Ex-YouTube Employee Reveals The apos;toxic apos; Cycle Of Its Recommendation AI

From Knowledge Management
Revision as of 22:19, 29 July 2023 by AccountDumpster (talk | contribs) (Created page with "has found itself embroiled in more than a few controversies in recent years thanks to its ‘Recommended for You' feature, which has been criticized for promoting violent and extremist content and, most recently, inappropriate videos of children.<br>It's a problem that the firm has been scrambling to [https://www.britannica.com/search?query=correct%20- corr employee Guillaume Chaslot says the root of the issue lies in the design of the recommendation algorithm itself.<br...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

has found itself embroiled in more than a few controversies in recent years thanks to its ‘Recommended for You' feature, which has been criticized for promoting violent and extremist content and, most recently, inappropriate videos of children.
It's a problem that the firm has been scrambling to corr employee Guillaume Chaslot says the root of the issue lies in the design of the recommendation algorithm itself.
The system, Chaslot explains, is built to predict and curate content geared toward the user's specific interests - be those innocent or nefarious - and gets better and better at its job the more you engage with it.
Scroll down for video 
YouTube has found itself embroiled in more than a few controversies in recent years thanks to its ‘Recommended for You' feature, which has been criticized for promoting violent and extremist content and, most recently, inappropriate videos of children. File photo
According to Chaslot, this means it inherently comes with a ‘toxic potential.'
The software engineer worked on the AI for roughly a year between 2010 and 2011, and has retrospectively concluded that the issues coming to light now were ‘not unpredictable,' even if unintentional, he wrote in an essay for Wired in July.
‘In some cases, the AI went [https://abcnews.go.com/search?searchtext=terribly terribly
wrong,' Chaslot says.
Chaslot points to examples such as terrorist content and suggestive videos of children, which prompted widespread backlash and caused Disney and other companies to pull their ads.
The problem, according to the software engineer, can be found in the system's engagement metrics.
RELATED ARTICLES



Share this article
Share
28 shares


Former Google employee Guillaume Chaslot says the root of YouTube's issue lies in the design of the recommendation algorithm
As people who those videos are aimed at interact with the recommendations, the AI becomes more precise in its suggestions.
Not only does that mean it will be better at recommending that content to that user, but it will also be less likely to show those videos to people who wouldn't want to see them, Chaslot says.
This is what's known as a feedback loop.
‘At that stage, problems with the algorithm become exponentially harder to notice, as content is unlikely to be flagged or reported,' Chaslot writes.
‘In the case of the pedophilia recommendation chain, YouTube should be grateful to the user who found and exposed it. 
'Without him, the cycle could have continued for years.'




data-track-module="am-external-links^external-links">
Read more:





DM.later('bundle', function()
DM.has('external-source-links', cheat pola olympus 'externalLinkTracker');
);