Saturday 13 July 2019

YouTube Recommendation System Gathers Together Videos Of Partially-Dressed Young Children And ‘Recommends’ Them To Pedophiles

New post on Now The End Begins

YouTube Recommendation System Gathers Together Videos Of Partially-Dressed Young Children And ‘Recommends’ Them To Pedophiles

by Geoffrey Grider

YouTube Recommendation System Links Innocent Videos of Children to Those Preferred by Pedophiles

YouTube’s automated video recommendation system has been allowing otherwise innocent videos of children to be categorized with those preferred by pedophiles, says a blockbuster  new report by the New York Times.

Social media is getting more and more evil as the we continue our journey into the last days before the Pretribulation Rapture of the Church takes place. Sites like Twitter, Facebook and YouTube are banning content from Christians and Conservatives, but evidently catering to the needs of bottom-feeding pedophiles is OK. Apparently YouTube thinks so, anyway. Their new algorithm is automatically locating videos of  young children who are swimming at the beach or in other activities that would cause them to be partially dressed and places those videos in playlists for pedofiles.
YouTube is owned by parent company Google, which is owned by parent company Alphabet. Google when they first started has a motto that simply read 'don't be evil', a motto they dumped a few years ago after purchasing a half-dozen companies that create robots, signing lucrative contracts with the Defense Dep't for military technology. In 2019, evil is big business and there are billions of dollars to be made from it, including presenting pedophiles with neatly gathered playlists of partially-dressed children for their viewing pleasure.

NYT: YouTube Recommendation System Links Innocent Videos of Children to Those Preferred by Pedophiles

FROM BREITBART NEWS: “YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families,” wrote columnists Max Fisher and Amanda Taub.
The writers spoke with researchers who discovered the problem of a YouTube algorithm that often referred innocent videos to a category of sexually-themed content. “The result was a catalog of videos that experts say sexualizes children,” they observed.
YouTube’s algorithm has been curating home movies of unwitting families into a catalog of semi-nude kids, we found.
YT often plays the videos after users watch softcore porn, building an audience of millions for what experts call child sexual exploitationhttps://t.co/zNwsd9UsgN
— Max Fisher (@Max_Fisher) June 3, 2019
Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society, identified YouTube’s algorithm as the means for connecting the channels, stated the report. “That’s the scary thing,” he said, adding that while YouTube never intended to connect family videos of young children to pedophiles, the reality of the situation is “disturbingly on point.”
The Google-owned platform is essentially leading users with pedophile interests to videos of partially-clothed children – possibly in swimsuits outside in their backyard pools – through its progressions of recommended videos.
Fisher and Taub wrote:
So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.
The writers interviewed Christiane C., a mother from the Rio de Janeiro area, whose 10-year-old daughter and a friend uploaded an innocuous video of themselves while swimming in a backyard pool.
When Christiane’s daughter excitedly told her mother several days later her video had 400,000 views, the mother fearfully viewed it again.YouTube’s recommendation system showed the video to users with pedophile interests, researchers said.
maps-minor-attracted-persons-pedophiles-want-inclusion-lgbtq-movement-end-times
PEDOPHILES ARE NOW CALLING THEMSELVES ‘MINOR ATTRACTED PERSONS’ AND WANT INCLUSION IN LGBTQ MOVEMENT
According to Fisher and Taub, when the NYT alerted YouTube to its discovery, the company removed several of the videos, “but left up many others, including some apparently uploaded by fake accounts.”
Additionally, its recommendation system changed and no longer connected some of the innocent videos with those of sexually-themed content, though YouTube said this was only a product of routine tweaks to their system, and not related to the effort to stop exploitation of children.
“Protecting kids is at the top of our list,” said Jennifer O’Connor, YouTube product director for trust and safety, about the company’s commitment to end exploitation of children on its platform.

The NYT columnists, however, noted the fact that YouTube has failed to switch off its recommendation system on videos of children, a situation that is leading to a continuation of its high-risk status.

In FebruaryWired observed major companies including McDonald’s, Nestlé, and Epic Games, pulled ads from YouTube over reports that many of the platform’s “videos with tens of millions of views are being inundated with comments by pedophiles, with adverts from major brands running alongside the disturbing content.”
The Wall Street Journal reported that, during a conference call with ad buyers Google executives “sought to assuage concerns by explaining the steps the company has taken to address brand safety problems that have plagued the platform.”
“The executives also told ad buyers the company will deliver a timeline in 24 hours outlining new restrictions and product changes, one of the people said,” the report noted.
However, as Breitbart News reported, “Google has struggled with pedophilia on YouTube for years, and in December 2017, the company claimed it would hire ‘thousands’ of human moderators to combat the problem.
According to Fisher and Taub, YouTube described its recommendation system as artificial intelligence, which continuously learns which suggestions will keep its users watching its videos.
Marcus Rogers, a psychologist at Purdue who has conducted research on child pornography, told the writers such a system of gradually moving from innocent videos of children to increasingly sexualized videos based on recommendations is fairly easy to “normalize.”
“A lot of people that are actively involved in chatting with kids are very, very adept at grooming these kids into posting more sexualized pictures or engaging in sexual activity and having it videotaped,” Rogers explained.
Similarly, Jenny Coleman, director of Stop It Now, a group that fights sexual exploitation of children, warned, “Even the most careful of families can get swept into something that is harmful or criminal.” READ MORE

YouTube's Algorithm Keeps Suggesting Home Videos Of Kids To Pedophiles

A Harvard scientist was researching the effects of YouTube on Brazilian society when he made a disturbing discovery. YouTube is scrambling to explain why some innocent home movies are being suggested to child predators.

No comments:

Post a Comment