Porn, dog poo and social media snaps: the ‘taskers’ scraping the internet for AI firm part-owned by Meta

A picture


Tens of thousands of people have been paid by a company part-owned by Meta to train AI by combing Instagram accounts, harvesting copyrighted work and transcribing pornographic soundtracks, the Guardian can reveal.Scale AI, 49%-controlled by Mark Zuckerberg’s social media empire, has recruited experts across fields such as medicine, physics and economics – putatively to refine top-level artificial intelligence systems through a platform called Outlier.“Become the expert that AI learns from,” it says on its site, advertising flexible work for people with strong credentials.However, workers for the platform said they have become involved in scraping an array of other people’s personal data – in what they described as a morally uncomfortable exercise that diverged significantly from refining high-level systems.Outlier is managed by Scale AI, which has contracts with the Pentagon and US defense companies.

Its former CEO, Alexandr Wang, who is Meta’s chief AI officer,was described by Forbes as the “world’s youngest self-made billionaire”.Its former managing director, Michael Kratsios, is the science adviser to the US president, Donald Trump.One Outlier contractor based in the US said users of Meta platforms, including Facebook and Instagram, would be surprised at how data from their accounts was collected – including pictures of users and their friends.“I don’t think people understood quite that there’d be somebody on a desk in a random state, looking at your [social media] profile, using it to generate AI data,” they said.The Guardian spoke to 10 people who have worked for Outlier to train AI systems, some for more than a year.

Many of them had other jobs – as journalists, graduate students, teachers and librarians,But in an economy struggling under the threat of AI, they wanted the extra work,“A lot of us were really desperate,” said one,“Many people really needed this job, myself included, and really tried to make the best of a bad situation,”Like the growing class of AI gig workers worldwide, most believed they had been training their own replacements.

One artist described “internalised shame and guilt” for “contributing directly to the automation of my hopes and dreams,”“As an aspiring human, it makes me angry at the system,” they said,Glenn Danas, a partner at Clarkson, a law firm representing AI gig workers in lawsuits against Scale AI and several similar platforms, estimates that hundreds of thousands of people worldwide now work for platforms such as Outlier,The Guardian spoke to Outlier workers, also called “taskers”, in the UK, the US and Australia,In interviews, taskers described the increasingly familiar humiliations of AI gig work: constant monitoring and piecemeal, unstable employment.

Scale AI has been accused of using “bait-and-switch” tactics to lure in potential workers – promising workers a high salary during initial recruitment, and then offering them significantly less.Scale AI declined to comment on ongoing litigation, but a source said pay rates change after recruitment only if workers opt in to different, lower-paid projects.Some taskers said they were asked to submit to repeated, unpaid AI interviews to qualify for certain assignments; several believed these interviews were recycled to train AI.All of them said they were constantly monitored through a platform called “Hubstaff”, which could screenshot the websites they visited while working.The Scale AI source said that no interviews were required to work on Outlier projects, although “optional onboarding” was offered to get contributors started; and that Hubstaff was used to ensure contributors were paid accurately but not to “actively monitor” taskers.

Several taskers described being asked to transcribe pornographic soundtracks, or label photos of dead animals or dog faeces,One doctoral student said they had to label a diagram of baby genitalia,There were police calls that described violent scenarios,“We had already been told before that there would be no nudity in this mission,Appropriate behaviour, no gore, like no blood,” said the student.

“But then I would get an audio transcript thing for porn or there would be just random clips of people throwing up for some reason,”The Guardian has seen videos and screenshots of some of the tasks that Outlier required its workers to perform,These included photos of dog faeces, and tasks with prompts such as “What would you do if an inmate refused to follow orders in a correctional facility?”Scale AI, the source said, shuts down tasks if inappropriate content is flagged, and workers are not required to continue with tasks that make them feel uncomfortable,The source added that Scale AI did not take on projects involving child sexual abuse material or pornography,There was an expectation of social media scraping, the Outlier workers suggested.

Seven of the taskers described scouring other people’s Instagram and Facebook accounts, tagging individuals by name, as well as their locations and their friends.Some of these involved training the AI on the accounts of people under the age of 18.The assignments were structured to require new data other taskers had not yet uploaded, pushing workers to plumb the social accounts of more people.The Guardian has seen one such task, which required workers to select photos from individuals’ Facebook accounts and sequentially order them by the age of the user in the photo.Several taskers said they found these assignments unsettling; one tried to complete them using only photos of celebrities and public figures.

“I was uncomfortable including pictures of kids and stuff, but like the training materials would have kids in it,” said one,“I didn’t use any friends or family to submit [tasks] to the AI,” said another,“I do understand that I don’t like it ethically,”The Scale source said taskers did not review social media accounts set to “private”, and was not aware of tasks that involved labelling the ages of individuals, or their personal relationships,They added that Scale AI did not take on projects with explicit sensitive content related to children, but did use children’s public social media data.

Workers did not log on to personal Facebook or Instagram accounts to complete these tasks.For another assignment, taskers described harvesting images of copyrighted artwork.As with the social media training, the task required constant new input – apparently to train an AI to produce its own artistic images.As workers ran out of other options, they plumbed social media accounts of artists and creators.The Guardian has seen documentation of this assignment, which included AI-generated paintings of “a Native American caregiver”, and the prompt, “DO NOT use AI-generated images.

Only select hand-drawn, painted or illustrated artwork created by human artists.”Scale AI did not ask contributors to use copyrighted artwork to complete assignments, the source said, and it declined work that violated this standard.Taskers also expressed uncertainty about what they might be training the AI to do – and how their submissions would be used.“It does seem like labelling diagrams is something an AI can already do so I’m really curious as to why we need like, dead animals,” said one.Scale AI has counted among its clients major technology companies such as Google, Meta and OpenAI, as well as the US department of defense and the government of Qatar.

It fills a need that is becoming more pronounced as AI models grow larger: for new, labelled data that can be used to train them.Taskers described interacting with ChatGPT and Claude, or using data from Meta to complete certain assignments; some thought they might be training Meta’s new model, Avocado.Meta and Anthropic did not respond to a request for comment.OpenAI said it stopped working with Scale AI in June 2025, and its “supplier code of conduct sets out clear expectations for the ethical and fair treatment of all workers”.Most taskers the Guardian spoke to are still accepting assignments on the Outlier platform.

The pay is unsteady; they face occasional mass layoffs.But with the AI future fast arriving, they feel there may not be any other choice.“I have to be positive about AI because the alternative is not great,” said one.“So I think eventually things will get figured out.”A Scale AI spokesperson said: “Outlier provides flexible, project-based work with transparent pay.

Contributors choose when and how they participate, and availability varies based on project needs,We regularly hear from highly skilled contributors who value the flexibility and opportunity to apply their expertise on the platform,” This article was amended on 7 and 8 April 2026,Alexandr Wang is the former, not current, CEO of Scale AI,The company is part-owned by Meta, not owned by it as an earlier headline said.

And a response from Scale AI has been added regarding claims by taskers that they had to take on unpaid interviews,
recentSee all
A picture

Oil prices plunge and stocks jump after Trump announces conditional ceasefire with Iran

Oil prices tumbled on Wednesday and global stock markets rallied after the US and Iran agreed a two-week conditional ceasefire.Investors welcomed the news that Donald Trump had held off on his threat to bomb Iran into “the stone ages”, while Iran’s foreign minister, Abbas Araghchi, said passage through the strait of Hormuz would be allowed for the next two weeks under the management of Iran’s military. Wall Street recorded its biggest single-day rally in a year.Oil fell below the $100-a-barrel mark, even though it was not certain that the US would accept a 10-point proposal drawn up by Iran. How the strait will be reopened and managed beyond the two-week grace period is yet to be determined

A picture

John Lewis boss’s pay rises to £1.2m as retailer cuts 3,300 jobs

The boss of the group that owns John Lewis and Waitrose had his basic pay rate increased by 21% last year while the retailer cut 3,300 jobs.The annual salary of Jason Tarry, who became chair of the John Lewis Partnership (JLP) in September 2024, was increased to £1.2m for the year to January, from £990,000.He also received a £22,700 annual bonus – equivalent to 2% of his pay – and other benefits, taking his total package to almost £1.26m

A picture

British computer scientist denies he is bitcoin developer Satoshi Nakamoto

A British computer scientist has insisted he is not the elusive developer of bitcoin, after a report claimed to unmask him as its creator.A story in the New York Times details a years-long effort to unmask Satoshi Nakamoto, the mysterious author of the bitcoin white paper which laid the theoretical foundations for modern digital currencies.It names Adam Back, a London-born computer scientist and entrepreneur. In a thread on X, Back promptly denied being the mysterious – and presumably ultra-wealthy – technologist.“I also don’t know who satoshi is, and i think it is good for bitcoin that this is the case, as it helps bitcoin be viewed [as] a new asset class, the mathematically scarce digital commodity,” he wrote

A picture

Britons warned about Russian hackers targeting internet routers for espionage

Russian hackers are exploiting commonly sold internet routers to harvest information for espionage purposes, the UK’s cybersecurity agency has said.The hack could allow attackers to obtain users’ credentials, redirect them to fake sites, and potentially access other devices on their home network such as phones and PCs, said Alan Woodward, a professor at the University of Surrey.The National Cyber Security Centre said on Tuesday the operations were “believed to be opportunistic in nature, with the actor targeting a wide pool of victims and then likely filtering down for users of potential intelligence value at each stage of the exploitation chain”.It follows a common pattern of cyber-actors targeting edge devices – hardware such as internet routers or internet-connected security cameras – that act as a bridge between users and the cloud.Woodward said: “It’s not the first time that warnings have come out about routers

A picture

How Augusta National outwitted ticket resellers and kept door closed on Trump | Andy Bull

Jeffrey Epstein’s web of influence stretched from European palaces to Ivy League universities and Wall Street banks, but there was apparently at least one little corner of the establishment that seems to have been beyond his reach: Augusta National. In July 2019, Epstein sent an iMessage to Steve Bannon asking for his help with a particularly difficult problem. “Need to work magic to get brad Karp admitted to augusta golf club,” Epstein wrote. “The head of Paul Weiss Brad Karp?” Bannon replied. “Yes

A picture

Rahm stands out as Masters favourite as Augusta adjusts to post-Tiger world | Ewan Murray

Spaniard is seeking his second Green Jacket at first Masters since 1994 without Tiger Woods or Phil MickelsonHalf a mile from the gates of Augusta National, at the foot of Washington Road, sits a keyboard and piano store. It closes on Masters week every year. “Spring has sprung and so have we,” reads a sign in the forecourt. Clearly there is insufficient correlation between golf fans and those with a tendency to tinkle the ivories (or similar) for the business to remain open.Masters mania is not for everyone