Meta has quietly unleashed a new web crawler to scour the internet and collect data en masse to feed its AI model.
The crawler, named the Meta External Agent, was launched last month, according to three firms that track web scrapers and bots across the web. The automated bot essentially copies, or “scrapes,” all the data that is publicly displayed on websites, for example the text in news articles or the conversations in online discussion groups.
A representative of Dark Visitors, which offers a tool for website owners to automatically block all known scraper bots, said Meta External Agent is analogous to OpenAI’s GPTBot, which scrapes the web for AI training data. Two other entities involved in tracking web scrapers confirmed the bot’s existence and its use for gathering AI training data.
While close to 25% of the world’s most popular websites now block GPTBot, only 2% are blocking Meta’s new bot, data from Dark Visitors shows.
Earlier this year, Mark Zuckerberg, Meta’s cofounder and longtime CEO, boasted on an earnings call that his company’s social platforms had amassed a data set for AI training that was even “greater than the Common Crawl,” an entity that has scraped roughly 3 billion web pages each month since 2011.
We need automated text generator with generic sentences. Bunch up all dictionary words grouped by type and then make absolutely none sensical but valid sentences. Keep updating as often as the AI bots visit. Add questions and fake answers about random images. And we could do the same thing with books. Download Volumes from Google, change the meaning of various words and rehash the same big texts with all the wrong stuff. Like everything is correct except for the word the, now written with the k in place of the h...tke. tje story about tje cat in tje hat. Then write another big book with the same thing but different topic...tje excelsior returns!
I wonder if you could do a ton of letter swaps to make things look misspelled, but then provide a custom font that also swaps the glyphs around. So a human would read the normal text, but if you changed the font to a normal font you'd see what an AI would see, e.g. garbage.
Probably not very practical though. Copy-pasting from your website would break for example.