A machine learning librarian at Hugging Face just released a dataset composed of one million Bluesky posts, complete with when they were posted and who posted them, intended for machine learning research.
Daniel van Strien posted about the dataset on Bluesky on Tuesday:
“This dataset contains 1 million public posts collected from Bluesky Social's firehose API, intended for machine learning research and experimentation with social media data,” the dataset description says. “Each post contains text content, metadata, and information about media attachments and reply relationships.”
The data isn’t anonymous. In the dataset, each post is listed alongside the users’ decentralized identifier, or DID; van Strien also made a search tool for finding users based on their DID and published it on Hugging Face. A quick skim through the first few hundred of the million posts shows people doing normal types of Bluesky posting—arguing about politics, talking about concerts, saying stuff like “The cat is gay” and “When’s the last time yall had Boston baked beans?”—but the dataset has also swept up a lot of adult content, too.
Yes, your content. That's the only thing anyone ever claimed you keep and the only part that would make any sense to have value. It makes it incredibly simple to make that history available elsewhere, and it's incredibly likely that a future platform that emerges will facilitate that process, just like all the book platforms let you import from goodreads.
Its not incredibly likely you can import social media from one network to another.
It has never been supported by any social media network, and bluesky's architecture is such that the only people that can host it are giant orgs like mega corps, who are profit driven and who want lockin, not portability.
You're hoping it works out, but without an example of viability, it's just conjecture that this is a real or even valuable feature.
It's a virtual certainty, because you control the information.
The lack of imports has nothing to do with the new places not wanting it and everything to do with the old place holding your data hostage. Having a clean, formally defined source of your data is all it takes to make building an import from a popular network trivial.