this post was submitted on 21 Dec 2024
117 points (100.0% liked)

Programming

17668 readers
169 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
 

cross-posted from: https://beehaw.org/post/17683690

Archived version

Download study (pdf)

GitHub, the de-facto platform for open-source software development, provides a set of social-media-like features to signal high-quality repositories. Among them, the star count is the most widely used popularity signal, but it is also at risk of being artificially inflated (i.e., faked), decreasing its value as a decision-making signal and posing a security risk to all GitHub users.

A recent paper by Cornell University published on Arxiv, the researchers present a systematic, global, and longitudinal measurement study of fake stars in GitHub: StarScout, a scalable tool able to detect anomalous starring behaviors (i.e., low activity and lockstep) across the entire GitHub metadata.

Analyzing the data collected using StarScout, they find that:

(1) fake-star-related activities have rapidly surged since 2024

(2) the user profile characteristics of fake stargazers are not distinct from average GitHub users, but many of them have highly abnormal activity patterns

(3) the majority of fake stars are used to promote short-lived malware repositories masquerading as pirating software, game cheats, or cryptocurrency bots

(4) some repositories may have acquired fake stars for growth hacking, but fake stars only have a promotion effect in the short term (i.e., less than two months) and become a burden in the long term.

The study has implications for platform moderators, open-source practitioners, and supply chain security researchers.

you are viewing a single comment's thread
view the rest of the comments
[–] ITeeTechMonkey 15 points 1 day ago (2 children)

Does codeberg have anything that will prevent an influx of bots or AI accounts that have plagued GitHub?

I ask because as the user base for codeberg grows the bots, AI and nefarious actors will follow.

I like the idea of a federated source code hosting platform especially since it removes lock-in to a single corporation and a defacto monopoly.

That in itself is a good enough reason to migrate, but regarding this particular issue, bots/AI and artificial project promotion for malicious intent, feels like re-arranging deck chairs on the Titanic.

[–] [email protected] 6 points 1 day ago

Once these things are federated, it seems reasonable to expect that each instance would be able to choose what stars/followers/etc it accepts or displays, roughly similar to what Lemmy does with allowed/blocked instances. That might put a dent in the problem. At least, there would no longer be a single, easy, high-value target for this sort of thing.

[–] mesamunefire 1 points 1 day ago

No idea. I would assume it's the same as all other activityhub providers.