this post was submitted on 08 Apr 2024
317 points (97.9% liked)
Fediverse
28576 readers
795 users here now
A community to talk about the Fediverse and all it's related services using ActivityPub (Mastodon, Lemmy, KBin, etc).
If you wanted to get help with moderating your own community then head over to [email protected]!
Rules
- Posts must be on topic.
- Be respectful of others.
- Cite the sources used for graphs and other statistics.
- Follow the general Lemmy.world rules.
Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Are you able to include the HTTP Method being called and the amount of data transferred per request? It's possible that the first request is an OPTION request and then the second request is a POST.
If you can see the amount of data transferred, then you can have some more indication that double the requests are being sent and quantity the bandwidth impact at least.
They'll all POST requests. I trimmed it out of the log for space, but the first 6 requests on the video looked like (nginx shows the data amount for GET, but not POST):
If I was running Lemmy, every second line would say 400, from it rejecting it as a duplicate. In terms of bandwidth, every line represents a full JSON, so I guess it's about 2K minimum for the standard cruft, plus however much for the actual contents of comment (the comment replying to this would've been 8K)
My server just took the requests and dumped the bodies out to a file, and then a script was outputting the object.id, object.type and object.actor into /tmp/demo.txt (which is another confirmation that they were POST requests, of course)
If the first one is OPTION, would that be a bug? Would the right design principle be to do it once per endpoint and then cache it for future requests?
I'm really curious cause I don't know how this usually works....
That's pretty standard with most libraries
I’ve never really seen this in (Java/Rust/PHP) backend personally, only in client-side JS (the CORS preflight).
It’s a security feature for browsers doing calls (checking the CORS headers before actually calling the endpoint), but for backends the only place it makes sense is if you’re implementing something like webhooks, to validate the (user submitted) endpoint.
I wonder if the legacy webhooks implementation in Lemmy has left some artifacts that show up when the services that comprise Lemmy are split up as they are for larger instances.
This is pure speculation.
Ok so my assumptions were right. Interesting...