could you share some specific examples?
feel free to pm them if you don't want to post them in public.
could you share some specific examples?
feel free to pm them if you don't want to post them in public.
do you have some more details about that?
Generally that is true, but when you access a remote community for the first time, Lemmy attempts to backfill several posts from the community. This is limited to only posts, so comments and votes are not included in that. You can also "resolve" a post (or comments for that matter) on an instance from its fedilink (the colorful icon you see next to posts and comments), so when someone links to something elsewhere, a lot of apps will try to open (by resolving) it on the current instance instead, which can also result in posts or comments showing up, even when there isn't a subscriber. Resolving can also be done manually by entering the URL in the search. This seems to not always be that reliable to work on the first try though, so it can help to try again if you have trouble resolving something on the first attempt.
I think there is also something updating community information in the background from time to time, I'm not sure if that only happens under certain conditions or in regular intervals, and I'm not sure whether that fetches new posts at that point either. If it does, it could explain new posts appearing at daily or so interval but without any comments and votes. Backfill should probably only happen initially when discovering the community for the first time though.
from a quick look it doesn't seem like the crawler uses any federation, it seems to just iterate over the community list api for each tracked instance, it probably doesn't have logic to remove entries that no longer exist, considering that they're still in there.
I couldn't tell you the reason for this, but several posts in [email protected] have been locked by a moderator: https://lemmy.world/modlog/959443
as far as i know, locking a post does not affect voting, only prevents new comments from being federated.
the other example you mentioned, i assume that you're referring to the inconsistency on lemmyverse.net? i haven't looked at how that application works, but it's unlikely to be working with activitypub/federation, instead it's most likely just connecting to various different instances and using their APIs. i've also left a comment over there about that.
i already explained before why these posts don't see votes on startrek.website - there is no local subscriber on that instance. once at least one person from that instance subscribes to the community it'll start seeing updates, which includes votes. there has also been a comment by one of the startrek.website admins about the federation issues caused by them accidentally blocking certain traffic from other instances here.
for discuss.online, there does not seem to have been a longer federation delay according to this dashboard, only about 1.5h delay at some point that was recovered from fairly quickly. it is also very possible that the first subscriber to the community on discuss.online only subscribed after the post was created, as the more recent posts seem to be doing just fine with their vote counts when comparing discuss.online and lemmy.world numbers. looking at our database, i can see the first subscriber to that community from discuss.online joined about 5 hours after the post was posted, which would easily explain the partial votes.
there seem to be two separate issues relating to that.
the number at the top includes "all" communities, including those marked as nsfw.
on a quick glance, it seems all the nsfw marked ones are correctly marked as such, in the sense of also being nsfw on lemmy.
there also are a large number of communities missing overall, but at least the number next to the community tab adds up with the number of listed communities when the filter is set to show nsfw communities as well.
there is also either some kind of data corruption going on or there may have been some strange spam communities on lemmy.world in the past, as it shows a bunch of communities with random numbers in the name and display names like oejwfiojwwqpofioqwfiowqiofkwqeifjwefwefoejwfiojwwqpofioqwfiowqiofkwqeifjwefwefoejwfiojwwqpofioqwfiowqiofkwqeifjwefwefoejwfiojwwqpofioqwfiowqiofkwqeifjwefwefoejwfiojwwqpofioqwfiowqiofkwqeifjwefwefoejwfiojwwqpofioqwfiowqiofkwqeifjwefwef
which don't currently exist on lemmy.world.
there is indeed a cutoff. there is exponential delay for retrying and at some point lemmy will stop trying until it sees the instance as active again.
there is also a scheduled task running once a week that will delete local activities older than a week. downtimes of a day or two can generally be easily recovered from, depending on latency it can take a lot more time though. if an instance is down for an extended time it shouldn't expect to still get activities from the entire time it was offline.
downtime should not result in missing content when the sending instance is lemmy 0.19.0 or newer. 0.19.0 introduced a persistent federation queue in lemmy, which means it will retry sending the same stuff until the instance is available. depending on the type of down, it can also be possible that there is a misconfiguration (e.g. "wrong" http status code on a maintenance page) that could make the sending instance think it was successfully sent. if the sending instance was unreachable (timeout) or throwing http 5xx errors, everything should be preserved.
we are planning to post an announcement about the current situation with lemmy updates and our future plans in the coming days, stay tuned for that. you can find some info in my comment history already if you are curious.
we're currently aware of delayed federation from lemmy.ml towards lemmy.world and still working identifying the root cause - see https://lemmy.world/post/22196027 (still needs updating that it's happening again).
aussie.zone has been about 6 weeks behind lemmy.world for a few weeks i think at this point, which at least means they're no longer losing activities, but it's still taking ages to reduce the lag.
i don't know what issue there might be with discuss.online right now, but for startrek.website the explanation is rather simple. as you can see in the sidebar, there are 0 local subscribers for the community. when there aren't any subscribers to a community on an instance, the instance will not receive any updates for posts in that community. this includes posts, comments, as well as votes.
startrek.website also had federation issues over the last weeks due to accidentally blocking lemmy instances in some situations.
lemdro.id has recently had some db performance issues that caused it to get around 3d behind lemmy.world, they've been slowly catching up again over the last days.
they did leave before taking the screenshot
i still didn't understand what you were referring to, but now that i looked at this comment thread on reddthat.com I can see that the other account that commented here is banned from lemmy.world: https://lemmy.world/modlog?userId=1250220
the justification for the ban is just "spam", which unfortunately doesn't provide much context, and I don't see anything immediately obvious that'd justify it. especially considering that it's been a year since the ban, it was likely not necessary to issue a permanent ban for that. i've unbanned your reddthat account now.