So, she was pronounced dead at end of day Tuesday, that's still like a day and a half of normal working hours where no one noticed. The fact that it happened over a weekend makes it less bad, but that still means cleaning, security, her supervisor, and her coworkers all went a full working day around her body without any interactions with her. I think it's okay to say a failure happened here, and that they should be reevaluating some procedures
GarrulousBrevity
You know some dipshit in upper management saw the report that she hadn't badged in for two workdays (the article says her last badge in was on a Friday), and was going to discipline her before checking on her work, or checking on her.
Pretty sure you just attacked someone for agreeing with you.
The comparison to COVID is interesting. If people can smell the smoke, they're literally breathing in the air that the smoker is breathing out, and one can definitely smell smoke from the next table over. Maybe that's part of why lockdown didn't work as well as we hoped.
Also, if you can smell it, it's doing harm. Second and third hand smoke on your clothing is still real in outdoor spaces, and people should have the freedom to avoid that harm.
Sorry, I was referring to your comment, I'm not sure what you read
That's incredibly subjective, and not true for many. A lot of people can tell if you've been smoking outside
Oh, no, that wasn't excusing Meta in general. Just giving them a pass on that they've had, to my knowledge, a history of respecting robots.txt, which makes this piece of software better than outright malware. Starting it secretly and not giving site hosts a chance to make sure they had their privacy configured the way they liked first was a shady as hell move, no argument there.
Over 463,000 voters on the suspense list
So, about half a million valid voters
I think of this as a problem with opt-in only systems. Think of how sites ask you to opt in to allow tracking cookies every goddamn time a page loads. A rule based system which lets you opt in and opt out, like robots.txt, to let you opt out of cookie requests and tell all sites to fuck would be great. @[email protected] is complaining about malicious instances of crawlers that ignore those rules (assuming they're right and that the rules are set up correctly), and lumping that malware with software made by established corporations. However, Meta and other big tech companies haven't historically had a problem with ignoring configurations like robots.txt. They have had an issue with using the data they scrape in ways that are different than what they claimed they would, or scraping data from a site that does not allow scraping by coming at it via a URL on a page that it legitimately scraped, but that's not the kind of shenanigans this article is about, as meta is being pretty upfront about what they're doing with the data. At least after they announced it existed.
An opt-in only solution would just lead to a world where all hosts were being constantly bombarded with requests to opt in. My major take away from how meta handled this is that you should configure any site you own to disallow any action from bots you don't recognize. As much as reddit can fuck off, I don't disagree with their move to change their configuration to:
User-agent: *
Disallow: /
I know what you're trying to say, but that phrasing though. Being able to opt out is an important part of consent. No means no, man.
But meta's will, and Alta Vista. I'm not angry at them when a script kitty makes a bad crawler
If those ~25 Lemmy users could read they'd be very upset