this post was submitted on 19 Aug 2024
186 points (97.4% liked)

politics

18877 readers
3857 users here now

Welcome to the discussion of US Politics!

Rules:

  1. Post only links to articles, Title must fairly describe link contents. If your title differs from the site’s, it should only be to add context or be more descriptive. Do not post entire articles in the body or in the comments.
  2. Articles must be relevant to politics. Links must be to quality and original content. Articles should be worth reading. Clickbait, stub articles, and rehosted or stolen content are not allowed. Check your source for Reliability and Bias here.
  3. Be civil, No violations of TOS. It’s OK to say the subject of an article is behaving like a (pejorative, pejorative). It’s NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect! This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.
  4. No memes, trolling, or low-effort comments. Reposts, misinformation, off-topic, trolling, or offensive.
  5. Vote based on comment quality, not agreement. This community aims to foster discussion; please reward people for putting effort into articulating their viewpoint, even if you disagree with it.
  6. No hate speech, slurs, celebrating death, advocating violence, or abusive language. This will result in a ban. Usernames containing racist, or inappropriate slurs will be banned without warning

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.

That's all the rules!

Civic Links

Register To Vote

Citizenship Resource Center

Congressional Awards Program

Federal Government Agencies

Library of Congress Legislative Resources

The White House

U.S. House of Representatives

U.S. Senate

Partnered Communities:

News

World News

Business News

Political Discussion

Ask Politics

Military News

Global Politics

Moderate Politics

Progressive Politics

UK Politics

Canadian Politics

Australian Politics

New Zealand Politics

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Limonene 32 points 1 month ago (5 children)

This is stupid. Their justification is an "unusual degree of vulnerabilities."

So why not outlaw vulnerabilities? Impose real fines or jail time, or at the very least a civil liability that can't be waived be EULA. Better than an unconstitutional bill of attainder.

[–] [email protected] 46 points 1 month ago (1 children)

So why not outlaw vulnerabilities?

Of course! If we make vulnerabilities illegal, then all the programmers will make perfect software! The solution was so easy!

[–] scrion 17 points 1 month ago

There is definitely a difference in quality when talking about import software.

Also, "outlawing vulnerabilities" would not mean to simply assume everyone starts making perfectly secure software, but rather that you're fined if you can't prove your processes are up to spec and you adhered to best practices during development. Additionally, vendors are obliged to maintain their software and keep it secure.

And surprise, surprise, the EU ratified laws that do exactly that (and more) recently. In fact, they'll be in effect very soon:

https://en.m.wikipedia.org/wiki/Cyber_Resilience_Act

[–] Manifish_Destiny 16 points 1 month ago* (last edited 3 days ago) (3 children)

Outlaw vulnerabilities? Do they just get little virtual handcuffs when they're found? If I find a Microsoft vulnerability I get arrested? Not sure I'm following this one.

Edit: it's really obvious most of you haven't worked in infosec.

[–] [email protected] 13 points 1 month ago (1 children)

When WannaCry was a major threat to cybersecurity, shutting down banks and hospitals, it was found that it used a backdoor Microsoft intentionally kept open for governments to use.

https://en.wikipedia.org/wiki/WannaCry_ransomware_attack

EternalBlue is an exploit of Microsoft's implementation of their Server Message Block (SMB) protocol released by The Shadow Brokers. Much of the attention and comment around the event was occasioned by the fact that the U.S. National Security Agency (NSA) (from whom the exploit was likely stolen) had already discovered the vulnerability, but used it to create an exploit for its own offensive work, rather than report it to Microsoft.[15][16]

https://en.wikipedia.org/wiki/EternalBlue

EternalBlue[5] is a computer exploit software developed by the U.S. National Security Agency (NSA).[6] It is based on a vulnerability in Microsoft Windows that allowed users to gain access to any number of computers connected to a network. The NSA knew about this vulnerability but did not disclose it to Microsoft for several years, since they planned to use it as a defense mechanism against cyber attacks.

In real life, if I do not prevent someone from doing a crime that I am aware of was premeditated, I am guilty of not doing my duty. Corporations are people thanks to Citizens United, and governments are ran by people, so uphold them to the same standards they subject the populace to.

[–] [email protected] 6 points 4 weeks ago

Well. Your sources don't say Microsoft kept it. They say NSA didn't report it to Microsoft so that they would be able to keep using it.

[–] Limonene 3 points 1 month ago

If you are Microsoft, then yeah. You'd go to jail when a Windows vulnerability is found.

In all seriousness though: it would be more likely to be just a civil penalty, or a fine. If we did want corporate jail sentences, there are a few ways to do it. These are not specific to my proposal about software vulnerabilities being crimes; it's about corporate accountability in general.

First, a corporation could have a central person in charge of ethical decisions. They would go to prison when the corporation was convicted of a jailable offense. They would be entitled to know all the goings on in the company, and hit the emergency stop button for absolutely anything whenever they saw a legal problem. This is obviously a huge change in how things work, and not something that could be implemented any time soon in the US because of how much Congress loves corporations, and because of how many crimes a company commits on a daily basis.

Second, a corporation could be "jailed" for X days by fining them X/365 of their annual profit. This calculation would need to counter clever accounting tricks. For example some companies (like Amazon, I've heard) never pay dividends, and might list their profit as zero because they reinvest all the profit into expanding the company. So the criminal fine would take into account some types of expenditures.

[–] pennomi 2 points 1 month ago (1 children)

Presumably that, once exploited, vulnerabilities are an offense that the DOJ can fine the company for. I think that’s quite reasonable.

[–] [email protected] 4 points 1 month ago (1 children)

I'd go further, an unpatched vulnerability is offense that the DOJ can fine the company for

[–] pennomi 2 points 1 month ago

Sounds fair enough to me.

[–] [email protected] 11 points 1 month ago* (last edited 1 month ago)

Because the NSA, CIA, and FBI love them. Vault 7, Magic Lantern, Intel ME and AMD PSP, Dual elliptic curve, COTTONMOUTH-I, ANT/TAO catalog, etc.

Hell, Microsoft willingly reports vulnerabilities and exploits to the government for them to use.

North Korea wishes it had this level of control on the goods its citizens willingly buy.

[–] [email protected] 6 points 1 month ago

Then you'd have to also go after Cisco.

[–] [email protected] 1 points 1 month ago

Why not?

Well…

It discourages self-reporting, makes vendors hostile to security researchers, opens the door to endless litigation over whose component actually “caused” a vulnerability… encourages CYA culture (like following a third-party spec you know is bad rather than making a good first-party one, because it guarantees blame will fall on another party)

In a complex system with tight coupling, failure is normal, so you want to have a good way to monitor and remedy failure rather than trying to prevent 100% of it. The last thing you wanna do is encourage people to be hostile to failure-monitoring.

(See also: Normal Accident theory)