this post was submitted on 23 Aug 2023
83 points (96.6% liked)

Programming

17313 readers
100 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

Why do so many companies and people say that your password has to be so long and complicated, just to have restrictions?

I am in the process of changing some passwords (I have peen pwnd and it’s the password I use for use-less-er sites) and suddenly they say “password may contain a maximum of 15 characters“… I mean, 15 is long but it’s nothing for a password manager.

And then there’s the problem with special characters like äàáâæãåā ñ ī o ė ß ÿ ç just to name a few, or some even won’t let you type a [space] in them. Why is that? Is it bad programming? Or just a symptom of copy-pasta?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 68 points 1 year ago (3 children)

Is it bad programming?

With very few exceptions, yes. There should be no restrictions on characters used/length of password (within reason) if you're storing passwords correctly.

[–] 0xSHODAN 41 points 1 year ago

And if a site does have such restrictions, it could be an indication that they store passwords in plaintext, rather than hashed

[–] [email protected] 20 points 1 year ago* (last edited 1 year ago) (3 children)

A very high max of something like 500 characters just to make sure you don't get DOSed by folks hitting your endpoint with huge packets of data is about the most I would expect in terms of length restrictions. I'm not a security expert or anything though.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (2 children)

That's a misunderstanding of DDoS. 0 byte packets are actually worse than large packets.

Which is why most DDoS (at least was) is extremely slow 0 byte requests until the server throttles/crashes under the number of requests.

E: Consider this. Are you more likely to throttle a bandwidth of terabytes/petabytes with couple million 1gb requests; or break it entirely by sending >4294967295 0 byte requests that effectively never stop being requested from the server?

[–] [email protected] 8 points 1 year ago (1 children)

It depends on what the DoS is targeting. If hashing is being done with an expensive hash function you can absolutely cause a lot of resource usage (CPU or memory depending on the hash) by sending long passwords. That being said this likely isn't a huge concern because only the first round needs to process the whole submitted data, the later rounds only work on the previous round's output.

Simple empty requests or connection opening attempts are likely to be stopped by the edge services such as a CDN and fleet of caches which are often over-provisioned. A targeted DoS attack may find more success by crafting requests that make it through this layer and hit something that isn't so overprovisioned.

So yes, many DoS attacks are request or bandwidth floods but this is because they are generic attacks that work on many targets. But that doesn't mean that all DoS attacks work this way. The best attacks target specific weaknesses in the the target rather than pure brute-force floods.

[–] [email protected] 0 points 1 year ago (1 children)

Well to be fair, if they're hashing serverside, they were doomed to begin with.

But yeah, there's a lot of ways to DDoS, and so many tools that just make it a 1 button click.

[–] [email protected] 2 points 1 year ago (1 children)

Who isn't hashing server-side? That just turns the hash into the password which negates a lot of the benefits. (You can do split hashing but that doesn't prevent the need to hash server-side.)

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago) (2 children)

Hashing on client side is both more private, and secure. All the user ever submits is a combined hash (auth/pubkey) of their username + password.

If the server has that hash? Check the DB if it requires 2FA, and if the user sent a challenge response. If not, fail the login.

Registering is pretty much the same. User submits hash, server checks DB against it, fail if exists.

Edit: If data is also encrypted properly in the DB, it doesn't even matter if the entire DB is completely public, leaked, or secured on their own servers.

[–] [email protected] 2 points 1 year ago

This means that the submitted hash is effectively a password. You get a minor benefit in that it obscures the original password in case it contains sensitive info or is reused. But the DB is now storing the hash password in plain text. This means that if the DB leaks anyone can just log in by sending the hash.

If you want to do something like this you would need some sort of challenge to prevent replay attacks.

This scheme would also benefit from some salt. Although the included username does act as a form of weak salt.

[–] [email protected] 1 points 1 year ago

Per your edit, the DB being "encrypted properly" just means "hashing server side". There's little benefit (though not necessarily zero) to encrypting the entire database, since the key has to live in plaintext somewhere on the same system. It's also making the slowest part of most systems even slower.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

Very true and a good explanation of DDoS but I was talking about DoS generally, not specifically DDoS. In my (admittedly pretty limited) experience, a single mega request which is not blocked or rejected by your server can cause it to choke. If you don't have sufficient redundancy or if you get several of these requests coming through it can take down some of your backend services.

It's a good point though, there are lots of different attack vectors each fun in their own way that you need to watch out for.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

Right, that's why I put the "within reason" in my comment. You still need to guard against malicious inputs so ultimately there is some max length limit, but it should be way beyond what a reasonable password length would be.

[–] [email protected] 7 points 1 year ago

My password is the bee movie script

[–] eu8 4 points 1 year ago

The best way to handle passwords IMO, is to have the browser compute a quick hash of the password, and then the server compute the hash of that. That way the "password" that is being sent to the server is always the same length.

[–] [email protected] 0 points 1 year ago (1 children)

Underappreciated fact: Bcrypt has a maximum of 72 bytes. It'll truncate passwords longer than that. Remember that UTF8 encoding of special characters can easily take more than one byte.

That said, this is rarely a problem in practice, except for some very long passphrases.

[–] [email protected] 0 points 1 year ago (1 children)

Bcrypt has a maximum of 72 bytes. It’ll truncate passwords longer than that. Remember that UTF8 encoding of special characters can easily take more than one byte.

Interesting: https://en.wikipedia.org/wiki/Bcrypt#Maximum_password_length

In the worst case a password is limited to 18 characters, when every character requires 4 bytes of UTF-8 encoding. For example:

𐑜𐑝𐑟𐑥𐑷𐑻𐑽𐑾𐑿𐑿𐑰𐑩𐑛𐑙𐑘𐑙𐑒𐑔 (18 characters, 72 bytes)

Makes me question if bcrypt deserves to be widely used. Is there really no superior alternative?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Not only that, bcrypt could be run by GPUs and FPGA, that makes it more prone to bruteforcing attacks.

There are 2 modern alternatives: scrypt and argon2. They both require a substantial amount of memory, so gpu and hardware computation is no longer feasible.