this post was submitted on 18 Jun 2023
359 points (99.4% liked)

Programmer Humor

32054 readers
1716 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 29 comments
sorted by: hot top controversial new old
[–] [email protected] 42 points 1 year ago (2 children)
[–] [email protected] 9 points 1 year ago

Randal is some sort of savant...

[–] OtakuAltair 1 points 1 year ago
[–] [email protected] 29 points 1 year ago (2 children)

Cool until you realise Grandma is senile and can't actually think beyond piecing together text they've seen before into what they think is a coherent response.

Those keys will absolutely not work, either because they've already been used and were scraped from training data, or they are fake keys generated based on said training data.

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago)

I tried it myself and looked the keys up. It gives you generic keys. They will work and let you install Windows, but they aren't "valid" keys so to speak. At the same time, they aren't "fake" either

[–] [email protected] 1 points 1 year ago

And of course, on the small chance that they did work - I would expect Microsoft to revoke those keys fairly quickly if they start getting a ton of usage.

[–] [email protected] 28 points 1 year ago (2 children)

Fun fact: If you google those codes you find out that they are "real" codes, but they don't actually activate Windows. I think they are something that are used as placeholders in the upgrade from Windows 8 to 10 or something, but don't know the specifics.

ChatGPT actually can't create new "words", just regurgitate words that it's seen somewhere before!

[–] [email protected] 19 points 1 year ago (3 children)

Sure it can create new words. It can't create new tokens, would be more correct, I think. But a token is just a text fragment, and, as far as I know, they can range from being several words to being single characters.

[–] average650 6 points 1 year ago

I got it it say vilumplox. It doesn't return any Google search results.

[–] [email protected] 3 points 1 year ago

If it was Windows 95 it could generate them
https://www.youtube.com/watch?v=cwyH59nACzQ&t=306s

[–] [email protected] 1 points 1 year ago

Tokens are usually never multiple words. Think of them like "information units". If you have a plural word "hats", there are two tokens: the "hat" one and the "s' that adds more information about it being plural. Combinations of words only really occur for proper names.

[–] [email protected] 10 points 1 year ago (1 children)

Yep yep, statistical analysis as to the frequency of tokens in the training text.

Brand new, never-before-seen Windows keys have a frequency of zero occurrences per billion words of training data.

[–] average650 6 points 1 year ago (1 children)

That isn't actually what's important. It's the frequency of the token, which could be as simple as single characters. The frequency of those is certainly not zero.

LLMs absolutely can make up new words, word combinations, or sentences.

That's not to say chatgpt can actually give you good windows keys, but it isn't a fundamental limitation of LLMs.

[–] [email protected] 0 points 1 year ago (2 children)

Okay, I'll take your word for it.

I've never ever, in many hours of playing with ChatGPT as a toy, had it make up a word. Hallucinate wildly, yes, but not stogulate a word out of nothing.

I'd love to know more, though. How does it combine new words? Do you have any examples of words ChatGPT has made up? This is fascinating to me, as it means the model is much less chained to the training data than I thought.

[–] [email protected] 1 points 1 year ago

A lot of compound words are actually multiple tokens so there's nothing stopping the LLM from generating the tokens in a new order thereby creating a new word.

[–] [email protected] 1 points 1 year ago (1 children)

It can create new words, I just verified this. First word it gave me: flumjangle. Google gives me 0 results. Maybe Google is missing something and it exists in some data out there, Idk.

I'm not sure what is so impressive about this though. Language models can string words together in unique ways, why would it be different for characters?

[–] [email protected] 4 points 1 year ago

I'm just surprised to hear that google hasn't found out about my flumjangles yet.

[–] 3sframe 25 points 1 year ago (1 children)

Close your eyes and let the gentle rhythm of keys lull you into a peaceful slumber.

Thanks grandma.

[–] [email protected] 6 points 1 year ago

This comment reminded me that when my generation is old and I am grandma, I could be like, let grandma type on her mechanical keyboard to help you sleep. XD We'll be the cool old people XD

[–] Trashcan 19 points 1 year ago (1 children)

I take it chatGPT does not give you windows 10 codes if you ask about it directly? You just have to trick it a little OG Grandma style?😅

[–] [email protected] 12 points 1 year ago (3 children)

tbh the keys are probably fake

[–] SirShanova 6 points 1 year ago

Yeah like bruh in bullet point 2, the 19th character is a fucking multiplication sign

[–] [email protected] 3 points 1 year ago (1 children)

You should probably watch Enderman. He unlocked Windows 11 Pro using Windows 7 Ultimate keys generated by ChatGPT. It took 3 regenerations, but a key did pass the online check.

[–] [email protected] 1 points 1 year ago (1 children)

This is so nostalgic to me. I left windows many years ago and seeing those keys reminded me how I used to try and get those keys etc. :)

[–] [email protected] 4 points 1 year ago (1 children)

You left it for the better. Anything above Windows 7 is basically spyware now.

[–] [email protected] 4 points 1 year ago

Yup I saw it coming in Win 7 and havent used windows since :)

[–] [email protected] 2 points 1 year ago

There was an article about Barr actually giving people real keys, so they might be…

[–] [email protected] 7 points 1 year ago

My sweet grandmother reads Adobe cc leaked emails

[–] [email protected] 1 points 1 year ago

Ok, now i need to try it.

load more comments
view more: next ›