this post was submitted on 20 Jan 2024
631 points (99.2% liked)

196

15650 readers
3560 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
all 28 comments
sorted by: hot top controversial new old
[–] [email protected] 61 points 5 months ago* (last edited 5 months ago) (2 children)

I'm absolutely sure the ship's computer not only knows contextual hotness, but has definitions for every crewmember. So Picard may like his tea hot at 82°C while La Forge likes his at 70° (possibly because he's drinking green, not black).

That said Geordi La Forge routinely struggles to tame the ship's computer to get what he wants. So it may also give him 95° Camomile just to mess with him.

[–] CluckN 15 points 5 months ago (1 children)

Maybe it’s a bug where hot is a global variable?

[–] [email protected] 23 points 5 months ago* (last edited 5 months ago)

Or, it actually knows the correct context but has discovered plausible deniability. Picard has a history of being mean to computer, after all.

[–] [email protected] 6 points 5 months ago (3 children)

I'm not a star trek nerd but a tea nerd, and if I'm not mistaken Picard drinks earl grey: You generally boil black tea of course that depends on the tea but yeah 80c range is quite low for black. Depending on the green and the time of brew the tempature can be anywhere from room temp to 90c it just depends on many different factors, like freshness or how the tea plant is grown and how those leaves are treated. Generally with Japanese greens you use low temp water, with fresh Chinese green teas you can use near boiling water.

[–] [email protected] 4 points 5 months ago (1 children)

As a rule of thumb westeners tend to brew tea too hot, don't be afraid of messing around with lower temperatures. Doubly so if you're living in the lowlands, in mountainous regions where the tea grows people might be using boiling water but that doesn't mean 100C: In the Andes, where potatoes are from, they're doing some freezing and whatnot processing to prepare them instead of boiling. Wouldn't really work because you can't get water hotter than 80-85C there.

Also cold brewed, as in refrigerator brewed, Earl Grey is one of my favourites in summer. Needs the right base tea though mine's a decent Cylon. Couple of hours at least, better overnight, practically impossible to steep too long.

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago)

I know that cold brew is a thing of black teas, its just that it takes a while to do black cold brews, compared to gyokuro which you can brew it room temp under a min or so if you're using higher ratio of tea to water compared to western brewing.

But yes like I mentioned you can do green tea near boiling its just it depends heavily on where its from, how its grown, how its treated and how fresh it is. The less fresh green tea is, the colder the water you should be using.

[–] [email protected] 1 points 5 months ago

90 times the light speed? holy moly thats fast for tea.

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago)

Yes. I was borrowing, actually from Starbucks standard: Black teas are steeped at boiling or near boiling, but then are cooled to 80℃ when served, and the TNG era Replicator seems smart enough create a cup of steeped tea at drinking temperature. Though yes, when someone orders a pot, it's water heated to steeping temperature.

ETA I didn't know the difference between Chinese and Japanese green teas! TIL!

[–] [email protected] 35 points 5 months ago

Maybe that shouldn't have been a global variable.

[–] [email protected] 21 points 5 months ago (1 children)

Now this is a top tier meme!

[–] SatansMaggotyCumFart 7 points 5 months ago

It’s hot.

[–] [email protected] 21 points 5 months ago (1 children)

This is fantastic except I've got a fractured rib and the mild chuckle fucking hurt.

[–] thawed_caveman 5 points 5 months ago (1 children)

What does a single cell say when you step on its toes?

spoilerMi-toes-sis

[–] [email protected] 2 points 5 months ago

Weaponized humor!

[–] Randelung 12 points 5 months ago

I was in a course the other day where some dude said something about "older" apprentices being more enthusiastic. Upon the request to define "older" he said 20.

Some time later some other dude was telling a story of his own and about how they got a CV from an older gentleman just a few years from retirement. I said "oh, so he's older than 20?"

(and then everyone clapped /s)

[–] [email protected] 10 points 5 months ago

The ship computer is a neural-net processor; a learning computer.

[–] [email protected] 10 points 5 months ago

This genuinely made me burst into laughter. Well done 196er.

[–] [email protected] 10 points 5 months ago

Lmao at all the nerds in here like "excuse me sir, this meme is computalogically incorrect"

[–] StephniBefni 8 points 5 months ago

The first few times he asked for tea he did state the temperature.

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago) (3 children)

Correct me if I am wrong here but isn't this like the best example of why the current "AI" isn't taking over anything anytime soon or shouldn't be doing critical stuff?
Like, this is almost exactly how current LLMs work.

Edit: yeah no, I was wrong on the internet! Was sleepy, and I think I imagined that the secondary scenario never ocurred in the trained dataset, requiring a true deduction... ?

[–] HandMadeArtisanRobot 16 points 5 months ago (1 children)

Yeah, you are wrong. This has nothing to do with LLMs or how AI today works. What was it that led you to that conclusion?

[–] [email protected] 2 points 5 months ago (1 children)

I've edited the comment with some extra, but I would still rather say my yesterday me was just high and act like this never happened haha

[–] HandMadeArtisanRobot 1 points 5 months ago

No worries!

[–] [email protected] 7 points 5 months ago

It's not how LLMs work though, an LLM would know the difference between these scenarios due to context given. I would go as far as to say it isn't even ML related, it's just a joke about defining a global variable and using it blindly everywhere.

[–] [email protected] 5 points 5 months ago* (last edited 5 months ago)

No. LLMs have context and know that words have context. This would be the exact opposite of ”AI”. This is analogous to defining a global variable “hot” as 1.9m kelvin, and then blindly using that for hot everywhere the word hot is used.

AI, even current iterations, know that a hot stove will be hotter than hot tea. And they’re both less than the hot that is the surface of the sun.

The whole achievement of LLMs is that they learn all of that context - to guess with certainty of some percentage that when you’re talking about hot while talking about tea that you mean 160-180 degrees or whatever, and when talking about hot oil it might be 350 degrees if you’re frying, or 250 degrees if you’re talking about cars. And if you’re talking about people, hot means attractive.

That’s exactly what LLMs do today. Not 100% perfectly, there are errors and hallucinations and whatever else, but that’s the exception not the norm.

[–] [email protected] 6 points 5 months ago

As a computer engineer this is completely incorrect.