agreyworld

joined 1 year ago
[–] agreyworld 21 points 1 year ago

Who could have predicted this!?!?? Why weren't we warned?

[–] agreyworld 20 points 1 year ago* (last edited 1 year ago) (1 children)

Does it not? It still does in my experience. Our company has these weird company wide meetings where they tell us how they're doing great, everything's great, but because growth isn't "double digit" due to inflation so there is a pay freeze. (I'd love even a less than double digit pay rise - even though that's still a pay cut with inflation).

Pretty much exactly what you are saying. Employees are grumbling, but don't want to get laid off and are uncertain about the job market.

[–] agreyworld 3 points 1 year ago (2 children)

I love getting socks.

[–] agreyworld 3 points 1 year ago (1 children)

Building a garage so will be laying blocks. Takes bloody ages!

[–] agreyworld 4 points 1 year ago

I can see it from lemmy.world

[–] agreyworld 11 points 1 year ago (6 children)

13! That's crazy.

[–] agreyworld 4 points 1 year ago (1 children)

God, I don't miss exams every single year for 7 years straight

[–] agreyworld 3 points 1 year ago* (last edited 1 year ago)

Read openAI’s papers on chatGPT. They define it as a next token guesser and detail the exact hidden layer functions that accomplish this. It’s not an oversimplification, it’s what the creators of these AIs define their creations as.

I am aware how LLMs work. I wasn't denying it works through next token generation. My point is that saying it cannot accomplish anything general because it is predicting probability of the next token is dumb. And referring to it as a "fancy autocomplete" to justify that is over simplifying and overlooking it's capabilities.

Other properties may be there, but as you say they are emergent; emergent properties rely on two or more sources, if one goes away so do they.

What do you mean by this? What are your referring to as sources?

 

Don’t know if anyone is interested. I normally do videos on car restoration, but decided to film myself building a new garage.

Crosspost from https://lemmy.world/comment/213388

[–] agreyworld 4 points 1 year ago* (last edited 1 year ago) (2 children)

You can distil anything down to "just fancy X" by over simplifying.

As much as people like to say "it's a fancy autocomplete" it's getting *much *more general, and displaying more and more emergent properties like reasoning

https://openreview.net/pdf?id=yzkSU5zdwD

https://www.assemblyai.com/blog/emergent-abilities-of-large-language-models/

They are becoming more and more general. If you can't see that given all the pretty general tasks people have begun using them for...

[–] agreyworld 2 points 1 year ago (1 children)

I thought I was getting my head a little around federated nature of lemmy and now you go telling me this!

Should I re-post it to a non beehaw DIY community do you think?

 

Don't know if anyone is interested. I normally do videos on car restoration, but decided to film myself building a new garage.

[–] agreyworld 1 points 1 year ago (1 children)

God, I wish there were more "X of the week" type shows. I know breaking bad and a bunch of really great shows came out and made the golden age of serial story heavy TV happen - but man, I'm exhausted having to watch 20 hours of something to get any kind of plot resolution.

I mess the days where you could just watch 40 minutes of a show and have a whole story start to finish. So much less exhausting not having to get super invested. I don't have the energy for serial TV shows these days!

[–] agreyworld 2 points 1 year ago

After a while I just found it too stressfully destructive to keep watching!

view more: next ›