this post was submitted on 31 Dec 2024
1806 points (98.0% liked)
Fuck AI
1604 readers
87 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
founded 9 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As far as I know, the Deepmind paper was actually a challenge of the OpenAI paper, suggesting that models are undertrained and underperform while using too much compute due to this. They tested a model with 70B params and were able to outperform much larger models while using less compute by introducing more training. I don't think there can be any general conclusion about some hard ceiling for LLM performance drawn from this.
However, this does not change the fact that there are areas (ones that rely on correctness) that simply cannot be replaced by this kind of model, and it is a foolish pursuit.
Just scan and simulate an actual human brain at 100x speed and gg
Human hardware is pretty impressive, might need to move on from binary computers to emulate it efficiently.
What do you mean by "might need to move on from binary computers to emulate it efficiently"?
Neurons produce multiple types of neurotransmitters. That means they can have an effective state different from just on or off.
I'm not suggesting we resurrect analogue computers, per se, but I think we need to find something with a little more complexity for a good middle ground. It could even be something as simple as binary with conditional memory, maybe. Idk. I see the problem not the solution.
I'm also not saying you can't emulate it with binary, but I am saying it isn't as efficient.