My suggestion has always been universal sidereal time. It is singular, doesn’t change, and carries no colonial baggage since it rotates around the whole earth. Even suitable as a home time if we become spacefaring.
theherk
Wait, so you’re saying the trip to the national anthropology museum was for work? Maybe.
You won’t catch me defending Apple’s repairability but you saying they need repaired often is completely untrue in my experience and that of the wider user baser. In fact their durability and longevity are very good. There are people still running around with 2014 models that have never been repaired.
I mean fuck their anti right to repair bullshit, but the machines are good.
Would she not be considered a tourist for her trip to Mexico City?
Yeah those MacBooks are really known for their poor build quality and terrible efficiency.
It isn’t like I’m not willing to pay. My NAS setup wasn’t exactly cheap. But the user experience is just incredible. I had Netflix for ten years, and several others for some time. The experience is just better. Watching whatever I want synchronized with my wife across devices of any type is superb. Who else offers that?
For what it’s worth, in spite of my poor choice of words and general ignorance on many topics, I agree with everything you said here, and find these fascinating topics. Especially that of our microbiome which I think by mass is larger than our brains; so who’s really doing the thinking around here?
I'm slightly confused. Which part needs an academic paper? I've made three admittedly reductive claims.
- Human brains are neural networks.
- Its outputs are based on training data built from reinforcement.
- We have a much more massive model than current artificial networks.
First, I'm not trying to make some really clever statement. I'm just saying there is a perspective where describing the human brain can generally follow a similar description. Nevertheless, let's look at the only three assertions I make here. Given that the term neural network is given its namesake from the neurons that make up brains, I assume you don't take issue with this. The second point, I don't know if linking to scholarly research is helpful. Is it not well established that animals learn and use reward circuitry like the role of dopamine in neuromodulation? We also have... education, where we are fed information so that we retain it and can recount it down the road.
I guess maybe it is worth exploring the third, even though, I really wasn't intending to make a scholarly statement. Here is an article in Scientific American that gives the number of neural connections around 100 trillion. Now, how that equates directly to model parameters is absolutely unclear, but even if you take glial cells where the number can be as low as 40-130 billion according to The search for true numbers of neurons and glial cells in the human brain: A review of 150 years of cell counting, that number is in the same order of magnitude of current models' parameters. So I guess, if your issue is that AI models are actually larger than the human brain's, I guess maybe there is something cogent. But given that there is likely at least a 1000:1 ratio of neural connections to neurons, I just don't think that is really fair at all.
Fair enough, but it does somewhat undercut your message that every model I’ve tested including quite old ones answer this question correctly on the first try. This image is ChatGPT-4o.
Humans do the same thing. Have you heard of religion?
I meant doesn’t change with respect to time zones. Leap times are still relevant in that scenario as each solar rotation doesn’t divide into a whole number of days and leap seconds due to variance in rotation.
With respect to the meridian I envision it rotating around the earth once per year, hence sidereal. So 0000 would rotate around the earth through the course of the year. Each day it would be one degree farther.
Most likely is I’m just completely full of shit.