this post was submitted on 25 Apr 2024
703 points (95.5% liked)

Programmer Humor

19623 readers
100 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.ml/post/14869314

"I want to live forever in AI"

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 106 points 7 months ago (31 children)

Even if it were possible to scan the contents of your brain and reproduce them in a digital form, there's no reason that scan would be anything more than bits of data on the digital system. You could have a database of your brain... but it wouldn't be conscious.

No one has any idea how to replicate the activity of the brain. As far as I know there aren't any practical proposals in this area. All we have are vague theories about what might be going on, and a limited grasp of neurochemistry. It will be a very long time before reproducing the functions of a conscious mind is anything more than fantasy.

[–] [email protected] 48 points 7 months ago (4 children)

Counterpoint, from a complex systems perspective:

We don't fully know or are able toodel the details of neurochemistry, but we know some essential features which we can model, action potentials in spiking neuron models for example.

It's likely that the details don't actually matter much. Take traffic jams as an example. There is lots of details going on, driver psychology, the physical mechanics of the car etc. but you only need a handful of very rough parameters to reproduce traffic jams in a computer.

That's the thing with "emergent" phenomena, they are less complicated than the sum of their parts, which means you can achieve the same dynamics using other parts.

[–] tburkhol 31 points 7 months ago (2 children)

Even if you ignore all the neuromodulatory chemistry, much of the interesting processing happens at sub-threshold depolarizations, depending on millisecond-scale coincidence detection from synapses distributed through an enormous, and slow-conducting dendritic network. The simple electrical signal transmission model, where an input neuron causes reliable spiking in an output neuron, comes from skeletal muscle, which served as the model for synaptic transmission for decades, just because it was a lot easier to study than actual inter-neural synapses.

But even that doesn't matter if we can't map the inter-neuronal connections, and so far that's only been done for the 300 neurons of the c elegans ganglia (i.e., not even a 'real' brain), after a decade of work. Nowhere close to mapping the neuroscientists' favorite model, aplysia, which only has 20,000 neurons. Maybe statistics will wash out some of those details by the time you get to humans 10^11 neuron systems, but considering how badly current network models are for predicting even simple behaviors, I'm going to say more details matter than we will discover any time soon.

[–] [email protected] 14 points 7 months ago (1 children)

Thanks fellow traveller for punching holes in computational stupidity. Everything you said is true but I also want to point out that the brain is an analog system so the information in a neuron is infinite relative to a digital system (cf: digitizing analog recordings). As I tell my students if you are looking for a binary event to start modeling, look to individual ions moving across the membrane.

[–] Blue_Morpho 12 points 7 months ago (7 children)

As I tell my students if you are looking for a binary event to start modeling, look to individual ions moving across the membrane.

So it's not infinite and can be digitized. :)

But to be more serious, digitized analog recordings is a bad analogy because audio can be digitized and perfectly reproduced. Nyquist- Shannon theory means the output can be perfectly reproduced. It's not approximate. It's perfect.

https://en.m.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampling_theorem

load more comments (7 replies)
load more comments (1 replies)
[–] [email protected] 9 points 7 months ago

I heard a hypothesis that the first human made consciousness will be an AI algorithm designed to monitor and coordinate other AI algorithms which makes a lot of sense to me.

Our consciousness is just the monitoring system of all our bodies subsystems. It is most certainly an emergent phenomenon of the interaction and management of different functions competing or coordinating for resources within the body.

To me it seems very likely that the first human made consciousness will not be designed to be conscious. It also seems likely that we won't be aware of the first consciousnesses because we won't be looking for it. Consciousness won't be the goal of the development that makes it possible.

load more comments (2 replies)
[–] [email protected] 23 points 7 months ago (28 children)

We don't even know what consciousness is, let alone if it's technically "real" (as in physical in any way.) It's perfectly possible an uploaded brain would be just as conscious as a real brain because there was no physical thing making us conscious, and rather it was just a result of our ability to think at all.
Similarly, I've heard people argue a machine couldn't feel emotions because it doesn't have the physical parts of the brain that allow that, so it could only ever simulate them. That argument has the same hole in that we don't actually know that we need those to feel emotions, or if the final result is all that matters. If we replaced the whole "this happens, release this hormone to cause these changes in behavior and physical function" with a simple statement that said "this happened, change behavior and function," maybe there isn't really enough of a difference to call one simulated and the other real. Just different ways of achieving the same result.

My point is, we treat all these things, consciousness, emotions, etc, like they're special things that can't be replicated, but we have no evidence to suggest this. It's basically the scientific equivalent of mysticism, like the insistence that free will must exist even though all evidence points to the contrary.

[–] [email protected] 8 points 7 months ago (3 children)

Also, some of what happens in the brain is just storytelling. Like, when the doctor hits your patellar tendon, just under your knee, with a reflex hammer. Your knee jerks, but the signals telling it to do that don't even make it to the brain. Instead the signal gets to your spinal cord and it "instructs" your knee muscles.

But, they've studied similar things and have found out that in many cases where the brain isn't involved in making a decision, the brain does make up a story that explains why you did something, to make it seem like it was a decision, not merely a reaction to stimulus.

load more comments (3 replies)
load more comments (27 replies)
[–] nnullzz 15 points 7 months ago (2 children)

Consciousness might not even be “attached” to the brain. We think with our brains but being conscious could be a separate function or even non-local.

[–] Blue_Morpho 14 points 7 months ago (5 children)

I read that and the summary is, "Here are current physical models that don't explain everything. Therefore, because science doesn't have an answer it could be magic."

We know consciousness is attached to the brain because physical changes in the brain cause changes in consciousness. Physical damage can cause complete personality changes. We also have a complete spectrum of observed consciousness from the flatworm with 300 neurons, to the chimpanzee with 28 billion. Chimps have emotions, self reflection and everything but full language. We can step backwards from chimps to simpler animals and it's a continuous spectrum of consciousness. There isn't a hard divide, it's only less. Humans aren't magical.

load more comments (5 replies)
load more comments (1 replies)
[–] Maggoty 8 points 7 months ago (4 children)

I think we're going to learn how to mimic a transfer of consciousness before we learn how to actually do one. Basically we'll figure out how to boot up a new brain with all of your memories intact. But that's not actually a transfer, that's a clone. How many millions of people will we murder before we find out the Zombie Zuckerberg Corp was lying about it being a transfer?

load more comments (4 replies)
load more comments (27 replies)
[–] [email protected] 84 points 7 months ago (2 children)

Consciousness and conscience are not the same thing, this naming is horrible

[–] [email protected] 51 points 7 months ago

This just makes it more realistic

load more comments (1 replies)
[–] [email protected] 56 points 7 months ago* (last edited 7 months ago) (6 children)

The game SOMA represents this case the best. Highly recommended!

[–] Wolfwood1 10 points 7 months ago

Yes, I immediately thought about SOMA after reading the post. recommendations++

load more comments (5 replies)
[–] [email protected] 55 points 7 months ago (7 children)

If anyone's interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it's created and used by big tech companies is uncomfortably real.

The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can't help but wonder if it's at least partially because of its harsh criticisms of the tech industry.

[–] [email protected] 9 points 7 months ago

Just FYI content warning for Pantheon there is a seriously disturbing gore/kill scene that is animated too well in the first season. Anyone who has seen the show knows what scene I am talking about, I found the scene pretty upsetting and I almost didn't finish the show. I am still a little upset that the scene is burned in my memory.

[–] khannie 8 points 7 months ago (3 children)

Sounds good. Did it come to a conclusion or get axed mid way?

[–] AFaithfulNihilist 9 points 7 months ago (1 children)

The series has a very satisfying conclusion.

It's one of the coolest fucking things we watched this last year.

load more comments (1 replies)
load more comments (2 replies)
load more comments (5 replies)
[–] [email protected] 48 points 7 months ago (4 children)

Soma is a wonderful game that covers this type of thing. It does make you wonder what consciousness really is... Maybe the ability to perceive and store information, along with retrieving that information, is enough to provide an illusion of consistent self?

Or maybe it's some competely strange system, unkown to science. Who knows?

load more comments (4 replies)
[–] [email protected] 42 points 7 months ago (4 children)

The comic sans makes this even deeper

[–] fidodo 16 points 7 months ago (3 children)

Who the fuck uses comic sans for programming? I use comic mono.

load more comments (3 replies)
[–] [email protected] 35 points 7 months ago (3 children)

What if you do it in a ship of theseus type of way. Like, swapping each part of the brain with an electronic one slowly until there is no brain left.

Wonder if that will work.

[–] [email protected] 26 points 7 months ago (1 children)

If I remember right, the game The Talos Principle calls that the Talos principle

load more comments (1 replies)
[–] [email protected] 13 points 7 months ago (1 children)

The tv show Pantheon figures it will work, but it will be very disturbing.

load more comments (1 replies)
[–] ChewTiger 9 points 7 months ago (1 children)

Right? Like what if as cells die or degrade instead of being replaced by the body naturally they are replaced by nanites/cybernetics/tech magic. If the process of fully converting took place over the course of 10 years, then I don't see how the subject would even notice.

It's an interesting thing to ponder.

load more comments (1 replies)
[–] Thcdenton 27 points 7 months ago (9 children)

This prospect doesnt bother me in the least. I've already been replaced 5 times in my life so far. The soul is a spook. Let my clone smother me in my sleep and deal with the IRS instead.

[–] mynameisigglepiggle 9 points 7 months ago (4 children)

Makes me wonder how many times I've been replaced. Also makes me wonder if I just died yesterday and today I'm actually a new person. I have no evidence that yesterday happened except for a memory of it, and let's face it, since it was a public holiday, that's a pretty foggy memory

load more comments (4 replies)
load more comments (8 replies)
[–] [email protected] 25 points 7 months ago* (last edited 7 months ago) (2 children)

A copy is fine. I can still seek vengeance on my enemies from beyond the grave.

load more comments (2 replies)
[–] ZILtoid1991 21 points 7 months ago (2 children)
throws UserNotPaidException
load more comments (2 replies)
[–] [email protected] 20 points 7 months ago* (last edited 7 months ago) (3 children)

would've made more sense if it was rust

(or is the copy intential here?)

[–] [email protected] 15 points 7 months ago (1 children)

Plottwist: consciousness is : Copy

load more comments (1 replies)
[–] RustyNova 9 points 7 months ago (1 children)
#[derive(Clone, Copy)]
struct Consciousness {...}

fn upload_brain(brain: Consciousness) -> Result<(), Error>
load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 13 points 7 months ago (4 children)

Whats the difference between void fn(Type& var) and void fn(Type* var)?

[–] Tangent5280 29 points 7 months ago (2 children)

Sends original data vs making a copy of data and sending it.

In meme context you'd be just making a copy of your consciousness and putting it in a machine. Whatever reason you're doing it for - escape illness, survive armageddon, nothing changes for you. A copy of you lives on though.

[–] Valmond 10 points 7 months ago

It's not like the post, secont is a pointer.

[–] [email protected] 8 points 7 months ago (1 children)

I mean, just kill the host as soon as the upload is complete. at best you are not conscious during the process and when "you" wake up you are in the cloud. The version of you that awakes gets told that the "transfer" was complete.

[–] [email protected] 12 points 7 months ago (8 children)
load more comments (8 replies)
[–] [email protected] 8 points 7 months ago

I guess you ask for C++. There Type* can be null while Type& can't be null. When it gets compiled Type& is compiled (mostly) to the same machinecode as Type*.

load more comments (2 replies)
[–] [email protected] 12 points 7 months ago

the plot of

spoilerSOMA
in a nutshell?

[–] [email protected] 10 points 7 months ago (2 children)

There's a cool computer game that makes this point as part of the story line... I'd recommend it, but I can't recommend it in this context without it being a spoiler!

[–] trashgirlfriend 9 points 7 months ago* (last edited 7 months ago) (1 children)

Guys probably talking about

Tap for spoilerSOMA

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 9 points 7 months ago (1 children)

I know myself deeply enough to be totally fine with a copy. I’d be my own copy’s pet if it came to that. I trust me.

[–] [email protected] 9 points 7 months ago

Yeah we'd work together well and the sex would be great.

[–] Valmond 8 points 7 months ago (1 children)

void teleport(Person person);

load more comments (1 replies)
[–] [email protected] 8 points 7 months ago (3 children)

I've had this thought and felt it was so profound I should write a short story about it. Now I see this meme and I feel dumb.

load more comments (3 replies)
load more comments
view more: next ›