vernes1978

vernes1978 t1_iwpni0x wrote

> be as smart as we are.

For a certain definition of "smart" of-course.
"Takes a big bite of rainforest killing sojabean fed cowmeat filled with microplastics"

But that means a person is a dataset applied to a "generally" identical neural net.
Ok, that statement might be generally a lie but this is my question:

What would happen if we could measure all the synaptic weights/values of brain model A belonging to ZeroRelevance.
And just use those values to adjust the neurons in Brainmodel B (belonging to vernes1978).
Howmuch would Brainmodel B react differently then ZeroRelevance?

How big would the difference be?

1

vernes1978 t1_ivtegu5 wrote

You mention two situation, a meat brain in a jar, and a digital upload.

Yet you seem to focus on the meat in a jar scene for the rest of the post.

Also, it would be nice if you introduce each abbreviation with the verbose description at least once.

UBI
AGI
ASI

And you make a number of assumptions without clarification how you came to that assumption.

1

vernes1978 t1_iuqyylm wrote

Look, these are articles I can understand why people write stories about how AI is going to split the sea and wipe it's butt with general relativity.
I mean, it's still horrible fiction.
But even I can see that if you'd manage to duplicate a rat brain using this tech, you have a rat working 30.000 times faster then a human brain.
These are hard numbers you can spin a tale around.

I just don't see why you'd write fiction as soon as AI can generate art of a lady with 2 different eyes and 11 and a half fingers and a hint of a third arm.

Anyway, cool article.

3