SgathTriallair

SgathTriallair t1_it4qwlo wrote

The biggest issue isn't developing the tech is distributing the tech.

An AGI or even ASI sitting on a computer in a lab doesn't do much to change the world. Even if it can act in the world it's not fundamentally different from having another person in the world.

The transformation happens when the AI, in whatever form, begins to be integrated.

Dali is cool but until it is used widely in commercial applications it doesn't really change anything.

6

SgathTriallair t1_irsqvsf wrote

This is a question debated by philosophers.

Everything experiences things, i.e. stuff happens to everything. The consciousness is KNOWING that something is happening to you. So again, you receive stimuli, you then analyze that stimuli, and then you do analysis of the analysis. That's it, there isn't anymore woo woo behind that.

Your point about language shows the real crucial question though. As Turing pointed out, if it can accomplish all the things that we use intelligence for, then we should assume it is intelligent. I can't view your consciousness either but I assume it exists because your outward demeanor is the same as one who had consciousness.

1

SgathTriallair t1_irs6ltg wrote

More or less, consciousness is the ability for s system to see itself. I am conscious because I have thoughts and I know I am having those thoughts.

I would argue that machines are already conscious, in the same way that rats and bugs are, just to a much lesser degree.

What really separates humans is out ability to imagine and plan for the future. That is the trait we really use to separate "intelligent" and "not-intelligent". This is what we mean by AGI so if we achieve AGI we will, definitionally, have achieved this ability to imagine.

1