Mokebe890

Mokebe890 t1_iuvlde5 wrote

I mean, the main logic problem here is that creation of ASI will make rich people poor. AGI may be controlled but not ASI. Something better than human in every aspect and you control it? No way.

ASI will be either end of every human problem or end of human race.

6

Mokebe890 t1_itwzkx1 wrote

Holy crap you really believe that? Cancer is literally random mutations that occur in your body through DNA malfunction. And pretty everything alters it, even your body as your age because there is more replication errors.

All you have to do is bioengineering body to youthfull state and fight cancer by repairing DNA, not some nature bullshit you say.

1

Mokebe890 t1_ituamvx wrote

Sure, and by no mean Im not happy about that. But we won't get much further with simple medicines. mRNA tech and tweaking our genes will prevent and treat cancer, which no lifestyle adjustments, medicine and other stuff will achieve.

Universally, the best way would be to adress aging itself as root od every disease and just cure aging as disease, reversing our bodies to youthfull state.

7

Mokebe890 t1_irzwmw4 wrote

Here and there for 10 years already, on this sub for one year currently. Im more realistic towards pessimistic now. There is some change in the air but its not advancing as fast as I'd expect it to. Reading breakthrough papers 10 years ago we should really have some proto AGI now, or at least something really autonomous like cars. But we got stable diffusion and DALEE which is also something important. By current frame if we have absolutly nothing by 2030-2035 I will be 100% pessimistic that singularity won't occur in my lifetime.

5

Mokebe890 t1_irsab1q wrote

I said that there is one thing that distinguish us from primates and probably one thing distinguish NN or LLM from AGI. That's what I meant.

The life isn't the cousciousness. If you like like that, there is only 0.0000001% of life that is conscious which actually sucks in terms of probability. My statement was that it dont have to emerge from life, but artifical intelligence must have same structure as biological one to emerge.

1

Mokebe890 t1_irruj26 wrote

Why not? Consciousness applies only to humans, so there is something in humans where it emerge. Also is gorilla conscious when it sign language something to you?

Humans are limited and only astonishing thing is we can do extremly complex operations per second with a very low electricity. But still, we're limited and we can achieve this level as our tech progress.

Look at primates, one thing particulary made us conscious while others not, and this thins is probably frontal lobe. Expreminet long enough with NN and maybe something will emerge.

4

Mokebe890 t1_irque0t wrote

Do your research - red flag. Im well aware of problems that social media and internet can bring, but that's just one part of the coin.

The problems must be adressed but overall its wonderfull thing and developing virtual worlds is our advancment as species. As I said before, best thing is to merge both worlds like extremly advanced AR but I don't see the problem with virtual worlds and living virtual life.

Or maybe you adress agressive tiktok algorithms then okay they must be controlled.

1

Mokebe890 t1_irnlmiv wrote

You can possibly enchance your brain to be constantly plugged into digital world not only normal one. Its matter of perspective. If you live 100% in digital world then digital world is your real word, its just created by humans. Something like perfect simulation. Maybe our universe is acutally a perfect simulation.

So honestly its very hard to tell. For me best is to merge both world into one but only time will tell. Anyway I dont see why it corrupt young generation?

1

Mokebe890 t1_irm8elk wrote

Could be true but we can be somehow blocked or just dont know what to look. In game stellaris you are still primitive civ even with space station around globe and so called "space development stage". Who knows, maybe we're observed by some advanced civ which is waiting for us to unite the globe?

1

Mokebe890 t1_irdt1uj wrote

Well I mean rather 10 - 20 years ahead not as far in future tho.

May you provide some insight? Im in camp by 2030 or 2030 - 2040 but would like to see some papers about as far as 2060. Looking at current model or following Altman you could think that is a lot sooner.

Good point but for example in Europe combine war in Ukraine and almost every country here will have energy crisis this winter, going as much as closing the schools and facilitie.

2

Mokebe890 t1_ir9lep9 wrote

Sure, why not? It will be expensive but not "only 0.01%" expensive. Also humans will be sceptical about it first, reject technologies and improvments.

Technology will lower its cost as it always does. And when there is no problem with resources what bother you about lower developed? What bother you about native tribes living in Amazon forest?

If Id be immortal super being then absolutly won't do nothing bad to humanity because I wont ever bother.

What's hostile? Super intelligence? That it would be fractions more inteligent than we're and won't even think about human anihilation?

3

Mokebe890 t1_ir9fhy0 wrote

Survival instinct only applies to living being, no one knows what ASI would think. Its super intelligence, something above our level of intelligence, you really can't guess what it would thought. Are people concerned about ants life? No because nothing bothers them.

Upgraded homo sapiens will win with normal one, just like homo sapiens won with neandenthral, nothing strange.

2