Recent comments in /f/Futurology

jonhasglasses t1_jeerma7 wrote

This is the problem with new tech. We imagine new tech unrestrained by our current model of business and society. So that when we prognosticate what this new tech is we see boundless possibilities. But the truth of the matter is any new tech will be lashed down with the restrictions of our business models and governments. This happened with the internet. It was supposed to change everything about everything. And the whole world was going to be revolutionized and we were going to end disease and stop world hunger all because of the internet. Well all the internet did is made our current capitalist systems stronger faster and more robust. The same thing is going to happen with AI. It might be a multiplier but it’s not going to be the one anyone is talking about.

2

FuturologyBot t1_jeerchi wrote

The following submission statement was provided by /u/Gari_305:


From the Article

>"Future missions that require HD video, robotics, sensing applications, telemetry or biometrics will need the advanced capabilities that cellular networks enable," Nokia said on its web page about the NASA partnership.
>
>Those technologies will help researchers locate lunar ice, which could help sustain human life on the planet by serving as a source of fuel, water and oxygen for future colonies, according to NASA.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/127lp8c/nokia_to_set_up_first_4g_network_on_moon_with_nasa/jeem9mg/

1

nobodyisonething t1_jeeqoio wrote

AI will creep into all jobs like water finding a way through cracks. It will happen and not be stoppable.

The jobs that are entirely done via a computer keyboard are easiest to target -- less engineering involved.

https://medium.datadriveninvestor.com/ai-with-change-comes-chance-5a7ff61cce0b

The potential for AI is much greater than human potential.

https://medium.com/predict/human-minds-and-data-streams-60c0909dc368

3

OriginalCompetitive t1_jeeq4b3 wrote

I think you’re missing how utterly alien any GAI will be to us. We have a single mind, closed off from direct contact with others.

But an AI mind will be able to split into thousands of separate copies, live independently, and then recombine (ie, by literally copying itself on multiple computers, severing the connections, and then reconnecting). Will that feel like being one mind, or a crowd of minds? Would a mind that is accustomed to creating copies and then shutting them down care about death?

Or consider the ability to store frozen copies of itself in storage files? What would that feel like? How would AGI think of that? What sort of “morality” would a being have that is constantly extinguishing copies of itself (killing them?) but itself never dies?

Would an AI that can store and revive itself across potentially decades or longer understand time? Would an AI that cannot physically move through the world understand the world? Would it live solely on the plane of abstract ideas, and never realize that a “real” world of space and time and humans with other minds even exists?

It’s absurd to wonder about the human morality of such an entity. It’s like asking if the sound of the wind has morality.

1

echohole5 t1_jeep9ys wrote

It would be suicidal for the owning class to let the working class starve to death. The 99.9% will kill to not starve to death and the owners are outnumbered 1000:1. They know they'd have no hope of survival. The military and police would turn on their masters to not let their friends and extended family die.

Besides, ruling over a nation of corpses isn't satisfying. There's nobody to feel superior to.

Capitalism requires consumption. Without it, nobody can sell anything and the corporations die with the citizens.

If we get to a point where human labor has no economic value, there will be a new way of redistributing wealth that takes the place of wages.

I'm not that worried about a future where human labor still has some value but not much and the kind of valuable labor is stuff every human can easily do. The ruling class would still want to incentivize working by not providing UBI but everyone would be working minimum wage jobs with a permanent high unemployment rate with very limited safety nets. That's a future of grinding misery but it wouldn't be bad enough to cause a revolutions, like mass starvation would. That's the dystopia that worries me.

1

BigMemeKing t1_jeeof2b wrote

There are several movies about this very thing, several songs as well. It's an old concept as we continue to rip and cut our way through every tree we can to build housing materials, furniture, entertainment sources like skate boards, snow boards, surf boards and so that giant corporations can profit.

It has been and won't stop happening until we're left with very little natural habitats. We've long been cutting through our rainforests, and selling rights to national parks to certain companies so they can clear space for their machines.

But one hiker scribbles on some rocks with paint and the whole world loses its mind. Crazy right? Then we start to feel such disgust because "those are our national parks!" "We need to preserve them!" "We can't be slapping graffiti on the rocks there! Conserve their natural beauty!" While 100 ft to the left is a cleared out patch of earth with an oil pump just sipping on that crude, lining some oil barons pockets.

It's silly. Life is silly. Our priorities are silly. We have pipelines running for thousands of miles, causing oil spills at a disastrous level, and the courts just have these companies pay some millions of dollars and it's "ok". And we look and see a price tag of 30 million and think "Oh wow! That's a lot of money, so it's fair right? But to a multi billion dollar oil company that makes 500 million every year... 30 million is a small price to pay for progress right? Especially when it's just a one time payment for a disaster that has impacts for years to come.

Call it an oopsie tax.

9