AdditionalPizza

AdditionalPizza OP t1_isgfk33 wrote

>I don't think you're actually disagreeing with me any more.

Honestly I don't think we ever were. Aside from whether or not it requires a self aware AI and the ways to achieve a singularity situation, and the definition of it.

We both agreed it's a moment in time we can't predict beyond. I had never stated anything less, the original commenter stated something differently and I disagreed with them.

1

AdditionalPizza OP t1_isgcvl7 wrote

The singularity is literally a point in time though. It's not an ongoing event. We possibly have our social structures > singularity > we no longer have our social structures.

I don't think you're understand what I'm saying. To be honest, I don't understand what your argument is either. I don't even know what we're debating at this point.

1

AdditionalPizza OP t1_isgcin7 wrote

I think it's correct to assume different countries will attempt different things. It's really hard to say because while we can compare revolutions in history, it doesn't really help us understand the implications of this one. This isn't the industrial revolution, this is a replacement of the of the entire workforce sector by sector. The first sectors to go might find other employment. Then more sectors will go, and then more.

We're waiting on the first big sector to go nearly or fully automated. Graphic designers aren't a sector, they're part of a creative sector. When entertainment and art is mostly automated we will see. But it could be medicine, could be legal, could be computer sciences, could be retail. It could be ones that have labour involved but that seems less likely at the moment but if there's a breakthrough in robotics soon that will spell the end of that sector.

One sector will probably knock I don't know, 5% of the workforce out? Maybe more? That's an immediate crises. Then another. Then another.

​

>I still think there will be work for people, but it may be "make work" of the FDR New Deal style.

That's probably a solution we'll see attempted, but I don't know. I don't want to see that. That's a bandage for a giant wound. It might work for a few months, but then more people become unemployed.

Open AI's Codex is crazy. That tech will accelerate all facets of IT, which means we are increasing the rate of exponential growth by orders of magnitude.

But you know, I hope you're view on it is more correct than mine. At least for this transition period coming up soon enough.

1

AdditionalPizza OP t1_isg9xwj wrote

You're very set on a sci-fi writer from 1993's version being the absolute. He didn't even come up with the term, he just made a popular theory about it.

If you want to be so concrete on one man's theory, you should probably go with the original at least. Not just the first most popular. The entire definition was originally a rate of returns on tech that surpasses human comprehension. That's it, and I'm sticking with it.

1

AdditionalPizza OP t1_isg4pbb wrote

>That's basically the foundational document of singularity theory.

Yeah I know. If you consider it a definition of a word, I can understand not wanting to change it. If you consider it a theory, then well theories evolve all the time.

But I know, I was absolutely not talking about the singularity. I mentioned it because the original comment was referring to it, and I said what they were referring to sounded less like post-singularity, and more like transformative AI. I was actually mostly avoiding talking about the singularity in this post, more about pre-singularity.

1

AdditionalPizza OP t1_isg1q1y wrote

Yes I've seen that very old writing of it. There are several more modern view points on what a technological singularity could be, they include many different ways of achieving it, but they all conclude the same basic thing; Unfathomable change and uncontrollable runaway technological innovation.

Regardless, we can agree to disagree on that point. I was trying to avoid talking about AGI and self aware AI anyway.

1

AdditionalPizza OP t1_isfw4w7 wrote

Right now, those billionaires. People assuming I suggest drugging them to be more empathetic. I'm suggesting a possible optimistic scenario to those that have doubts billionaires and world leaders won't suppress us further with technology.

1

AdditionalPizza OP t1_isfvw73 wrote

If you have the exact blueprints of how the singularity will go down, then sure. But we have no idea if AGI is absolutely necessary for the singularity to occur yet. We don't even know if self awareness will be possible in AI, so it's possible a single entity could control an ASI and use it for whatever they want. We have no idea.

1

AdditionalPizza OP t1_isfv03z wrote

>The same thing will happen over the next 20 years.

The problem would be that AI would be cheaper, and perform several magnitudes more efficiently. There just won't be a job that AI can't perform better and for less cost. 24/7. At that point why would we even be striving for work? We should be striving to be unshackled from working half our lives.

>devalue individual worth

This could be the hard pill to swallow during an upcoming revolution for a lot of people. The "value" and "worth" will cease to exist. People will have to come to terms with unemployment no longer being a thing, it's simply mandatory employment disappears.

The part I worry about is the people that believe employment is essential and will prolong the suffering of many by dragging others to the bottom before the eventual collapse of the system. UBI is simply a stop-gap.

1

AdditionalPizza OP t1_isfsq7p wrote

I agree. My post is just optimism for the process in which that could have a possibility of happening. Hopefully we don't focus so much on fearing AI alignment that we forget to fear each other. Both are equally important, for the majority of society anyway.

2

AdditionalPizza OP t1_isfqvb1 wrote

Hmm, personally from that reasoning I would say your future self is less separated than the others, being that your actions in the present directly affect what you will feel. But also I suppose in this theory the present doesn't even exist so I don't know.

I can't remember what someone else thinks though, so I don't know how much I agree.

But anyway, what were you getting at?

1

AdditionalPizza OP t1_isfq9ok wrote

>the billionaires you are talking about “force medicating”

I'll start there, just because that's not at all what I said haha. They would possibly take a cure-all for longevity and healthy aging reasons. Not capture billionaires and shove a pill down their throat. I imagine it as a daily mix of medications and vitamins to make people healthier and prevent diseases. And people would regularly go and get diagnostics done to keep in top health.

>Sure, one can argue their treatment of workers could be better, but at the end of the day Bezos had to solve problems to get rich.

Sociopathic reasoning. "Eh so what if a ton of people suffer so I can have everything."

And I am fully aware capitalism is what we have and it's currently the best solution or at least appears to be. I'm talking about the future when this system just simply won't work, at least at some point it won't if we prescribe to the idea of a technological singularity.

0

AdditionalPizza OP t1_isfovt3 wrote

I agree, and that's the optimism of the post. Hopefully we get a sort of revolution long before the singularity. Otherwise it will be a very brutal road toward the singularity. But personally, unless the singularity occurs much quicker than I think it will, there will definitely have to be a transformation of society in the coming years.

1

AdditionalPizza OP t1_isfok1f wrote

See the way I look at it is the singularity for sure will be unimaginable. I think a lot of people underestimate how unimaginable it will be. I don't expect you or I will be the same after it happens. The world will launch from a brisk jog to the speed of light in an instant.

What you're describing to me sounds more like transformative AI. It will be a revolution much larger, but not unlike the industrial revolution. We're already in the beginning of that. We haven't reached its full potential yet, but I've heard experts saying ~30% to ~40% of the way to reaching that point, whatever metrics they use. But that doesn't mean AGI or the singularity. It means the point where AI creates a situation equal to or greater than the industrial revolution relative to its effect on the world.

We could likely see cures for several, possibly all health issues in this time period. When you have transformers with trillions, possibly quadrillions of parameters in this decade, we will most likely see this kind of application.

1

AdditionalPizza OP t1_isfl91r wrote

I agree. That's where my post came from. We often think about the ultra wealthy getting all of this stuff first, and possibly never letting common folk get their hands on it. Well, maybe the cure-all treatments of the near future will cure their heads from holding onto everything for generations.

2

AdditionalPizza OP t1_isfky6x wrote

>It's a clinical fact.

I don't have the data on billionaire's diagnosis haha.

>Do we endeavor to eliminate sociopathy as a psychological possibility?. If the focus is billionaires as opposed to sociopathy, then we should direct our focus to the economic systems which allow the existence of billionaires.

Yes we endeavor to eliminate the mental illness. No, we don't focus on eliminating it from billionaires, we focus on eliminating it in general and billionaires would likely be the first to receive new/expensive treatments.

1

AdditionalPizza OP t1_isfkdgb wrote

Perhaps. But maybe enough of them will see "sociopathic, anti-social, psychopathic" on the diagnosis and want to cure it. If enough do, then the rest that don't will be the old dinosaurs. It's hypothetical though, just a possible scenario that could ease some people's mind. There's a trillion possibilities of how the future will play out, it's important to keep some optimism is all.

1

AdditionalPizza OP t1_isfk1jm wrote

This sub is literally about an AI revolution that essentially cancels capitalism because it likely won't work post-singularity, possibly pre-singularity as I'm hinting at in my original post.

Capitalism is currently the best we have, and it's "worked" well enough to get us this far. But I can't imagine a future with automated workers, and the unemployed just die off.

1

AdditionalPizza OP t1_isfjhy1 wrote

>That's not how capitalism works. Everyone who actually manages a company would prefer to keep the government at a distance.

You think CEO's aren't in bed with politicians? You're describing capitalism on paper, it's nothing like it was 100 years ago. Sure companies hate when politicians stifle their progress, but campaigns are funded by the wealthy.

1

AdditionalPizza OP t1_isfizb1 wrote

Well the whole subject of transformative AI relates to some kind of revolution. Not in the sense of an over-throw scenario, but like the industrial and agricultural ones. We are in the midst of a revolution. Everyone here is focused so much on the date of AGI/ASI and the singularity, of course that's the sub, but we don't need those necessarily. I have no doubt they will be soon enough, but the next 5-10 years will be the most important in human history so far. The computational revolution or whatever you want to call it, probably something catchier than that. Of course that can be superseded shortly after by AGI or whatever, but regardless of people predicting the singularity in 15 years or 100 years, the revolution is already here, and transformative AI is "slowly" coming out every few months. We don't need 100% unemployment to reach a crises, we need like 10%.

Capitalism just simply cannot be sustainable when unemployment rates start rapidly rising. We can say things like history repeats itself, there were actually more jobs created from the industrial revolution, etc. But history doesn't *always* repeat itself. We are going to automate everything, or at least enough things that there just simply won't be a *reason* to work. The industrial revolution everyone feared being unemployed, this time around we should fear if we will still need to work.

Capitalisms has been exploited to see the gains we have so far, the wealthy still need the lower classes. Without us, they don't sell products. Currently they need our labour, shortly they won't. Do they think they can just take our labour and paychecks and we can still purchase their products? It just won't work.

1