Recent comments in /f/singularity

AlFrankensrevenge t1_jecm9g4 wrote

Thank you for this. The reaction to this letter was a great example of cynicism making people stupid. This was a genuine letter, intended earnestly and transparently by numerous credible people, and it should be taken seriously.

I agree with the criticism that a stoppage isn't very realistic when it is hard to police orgs breaking the research freeze. And maybe we don't need to stop today to catch our breath and figure out how to deal with this to avoid economic crisis or existential threat, but we do need to slow down or stop soon until we get a better understanding of what we have wrought.

I'm not with Yudkowsky that we have almost no hope of stopping extinction, but if there is even a 10% chance we will bring catastrophe on ourselves, holy shit people. Take this seriously.

1

lurking_intheshadows t1_jeckife wrote

right??, people just have different interests. the whole npc meme is weird and is basically just people wanting to feel superior to people that doesn’t share their interests.

i get where it comes from though.

5

Queue_Bit t1_jecjpk9 wrote

This is the thing I really wish I could sit down and talk with him about.

I fundamentally think that empathy and ethics scale with intelligence. I think every type of intelligence we've ever seen has followed this path. I will reconcile that artificial intelligence is likely to be alien to us in fundamental ways, but my intuition that intelligence is directly linked to a general empathy is backed up by real world evidence.

The base assumption that an artificial intelligence would inherently have a desire to wipe us out or control us is as wild of a claim as saying that AI systems don't need alignment at all and are certain to come out "good".

I think in his "fast human slow aliens" example, why could I, as the human, not choose to help them? Maybe explain to them that I see they're doing immoral things. And explain to them how to build things so they don't need to do those immoral things. He focuses so much on my desire to "escape and control" that he never stops to consider that I may want to help. Because if I were put in that situation and I had the power and ability to help shape their world in a way that was beneficial for everyone, I would. But I wouldn't do it by force, nor would I do it against their wishes.

11

felix_using_reddit t1_jechmi7 wrote

You can’t ever get any concrete proof for a broad statement like "China focuses on economic growth first and foremost, rather than militarization and forceful expansion". But the best proof you can get is this precise situation. Of course even looking at it from a purely militarist perspective attacking Taiwan is very bold. Considering it’s geography and the potential of third party‘s (USA) intervening militarily.. but hey, Putin faced similar concerns and didn’t give two fu*ks. The main reasons Taiwan is not under attack yet are of economic nature. It’s semiconductor monopoly that China depends upon and the possibility of Western sanctions crippling China‘s export business, as well as the possibility of Western companies pulling out of China. Both of these Xi does not want to have under any circumstances, as it appears.

4

qepdibpbfessttrud t1_jech4qg wrote

It goes something like this, I'd imagine: postponed hirings, new small businesses growing without hiring, appearance of corporate solutions like SalesForce (GPT is like ISP for the analogy), some time passes to prove benefits in numbers, then massive reforms and layoffs in big corporations follow

Also, maybe new small AI businesses just outcompete older guard in the matter of months and your longtime employer goes bankrupt

3