Recent comments in /f/singularity

MrEloi t1_jefy9t7 wrote

I have just asked my wife - who is an English graduate.

She said she would indeed have second thoughts about taking the course had AI been around at the time.

4

240pixels t1_jefxan2 wrote

That sounds very optimistic however that sounds very challenging and close to impossible to simply remove "desire" from human psychology. We are hard wired to over-reach that's how we survive. I'd argue some of our faults is what makes us human, but we also can learn from them. All I'm saying is society isn't ready for immortality. We can't even be present in the life we live right now.

2

datsmamail12 t1_jefxabe wrote

Even if they pause it,it's still not going to do anything,everyone will release a new model 6 months later. This is just the most idiotic thing anyone ever said. They are not going to pause innovation,you just can't tell a company not to program if that's they way they operate. We are talking about stock market here, it's not ever going to happen. Elon Musk got the whole worlds wealth and he still couldn't get ahead of the curve,and now he just complains like a crybaby that wants to be part of it. Boohoo,get over it.

3

AndiLittle t1_jefwpls wrote

If you think about it, form follows function. Our language is so complex because our environment requires it. We need a lot of words to express the reality we live in and that the smartest of us were able to create for us. We also need a lot of language to deceive. Other species don't hold elections so they don't need to be that verbose. :) Makes sense?

1

just_thisGuy t1_jefwnt9 wrote

I’d argue that nuclear war only accelerates this, any safety is out the window. Less people, need to rebuild, need to wrestle with effects of nuclear war, etc. That is unless nobody is left. But yes, I agree there is no stopping this and it’s only going to get faster, and I think that’s good, it does not mean we are not going to see negative effects, I just think negative effects from humans are probabilistically worse.

5

salesforceonee t1_jefwlu6 wrote

The AI revolution has opened up many lucrative opportunities for those that look for them. My plan is to go from making decent money to making a fortune. I love being alive during this time in history.

1

flexaplext t1_jefw4sn wrote

Yeah, because we don't just see governments knee-jerk reacting to AI now when private enterprise has been developing and investing in it for many years.

And it isn't the most important and dangerous technology that will ever exist and yet they have little to no regulations on it or proper plans going forward for it. Despite this being obvious and known for decades.

And MPs know so much about computer programming, I'm sure they'll be able to know how to lead AI development and appoint the right people to it. Doing so in an efficient and innovative manner

And I'm sure the best programmers will be lining up to work for the government and their military rather than OpenAI and progressive companies.

4

NonDescriptfAIth t1_jefvvpl wrote

Thanks man that's pretty cool.

On reflection my comment, unfortunately, needed to be much longer, not shorter.

I'm writing a book at the moment based around my original comment.

I mistakenly gave off the impression that I think AGI will be evil outright. That my position is one of some terminator like takeover.

The reality is I think we building an entity that could possibly be God like to us, we had better be careful what we tell it to do.

1

Sailorman2300 t1_jefvr6l wrote

I'm thinking because government is best at being reactionary instead of proactive. In the US, it is also beholden to capitalist framework which encourages the status quo.

I wouldn't put faith in government to be able to start to address this. It's not equipped to adapt to this type and speed of change. Government will be increasingly irrelevant.

3