Recent comments in /f/singularity

acutelychronicpanic t1_je9gsif wrote

INT. TITANIC - DECK - NIGHT

Panic-stricken passengers are running in every direction. A mother is clutching her child, and people are pushing each other to get on lifeboats. Water is gushing onto the deck.

PASSENGER 1 (screaming) We're going down! We're all going to die! Amidst the chaos, CAPTAIN SMITH steps forward and raises his hands to calm the crowd.

CAPTAIN SMITH (firmly) Everyone, please, listen to me! I understand your fear, but there's no need to panic. The crowd quiets down, turning their attention to the captain.

CAPTAIN SMITH (continuing) Throughout history, every time the water level has risen, there has always been more boat to climb. We may not see it now, but there could be even more boat to climb that we can't imagine.

PASSENGER 2 (uncertain) But, Captain... the ship is sinking!

CAPTAIN SMITH (smiling reassuringly) Trust me. We'll find a way to climb higher. We always do.

2

smokingPimphat t1_je9fso3 wrote

I wouldn't say 1000s but its easily 100 every year, there are 1000s of shit scripts that get sent, if there are 100 that could be good, of those there are 20 that get "sold" ( that means someone pays for the IP ) and of those there are 5 that could be be good enough to go live( that means go into production), AI could be the difference between between 2 and 4 of those 5 getting made. it doesn't sound like much but that is absolutely huge when you consider how many people ( HUMANS ) it takes to produce a "low budget" show.

1

Louheatar t1_je9fsii wrote

on a saturday a week ago (or maybe 2 weeks? idk, feels like a year ago, time has sped up haha), i decided to give gpt-4 a go. pasted it some code and told it to explain it, pressed enter.

my heart stopped for a moment when i started reading the response. it felt like everything i've ever known was false, and i was shaking for hours after that. :D a sudden total dramatic change of my entire world view. quit my job on the following monday (a tech startup facing a fundamental business risk from generative ai, no hope for the company anymore, no point staying), so i could focus on this full time. i've been fortunate enough to be employed in a bullshit job that gave me the opportunity to invest the extra money, so i can withstand the turmoil financially for a few years (in case this move seems too crazy for someone).

anyway, after that experience, i've had no stress about anything anymore. i think it's because i know all the super stressful grunt work i've had to do over the years, grinding with the computer from dusk till dawn, is now over. there's literally no reason to do that anymore. what's left is enjoying the ride and working on creative projects i've only dreamt of. 8)

just sharing this story because i'd be interested in hearing if others have similar stories :D

3

ShowerGrapes t1_je9fibd wrote

a vast simplification is this: neural pathways are created randomly with each new training cycle then something is input (text in gpt instance), the generated outputs are compared to the training data and higher weights are attached to the pathways that generate the best output, reinforcing these pathways for future output. done millions or trillions of times, these reinforced pathways end up being impressive. the way the neural pathways are created is constantly changing and evolving, which is the programming aspect of it. eventually, the ai will be able to figure out how best to create the pathways itself, probably. you can watch it in real time and see how bad it is in the beginning, watch it get better. it's an interesting cycle.

1

Tiamatium t1_je9fadb wrote

It probably won't.

Also look at the list chatGPT gave you, it broadly falls into two categories:

  1. Artists, and this is based on idea that creativity and artistic expression are unique to humans and cannot be recreated by AI.

  2. Very smart knowledge workers with decades of specialized experience under their belt. The very fact that we consider these people supersmart should give you a hint that not everyone can be AI systems developer or a researcher.

1

acutelychronicpanic t1_je9f9q0 wrote

Understanding, as it is relevant to the real world, can be accurately measured by performance on tasks.

If I ask you to design a more efficient airplane wing, and you do, why would I have any reason to say you don't understand airplane wings?

Maybe you don't have perfect understanding, and maybe we understand it in different ways.

But to do a task successfully at a high rate, you would have to have some kind of mental/neural/mathematical model internal to your mind that can predict the outcome based on changing inputs in a way that is useful.

That's understanding.

1

Jinan_Dangor t1_je9e00f wrote

How'd you reach that conclusion? There are dozens of solutions to climate change right in front of us, the biggest opposition to these solutions is the people whose industries make them rich by destroying our planet. This is 100% an issue that can be solved by humans alone, with or without AI tools.

And why do you assume anything close to a 50% chance of paradise when AGI arrives? We literally already live in a post-scarcity society where the profits of automation and education are all going straight to the rich to make them richer, who's to say "Anyone without a billion dollars to their name shouldn't be considered human" won't make it in as the fourth law of robotics?

Genuinely: if you're scared about things like climate change, go look up some of the no-brainer solutions to it we already have that you as a voter can push us towards (public transport infrastructure is a great start). Hoping for a type of AI that many experts believe won't even exist for another century to save us from climate change takes up time you could be spending helping us achieve the very achievable goal of halting climate change!

1

acutelychronicpanic t1_je9dxaa wrote

Yes! This is exactly what is needed.

Concentrated development in big corps means few points of failure.

Distributed development means more mistakes, but they aren't as high-stakes.

That and I don't want humanity forever stuck on whatever version of morality is popular at Google/Microsoft or the Military.

275

fluffy_assassins t1_je9dwdw wrote

"is it gonna swing a hammer? are we all just gonna keep pretending there isn't a nationwide skilled labor shortage? There's plenty of work to go around. Everyone just thinks they're too good for it."

We are talking about jobs, not skilled jobs.

People think they're too good for unskilled labor.

Labor that is skilled requires skills. If it was as simple as swinging a hammer, it wouldn't be skilled labor. So you're trying to talk about skilled labor like it's unskilled labor to transfer the blame to people who don't have the resources to acquire the skill and then claiming they think they're too good for your skilled labor when there's just no way they can do it.

Yeah, no.

0

Kafke t1_je9drq3 wrote

Yes. You do realize our eyes only have three kinds of cones right? Rgb are the primary colors lol. Cmy if you're looking at subtractive colors. Using these three colors, you can create every other color. Rgb for light/additive, Cmy for ink/paint/subtractive.

Rby is not primary in any sense of the word.

1