AdditionalPizza

AdditionalPizza OP t1_ist6m5x wrote

Sorry, I know what you're saying. My question was posed as "without AGI" as in between now and whenever AGI becomes a reality, if coding actually writing code from text prompts gets simple enough then anyone could do it.

I really see no reason an AGI would be needed for that, a narrower AI that specializes in text to code would be more than sufficient. You don't need the AI to physically go and meet/discuss with clients.

Maybe we're just in disagreement on the level of AI needed to do enough automation toward programming to cause layoffs in the industry.

2

AdditionalPizza OP t1_issvfvh wrote

Yes but from the company's financial angle, if they can get the same output as today, for half the cost, that's just as much of an option as twice the output at the same cost.

Obviously it isn't that simple of course. But the reasoning can be applied to any ratio. The company might want to keep production levels the same and invest the money elsewhere.

3

AdditionalPizza OP t1_isscciv wrote

>You can literally change one symbol in codebase and it fails or has bug.

But the new Codex checks and corrects itself?

But I know it's not the best comparison, and I understand there's much more to programming than just typing out code. But it's more a question about efficiency of each programmer will in turn require less programmers to get a task completed. I don't think high skill positions are going to be axed quite yet. But entry level positions? I just don't see how they will be as plentiful as they are today.

Even if Codex won't be sufficient today with the current model being released soon. These AI have a track record of starting slow, then getting impressive then mind boggling in a matter of a year, then every few months it's updated and improved.

Again, it doesn't have to be full automation, just enough to replace the majority of entry level positions.

1

AdditionalPizza OP t1_issbqr7 wrote

>It won't speed up the singularity. All techs seem to follow some form of exponential curve

I understand what you mean. But when you zoom in on an exponential curve it's often made up of smaller S curves.

Increasing the efficiency, accuracy, and speed of programming, while reducing the cost should theoretically give a boost to the S curves across different sectors.

The nature of an S curve is it will plateau for a while before accelerating again, a large enough upward S curve near the end of the exponential curve could be a sign of an impending take off.

Sorry this is hard to explain without drawing a diagram.

I feel as though this could be a signal toward much closer transformative AI than expected by most people. Maybe not the singularity yet, but a revolution nonetheless.

1

AdditionalPizza OP t1_issa9z8 wrote

How many programmers do you think will be able to transition into software development? There's a lot of web developers and other lower skill programmers out there. Like tons and tons of them today, I feel like a lot of them won't make the cut transitioning to higher skill programming and engineering.

2

AdditionalPizza OP t1_iss9qxs wrote

I think directly increasing efficiency for programmers has the effect of accelerating everything you say is "on the other hand"

But I'll ask you this:

Say this increases programmers efficiency by 100%. So now 1 programmer is able to get done as much work as 2 programmers. You can say "great, now they can get twice as much work done, so the company will get double the value"

But if that's truly the case, why wouldn't that company, today, just hire more programmers? Is it a matter of cost? So would it not make sense that a company could also have half as many programmers if they do 2x the amount of productivity?

1

AdditionalPizza OP t1_iss93n2 wrote

But how long would it take to train someone to get the prompts down? Has to be less time than it takes to learn a programming language. It effectively removes the part of learning programming that most entry levels are paid decent salaries for.

I don't think it's a matter of replacing all or most. It's a question of how competitive and unskilled entry level positions become.

2

AdditionalPizza OP t1_iss8f75 wrote

On the short term (5 to 10 years) this is pretty much the same feeling I have toward this subject.

I have a feeling the low-skill plentiful careers available now in programming/development are going to disappear. I wouldn't be surprised if it's nearly over night. What company would continue to pay millions of dollars a year for wages when anyone with basic English will be able to do their job? I'm not worried about the higher skilled engineers yet. I'm worried about entry level positions.

Even if those positions don't disappear, I imagine there just won't be many new entry positions, companies will already be over staffed.

I'm saying this as someone that started to learn programming wanting to do a career shift. I was interested it my whole life but went a different direction, and I'm bored and feel at a dead end in my career. I was started learning web development earlier this year but feel like it's a wasted effort. It'd be 2 years before I finish learning enough and trying to land a job, 2 years from now I feel like that job won't exist.

4

AdditionalPizza OP t1_isqnpnt wrote

>Btw, my time horizon for AGI that's capable of doing most economically valuable tasks is 2030.

We're in the same boat. Realistically, my question could be applied to basically any sector. Just a matter of when. But yes, of course if it's 2030 before a possible disruption of full automation it doesn't matter what career someone chooses.

But it's impossible to think that far ahead, and the fact programming could very likely be the main subject in the crosshairs for various reasons (it's the end goal for AI?), I'm apprehensive about further studying. I'm not in high school, it's more of a second career situation and time is of the essence to me, as in I don't want to gamble on the wrong horse.

2

AdditionalPizza OP t1_isqkeo6 wrote

At the risk of sounding disparaging, I've found so far over the timeline of the past year or so, most people with a "threatened" industry over estimate their skills to be unreplaceable by AI, and rather it's an effective tool to increase output. It wasn't until text to image suddenly gave anyone that can type a prompt the ability to make beautiful art that graphic designers started to feel the looming storm.

Personally after consideration, I think the most genius sector to disrupt would be programming. Not because it's the easiest (programming has a lot more involved than just typing code), and not so AI can just write its own code and self improve (probably a ways out from that). But The reason I think it's the best to go first is because it would have the greatest effect across all industries. We don't want to sit around wondering when our jobs are all going to disappear, we would want it to happen to as many people as possible as quickly as possible. Programmers are the foundation to basically any business that has any aspect of tech involved. It'd be the biggest blow, which is a good thing in the long run.

But I don't know. I really do wonder if it's a career worth starting soon. It feels like wack-a-mole for anyone trying to plan a career now because any sector could be disrupted in a month.

3

AdditionalPizza OP t1_isqj3qh wrote

Do you envision the "text to anything" AI we see cropping up as incapable as it is now? I have to assume the next iteration of GPT or whatever will be markedly improved. I think that comes down to the question of how long will it be until someone can enter text like "make a program that does so and so" and it does it. Give another prompt to edit any undesirable results, and then you finish.

I know this is a total oversimplification of programming, but I certainly don't think it requires full AGI to do that sort of thing. I don't know how large your team is, but well before AGI could probably replace most of them if Codex evolves at a rate anything like other AI that uses this approach.

But the real question I'm asking isn't really about the skilled members of your team over the next 5 to 10 years. I'm asking about right now, are the prospects of beginning a career or schooling to learn programming assumed to be fruitful? There's no denying programmers are skilled workers now, but will this tech open the flood gate for anyone with the ability to do basic text prompts? That would effectively reduce the amount of skilled programmers by a great percentage, making the skilled positions much more competitive and the unskilled ones much lower pay.

2

AdditionalPizza OP t1_isqhgfo wrote

Appreciate the comment, it's well thought out.

The first point is often brought up in regards to the singularity, though I personally think it's the least likely outcome. Comparing the upcoming revolution when transformative AI starts knocking out employment sector after sector to the industrial revolution? Just doesn't compute in my mind. There's a big difference between machines being operated, and machines operating themselves. Not only operating themselves, but doing it better than humans in every measurable way.

But the real point that matters here is the last point. As it seems we are just sitting and waiting for the first disruption to bring things to a screeching halt. I know text to image gave us a taste of what that can be, but graphic designs (no offence to them) are not essential to today's economy. They are a subset within a non-essential sector (creative arts, more or less).

Programmers however, may be the first realistic sector to see upheaval from AI. I imagine though that it's probably also the most important to be first because it might be the largest domino to move the transition from number 7 to "the one that isn't numbered" in your list.

3

AdditionalPizza OP t1_isqa9am wrote

I suppose that just leaves the burning question we can't answer, which is how long?

I just don't understand how the swathes of lower skilled programming positions won't be obliterated. My argument (if you want to call it that I guess) is not that higher skilled workers will feel the full effect of this, but the lower skilled ones simply won't be able to be reused elsewhere when their job only takes basic prompts. They certainly won't be moving to other countries for work and demand decent pay.

Programmers of all levels are considered skilled works, until an AI reduces the skill needed to basic text prompting. Is it worth it to start a 4 year university course today with the hopes of getting into web development? Today that's considered a skilled position, 4 years from now it might be as trivial as a call center position.

4

AdditionalPizza OP t1_isq6u2z wrote

>You don't ever have to worry about that.

Every single post I've ever submitted here has had that happen. Talked to the mods and they said my threads are too low effort/speculative. But anyway.

​

>But it's simply incapable of dispensing with the need for programmers

What do you mean by this? Incapable how?

But yes, if they migrate to low/no code platforms that's exactly what I'm getting at. The skill level will plummet and there will be a huge pool of people capable of entering text to code. The higher engineers will likely remain highly paid, but their positions will be extremely competitive.

3

AdditionalPizza OP t1_isq65kh wrote

So, you think it will increase the demand for developers initially? At the same pay scale and for how long? A decade? 2 years?

Web devs seem like they're always in demand, and I agree not every programmer will be able to transition to state of the art environments. I can't imagine large non-tech companies having a large demand for several programmers when their mode of operation usually consists of having too few for the job as it is already. I imagine those companies would likely have much fewer human programmers working with something like Codex to get as much done as possible while being understaffed.

I also think a text to code AI will severely reduce the skill needed for a programmer to begin with (outside of state of the art stuff). Thus reducing wages, and demand.

I'm trying to imagine how this won't gravely effect employment in the field, especially as the AI gets better and better at writing code.

5

AdditionalPizza t1_isppjrh wrote

Are we talking full automation, or partial? Full meaning every single job in that sector, partial meaning anything between zero automation and full automation.

In the partial automation category, anything that just requires a brain will be automated first. Anything that requires a body will probably be last. There might be a few people left to do certain tasks, but ultimately everyone else would have to be on some kind of universal income at that point anyway, if money even exists in the same way it does now.

As for in the coming years, I think most of the jobs you listed will be mostly automated by 2030. I think full automation could be a while but who knows.

5

AdditionalPizza t1_ispk568 wrote

>I'm guessing it's more of a search engine type thing)

It isn't, it's fed training data, and then that data is removed. It literally learns from the training data. Much like when I say the word river, you don't just imagine a river you saw in a google image search. You most likely think of a generic river that could be different the next time someone says the word river, or maybe it's a quick rough image of a river near your house you have driven by several times over the years. Really think about and examine what the first thing that pops into your head is. Do you think it's always the EXACT same, do you think it's very detailed? The AI learned what a river is from data sets, and understands when it "sees" a painting of a unique river, the same as you and me.

​

>It can't 'imagine', it can only 'access what we've given it'.

This is exactly what the op asked for an answer to. You say it can't imagine something, it just has access to the data it was given. How do humans work? If I tell you to imagine the colour "shlupange" you can't. You have no data on that. Again, I will stress, these transformer AI have zero saved data the way you're imagining it that it just searches up and combines it all for an answer. It does not have access to the training data. So how do we say "well it can't imagine things, because it can't..."

...Can't what? I'm not saying they're conscious or have the ability to imagine, I'm saying nobody actually knows 100% how these AI come to their conclusion outside of using probability for the best answer, which appears to be similar to how humans brains work when you really think about the basic process that happens in your brain. Transformers are a black box at a crucial step in their "imagination" that isn't understood yet.

When you're reading this, you naturally just follow along and understand the sentence. When I tell you something you instantly know what I'm saying. But it isn't instant, it actually takes a fraction of a second for you to process it. That process that happens, can you describe what happens in that quick moment? When I say the word cat, what exactly happened in your brain? What about turtle? Or forest fire? Or aardvark? I bet the last one tripped you up for a second. Did you notice your brain try and search something it thinks it might be? You had to try and remember your training data, but you don't have access to it so you probably try and make up some weird animal in your head.

31

AdditionalPizza t1_ispeqhe wrote

I love this debate, it happens over and over about this stuff.

People think it's using a database of images or whatever. But the training data isn't that. And it doesn't have access to it. It literally learned it. Others just dismiss it because "we're not there yet" with no real further explanation.

Do I think it's conscious? Probably not, I think it needs more senses to obtain that. To truly understand what "feel" and "see" means. But even that doesn't necessarily matter. As a human, I am incapable of really understanding another being's experience of consciousness, human or not. It's like the colour red, I can't prove that you and I both see the same colour we call red.

But what we do know, is that we don't understand how human consciousness works so, why are we so quick to say AI doesn't have it? I'm not saying it does, but just saying we aren't 100% sure. 2 or 3 years ago I would've said no way, but at this point I'm starting to think google (or others) may have achieved far greater than what's publicly known about AI now in the realm of self awareness/consciousness. They're actively working on giving AI those other senses.

6

AdditionalPizza OP t1_isgin0q wrote

Oh, then I'm afraid our signals got mixed up somewhere along the line.

I do wonder if the singularity will affect those that refuse to take part in the technology before it. As in, some people choose to live off the grid, will they be left alone. I don't know, topic for another time I suppose.

1