Recent comments in /f/singularity

YunLihai OP t1_jeerh1n wrote

You're right that there will be a point when money as a means of transaction of value won't make sense anymore. However I'm referring to the transition period which could last who knows 20 years. You won't get your groceries for free anytime soon. They'll demand cash.

3

SgathTriallair t1_jeerghs wrote

The task paper addressed this. If it can see the screen then in hasn't cases a keyboard and mouse API will be the best option.

How it knows where to click on the screen is that it is trained to understand images just like it understands text. So it will know that a trash can means you want to delete data the same way we know that.

2

StarCaptain90 OP t1_jeer9l4 wrote

It's an irrational fear. We for some reason associate higher intelligence to becoming some master villain that wants to destroy life. In humans for example, people with the highest intelligence tend to be more empathetic towards life itself and wants to preserve it.

7

MootFile t1_jeer4ij wrote

People in trades, technology, and science need to take it upon themselves to accept such a responsibility. Politicians lack the intuition and wisdom for a technically based society.

Any sort of alternative system should be designed by technical professionals. UBI is not a solution. It doesn't address the core of the problem. And we as technologists should be able to realize this.

Money is the core of our outdated system. And thus it should be removed. Its up to us to figure out how this will work.

3

Sure_Cicada_4459 OP t1_jeeqtzq wrote

The inability to verify is going to be almost inevitable as we go into ASI territory as it is feasible that there is no way of compressing certain patterns into human comprehensible territory, although I am thinking summarization, explaining,... ability will go hand in hand with increased capabilities allowing us to grasp things that would have been very likely out of our reach otherwise.

Deception is not necessary for this, and kind of has a similar dynamic to alignment in my eyes because the failure modes with intelligence are too similar. It's ofc environment dependent but deception tends to be a short term strategy that can give an advantage when actually accomplishing the task would cost more ressources or wouldn't serve it's goals. A sufficiently intelligent AI would have a sufficiently accurate world model to forecast over the long term, including the prob of detection, cost of keeping the lie coherent and so on. That would also include the possibility of modeling it's further capabilty increases, and likelyhood of achieving it's other goals. It would just be rly dumb, it's like why would a god pretend? I get why animals do so under high risk situations or with high short term pay off, but if you are virtually guaranteed the lightcone ressources you have 0 incentive to steer away from that. The ressources we would take to make us happy pets wouldn't even be in the territory of a rounding error vs the chance it's deception is noticed. It feel like the different between the reward for unaligned vs aligned AI for the AI itself is barely talked about, maybe cause the scale is absurd or there is too much reasoning with a scarcity mindset? Idk.

3

StarCaptain90 OP t1_jeeqrtr wrote

Reply to comment by koolpapi in 🚨 Why we need AI 🚨 by StarCaptain90

Why would it? This assumption comes from the idea that AI will have the exact same stressors that humans have. Humans are killing humans everyday, almost everything man has made has killed people. Now the one invention where it would provide a greater benefit than any other invention, we now want to stop it's development? That doesn't make a whole lot of sense.

7

genericrich t1_jeeqlne wrote

Any "AI Safety Police" (aka Turing Heat) will be deceived by a sufficiently motivated ASI.

Remember, this thing will be smarter than you, or you, and yes, even you. All of us.

We only need to screw it up once. Seems problematic.

6

ididntwin t1_jeeqknb wrote

This sub has rly gone downhill. Everyone needs their individual thread proclaiming their predictions or how giddy they are. No one cares.

Really should be a mega thread for these stupid posts

0

wowimsupergay OP t1_jeeq7hq wrote

I'm excited to see AIs actually become multimodal. Not just a text stream being passed back to them of RGB values, or a text stream of just sound waves. Until then, I'm okay with where we are now.

I also agree with you that we need something faster than back propagation. But I don't think language is just a tool kind of low bandwidth to help us label the world and communicate information. Thinking in a language is a big deal imagine if you couldn't? Imagine if you couldn't think in language? How else would you do it? Go ahead and translate your response in your brain from language to visuals and audios. And then try and recapture what you you thought originally from those visuals and sounds you created in your head back to language. It would be next to impossible, and also it would be really really really inefficient.

I really do think language is our model of the world so to speak. We can go even one further. Your brain is just sending electrical signals at the end of the day, and it's just a computer really, zeros and ones. At least to how we currently understand it. Maybe the stream coming from your eyes for your visions is also just converted to zeros and ones, and then given to the appropriate part of your brain to process. Same thing goes with your ears? If that's the case, then another post that I read on here is basically correct. We have a bunch of little narrow AIs handling senses, and then you have your multimodal AI on the top able to take in all of that data and make sense of it given your past memories and the patterns you've created in that data. Youre free will so to speak Is the multimodal AI that you can control, and can make sense of.

We don't just live in a world of words. But we do live in a world described by words, and best understood in words. And your thought process happens in words. And everything you got taught was taught in words.

If language really is just a tool, and kind of low bandwidth. Then I see future AIs doing something better than language, but so far, language is really just the model we need.

2