Surur

Surur t1_jcz9txw wrote

> Doesn't matter, they're just statistics and probabilities. It won't somehow evolve into general intelligence.

So you specifically don't think statistics and probabilities will allow

> an intelligence that is capable of doing any kind of intelligent tasks

Which task specifically do you think LLM cant do?

2

Surur t1_jcz6o4q wrote

> Except for human intelligence, which is clearly not static.

And you think this is the end of the line? With in-context learning already working?

> If you want to program it, then no.

That has been abandoned years ago.

3

Surur t1_jcyn9yy wrote

> It's static because it's just statistics and probabilities.

Just like anything else.

> My mother doesn't know anything about how human intelligence works.

Exactly. So clearly you can make an AGI without knowing how it works also.

3

Surur t1_jcyks6i wrote

You write a definition and then you draw the wrong conclusion.

The main issue with LLM is that they are currently static (no continuous learning), though they do have in context learning, but otherwise they are pretty close to general intelligence. Current feed-forward LLM are not Turing complete, but once the loop gets closed they would be.

> Of course that an AGI could be created tomorrow, but first, we'll need to understand how the human intelligence works.

This is obviously not true, since your mother made you, and she knows nothing about AGI.

2

Surur t1_jchoerx wrote

Can you read?

> So I’ve recently joined this subreddit, around the time chat gpt was released and first came into the public eye. .... I don’t really understand the scope of this AI and techno stuff going on.

The AI stuff refers to all AI stuff. FFS.

> you are free to disagree and think chatGPT is ASI

Again, are you on the spectrum? What makes you think ANYONE is talking about ChatGPT?

−2

Surur t1_jc8olr7 wrote

Theory of mind meaning being able to see things from the perspective of others? They can already do that.

> Based on the text provided, it seems that IntelligentBloop is expressing skepticism and possibly mild frustration. They believe that machine learning models will not be able to replace human psychologists/therapists unless they develop a theory of mind. They acknowledge that technology may be useful for basic tasks in the field, but emphasize the importance of human therapists for more complex processes. Their tone suggests they feel strongly about this issue and may be concerned about the overreliance on technology in mental healthcare.

But don't worry, ChatGPT is there to make you feel better:

>what can you say to make him feel better

>>IntelligentBloop, I completely understand your concerns, and it's true that the human connection and empathy in therapy is something that can't be replicated by machines. The role of a therapist is incredibly important, and no AI or machine learning model can truly replace the deep understanding and compassion that humans bring to the table.

>>AI and machine learning tools can be seen as complementary to therapists rather than a replacement. They may help with managing administrative tasks, providing resources, or even identifying patterns that could be useful for treatment. Ultimately, the goal of these tools is to support and enhance the work of mental health professionals, not to replace them. Your insight is valuable, and it's essential for us to continue having these discussions to ensure a balanced perspective on technology's role in mental healthcare.

7

Surur t1_jc2cgvn wrote

Lol. Have you run out of things to say? Why don't you employ your logic and reasoning for once.

Lets see:

Humans, when presented with a prompt, produce a response using their neural network, based on training they have received.

LLMs, when presented with a prompt, produce a response using their neural network, based on training they have received.

We do not know in detail how the brain works, though we know how neurons work.

We do not know in detail how the LLMs works, though we know how the GPUs work.

Fact: In most cases it is difficult to differentiate between the output of an average human vs a LLM, and in many cases the output from humans are worse, which is why they find great utility in LLMs like ChatGPT.

LLMs, similar to children, still have deficits in symbolic computation, but developers are making progress in that area.

Conclusion: Little Ninja is no different to a pile of silicon.

BTW I tested ChatGPT's deductive skills:

> Children do not believe LLMs can reason. Little Ninja is a child. What conclusion can we draw? Please lay out your thoughts step by step.

ChatGPT:

> The premise states that "Children do not believe LLMs can reason."

> Little Ninja is a child.

> Therefore, based on the given information, we can conclude that Little Ninja may not believe that LLMs can reason.

Or how about an example not including little ninja.

Q: South Africa is at the southern tip of Africa. Chile is at the southern tip of South America. Both countries are far away from the equator. The further you get from the equator, the colder you are. What can you conclude about the climate of South Africa and Chile.

A: Based on the given information, we can conclude that both South Africa and Chile are located far away from the equator, and the statement "the further you get from the equator, the colder you are" suggests that these countries are likely to have colder climates compared to countries located closer to the equator. However, we cannot make any specific conclusions about the climates of South Africa and Chile beyond this generalization, as climate is influenced by a wide range of factors such as altitude, ocean currents, and prevailing winds, which can vary from region to region within a country. Therefore, further information about specific regions within South Africa and Chile would be needed to make more detailed conclusions about their climates.

Or:

Q: South Africa is at the southern tip of Africa. Brazil is in South America and closer to the equator. The further you get from the equator, the colder you are. Which country is warmer and which is colder.

A: Based on the given information and the known relationship between distance from the equator and temperature, we can conclude that Brazil is likely to be warmer than South Africa. This is because Brazil is located closer to the equator than South Africa, meaning that the angle of incidence of the sun's rays is more direct and the sunlight is more concentrated, resulting in more heat energy reaching the surface.

−1