Recent comments in /f/philosophy
Syllosimo t1_jaa50su wrote
Reply to comment by Avalanche2 in AI cannot achieve consciousness without a body. by seethehappymoron
If we simplify that much we can also call humans just bunch of "if else" statements using weighted data and given that how far do we have to go with those "if else" statements as you say to separate line between machine and sentience?
LeopardOiler27 t1_ja9yhn0 wrote
Can anyone here prove that they themselves are conscious? Some truths which are self-evident to certain parties cannot be proven, but are nonetheless true. For example, somehow someway I know I am self aware but there is no way to prove that to anyone, nor can anyone rigorously prove their selwareness to me.
I don't think current AIs are sentient to any degree.
Ghostyfied t1_ja9xbvm wrote
Reply to comment by eucIib in AI cannot achieve consciousness without a body. by seethehappymoron
Your argument might be right, we can only tell once someone would experience this. And because of the fact that no one has experienced before how it is to be conscious completely without a body (or at the very least, we do not know of such an event) we can not say with certainty that it is possible.
And I think that means that without further knowledge, both your and the authors argument concerning this specific situation could be correct here.
livingthedream82 t1_ja9wkul wrote
Reply to comment by paxxx17 in AI cannot achieve consciousness without a body. by seethehappymoron
All this pain is an illusion
James_James_85 t1_ja9uwb0 wrote
Reply to comment by LeykisMinion007 in /r/philosophy Open Discussion Thread | February 27, 2023 by BernardJOrtcutt
>he discusses the experiments he did over 30 years that allude to us all being tapped into one human consciousness.
One interesting paradox solved by a global consciousness is the issue of split brain patients. Patients who undergo a total corpus callosotomy have their two hemispheres completely separated. After the surgery, they emerge as two separately thinking entities, essentially two people in one body. If one imagines undergoing such a surgery, slowly closing their eyes as they drift away under the anesthetic, then waking up after. Which half would they find themselves as? I'd imagine wiring two separate brains together in the right way would cause them to start thinking they are one person instead of two as well. The traditional view that each individual has their own consciousness fails to explain such paradoxes or questions as "why am I aware of this body and not that body" or "why am I conscious of a body born in the 21st century instead of one in the year 3000", why am I conscious of a human body instead of a bird's, etc..
I had such a thought too, I imagine a single consciousness throughout the entire universe, which experiences the universe in varying degrees of awareness through completely separate entities (brains/brain simulations/...). However, it's important to note that even if this were true, no experiment would ever be able to establish any connection between separate organisms beside what can already be explained by the laws of physics. Whatever true consciousness is, it would appear that, paradoxically, it has absolutely no effect on our thoughts and decisions, or our subconscious, as all those processes can be traced back to neural interactions and chemical processes in the brain. Lesioning certain parts of the brain could for example mess with our decision making, personalities or memories.
​
>what sparks the initial thought in the brain?
Sensory input for example could serve as an initial spark. Certain types of neurons sometimes fire spontaneously too, due to certain chemical properties. I don't think true consciousness, whatever it is, can alter the path of a molecule or squeeze a neuron and cause it to fire, that just doesn't happen.
​
>I believe there’s much more to true consciousness than a bunch of 1’s and 0’s
I do to. It's easy enough to imagine why a complex enough brain would "think" it is conscious, but I could never see how it would be "truly" conscious. It's indeed a perplexing issue.
smallusdiccus t1_ja9sw5a wrote
Reply to Neuroscientist Gregory Berns argues that Thomas Nagel was wrong: neuroscience can give us knowledge about what it is like to be an animal. For example, his own fMRI studies on dogs have shown that they can feel genuine affection for their owners. by Ma3Ke4Li3
I don't agree with Nagel, but I think Berns is missing his point. Nagel argues that experience itself its a different and independent form of knowledge only acquirable by the subject experiencing it. Experience manages to escape from the cold data scientific knowledge provides because of that point. Imagine if we could describe every aspect of a new discovered color (amount of nanometers present in its wavelenghts, how it affects the different photoreceptors of each animal that encounters it, which part of electromagnetic waves it absorbs and which ones it reflects, etc.) then we would theoretically know this color, every component that shapes it, but we still wouldn't get the effect, this new and different type of knowledge that Nagel was refering to, that we'd get if we saw the color, if we experience it. So if neuroscience can tell us how is like to be an animal, Nagel would just respond that that knowledge only describes what it is to be an animal but we could only get how it is to be an animal if we were that animal in question, which is a new form of knowledge itself that science cannot essentially reach.
Wroisu t1_ja9r7v5 wrote
Reply to comment by unskilledexplorer in AI cannot achieve consciousness without a body. by seethehappymoron
What if a future very advanced AI builds a human body for itself from the ground up?
fatty2cent t1_ja9ocqt wrote
Reply to comment by eucIib in AI cannot achieve consciousness without a body. by seethehappymoron
I think the problem is that there once was a mapped body part, and then it was removed. Is there phantom limbs in people who never had said limb? Can you have phantom limbs that exceed the normal limb arrangement of a human body? Likely not.
LeykisMinion007 t1_ja9nzck wrote
Reply to comment by James_James_85 in /r/philosophy Open Discussion Thread | February 27, 2023 by BernardJOrtcutt
I think before we can talk about a piece of paper having a conscious, we need to prove that someone other than you has one.
I believe everyone has one, but I can’t see or feel your conscious mind. At the end of the day we all just assume we have one. Which is a safe bet, but where and what is it?
If we look at it from an observer perspective, then me knowing how I react and feel during my conscious state I can assume you do as well; and the paper for that matter.
But I think there is much more to consciousness than simply something mechanical. If you’ve read Power vs Force by Dr. David R Hawkin, he discusses the experiments he did over 30 years that allude to us all being tapped into one human consciousness.
So perhaps the conscious state we feel in the pure ego state is the separated “us” we know on the surface and the subconscious may have areas connected to everyone else.
Furthermore, as mentioned in Beyond the Quantum by Michael Talbot (and I’m paraphrasing), what sparks the initial thought in the brain? For example, you can follow the electrical impulses in the brain and say something like, this area of the brain sparked which made the person open their hand which sparked this area of the brain that made them reach out for an apple, or whatever, but follow all the sparks back to the very first one. What made that spark?
Are we unable to measure such things yet? Or is something deeper taking place? Who knows?
I believe there’s much more to true consciousness than a bunch of 1’s and 0’s, or things like this paper example. Fun to entertain, but we still know so little about consciousness anyway.
Mustelafan t1_ja9m120 wrote
Reply to comment by NotObviouslyARobot in Neuroscientist Gregory Berns argues that Thomas Nagel was wrong: neuroscience can give us knowledge about what it is like to be an animal. For example, his own fMRI studies on dogs have shown that they can feel genuine affection for their owners. by Ma3Ke4Li3
Your example still falls prey to the problem of inverted spectra. Hypothetically, two people could have phenomenal experiences of sight and color that are exactly opposite of each other, and if these experiences were otherwise isomorphic (the relationships between each color still proportionally the same) they could produce the exact same work of art but both percieve it differently.
Regarding bats, though blind humans are apparently capable of some form of echolocation, there's no way to know if their phenomenal experience of echolocation is the same as how bats experience echolocation. If their brains and brain activity are sufficiently similar we might reasonably infer that that's the case, but it's probably impossible to ever say for sure. Same with dogs; we can reasonably infer that dogs experience affection, but who can say whether the subjective feeling of affection is the same for dogs as it is for humans? This is where Mr. Berns has failed to properly address Nagel's question.
I might say affection is a sort of sweet feeling. You might say it's more red, to someone else it's gold or a feeling of levity. A dog might consider affection savory or warm. We all have experienced affection, but we may all experience it differently.
paxxx17 t1_ja9lnwy wrote
Reply to comment by MrGurabo in AI cannot achieve consciousness without a body. by seethehappymoron
It's our reminder here that we are not alone
DarkDracoPad t1_ja9l5tw wrote
But does the AI know this, cuz then they can start looking for a body 🧐
Alexander556 t1_ja9jvjk wrote
Reply to comment by Maximus_En_Minimus in /r/philosophy Open Discussion Thread | February 20, 2023 by BernardJOrtcutt
I personally think that, if something like this would exist, we should only use it to prevent harm. If someone has planted a bomb somewhere, i have no problems with violating his privacy and reading his mind.
NullRad t1_ja9jdqh wrote
Reply to comment by ErisWheel in AI cannot achieve consciousness without a body. by seethehappymoron
How’s that working out for you… your dopamine bound to winning low risk arguments?
ErisWheel t1_ja9j5s9 wrote
Reply to comment by NullRad in AI cannot achieve consciousness without a body. by seethehappymoron
>when you Hitchens a Diogenes? Behold, a
>
>chicken
Cool, man. You've read some ancient philosophy somewhere and mish-mashed it with name-dropping Hitchens for some reason. Good stuff and keep those quips rolling, no matter how nonsensical they may be.
Whatever bone you've got to pick from here on out, the bacteria idea you offered earlier wasn't a good one.
Alexander556 t1_ja9iscw wrote
Reply to comment by Fuyoc in /r/philosophy Open Discussion Thread | February 20, 2023 by BernardJOrtcutt
Something like that was called "mind rape" in another context.
NullRad t1_ja9i11x wrote
Reply to comment by ErisWheel in AI cannot achieve consciousness without a body. by seethehappymoron
What do you get when you Hitchens a Diogenes? Behold, a chicken.
ErisWheel t1_ja9hpir wrote
Reply to comment by NullRad in AI cannot achieve consciousness without a body. by seethehappymoron
Do you know what ad hominem means? Because this ain't it.
You said you don't need evidence because you made a "tongue in cheek" comment on r/philosophy. Which seems to suggest either a) you don't think evidence is important for arguments, b) you don't know what tongue in cheek means, or c) you think r/philosophy isn't a place that requires the above, or some combination of all of that.
How's what working out for me? Calling out a bullshit argument? Not all that hard, really. Feel free to provide support if you don't think that's true, but I'm not sure why you're upset that someone doesn't take your point seriously when your justification is "I don't need shit because my comments are flippant and this is r/philosophy".
[deleted] t1_ja9gck0 wrote
Reply to comment by Otarih in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
[deleted]
NullRad t1_ja9fned wrote
Reply to comment by ErisWheel in AI cannot achieve consciousness without a body. by seethehappymoron
How’s throwing ad homonyms & snuck premise at people who don’t care working out for you?
ErisWheel t1_ja9fdlb wrote
Reply to comment by NullRad in AI cannot achieve consciousness without a body. by seethehappymoron
"My argument is bad and I don't care/don't believe it anyway."
Gotcha.
NullRad t1_ja9dxjm wrote
Reply to comment by ErisWheel in AI cannot achieve consciousness without a body. by seethehappymoron
Evidence? I don’t need shit to make a tongue in cheek comment on r/philosophy.
ErisWheel t1_ja9dpvi wrote
Reply to comment by NullRad in AI cannot achieve consciousness without a body. by seethehappymoron
Your hunger and thirst sensations are hormonally driven. They don't arise as a result of bacterial activity.
You're making huge sweeping assumptions based on the fact that because the volume of bacteria in the human body is very high, they must "control everything". That's not how our biology works. There's no evidence at all that bacterial function in the body has any sort of causal link to higher-order brain function. Altered states of consciousness can arise as a result of serious infection, but that's not at all the same as bacteria being able to coordinate and "control" what the body does or how the conscious mind acts and reacts.
You'd need a LOT more evidence to even come close to supporting what you're suggesting.
Base_Six t1_ja9cmzl wrote
Reply to comment by unskilledexplorer in AI cannot achieve consciousness without a body. by seethehappymoron
If I grow a bunch of human organs, brain parts and whatnot in a lab and put them together into an artificial human, would I then not expect consciousness because of how the structures emerged? It seems most intuitive that, if I compose a physical structure that is the same as a naturally grown human body and functions in the same way, that the brain and mind of that entity would be the same as a "natural" human.
I can extrapolate, then, and ask what happens if I start replacing organic components with mechanical ones. Is it still conscious if it has one fully mechanical limb? How about all mechanical limbs? What if I similarly take out part of the brain and replace it with a mechanical equivalent?
techhouseliving t1_jaa788t wrote
Reply to AI cannot achieve consciousness without a body. by seethehappymoron
Fine, then give me the agreed upon definition of consciousness?