Recent comments in /f/philosophy
Digital_Utopia t1_j9wfnpy wrote
Reply to comment by AllanfromWales1 in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
However, keep in mind that sophistication for a human, and sophistication for a computer are 2 different things. Chances are, the more sophistication required on behalf of a human, the easier the job is for a computer - the latter only struggling with what we consider to be easy- namely sight, and the ability to hold a conversation.
While it's true that computers would struggle with creativity out of the blue - the type of creativity involved in actual jobs, is much less than artists creating their own, independent art.
I mean, someone working on game art, or designing advertisements, or websites, are getting similar parameters as Dall-E, from their supervisors/clients.
twnznz t1_j9wdk7b wrote
So what, you're gonna buy a massive training farm and just make it free to use? Because that's fundamentally what we're discussing, nothing to do with algorithms!
lordtrickster t1_j9wd15z wrote
Reply to comment by commandolandorooster in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
I function off a "working model for the situation" approach. You'll never know the entirety of everything, but you can know enough to be useful in a given situation.
JohnLawsCarriage t1_j9wcxns wrote
Reply to comment by norbertus in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
A big NVIDIA card? You'll need at the very least 8, and even still you're not coming close to something like ChatGPT. The computational power required is eye-watering. Check out this open-source GPT2 bot that uses a decentralized network of many people's GPUs. I don't know how many GPUs are on the network exactly, but it's more than 8, and look how slow it is. Remember this is only GPT2 not GPT3 like ChatGPT.
edstatue t1_j9wcotn wrote
Reply to comment by quantumdeterminism in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
I don't agree that a higher level of consciousness necessarily correlates with a more accurate or complete perception of reality.
Perception doesn't require consciousness, and I think ants are probably not conscious, but we know they can perceive the world around them. Not exactly as humans do, but with various sensory organs.
Why then, would consciousness bring us closer to reality? How often does our conscious mind lie to us? We know that each time we recall a memory it gets modified before we move on... We know that our subconscious mind can even perceive stimuli better than our conscious mind (there's a reason you pull your hand off a hot burner without even thinking about it).
I posit that consciousness is a beautiful lie our bodies tell us, and that if we are to look for a living being on earth that experiences "reality" as closely as possible, it's going to be something that doesn't have sentience as a misleading bottleneck.
edstatue t1_j9wblzd wrote
Reply to Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
What Lawson gets at with "openness" being all that possible ways in which a thing can be perceived reminds me of the quantum cosmological idea that reality is inherently probabilistic, and that all the different "options" available in a wave function never truly collapse, but collapse for each reference point in potentially different ways.
But where there's a difference is the idea that we're "always infinitely distant from the true open nature of things." The thought school of quantum mechanics that I'm thinking of suggests that there is NO "God-eye-view" of reality, and thus every reference frame is equally legitimate, since no one or no thing can experience multiple reference points simultaneously.
So when Lawson says that reality is a bunch of homogenous stuff that only appears to have differentiation when we using a closing tool like language, that's not far off from what quantum theorists have to say about reference frames and observation (or interaction).
Edit: a word
Outrageous_Fall_9568 t1_j9w9hb7 wrote
TheRoadsMustRoll t1_j9w9fdr wrote
Reply to comment by norbertus in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
>This article is kind of nonsense.
yep. here's some highlights:
>AI Creativity is Real
despite the authors wordy arguments AI requires input and only that input (however jumbled) will be returned on a query. if it were really creative it could dream up something on its own and populate a blank sheet of paper with something novel. AI isn't creative. the people that program it might be.
​
>3. Comparison: Human Brains vs. AI
despite the title of this section the author never actually makes any comparison. we only get this:
>The present analysis posits that the human brain, in terms of artistic creation, is lacking in two conditions that AI is capable of fulfilling.
>
>AI decomposes high-dimensional data into lower-dimensional features, known as latent space. [AI is more compact]
>
>AI can process massive amounts of data in a short time, enabling efficient learning and creation of new data. [AI is more comprehensive]
ftr: the human brain processes a massive amount of data and succeeds in keeping living beings alive while driving/painting/writing code. the list of things human brains can do that AI can't is very long.
Alexander556 t1_j9w96c7 wrote
If we had a device which could reliably (and painlessly) read peoples minds, without their consent, and could be used to pull specific informations from some ones brain, should we restrict it's use*?
Should we use this device at all?
-Would it be unethical to search the brain of a criminal to find information on a crime?
-Should such information be used in court, or would it be considered as having to witness against oneself?
-Would it be unethical to search the mind of a convicted murderer, to find the locations where s/he hid some of the victims, to give the families closure?
-Would it be a war crime to extract vital knowledge from a POW?
*For example, only to be used by psychiatrists etc. to help consenting patients, research purposes etc.
Quantum-Bot t1_j9w8s44 wrote
Whatever we want AI to be like, it will most likely turn out much like computers and the internet, accessible to all who are tech savvy, but dominated by the elite. Everyone can benefit from AI in their daily lives and companies like Amazon and Google are happy to provide that service, as long as they can skim data and ad revenue off of every interaction.
skwww t1_j9w88rs wrote
Was this written by an ai?
What’s with all the jumping around with the topics?
renegade2point0 t1_j9w6g4x wrote
theplanet1972 t1_j9w6c3r wrote
Reply to comment by Wegwerpbbq in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
Can anyone recommend a work by Wittgenstein that covers similar territory.
22HitchSlaps t1_j9w4onh wrote
Reply to comment by AllanfromWales1 in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
With how narrow AI is now I tend to agree with you. No one is going to pay for that. But the thing is we just don't know when AGI is coming, maybe a long way off. To me though I still see even the continued prevalence of narrow AI as so destabilising that it'll affect every sector and job, even if it doesn't specifically take it over. Whether agi or this kinda paradigm shifting destabilisation happens in the next 10 years, who knows but I do see it as inevitable. We need an entirely new approach to society, jobs and capitalism.
AllanfromWales1 t1_j9w35w6 wrote
Reply to comment by 22HitchSlaps in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
My work is facilitating a particular type of technical safety audit (HAZOP) in the process engineering industry. There's no reason why AI couldn't do it, but the demand isn't great and the complexity of learning is such that it would be unlikely to be cost-effective even in the medium term.
jl_theprofessor t1_j9w2w1o wrote
Reply to comment by 22HitchSlaps in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
You better learn how to hunt and fish since you're not going to be able to get a job.
22HitchSlaps t1_j9w2gdl wrote
Reply to comment by AllanfromWales1 in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
Well that's actually fairly reasonable then. I'd say though that the idea that AI will do "some stuff sure but not MY STUFF" is shortsighted. Sufficiently advanced AI will do everything better than humans and the thing is the tech is like an avalanche that has already started. I'm 40 years younger than you, there's no job I could ever do that won't be better done by AI by the time I'm finished learning it to the level you could have in your life time. Such is life...now.
AllanfromWales1 t1_j9w1xgs wrote
Reply to comment by 22HitchSlaps in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
For what it's worth I'm 67 now, so probably yes. But I doubt that there'll be AI doing my job for a long time after that.
22HitchSlaps t1_j9w10bs wrote
Reply to comment by AllanfromWales1 in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
Hope you're retiring within 10 years.
norbertus t1_j9w05jq wrote
This article has some problems. The biggest one -- beyond some of the more basic conceptual problems with what these machine learning systems actually do -- is the vague demand that AI be "democratized."
They never define what the mean by "democratize" though they caution that "Big corporations are doing everything in their power to stop the democratization of AI."
We have AI because of big corporations. And nobody is going to "democratize" AI by giving every poor kid in the hood a big NVIDIA card and the skills to work with Python, Bash, Linux, Anaconda, CUDA, PyTorch, and the whole slew of technologies needed to make this stuff work. You can't just "give" people knowledge and skills.
This article is kind of nonsense.
rottentomatopi t1_j9vy0o4 wrote
Reply to comment by AllanfromWales1 in The Job Market Apocalypse: We Must Democratize AI Now! by Otarih
Yes, but there are very high skill, well paid jobs that AI is capable of doing cheaper than humans, primarily in the arts. It’s already tough getting into those fields, so this does make it pretty troublesome because it takes away from those opportunities that many people make a living off of and feel fulfilled by.
DriftMantis t1_j9vxk30 wrote
Reply to Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
I think the nature of reality is always a self constructed byproduct of conciousness itself. In other words, reality or ultimate truth is not a fixed thing that exists outside of our grasp. It is only a truth in so far as we can percieve it as such.
In a sense this makes sense because you can assume humans have conjured some version of consensus reality, which is an abstracted version of reality as it appears to us. Therefore, we can assume to some extra terestrial alien out there they may have a different version of reality that they percieve which may overlap with ours in someway or perhaps not at all.
If you could gain the perspective of every perspective, forever, simultaneously, I would assume you would become god at that point, the experience of which is ultimate reality itself.
Your ego as a concious waking human can not percieve ultimate reality because your ego will block it from happening and only allow understanding of limited everyday constructed reality.
Last_Contact t1_j9vxd7a wrote
Reply to comment by Wegwerpbbq in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
Yep, I wanted to mention this guy as well
rekdt t1_j9vwz7u wrote
Reply to comment by commandolandorooster in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
Why not investigate the depression and anxiety? You don't have to think to know reality, forget about philosophy for a bit, what do you see here and now. That's all there is, thoughts of what reality is are just obstructions to the truth.
[deleted] t1_j9wghm5 wrote
Reply to comment by edstatue in Reality is an openness that we can never fully grasp. We need closures as a means of intervening in the world. | Post-postmodern philosopher and critic of realism Hilary Lawson explains closure theory. by IAI_Admin
[deleted]