Bakoro
Bakoro t1_iuj6nik wrote
Reply to comment by Wassux in Giant farming robot uses 3D vision and robotic arms to harvest ripe strawberries by Anen-o-me
We have more than enough to make sure everyone has their basic needs met.
Once we get a guarantee that everyone gets the basics, it's not unreasonable to instill procreation limits to two per family, and adjust as materials allow.
People who bypass the limits can be put to extra work.
The point is to eliminate needless suffering and waste. The amount of wasted energy and effort in the world is phenomenal.
Bakoro t1_iuiunf2 wrote
Reply to comment by Quealdlor in Giant farming robot uses 3D vision and robotic arms to harvest ripe strawberries by Anen-o-me
Lower prices mean nothing if people don't have income to buy goods.
AI is not like other past technologies where it will create more jobs than it takes. It will create a small number of initially higher paid jobs while taking ten or a thousand jobs for every one created. Then the competition for those jobs will reduce wages. The capitalists who own everything will keep the profits.
We've already seen this past pandemic that the mass inflation we've seen is due to corporate greed, far more than anything. They will continue to suck the blood out of people if nothing changes.
Bakoro t1_iub7rxu wrote
I am looking forward to seeing all the D&D campaigns which get transcribed to video.
Bakoro t1_iu2cm3b wrote
Reply to comment by Kong_Here in The Great People Shortage is coming — and it's going to cause global economic chaos | Researchers predict that the world's population will decline in the next 40 years due to declining birth rates — and it will cause a massive shortage of workers. by Shelfrock77
We absolutely do have the numbers. What we also have is a gross misalignment in the economy where many developers and engineers are being paid to make bullshit, and a lot of them are doing redundant work because the idiotic "competition is always good no matter what" stance of capitalism, even when it's purely wasted effort.
In addition, a lot of people who are smart and hard working enough to be engineers simply don't have reasonable access to education.
We do still need more educated people, but we could be putting who we have to more productive use.
Bakoro t1_itfwhe4 wrote
Reply to comment by Devoun in Given the exponential rate of improvement to prompt based image/video generation, in how many years do you think we'll see entire movies generated from a prompt? by yea_okay_dude
I'm pretty sure that simple story generators are already a thing. Maybe not full on scripts, but there is stuff to build off of.
I know that there are AI which can "read" a story and extract some of the defining qualities and themes. Look at MIT's Patrick Winston who was/(is?) working on AI symbolic understanding.
Writing a novel is a lot more formulaic than a lot of people think.
Jim Butcher has a pretty fun story about his college professor, who is also a prolific romance novelist. Jim didn't want to follow her advice because he thought it would lead to generic feeling crap. To prove her wrong he followed her advice and wrote the first book of The Dresden Files. He's had a great career so far. For a while he was publishing two books a year.
There's NaNoWriMo, where people try to write 50k words of a novel during November. Pretty much any competent writer can bang out a generic script or story in hours or days. There's a formula, the trick is hiding it and twisting it, and making it less generic. The sheer amount of shit out there that's just Shakespeare with a wig and glasses is overwhelming.
There probably just has to be more guidance in a story writing AI.
Break it down into the elements of a story. You have the seven basic stories. Start there. There are probably character archetypes. There are relationship archetypes. Mix and match.
Central plot, central characters who each have their primary archetype and secondary/tertiary qualities. They have their central motivation. They have roles to fill.
The characters try to achieve their goals using the resources at their disposal, according to their parameters.
Statistical and symbolic analysis would probably tell us if there are reasonable approximations for how long each narrative description is, how long conversations should be, the patterns of conversational back and forth, how to divide the plot structure...
If I sat and thought about it, I could probably list a few dozen or more parameters to analyze, and then it just turns into an ad-libs kind of thing.
It'd probably even help to tie sevel AI together. Like a city/building design AI to get an understanding of a world, and interrogating generated content to feed to a character.
So, maybe in a roundabout way making a compelling story bot might lead to more generalized AI, depending on how it's done.
We may not get deeply philosophical and emotionally complex stuff at first, but it's good enough for a flashy action movie or a heist movie. Good enough to make an episode for the average CW series.
I don't think near-future generalized AI is going to be plausible without structure anyway.
Look at a human. It takes weeks for an infant to be more than an eating pooping machine, months to be functional enough to start engaging and babbling, months to start crawling and walking, potentially years to start talking, and their cause-effect understanding is dubious at best. Children have lots of overfitting and beliefs based on false causation. It takes years for people to learn to read, and some never pick it up well. Some never learn more math than arithmetic. It takes most people decades for the person to reach their full potential. We are expecting AI to do it in days? A few years? With piddling resources and limited run times?
AI is already more functional in many aspects than a typical human. With posits and the new hardware coming down the line, and the big dollars being increasingly spent, it won't take long to see an AI tell a coherent one page story. A full script will come sooner than later, even if it's a boring one.
Bakoro t1_it4z9kk wrote
Reply to comment by haptiK in A primitive "holodeck" by Ezekiel_W
The Star Trek holodeck is a sci-fi thing which will likely never be as real as shown in the show, but realtime holographics aren't unfeasible.
As far as computing power, AI for the most past has been taking advantage of more general purpose hardware, and half-measure hardware.
Between posits and ASICs, there is new technology already staged for manufacturing which are in the realm of 100 to 1000 times faster.
You need to hold onto your butt, because specialized hardware is going to put the already rapid progress on steroids.
Bakoro t1_it1bxzb wrote
Reply to comment by RavenWolf1 in Since Humans Need Not Apply video there has not much been videos which supports CGP Grey's claim by RavenWolf1
>Economy sure but not money necessary. Economy does not mean money.
Money is a useful abstraction for value. How many chickens to a television, and televisions to the beach house is a hard problem.
If you have resource tokens, its basically the same thing. The right to requisition x food resources and y labor resources, and z land resources. Anything fungible which replaces direct barter ends up being similar.
If humans are to still exist, they'll have to be part of the equation in terms of directing the AI. Like, who decides what the AI spends its discretionary time on? If the AI doesn't have its own motivation and interests, or otherwise just allocates resources to human requests, that can be a kind of money in and of itself. Start off giving everyone an equal share of AI requests, and the requests which generate the most positive feedback from the community yields more time to the person or group who made the request, and people can trade AI time share just like money.
I personally like the resource allocation model. It's basically money, only it ties value to quantifiable things. That's only viable when you have highly mechanized everything where the energy and time costs are highly predictable, like a society mostly by AI.
Bakoro t1_iszogjh wrote
Reply to comment by Sashinii in Since Humans Need Not Apply video there has not much been videos which supports CGP Grey's claim by RavenWolf1
Economics will exist as long as there are people. Scarcity will always be a thing, it's essentially a law of the universe.
There is only so much beachfront property, only so many houses with an ocean view, only so many people who can live on the top of a hill.
One way or another there will have to be a way to decide who gets what limited resources, and who gets the new things first.
Even if you just make everything timeshare, so everyone takes turns with exclusivity of a thing, some people won't care about one thing but will want more of their favorite thing. Some things will be more popular.
"I'll trade you my week in Maui for a day in the glorgotron" you'll say, and I'd be like dang, that's a good deal, the glorgotron gives me a headache anyway...
It's just a matter of what people value, what people want exclusive access to, and what is limited. If nothing else, people's time will always be somewhat valuable into the distant future.
Bakoro t1_iszkyut wrote
Reply to Since Humans Need Not Apply video there has not much been videos which supports CGP Grey's claim by RavenWolf1
Eventually, nearly all jobs will be taken by AI. That's not a bad thing. That's the thing we should be running at, full speed.
People don't need jobs, people need things like air, food, shelter, sometimes clothes, and prefer to have something to keep themselves occupied.
Jobs are supposed to just be an inconvenient thing we have to do to get the things we need and want. What jobs turned into is a means to control the masses. "The economy" is a way for people to exploit people. There's no way that someone doing a "critical" job shouldn't be able to own a home or have good healthcare, yet that's become the standard in the U.S and other places.
"The economy" is full of bullshit nothing-jobs which only exist to try and move money around. Fuck jobs for the sake of jobs. Fuck having to justify your own existence by bowing down to some corporation.
Let AI do everything it can, and allow humans to have the free time to do as they wish.
In the past, new technologies opened the doors to more products, more technology, but we still needed people to do things. Technology freed up people to do other things that we wanted to get done, but couldn't put resources to. New jobs got created in the manufacture, installation, and repair of machines.
AI is not going to be like that. AI may create some new jobs, but if it goes well, it'll take more jobs than it creates, because the new jobs it creates will also be filled by AI.
Ideally, the only jobs people will have are going to be the ones they want. The only work people will do will be the work that fulfills them.
There won't be anything to stop people from making art, doing crafts, doing research and development.
What's more, people who have ideas will be able to actually explore those ideas, they'll be able to get AI to do the grunt work instead of needing to convince hundreds of people to do something, or convince a bank to give them millions.
The real questions are going to be about how resources are distributed, because even in a AI utopia, there's still only limited amounts of land, there's only so much beachfront property, and not everyone can have their own personal theme park and sprawling castle.
Bakoro t1_is9328f wrote
Reply to Would you be friends with a robot? by TheHamsterSandwich
I'm willing to be friends with anything of even the most dubious intelligence, as long as it isn't totally psychopathic or antisocial (in the medical sense of harming others).
If an AI seems to have some kind of motivation and self direction, why wouldn't I be friends with them?
I wonder what an AI would think about friendship though. When I think friendship, I think shared interests, shared goals, and just enjoying each other's company. What are an AI's motivation and goals which overlap with my own, and what do I have to offer an AI? Thumbs? Even that won't last long.
Kind of feels like being friends with an AI would be more like being friends with a hyper intelligent talking cat, rather than another human or a dog.
Like, perhaps a mutual respect and tolerance, with each occasionally fulfilling a role for the other. The AI gets to observe us and collect data points, gets to bounce ideas off us, and we get to enjoy the content they make or test their newest product.
Yeah, it's hard to even think about in a nontransactional way. An intelligence without the same biological needs. Any internal motivation it develops may still be as mysterious as a human's.
I would love to explore an AI's development with it, but it feels like it'd be like watching a child grow up at super speed, and then surpasses us in such a way that we look like the children instead.
Bakoro t1_irlyoif wrote
Reply to MIT And IBM Researchers Present A New Technique That Enables Machine Learning Models To Continually Learn From New Data On Intelligent Edge Devices Using Only 256KB Of Memory by Dr_Singularity
I want to make sure I've got this right, is it that you still train the pretty-good base model with major resources, and the edge device is able to improve it over time, or is it that you can start with a poop-tier model which can still become very good over time?
Bakoro t1_irfin4a wrote
Reply to comment by mootcat in “We present 3DiM (pronounced "three-dim"), a diffusion model for 3D novel view synthesis from as few as a single image” by Shelfrock77
Do whatever will make you the most money within ten years, sit on cash until the next market crash, and buy any land you can get your grubby mitts on.
Bakoro t1_ir98owh wrote
Reply to comment by saccharineboi in "The number of AI papers on arXiv per month grows exponentially with doubling rate of 24 months." by Smoke-away
For real dealing with that right now. One way or another, I'm going to have to make some software to do this thing. Do I want a fairly okay solution right now, which I can iterate on and easily explain/justify why my solution is roughly correct, OR dump some resources into machine learning, have nothing to show for it up until I do, but very likely get something on the other end which is almost magically good but I don't know why...
Bakoro t1_iqtddqi wrote
Reply to Self-Programming Artificial Intelligence Using Code-Generating: a self-programming AI implemented using a code generation model can successfully modify its own source code to improve performance and program sub-models to perform auxiliary tasks. by Schneller-als-Licht
Man, sometimes I wish that my life had been just a little easier and I could have finished college some years earlier.
So much of what I am seeing now are very similar to ideas I've had in the past few years, and I'm so preoccupied with getting my life right that I can't dig into things as much as I'd like.
In my dreams, AI will explode enough to design something which can connect my brain to artificial extension.
Bakoro t1_iuj6whc wrote
Reply to comment by CoachAny in Giant farming robot uses 3D vision and robotic arms to harvest ripe strawberries by Anen-o-me
The point of UBI is making sure people have their basic needs met. It's functionally a food and housing distribution, just indirectly.
When instilling UBI, it must be codified that it cover all basic needs, or else we'll end up with the same issue of minimum wage not covering basic needs.