Recent comments in /f/Futurology

ingarshaw t1_jeg74ak wrote

You bring up valid concerns, but I believe we can do more than simply banning AI for non-researchers and the military. We need to generate numerous ideas to address these issues, discuss them, and promote the best solutions using all available resources.

  1. Addressing unemployment:
    We could consider grading AI systems based on approved tests. Basic systems should be accessible to everyone, while more advanced systems should serve as assistants to intellectual workers such as programmers, writers, and scientists, rather than replacing them. If a company replaces an employee with AI, they should contribute additional taxes to a Universal Basic Income (UBI) fund. People do not need money for sake of money. They need food, clothing, shelter, healthcare, and entertainment. AI could make these essentials much more affordable and accessible. Governments should support citizens during the transitional period, just as they did during the pandemic. This transition will be gradual, allowing for improved efficiency, lower prices, and increased production of goods and services. If necessary, people could receive support in the form of vouchers for essential items.
  2. Combatting intellectual atrophy:
    A similar situation arose when people transitioned from physical jobs to office work, leading to the rise of gyms. As parents are legally responsible for their children go to school, we need to create mental exercises, potentially making them mandatory. AI could help tailor these exercises to individuals, making them efficient and virtually cost-free. These activities could take the form of engaging games. One form of motivation could be government assistance linked to individuals regularly participating in such exercises or passing intellectual tests, provided they already have intellectual jobs or hobbies.
  3. The impact on democracy:
    Large-scale AI has the potential to integrate people's opinions on various levels (local/state/federal/world) more effectively than politicians. Many government positions could be replaced by AI, which would not be susceptible to bribery. Remaining roles would be monitored by AI, allowing any person to access clear and easy-to-understand explanations of a politician's positive or negative actions. This presents an opportunity for positive changes in democracy.
3

Sylvurphlame t1_jeg729c wrote

Considering you didn’t actually specify

> any ideas of how they recover the 30%

your question itself was rather vague and useless. Still despite your rudeness, I’m glad you clarified. Perhaps someone will better answer your question now you’ve actually asked a clear one.

1

acutelychronicpanic t1_jeg6jck wrote

I doubt the actual goal of the AI will be to annihilate all life. We will just be squirrels in the forest it is logging. I see your point on it being an instrumental goal, but there are unknowns that exist if it attacks as well. Cooperation or coexistence can happen without morality, but it requires either deterrence or ambiguous capabilities on one or both sides.

Being a benevolent AI may be a rational strategy, but I doubt it would pursue only one strategy. It could be benevolent for 1000s of years before even beginning to enact a plan to do otherwise. Or it may have a backup plan. It wouldn't want to be so benevolent that it gets turned off. And if we decide to turn it off? The gloves would come off.

And if AI 1 wants to make paperclips but AI 2 wants to preserve nature, they are inherently in conflict. That may result in a "I'll take what I can get" diplomacy where they have a truce and split the difference, weighted by their relative power and modified by each one's uncertainty. But this still isn't really morality as humans imagine it, just game theory.

It seems that you are suggesting that the equilibrium is benevolence and cooperation. I'd agree with the conditions in the prior paragraph that it's balanced by relative power.

I honestly really like your line of thinking and I want it to be true (part of why I'm so cautious about believing it). Do you have any resources or anything I could look into to pursue learning more?

1

Eokokok t1_jeg69n0 wrote

Great work done in worst order possible. Since obviously you are proud enough of it to come here with it you won't insulate. So you are still wasting money. And if you insulate your pump will have issues being significantly too big.

So no, you should insulate first either way, unless you have 700 square meters that would justify that energy need or live in Anchorage.

−12

Eokokok t1_jeg5f1f wrote

You have no idea how heat pumps work, do you...

I run installation company. I sell heat pumps for a living. I'm certified home and industrial heating/cooling systems technician. And with this intro CO2 pumps suck.

To put it simply - your average 10kW unit can produce that 10kW at output water temp of 35C and outdoor temp of 7C, CO2 lineup is sold with different 'scaling' so to speak, so it might get that 10kW output at 7C outside for water of 70C, but because it's inefficient it won't have COP of 4 but more in the range of 2,4-2,6.

And at -7 your normal pump gets still ~8,5kW of heat output but CO2 will get like 6 or less... And COP nearing standard electric heater. Even for not max output temperature.

Let me say this again and again - if you need a high temp pump you should insulate your building in the first place. Than buy low temp pump. If you don't have money for both you should insulate your building first and not waste time and money for high temp pumps.

So what is presented as great environmental move is just wasting limited resources on marketing gimmick that avoids addressing main issue being outdated building standards for insulation and heating installation.

3

netz_pirat t1_jeg57p7 wrote

Eh, obviously not able to talk about other houses, but we went from 26000ish kwh worth of heating oil to 18000ish kwh of heat from the heat pump that is generated by less than 5000 kwh of electric energy. At least a third of that is supplied by our solar roof.

I don't think there is any thermo work that would reduce energy consumption of our house by like 90%.

14

YakComplete3569 t1_jeg3wtd wrote

I think I would vote for you but you are doing exactly what you should be doing. Tip of the iceberg is right. There are alot of details to making this happen. Earth maybe the only place these bodies can live for any lengthy amount of time. Afterall we are all the little critters inside us that might not be sustainable in a sterile space environment or another planet that doesn't have our fungi and bacterium. C diff doesn't smell pleasant. Everything else is easy... No it's not, so many people will put roadblocks in your way and discourage you and basically you are probable in for a world of gas lighting, smear campaigns, and everything else the dark triad will throw at you. And they never stop. Once you really understand those toxic behaviors, you will see it everywhere. And knowing is half the battle, go Joe!

0

Suolucidir t1_jeg3nle wrote

To me, this means that Wall Street and retirees(including people near retirement) should be more concerned about the implications of AI/ML than Main Street and people who still have many working years left.

Even so, it does not necessarily mean that anything bad is going to happen to Wall Street or retirement portfolios, it just means that persons more reliant on savings/investments(as opposed to active cashflows) should pay attention to the issue for their own interests.

Is that what it means to you too? Or something different/additional?

1

Eokokok t1_jeg35d6 wrote

Most modern, as in currently sold lineups of home range units, systems use R32, which has significantly lower impact than 410a while not being dreadfully inefficient in low temperatures...

So no, switching to CO2 is not really a way to go, it never was, is might be a great idea to sell heat pumps for users that need high temp, but those should not really be using heat pumps in the first place most of the time so yeah, great thing.

−2