Recent comments in /f/explainlikeimfive

Horndog2015 t1_j6n2tbo wrote

I can see the GPU walking around pushing the characters. Some of the fans waving a giant leaf at parts to cool them. The heat sink is talking to the CPU, saying to put the heat in em. The keyboard delegating duties to the CPU, and the GPU is saying "alright then." The ram is auditing everything, and the HDD is like "Wow! This is an all you can eat buffet.! The OS is just saying "I'll throw some papers on your desk. Figure it out." The MoBo is preaching to everyone how it brings everyone together. The Disk player just says "Whatever! I'm calling in today!"

3

Redshift2k5 t1_j6n11p0 wrote

Just planting seeds isn't enough. If it was a hybrid they often don't have viable offspring or the offspring are different from parent hybrid plants.

The last stalk of silphium was given to empower Nero. They knew exactly what the plant was and they knew when it was no longer being cultivated.

5

PckMan t1_j6n10hl wrote

A GPU is almost like a separate computer inside your computer suited to one specific task. It's like having a gaming console attached to your motherboard. A CPU can more or less do anything. After all an "integrated GPU" is really just the CPU doing the job of the graphics card as well as its own. The problem is that for most types of use for a PC, you don't need a GPU at all. Office computers, casual users just browsing the web and watching movies, store computers etc don't really need a GPU, because their tasks do not require lots of processing power. Conversely there's some tasks/activities, like gaming, rendering, cad/cam software and others that do require a lot of processing power, a disproportionate amount compared to most other things. So the solution is to have a "separate" computer inside your computer, with its own processors and its own memory, dedicated to those tasks specifically and since software is written around this industry convention, the GPU will perform those tasks more efficiently. Something like a server, used for different tasks, won't have a GPU at all, but it will have multiple CPUs and tons of storage space because that's the kind of resources it needs for its tasks.

1

GalFisk t1_j6n0iwv wrote

Reflexes are controlled by local nerves and muscles. Some biochemistry is controlled by hormones made by glands, which affect how cells behave. Some is controlled only by the cells themselves.

Instincts, emotions, thoughts, memory, muscle memory, all voluntary movement is controlled by the brain.

1

AshFraxinusEps t1_j6mznsg wrote

Interesting, and cheers. I did think we'd eventually rediscover it, as it was too widespread to be completely gone (although those are famous last words of conservationists throughout history)

From your points, while the link says it is slow growing, only .1 really would excuse it going completely extinct, as if it reproduced by asexual budding then any buds in the soil would have long-since died. But from the link, it does seem seed-based and therefore it'd be odd to be completely extinct

1

Seygantte t1_j6mzhnj wrote

Saying RNA is correctly termed mRNA is like saying that cutlery is correctly terms forks. There are several varieties of RNA of which mRNA is just one. Others include ribosomal RNA (rRNA) which makes up part of the ribosome, and transfer RNA (tRNA) which is a links peptides to mRNA during protein synthesis.

2

Thrawn89 t1_j6myw1u wrote

The explanation you are replying to is completely wrong. GPUs haven't been optimized for vector math since like 20 years ago. They all operate on what's called a SIMD architecture, which is why they can do this work faster.

In other words, they can do the exact same calculations as a CPU, except they run each instruction on like 32 shader instances at the same time. They also have multiple shader cores.

The Nvidia cuda core count they give is this 32*number of shader cores. In other words, how many parallel ALU calculations they can do simultaneously. For example the 4090 has 16384 cuda cores so they can do 512 unique instructions on 32 pieces of data each.

You CPU can do maybe 8 unique instructions on a single piece of data each.

In other words, GPUs are vastly superior when you need to run the same calculations on many pieces of data. This fits well with graphics where you need to shade millions of pixels per frame, but it also works just as well for say calculating physics on 10000 particles at the same time or simulating a neural network with many neurons.

CPUs are better at calculations that only need to be done on a single piece of data since they are clocked higher and no latency to setup.

2

jran1984 t1_j6myp3p wrote

There is already a thing called decimal time, with hours minutes and seconds depicted this way. The reason it isn't widely used it adopted is because the current system is so engrained that we can't dislodge it. We can't even get the US to switch to the metric system.

1