Recent comments in /f/explainlikeimfive

zachtheperson t1_j6lngib wrote

The CPU is really smart, but each "core," can only do one thing at once. 4 cores, means you can process 4 things at the same time.

A GPU has thousands of cores, but each core is really dumb (basic math, and that's about it), and is actually slower than a CPU core. Having thousands of them though means that certain operations which can be split up into thousands of simple math calculations can be done much faster than on a CPU, for example doing millions of calculations to calculate every pixel on your screen.

It's like having 4 college professors and 1000 second graders. If you need calculus done, you give it to the professors, but if you need a million simple addition problems done you give it to the army of second graders and even though each one does it slower than a professor, doing it 1000 at a time is faster in the long run.

6

Semyaz t1_j6lnfnr wrote

To put this into perspective, a relatively low resolution monitor is 1920x1080 pixels. That is over 2 million pixels that need to be potentially sent 3 numbers (red, green, and blue values) for every frame. One gigahertz is 1 billion operations per second. Rendering 60 frames per second is 60 frames * 3 color values * 2 million pixels = 360 million operations per second -- 1/3 of 1 GHz. Even further, graphics depend on tons of other operations like rendering, lighting, antialiasing that need to happen for every frame that is displayed.

It becomes clear that raw speed is not going to solve the problem. We like fast processors because they are more responsive, just like our eyes like higher frame rates because it is smoother. To get smooth, high frame rate video, we need specialized processors that can render millions of pixels dozens of times a second. The trick with GPUs is parallelization.

GPUs have relatively low clock speed (1GHz) compared to CPUs (3-4Ghz), but that have thousands of cores. That’s right, thousands of cores. They also use larger instruction size: usually 256 bits compared to CPUs’ 64 bits. What this all boils down to is boosting the throughput. Computing values for those millions of pixels becomes a whole lot easier when you have 2,000 “slower” cores doing the work all together.

The typical follow up question is “why don’t we just use GPUs for everything since they are so fast and have so many cores?” Primarily because GPUs are purpose built for the task they were designed for. Although that doesn’t prevent the possibility of general computing on GPUs, we humans like computers to be super snappy. Where CPUs can juggle dozens of tasks without a hiccup, GPUs are powerhouses for churning through an incredible volume of repetitive calculations.

PS: Some software takes advantage of the GPU for churning through data. Lots of video and audio editing software can leverage your GPU. Also CAD programs will use the GPU for physics simulations for the same reason.

2

symbiotic_Tao t1_j6ln0b7 wrote

Humans do bond for life (or used to at least, but I digress). Look what happens when an old woman looses her husband or vice versa. They often literally die from grief. It's a very real phenomenon. So yes, human couples absolutely bond for life. It's only recently, with society going the way it has, that things have changed.

−1

FellowConspirator t1_j6lmx2z wrote

A computer doesn’t need a GPU.

What a GPU is good at is performing the same task on a bunch of pieces of data at the same time. You want to add 3.4 to a million numbers? The GPU will do it much faster than a CPU can. On the other hand, it can’t do a series of complex things as well as a CPU, or move stuff in and out of the computer’s memory or from storage. You can use the GPU’s special abilities for all sorts of things, but calculations involving 3D objects and geometry is a big one — it’s super useful in computer graphics (why it’s called a Graphics Processing Unit) and games. If you want stunning graphics for games, the GPU is going to be the best at doing that for you.

The CPU talks to a GPU using a piece of software called a “driver”. It uses that to hand data to the GPU, like 3D shapes and textures, and then it sends commands like “turn the view 5 degrees”, “move object 1 left 10 units”, and stuff like that. The GPU performs the necessary calculations and makes the picture available to send to the screen.

It’s also possible to program the GPU to solve math problems that involve doing the same thing to a lot of pieces of data at the same time.

3

ThatGenericName2 t1_j6lmf86 wrote

It doesn't, the person that replied to you is incorrect.

Some CPUs have a low power GPU integrated into it, hence Integrated GPU, and not all CPUs have them.

These Integrated GPUs are very weak and are meant for the bare minimum of graphics processing, enough to draw the interface of a word document or other programs and that's about it. Attempting to do anything complicated such as 3D rendering, even really simple ones will start to strain it.

Despite this these Integrated GPUs are still much better at graphics processing than software renderers that actually use the CPU.

27

DressCritical t1_j6llpv6 wrote

Nope.

AC can induce muscle tetanus, but while it can cause you to lock a hand to something, DC is quite a bit more likely to do so. AC has a greater chance of causing a spasm that either removes the hand or allows the victim to pull free.

AC, however, is more likely to kill you, as it can trigger ventricular fibrillation. DC is more likely to stop your heart, which can actually be easier to recover from.

As for Edison, he electrocuted the elephant and invented the electric chair to illustrate the dangers of AC current. Not because he thought it would be all that dangerous at household voltages, but because he owned the patents on DC and wanted AC to look bad. Look into Edison a bit, you will see that he was something of a jerk.

However, for local grid purposes, AC is much better and won out in the end.

3

dadamn t1_j6lloag wrote

Lots of folks have the right answers for old planes and plenty of folks have noted that newer planes are mostly switching to standard 3.5mm plugs. One thing nobody has mentioned is that new planes will sometimes have 2 prong plugs in first and business class for long distance international flights. The reason for this is that in first and business class, the seats often come with noise cancelling headphones and the second prong is used to power te noise cancelling circuitry. Typically on these newer 2 prong plugs, the audio prong is still a standard 3.5mm so you can use any headphones if you choose not to use the provided ones.

1