Recent comments in /f/explainlikeimfive

CrashCalamity t1_j6lgtir wrote

We don't measure temperature based on feelings. The real temperature is often observed with thermometers (which do not transmit any moisture through the glass, so humidity is not even relevant here), relying upon heat transfer from/to the surrounding air to make the measuring medium (like liquid mercury) expand and contract.

The difference that you feel with humidity, and what the weather stations are trying to say, is that when there is more water in the air it becomes harder for you to cool off and regulate your body temperature. We rely on our ability to sweat, and for that sweat to quickly evaporate, to keep our internals at a comfortable level when we are exposed to heat.

3

johnn48 t1_j6lghf1 wrote

I ran across an interesting article which addressed just that point. A number of hypotheses were presented: 1) that the plant reproduced asexually by spreading its roots, 2) that the climate contributed to its demise, 3) that it grew in a narrow geographical area and was harvested to extinction. These are my narrow take of the article but it’s a definite must read.

7

homarjr t1_j6lge7z wrote

"the brick brown big wall" would have a different meaning.

It makes it sound like "brick brown" is a type of colour.

And then "big wall" is sounds like something different than just a "wall", it's its own noun.

The strict order you would normally put these words actually allows us to have more flexibility in the words you use. For example, having a colour called "brick brown" is made possible. If we allowed any random order to describe things, that couldn't happen.

9

TheJeeronian t1_j6lgdvg wrote

The actual temperature is super easy to measure. Get an object, set it in the shade until it reaches a balance of humidity with the surrounding air, and measure its temperature. Humidity and wind only impact us because we're not in balance. We're hotter and wetter than our surroundings.

11

zachtheperson t1_j6lgdmb wrote

Basically:

  1. Make your game look way better than it is
  2. Advertise to millions
  3. Hope thousands download
  4. Even if those thousands quit almost immediately, you still got to show them 1 or 2 ads.

Here's a video of two people talking about how their pretty hilarious journey of how they did something similar as a joke and accidentally ended up making quite a bit of money: https://youtu.be/E8Lhqri8tZk (seriously worth the watch)

6

FenderMoon t1_j6lg2w0 wrote

Yea, the CPU is basically giving the GPU commands, but the GPU can take those and execute them far faster than the CPU can.

GPUs are very good at things that involve tons of parallel processing calculations. E.g. "Take this texture and apply it over this region, and shade it with this shader." CPUs would sit there and just calculate all of that out one pixel at a time, whereas the GPU has the hardware to look at the entire texture, load it up, and do tons of pixels in parallel.

It's not that the CPU couldn't do these same calculations, but it'd be way slower at it. GPUs are specifically designed to do this sort of thing.

4

BobbyThrowaway6969 t1_j6lg0ft wrote

The CPU is a mathematician that sits in the attic working on a new theory.

The GPU is hundreds of thousands of 2nd graders working on 1+1 math all at the same time.

These days, the CPU is now more like 8 mathematicians sitting in the attic but you get the point.

They're both suited for different jobs.

The CPU could update the picture that you see on the display, but that's grunt work.

Edit: I don't mean the cores in a GPU are stupid, but their instruction set isn't as complex & versatile as a CPU's which is what I meant.

663

DragonFireCK t1_j6lfyqb wrote

The key difference is how the two processors function. A GPU is designed to do the same calculation lots of times at once, though with differing values, while a CPU is designed to do lots of different calculations quickly.

A simple way to think about this logic is that a single object on the screen in a game will be on multiple pixels of the screen at once, and each of those pixels will generally need to do the exact same set of calculations with just different input values (think like a*b+c with differing values for a, b, and c). The actual rendering process does the same idea at multiple levels, where you are typically going to position and rotate the points (vertices) of each object in the same way. It also turns out that this same style of calculation is useful for a lot of other stuff: physics calculations*, large math problems*, and artificial intelligence*, to name a few.

However for general program logic you aren't repeating the same calculations over and over with just different data, but instead need to vary the calculations constantly based on what the user is trying to do. This logic often takes the form of "if X do Y else do Z".

Now, modern CPUs will have some hardware designed to function like a GPU, even if you discount any embedded GPU. Using this is very good if you just need to do a small amount of that bulk processing, such that the cost of asking the GPU to do it and receiving the result will be too expensive, however its no where near as fast as the full capabilities of a GPU.

Beyond those design differences which are shared between dedicated and embedded GPUs, a dedicated GPU has the benefit of having its own memory (RAM) and memory bus (the link between the processor and memory). This means both the CPU and GPU can access memory without stepping on each other and slowing each other down. Many uses of a GPU can see massive benefits from this, especially games using what is known as "deferred rendering" which requires a ton of memory.

As a note, there is no reason you couldn't just do everything with one side, and, in fact, older games (eg Doom) did everything on the CPU. In modern computers, both the CPU and GPU are what is known as Turing complete, which means they can theoretically perform every possible calculation. Its just that each is optimized to perform certain types of calculations, at the expense of other kinds.

* As a note, artificial intelligence heavily relies on linear algebra, as does computer rendering. Many other math problems can be described as such, converting the problem into a set of matrix operations, which is specifically the specialization of GPUs.

2

Xanjis t1_j6lf95a wrote

It's the second. The GPU takes all the information for the scene (the location/size/rotation) of objects in the scene and then calculates what color each pixel the screen can display. That calculation needs to be done millions of time per second but it's a very simple calculation so the GPU is suited for the task because it has a huge number of weak cores. Whereas a cpu has a small number of incredibly powerful cores.

3