Recent comments in /f/explainlikeimfive

DiamondIceNS t1_j6oy6v3 wrote

Let me ask a counter question:

You are playing a game where someone rolls a 6-sided die over and over, and each time you have to guess which number it lands on before it is rolled. If you guess correctly, you win.

You happen to know that the die is loaded. It's not a 1/6 chance for each number to show up. The side numbered "4" is slightly more likely to occur than all of the others.

Knowing that, why would you ever guess anything other than "4"? You're not going to win every time, but it's always your best shot.

1

-paperbrain- t1_j6oy26i wrote

But surely if EVERYONE refused to use the self checkout, piled into line for the cashier lanes, and made sure to take their business to the first stores to hire more cashiers, then that would have an effect on employment.

That's unlikely, but stores don't dictate everything regardless of consumer preference. There are tons of things stores might like to do which customers wouldn't put up with, and at least a handful of businesses folding to consumer sentiment and changing their plans.

So one person going through self checkout doesn't cost a job exactly. But everyone accepting the shift to self checkout IS costing jobs. Just like one person doesn't appoint the president, but the way everyone votes DOES.

2

Little_Noodles t1_j6oxw3i wrote

It’s also worth noting that in many stores, the self checkouts aren’t actually replacing more lane space than the checkers on hand used to occupy.

In my grocery store, they removed 2 checkout lines to put in two rows of self checkout, each with 6 stations, each row headed by an employee.

I get in and out if there faster, especially with small orders, since there’s less of a bottleneck at the registers. But it’s the same number of people per lane as it was before

3

suedehelpme t1_j6ox9cy wrote

It doesn't really matter whether we call it an adjective or not. The point is that brick and wall co-occur so much, that we register it as a single entry in our mental dictionary. We wouldn't typically call a wall made of concrete bricks or Lego bricks a "brick wall", meaning that the term refers to something distinct from "a wall that's made of bricks". The intonation of "brick wall" is also distinct from "tomato wall", since the latter is pronounced with a rise on tomato. That's pretty good evidence of whether or not something has essentially become a fused expression. Compare how you'd pronounce "cheese dip" and "cheese shoes".

2

blue_nylon t1_j6owwzd wrote

Say you want to draw a circle with radius R in location X,Y on your screen. To do this in the CPU you would need to do all the math to figure out which pixels should light up and which should not in order to make the circle, and then send that pixel data to the monitor. This takes a lot of CPU resources and will generally be very slow.

What a GPU does is simplify common operations like this for the CPU. Instead of doing all the math, the CPU can send a command to the GPU to draw the circle with the same parameters, and the GPU will handle it automatically. This will free up the CPU to do other stuff. Additionally the GPU will have dedicated circuits in the chip that can do all the math and draw to the screen much faster than the CPU.

So technically yes the CPU could send direct to the display, but a separate GPU will accelerate a lot of the common math needed to draw the images to the screen.

2

Kuraio-Kadaver t1_j6owwkr wrote

The obsession around Area 51 comes from the so called testimony of a one Bob Lazar, who claimed in interview that he worked on reverse-engineered alien technology at the site.

Lazar's claims were made into a film, ''Bob Lazar : Area 51 and Flying Saucers.''

In 2014 Boyd Bushman claimed he'd reverse engineered flying saucer tech for Lockheed Martin at Area 51, and so here we are.

1

stairway2evan t1_j6owpnh wrote

And our ancestors at the time were the early mammals - smaller, rat-like creatures, and a few other species. They were mostly scavengers who were better able to survive the difficult conditions, while many other animal and plant species died.

As things got better, without giant lizards stomping around everywhere, those mammals were able to thrive, diversify, and spread all over the world into the wide array of mammals we have today. Including the first primates, somewhere around 55 million years ago (well after the meteor extinction event), who would eventually (millions upon millions of years later) give rise to us.

13

dimonium_anonimo t1_j6owo34 wrote

A solar panel is one type of device that converts incident photons into electrical energy. A similar device is inside the camera and is generally referred to as "the sensor." Slightly different, though, they are tuned to specific colors/wavelengths of light. The sensor is split into pixels. Each pixel has a sub-pixel for red, green, and blue light. If a red, green, or blue photon hits one of the appropriate sub-pixels, it generates a small electrical current which can be measured.

Just like human eyesight, the sub-pixels are not extremely particular about what wavelength excites them. There's a wider curve. So for instance, yellow light is somewhere between red and green and will excite both red and green sensors, but to a slightly lesser degree meaning a slightly weaker (lower current/voltage).

What's really key to making an image instead of a poor efficiency solar panel is the lens which very finely aims rais of light based both on where on the lens they hit as well as what angle it comes from. The lens bends light because it has a different refractive index than air (feel free to ask additional questions on this... Or any of these concepts really). The bent light rays hit very precise locations of the sensor so only that pixel is affected.

The electronics will scan the pixels typically an entire row or column at a time, then shift the way down the entire sensor. This means if something moves very quickly relative to the speed at which the electronics scan the sensor, it can end up slightly warped due to something called rolling shutter.

These files end up fairly large, so the camera actually stores the square root of the excitation (the current is first scaled to a number from 0: no current to 255 max current) which not only makes the number much smaller, it also samples small changes in low light levels better than the same change in very bright conditions which is also how human vision works (one light bulb switched on when only 1 other is already on doubles the amount of light. But if there are 100 on and you turn on one more, it's only 1% more light. So we notice contrast better in low light conditions). Your computer should square the numbers back to a scale of 0-255 before displaying the picture. fun, relevant video

1

MOOzikmktr t1_j6owo11 wrote

The first "thing" that came out of Area 51 as a tangible, functional object that captured real public interest was the SR-71 Blackbird. It had been kept under wraps for a while and was already well into it's conceptual design & flight testing phase when the public was informed of its existence (I think by accident, perhaps).

This thing still holds the flight speed record of Mach 3.4 (2609 mph) set in 1976, with an unverified claim of Mach 3.5 in 1986.

So if that's the FIRST thing that was uncovered from whatever they were working on in the early 60s at Area 51, imagine what weird ass shit is going on there in the five decades since...they hadn't even invented silicon transistors yet.

2