Monnok

Monnok t1_jdodiq3 wrote

Your Earth-volume : lunar-orbit-volume ratio works for the likelihood of finding the asteroid inside or outside of the Earth at any given moment.

But we can assume the object has an entire path of moments passing straight through on [basically] a line. An asteroid “looking ahead” directly at the round perimeter of the Earth might briefly occupy some point “in front” of the Earth before collision… but that’s still a collision path. And it’s never gonna get to any points on the other side. What it “sees” just is a flat disc in a flat disc. It’s either heading through the empty part or not.

2

Monnok t1_ja28d43 wrote

There is a pretty widely accepted and specific definition for general AI... but I don't like it. It's basically a list of simple things the human brain can do that computers didn't happen to be able to do yet in like 1987. I think it's a mostly unhelpful definition.

I think "General Artificial Intelligence" really does conjure some vaguely shared cultural understanding laced with a tinge of fear for most people... but that the official definition misses the heart of the matter.

Instead, I always used to want to define General AI as a program that:

  1. Exits primarily to author other programs, and

  2. Actively alters its own programming to become better at authoring other programs

I always thought this captured the heart of the runaway-train fear that we all sorta share... without a program having to necessarily already be a runaway-train to qualify.

2