visarga

visarga t1_iqzerze wrote

> If we assume AI can eventually create a movie that is oscar nomination worthy every 10 seconds for essentially no cost

It's not gonna be a "movie" but more like a sim or a game, and we're not going to make it for entertainment but as a training ground for AI. Simulation goes hand in hand with AI because real world data is expensive and limited but sims only cost electricity to run.

We are already seeing generative models as source of training data. link

1

visarga t1_iqxstx5 wrote

Just remember how you use your phone and explain that to a person from 200 years ago, I bet they'll think you are already deep into the singularity by their standards.

Having food, water, toilet, electricity and internet is nothing to brag about, even the poorest of us should have them. But just a couple of centuries ago these things would have been off the scale.

If you look back over decades or a couple of centuries life has been getting steadily better. It wasn't fake progress, but we're busier than ever.

Many people think after the singularity we'll have nothing to do anymore. On the contrary, I think we'll have more than before. We'll still compete and we'll often be unhappy like before.

Who said the purpose of AI should be to improve our lives? The purpose of life is to expand and exist despite the challenges it meets. That means competition and exploration, not peace and detachment. We didn't come out on top of nature by being nice, we exploited every advantage and knowledge along the way.

5

visarga t1_iqsob64 wrote

Cool down. It's not that revolutionary as it sounds.

First of all, they reuse a code model.

> Our model is initialized with a standard encoder-decoder transformer model based on T5 (Raffel et al., 2020).

They use this model to randomly perturb the code of the proposed model.

> Given an initial source code snippet, the model is trained to generate a modified version of that code snippet. The specific modification applied is arbitrary

Then they use evolutionary methods - a population of candidates and a genetic mutation and selection process.

> Source code candidates that produce errors are discarded entirely, and the source code candidate with the lowest average training loss in extended few-shot evaluation is kept as the new query code

A few years ago we had black box optimisation papers using sophisticated probability estimation to pick the next candidate. It was an interesting subfield. This paper just takes random attempts.

76