blueSGL

blueSGL t1_iv80z8y wrote

> what did he see on Twitter that is worth spending 40 Billion dollars?

as far as I can remember all his financials (and the world's in general were way better when he put the offer in) problem with locking something like that in and then the wind changes is you can be on the hook for comparatively a lot more because the sale price does not change but the economic climate has.

It really looked like he wanted to go to court to get out of the deal, discovery started and they allowed in communications that would have meant he has to pay anyway because of what was said, so he dropped it all and paid up.

2

blueSGL t1_iux9hew wrote

it'd be infrastructure costs. If you can get lights that direct light correctly but are not slot in replacements and need to replace/retofit the attachments/poles they will not get used as they cost more money.

and the above holds true if there are slot in replacements but they cost more money.

The solution needs to be cheap and easy to implement otherwise it will face massive barriers to being done.

1

blueSGL t1_iuriw6r wrote

matrix multiplications require doing additions (and subtractions) and multiplications.

GPUs can do additions (and subtractions) faster than multiplications.

by rejiggering the way the matrix multiplication is written you can use less multiplications and more additions thus it runs faster on the same hardware.

https://en.wikipedia.org/wiki/Strassen_algorithm

>Volker Strassen first published this algorithm in 1969

.....

>In late-2022, AlphaTensor was able to construct a faster algorithm for multiplying matrices for small sizes (e.g. specifically over the field Z 2 \mathbb {Z} _{2} 4x4 matrices in 47 multiplications versus 49 by the Strassen algorithm, or 64 using the naive algorithm).[2] AlphaTensor's results of 96 multiplications for 5x5 matrices over any field (compared to 98 by the Strassen algorithm) was reduced to 95 a week later with further human optimization.

7

blueSGL t1_itsji9p wrote

I'm sure I saw a Lex Friedman interview with a neuroscientist who said that experience is a post hoc narrative of events and that you can watch the brain make decisions about choices using fMRI where the choice is fixed in before the conscious observer thinks it is. Annoyingly I can't remember who he was interviewing.

10

blueSGL t1_itsf4uj wrote

>the majority of small businesses are not buying robots to replace people in the near term.

What about small businesses that can do the work remotely? the percentage of the entire workforce who don't need to physically be present in a specific location to carry out their jobs (quick google, ranges from 1/4 to 4/10 )

and large business is already looking at automation. With control models like this six axis arm making it simple to program and to reprogram for a different task, it only needs to be slightly better than human on a cost/benefit analysis to make it worth while. (was the cost in these things to begin with the hardware or the software, I've never looked into it)

3

blueSGL t1_itscy5u wrote

it all comes down to the money in the end, if [business] can make more money by using AI it will get used.

Is there going to be enough companies left doing things 'the old way' to keep employment numbers up even though it's less cost effective?

> My grandmother still goes to the bank window to withdraw cash and has never used a computer in her life.

and yet people like her don't provide enough financial incentive to keep branches open.

https://www.bankingdive.com/news/us-banks-close-2927-branches-in-2021-a-38-jump/617594/

5

blueSGL t1_its73w0 wrote

The other confusion is you don't need a general human level AI in all fields to cost jobs, A collection of narrow AIs selected for the type of work and feeding into each other will be able to replace jobs without even looking at the larger multi model systems that are being built.

24

blueSGL t1_its6j7f wrote

> highly specialized machines.

That is of course until you can use a language model to instruct humanoid robots.

something like this: https://vimalabs.github.io/ but for a humanoid/teslabot body.

Then it's generic hardware and generic (likely fine tuned) software.

2

blueSGL t1_itlmagr wrote

thinking about [thing] necessitates being able to form a representation/abstraction of [thing], language is a formalization of that which allows for communication. It's perfectly possible to think without a language being attached but more than likely having a language allows for easier thinking.

13

blueSGL t1_it7tzvt wrote

I've already seen people generate images for their RPG campaigns using Stable Diffusion, how many rule/campaign books will a LLM need to crunch through before it can spit out endless variations on a theme for your favorite system (or act as a major accelerator for those creating them already)

Edit: actually lets expand on this.

What happens when a sufficiently advanced model gets licensed for fine tune to paying companies and Wizards of the Coast feeds in the entire corpus of data they control and starts using that to create or help create expansions and systems and the former creators shift to editors.

Now do that for every industry that has a text backbone somewhere in it. e.g. movie/tv scripts, books, comics, radio dramas, music video concepts, and so on.

5