Recent comments in /f/deeplearning

Born_Judge6078 t1_j6c55d2 wrote

Evolutionary, reinforcement and swarm optimization algorithms are interesting variations of what you’ve described and they’re not very complex to implement if you have beginner knowledge of vectors and matrixes but the real struggle would be engineering the environment in which the agents of the environment exist in, the environment would describe an agents/entities fitness to survive/solve a problem.

1

Rude_Ad_4174 t1_j6bdock wrote

Reply to comment by raulkite in M2 pro vs M2 max by raulkite

I’m total noob in this space, but i agree it’s probably better to have just M2 or M2 Pro and notebook running on a remote server with Nvidia GPU than having M2 Max. But with M2 Max now it can have 96gb memory so I’m curious how much difference it makes 🤔

1

perrohunter t1_j6abuz7 wrote

For machine learning workloads the M2 Max will show a substancial difference so I’d recommend you go with the M2 Max

2

raulkite OP t1_j6a41la wrote

Is it worthwhile to move from buying m2 pro 32 gb to m2 max 64gb? My feeling is no and continue running notebooks on the server. But the possibility of run little training and test fluently in the laptop is soooo interesting 🤔

Neural engine seem the same for both models

Interested in opinion

4

LiquidDinosaurs69 t1_j69jpk3 wrote

Actually, there aren’t evolution based locomotion optimizers (that I know of), but there are reinforcement learning based ones (much fast and more efficient). It actually probably won’t be very difficult. You just need to use stable-baselines3 reinforcement learning library and then use an existing openai gym format for whatever walking robot you want to experiment with. I think there are gyms available for humanoids and quadrupeds. You could get it running by following stable baselines3 tutorials.

Unfortunately, these optimize control policies, not the actual robot designs itself which I assume is what you want. Theo Jansen (strandbeest guy) used an evolutionary algorithm to come up with his design which I think is what you want to do. I’m not aware of any existing software that lets you do that though.

You could implement it yourself in python or C++ using robotics oriented rigid body dynamics libraries and solvers. But if you haven’t done this before it will be pretty hard.

This is actually something I want to do at some point too but I’m busy with other projects right now.

1

Screend t1_j67ybnp wrote

In a somewhat similar boat to you as I’m stepping into a team grappling with problems different to the ones I’ve been specialising in. I would say identifying the most burning topics for your new role, take a short course and most importantly, apply what you’ve learnt on a mini project. Be kind to yourself and pick 2-3 things max that will unlock you in your new role, then build from there.

1

JJJJJJtti t1_j67tyg1 wrote

I have no idea about pre-existent evolution simulation environments, but making your own is definitely feasible. That will simply require a solid programming background (perhaps physics as well, depending on what you have in mind). Based on your current background, I would suggest Python (or C/C++ if you really want maximum control over your hardware even though this will be much harder) and go with it.

1

BellyDancerUrgot t1_j67p0e9 wrote

A degree , maybe a part time professional masters perhaps from a school where the faculty who teaches does active research. Or just read papers. 2 min papers is a good channel to start off with.

In ML/DL 1 year is already ancient. 4 years is prehistoric lol. For context if u choose a topic, say 2D-3D translation , from the advent of NERFs a couple of years back? We have a stupid amount of papers on the topic trying various novel approaches , everything ranging from using Voxels to store geometry in new ways, to geometry aware GANs , multi view compression using ViTs etc etc

So choose a topic and focus on that otherwise it’s a lot.

4

FastestLearner t1_j677lru wrote

For an absolute beginner, definitely PyTorch is what I would recommend. It’s like an extension of numpy.

Both frameworks are extremely matured and will get the job done no matter what you throw at it (I don’t get what you mean by practicality).

For industry purposes, if you have a particular company in mind, then check which framework they use (ask some employee on LinkedIn) and learn that framework (some companies still have their codebases in TF1, they never updated). If you are in the market for a job hunt, then having both on your CV will give you the best chance.

1

perfopt t1_j66x6kz wrote

I work in the DL area with previous background in systems. The field is diverse and rapidly changing. I have sort of come to the conclusion that - for me - it is best to learn the application of the technology in a few chosen domains (say NLP, image recognition) rather than chasing everything that is happening.

7

Appropriate_Ant_4629 t1_j65cjuf wrote

> OpenAI managed to solve all of its problems at once. They raised a boatload of money and have access to all the compute they need. On top of that, they solved their distribution problem. They now have access to Microsoft’s sales teams and their models will be integrated into MS Office products.

But they already did that years ago when they sold Microsoft an exclusive license to an older GPT.

This is just a nice extension of that deal.

4

Lankyie t1_j64czb1 wrote

first time i‘m hearing the numbers but paying out 255 billion is far from easy and wont go ‚quickly‘. best case scenario, which in my opinion is unlikely, OpenAI will become one of the largest Companies on the world and it will take the bigger part of a decade to pay back the money when there. We shouldnt forget that microsoft, google, salesforce, adobe and others all have more personal working on developing AI than OpenAI themselves. Microsoft is also tapping und to the knowledge generated by OpenAI and Google has been on the forefront of relevant Papers. Other than that i agree with the Thesis :)

5