Recent comments in /f/deeplearning

tired_fella t1_iqqejcl wrote

Give Oryx Pro a thought too! for similar spec, you can get similar price as the Alienware with Ubuntu preinstalled. Sure, it is based on Clevo laptops, but they also did some customizations specifically to make things work with Ubuntu perfectly.

But in general, desktop PCs are preferred for deep learning. In my case, I have a ITX machine for side projects. Sadly, the GPU innit is outdated...

1

cyberpunkstrategy t1_iqq2vh4 wrote

Because laptops are the suboptimal choice is exactly why people ask for advice when they face situational constraints.

Take me: I miss my home built tower terribly but due to a change in life situation something immobile became untenable. Its not the life situation I would choose, nor the setup, but it's the required setup for the situation.

2

nutpeabutter t1_iqpxj3a wrote

Kinda frustrating how half of the help posts here are requrests for laptops. Like have they not bothered to do even the tiniest bit of research?? At the same price you could get an equivalently speced desktop/server AND an additional laptop, with the added bonus of being able to run long training sessions without needing to disrupt it.

2

cma_4204 t1_iqph0hw wrote

$2700 on Newegg my work got it for me. I have it on a stand, it definitely gets a little warm when running PyTorch but never been an issue. There is a key to turn the fans up to 100% instantly if needed but I’ve never had to use it.

https://www.newegg.com/titanium-blue-msi-ge-series-ge66-raider-11ug-271-gaming-entertainment/p/N82E16834155921

2

incrediblediy t1_iqp8nmt wrote

> Price: $3,800

still you can get multiple RTX3090 24 GB at this price, I have seen even brand new cards are going for US$900

> Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?

you will be training on GPU, so batch size is limited by 16 GB VRAM not RAM.

3

yannbouteiller t1_iqozhsf wrote

Alienware. Take minimum RAM/SSD et replace those by yourself (you can do this on the 17'' version, double-check that this is also true for the 15'' version. I think I remember some crap with the x15 like the RAM being soldered). You get the Dell on-site guarantee, and the machine is much cooler probably in both senses.

The real issue with both machines is that your GPU is soldered to the motherboard and thus will likely kill your laptop eventually.

Also, good to know before you buy, I got myself an AW x17 R2 for prototyping and gaming, and I realized that the built-in speakers make the chassis vibrate and create a terrible crackling noise if you use them at mid to high volume. This defect seems to be present on the whole series. Also, the webcam is crap, and the battery doesn't last long. Not sure if the Lambda laptop is any better in these regards, though.

A better bet might be the MSI Raider GE76 (if they have a 15 inches equivalent), but it looks a bit more flashy / less professional, you don't get on-site repairs, and the power supply is less transportable I think.

1

cma_4204 t1_iqogqbg wrote

I have a msi laptop with a rtx3070 and 32gb of ram. It is plenty for experimenting and training small models/datasets. Once I have something working and need more power I rent gpus by the hour from lambda labs or run pod. For $1-2/hr I’m able to get around 80gb of gpu power. Long story short I would go for the cheaper and use the cloud when needed

9

Karyo_Ten t1_iqnb0i2 wrote

Neither.

Mac M1 for deep learning? No nvidia GPU, no deep learning, and before people pull pitchforks about PyTorch and Tensorflow supporting M1, it's a pain and many ecosystem packages only support Cuda. And recompiling everything is a time sink.

The RTX 2060 is a bit of a bummer when 3060 12GB is a clean upgrade for not much more, 6GB is getting small these days, and you didn't mention the RAM? 16GB minimum just to have our browser, VScode, and Discord/Slack or whatever you use to communicate and then your model.

3