Recent comments in /f/deeplearning
PleaseKillMeNowOkay OP t1_iqqwpem wrote
Reply to comment by thebear96 in Neural network that models a probability distribution by PleaseKillMeNowOkay
That's what I thought but I haven't been able to get the second model to even match the performance of the first one. I tried regularization methods without much success.
PleaseKillMeNowOkay OP t1_iqqw34u wrote
Reply to comment by UsernameRelevant in Neural network that models a probability distribution by PleaseKillMeNowOkay
I did. The second model performed worse. I didn't think that was possible.
BobDope t1_iqqvrl5 wrote
Reply to comment by BobDope in New Laptop for Deep/Machine Learning by MyActualUserName99
Y’all know it’s true
thebear96 t1_iqqsaoe wrote
Assuming same hyperparameters, the second network theoretically should converge to the solution quicker. So one will need to modify the hyperparameters and maybe add some dropouts so that the model doesn't overfit.
tired_fella t1_iqqejcl wrote
Give Oryx Pro a thought too! for similar spec, you can get similar price as the Alienware with Ubuntu preinstalled. Sure, it is based on Clevo laptops, but they also did some customizations specifically to make things work with Ubuntu perfectly.
But in general, desktop PCs are preferred for deep learning. In my case, I have a ITX machine for side projects. Sadly, the GPU innit is outdated...
UsernameRelevant t1_iqq5hp9 wrote
> Is my second network going to perform at least as well as my first network?
Impossible to say. In general, more parameters mean that you can get a better fit, but also that the model overfits more easily.
Why don’t you compare the models on a test set?
WhizzleTeabags t1_iqq59i6 wrote
It can perform worse, the same or better
cyberpunkstrategy t1_iqq2vh4 wrote
Reply to comment by nutpeabutter in New Laptop for Deep/Machine Learning by MyActualUserName99
Because laptops are the suboptimal choice is exactly why people ask for advice when they face situational constraints.
Take me: I miss my home built tower terribly but due to a change in life situation something immobile became untenable. Its not the life situation I would choose, nor the setup, but it's the required setup for the situation.
nutpeabutter t1_iqpxj3a wrote
Reply to comment by incrediblediy in New Laptop for Deep/Machine Learning by MyActualUserName99
Kinda frustrating how half of the help posts here are requrests for laptops. Like have they not bothered to do even the tiniest bit of research?? At the same price you could get an equivalently speced desktop/server AND an additional laptop, with the added bonus of being able to run long training sessions without needing to disrupt it.
majh27 t1_iqpw8pw wrote
build a desktop or pay for hosting, get a macbook pro or one of those popOS laptops
Final-Rush759 t1_iqpujzr wrote
A desktop with 3090 is much better and cheaper. Buy a big case with good airflow. Or go for 4090.
peltierchip t1_iqpj22d wrote
They are just rebranded razer blades
cma_4204 t1_iqph0hw wrote
Reply to comment by Icy-Put177 in New Laptop for Deep/Machine Learning by MyActualUserName99
$2700 on Newegg my work got it for me. I have it on a stand, it definitely gets a little warm when running PyTorch but never been an issue. There is a key to turn the fans up to 100% instantly if needed but I’ve never had to use it.
Icy-Put177 t1_iqpgyi6 wrote
Reply to comment by Chigaijin in New Laptop for Deep/Machine Learning by MyActualUserName99
Who made these laptops? Are they any recognized brand? What type of warranties did you get? And where to take for tech support for any issue?
Thanks in advance.
Icy-Put177 t1_iqpgnpl wrote
Reply to comment by cma_4204 in New Laptop for Deep/Machine Learning by MyActualUserName99
How’s the price for msi setup? Does it warm much when GPU runs? Never had use a laptop with GPU, just want to get one for hands on small experiments.
Chigaijin t1_iqp975j wrote
The Tensorbook is only $3500 unless you're looking at the dual boot model. I bought one and I've had no regrets, incredible machine. Maybe check out Sentdex on YouTube's review, thought it was pretty good.
incrediblediy t1_iqp8nmt wrote
> Price: $3,800
still you can get multiple RTX3090 24 GB at this price, I have seen even brand new cards are going for US$900
> Is 64 GB of RAM really needed for most moderate deep/machine learning projects or should 32 GB with a reduced batch size work fine?
you will be training on GPU, so batch size is limited by 16 GB VRAM not RAM.
yannbouteiller t1_iqozhsf wrote
Alienware. Take minimum RAM/SSD et replace those by yourself (you can do this on the 17'' version, double-check that this is also true for the 15'' version. I think I remember some crap with the x15 like the RAM being soldered). You get the Dell on-site guarantee, and the machine is much cooler probably in both senses.
The real issue with both machines is that your GPU is soldered to the motherboard and thus will likely kill your laptop eventually.
Also, good to know before you buy, I got myself an AW x17 R2 for prototyping and gaming, and I realized that the built-in speakers make the chassis vibrate and create a terrible crackling noise if you use them at mid to high volume. This defect seems to be present on the whole series. Also, the webcam is crap, and the battery doesn't last long. Not sure if the Lambda laptop is any better in these regards, though.
A better bet might be the MSI Raider GE76 (if they have a 15 inches equivalent), but it looks a bit more flashy / less professional, you don't get on-site repairs, and the power supply is less transportable I think.
BobDope t1_iqowbfc wrote
Chrome book and log in to your cloud provider
obsoletelearner t1_iqorxoe wrote
If you're seriously considering doing deep learning build yourself a server, with the budget you specified you can build a decent machine
Also if had to choose one of the two I'd Alienware
cma_4204 t1_iqogqbg wrote
I have a msi laptop with a rtx3070 and 32gb of ram. It is plenty for experimenting and training small models/datasets. Once I have something working and need more power I rent gpus by the hour from lambda labs or run pod. For $1-2/hr I’m able to get around 80gb of gpu power. Long story short I would go for the cheaper and use the cloud when needed
Ttttrrrroooowwww t1_iqneo00 wrote
Posted here a lot already
Karyo_Ten t1_iqnb0i2 wrote
Neither.
Mac M1 for deep learning? No nvidia GPU, no deep learning, and before people pull pitchforks about PyTorch and Tensorflow supporting M1, it's a pain and many ecosystem packages only support Cuda. And recompiling everything is a time sink.
The RTX 2060 is a bit of a bummer when 3060 12GB is a clean upgrade for not much more, 6GB is getting small these days, and you didn't mention the RAM? 16GB minimum just to have our browser, VScode, and Discord/Slack or whatever you use to communicate and then your model.
jk_xz t1_iqn3wh8 wrote
Some RTX 3060 laptops are coming down with their prices. Regardless, I wouldn't recommend the Macbook Air with 8GB ram. Tried to use it for deep learning last year, and the RAM bites hard. One visual studio code instance and several chrome tabs would make it freezing.
thebear96 t1_iqqwxur wrote
Reply to comment by PleaseKillMeNowOkay in Neural network that models a probability distribution by PleaseKillMeNowOkay
Is the loss decreasing enough after running for specified number of epochs? Are you getting a flat tail after convergence?