Recent comments in /f/deeplearning
Comfortable-Author t1_ixsdaz5 wrote
Reply to comment by jazzzzzzzzzzzzzzzy in Is Linux still vastly preferred for deep learning over Windows? by moekou
Nvidia drivers for "simply" compute workloads are really good, just look at all the server with Nvidia GPUs. The problem is when you need to output an image to a screen and deal with Wayland or X11
Rephil1 t1_ixsa6jj wrote
Reply to comment by jazzzzzzzzzzzzzzzy in Is Linux still vastly preferred for deep learning over Windows? by moekou
Standard driver + cuda. Get yourself a docker container and your good to go š this is probably the easiest way to do it
RichardBJ1 t1_ixs8s3c wrote
In my dept a mix of window/Linux. CUDA-GPU fitted machines, mostly TensorFlow /Python. No significant issues with either. But that is, in CS terms quite a limited remit.
RichardBJ1 t1_ixs8fsy wrote
Reply to comment by stillworkin in Is Linux still vastly preferred for deep learning over Windows? by moekou
Macs for deep-learning?
VinnyVeritas t1_ixs2kfa wrote
Reply to comment by jazzzzzzzzzzzzzzzy in Is Linux still vastly preferred for deep learning over Windows? by moekou
Maybe it was a pain years ago, I don't really know but nowadays you just click install nVidia drivers in the software management and it works. There's nothing painful or difficult about it.
stillworkin t1_ixs09pm wrote
I'll extend this: for the majority of real computer science/ engineering work (especially back-end stuff), *nix is the way to go. It's a huge reason Macs are the standard for CS folks (ever since the terminal became part of the MacOS). At every university I attended and taught, and at every company I've worked for, linux/macs have been used by essentially 100% of the people. Roughly once a year I'll hear/see someone using Windows.
jazzzzzzzzzzzzzzzy t1_ixs06fo wrote
Reply to comment by notgettingfined in Is Linux still vastly preferred for deep learning over Windows? by moekou
I always found this weird. Getting Nvidia drivers to work on Linux is a big pain, yet most ML people use it. I use windows and WSL if I have to. I never go into my Linux dual boot.
deepneuralnetwork t1_ixryuig wrote
Longjumping-Wave-123 t1_ixrxasw wrote
Windows + WSL with Ubuntu is great
Cholojuanito t1_ixrsm65 wrote
Most DL hardware accelerator libraries are built and tested to run on servers which are more than likely Linux machines so yeah Linux would be the way to go.
notgettingfined t1_ixrnwuu wrote
I dual boot. Iām sure you could get things to work on Windows but it would be a horrible experience.
schrodingershit t1_ixris61 wrote
pornthrowaway42069l t1_ixq1g4e wrote
Reply to comment by Sadness24_7 in Keras metrics and losses by Sadness24_7
Write a custom function, and use it as a metric? Not sure what you mean by "What the training method says", but I think default metrics get summed just like losses.
Sadness24_7 OP t1_ixpuynf wrote
Reply to comment by IshanDandekar in Keras metrics and losses by Sadness24_7
I would rather use my own rmse this one does some weighting and it just does not make sense to me.
Sadness24_7 OP t1_ixpuuzn wrote
Reply to comment by pornthrowaway42069l in Keras metrics and losses by Sadness24_7
I dont care about the loss, i care about the rmse metric. When i calculate it for each output it just does not add out to what the training method says. š«¤
The_Poor_Jew OP t1_ixmnfjo wrote
Reply to comment by liaminwales in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
Thank you!
liaminwales t1_ixmhyfb wrote
It may be worth talking to a specialised PC shop, you may also need to check your house electrics will support that kind of setup. You may need an electrocution to come in to do some work for you.
https://www.pugetsystems.com/labs/articles/1-7x-nvidia-geforce-rtx-4090-gpu-scaling/
Puget has some info on power use that may be of interest, that link is for 1-7 RTX 4090's scaling. They may also be worth talking to if you want some one to make your setup for you.
The_Poor_Jew OP t1_ixm9jj8 wrote
Reply to comment by incrediblediy in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
Thanks for your input. Yes, it's a home project, but I'd possibly start using it for commercial in the future. Otherwise why do you think 3090s would give me a better ROI?
incrediblediy t1_ixm8sku wrote
I think you might find more info on such a rig, if you search about building a "mining rig", it is quite the same. I have seen they have used multiple 1200 W server PSU's connected together with interface board.
> power spikes can cause each of them to go up to 1000W
:O that's quite a lot, is this for a commercial application or home project, otherwise you might be able to find 4 used 3090s with better ROI
pornthrowaway42069l t1_ixm7q3s wrote
Reply to Keras metrics and losses by Sadness24_7
You can specify several losses, or have multi-output with a single loss - in both cases Keras will average them out (I think its non-weighted by default, and you can specify the weights, but I don't remember 100%).
You can't really have 3 different loss values for a single network - otherwise it won't know how to use that to backpropagate. The best you can do is write a custom loss function, and mix them in a way that makes sense for your problem (You will still need to provide a singular value at the end), or provide the weights (You'd need to look up APIs docs for that).
suflaj t1_ixlwf2t wrote
Reply to comment by The_Poor_Jew in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
You'll probably have to look hard, as a power supply needed to do this, its installation and modification for computer components is anywhere from 2-5k$ alone. Server components cost way more than PC components. Good luck.
The_Poor_Jew OP t1_ixlw06g wrote
Reply to comment by suflaj in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
Ah okay, thanks for the input. Yes, my goal is to build a $10-15k server, so looking if someone knows here how to buikd such machines
suflaj t1_ixlvsbg wrote
Reply to comment by The_Poor_Jew in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
The solution for 2 power supplies refers to the 2 computers solution. For a singular 4kW power supply you'll need to go beyond consumer products, into industrial power supplies for high performance servers or supercomputers. At that point you're no longer building a PC, and I don't know how you'd handle it, sorry. IMO it's just better to build 2 machines.
The_Poor_Jew OP t1_ixlvdok wrote
Reply to comment by suflaj in I'd like to build a deep learning home server - any resources? by The_Poor_Jew
I've read that 2 PSUs is not recommended. What do you think?
redditrasberry t1_ixsfztx wrote
Reply to comment by Longjumping-Wave-123 in Is Linux still vastly preferred for deep learning over Windows? by moekou
Is GPU passthrough to WSL/docker working smoothly now?