Recent comments in /f/deeplearning
ShadowStormDrift t1_iuivphh wrote
Reply to comment by GPUaccelerated in Do companies actually care about their model's training/inference speed? by GPUaccelerated
GPUaccelerated OP t1_iuitwy2 wrote
Reply to comment by suflaj in Do companies actually care about their model's training/inference speed? by GPUaccelerated
not exactly sure, i'm not a lawyer. But it's something that gets taken very seriously by a lot of my medical field clients. Its definitely something for their side, not mine. I just help those specific clients go on-prem
Rare_Lingonberry289 OP t1_iuisx3d wrote
Reply to comment by suflaj in Does the length/size of a dimension affect accuracy? (CNN) by Rare_Lingonberry289
Do you think it would be beneficial to have 365 different time points (1 each day) in my fourth dimension? I know it’s hard to say, but what do you think?
suflaj t1_iuioq7c wrote
Reply to comment by GPUaccelerated in Do companies actually care about their model's training/inference speed? by GPUaccelerated
Which laws?
GPUaccelerated OP t1_iuimm3t wrote
Reply to comment by suflaj in Do companies actually care about their model's training/inference speed? by GPUaccelerated
Right, but for example in the medical field, It's not a trust issue. It's a matter of laws that prevent patient data from leaving the physician's premise.
suflaj t1_iuim5q2 wrote
It could, but doesn't have to. For temporal dimensions 4 is very often seen, so you probably wanna start with that firat, then see how it compares to 3 or 2.
Intuitively, I think 2 time points are useless. It's difficult to generalize something new from such a short relation. Intuitively, I would like to sample t, t-1, t-2 and t-4, but I'd first confirm it's better than t, t-1, t-2 and t-3.
GPUaccelerated OP t1_iuim3cp wrote
Reply to comment by ShadowStormDrift in Do companies actually care about their model's training/inference speed? by GPUaccelerated
Very cool! But I think you mean 24GB of VRAM for the 3090?
Issues loading the web page, btw.
GPUaccelerated OP t1_iuilu92 wrote
Reply to comment by mayiSLYTHERINyourbed in Do companies actually care about their model's training/inference speed? by GPUaccelerated
okay cool! Thanks for explaining
GPUaccelerated OP t1_iuilp7x wrote
Reply to comment by sckuzzle in Do companies actually care about their model's training/inference speed? by GPUaccelerated
Got it! Thank you.
GPUaccelerated OP t1_iuill54 wrote
Reply to comment by Rephil1 in Do companies actually care about their model's training/inference speed? by GPUaccelerated
That's pretty intense.
ElCharles2997 t1_iuikq9j wrote
Reply to Is Colab still the place to go? by CodingButStillAlive
Sure, e.g. there are a lot of stable diffusion demos out of there
[deleted] t1_iuhvi09 wrote
Reply to Is Colab still the place to go? by CodingButStillAlive
[removed]
[deleted] t1_iuhu9aj wrote
[removed]
jackhhchan OP t1_iugnxpa wrote
Reply to comment by chatterbox272 in 1080ti to RTX 3090ti by jackhhchan
yes the VRAM jump is one of the biggest reasons why I am considering the upgrade!
4 x1080ti - that is certainly too tempting
​
Thanks!
jackhhchan OP t1_iugnskg wrote
Reply to comment by suflaj in 1080ti to RTX 3090ti by jackhhchan
right! thank you. Yeah strangely enough they're the same price as the 3090ti is on sale - do you think a liquid cooled (AIO) one will relieve some of the heat and noise issues?
jackhhchan OP t1_iugnhds wrote
Reply to comment by BalanceStandard4941 in 1080ti to RTX 3090ti by jackhhchan
thank you so much for the reply!
I think I'll get the 3090ti still just because it's on sale where I am at the moment. Gonna take your advice and go for it :)
Nixx233 OP t1_iug0nek wrote
Reply to comment by someone383726 in Hi guys, is rtx 3080ti enough for a deep learning beginner? by Nixx233
Thx!
Nixx233 OP t1_iug0kx3 wrote
Reply to comment by werres123 in Hi guys, is rtx 3080ti enough for a deep learning beginner? by Nixx233
All right🤣,thx!
Nixx233 OP t1_iug0ivw wrote
Reply to comment by Prestigious_Boat_386 in Hi guys, is rtx 3080ti enough for a deep learning beginner? by Nixx233
Thx!
Nixx233 OP t1_iug0hsv wrote
Reply to comment by eisaletterandanumber in Hi guys, is rtx 3080ti enough for a deep learning beginner? by Nixx233
Thx🤣
Nixx233 OP t1_iug0glq wrote
Reply to comment by llv77 in Hi guys, is rtx 3080ti enough for a deep learning beginner? by Nixx233
Thx a lot😃
someone383726 t1_iufx3rb wrote
I’ve been using a 1070ti and it works for training various models. I’d love something newer to speed up some of the trainings, but a 3080ti would be solid.
werres123 t1_iueyna6 wrote
your friends are idiots to say that..you can easily use a 3080ti for training for a beginner.
Prestigious_Boat_386 t1_iuet51s wrote
My 10 year old shitty midtier laptop graphics card is enough to learn about deeplearning. Even cpu works when you're learning about small cnns or classical machine learning.
GPUaccelerated OP t1_iuixkzj wrote
Reply to comment by ShadowStormDrift in Do companies actually care about their model's training/inference speed? by GPUaccelerated
So cool! Good for you!