Recent comments in /f/deeplearning
Zealousideal-Copy463 OP t1_j5ylls6 wrote
Reply to comment by FuB4R32 in Best cloud to train models with 100-200 GB of data? by Zealousideal-Copy463
Thanks a lot, gonna try it!
Calm_Motor4162 t1_j5yh1bw wrote
Reply to best deep learning reference by Reasonable-Ball9018
Practical Deep Learning python by Kneusel , Dive into Data science, Practical TensorFlow
SometimesZero t1_j5ybxvm wrote
Reply to comment by BellyDancerUrgot in best deep learning reference by Reasonable-Ball9018
Compared to Andrew Ang’s specialization, that text is pretty advanced (though a classic). You might have better luck with Dive Into Deep Learning first.
EnlightenMePlss t1_j5y57ce wrote
Reply to comment by SevereSupermarket244 in What are the best ways to learn about deep learning? by Tureep
The first link seems to have no code. But for the others just click on "code". Then you can sort the public notebooks by "most votes", "best score", etc. and have a look at them.
Reasonable-Ball9018 OP t1_j5xsp0c wrote
Reply to best deep learning reference by Reasonable-Ball9018
Thank you everyone!
libai123456 t1_j5xmzcn wrote
Reply to comment by randcraw in best deep learning reference by Reasonable-Ball9018
can’t agree more. This is really an amazing book.
Tall_Help_1925 t1_j5xl2i4 wrote
Reply to Classify dataset with only 100 images? by Murii_
It depends on how similar the images are and on the amount of defect data, i.e. images containing cracks. You can rephrase it as image segmentation problem and use U-nets (without attention) as model. Due to the limited receptive fields of the individual "neurons" the dataset will effectively be much larger. If the input data is already aligned you could also try using the difference in feature space, i.e. calculate the difference in the activations of a pretrained network for non-defective images and the current image. I'd suggest using cosine distance.
wise0807 t1_j5x4frf wrote
Reply to best deep learning reference by Reasonable-Ball9018
Data driven science and engineering book by Brunton
randcraw t1_j5x2hh4 wrote
Reply to best deep learning reference by Reasonable-Ball9018
Dive into Deep Learning. Free and up to date. https://d2l.ai/
Reasonable-Ball9018 OP t1_j5wxgm9 wrote
Reply to comment by BellyDancerUrgot in best deep learning reference by Reasonable-Ball9018
Will look into these. Thank you!
BellyDancerUrgot t1_j5wtg9z wrote
Reply to best deep learning reference by Reasonable-Ball9018
Deep Learning book - Courville, Bengio, Goodfellow
Andrew Ng ML and DL specializations on Coursera
suflaj t1_j5wgdsj wrote
Reply to Classify dataset with only 100 images? by Murii_
One thing people haven't mentioned is you could create synthetic images via 3D modelling. If you can get someone to set up realistic 3D models of those microchips, and then randomly generate cracks, you can get a pretty good baseline model you can then finetune on real data.
There are companies which could do that too but I'm not that sure that the price would be approachable, or if outsourcing it is a viable solution given trade secrets. But ex. Datagen is a company that can do it.
chatterbox272 t1_j5wfrwp wrote
Reply to Classify dataset with only 100 images? by Murii_
You can try. Augment hard, use a pretrained network and freeze everything except the last layer, and don't let anyone actually try and deploy this thing. 100 images is enough to do a small proof-of-concept, but nothing more than that.
manojs t1_j5wbyum wrote
Reply to Classify dataset with only 100 images? by Murii_
With such a small dataset, you should use pre-existing classification models most similar to your data (search huggingface), and then re-train just the last layer or last couple of layers ("freeze" all the prior layers). And yes you can use the data augmentation suggestion but if you build the entire network from scratch it will be challenging to get good results. AKA "transfer learning".
SevereSupermarket244 t1_j5w9w7r wrote
Reply to comment by EnlightenMePlss in What are the best ways to learn about deep learning? by Tureep
How can I see the code? Of the first link for example?
SevereSupermarket244 t1_j5w9k7u wrote
For introduction watch Andrew Nga course about Deep Learning on YT and then do what @EnlightenMePIss suggests
SevereSupermarket244 t1_j5w8vn2 wrote
Reply to comment by agentfuzzy999 in I just started out guys, wish me luck by 47153
😂😂😂😂
NinjaUnlikely6343 OP t1_j5vq6vj wrote
Reply to comment by emad_eldeen in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
I've heard of it when I started delving into deep learning, but it seemed too complex for me at the time. I'll check it out!
NinjaUnlikely6343 OP t1_j5vps4g wrote
Reply to comment by ChingBlue in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Sweet! Thanks a lot!
NinjaUnlikely6343 OP t1_j5vkttj wrote
Reply to comment by thatpretzelife in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
At neural networks specifically haha!
thatpretzelife t1_j5vf0oo wrote
Reply to comment by NinjaUnlikely6343 in Efficient way to tune a network by changing hyperparameters? by NinjaUnlikely6343
Haha and you claim to be a noob 😅
Internal-Diet-514 t1_j5vbxjl wrote
Reply to Classify dataset with only 100 images? by Murii_
I would try without data augmentation first. You need a baseline to understand what helps and what doesn’t to increase performance. If there is a strong signal that can differentiate between the classes, 100 images may be enough. The amount of data you need is problem dependent it’s not a one size fits all. As others have said make sure youre splitting into train and test sets to evaluate performance and that each has a distribution of classes similar to the overall population (matters if you have an imbalanced dataset). Keep the network lightweight if you’re not using transfer learning and build it up from there. At a certain point it will overfit but it will most likely happen faster the larger your network is.
FuB4R32 t1_j5v9mlr wrote
Reply to comment by Zealousideal-Copy463 in Best cloud to train models with 100-200 GB of data? by Zealousideal-Copy463
Yeah as long as your VM is in the same region as the bucket it should be fine. Even if you have 200GB it doesn't take that long to move between regions either
emad_eldeen t1_j5uw5sp wrote
Wandb is the best! https://wandb.ai/
Check the hyperparameter sweep option. It is FANTASTIC!
you can set range/values for each hyperparameter and let it run.
BrendanKML t1_j5yqith wrote
Reply to Why add bias instead of subtracting bias? by ArthurLCTTheCool
Aren’t they the same thing?