Recent comments in /f/deeplearning
suflaj t1_j3bt2eq wrote
That learning rate is about 100 times higher than you give to Adam for that batch size. That weight decay is also about 100 times higher, and if you want to use weight decay with Adam, you should probably use the AdamW optimizer (which is more or less the same thing, just fixes the interaction between Adam and weight decay)
Also, loss is not something that determines how much a model has learned. You should check out validation F1, or whatever metrics are relevant for the performance of your model.
Paid-Not-Payed-Bot t1_j3aw07q wrote
Reply to comment by mmeeh in Tracking keras experiments by liberollo
> I'm not paid to say
FTFY.
Although payed exists (the reason why autocorrection didn't help you), it is only correct in:
-
Nautical context, when it means to paint a surface, or to cover with something like tar or resin in order to make it waterproof or corrosion-resistant. The deck is yet to be payed.
-
Payed out when letting strings, cables or ropes out, by slacking them. The rope is payed out! You can pull now.
Unfortunately, I was unable to find nautical or rope-related words in your comment.
Beep, boop, I'm a bot
mmeeh t1_j3avzl1 wrote
Reply to Tracking keras experiments by liberollo
Check out the website neptune ai. I'm not paid to say it but they are pretty awesome.
Clicketrie t1_j37zi1a wrote
Reply to Tracking keras experiments by liberollo
Comet has an integration with Keras (disclosure: I work for Comet). But it does exactly what you're looking for and it's super easy.
The Comet integration with Keras automatically logs the following items:
- Model and graph description.
- Steps and epochs.
- Metrics (such as loss and accuracy)
- Hyperparameters.
- Optimizer Parameters (such as the learning rate, beta decay rate, and more)
- Number of trainable parameters.
- Histograms for weights and biases.
Just_CurioussSss t1_j37t9e5 wrote
Reply to Tracking keras experiments by liberollo
Have you tried using tools such as TensorBoard, which is a visualization tool for TensorFlow that can be used to track the performance of your Keras models?
To use TensorBoard with Keras, you will need to install TensorFlow and modify your Keras code to write log files that TensorBoard can read. This can be done by using the TensorBoard callback provided by Keras, which writes log files to a specified directory that TensorBoard can use to visualize the results of your training runs.
Here is an example of how you might use the TensorBoard callback in your Keras code:
from tensorflow.keras.callbacks import TensorBoard
# Create a TensorBoard callback
tensorboard_callback = TensorBoard(log_dir='/path/to/logs')
# Compile and fit your Keras model
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(x_train, y_train, epochs=10, callbacks=[tensorboard_callback])
After you have trained your model, you can start TensorBoard by running the tensorboard command in a terminal, specifying the directory where the log files are stored. TensorBoard will then start a web server that you can access in a web browser to visualize the results of your training runs.
liberollo OP t1_j372as5 wrote
Reply to comment by ekbravo in Tracking keras experiments by liberollo
Thank you
ekbravo t1_j370wqd wrote
Reply to Tracking keras experiments by liberollo
Check out weights and biases website WandB.com That’s exactly what it does.
deepl3arning t1_j32tbjs wrote
Reply to Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
I started native C++/CUDA. I don't recommend it. Debugging was a nightmare, and the environment was immature at best. There were undocumented exceptions, errors and inconsistencies throughout with unknown dependencies which would break your code catastrophically on a simple driver update.
I pushed into Theano, then TF/Keras and now spend most of my time with PyTorch.
As Nater5000 mentioned, now you can just google any weird error. Someone will have likely encountered it before, and probably has a workaround. For me, I also really appreciate the confidence I can have in my results. If something goes wrong, I can safely assume that I have messed up somewhere. I can have confidene in the outcomes, either good or bad.
If you need something peculiar, then you're probably best off looking at extending or overriding the existing frameworks. For a starter, the frameworks will make your life a lot easier, many examples/videos/articles available for PT/TF. Best of luck with your journey
Appropriate_Ant_4629 t1_j32exrg wrote
Reply to comment by WinterExtreme9316 in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Sounds almost like you're talking about the first TensorFlow.
Or Torch-before-PyTorch.
enterthesun t1_j32div1 wrote
Reply to comment by BellyDancerUrgot in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Wow no one working on tf that’s crazy I didn’t know that!
enterthesun t1_j32dgvr wrote
Reply to comment by kraegarthegreat in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Thank you.
todeedee t1_j31v7pu wrote
Reply to comment by suflaj in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
JAX looks appealing, but totally agree -- I'm not ready to go back to using Bazel to build code from source
suflaj t1_j319k9o wrote
Reply to comment by BellyDancerUrgot in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
It's basically just a higher abstraction layer for PyTorch. It's completely separate but works in tandem with PyTorch.
I use LightningModules (analogous to torch.nn.Module) as basically decorators over ordinary PyTorch models. So you have your model class, and then you create a LightningModule which is instantiated with said model, where you implement ex. what optimizers and schedulers you use, how your training, evaluation and testing goes, what metrics you track and when etc.
But once you're done with R&D you can just use ordinary PyTorch as-is, that's why I like it. It doesn't make setting stuff up for production different in any way, but it makes plenty of stuff during R&D effortless. It has some smelly parts but IMO they're not a dealbreaker, just take a day or two to learn it.
kraegarthegreat t1_j316z7p wrote
Reply to comment by enterthesun in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
I found PL helped reduce boilerplate code while still giving the niceties of torch versus tf.
The main thing I like is that it abstracts the training loops while still giving you the ability to add custom code to any part of the training loop. This likely sounds weird, but check out their page. 12/10 recommend.
Final-Rush759 t1_j312pgq wrote
Reply to Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
JAX is the new toy,. It's not customer framework.
BellyDancerUrgot t1_j30uc0r wrote
Reply to comment by enterthesun in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Keras is a good tool when getting started imo. But eventually it’s better to switch to PyTorch because it’s more pythonic and although tensorflow used to be the deployment standard PyTorch has caught up and add to that tf doesn’t even have a team working on it anymore iirc since everyone moved onto jax. I will say tho PyTorch can be very frustrating to work on initially because a lot of the internal optimizations that keras does are absent in PyTorch. I’ve never used PL tho.
enterthesun t1_j30tnas wrote
Reply to comment by WinterExtreme9316 in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Yikes that’s almost traumatizing just to hear.
enterthesun t1_j30tlfx wrote
Reply to comment by BellyDancerUrgot in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
I hope PL is better than the Keras situation because I’m very excited about PL and a big reason I started learning PyTorch was to avoid the crutch that is Keras.
WinterExtreme9316 t1_j30dfgv wrote
Reply to Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
You must have never spent a ridiculous amount of time learning and building on some api, just for some committee or company board to decide its time to "end-of-life" it (i.e. can the whole thing). It changes you, and your future api choices.
BellyDancerUrgot t1_j3013um wrote
Reply to comment by suflaj in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
Quick question is PyTorch lightning a better PyTorch or is it more ‘streamlined’ (meaning high level) like say keras for tf
StabbyPants t1_j2z0777 wrote
Reply to comment by Nater5000 in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
odds are, the mainline supported frameworks have so much work put in to them that you'd have to duplicate that it's almost never a good idea to go solo
suflaj t1_j2yhjdw wrote
Reply to Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
My man, I only recently convinced myself to start using PyTorch Lightning, no way I'd be able to switch to some other new hip marginally better framework, when it was this hard to start using something that speeds stuff up 10x.
Unless there are clear benefits to switching to some other new technology, it's not worth it.
Nater5000 t1_j2y3k0e wrote
Reply to Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
>Do you also use frameworks aside from the popular ones?
No, and this isn't just limited to deep learning. Anybody with even a small amount of engineering experience will have learned the pitfalls of trying to work with immature frameworks, libraries, concepts, etc. When you're building anything non-trivial, you're not just choosing a framework- you're investing in it. If the framework ends up being trash, or it becomes obsolete/abandoned, or you have a hard time finding resources for it, then you're risking the success of your entire project. As such, most engineers prefer the safety of maturity.
To illustrate: If I get some weird, cryptic error from PyTorch, I can usually just copy it, paste it into Google, and find a result immediately describing it and how to resolve it. Try doing that for some new, immature framework and quickly become disappointed as you realize that not only are you on your own in terms of understanding the issue and troubleshooting it, but you may even be the one responsible for fixing it. That's a tough sell for anyone who values their time.
​
>I have a feeling people don't want to learn new things because they already worked hard to learn something else.
This may be a factor, but it's relatively minor compared to what I described above. I used to be all-in on TensorFlow. Once it became clear, at least in the domain I was focused on, that PyTorch was "winning," I switched over. There was some time I had to dedicate to figuring it out, but the switch was easy since I knew PyTorch was also mature. In fact, the reason I had originally invested in TensorFlow over PyTorch was because, at the time, PyTorch wasn't very mature and it seemed risky to invest anytime into it.
​
>But I also think choosing a framework depends on people’s needs for their projects. If that project could benefit from that new framework or just use it out of curiosity, people would definitely try it out.
In a vacuum this makes sense. But, like I said, there's a cost to adopting new frameworks/libraries. If there's a new framework that does something that'd take me weeks to develop on my own, I may try it out. But if it's a critical component and that new framework isn't very mature, I won't even touch it (although I may look through it for inspiration). I don't want my project to be dependent on something that may break/become obsolete/be full of bugs/etc., even at the cost of my time and effort.
Odds are if you're making a new framework, you'd be better off contributing to an existing framework.
_DeepFreeze_ t1_j2va7d4 wrote
Reply to comment by hayAbhay in Any inspiring and engaging deep learning courses for beginners? by westeast1000
Complete beginner here. Thank you for the recommendation.😁
AKavun OP t1_j3btlem wrote
Reply to comment by suflaj in Why didn't my convolutional image classifier network learn anything! by AKavun
I also have a validation accuracy metric of around %50 which is basically the expected value of a random variable.
I removed the weight decay to keep things simpler and adjusted the learning rate to 0.0003. I will update this thread on the results.
Thank you for taking the time to help