Recent comments in /f/deeplearning

FastestLearner t1_j630p88 wrote

The thing with me is that I started with TensorFlow v1 back when PyTorch wasn’t even in the race, and because of the constant breaking changes to the TensorFlow API and cryptic error messages, my experience was hellish TBH. Even getting support from stackoverflow was messed up because people would be posting solutions for different API versions. Then PyTorch got released and boy was it the savior I needed. It literally saved me hundreds of hours of debugging (and possibly from brain hemorrhage too). Compared to the burning hell TF1 was, PT was like coding on a serene beach. And then TensorFlow v2 came out with eager execution, that promised PyTorch way of doing things. But then the question is, why switch if it is the same as PyTorch? And so I didn’t.

I’m coming from a research point of view. If I was coming from a production POV, things could’ve been different.

1

suflaj t1_j5zlq6k wrote

Aside from what others have mentioned, let's assume that we don't have a symmetrical situation, i.e. that the range of the function we're learning, as well as the domain of weights and biases, is [0, inf>. Then it makes more sense to add bias than to subtract it, as it will lead to smaller weights and less chance to overflow or for the gradients to explode.

It makes more sense to subtract the biases if in the scenario described above, you want a more expressive layer, but with less numerical stability. This is because a subtractive bias allows the weights to be of greater magnitude, which in terms gives you more effective range for the weights.

But note that neural networks are not done with integer weights, and in some libraries there is no autograd for integers even.

3

OutrageousSundae8270 t1_j5zf7ko wrote

PyTorch is great, its honestly much easier to use than TensorFlow, especially for beginners. TensorFlow however offers everything PyTorch does through heavy use of object oriented design (primarily inheritance).

The functional model in TensorFlow is very similar to the default way of instantiating models in PyTorch. TensorFlow has both many many convenience wrappers but also gives you the full freedom that PyTorch does, given that you are able to deal with the nuances and complexities of object-oriented design and refer heavily to the documentation.

1