Submitted by laprika0 t3_yj5xkp in MachineLearning
ThatInternetGuy t1_iummggq wrote
It's not that worth it.
People with RTX cards will often rent cloud GPU instances for training when inference/training requires more than 24GB VRAM which it often does. Also, sometimes we just need to shorten the training time with 8x A100 so... yeah renting seems to be the only way to go.
Viewing a single comment thread. View all comments