Recent comments in /f/deeplearning

gahaalt OP t1_izo0w5m wrote

Hi! Thanks for the feedback.

Actually, Progress Table is not tied to Keras or any other Deep Learning framework. You can use Progress Table to track any long-running process that produces data. The source code is not neural network specific :)

To help you start out, I've created a markdown file with PyTorch integration example. Check this out: integrations.md. Let me know if it's clear!

1

incrediblediy t1_izm4ey7 wrote

Looks like, GTX980 4GB = 165 W & RTX2080 6GB = 160 W, which would be 325 W. I haven't used Intel K CPUs, so I am not that familiar with power usage of that. But I think 850 W would be more than enough, if it is a proper 850 W PSU, even considering power usage by other components like motherboard, RAM, SSD etc.

You can use this to calculate power requirement https://outervision.com/power-supply-calculator

My power usage is AMD Ryzen 5600x (75 W) + RTX3060 (170W) + RTX3090 (350W) = 595 W at max, I think with other components total was 750 W ( System power budget : https://outervision.com/b/8XoZwf ).

I have a 850 W, Tier A - Deepcool PQ850M which is a Seasonic based 80+ Gold. I have power stress tested with OCCT and it was fine.

1

robbsc t1_iziz6ie wrote

I don't have the time to figure it out, but I'm pretty sure you can do it through some combination of permutations and reshapes. Play around with an NxN numpy array (e.g. np.arange(8**2).reshape(8,8)) and perform various transposes and reshapes and see what comes out. You might have to add and remove an axis at some point too.

1

suflaj t1_izihg01 wrote

This is only the code license for the open source portion, but the SDK license of the general, proprietary software that TensorRT is, is also something you have to agree on: https://docs.nvidia.com/deeplearning/tensorrt/sla/index.html

In there, ownership is phrased in such an ambiguous way the legal team of a company would probably never greenlight using it.

2

incrediblediy t1_izi6n58 wrote

>Now if I connect my 2060 along with the gtx 980, and connect my display to the 980 , will pytorch be use the whole vram of 2060 ?

Yes, I have a similar setup, RTX3090 - No display (full VRAM for training), RTX3060 - 2 Monitors

When I play games, I connect 1 monitor to RTX3090 and play on that, other monitor on RTX3060

2