Recent comments in /f/deeplearning

Prestigious_Boat_386 t1_ivuhxf7 wrote

Think they want thecombination of that and splitting up when different people talk and assigning what's said to the person saying it.

Which isn't THAT hard when you already can recognice whos who, sometimes you could even just use main pitch & formants and silent segments. It's just quite a niche application.

1

suflaj t1_ivuam4k wrote

You mean automatic speech recognition? Yeah, there are models for that, Google probably has the best proprietary one but from what I understand it is still a work in progress, despite ex. Whisper releasing recently.

3

VU22 t1_ivsgucb wrote

I have literally never seen any company use matlab for deep learning. Some of the companies use it for prototyping image processing algorithms or so but pytorch, tensorflow, and keras is by far the gold standard in this area. Additionally I would recommend you to check TensorRT and CUDA if you have C++ experience.

1

justanator101 t1_ivrko53 wrote

I used both Matlab and Python (tensorflow) during my masters research involving image processing. Matlab was great for doing some signal analysis, preprocessing tasks, and even in some cases whipping up simple baseline ML models. I’d export that data and use tensorflow for any deep learning tasks. Like others have said, python is definitely way more used in industry so it’s way better to know tensorflow/PyTorch. The average job will require python and not Matlab. The oddly specific job probably requires both.

1

TheRealCpnObvious t1_ivr7c26 wrote

MATLAB's deep learning capabilities have been overshadowed by TensorFlow and PyTorch. I'm a MATLAB fanatic and though it pains me to say it, MATLAB has been historically too restrictive for DL practitioners looking to really innovate in the field. However, MATLAB is very good for rapid prototyping for a variety of applications (DL included; the Deep Learning tools and apps really do make it a lot easier to prototype with different experimental setups etc) but they still lag the popular frameworks in terms of SOTA implementations etc. MATLAB is also good to get you started with DL if you're a non-programmer and already know it, so it's got a lower relative barrier to entry and initial learning curve. So if you're an engineer looking to apply some deep learning models from the last few years then MATLAB can be enough (and even with some Cross-Framework interoperability in some limited cases), but if you're trying to solve fairly new problems then you might struggle.

These are my two cents having done the bulk of my DL research in MATLAB for my PhD, which I completed last year. Started off with the intention of learning Python and TensorFlow for my DL research but ultimately chose MATLAB for short-sighted convenience reasons that cost me the opportunity to learn the better tools/frameworks over the medium term.

4

OutrageousSundae8270 t1_ivr463t wrote

From my perspective, R is great for certain tasks (more related to classical machine learning and statistical analysis as you have mentioned). There are several libraries that have demonstrably better implementations in R than Python (lme4 comes to mind).

I will have to circle back to the same point regarding deep learning and Python. Whilst R does offer deep learning capabilities in its modern iterations, the uptake of using R in such a fashion pales in comparison to the uptake of Python, so the same comment I made about MATLAB still applies.

2

OutrageousSundae8270 t1_ivqynjm wrote

TensorFlow/Keras and PyTorch are the gold standard for deep learning frameworks within the industry, and these are both Python frameworks. It's not just about it being open source, its more about it being the tool of the trade. Nobody is going to employ you within a team that uses Python frameworks for deep learning, unless you are skilled at those frameworks.

MATLAB isn't really popular for deep learning, even though it does facilitate some deep learning capabilities (relative to Python).

16

fjodpod t1_ivk6pbg wrote

Reply to comment by xyrlor in Are AMD GPUs an option? by xyrlor

Personally i would either wait for the 4000 series midtier cards or just buy a 3000 series card with enough vRAM. However keep in mind that the 4000 Series theoretically could be worse or on par for machine learning than the 3000 series in some cases due to lower memory throughput: (https://www.reddit.com/r/MachineLearning/comments/xjt129/comment/ipb6p8y/?utm_source=share&utm_medium=web2x&context=3)

1

Emotional-Fox-4285 OP t1_ivjbz4s wrote

1

HowdThatGoIn t1_ivi1zzn wrote

I can’t say for certain without the code but it looks like the loss is being applied to every hidden unit (as a scalar) rather than being distributed based off of each units contribution to the loss (as a vector). Check the shape of your loss as it moves through each layer?

Edit: also, are you applying the total loss or the mean loss? It should be the latter.

1