makeasnek t1_j067hnh wrote
Unfortunately AI as it stands strongly favors those with large resources, ML training is not something that works to distribute well because it benefits greatly from low latency. AI isn't a map-reduce problem or any other problem type which works well to split up into discrete parts to distribute for computation. MLC@Home if I understand correctly was training small single models on each machine/work unit, not training models in a distributed fashion.
Viewing a single comment thread. View all comments