Net-Specialist t1_j8wlpsp wrote
What drove the 2017 to 2018 cost reduction?
xsvfan t1_j8xf392 wrote
> Progress in AI is being driven by availability of large volume of structured data, algorithmic innovations,and compute capabilities. For example, the time to train object detection task like ImageNet to over 90% accuracy has reduced from over 10 hours to few seconds, and the cost declined fromover $2,000 to substantially less than $10 within the span of the last three years Perrault et al.(2019)
wallstreet_vagabond2 t1_j8xofqo wrote
Could the rise in open source computing power also be a factor?
PM_ME_WITTY_USERNAME t1_j8yb4oo wrote
What is that open source computing power you speak of?
greenking2000 t1_j8yr0yq wrote
You can donate your idle CPU time to do research if you want
You install a program and it when your CPU/GPU usage is low it’ll do calculations for that organisation
https://en.m.wikipedia.org/wiki/List_of_volunteer_computing_projects
Though he is likely on about Cloud Computing and got the name wrong
PM_ME_WITTY_USERNAME t1_j8zc52q wrote
I used Folding@Home before AI took over! :>
hppmoep t1_j90acma wrote
I have a theory that modern games are doing this.
Nekrosiz t1_j8ycmqp wrote
Think he means cloud based computing power?
As in rather have 1 PC here and one pc there doing the work individually, linking them up and sharing the load
Bewaretheicespiders t1_j8x518f wrote
I changed my mind, and I will guess mobileNetV2.
splerdu t1_j8y0vw4 wrote
The most obvious thing to me is the introduction of Nvidia's RTX series cards.
foodhype t1_j8zci3y wrote
Deep learning techniques improved dramatically in 2017. TPUs were also first introduced in 2016, and it typically takes about 4 years for data centers to completely migrate.
[deleted] t1_j8x4f37 wrote
[deleted]
Viewing a single comment thread. View all comments