Estimated reading time: 0 minutes, 16 seconds

Making Deep Learning More Energy Efficient

Making Deep Learning More Energy Efficient "solitude"

IBM is proposing a way to make deep learning less of an energy drain by reducing the bits from 16 to 4, reports MIT Technology Review. 

Deep learning is an inefficient energy hog. It requires massive amounts of data and abundant computational resources, which explodes its electricity consumption. 

Read the article on MIT Technology Review

Read 2529 times
Rate this item
(0 votes)

Visit other PMG Sites:

PMG360 is committed to protecting the privacy of the personal data we collect from our subscribers/agents/customers/exhibitors and sponsors. On May 25th, the European's GDPR policy will be enforced. Nothing is changing about your current settings or how your information is processed, however, we have made a few changes. We have updated our Privacy Policy and Cookie Policy to make it easier for you to understand what information we collect, how and why we collect it.