From The New Yorker - - it will be interesting to see if the "watt metric" becomes a KPI.
"A.I. owes much of its recent success to biological metaphors. Deep learning, for example, which underlies technologies from Siri to Google Translate, uses several interconnected processing layers, modelled after the neuronal strata that compose the cortex. Still, given that even the most advanced neural networks are run on von Neumann machines, they are computationally intensive and energy-greedy. Last March, AlphaGo, a program created by Google DeepMind, was able to beat a world-champion human player of Go, but only after it had trained on a database of thirty million moves, running on approximately a million watts. (Its opponent’s brain, by contrast, would have been about fifty thousand times more energy-thrifty, consuming twenty watts.) Likewise, several years ago, Google’s brain simulator taught itself to identify cats in YouTube videos using sixteen thousand core processors and all the wattage that came with them. Now companies want to endow our personal devices with intelligence, to let our smartphones recognize our family members, anticipate our moods, and suggest adjustments to our medications. To do so, A.I. will need to move beyond algorithms run on supercomputers and become embodied in silico."
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.