Lightmatter says the computing and power demands of complex neural networks need new technologies like these to keep up.
Deep neural networks can perform wonderful feats thanks to their extremely large and complicated web of parameters. But their complexity is also their curse: The inner workings of neural networks are often a mystery — even to their creators. This is a challenge that has been troubling the artificial intelligence community since deep learning started to become popular in the early 2010s. In tandem with the expansion of deep learning in various domains and applications, there has been a growing interest in developing techniques that try to explain neural networks by examining their results and learned parameters. But these explanations are often erroneous and misleading, and… This story continues at The Next Web