Article Image
News Link • Science, Medicine and Technology

Neural Networks Are The New Apps


Shazam was the must-have app of the late aughts. It seemed to fulfill every promise of the post iPhone world: With the mere tap of the screen, you could beam information to the cloud to identify a random song playing in a commercial, at the bar, or on the radio. But it required Shazam to build a huge server farm–its own entire data center to handle the loads.

For a tangible example of how things have changed in the decade since Shazam's smartphone app debuted, think about this: On the Pixel 2, with a feature called Now Playing, Google has shrunk the equivalent of Shazam's countless servers of yore to run entirely on the phone. It can match 70,000 songs, no internet required. And instead of you asking it what song is on, Now Playing listens all the time and tells you before you even ask.

What made this possible? "There's been a deep learning revolution," says Matt Sharifi, a software engineer at Google, who first helped bring music identification to Google's own search bar back in 2010. "When we started working on this problem, the approaches to music recognition were different than in 2017. We did everything with deep learning and machine learning."