Amazon, Google: Here Comes Application-Specific A.I., Says Evercore ISI

Evercore ISI’s Ken Sena today offers up his findings from his firm’s “Machine Learning Symposium” in San Francisco, with implications for Amazon.com (AMZN) and Alphabet’s (GOOGL) Google, among others, of the revolution in artificial intelligence that’s sweeping computing. 

One takeaway is just how important it is that Amazon and Google have the hardware power to do artificial intelligence in the cloud:

Hardware was brought up frequently not only in terms of the raw compute power required for AI and machine learning, but also in terms of the proliferation of devices. Gridspace’s Evan MacMillan spoke to the rise of quality microphones as a key enabler of his companies natural language processing tools and pointed out that one of the most expensive components in any phone is the microphone array. Similarly, Kevin O’Brien from Orbital Insights mentioned that the explosion in the number of satellites from private space companies like Planet Labs have created more imagery demand as the influx in satellites is leading to influx in data and hence performance. In addition, Jack Clark from OpenAI highlighted the hardware necessary for the raw compute power needed to power machine learning applications as creating just as much a moat as the data itself. This perspective on hardware makes us think that companies like Amazon and Google that provide cloud-based machine learning services are poised to benefit.

At the same time, there’s a rise of application-specific A.I., writes Sena, where the big names like Amazon may not dominate: 

The conclusion among panelists and speakers seemed to be both, as there will be several AI service providers for application providers to leverage with these application providers holding certain domain expertise. For example, Gridspace is using similar voice recognition technology as Google and Amazon, but remains focused purely on enterprise applications rather than the consumer-facing application of voice recognition technology found in Google’s Assistant and Amazon’s Alexa.

And another topic was the limited engineering talent available relative to the high demand:

There are an estimated 1,000 engineers with the needed skills in this field and concerns were expressed around the drain of the pipeline of data scientists within academia. Our VCs spoke to this bottle neck specifically, contrasting the notion of choosing to remain in school for five years (or more) with seven figure job offers coming in every six months.

For more thoughts on A.I., see my interview with Nvidia (NVDA) CEO Jen-Hsun Huang this morning.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top