Artificial Intelligence Hardware – Who Should Adopt it First, and Why?

From the soaring stock price of NVIDIA, to the cutting-edge developments at Facebook and Google, AI hardware is a hot topic. We set out to learn more about what executives should know about the coming developments in AI hardware – and how it might impact different industries and sectors.
With the help of Kisaco Research, we asked three AI and analytics professionals (all speakers at the upcoming AI Hardware Summit in Mountain View, California) these three important questions:
  1. What are the most important new capabilities that cutting-edge AI hardware will allow for in business?
  2. What industries or business functions are most likely to be forced to adopt new AI hardware and why?
  3. What should business leaders do to prepare for AI hardware innovation today? (What should they learn, what should they know, etc.)
In the subsections of the article that follows, we will delve deeper into these questions, highlighting the key insights from the professionals we corresponded with.
AI software has always received the lion’s share of attention, but as the computational resources needed to process this software soar exponentially, a new generation of AI chips is coming into being. Developments in AI hardware will take the spotlight, as companies converge at the AI Hardware Summit September.
Ed Nelson, Conference Director of the event, describes the AI hardware ecosystem as “exciting and dynamic – some of the biggest AI chipset customers and the cloud service providers of the world, are developing their own silicon architectures for processing deep learning, neural networks and computer vision. This hardware revolution is promising to slash times for training models in the Cloud and deliver ubiquitous AI at the Edge, which will in turn enable entirely new markets and business models.”
Chris Nicol, Co-founder and Chief Technology Officer of Wave Computing, gets more specific saying that, while most hardware solutions today are datacenter-centric and used for both training and inferencing, “there is a significant need for specialized AI hardware for embedded edge applications.”
He adds that “next-generation AI hardware solutions will need to be both more powerful and more cost efficient to meet the needs of the sophisticated training models that are increasingly being used in edge applications.”
Karl Freund, Senior Analyst for Machine Learning and HPC, at Moor Insights and Strategy predicts that new capabilities will be in hardware accelerators, which can:
“…provide much faster insights into your customers’ behaviors and preferences, which can bring a number of benefits such as improving sales and customer satisfaction, improved manufacturing processes and uptime, reduced costs in supply chain management, portfolio management and customer relationship management.”

Industries and Businesses Functions Adopt New AI Hardware

AI software has permeated major industry such as banking and financial services, healthcare, retail, as well as media and entertainment. Among these, who will be compelled to deploy new AI hardware? Of those industries that have not yet forayed, who will be compelled to implement AI? In Freund’s words:
“I cannot think of an industry that will not be impacted. From government to education to finance to healthcare to manufacturing…. AI will be pervasive.”
Nicol takes this insight a notch narrower by enumerating specific industries, such as healthcare and pharmaceuticals, retail, financial services and manufacturing industries, as well as companies developing embedded edge applications such as autonomous vehicles and security cameras.
These companies, he said, “are facing business imperatives for better, faster insight from their data and will benefit from AI native hardware acceleration to help them extract value from their data at scale and at the speed of AI innovation.”
Nelson added, it is in “any data-driven business’ interest to keep up with these developments for the sake of proactively maintaining or achieving competitive advantage via AI.” 

Preparing for Innovation

One issue among business leaders is not knowing how to structure their company data so as to be more useful. In preparing to innovate with AI, companies can start with the “low-hanging fruit.”
Freund recommends to start by “understanding what data you have that can be used more effectively, building your teams’ skills…and optimizing your hardware ecosystem to your business’ data requirements.”
He adds that before making enormous purchases on AI hardware, businesses should first “understand how different types of hardware suit different needs. You certainly don’t want to be spending a ton of money on specialized hardware if you don’t have to. But to solve specific problems you may need accelerators,” he adds.
Nicol echoes this insight, saying “business leaders need to understand the impact AI can have on their businesses, before their businesses are impacted.”
The first step, he specifies, is for businesses “to map out how improvements in interaction with customers or suppliers could affect their business processes. Then, they should plan to adopt AI hardware and software solutions that can support these changes before they occur.”
Moving forward, Nelson anticipates that “we’ll see that the AI workloads of the future will require a variety of processing architectures…he space is chiefly composed of a few big players with the majority of the market share, plus a host of extremely well-funded startups led by semiconductor industry veterans.”
Nelson deduces that “at least one or two of these startups have a real chance of becoming the next major player… these guys are making AI run fast, and at lower power and cost.”
Phil Blunsom, who divides his time between the Department of Computer Science at Oxford University and DeepMind, summarizes the importance of AI hardware as equal to that of software:
“Developing deep learning models is a bit like being a software developer 40 years ago. You have to worry about the hardware and the hardware is changing quite quickly… Being at the forefront of deep learning also involves being at the forefront of what hardware can do.”