Thought Leadership

Episode 3: Identifying hardware design challenges and AI at the edge

A good book can often trigger an idea that can be applied to my work or personal life. That’s why I constantly read both fiction and non-fiction books. One of my favorite authors is Neal Stephenson. He is probably best known for his book “Snow Crash,” which was used as an inspiration for Dot.com startups for vision and business plans. I heard one story that a CEO tossed this book onto a conference table and said, “Implement this!” Neal combines a strong technical background with creativity to produce complex science fiction stories set within intricate worlds. His books are hefty tomes that require your utmost concentration to read.

Recently, I finished reading his book “FALL: Dr. Dodge in Hell.” In it, an entrepreneur at an AI company dies and his brain structure is transferred into a massive simulation system. He actually learns how to build out an entire planet within the system. As others pay to be transferred into the system after death, they “live” and build communities. In real life, people around the globe watch the simulation as a new form of entertainment. One of the aspects that Neal explores is how much power this system consumes. The actors in the book actually have to develop new power sources and finance systems to keep this AI community alive. Neal is giving a nod to the very real-world problem of the enormous energy required to fuel AI development and deployment.

Neal signing my first edition copy, when such events were still possible.

Most AI system developers don’t consider power consumption, as they just want to create the best algorithms that deliver results. But, the AI community is starting to worry about their carbon footprint, especially for the neural network training process that takes place on racks of computers in their labs. For example, in the paper “Energy and Policy Considerations for Deep Learning in NLP,” researchers found that training their natural language processing system consumed about the same energy that the average car burns during its lifetime. So, AI teams are starting to apply energy measurement techniques at various points in the system:

  • Software level
  • Instruction set level
  • Hardware level

The custom integrated circuit (IC) design community are experts at conserving power and they have been worried about this for decades. First, they do everything they can at the hardware (IC) level to minimize energy consumption. Then within the system they are building, they design and simulate the instruction set that runs on the IC. They use tools to measure energy consumption as the instruction set runs and change the IC design if they see power problems. Finally, they run software programs built on the instruction set and test the IC power performance. AI system developers can learn a lot from custom IC teams.

Energy consumption in the lab is one thing, but businesses want to push AI into the Internet of Things (IoT) for applications in factories and cities, in addition to your smartphone. And, some of these applications collect data at the edge, so they want to add some training to the existing device, instead of just deploying static neural networks. These devices often run on limited power sources like batteries or solar, so they cannot consume massive amounts of energy. Data movement within a device consumes power. So, companies are creating custom ICs with targeted AI applications and onboard memory to efficiently process as much data as possible at the edge and then only communicate with the Cloud when absolutely necessary.

And that brings us to a new podcast. In Episode 3 of their 4 part series, Ellie Burns and Mike Fingeroff discuss AI power consumption, different compute platforms that teams are trying and their limitations. This is leading to a surge of new custom IC development as hardware designers try to move AI to IoT edge devices. Listen here.

Thomas Dewey

Thomas Dewey (BSEE) has over 20 years of electronic design automation (EDA) experience at Siemens EDA (formerly Mentor Graphics). He has held various engineering, technical, and marketing responsibilities at the company, supporting custom integrated circuit design and verification solutions. For the last 4 years, he has researched, consulted, and written about all aspects of artificial intelligence.

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.stage.sw.siemens.com/thought-leadership/2021/04/02/episode-3-identifying-hardware-design-challenges-and-ai-at-the-edge/