Hello! Welcome to Embedic!
This website uses cookies. By using this site, you consent to the use of cookies. For more information, please take a look at our Privacy Policy.
Home > Embedded Events > AI chips soar, AI software in the spotlight.

AI chips soar, AI software in the spotlight.

Date: 20-07-2022 ClickCount: 296

While the tech industry continues to tout the "renaissance" of artificial intelligence, the number of AI chip startups has begun to level off. AI startups are finding that data centers were once a promising market, but the barriers to entry are high - and perhaps prohibitive. Their problems can be traced back to hyper-scale companies such as Google, Amazon, and Facebook, which are now developing their own AI processors and gas pedals to meet their specific needs.

What needs to be clear is that machine learning (ML) continues to evolve. More variants of neural networks are emerging. Artificial intelligence is becoming an inherent feature of every electronic system.

Arteris COO Laurent Moll predicts that in the future, "everyone has some kind of artificial intelligence in their SoC." That's good news for Arteris, which is in the business of helping companies integrate SoCs by providing network-on-chip (NoC) IP and IP development tools.

For AI chip startups? Not so much. Competition is getting fierce, compounding the challenge of cracking the right market segment for a particular AI design.

Next month EE Times will publish our "Silicon 100" (2021 edition), an annual list of emerging electronics and semiconductor startups. The report's author, Peter Clarke, has been following semiconductor startups closely for two decades. He tells us that the number of specialty chip startups focused on GPUs and AI is "flat compared to the previous year. He observed, "We sense that the industry may have reached 'peak AI' territory."

 

In short, the salad days of AI chip startups may be over.

Kevin Krewell, the principal analyst at Tirias Research, expects more AI chip startups to be acquired. "After all, the explosion in AI startup funding happened after Intel acquired Nervana. Venture capitalists and angel investors saw a potentially lucrative exit strategy." He added, "There are too many [AI] startups today beyond the industry's ability to support in the long term. I'm sure there will be more exotic solutions involving simulation or optics. [But] eventually, AI/ML capabilities will be incorporated into larger SoC or small-chip designs."

Against this backdrop, EE Times recently sat down with Arteris' newly appointed chief operating officer. Moll, Arteris' chief technology officer, spent more than seven years at Qualcomm, most recently as the mobile chip giant's vice president of engineering.

 

We asked Moll about the changing landscape of AI chips and where startups are headed.

Not surprisingly, Moll describes the industry's rush to AI as "one of the biggest gold rushes" he's ever seen. However, these later 49ers are no longer just startups or small companies. Prospectors include "companies that have been making silicon for a long time, as well as many newcomers who [had] not made silicon before," Moll says. Everyone is "playing on the same stage" and "trying to solve problems.

The growing developer base and diverse applications play to Arteris' strengths, but it paints a very different picture for AI chip startups. They are no longer competing only with AI startups with similar new ideas. But now they're also going up against the big boys. Hyperscale manufacturers and automotive OEMs are making a big push into AI so they can use their chips for their systems.

 

Still in the expansion phase

Arteris' Moll observes that the AI chip market is "still in the expansion phase" and that "everyone is exploring. Nonetheless, he sees "a little order" in the data center. That's large because hyper-scale companies are controlling their destiny by developing their own AI processors and gas pedals.

The difference between hyper-scale and other AI chip designers comes down to one factor. "They own the datasets," Moll says. the hyper-scale doesn't share datasets with anyone else, but they are developing proprietary software stacks. "And they feel they can create more optimized chips for their data access."

Meanwhile, outside vendors-smaller AI chip startups-are "developing new ways to build SoCs, new ways to use SRAM and DRAM, stacking, using optics," Moll says. " Moll said. "There are many ways to create secret recipes that allow them to do better than off-the-shelf AI chips. The small guys are changing the game and are very smart about doing things differently."

In contrast, the AI chips sought by hyper-scale companies are less innovative. Moll observes that hyper-scale users can afford to use more traditional methods. A good example is Google's TPU. "If you look at it, the architecture is great, but it's not revolutionary - in many ways." Nevertheless, "it's perfect for what Google wants to do. So, it fits their purpose."

If small AI startups have such novel chips, shouldn't they be drilling into hyper-scale data centers?

"No, no, no," Moore said. "It's unlikely that any small business will expand in the data center market ...... or that hyperscale companies will buy their products." However, he notes, "Once they see that their technology works and is applicable to what they want to do, they will certainly buy some of these startups."

Moll describes the hyper-scale mindset: "I know what my data set is. I know how to do a more centralized architecture. If someone has a great idea, let's capture that set of people and IP and improve our product."

Tirias Research's Krewell agrees. "You must do some amazing things to get superscalar commitments to use your machine learning chip." Cerebral, for example, pushes the limits with its wafer-sized chips, Krewell says. "Nvidia remains the default platform for AI development efforts because of its ubiquitous software and scalability."

 

What about the edge?

Moll notes that for AI chip designers, "the edge is a completely different story" compared to the data center. The end market at the edge is versatile and requires a broader range of solutions. "A lot of people are still figuring out where to apply AI and how to implement it," Moll said.

 

ai

 

By 2025, 19% of the semiconductor market will be AI/ML related.

Tirias Research's Krewell concurs. "The edge is still a relatively unexplored area. There are still opportunities to add ML to sensors and edge devices. Very low-power analog and memory devices and gas pedals in MCUs and application processors hold promise. I see great potential for INT4 and INT2 inference in edge processors - with high accuracy and much lower power and memory requirements."

While the diversity of applications sounds exciting, it runs the risk of getting caught in the hype cycle of edge AI.

Edge AI has become a buzzword not because the edge is a new market or because it represents any particular product category. Rather, the lack of definition has turned "edge" into a catch-all for startups to associate their products with.

Across a wide range of edge applications, Moll sees two distinct trends. One is "AI on a chip that can do other things," he notes. "That's where the explosion is happening."

He added that the embedded systems market is "where factors like form, power, and heat dissipation matter.

Another trend on the other end of the spectrum is "huge chips that just do artificial intelligence," Moll noted. However, the applications for big chips at the edge are still evolving.

The best example of "AI on a chip" is probably the smartphone application processor, as Moll knows. Artificial intelligence gas pedals have played a key role in speech recognition and vision processing. Today, AI has become an important part of the sales appeal of cell phones. One result is that "the older players in the mobile space, such as Qualcomm, have an advantage," Moll admits.

 

AI in the car

Moll sees AI in vehicles as an entirely different story.

He notes that various solutions will be, from AI-heavy computer vision chips to large AI chips that do all the heavy processing. As cars move from ADAS to autonomous driving, Moll expects larger AI processors to play a key role in the high-end automotive market.

 

While automotive industry stalwarts typically have their own small AI chips but have an advantage in ADAS, there is still plenty of room for AI chip startups in the autonomous driving market, which has a sizable AI chip.

 

But that's the tipping point.

Automotive OEMs - mimicking hyper-scale manufacturers - are also going vertical. Tesla has designed its chip, called a "fully automated driving" computer. A few weeks ago, Volkswagen CEO Herbert Diess told a German newspaper that the company plans to design and develop its high-performance chip for self-driving cars and the software needed.

Moll confirmed that automakers "are looking at this very carefully. Although Arteris is an IP company, "we get calls from automotive OEMs because they want to understand the whole stack and control the "big stacks of silicon" that are coming and changing the architecture of vehicles.

AI chip startups such as Recogni, Blaize, and Mythic cite automotive as an edge AI niche they are targeting. How automakers will eventually implement such chips in vehicles remains to be seen.

Krewell emphasized, "Automotive platforms are still evolving. Distributed functionality has the advantage of modularity and reduced risk, but it is more expensive to build and maintain than centralized processing complexes."

He added, "Another issue is data. Sensors will send a lot of data, and edge intelligence will reduce data transfer, but there is a trade-off between increased sensor latency and more distributed power in the chassis. Some balance of lightweight edge processing on sensors could reduce the load on the central processor without adding too much latency or requiring too much distributed power."

 

Artificial Intelligence Battle Shifts from Chip to Software

Krewell observed, "I'm seeing a shift in focus in AI from silicon to software. Deploying ML capabilities requires good software. In order for more embedded design engineers and programmers to use ML, it needs to be made low-code. It also needs to automate the creation of custom models for specific applications."

Moore came to a similar conclusion. He cited two things when asked why he decided to return to Arteris from Qualcomm.

First, Arteris once functioned in a niche market - "a narrow place among IP vendors. But that niche has now become "one of the key spaces" where AI chip designers seek help in "assembling very large and complex SoCs" by building massive networks on a chip. That's where Arteris' Network-on-Chip (NOC) can come in and holistically solve the problem.

Second, Arteris IP acquired Magillem last year, and Moll sees the "software layer" that Magillem provides as another key to creating a very large and complex SoC. After working with the Qualcomm team responsible for delivering top-of-the-line chips, "I began to recognize the value that Arteris offered as a user rather than a marketer."

 

  • Challenges to Embedded System Security
  • MCU based on Arm core

Hot Products

  • TMS320C6742BZCE2

    Manufacturer: Texas Instruments

    IC DSP FIX/FLOAT POINT 361NFBGA

    Product Categories: DSP

    Lifecycle:

    RoHS:

  • TMS320C6742BZWT2

    Manufacturer: Texas Instruments

    IC DSP FIX/FLOAT POINT 361NFBGA

    Product Categories: DSP

    Lifecycle:

    RoHS:

  • TMS320C6743BPTPT2

    Manufacturer: Texas Instruments

    IC DSP FIX/FLOAT POINT 176HLQFP

    Product Categories: DSP

    Lifecycle:

    RoHS:

  • TMS320C6743BZKBT3

    Manufacturer: Texas Instruments

    IC DSP FIX/FLOAT POINT 256BGA

    Product Categories: DSP

    Lifecycle:

    RoHS:

Customer Comments

  • Looking forward to your comment

  • Comment

    Verification Code * 

Compare products

Compare Empty