ARM In IoT Game for Long Run

Article By : Nitin Dahad, EE Times

Stream acquisition signals intent, but rewards require commitment

LONDON — Arm is better positioned as a privately held company to take a strategic approach for long-term growth as it addresses the IoT market, according to Rajeev Misra, CEO of SoftBank’s $100B Vision Fund and a director of the SoftBank Group.

“IoT requires an entire stack, not just the chip, and in order to make these investments, you need to sacrifice profits over the next three to five years,” Misra said here this week at CogX, one of the biggest artificial intelligence conferences in the U.K. “It takes courage to take a long-term view when shareholders look for nearly 10% growth per year.”

Asked for his view on the future of tech, Misra said that digital security would be the biggest industry in the next three years, especially with the level of connectivity that we’re likely to experience plus privacy challenges. He also believes that over the next couple of years, we’ll be able to deliver more processing power than the brain but that processing power alone will not be enough, especially with the need to process, analyze, and do something with the data obtained.

On the Vision Fund, he said that SoftBank has made 36 investments during the year since it was launched, but with only two in Europe, and that in the next two to three years, it expects to have 80 to 90 companies under its umbrella.

He dismissed suggestions that the company’s fund was fueling a bubble in the tech industry. “When you take a 10-year view, short-term valuation fluctuations are not relevant,” he said. He added that some of the tech companies with hundreds of billions of dollars in cash will start becoming prolific investors competing with their own Vision Fund.

Misra also said that China is creating a huge tech ecosystem now, which may end up being bigger than Silicon Valley in terms of its impact. The critical factor was not capital but the mentors and the role models, which come from more and more successful tech companies in countries like China.

Hardware gets airtime at AI conference

The two-day CogX conference, which is reported to have had over 6,000 attendees, was unusual for a generalist AI conference in that hardware players were also given a stage. With many AI conferences today, there’s a lot of generalist discussion as there was at CogX on the impact of AI, AI taking over jobs, ethics of AI, and women in AI. But this time, there was also an attempt to make generalists aware of what the chip and hardware players are doing to enable this.

Companies like Intel, Graphcore, and Lighton explained in simplistic terms to the audience the basics of the hardware options for AI — CPUs, GPUs, TPUs, intelligence processing units (IPUs), and optical processing units (OPUs). FiveAI’s CEO, Stan Boland, also outlined the essential elements of developing hardware technology for fully autonomous vehicles.

In particular, the panel hardware players emphasized the need for a different approach to addressing the requirements of deep learning for AI and why an optimal balance of computation and communications architecture for each algorithm has to be reached to get the best performance from the hardware.

Intel’s Casimir Wierzynski, from the company’s artificial intelligence products group in San Diego, highlighted its accelerator for deep learning and its work on silicon photonics circuits to bring lasers right onto the wafer. “One day, we could bring light directly into the deep-learning accelerators,” he said. Simon Knowles, co-founder and CTO of Graphcore in Bristol, U.K., said that there’s a new wave of chip companies that are designing processors from scratch specifically for machine intelligence. Igor Carron, co-founder and CEO of LightOn in Paris, France, added that optical processing was the future for large-scale AI using optical computing, which can make use of diffusive media as memory.

James Wang, an analyst at New-York-based Ark Invest, said that AI hardware had already come a long way and that the key to scaling up AI performance another 100,000 times would be to develop new substrates such as optical, analog, and quantum and to develop more efficient algorithms. In addition, he emphasized that TPU designs were fine but would be insufficient to supplant Nvidia, stating, “Nvidia is the most ferociously competitive company in the world.”

— Nitin Dahad is a European correspondent for EE Times.

Subscribe to Newsletter

Test Qr code text s ss