Three-way deal pushes for self-driving car industry standard

Article By : Junko Yoshida

BMW, Intel and Mobileye hope to establish a big lead in the nascent self-driving car segment, but whether the deal will become the one to define an autonomous car platform is a tougher question.

Intel’s shopping spree

Let’s face it. Intel’s outsized ambition for the automotive market is no secret.

Intel already has underlying automotive technologies such as Wind River’s embedded operating system, software foundation and security expertise.

Further, in the last several months alone, Intel has been busy shopping around for more automotive-related technologies.

Earlier this year, Intel acquired Yogitech, which makes safety tools for autonomous car chips. In parallel, Intel’s Wind River unit bought Arynga, which offers GENIVI-compliant CarSync software for enabling Over-the-Air updates in automotive computers. Common to the two acquisitions is that both will be used by Intel’s future chips and reference designs aimed at fully autonomous cars.

Then, in late May, Intel announced the acquisition of Itseez Inc., a company armed with Computer Vision algorithms and implementations for embedded and specialised hardware.

De Ambroggi wonders if Intel’s recent acquisitions–added them all together–might spell trouble down the line, as Intel becomes a competitor to Mobileye, rather than a partner. If that’s the case, how much can Mobileye trust Intel?

Then, there’s the nagging mystery of whether Intel—under this deal with Mobileye and BMW—will push its own processor as a decision-making CPU.

Intel said in the press release, “To handle the complex workloads required for autonomous cars in urban environments Intel provides the compute power that scales from Intel Atom to Intel Xeon processors delivering up to a total of 100 teraflops of power efficient performance without having to rewrite code.”

Going against the Intel-inside-in-autonomous-car theory is Intel’s poor record of automotive design wins–thus far.

The Linley Group’s Demler can’t recall too many Intel victories. He explained, “First, the vast majority of processors in automotive applications are MCUs, so there’s no Intel play there. Intel claims a few design wins for In Vehicle Infotainment systems with Nissan (Infiniti) and Hyundai (Kia), but those are basically repurposed Bay Trail tablet processors.” In Demler’s view, “Intel doesn’t have a chip for embedded-vision processing like Mobileye’s, or even something equipped to run deep neural networks. They have Wind River software, and they’ve made some acquisitions for in-vehicle software, but nothing for ADAS hardware.”

Jim McGregor, a principal analyst for Tirias Research added that Intel doesn’t have “a strong track record at this point.” In the current market, “the leaders in the command and control systems are NXP and Renesas, the leader in communications is Qualcomm, and the leader in computer vision is MobileEye,” he noted.

Foundry deal in the offing?

Intel’s role in the deal could be explained with the theory that Intel might become a potential foundry for Mobileye’s EyeQ5, which is expected to ready in two years and scheduled to be manufactured by using a 10nm or below FinFET technology node.

In fact, although Mobileye’s EyeQ4 is being produced by STMicroelectronics using ST’s 28nm FD-SOI process technology, both Mobileye and ST have acknowledged that the future of EyeQ5 production must rely on a finer-note process technology–which ST doesn’t have.

Mobileye has not disclosed a foundry for EyeQ5. The Linley Group’s Demler remains sceptical of Intel getting the deal. “Mobileye uses Imagination’s MIPS CPUs, which I doubt have been run in any Intel processes,” he said.

What open platform?

Although the joint announcement talks about the three companies’ commitment to define an open platform for autonomous driving,” what “open” means is anybody’s guesses.

Tirias Research’s Jim McGregor said, “I doubt that Intel and MobilEye's definition of ‘open’ will be the same as the rest of the industry. It will likely mean that anyone can use their technology if they choose to do so and they will try to push that as an industry standard.”

The Linley Group’s Demler said, “Every company has its own definition of ‘open,’ but that would be a huge change for Mobileye.” Demler said Mobileye has “hinted at opening up when they announced EyeQ5 in their Q1 earnings call, but right now everything they do is proprietary.”

[auto 02]
__Figure 2:__ *Mobileye’s upcoming EyeQ5 block diagram (Source: Mobileye)*

Nvidia’s Drive system is open in the sense that it doesn’t supply the ADAS software, but a hardware-software development platform, he added. “Mobileye does it all in house, and nobody knows how they develop their algorithms. Nvidia supports open-source neural-network frameworks like Caffe. Google has an open platform for CNN development with TensorFlow.”

In short, Demler sees current autonomous-vehicle development echoing the smartphone platform wars between the closed Apple iOS (Mobileye) and “open” Google-Android (Nvidia and everybody else). “I believe that there’s going to be no way for autonomous vehicles to proliferate on closed, proprietary systems. There’s too much at risk to rely on one company’s secret sauce.”

While it’s an easy-to-understand analogy, the current status of autonomous cars might be far from smartphone level of maturity. In self-driving cars, too many technologies are involved, and they need to be sorted out before the industry aligns.

Mapping – HERE and REM

IHS’ Juliussen senses that the automotive industry is finally moving into that much needed “open” development phase.

Mapping is an example. HERE, a company co-owned by German automotive companies Audi, BMW, and Daimler, developed Sensoris, an open spec for vehicle sensor data to be collected and transmitted to the cloud by connected vehicles. Sensoris is billed as a common language for all autonomous vehicles.

HERE this week took a step toward pushing itself as the de-facto standard by submitting to ERTICO – ITS Europe, a private/public partnership for intelligent transportation systems in Europe.

HERE believes Sensoris is critical for combining data from all vehicles on the road. Once combined, data can be used in a system tracking traffic patterns, making predictions about potential bottlenecks and adjusting a self-driving car's path automatically, even intelligently updating mapping data.

The Linley Group’s Demler explained that REM and HERE are two components that could be used for autonomous-vehicle navigation.

[auto 03]
__Figure 3:__ *How Mobileye’s REM works in the backend (Source: Mobileye)*

Earlier this year, when Mobileye introduced REM, the company talked extensively about REM’s advantage over Google’s approach.

As Demler pointed out, it’s too early to say how autonomous-vehicle navigation will play out. But BMW, a key owner of HERE, is now working with Mobileye, inventor of its own REM. Juliussen remains optimistic. He noted that this indicates that the automotive industry is migrating into a more convergent approach, using a combination of technologies.

Subscribe to Newsletter

Test Qr code text s ss