LEARN MORE

Related Suppliers

Autodesk, Inc.

Zones



Loading the player ...

Click Image to Enlarge

The Tegra processor from NVIDIA is based on tech used for graphics-intensive applications like games, which means that renderings on screens are high fidelity. Auto applications today tend to be for infotainment, but advanced driver assistance system implementations such as vision processing aren’t far in the future.

The Tesla Model S uses Tegra processors for its instrument cluster and the touchscreen in the center stack. The carmaker is able to make upgrades over the air to its system.

Processors for Superior Performance in Automotive Apps

In the race to deploy more electronics-based systems, faster is better. And the processors from NVIDIA are awfully quick.

Danny Shapiro drives a 2011 Nissan LEAF. He charges it from the solar panels on his house in northern California. He says he likes the car. But there is a bit of a problem. “Even though the car is maintenance-free, I have to take it into the dealer to get a software update that is going to change the distance-to-empty algorithm,” he says, adding, “It’s ironic that a connected car still needs to be brought in to get a software update.”

Shapiro knows more than most people about automotive electronics, hardware and software alike. He’s director of Automotive for NVIDIA Corp. (nvidia.com). NVIDIA, based in Santa Clara, California, is the company that invented the GPU—the graphics processing unit—in 1999. Graphics processing is highly data intensive, whether it is for gaming, computer-aided design, or creating special effects for movies. Work in this area has led the company to develop products like the Kepler GPU, which includes seven billion transistors. NVIDIA developers created CUDA, a computer platform and programming model for parallel processing.

So what’s this have to do with cars? Have you ever seen Google Earth and StreetView displayed on the screen in an Audi? Or interacted with the 17-inch touchscreen in a Tesla Model S? Those OEMs, as well as an array of others, are using NVIDIA technology, specifically Tegra mobile processors, to drive the in-vehicle infotainment and digital instrument clusters. The Tegra processor, which is the size of your thumbnail, uses four CPU cores and an additional fifth processing unit to handle low-level tasks. What this means is that not only is there the power to render high-fidelity images (they can render at 120 Hz, which is twice what a typical video monitor operates at), but even the ability to use each of the four CPUs for individual tasks.

Navigation, infotainment, climate control, and driver assistance could all be handled by one of these chips at the same time, although it is more likely that there would be the use of a separate processor for advanced driver assistance operations, because having the collision avoidance system suddenly go out because something went wrong with the rear-seat entertainment system would not be ideal.

Philip Hughes, NVIDIA vice president of Automotive Sales and Business Development, explains, “Theoretically, there is enough horsepower within Tegra to run an infotainment system and, in parallel, to run some ADAS [advanced driver assistance system] functionality as we have four ARM Cortex-A15 CPUs running hypervisor [a virtualization manager] running on two separate systems. Some OEMs and Tier Ones are concerned and want a firewall, so we would have two Tegras, one for ADAS and one for infotainment.”

Which brings us back to the LEAF. Shapiro says that when the Tesla Model S first came out, some customers of the car didn’t like the fact that when they took their foot off the brake, the car didn’t creep forward like non-electric vehicles do. So Tesla engineers developed a selectable function called “Creep,” and updated the system over-the-air. The customers climbed into their vehicles one morning, and voila! there was a new function. He maintains that going forward, customers are going to download apps for their cars. “They can add new capabilities and enhance the value over time, rather than the car just depreciating,” he says. This is not something that is going to be done at the dealership, but remotely. Tired of that digital instrument cluster? Download a new one.

According to Hughes, the “holy grail” for automotive displays has been creating a tachometer with an indicator that moves smoothly and looks electromechanical. Information about vehicle operating parameters are taken from the CAN bus, and the signal is then processed to render graphics that represent the input signal. Then it is through the video output to the LED display. In real time. It’s like high-end gaming systems. It isn’t something that’s canned or preplanned. It is rendered on-the-fly. (Shapiro talked with designers at an OEM about digital instrument clusters. “They were proud of the fact that they have 250 screens that they designed in Photoshop. That makes no sense from a productivity standpoint. We have zero screens designed because we create beautiful, realistic content as needed; it is dynamic, it is not hard-baked.”) Like game developers and film makers, the OEMs or Tier One suppliers can create the 3D elements for their instrument clusters using software like Autodesk 3ds Max or Maya (autodesk.com) and then render it with UI Composer toolchain from NVIDIA. The result is the sort of high-fidelity images that consumers are familiar with from their other electronic products.

As OEMs install more cameras and radar sensors into their vehicles, there is a need to have high processing capability. This will become more the case as there are moves toward autonomous driving. Arguably, a supercomputer on a chip—like the Tegra—would be more than up to that challenge.