This article also appears in
Subscribe now »
The Sony VISION-S in road testing. (Sony)

Sony jumps into AV sensors, software

The consumer-electronics giant leverages innovative tech for ADAS prototypes, partnering with Hungary-based AImotive for automated-driving software.

The growing prominence of sensor-based driving systems is creating opportunities for electronics companies to enter the auto industry. After years of speculation about Apple launching an electric vehicle (EV), Japanese electronics giant Sony is making moves to offer next-generation cameras for automated-driving functions.  

Sony unveiled the VISION-S – a sleek, battery-powered, automated vehicle – at the 2020 Consumer Electronics Show (CES) in Las Vegas. The company returned (virtually) to CES in 2021, this time announcing it was testing cars on public roads. At the same time, the company’s new video promoting the VISION-S identified Budapest-based AImotive as the provider of its automated-driving software.  

“Teaming up with a company that really knows about ADAS [advanced driver-assistance systems] to develop the software stack allows Sony to get more insight into how to make their products customized to the automotive market,” said Tom Jellicoe, head of autonomous vehicles at TTP, a U.K.-based consulting company that helps clients develop breakthrough technologies.  

Jellicoe noted,  “It’s one thing to release marketing materials about selling sensors to the automotive market, but Sony has gone a lot further than that. It has a car using an automotive ADAS system.” Sony gets a willing partner and AImotive can elevate its presence in a crowded field of ADAS suppliers. Work on the Sony VISION-S currently is AImotive’s biggest project. 

At the 2021 CES, Sony announced that  VISION-S  was “conceived to re-imagine mobility as we know it.” But whether or not the VISION-S sedan prototype – about 3.5 inches (89 mm) shorter than a Tesla Model S – eventually will be a direct consumer product has been a matter of speculation. Sony told us that it has no plans to mass-produce and sell a version of the prototype vehicle. Instead, the company  will primarily use the vehicle as a rolling showcase of its auto-ready technologies.  

Jellicoe said that the end game for Sony is to become a Tier 2 supplier. “I don’t think they want to get stuck in the depths of qualifying automotive assemblies or dealing with OEMs,” he said. “They sell probably billions of sensors into mobile phones. They would quite happily sell 100 million sensors into automotive.” Sony declined to comment on its plans to become an automotive supplier. 

Sony’s extraordinary parts bin 
Laszlo Kishonti, chief executive of AImotive, points to Sony’s leadership position in sensors, cameras, LCD screens, audio systems, videogames and entertainment as a powerful suite of technologies well-suited to next-gen vehicles. “Sony is not a standard EV company because they can build upon the technology they already own to add spice to the car,” said Kishonti. 

AImotive’s team of around 200 engineers is located in Budapest, Detroit, Yokohama, Japan, and Mountain View, Calif. In 2017, the former PSA Group (now Stellantis) announced it was working with AImotive on an AI-based automated driving system (ADS) for highways in France. The R&D  project focused on vision-based, so-called “Level 2-plus” systems. Kishonti noted AImotive software will be used in mass-production vehicles by approximately 2023. “Automotive product cycles are still very long,” he reminded.

Sony owns a treasure trove of electronics devices that could prove invaluable for automated-vehicle functions. In March, Izumi Kawanishi, Sony’s senior vice-president of AI robotics, told SAE’s Autonomous Vehicle Engineering about the company’s new complementary metal-oxide-semiconductor (CMOS) image sensor for automotive applications. He described it as “a stacked direct Time of Flight (DTOF) depth sensor for automotive lidar using single-photon avalanche diode pixels, an industry first.” 

Jellicoe and other observers speculate that Sony is pairing a lidar chip with its image sensor in a single camera package. However, Sony said it provides the image sensors as photo detectors, not as modules. “With the time-of-flight span array, it also measures the time it takes light to leave a source and get back to the center, so you get distance. That’s how lidar works,” said Jellicoe. He refered to the new Sony unit as a “hybrid between a camera and lidar.” Sony said the “hybrid” function relates to its sensor-fusion technology. 

The latest Pro versions of Apple’s iPhone and iPad use a camera coupled with lidar for measuring depth up to about five meters (16 ft.). The technology supports advanced photograph features such as portrait mode, as well as potential augmented-reality apps.  

AImotive’s Kishonti said that his company uses these next-gen sensors in its fleet of approximately 20 test vehicles. It has deployed a set of “mule” cars – mainly in Europe but a few in the U.S. and Japan. They are equipped with a technology stack that AImotive will use as a basis to enable automated driving in Sony test cars. Quanta provides automotive-grade processing platforms for AImotive’s test fleet, while Nvidia, a shareholder in AImotive, is the chip supplier. 

Kawanishi also played up Sony’s use of image sensors using LED flicker mitigation and high-quality HDR capturing. “They enable high-precision recognition regardless of lighting conditions,” he said.  The specifications sheet for the image sensor notes these frame rates: AD 10 bit at 40 frames per second and AD 12 bit at 30 frames per second. AD, in this case, refers to analog-to-digital converters in the specified bit size. The digital conversion allows for more subtle variances of color, contrast and exposure. The frame rate drops with higher bit depth due to the limited bandwidth accommodating the conversions. 

Jellicoe explained that flicker mitigation could be advantageous over cameras with a fixed frame rate. “When you image anything that has a light source, some frames have no light, and some frames have loads of light, so it can be quite confusing,” he said. “It sounds like Sony has solved that.” 

Kishonti is excited about the high resolution of Sony sensors. “HDR is probably the most important piece,” he said. “If I want to see 500 meters away, then I can use a specific small part of the image, kind of like a virtual zoom.” He explained that detecting the color of a traffic light several blocks away could help determine broader traffic patterns and thereby suggest if a car should slow down or maintain its pace. 

Sony is promoting the use of its high-resolution cameras and the potential adoption of onboard ethernet communications. Nonetheless, Jellicoe, the TTP consultant, said that high-resolution cameras run the risk of producing too much data. “You end up needing an enormous computer to wade through it all,” he said. “Autonomous vehicles have not been screaming out for high-resolution cameras, but it’s nice to see where Sony sees itself.” 

Early Days 
Sony launched the VISION-S project in 2018. Three years later, its development advanced to the road-testing phase. AImotive’s Kishonti described Sony’s requirements for the VISION-S ADAS system as a “roadmap” rather than a specific set of requirements. The VISION-S, after all, is an entirely new vehicle platform being built from the ground up. He pointed to the upgradable features of a smartphone (or Tesla’s pioneering over-the-air updates) as the framework for launching a product. Sony can add more advanced features in potential future releases. 

When this author visited AImotive in Mountain View in early 2020, engineers used a Toyota Prius mule to demonstrate hands-off highway driving, traffic-jam assist and automated valet parking in a multi-story structure. The systems had minor flaws – such as difficulty navigating around undifferentiated concrete elements – but appeared well into developmental maturity.  

Kishonti hinted that AImotive could apply those features to the Sony project at some point, but he balked at the prospect for fully autonomated driving. “I expect really high automation,” he said. “But if you’re driving in Europe, especially in southern countries where there are a lot of scooters around, that’s not an easy task to solve.” 

AImotive’s work with Sony is less about grandiose plans for robotic operation and more focused on safety. That requires choosing the best set of sensor inputs for the broadest array of driving scenarios. “Every weather situation and every road surface are a little bit different,” he said. “You need to make a system which works reliably and safely in every condition. These are not [SAE] Level 5 systems, so you still have a backstop of the human driver. But you want to make sure that you don’t cause more problems as you add safety and convenience.”  

Moreover, given the early stage, it’s critical to have a flexible hardware strategy. For example, the resolution of cameras and the speed of the compute might be suitable for American highways. But the system also needs to handle the unlimited speed of sections of the German autobahn. Sensors and chips are getting more powerful every day, so the potential for upgradability is paramount. 

It shouldn’t be overlooked that Sony also is famous for its sophisticated interfaces applied to videogames, electronic devices and household appliances. “I haven’t yet seen the Sony [car dashboard] system. But I know that we need to provide a lot of data to feed that human-machine interface,” Kishonti says. 

The AImotive CEO said that a Sony car also could leverage the Japanese company’s significant brand presence. “There are only a few companies that have so many followers. Everybody knows Sony either for the Walkman or the Playstation, depending on their age,” he said. “As an independent software solution provider, we can help these guys accelerate their development. The auto industry is in a transitional period. Nobody knows where it will be 10 or 20 years from now.” 

Continue reading »
X