Episode 151 - Unlocking Another Dimension with Depthsensing Technology

Machines that see and understand the world around them? It’s possible with depthsensing technology that maps surroundings, objects, and people more reliably and in more detail than ever before.

Leading this charge is Sony Depthsensing Solutions whose Time-of-Flight (ToF) technology give machines the ability to see accurately in real-time in 3D by establishing the distance between the camera and multiple points on the subject surface. From mobile phones to smart homes to industrial robots, the possibilities for this technology are expanding into everyday life.

From an automotive perspective, Sony’s vision for car safety integrates advanced imaging and sensing technologies (RADAR, LiDAR and Camera) to deliver a 360 degree "safety cocoon” for faster responses to potential dangers and obstacles on the road. Gesture control and in-cabin monitoring help drivers stay focused on the road and can differentiate between people and objects in the car.

To learn more about this game-changing technology, we sat down with Daniël Van Nieuwenhove, President, Sony Depthsensing Solutions, to discuss how the company’s applications can be used across industries and the importance of creating a safer and more immersive experience on the road.

Meet Our Guest

President, Sony Depthsensing Solutions

Daniël Van Nieuwenhove obtained his engineering degree at the VUB university in Brussels in 2002 and acquired a PhD at the same university on the subject of CMOS circuits and devices for 3D ToF (Time of Flight) imagers.

Post university, as co-founder of Optrima, he brought proprietary 3D CMOS ToF sensors and imagers to the market whilst working on CAPD (Current Assisted Photonic Demodulator). Daniël is a named inventor on multiple patents and has authored several scientific papers.

Following a merger, Optrima became Softkinetic and was ultimately acquired by Sony in 2015. Now president of Sony Depthsensing Solutions, Daniël continues to pursue his passion for engineering solutions by pursuing mobile, industrial and automotive 3D sensing solution business for new applications such as gesture control, 3D scanning and navigation.

In 2019 Daniël was awarded the honor of Outstanding Engineer at Sony for "3D ToF Sensor and System Used in Cars and Mobile Applications." “My passion for technology and the great potential of 3D have been the driving forces which brought me from initiating a start-up to leading Sony Depthsensing Solutions, which has been, and remains, tremendously fun.”


Grayson Brulte:

Hello, I'm your host Grayson Brulte. Welcome to another episode of SAE Tomorrow Today, a show about emerging technology and trends and mobility with leaders and innovators who make it all happen. On today's episode, we're absolutely honored to be joined by Daniël Van Nieuwenhove, President of Sony Depthsensing Solutions.

We'll discuss Sony's depth sensing technology and the impact it has on automotive safety. We hope you enjoy this episode. Daniël, welcome to the podcast. 

Daniël VanNieuwenhove:

Hey, good afternoon, Grayson. Welcome. 

Grayson Brulte:

Excited to have you here because depth sensing has immense capabilities, immense use cases throughout a variety of industries.

It's not just one industry. The technology carries across a variety of industries. And Daniel, to kick things off, let's dive right into this. What is depth sensing and how would you describe it to a listener? 

Daniël VanNieuwenhove:

Depth sensing. What few people realize is that cameras we have all around us, they're just recording color in two dimensions and they don't contain the depth.

So depth sensing actually means that you also include the recording, the distance to all the points in the image, and that's. Quite crucial actually. Because as we all know, we live in a three-dimensional world. And in order to understand, in order to understand our environment properly especially for machines, this third dimension is very crucial.

As humans, we tend to recognize things in color pictures but the depth is not actually present in those pictures. 

Grayson Brulte:

You've described depth sensing as a game changing technology. Is that because the distance in the image or why do you describe it that way? 

Daniël VanNieuwenhove: 

Game changing means that it's really revolutionizing quite a few industries, actually, as I just mentioned, is depth is really crucial for machines. In order to enable machines to properly understand their environment, they need these three dimensions. And basically there are a lot of applications that are trying to run off color images. If you take, for example, face recognition, it's running of color images as well in many mobile phones today.

However, if you just show it a picture of yourself, the face, ID also recognizes you. So somebody carrying your picture could log in to all your details as soon as you add. You need to also have the exact representation of the geometry of your face. And that's like a fingerprint. It's really unique to every human being.

So this is an example where depth allows a machine to really know it's you and the color just allows the machine to think it's you. 

Grayson Brulte:

What you described there seems that depth sensing could increase security for biometrics if you're an ATM machine, for example, or you're trying to unlock a vehicle. Is that a fair statement there?

Daniël VanNieuwenhove:

Oh, absolutely. Absolutely. There are some mobile phones today which already use some of the depth perception. In order to bring this after level of robustness. And the advantage of those is that they can be used for unlocking your financial data and these kind of things, which makes wiring some money very rapidly from your phone and these things very handy. So in that respect, we can also see that this is a natural step to also be included in ATMs, in airports, in all these kind of things. And that's then just the biometrics angle to it because that's just the tip of the iceberg if you want. Otherwise it wouldn't be a game changing technology. It's also things like autonomous navigation of both robot cars, drones. It's also things like 3D scanning your environment for augmented reality. It's also things like monitoring and understanding properly the trafficking for the amount of humans or whatever you want monitored properly.

All these things can be observed properly if the machine possesses some 3D perception instead of just color perception. 

Grayson Brulte:

That, that's game changing. You're increasing security. You're allowing the autonomous vehicle to drive better. You're allowing the augmented reality world to be made.

Let's dive into to the automotive industry here. You described the example with the pedestrians. What are some other applications that depth sensing can be used for in the automotive industry? 

Daniël VanNieuwenhove:

Yeah. Automotive these days is a huge category where lot of things are changing, and one of the evolutions we see coming is the autonomous driving evolution, which is a huge challenge in the sense that when cars start taking control of the steering wheel, you wanna be absolutely sure that you don't hit any pedestrian or any object that can damage your car. So in that respect having some 3D perception is not just an option. It's mandatory and can either do that by multiple viewpoints. If you have multiple RGB viewpoints, you can see the depth from those.

The downside of those is that in darkness, they're quite blinded because they need light to see that. And that's where there is depth technology, other depth technologies like radar or active light based step technologies or even sonar that allow you like ultrasonics that allow you to be combined in a full 3D view of the surrounding of the car.

And that's Autonomous driving. Besides that, there's also evolutions, or evolutions gonna happen inside the cabin. Some of which are already on the way, some of which will still follow. So there's quite a lot happening in automotive. 

Grayson Brulte:

Indeed. There's a lot happening. We're seeing the improvements in ADAS pretty much every vehicle new today has emergency braking, has lane-keep assist pretty much as standard features today.

Is Sony's depth sensing technology currently being used in vehicles on the road today? 

Daniël VanNieuwenhove: 

Oh, Sony's providing a lot of sensors and there are definitely vehicles today benefiting from that. So yes the short answer is yes. Unfortunately, I can't give you the details of which brands or which cars are really using that. But yes, there are quite a few cars on the road today using some sensing of some form that is coming from Sony.

Grayson Brulte: 

You described earlier, let's say the security application where you're making that consumer's bank account safer through the depth sensing. And when everybody gets into a vehicle, if you're at home and your wife takes your child to school or go to the grocery store, you hope and pray they get to the store and home.

Safely without all the advancements being made today with the emergency braking, as I said the lane-keep assist. What are the impacts that depth sensing could add to safety, can depth sensing and technology developing increase safety. So you're like, I know that they're gonna, the vehicle's going to the emergency systems will kick in because there's depth sensing there.

Daniël VanNieuwenhove: 

It definitely, it's all about increasing the robustness of these systems, right? Of course, The ultimate is just being completely confident in any situation, any environment that the car will. Control and take care of everything. In reality, this will happen in steps. We are definitely getting there and a lot of sensing and sensors will be necessary for that, including depth sensing, but also other modalities in combination with that.

Regular core sensors et cetera. But the steps to go there, some very important steps to go. There are. Initially we are, we know ADAS today where we are getting systems that are assisting in driving the lane control, et cetera. You mentioned that. So the step now, these systems need to monitor as well, some way or the other that you keep your hands on the steering wheel, that there is a level of control from the driver. The next step in that will be to allow the driver to really release the steering wheel and give the autonomy to the car, and have the car take liability of the driving. And this is a significant step. And initially that step will be taken by still having some observation of the driver so that the driver can take over.

The steering, whenever the car feels, it's not fully understanding or not fully 1000% confident that it's driving safely. So in that respect, the in between step between just enjoying an autonomous ride and sleeping in the back of the car. And what we have today will be that you can leave the steering to the car, let the car drive, but you still will be requested to keep some attention and ready to take over when things don't really, when the car is not perfectly sure about what it's doing, and in order to get to that level, the car needs to have as much sense and as much understanding of the 3D environment as possible because that is what enables the car to be as confident, to be fully confident. And yes, I will only enjoy those services if there are some 3D capabilities given to the car to take care of that functionality. 

Grayson Brulte:

It's not just taking care of you, making it safe for you, it's making it safer for other individuals on the road. Perhaps they're driving and they wanna know that your system is monitoring to take care of you, to make sure that you're not misbehaving or doing something silly. You have individuals that do silly things in vehicles. We've all seen the videos on YouTube of silly things, but perhaps there's a driver that's distracted and keeps talking to the passenger, turning their head or turning around telling the kids to be quiet cuz they're on a call.

Can your depth sensing technology do that, say, wait a second, this driver's not paying attention. They're telling the kids to be quiet talking to the passenger. Can you notice that? And then the goes, perhaps beep. Driver, please pay attention or system will disengage. 

Daniël VanNieuwenhove: 

Oh, absolutely. Yeah. You got it fully right there, Grayson. This is a very important field, especially as I explained that. You will be required in certain situations to be able to take over a steering wheel. So all car manufacturers will need to install some mechanisms to fully monitor exactly those situations so that no unnecessary accidents happen when drivers get used to not driving and all of the sudden are ready to take over.

Features like head orientation, the attention. Of the driver where he is looking. All these angles and aspects are being taken care of by 3D systems in your future cars. Adding to that, a big thing in cars already today is what we call drowsiness detection. That's already a feature, which is in some cars today, but it illustrates the way it's implemented in most everyday cars today is quite l ow level and it's not very accurate. So this is a good example of a system that can be much, much made much, much better by bringing in some 3D, by bringing in multiple modalities and multiple sensors so that you can know 100% sure when a driver is distracted or when he's drowsy or when things are not really as they should be and indeed, you will get an annoying beep in those situations. 

Grayson Brulte:

When you monitor the drowsiness, the tiredness, the drivers putting their head backwards and forwards. They're half awake, half asleep. They're opening the window trying to get cold air to wake up. Is that where the depth technology comes in?

Wait a second, this head is moving. And a pattern that we've determined that could mean drowsiness. Is that an example of how the depth sensing technology could be used? 

Daniël VanNieuwenhove:

Yes, absolutely. Yeah. So the depth technology will do, will know down to the last detail, all your movements in three dimensions.

What you're doing, the speed of your head orientations, how attentive you are. Also your gaze is being, will be tracked and is already being tracked in many cars today. So all this big library of data points together with some good compute power will enable the car to fully understand whether or not things are, what the state is of the driver.

Also, maybe adding to that, if we talk about state of the driver also the some also the happiness the attitude or how the driver is feeling will be monitored because eventually, we want a tailor-made experience in a car. We want a car to be an extension of us as humans. It's taking us somewhere and it's, and while going there, we wanna maximize the quality of our time. So all this is being looked at with sensing technologies. 

Grayson Brulte:

When you look at the sensing technologies, I have a vehicle has sensory technology. And it always beeps at me and tells me to move my steering wheel because the steering wheel's blocking where it's put, and I'm just a consumer. Okay.

You didn't put this in the right way. It's obviously, it makes a big difference. Do you have best practices of where the sensor should be put so the depth sensing will work the best, or do you work with your OEM partners to determine we, we recommend pointing the technology here to get the full benefit of it?

Daniël VanNieuwenhove:

Yeah. A lot. A lot of it is co-creation, of course, because in the value chain there are a lot of players, and we are typically providing the sensing part and some of the software. So in that context, this is , many aspects of it is collaboration. Now, there is also a lot of joint investigation, joint research going on to see where these cameras need to be positioned in order to maximize their efficiency, their use, their quality, and minimize the cost for the for the consumer in the end, because you don't want to have cameras everywhere because that wouldn't make sense. So typically in the center of the car, behind the steering wheel, outside of the car, it's more front, rear, but also on the surrounding there are, there is always a level of sensing necessary. I would say quite the obvious positions, but the effort is to minimize the amount of cameras and to maximize the efficiency of the cameras and what they can achieve in the car. That's the trend. 

Grayson Brulte:

The cameras can achieve a lot. I spoke to an Uber driver recently that has an electric vehicle with a lot of cameras on it. And this individual said, how do you love it? He said, I love it. I said, sir, why do you like it so much? He goes, now when I go through an intersection, I'm taking an unprotected left-hand turn.

I can see all the camera data in case somebody decides to be silly and run the light or run into it. It makes me a safer driver of being able to get all those cameras. That's just one example that I've personally experienced. We're talking to an individual with the cameras there. From the safety perspective, Sony is all in on safety.

Your depth sensing technology is part of Sony's safety cocoon. I repeat Sony's safety cocoon. It's really a fascinating approach that's been well documented throughout Sony's YouTube videos. Could you talk about that approach to safety please? 

Daniel VanNieuwenhove:

First, we should distinguish in the ranges around the car. There are multiple ranges around the car that need to be safeguarded. So there is the what we call the short range around the car, which is the perimeter of around 10 to 15 meters around the car, which especially in context close to your home where there are many, where are a lot of humans like parking lots and these kind of situations.

There is a strong need even at slow driving to be 1000% sure that you are not missing the even the slightest detail. So that's the closest context around the car. Then you have the more longer range technologies as. And of course everything in between that, but long range typically means on the front of the car, LiDAR technology is looking around a hundred, 150 meters far for when you are mostly for highway driving because there you really need to look far ahead of the car, be able to monitor the radar. The radar technologies give a compliment to that. And then there is multiple mixes of modalities that are used by various OEMs in order to bring about what they believe is the best strategy to, to bring safety there.

And at Sony, we are trying to provide the highest quality of all these modalities to the market. But on this long range, in, in the front, there are already deployed technologies like automatic cruise control or adaptive cruise control. These are existing today. But of course when making lane changes, when making when taking exits and these kind of things you always need to have a mix and a proper understanding of everything going on the highway.

An extra angle to this is that there are also different technologies. There are 3D technologies that are blinded by things like mist or things like fog and rain, et cetera. There are technologies that can see through those. So those are partially under investigation, partially also being worked out.

That's part of the safety aspect we need, we want to deploy in the car as well. That was in front of the car. In the rear of the car f or many years, there is a regular color rear view camera, but of course, the more you can secure by, by 3D technologies and others, what a car understands of its environment, the more the car gets safer.

There is maybe a category which is in between long range. You have the short range that is the middle category as well which is obviously than used in between both those situations. Not much I can say about that, but there. Those are definitely there as well. That's more a 40-50 meter category, which is typically also taken full 360 around the car today.

Grayson Brulte:

You mentioned the environment. You have the safety cocoon, which is making the car safe to operate, safe to drive, or eventually safe and autonomous. And then the other path back to the environment is inside the vehicle. When the consumers are inside the vehicle, they want to have a great experience.

They want things to work. You wave your hand to open the sunroof. You want it to work. You ask the car to change some music or raise the volume you want it to work inside the vehicle. How is Sony's depth sensing technology being used outside of safety? Are there consumer comforts that it's being used for as describe with opening and closing the sunroof, or perhaps if you're in front of the screen and you wanna go left to right to, to change tracks? How is it being used inside the vehicle? 

Daniël VanNieuwenhove:

Yeah, good question actually inside the vehicle, as I mentioned earlier, there is this focus on making the experience as nice and as good and as high quality as possible. So inside the vehicle, there are already today many comfort features and the first focus of depth technologies as well is to further enable those comfort features.

I can give examples like, everybody knows when there are now fixed. You can now save your configuration of your electrical seat in the car so that it fits automatically too, so that the mirrors and everything is configured properly. Now with adapt technology this, even fixing it automatically and configured for you is a redundant step because the 3D technology can even understand, can understand immediately what should be the most accurate position for the mirrors, et cetera. For you, there is the facial recognition that can enable to launch that configuration automatically. Where now sometimes it's Bluetooth, sometimes there are systems which are not always handy or which you have to train or manually configure.

When you save your face in one of the future cars, it'll automatically know that it's you and be able to configure all these things automatically. And we also tend to look at a car as a multimedia device. So a car is we all remember the evolution from a feature phone or an old regular phone to a smartphone.

And in the beginning felt a bit awkward to have all these internet-enabled things on your phone, but rapidly we adopted that and it made total sense. And we have a similar vision for a car, actually. We believe a car should be multimedia enabled. And in that context, you can think about face recognition, for example, doing a payment in the car, customizing your favorite movies, music and all those things.

There are many more things that we see in the context of comfort. Maybe the last one I can mention is a technology like gesture control, which is, has been a very challenging technology to deploy up to a couple of years because it requires a lot of robustness and differentiating so that there are minimal false positives and that the system really operates securely, but enabling it with 3D in the proper context is really a very easy and intuitive interface, and it allows you to keep your eyes on the roads while controlling all these new features in the car while switching music or launching several other maybe a video call or other things that you will be doing whilst driving in your next generation car. 

Grayson Brulte:

I love the fact that you get into the vehicle, the seat sets its position, the mirror sets, it’s their position. You're creating the bespoke experience, you're giving the Saville Row suit a vehicle. You go in there to get a beautiful bespoke suit made that, that fits you and you're doing that for the vehicle.

You mentioned payments, I think in-car payments is gonna be a giant growth market and society shifts to autonomous vehicles could depth sensing technology used to authenticate a passenger. So it says, okay, Daniel. Okay Grayson, so all of my metrics, I like the seat warm. It has my credit card on file and said, okay, car continue riding it charges my car, I say, okay, car turn on radio. Does that become that authenticated thing? So somebody can't put a mask on and say, oogly-boogly. And then next thing you know, they're charging all this stuff to my credit card. 

Daniël VanNieuwenhove:

Initially, obviously there's gonna be a level of evolution here, and this is a roadmap being built by humanity, I would say. Because even if we look at all the car brands, they're just gonna deploy technologies and implement elements and then see how we as the human race will respond to that and what we appreciate and what we don't now. Definitely the things you mentioned , they are part of what is possible and what is considered now. How much, and in what mix and in what shape this all will be deployed, that's dependent on many factors. And that will really define the rollout of these things. So I wouldn't say that all of a sudden tomorrow you're gonna be, you're gonna have theft protection, or you're gonna have in all the cars or in many cars, or you're gonna have recognition of all your passengers, or that there is the, all these things are possible, but things are going in a natural evolution.

Grayson Brulte:

The possibility's good. You have really smart engineers like yourself and the great team at Sony that engineer and build some of the world's greatest products. So I'm not gonna give a timeline to it, but at some point you're probably gonna figure it out and deploy it through through your customers into the vehicles.

Outside of the vehicles, are there applications for depth sensing in the home? Sony makes great TVs. You have a lot of products in individuals' homes. Are there applications for depth sensing there? 

Daniël VanNieuwenhove:

Oh, absolutely. There are, as I mentioned, the opportunities in depth sensing are really I mean there are a lot of things depth sensing can enable and obviously that's also inside the home.

It's obviously nice that your face is properly recognized. One of the things I'm thinking of is the fact that every day is alarm systems. You forget the code or you miss the digit and the, and you hear in the middle of the night you, you have this neighboring alarm system running, et cetera.

He came home and missed the digit or whatever and the whole neighborhood is awake. So this happens way too often. Just having robust recognition systems and sensing systems makes it all the way, all the more easy. It's just gonna, the code's gonna just be a backup or not even be necessary.

Also access control of your home. You arrive and if you have a robust, if you have robust face id, robust recognition. These things are just making your home safe. If somebody's running through your home while you are absent and the face is not recognized, there are also multiple gradations you can do.
You can allow that. For example, your parents, you are on a holiday and your parents-in-law are checking on giving the plants some water while you are absent, et cetera. You don't want your alarm system to go, but maybe you want some response stating, look, there is a female person which is not recognized by the system in your home right now. And then you can decide yourself. What you'll be doing this is again, just tailor it, tailoring it on the face, recognition, other things like gesture recognition, can definitely have a position in the home.

Autonomous AGVs are already in your home. Most of us have today these robot vacuum cleaners, et cetera. They're just on the way to become much, much smarter so that they can easily, they, the problem with those is that they suck things they shouldn't, or they get stuck somewhere.

And these things can be avoided by improving their sensing systems. So that's a couple of examples. Other things are the multimedia systems can understand better. Some examples I've heard that were also interesting was even for very expensive rental systems where very new movies are rented, you could have systems that count how many people are watching the movie. So I've heard many different things. It goes from monitoring to comfort, to enabling robots to navigate drones flying around your house, taking pictures of people in the house from time to time. So you keep a library of what's happening in your home, who knows? 

Grayson Brulte:

The safety and the security to me, seems is a critical use case that can create a lot of value from an economic standpoint.

Insurers are gonna love it cuz it's gonna limit the amount of checks that they have to write. Going back to automotive where I live, we have a phenomenon now where individuals leave their keys in the car all the time and cars get stolen every day. Is gentleman I know had his Ferrari stolen yesterday because he left his---it was a 488---left his keys in it and the Ferrari was gone.

But if it had your depth sensing technology said, oh you're not Bob, you can't take the car. Okay, you're gonna help the insurer. So there's a lot of stuff around there around the safety and security. I really like the technology's evolving in a very positive way. There's a lot of use cases as we described today.

You're sitting here as the President of Sony Depthsensing Solutions. You're an engineer building really great technology. How do you see depth sensing technology advancing over the next decade?

Daniël VanNieuwenhove:

If you look at depth sensing technologies today, there are, as you mentioned at the very beginning there are a lot of different technologies trying to bring depth one way or the other, to machines.

I went over some of those, like passive stereo is great, but it doesn't work in the dark. Ultrasonic is very nice, but it doesn't give you a good resolution. Similarly for radar. Radar can really be, is very robust in certain ways, but doesn't really give you a resolution or makes it difficult to understand where the signal is actually exactly coming from and these kind of things.

All these technologies have their limitations. You have the, what they call time of flight which is active light-based light radars if you want they call LiDAR, flash LiDAR, but you also have, they also called time of flight. You have multiple flavors of those. And essentially the reason you have all those is that the ultimate depth technology hasn't yet been conceived right now.

What are some of the limitations? Some of them have issues, as I mentioned, with resolution and understanding. The granularity in the scene where things have are coming from some of the higher resolutions ones like time of flight. They have difficulties with longer ranges. Or with power consumption because they have their own light source.

But of obviously when you wanna see very far that light source will need to consume a bit of power. Or there is issues when multiple of these systems look at the same environment. I can imagine that two cars look ahead of them and they both use active light or active technologies emitting certain signals in front of them.

Then there must be absolute certainty that they don't interfere. So it's in this kind of context that there is a lot of improvement being made. So making a very, a long range low power. Fully robust to any circumstance. High resolution depth system is what the roadmap is heading to us.

And that as we discussed will fully enable the game changing aspect of depth in the context of all these applications. 

Grayson Brulte:

As devices get smarter, everything's gonna come into low power's gonna be a huge trend for this decade. How can you build the world's best device running on lower power?

Speaking of that, what's the future of Sony Depthsensing Solutions? 

Daniël VanNieuwenhove:

The future of Sony Depthsensing Solutions, first many of the things we talked about, we wanna enable the industry to, to see, to have them see the daylight. So at Sony we have this huge portfolio of different sensors and this great capability of producing them with the highest quality.

And so Sony Depthsensing Solutions is involved in designing some of those sensors, but also is involved in adding the system and software layer on top of that to enable our partners and customers to make the best out of those sensors. So the future of Sony Depthsensing Solutions is exactly focused on all the applications we've talked about in helping our partners and customers.

See them successfully and create good business for them in the market. That's what we are trying to achieve. And obviously our motto is make devices smarter and improve everybody's life. So our ultimate goal is to see with the rest of the ecosystem how we can make those depth sensing technologies and sensing technologies improve everybody's life. That's where we are headed. 

Grayson Brulte:

 You said the most important thing there, which the Sony brand stands for is high quality. The Sony brand stands for high quality when you buy a Sony product. Going back to when I had my first Walkman, and then I had the, my first Sony, the red one, then the yellow one. It was all high quality, and that goes throughout the entire organization no matter what division you're in at Sony. And Daniel, as we look to wrap up this insightful conversation, what would you like our listeners take away with them today? 

Daniël VanNieuwenhove:

I hope they got a bit of a feel that all these sensing and depth sensing technologies will really enable a lot of good, new use cases and applications in the future that depth sensing will enable devices to be smarter and that it'll improve a lot of aspects in our lives. And this in multiple categories like automotive, in the home, consumer uses, way too much talk about the robotics et cetera. This is just on the way. And your listeners, Grayson, should be aware that there's a lot of good stuff coming. 

Grayson Brulte:

There's a lot of good stuff coming. There's high quality stuff coming from Sony. Depth sensing is the future. Today is tomorrow. Tomorrow is today, and the future is depth sensing,

Daniel, thank you so much for coming on SAE Tomorrow Today. 

Daniël VanNieuwenhove: 

The pleasure was mine. Thank you very much, Grayson. 

Grayson Brulte:

Thank you for listening to SAE Tomorrow Today. If you've enjoyed this episode and would like to hear more, please kindly rate review and let us know what topics you'd like for us to explore next.

Be sure to join us next week when we speak with Tim Zuercher from Torc Robotics. He'll discuss how Torc is developing autonomy for L4 trucking. 

SAE International makes no representations as to the accuracy of the information presented in this podcast. The information and opinions are for general information only.

SAE International does not endorse, approve, recommend, or certify any information, product, process, service, or organization presented or mentioned in this podcast.


Listen to the full back catalog on your favorite podcast platform.