Episode 158 - Saving Lives with Driver Monitoring Systems

As the automotive industry moves toward SAE Level 2 and Level 3 autonomy, monitoring driver distraction and encouraging safer driving behaviors is crucial. Driver monitoring systems (DMS) can be lifesaving—and they are increasingly being required by law in many countries around the world.

Cipia is leading this charge with DMS solutions that employ advanced computer vision and AI to monitor the driver’s state in real time. Driver Sense detects signs of drowsiness, distraction, seatbelt, holding a phone while driving and more, enabling carmakers to provide lifesaving alerts to the driver.

To learn more about this life-saving technology, we sat down with Tal Krzypow, Vice President of Product, to discuss Cipia’s advanced computer vision and AI capabilities, in-cabin monitoring and video telematics, and the overall state of DMS technology in markets around the world.

Meet Our Guest

Vice President of Product, Cipia

Tal joined Cipia after a 5-year tenure with Microsoft where he served as a Product Planner and Product Manager for key Microsoft Office applications (including Word & PowerPoint.) It was there that Tal gained deep knowledge and understanding of global scale in software products. He is an alumnus of Northwestern University - Kellogg School of Management. Tal has spoken at many conferences including ECO motion 2022 and the Euro encap where he gave a workshop on DMS.


Grayson Brulte:

Hello, I'm your host, grace Grayson Brulte. Welcome to another episode of SAE Tomorrow Today, a show about emerging technology and trends and mobility with leaders and innovators who make it all happen. On today's episode, we're absolutely honored to be joined by Tal Krzypow, Vice President of Product, Cipia. On today's episode, he'll discuss Cipia's driver and occupancy monitoring systems and how the technology can create a safer and better in-car experience.

We hope you enjoy this episode. Tal, welcome to the podcast. 

Tal Krzypow:

Thank you very much.

Grayson Brulte:

It's an honor and pleasure to. I'm excited to have you here because driver monitoring is gonna play a critical, vital role in the development of SAE Level 2 and Level 3 vehicles. There are some analysts and pundits that say, without driver monitoring, you cannot scale to Level Three.

There's some pundits saying that you really need driver monitoring and lock the true benefits of SAE Level 2. So I'm really excited to get into this conversation with you since driver monitoring. It's a topic in automotive world, but some of the main consumers or listeners might not know. Tal, what is the current state of driver monitoring by state?

Tal Krzypow:

I guess we can discuss two aspects. One is the adoption of driving monitoring, which feels still nascent, although we see it coming to vehicle. In Europe, in the US, in China in Japan, worldwide, Korea. So it's certainly something that automotive manufacturers, OEMs recognize the importance of there is also the legislation that entered into effect and the safety standards that truly push it forward.

We can expect more and more vehicles to adopt it in the near future. And if you look at Europe and the general safety regulations update there, then we know that by 2026, all vehicles from all M and N categories will actually have to have a driver monitoring system, both for drowsiness and distraction.

Grayson Brulte:

What happens to the vehicles? The 25s, the 24s, the 23s that do not have driver monitoring. Will they still be able to operate in the EU? 

Tal Krzypow:

The requirements are divided into the cut of date for receiving type approval. So already in July of 2024, a vehicle that once type approval has to have driver distraction monitoring included.

But from July of 2020, the distraction monitoring is required for registration. So vehicle cannot be registered if it doesn't have the technology inside. So either vehicle, also vehicles that received type approval before July of 2024, and were perfectly sold in Europe beforehand by July of 2026. I guess were facelift or something along these lines, will have to have the technology introduced. 

Grayson Brulte:

That opens up the whole secondary market, the aftermarket. Is there an emerging market there to upgrade if you wanna use the term retrofit vehicles to become in compliance with the EU regulations?

Tal Krzypow:

I'm actually not sure about the requirements of the type approval, but I believe that it has to be part of the vehicle. And of course, in order to be part of the vehicle, it has to be automotive grade as opposed to aftermarket products that can meet on the commercial grade standard. We reduce the interest in driver monitoring for the aftermarket, particularly for fleet management. As with fleets, we see a lower turnover than private vehicles, and because the vehicles are retained for longer years, we can expect the technology to arrive to the vehicles as embedded in the vehicles.

A new vehicle to take longer. That's why fleet managers already want to enjoy the benefits today, and they're looking for aftermarket solutions. Cipia actually offers beside the OEM offering the software that we offer for integration. Also, an aftermarket device for integration by telematic service providers as an offering for fleet management systems.

Grayson Brulte:

Are the fleet managers putting a year after market device to reduce risk? Or if a driver's goofing off or doing something they're not supposed to, or they're trying to lower their insurance rates. What is the main driver from fleet managers to put an aftermarket driver assistance program in there?

Tal Krzypow:

It's sometimes hard for us to remember, as, private people as consumers. What are the ramifications of an accident for us, if, let's say, God forbid that there are no injuries or death, it typically ends up with visiting the auto shop and fixing a dent or something along these lines.

But for a fleet manager, it means now that there is an asset in the fleet with downtime, it means that the service level agreement to the customer is hit and I may need to pay. It means that the cargo may have been hit and they need to recuperate it through the insurance. It means that my premiums go up.

The bundle becomes with of such enormous proportions that reducing the accident rate for any fleet is the critical aspect of the efficiency, not just logistically, but also financially. So it's a critical point and something that is really top of. And it's actually truly amazing to see how with the right prompts, with the right feedback to the driver monitoring system can actually change the behavior.

I have an example of one of the fleets that we went into in the first week. 20% of the drivers are with no seat. Four weeks later, already 80% of the drivers were seat belts properly because you provide constant feedback. And as human beings we adapt. I have really, it's very anecdotal, but it was so powerful in my realization of how we change behavior.

We were at MWC in Barcelona and the set introduced us as a POC In their vehicle, there was a reporter sitting in the driver's seat and the system was designed there to provide prompts. Whenever the tester or the reporter, whoever is sitting in the driver's seat, does not look through the front windshield they got a, an audio prompt.

Please pay attention to the road. Okay, so the reporter is very nice. He's starting to ask me questions. I stand outside of the vehicle and I remind you the vehicle is stationary. It's on a stage. There are plenty of people around. It's not moving anywhere. The reporter looks at me to the left to ask me a question against the audio prompt.

Please pay attention to the road. So he immediately turns his face to the road then and he listens to my answer. Then he is asking me another question, turns his face toward me, gets another prompt and tries to behave and get looks again at the road. The third question he threw from the angle of his mouth to the left while he was still keeping a look at the road so we wouldn't get the prompt and he wasn't even driving.

But the feedback that we receive as humans changes our behavior, and it was amazing to see that happening within 60 seconds. So the driver monitoring system as a tool, even before we get to the vehicle intervention, to changing the forward collision warning or activating minimum risk maneuver, any of these things, it already changes the driver behavior even through basic feedback. So it's a very powerful tool and I'm very excited to see it. 

Grayson Brulte:

That's a very profound change. Were the prompts, obviously they were annoying cause you got his attention. How would you define that prompt? Was it hepatic feedback? Was it beep beep? What was it? 

Tal Krzypow:

At this particular case, it was just a audio feedback of a sentence of please pay attention to the road. Literally and nothing else. But, when you see the general safety regulations in. They call for really any type of feedback. So it can be audio, it can be haptic, it can be visual, and different OEMs may decide to apply it in different ways. It's really flexible from that perspective.

And obviously the best thing for us as human is to combine different channels because, I may be with the windows down and the wind blowing. Everything and I may miss an audio feedback, but then a haptic feedback will actually make pay tension. And a visual feedback can be very effective, but if it's emanating from the front and at the very same time, I'm actually looking to the right and I need to get the distraction alert, I'm going to miss the flashing light that is in the front because I'm looking to the right.

Hence the importance of actually combining several types of feedback. 

Grayson Brulte:

Take a vehicle today with a steering wheel. There are certain ways in certain OEMs globally, you can block the driver monitoring system. It gives you this alert. Please move your steering wheel. Where is the most ideal place to put the driver monitoring system?

So a person can't play trickeries and put the steering wheel where it blocks the driver monitoring system. 

Tal Krzypow:

Sir, you're actually asking two different questions where the answer is divided into two different topics. One is the ideal position for a driver monitoring system. And the second is how to prevent trickery of the system.

The camera position from Cipia's perspective is flexible because we realize that our customers, especially our direct customers, which under the traditional model is typically the tier one. Each tier one owns a different real estate of the vehicle. Some own the instrument cluster, some own the infotainment console.

Some own the trim in the A pillar cover. Some own the overhead console, and we want to work with as many as possible. So we trade a system from multiple angles, so no matter where you put the camera, we can provide the adequate. So from that perspective, Cipia's solution is fully flexible. Having said that, there is no really ideal solution because each position offers certain trade.

For example, a camera on the steering wheel column that looks straight at the driver's face. It's great because it's very centered. You have great coverage of gaze to the left, to the right. You can identify different areas of interest. The driver is looking in the vehicle with high level of accuracy.

Having said that, it's behind the steering wheel. The steering wheel moves while you turn, you occlude the camera naturally simply because you control the vehicle. The availability of the system is better suited for the driving straight and long stretches of the road, which is where distraction and drowsiness really occurs.

So it's not a material flaw or a hindrance to the. But it limits the availability of the system. If you put the camera on top of the entertainment console, then you are off to the right of the driver. You don't get as good coverage as possible when the driver is looking left. That you don't get these steering wheel occlusions, for example.

So each and every camera offers a certain position, offers a certain trade off, but everything is workable. I can say that we are already in serial production from all four camera locations that I mentioned. And they all work. So from that perspective it's not a concern regarding the trickery or the attempts to block the system.

One of the things that you can do. Included, that's what we do. We include certain measures in the system that analyze the video and detect failure points. For example, if we see a blurry image that can be caused by an object that is very close. Or a smudge on the lens or something like that. That's something that we actively look for and alert for.

So we know that the system cannot be relied on because something is wrong with the system. And we can report the problem, there are different types of visual qualities that we detect and we report them, and then the OEM can choose what to do. 

Grayson Brulte:

Fascinating. Do you see some OEMs putting multiple driver monitoring systems in there for occupant detection?

We're getting ready to go into the summer and every summer you read these horrific reports of a child being left in the backseat or a dog and NITSA publishes this data unfortunately, every year. Can you put a system open so you can detect occupant detection, perhaps that there's a child or an animal in the back.

If it's electric vehicle, the air conditioning can turn on. That child's life or that pet's life, are you seeing that type of stuff as well? 

Tal Krzypow:

Absolutely. Though there is a caveat. So first I would say that there is a trend in the industry. To move from a pure driver monitoring or interest among OEMs to a full cabin wide occupancy monitoring, and with the OEM's desire to cut down on costs and hardware costs.

We also see the aspiration of combining both driver monitoring and occupancy monitoring through the same white field of use sensor that monitors the entire. But with the desire to bubble down all systems into a single camera in the front of the cabin, naturally the back seat is partially included with the backs of the front seats or other elements.

So scenarios. For example, one of the scenarios that Euro NCAP tests for child presence is a baby in the baby carrier with the canopy closed and the baby covered with the blanket all the way to. It's very reasonable, right? I know that my baby spent some time in the backseat of my car that way, but there is absolutely no line of sight to the baby, so a camera would not be able to detect it.

An alert in the case that the baby is forbid for forgotten in the vehicle. That's why we also see interest though in radar systems in vehicles. There are radars in the bandwidth of 60 gigahertz in hub that can penetrate fabrics and be sensitive enough for vibrations in motions to detect respiration and heartbeat.

And these are great compliments to cover the scenarios where children. Cannot be seen at all because they are completely occluded, whether in the baby carrier or because an infant crawled into the foot footwell of the backseat or something along these lines. To address perfectly all scenarios there will be a need to combine technologies.

Grayson Brulte:

When you look at combining technology, does this evolve similar to airbags, where before you just had it in the driver, then it went to the passenger, and now that, if you wanna use the term cocoon there, there's, you're surrounded by airbags. Do we get surrounded by driver monitoring and occupant detect from different motor technology to have the, if you monitor the safest vehicle possible?

Tal Krzypow:

I guess so. Ultimately there is no reason that, if we, and I veer off here and look way ahead in the future autonomous vehicle that is truly our third space, a lounge, moving the wheels that carries us in the mobility services world from point A to point B without us even owning the vehicle.

Because the moment we step out, the vehicle comes to pick us up and brings us destination, we believe that there will be a fusion of multiple sensors. To truly serve us, understand our needs and offer us services and products that are relevant, that are desirable from our perspective for the ride.

And this will be not limited only to vision and radar, but also voice and maybe from the wearables that we were on us, and the contextual information of the, where we are, the GPS in our destination based on what the vehicle knows. And when you combine all these together, you can truly understand what's going on.

Who is in the car? Who are they, what is their state? Are they engaged right now in an activity? Let's say that I'm fall I fell asleep in the car. Maybe the car should block all calls and just let me sleep. Maybe. I'm right now driving with my wife and kids and we have a two hour drive and the vehicle knows that the lungs.

Right in the middle, there is a restaurant we like. It can offer us, do you want your favorite lunch to wait for you there to pick up? Do you want to authenticate the payment with your face id, et cetera.

And so the knowledge that you get from all these sensors can truly help the vehicle understand who is inside, what's going on, what's the context and what is the best offering. Even predict our needs.

Grayson Brulte:

What that comes down to is experience. That's a really great experience. If it's your favorite restaurant and perhaps it's in the morning and you're on your way to the office, or you're going to a conference, You want a cup of coffee and it knows the pool's there gives you a cup of coffee, the in-vehicle payment's already authorized and it knows that you like your latte this way.

Okay? Oh, with two shots of espresso, away you go. That's the future. That's somewhere where I believe that we're getting, we're not there yet, but eventually we will get there. But what we are today is driver monitoring a growing market, a growing need for it from a product per. How does the Cipia driver monitoring system work today?

Tal Krzypow:

The system is consists of two layers of algorithms. First of all, we receive an infrared video and we use infrared because we want to work under all lighting conditions with light. That does not impede with the driver's ability to see or change their perception. So this is a infrared in the 940 nanometer bandwidth something that is beyond a visible spectrum. And the first layer is computer vision algorithms. And this layer tries to analyze the facial features of the driver. Understand things like where are the eyelids and how open the eyes are, where are the pupils and the irises with respect to the eyes? And what is the gauge direction?

What is the head post of the driver? A lot of parameters that are associated with facial. And then a second layer of human factors algorithms translate the visual cues into the physiological state of the driver. For example, how a series of blinks is translated into a level of dryness or how a series of gaze directions is translated into the distraction.

So that's the way that the system operates. There are two levels for our customers. We offer both levels simultaneously for each and every frame. So the system works in a way that we are the algorithms receive a frame, analyze everything that happened up to this frame, report back, and from that point onwards, the tier one or the o.

Take this output and apply it for safety based on the available systems in the vehicle. For example, let's say that the vehicle is at level zero of autonomy and level zero. The vehicle can do nothing but the minimum legal regulation is issue an alert in case of distraction or address. Audio visual haptic we discussed for level one and level two, the vehicle can actually do something.

There are adaptive cross control, lane keeping assist. Level two is a combination of both with traffic gym assist or a highway driving assist. And with this system what Euro NCAP wants to see in order to assign the full rating for the category of safety, assist of driver monitoring is intervention of the vehicle.

So it's not enough to issue a warning, but in case the driver is distracted, the Ford collision warning should already go up to the higher sensitivity and in case the driver becomes unresponsive and unresponsive. Unresponsive means if the driver is distracted and receive a distraction warning, but they continue to be distracted for a, an additional period.

It's three seconds and three seconds. So after three six seconds. The driver is deemed unresponsive and the vehicle should engage with the minimum risk maneuver, which means it needs to lower the speed to 10 kilometers per hour or less, or even come to a full stop. So these are active safety measures that based on the driver's state, make the vehicle behave in a certain way to save the driver and those.

We also see interesting requirements from consumer reports and the insurance Institute for highway safety in the US that issued a guidance for their full safety rating saying that L two vehicles with highway driving assist with traffic sha assist. They need to have a driver monitoring system to get the full safety rating because the driver monitoring system has to verify the driver's attention to the road.

While they enjoy the L two capabilities, the concern there is that many drivers today do not understand the limits of technology and when they're given a vehicle with L two capabilities, that it feels like the vehicle can drive it on its own right? It has adaptive course control, the IT does both together.

I can take my hands off the steering wheel and. Look, mommy, no hands it gives the distance, it gives the position in the lane. I'm good. But they don't understand that the vehicle cannot cope with lane markings all of a sudden breaking up or someone breaking into my lane in front of me. All of a sudden things that, that the vehicle is not designed to cope with.

Hence the driver is responsible legally and functionally, and they have to maintain their vigil and visual attention to the road at all times. Therefore they come up with a requirement that an L two vehicle has to have a DMS system, verifying this visual attention. And of course, in, in level three where we have finally the vehicle can drive an entire segment of the road on its own, let's say the highway.

Still, the vehicle may at any point in time along the highway, and of course, when the highway ends issue a takeover request, and at this point, the driver has to take over. So a DMS system has to verify the availability of the driver to address a takeover request. 

Grayson Brulte:

You're right, cuz the limits of the technology, you know it, I know it, the majority, if not all of our listeners know the limits of the technology.

But a consumer going to a car rental agency is fixed. Or a Hertz or an Avis to rent a car. Oh, this is great. It drives something. No. It doesn't drive itself. It causes issues. Is this where DMS comes in? It can, I'll use a very harsh term. It can force driver behavior to. To really understand the true elements of what an L two system can do.

Does this put 'em into compliance? But not only put 'em into compliance, it's gonna increase safety. 

Tal Krzypow:

Absolutely. That's the goal here. Yeah. That has been our mission from day one. So it's actually very reward, rewarding to work on such a product.

Grayson Brulte:

It's rewarding to work on a product. Your product can do a lot. Now, when looking at the competitive landscape, is it your algorithms that separate you from your competitors or what is that secret sauce where the OEMs, the tier ones are signing up for Cipia? 

Tal Krzypow:

I'm going to say that of course we're the best are leading competitors are going to say that they're the best.

What I hear from customers is that the leading providers provide. A pretty similar level of performance from an algorithm perspective when it comes to the basic measures of, grans, distraction, et cetera. What sets Cipia apart is our focus on lean hardware requirements and the flexibility of the platform.

So we make it a key focus. To require the well practically for OEMs, the cheapest hardware possible and do the magic. On the algorithm side, were possible, whether it's lower resolution of the camera, whether it's a linear processing requirements whether it's the flexibility of working on work platforms and leveraging where they're interested, a hardware acceleration.

So we free up Cipia resources for additional. We see a very clear trend of when driver monitoring systems were just introduced in the latest crop of vehicles into the market. The focus was very much on standalone systems. So there is an ECU that is dedicated for the driver monitoring, but as we move forward, what we see with the market trend is the shift toward domain integration.

And the two primary domains are the ADAS, ECU, and the IVIECU and this is use, of course, are supposed to host a lot of other functions, whether it's the external ADAS perception or the internal functions of the IVI. And that's why the desire for lean hard requirements is so important for the OEMs.

Grayson Brulte:

When I think of lean, I think of the Toyota's manufacturing. That's been well done with the lean manufacturing process. But when I also think of lean, I think power efficient. I think of arm, how power efficient. Are the Cipia driver monitoring systems? Are they extremely power efficient? Tying into your lean strategy.

Tal Krzypow:

Absolutely. It really depends. So honestly, the choice of an SOC to work on is typically not something that we control. It's something that the OEM already has in mind when they play in their architecture. And we need to play along and fit within the different platforms. I can say that the lowest platform from a power consumption perspective that we reached is the OmniVision O X 8,000, where if I recall correctly, off the top of my mind and don't get me if I'm sticking, but it's less than half a what or something like that for our system to run. So it's, it's pretty negligent. It's not something that you feel at any scale when you're just. When you have the entire car in mind.

Grayson Brulte:

That's very important as we shift to electrification and we eventually shift to autonomy. But that raises the question how are you able to run a DMS system with such low power when there are competitors in the market that use a lot more power to run their systems? 

Tal Krzypow:

I think it ties back to our roots. So Cipia when it started, its path was known as EyeSight. We changed the names a couple of years ago. And in our previous incarnation we were always a computer of an embedded computer vision company, but we focused on embedded computer vision for consumer electronics, for touch, jection control interfaces. And we had customers such as Lenovo, Sony, AMD, Toshiba, and NTT Docomo, really known household brands.

In all these devices, what characterized our activity was that the device was already designed and ready. And when we came in with our solution, we're told, okay, you wanna run on our laptop? We didn't have you in mind. We didn't plan for you. You need to make due with the leanest amount of processing power possible and with the existing camera, no matter what it was, So we developed this state of mind, of being agile to the and agnostic to the platform and requiring the least amount of resources.

We also learned how to combine classic computer vision with advanced ai because AI tends to be, typically, neural networks tend to be heavier, while classical computer vision is less performant, but a lot more nimble. But when you combine both and you use each in the right section of the flow, you can enjoy the benefit of both and optimize the overall system performance.

Grayson Brulte:

Would we be having this conversation today if you're a company did not start in the consumer electronics industry? 

Tal Krzypow:

I don't know yet. I have to know that. I'm very thankful for the fact that in 2017. Euro NCAP released the roadmap that required for the first time driver monitoring in the vehicles.

Very shortly after we experienced a surge of interest among OEMs who, who came over to us and told us, Hey, we heard your awesome computer vision company. What do you have for in Kevin sensing? And initially we didn't have anything, but 2018 was dedicated to building a demo, establishing the product market fit starting developing a product, and in 2019 it was clear to us that the consumer electronics market never picked up. We never reached the tipping points to be able to reach a mass market adoption. And unfortunately, I don't think that any gesture control technology in the world reached that even Microsoft and the Connect ultimately didn't penetrate the market as they hoped.

So we're very thankful to identify a new. Where our technology can be more meaningful to save lives, be a must in every vehicle at a certain point. So from that perspective the pivot was a no-brainer and a great help in focusing the company in its efforts. 

Grayson Brulte:

You saw the opportunity, you saw the pivot. I had a Microsoft Connect. I played with it. It was fun to play the video games, but it got old after a while. But here you are, so you're not taking that. Oh, I'll, we're just gonna wait this out and see where it goes. You take the leadership to pivot. You put your computer vision technology to work, and now you're scaling and driver monitoring.

You're doing really well. You're running low power, and I wanna highlight, you're running low power. So as more theses come online, you can go in EVs. And as more EVs come online, we'll take more computing power. You can go in there as well because of your. In the consumer electronics industry, Intel, putting this whole conversation together, in your opinion, what is the future of driver monitoring?

Tal Krzypow:

First of all, I already mentioned the fact that the driver monitoring, the occupancy monitoring will coalesce into one system, one sensor, one SOC, one everything, to do everything together. We'll also see specifically on the driving monitoring front, we already see interest with Euro NCAP and other regulatory bodies and safety standard.

In capabilities to detect driver intoxication alcohol impairment cognitive load. So today we focus very much on visual intention. On visual distractions, but sometimes it may look as if the driver looks at the road, but they're mind is somewhere else. Because I had a fight with my boss because I'm emotionally distraught, because, cause I'm excited about something, whatever it may be.

So there is a need to detect these aspects as well. And these are some of the next steps that you can look for coming into driver monitoring emotion. 

Grayson Brulte:

 It will be very intense because there are individuals, and this is a horrible term, that have road rage and they go absolutely ballistically nuts. If you can detect road rage, you're gonna save a lot of lives.

Tal Krzypow:

I have to say that this is an extremely challenging area because first of all, there is a difference between the expression of a person and their sentiment inside. Sometimes the expression betrays what we feel and sometimes it does.

Like right now I'm smiling and I seem very pleasant, but I may be furious inside. It's even more creepy that way. And that's something that is hard to tell from a visual perspective. But the even more challenging aspect is what would OEMs do with this information? There is a lot of uncertainty.

So let's say that you detect a certain. Is it legitimate to interpret a certain emotion as dangerous for the driving task? And I'm not sure I can be seem furious during a phone call because I'm angry at someone I'm talking to. It has nothing to do with road rage. How do you interpret the target of my anger?

And until we can do that and I'm not sure we'll ever be able, It's too risky and to a certain extent, irresponsible to apply it in a way that overtakes the vehicle and does not let a driver drive, for example. It's, there are other aspects even with technologies is exist today, like the drowsiness detection.

Let's say that a driver gets into the car and they already look drowsy. I can tell you right now they're tired. It's unhealthy for them to drive, but if they need to drive only two blocks away to get back to their home, should I prohibit and will not be, should the OEM prohibit that and lock down the vehicle?

It's just two blocks. It's not that dangerous. The person is tired, but it's not at the level that they cannot sustain themselves, all the way. As opposed to now they have a two hour drive, but as long as you, you don't know this information, don't have the context, you can't always make the ultimate judgment.

Grayson Brulte:

And then you open up the can of worms called privacy. 

Tal Krzypow:

I am, I actually don't see privacy as a can of worms because from the very beginning it was clear that privacy is one of the most sacred values that we have to respect, and the product was designed with privacy in. The system works in a way that each frame that we receive is translated to metadata and the image from our perspective can be discarded and deleted immediately.

Even the metadata is not retained by us. It's used for ongoing calculations. And then again, this data can be deleted too, even for facial recognition and some of our customers use our face a. For personalization purposes in the vehicle to know which member of the household is actually driving the vehicle and adapt certain aspects.

We translate each new face that we see in the image into a feature vector of 500, 512 values, and this feature vector is saved as the driver id. When we see a new face, we translate them to numbers and we compare numbers to numbers. We never compare an image to image and we do not retain the image. We can't even backtrack from numbers to an image.

It's a one way process. So all from privacy perspective, privacy and security perspective, everything is fully secure. No concerns. 

Grayson Brulte:

So you're doing the right thing on, on the privacy. So I don't respect that, but if an OEM decides to make the decision is that individual's drowsy. They don't only have to goes two blocks. That's a whole nother can of worms. That's not your can of worms. Correct. That's for the OEMs to figure out. But Cipia is really focused on privacy. Tal, I've learned a lot during this conversation. I found it absolutely fascinating. And as we look to wrap up this insightful conversation, what would you like our listeners to take away with them today. 

Tal Krzypow:

In case some of the listeners are from OEMs and they own the vehicle interior experience. I urge you to push us and also leverage the outputs that we provide as much as possible because today there is tendency to focus on the legal and safety standards requirements to ensure that the safety aspects are addressed. But when we provide the outputs of the driver state, we actually already provide so much.

We know for example, what is the case origin and the case direction, the side mirrors can be adjusted automatically for the driver. We know where the driver is looking in terms of there are more and more displaying the vehicles. For example, the infotainment, the instrument cluster, the head of display, and you can interact with all of them.

But how many joysticks can you have on the steering wheel to interact with all of these separately? When you know where the driver is looking, it's the most powerful indicator for the driver's intent. If I'm looking at the HUD and I'm moving the joystick, clearly I'm not trying to change something in the IVI because I'm not looking there in, in the infotainment.

So you can assign the right control based on my case direction. There is so much more that can be done with the existing. That I encourage anyone in the industry who is tied to it to leverage what's already there and improve the experience. 

Grayson Brulte:

There's no doubt we want an improved experience. There's also no doubt that driver monitoring has played an extremely important role in the future of autonomy.

Today is tomorrow. Tomorrow is today. The future is Cipia. Tal, thank you so much for coming on SAE Tomorrow Today.

Tal Krzypow:

Thank you very much. It has been a pleasure. 

Grayson Brulte:

Thank you for listening to SAE Tomorrow today. If you've enjoyed this episode and would like to hear more, please kindly rate review and let us know what topics you'd like for us to explore Next.

Be sure to join us next week for another episode of SAE Tomorrow Today Unplugged, where I'll share my candid insights on the future mobility. 

SAE International makes no representations as to the accuracy of the information presented in this podcast. The information and opinions are for general information only.

SAE International does not endorse, approve, recommend, or certify any information, product, process, service, or organization presented or mentioned in this podcast.


Listen to the full back catalog on your favorite podcast platform.