Driver Technologies: innovators of automotive safety pt 2
Meet Driver Technologies, the pioneering AI-based mobility tech company dedicated to revolutionising automotive safety.
Co-founded in 2018, CEO Rashid Galadanci embarked on this journey with a mission to democratise advanced vehicle technologies and enhance safety for all drivers. Alongside him, COO Marcus Newbury brought with him 14 years of expertise in customer sales and mobility financial lending. Guiding the technological aspect of Driver Technologies is CTO Ben Heller, whose background spans mobile, web, and server-side technologies. Together, they lead a team committed to providing a safer driving experience.
In this exclusive two-part Q&A with Electronic Specifier, Rashid Galadanci, Marcus Newbury, and Ben Heller talk to Sheryl Miles about their insights, experiences, and the collective vision that drives Driver Technologies forward towards a safer and more secure driving future for everyone.
How does the motion detection feature operate? Does it detect any movement around the vehicle or only specific types of motion?
BH: We focus the motion detection feature on the car that has our device installed. To develop our motion detection, we’ve had to synthesise several other motion detection concepts to achieve something that we feel is reliable.
If you look at how Apple defines motion, it’s actually a little bit too smart, so other things that are not purely “motion” trigger events in their system. For example, if you connect to Bluetooth in your vehicle or open a door, Apple might think you’re driving even if your car is completely still.
To combat this, we take that data as one of many pieces that together paint a more complete picture of the vehicle’s state. This becomes particularly important during quick pit stops for fuel, heavy traffic, or driving on low-speed rural roads. We also have the ability to infer motion and paths for the other vehicles on the road. As part of our accident recreation platform, we can even generate a simulation environment that includes the other actors in the space. Based on this information, we’re inferring telematics data for vehicles in the area around us.
Can the motion detection feature differentiate between normal movements and potential threats or incidents?
BH: Yes, when looking at accelerometer data, it quickly becomes clear when you’re travelling or accelerating at a rate that feels unsafe from a human perspective.
RG: For example, one of our early partners, an autonomous vehicle company, was conducting tests where they put their data science PhDs in front of vehicles, drove cars at them, and asked them to put their hands up once they felt unsafe. Thankfully, now with accelerometer data, we don’t need to put our employees at risk. Instead, we can use the technology to determine if you’re accelerating or decelerating at a rate that looks above a specific value to investigate unsafe behaviour.
BH: We can also do the same thing for harsh turns using a gyroscope. Overall, we want to get away from a place where these measures are subjective and to know with absolute certainty that if drivers engage in particular manoeuvres, it will impact their safety.
Does the video quality depend on the mobile device being used? What resolution does it support?
MN: Right now, the Driver app provides a range of hardware you can bring to the application as we try to be as broad as possible to support what we can, from whatever you have in your pocket to a hardware dashcam a fleet may already have installed.
Our mission is to democratise road safety, and you can’t do that if you only support the latest, shiniest smartphone device. That said, we will also want to capture everything with as much fidelity as possible; therefore, our standard resolution is 720p HD video, which you will find when streaming from a cloud-based movie platform. Many smartphone devices on the market will support more than that, and we're exploring recording at 1080p and even 4k resolutions, especially for brief moments of the drive triggered by extreme events.
Our engineering team is prototyping a burstable system, where the video quality adjusts to higher resolution video when an event such as a collision might happen. Right now, if you download the app, you’ll be recording in 720p HD video, but we also take image stills throughout the drive as an accountability dataset.
What methods does the app use to detect driver alertness? Are there specific indicators or patterns that it looks for?
BH: Driver alertness is tricky as it really varies for everybody. As a person with glasses, I have a different baseline than somebody who doesn't wear glasses. The first thing we do for driver alertness is to calibrate the app and understand what your normal state while driving looks like. All of this is performed on-device, and the calibration state is used only for the purpose of determining the driver’s baseline posture. Once that’s complete, we evaluate lateral and vertical head tilt, eye position, and visible eye diameter to understand whether the driver has a clear view of the road. Blinking won’t trigger an alert, nor will normal scanning of the lanes in front of your vehicle, but if your child drops their snack and you turn around to lend a hand, we’ll alert you with a quick beep.
How does the app notify the driver when it detects a lack of alertness or drowsiness?
RG: We have a few ways to notify the driver. First, we will include an audio alert that is at a volume that the driver can hear inside the cabin of the vehicle. Our goal is that the driver is not looking at their phone, but we will also flash the screen a bright red colour to indicate an alert that might catch your eye if your volume is down. Either way, we want to let the driver quickly notice something is wrong.
In the case of drowsiness and distraction, the visual alert might catch the driver out of the corner of their eye, but we’re relying on that audio alert in the case of a forward collision. It is common in cases of forward collision alerts that you’re looking straight ahead on the road but have zoned out while driving. The bright flashlight can jolt the driver back into quick braking, so we try to pair both strategies for alerts.
Are there any additional smart features or functionalities that the dash cam app offers to enhance vehicle safety?
MN: One of the features we started developing for one of our partners is our auto start/auto stop video and telematics data recording capability. In those cases, the video runs longer than the lifetime of the actual drive. From this data, it can function as an in-cabin security system if the phone is disrupted or if there’s motion inside or outside the vehicle that the driver wouldn’t necessarily expect before that drive has naturally ended. For example, if you park for gas or a quick cup of coffee, that recording may persist for five to 10 minutes after you’ve parked the vehicle.
Overall, we're working to understand what’s happening on the road while you're driving as the most important thing, but secondary to the safety of the driver, is the safety of the vehicle. This new focus allows us to explore options for more commercial use cases where drivers might plug into an OBD-II port, have continuous power, and might want to leave that recording on continuously. At Driver Technologies, we always look for ways to help drivers, even if they’re not necessarily in motion.
How do you envision Driver integrating with autonomous vehicles in the future?
BH: This is what I dream about as I sleep at night. There are many ways to implement the Driver app experience on devices, and we’re in the process of extending that to OEM platforms like TomTom and Google Automotive Services that are native to the vehicle itself.
Fundamentally, we’re trying to understand human driving behaviour. I hope that we can take all of the learnings that we have gleaned from trying to help people stay safe on the road through what is effectively “just” a mobile phone app and teach the cars of the future how to drive in a way that’s more compatible with natural human behaviour.
I would love it if my car in the future drove like me, even if I’m not driving it. Imagine loading a profile from a decade’s worth of driving data to teach it how I want to be chauffeured around. It’s well within the realm of possibility, but ultimately the goal is to teach autonomous cars to drive alongside humans safely. Safety, above all else, is our mission. If we can also consider more eco-conscious driving along the way, we just might earn a future where we don’t need to talk about the dangers of driving quite so often.
What are your plans for the future?
RG: Over the next six months, we plan to announce several partnerships with insurance, automotive, and municipalities nationwide that will improve road safety. We look forward to further integrating our AI-powered dash cam and safety alert app, Driver, into insurance-backed programmes for commercial and personal drivers.
To learn more, visit the Driver Technologies website or download the app in the Apple and Google Play store.