Aftermarket November 2022

NOVEMBER 2022 AFTERMARKET 17 www.aftermarketonline.net Survey nds drivers want to stay hands-on A new survey of over 1,000 drivers by The Insurance Institute for Highway Safety (IIHS) in America has found signi cant mistrust of automated lane changing systems, with drivers preferring to stay hands-on and initiate the manoeuvre themselves. 80% wanted to use “at least some form of lane centering” – a strong endorsement for what we Brits call automated lane keeping systems (ALKS). 36% preferred “hands-on-wheel” lane keeping, compared to 27% for “hands- free”, with 18% having no preference between the two types, 16% not wanting to use any form of lane keeping and 4% being unsure. Asked about lane changing assistance (as opposed to just lane keeping), 73% said they would use some form of auto lane change. However, 45% said they would prefer to use driver- initiated auto lane change compared to only 14% for vehicle-initiated auto lane change. 23% said they wouldn’t use either type, 13% had no preference and 5% were unsure. Alexandra Mueller, the IIHS survey’s primary designer, commented: “Automakers often assume that drivers want as much technology as they can get in their vehicles. But few studies have examined actual consumer opinions about partial driving automation. At least somewhat comfortable” with in-cabin driver monitoring to support such systems: 70% for steering wheel sensors, 59% for camera monitoring of driver hands and 57% for camera monitoring of driver gaze. Alexandra added: “The drivers who were the most comfortable with all types of driver monitoring tended to say they would feel safer knowing that the vehicle was monitoring them to ensure they were using the feature properly. That suggests that communicating the safety rationale for monitoring may help to ease consumers’ concerns about privacy or other objections.” One of my favourite interviewees this year was Lucas Noldus Ph.D., founder of Netherlands-based Noldus Information Technology, an expert in driver-vehicle interactions. “Over the coming decades, driving tasks will gradually diminish but, until full autonomy, the driver will have to remain on standby, ready to take over in certain situations,” he said. “How will the vehicle know the driver is available? How quickly can he take over? What should be measured to understand the driver behaviour? If we use a microphone, a video camera, a heartbeat monitor and a link to the ECU, how do we synchronise that? “We work with OEMs, tier1 suppliers, research institutes and simulator manufacturers to build-in our DriveLab software platform. We try to capture all the driver-vehicle interactions, so if he pushes a pedal, changes gear or turns the steering wheel, that’s all recorded and fed into the data stream. Eye tracking measures the point of gaze – what your pupils are focused on. In a vehicle, that might be the left, right and rear mirrors, through the windscreen or windows, around the interior, even looking back over your shoulders. Eye tracking generates all sorts of data. How long the eyes have been looking at something is called dwell time. Then there’s how fast the eyes move from one xed position to another – that’s the saccade, measured in milliseconds. “Another important metric is pupil diameter. Given a stable light condition, the diameter of your pupil says something about the cognitive load – the harder you have to think, the wider your pupils will open. If you're tired, your blink rate will go up. It's a very useful instrument. “Another incredibly useful technique is face reading. Simply by pointing a video camera at someone's face we can plot 500 points – the surroundings of the eyebrows, the eyelids, the nose, chin, mouth, lips. “We feed this into a neural network model and classify it against a database of tens of thousands of annotated images, allowing us to identify basic emotions – happy, sad, angry, surprised, disgusted, scared or neutral. You can capture that from one photograph. For other states, like boredom or confusion, you need a series of images. “These days we can even capture the heart rate just by looking at the face – tiny changes in colour resulting from the pulsation of the blood vessels in the skin. This eld of face reading is evolving every year and I dare to claim that we are leading the pack.” Using eye tracking to tackle distracted driving

RkJQdWJsaXNoZXIy MjQ0NzM=