Refine Your Search

Search Results

Viewing 1 to 7 of 7
Technical Paper

“Fitting Data”: A Case Study on Effective Driver Distraction State Classification

2019-04-02
2019-01-0875
The goal of this project was to investigate how to make driver distraction state classification more efficient by applying selected machine learning techniques to existing datasets. The data set used in this project included both overt driver behavior measures (e.g., lane keeping and headway measures) and indices of internal cognitive processes (e.g., driver situation awareness responses) collected under four distraction conditions, including no-distraction, visual-manual distraction only, cognitive distraction only, and dual distraction conditions. The baseline classification method that we employed was a support vector machine (SVM) to first identify driver states of visual-manual distraction and then to identify any cognitive-related distraction among the visual-manual distraction cases and other non-visual manual distraction cases.
Technical Paper

Hazard Cuing Systems for Teen Drivers: A Test-Track Evaluation on Mcity

2019-04-02
2019-01-0399
There is a strong evidence that the overrepresentation of teen drivers in motor vehicle crashes is mainly due to their poor hazard perception skills, i.e., they are unskilled at appropriately detecting and responding to roadway hazards. This study evaluates two cuing systems designed to help teens better understand their driving environment. Both systems use directional color-coding to represent different levels of proximity between one’s vehicle and outside agents. The first system provides an overview of the location of adjacent objects in a head-up display in front of the driver and relies on drivers’ focal vision (focal cuing system). The second system presents similar information, but in the drivers’ peripheral vision, by using ambient lights (peripheral cuing system). Both systems were retrofitted into a test vehicle (2014 Toyota Camry). A within-subject experiment was conducted at the University of Michigan Mcity test-track facility.
Technical Paper

Can You Still Look Up? Remote Rotary Controller vs. Touchscreen

2017-03-28
2017-01-1386
The popularity of new Human-Machine-Interfaces (HMIs) comes with growing concerns for driver distraction. In part, this concern stems from a rising challenge to design systems that can make functions accessible to drivers while maintaining drivers’ ability to cope with the complex driving task. Therefore, engineers need assessment methods which can evaluate how well a user interface achieves the dual-goal of making secondary tasks accessible, while allowing safe driving. Most prior methods have emphasized measuring off-road glances during HMI use. An alternative to this is to consider both on-road and off-road glances, as done in Kircher and Ahlstrom’s AttenD algorithm [1]. In this study, we compared two types of prevalent visual-manual user interfaces based on AttenD. The two HMIs of interest were a touchscreen-based interface (already in production) and a remote-rotary-controller-based interface (a high-fidelity prototype).
Technical Paper

Varying Levels of Reality in Human Factors Testing: Parallel Experiments at Mcity and in a Driving Simulator

2017-03-28
2017-01-1374
Mcity at the University of Michigan in Ann Arbor provides a realistic off-roadway environment in which to test vehicles and drivers in complex traffic situations. It is intended for testing of various levels of vehicle automation, from advanced driver assistance systems (ADAS) to fully self-driving vehicles. In a recent human factors study of interfaces for teen drivers, we performed parallel experiments in a driving simulator and Mcity. We implemented driving scenarios of moderate complexity (e.g., passing a vehicle parked on the right side of the road just before a pedestrian crosswalk, with the parked vehicle partially blocking the view of the crosswalk) in both the simulator and at Mcity.
Technical Paper

Improved Perception for Automated Vehicle Using Multi-Pose Camera System

2017-03-28
2017-01-1401
In this paper, a method of improving the automated vehicle’s perception using a multi-pose camera system (MPCS) is presented. The proposed MPCS is composed of two identical colored and high frame-rate cameras: one installed in the driver side and the other in the passenger side. Perspective of MPCS varies depending on the width of vehicle type in which MPCS is installed. To increase perspective, we use the maximum width of the host vehicle as camera to camera distance for the MPCS. In addition, angular positions of the two cameras in MPCS are controlled by two separate electric motor-based actuators. Steering wheel angle, which is available from the vehicle Controller Area Network (CAN) messages, is used to supply information to the actuators to synchronize MPCS camera positions with the host vehicle steering wheel.
Technical Paper

Automotive HVAC Induced Blade Passing Frequency (BPF) Tone and its Suppression

2013-05-13
2013-01-1915
Audible tones in vehicle interiors are undesirable because of their impact on customer satisfaction and quality metrics. Most of the loudest tonal noise sources located in the engine compartment are isolated from the vehicle interior by the dash-wall. A majority of the automotive blower/s are located in the vehicle interior in the close proximity of the driver and passengers. Hence, the blower induced tones (if present) become audible and airborne, they readily propagate to the vehicle occupants. The severity of these audible tones is going to be most annoying in future especially in vehicles equipped with hybrid, electric and start/stop at idle technologies, for improved fuel economy mandates in future. Due to increased demands for quieter vehicle interiors with higher airflow for achieving quick thermal comfort, the HVAC systems are designed with lower pressure drop which helps reduced low frequency broad-band noise but does not mask the BPF tone and its harmonics.
Technical Paper

HUD Future in the Driverless Vehicle Society: Technology Leadership Brief

2012-10-08
2012-01-9022
New sensing and fast processing technologies will create an electronic driver in every car by 2025. All people in the vehicle will be passengers! The vehicle will drive by itself from A to B. In this case what will be need for HUD? Below is an investigation of the key issues and some possible solutions.
X