Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Response Times for Visual, Auditory and Vibrotactile Directional Cues in Driver Assistance Systems
Luleå University of Technology, Department of Civil, Environmental and Natural Resources Engineering, Operation, Maintenance and Acoustics.
Luleå University of Technology, Department of Civil, Environmental and Natural Resources Engineering, Operation, Maintenance and Acoustics.ORCID iD: 0000-0002-7048-523X
2016 (English)In: SAE International Journal of Transportation Safety, ISSN 2327-5626, E-ISSN 2327-5634, Vol. 4, no 1, p. 8-14Article in journal (Refereed) Published
Abstract [en]

The number of advanced driver assistance systems is constantly increasing. Many of the systems require visual attention, and a way to reduce risks associated with inattention could be to use multisensory signals. A driver's main attention is in front of the car, but inattention to surrounding areas beside and behind the car can be a risk. Therefore, there is a need for driver assistance systems capable of directing attention to the sides. In a simulator study, combined visual, auditory and vibrotactile signals for directional attention capture were designed for use in driver assistance systems, such as blind spot information, parking assistance, collision warnings, navigation, lane departure warning etc. An experiment was conducted in order to measure the effects of the use of different sensory modalities on directional attention (left/right) in driver assistance systems. Attention was assessed in a driving simulator using Lane Change Task together with a secondary task, designed to measure choice response times and error rates to directional (left/right) information for multisensory signals. Different combinations of visual, auditory and vibrotactile signals were tested and compared. Visual signals alone (when captured by the driver) or in combination with other modalities provided shortest response times (570 ms on average). Auditory and vibrotactile signals captured attention equally well in terms of response time (650 ms and 740 ms on average). No significant differences in localization error rates were observed.

Place, publisher, year, edition, pages
2016. Vol. 4, no 1, p. 8-14
National Category
Fluid Mechanics and Acoustics
Research subject
Engineering Acoustics
Identifiers
URN: urn:nbn:se:ltu:diva-3313DOI: 10.4271/2015-01-9153Scopus ID: 2-s2.0-84979209151Local ID: 11e93edc-f012-44cf-9438-b2710c8cf883OAI: oai:DiVA.org:ltu-3313DiVA, id: diva2:976170
Note

Validerad; 2016; Nivå 1; 20151127 (lunand)

Available from: 2016-09-29 Created: 2016-09-29 Last updated: 2018-07-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records BETA

Lundkvist, AndréNykänen, Arne

Search in DiVA

By author/editor
Lundkvist, AndréNykänen, Arne
By organisation
Operation, Maintenance and Acoustics
In the same journal
SAE International Journal of Transportation Safety
Fluid Mechanics and Acoustics

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 233 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf