Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention
Luleå University of Technology, Department of Health, Learning and Technology, Health, Medicine and Rehabilitation.ORCID iD: 0000-0003-3503-4676
Luleå University of Technology, Department of Health, Learning and Technology, Health, Medicine and Rehabilitation.
2022 (English)In: Information, E-ISSN 2078-2489, Vol. 13, no 9, article id 420Article in journal (Refereed) Published
Abstract [en]

Pedestrians base their street-crossing decisions on vehicle-centric as well as driver-centric cues. In the future, however, drivers of autonomous vehicles will be preoccupied with non-driving related activities and will thus be unable to provide pedestrians with relevant communicative cues. External human–machine interfaces (eHMIs) hold promise for filling the expected communication gap by providing information about a vehicle’s situational awareness and intention. In this paper, we present an eHMI concept that employs a virtual human character (VHC) to communicate pedestrian acknowledgement and vehicle intention (non-yielding; cruising; yielding). Pedestrian acknowledgement is communicated via gaze direction while vehicle intention is communicated via facial expression. The effectiveness of the proposed anthropomorphic eHMI concept was evaluated in the context of a monitor-based laboratory experiment where the participants performed a crossing intention task (self-paced, two-alternative forced choice) and their accuracy in making appropriate street-crossing decisions was measured. In each trial, they were first presented with a 3D animated sequence of a VHC (male; female) that either looked directly at them or clearly to their right while producing either an emotional (smile; angry expression; surprised expression), a conversational (nod; head shake), or a neutral (neutral expression; cheek puff) facial expression. Then, the participants were asked to imagine they were pedestrians intending to cross a one-way street at a random uncontrolled location when they saw an autonomous vehicle equipped with the eHMI approaching from the right and indicate via mouse click whether they would cross the street in front of the oncoming vehicle or not. An implementation of the proposed concept where non-yielding intention is communicated via the VHC producing either an angry expression, a surprised expression, or a head shake; cruising intention is communicated via the VHC puffing its cheeks; and yielding intention is communicated via the VHC nodding, was shown to be highly effective in ensuring the safety of a single pedestrian or even two co-located pedestrians without compromising traffic flow in either case. The implications for the development of intuitive, culture-transcending eHMIs that can support multiple pedestrians in parallel are discussed.

Place, publisher, year, edition, pages
MDPI, 2022. Vol. 13, no 9, article id 420
Keywords [en]
external human–machine interfaces, autonomous vehicles, vehicle-to-pedestrian communication, traffic safety, gaze direction, emotional facial expressions, conversational facial expressions, neutral facial expressions
National Category
Vehicle Engineering Applied Psychology
Research subject
Engineering Psychology
Identifiers
URN: urn:nbn:se:ltu:diva-90165DOI: 10.3390/info13090420ISI: 000856467300001Scopus ID: 2-s2.0-85138736115OAI: oai:DiVA.org:ltu-90165DiVA, id: diva2:1651540
Note

Validerad;2022;Nivå 2;2022-09-12 (hanlid)

Available from: 2022-04-12 Created: 2022-04-12 Last updated: 2023-05-08Bibliographically approved
In thesis
1. Virtual Human Characters for Autonomous Vehicle-to-Pedestrian Communication
Open this publication in new window or tab >>Virtual Human Characters for Autonomous Vehicle-to-Pedestrian Communication
2022 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Pedestrians base their street-crossing decisions on both vehicle-centric cues, like speed and acceleration, and driver-centric cues, like gaze direction and facial expression. In the future, however, drivers of autonomous vehicles will be preoccupied with non-driving related activities and thus unavailable to provide pedestrians with relevant communicative cues. External human-machine interfaces (eHMIs) hold promise for filling the expected communication gap by providing information about the current state and future behaviour of an autonomous vehicle, to primarily ensure pedestrian safety and improve traffic flow, but also promote public acceptance of autonomous vehicle technology. The aim of this thesis is the development of an intuitive, culture-transcending eHMI, that can support multiple pedestrians in parallel make appropriate street-crossing decisions by communicating pedestrian acknowledgement and vehicle intention. In the proposed anthropomorphic eHMI concept, a virtual human character (VHC) is displayed on the windshield to communicate pedestrian acknowledgement and vehicle intention via gaze direction and facial expression, respectively. The performance of different implementations of the proposed concept is evaluated in the context of three monitor-based, laboratory experiments where participants performed a crossing intention task. Four papers are appended to the thesis. Paper I provides an overview of controlled studies that employed naive participants to evaluate eHMI concepts. Paper II evaluates the effectiveness of the proposed concept in supporting a single pedestrian or two co-located pedestrians make appropriate street-crossing decisions. Paper III evaluates the efficiency of emotional facial expressions in communicating non-yielding intention. Paper IV evaluates the efficiency of emotional and conversational facial expressions in communicating yielding and non-yielding intention. An implementation of the proposed anthropomorphic eHMI concept where a male VHC communicates non-yielding intention via an angry expression, cruising intention via cheek puff, and yielding intention via nod, is shown to be both highly effective in ensuring the safety of a single pedestrian or even two co-located pedestrians without compromising traffic flow in either case, and the most efficient. Importantly, this level of effectiveness is reached in the absence of any explanation of the rationale behind the eHMI concept or training to interact with it successfully.

Place, publisher, year, edition, pages
Luleå tekniska universitet, 2022
Series
Doctoral thesis / Luleå University of Technology 1 jan 1997 → …, ISSN 1402-1544
Keywords
external human-machine interfaces, pedestrian acknowledgement, vehicle intention, traffic safety, traffic flow, gaze direction, facial expression
National Category
Vehicle Engineering Applied Psychology
Research subject
Engineering Psychology
Identifiers
urn:nbn:se:ltu:diva-90172 (URN)978-91-8048-061-1 (ISBN)978-91-8048-062-8 (ISBN)
Public defence
2022-06-10, A117, Luleå tekniska universitet, Luleå, 13:00 (English)
Opponent
Supervisors
Available from: 2022-04-13 Created: 2022-04-12 Last updated: 2022-05-30Bibliographically approved

Open Access in DiVA

fulltext(11686 kB)283 downloads
File information
File name FULLTEXT02.pdfFile size 11686 kBChecksum SHA-512
eb653ce57fd3ccaaf2a52cb578ec262055e2c3fd5768850bc2b53096758568f42e79b97b67005530d9fda982ba1723d969eb164a1595bd1f74808efaf724370c
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Rouchitsas, AlexandrosAlm, Håkan

Search in DiVA

By author/editor
Rouchitsas, AlexandrosAlm, Håkan
By organisation
Health, Medicine and Rehabilitation
In the same journal
Information
Vehicle EngineeringApplied Psychology

Search outside of DiVA

GoogleGoogle Scholar
Total: 291 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 193 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf