Skip to content

vrsys/Synchrony-Perception--Data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Rhythmic Interaction influences Synchrony Perception in VR - VR Eye Tracking Study Data

This repository contains the anonymised data collected in the study of the paper Rhythmic Interaction influences Synchrony Perception in VR. The corresponding study implementation is available at this repository.

File Descriptions

participant_[ID]_judgments.csv

Contains participant judgments and experimental conditions for each trial:

  • SOAValue: Stimulus Onset Asynchrony (SOA) values indicating temporal offsets between stimuli.

    • Values: -400 ms, -200 ms, -100 ms, -50 ms, 0 ms, 50 ms, 100 ms, 200 ms, 400 ms
    • Negative values indicate audio preceding video, positive indicate video preceding audio.
  • FrequencyCondition: Frequency conditions tested (0.5 Hz, 1 Hz).

  • InteractionCondition: Boolean indicating if participants rhythmically moved their hand with the avatar.

  • SynchronyJudgement: Participant's judgement on whether stimuli were synchronous (True) or asynchronous (False).

  • TimeTillDecision: Time in seconds taken by participants to make a synchrony judgement.

participant_[ID]_eye_data.csv

Eye-tracking data captured per frame during the experiment:

  • trialNr: Trial number.
  • trialHasStarted: Boolean indicating whether the trial was active.
  • isFirstBaseline / isSecondBaseline: Booleans marking baseline periods.
  • frame / time: Frame number and corresponding timestamp.
  • FrequencyCondition / InteractionCondition / SOACondition: Experimental conditions for the current trial.
  • pupil_diameter_left / pupil_diameter_right: Diameter of left and right pupils.
  • focal_distance: Estimated distance to focal point.
  • etRecord_x / etRecord_y: Eye-tracking coordinates on the XY plane where the avatar hand moved.
  • stim_x / stim_y: Coordinates of the stimulus hand on the XY plane.
  • audio_stim: Status of audio stimulus.
  • synchrony_judgement: Previous and current synchrony judgement (1 if judged asynchronous, 2 if judged synchronous).
  • dist_et_stim_x / dist_et_stim_y: Distance between eye-tracking point and stimulus (avatar hand) on the XY plane.
  • latencyCorrection: Latency correction applied based on synchronization with SyncOne2.
  • audio_source_level: Volume of audio stimulus.
  • total_time: Data recording time, linking CSV data with timestamps in VR participant data.

participant_[ID]_vr_data.csv

VR-specific tracking data for participant controllers (left and right hands) and avatar hand movements:

  • Time: Timestamp.
  • Name: Identifier for VR objects (participant controllers, avatar hand components).
  • ID / ParentID: Internal identifiers for objects and their hierarchical relationships.
  • ActiveState: Boolean indicating object visibility or active status.
  • LocPosX / LocPosY / LocPosZ: Local coordinates relative to the parent object.
  • GloPosX / GloPosY / GloPosZ: Global coordinates in VR space.
  • LocScaX / LocScaY / LocScaZ / GloScaX / GloScaY / GloScaZ: Scale of objects locally and globally.
  • LocRotX / LocRotY / LocRotZ / LocRotW: Local quaternion rotations indicating orientation.
  • GloRotX / GloRotY / GloRotZ / GloRotW: Global quaternion rotations indicating orientation.

questionnaire.csv

Contains demographic information and subjective responses collected from participants after the experiment:

  • Participant: Unique participant ID.
  • Start FMS / 1 FMS / 2 FMS / 3 FMS / Final FMS: Self-reported Flow Measurement Scale (FMS) ratings at different timepoints during the experiment.
  • Age: Age of the participant (in years).
  • Gender: Reported gender identity (e.g., Male, Female, Other).
  • Profession: Occupation or field of work.
  • VR Expertise: Self-rated experience with virtual reality (e.g., on a scale from 1 to 5).
  • Audio-Visual Timing Activity Frequency: Frequency of engagement in tasks requiring precise audio-visual timing (e.g., musical or rhythmic tasks), rated by the participant.
  • Vision Problems: Indicates whether the participant has vision impairments (Yes/No).
  • Visual Aids: Indicates whether the participant uses visual aids such as glasses or contact lenses (Yes/No).
  • Hearing Problems: Indicates whether the participant has hearing impairments (Yes/No).
  • Hearing Aids: Indicates whether the participant uses hearing aids (Yes/No).
  • Did the act of synchronising influence your synchrony perception? If so, how?: Open-ended response reflecting the participant's subjective experience of how synchronisation during the task may have influenced their perception of synchrony.

BibTeX Citation

If you use the data in a scientific publication, we would appreciate using the following citations:

@article{Lammert2025,
    doi       = {},
    url       = {},
    year      = {2025},
    booktitle = {2025 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)}, 
    author    = {Lammert, Anton Benjamin, and Klass, Lina and  and Simon, Laura and Hornecker, Eva and Ehlers, Jan, and Froehlich, Bernd},
    title     = {Rhythmic Interaction influences Synchrony Perception in VR},
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published