Sick Moves! Motion Parameters as Indicators of Simulator Sickness

Feigl T., Roth D., Gradl S., Wirth M., Latoschik M. E., Eskofier B., Philippsen M., Mutschler C.:In: IEEE Transactions on Visualization and Computer Graphics (TVCG), Beijing, China, 2019. We explore motion parameters, more specifically gait parameters, as an objective indicator to assess simulator sickness in Virtual Reality (VR). We discuss the potential relationships between simulator sickness, immersion, and presence. We used two different camera pose (position and orientation) estimation methods for the evaluation of motion tasks in a large-scale VR environment: a simple model and an optimized model that allows for a more accurate and natural mapping of human senses. Participants performed multiple motion tasks (walking, balancing, running) in three…

0 Comments

Brain 2 Communicate: EEG-based Affect Recognition to Augment Virtual Social Interactions

Roth D., Westermeier F., Brübach L., Feigl T., Schell C., Latoschik M. E.:In: Mensch und Computer - Workshopband, Hamburg Germany, 2019. The perception and expression of emotion is a fundamental part of social interaction. This project aims to utilize neuronal signals to augment avatar-mediated communications. We recognize emotions with a brain-computer-interface (BCI) and supervised machine learning. Using an avatar-based communication interface that supports head tracking, gaze tracking, and speech to animation, we leverage the BCI-based affect detection to visualize emotional states.DOI: 10.18420/muc2019-ws-571URL: https://dl.gi.de/bitstream/handle/20.500.12116/25205/571.pdfBibTeX: Download

0 Comments

A Social Interaction Interface Supporting Affective Augmentation Based on Neuronal Data

Roth D., Brübach L., Westermeier F., Schell C., Feigl T., Latoschik M. E.:In: Proceedings of the Symposium on Spatial User Interaction (SUI), New York, NY, USA, 2019. In this demonstration we present a prototype for an avatar-mediated social interaction interface that supports the replication of head- and eye movement in distributed virtual environments. In addition to the retargeting of these natural behaviors, the system is capable of augmenting the interaction based on the visual presentation of affective states. We derive those states using neuronal data captured by electroencephalographic (EEG) sensing in combination with a machine learning driven classification of emotional states. DOI: 10.1145/3357251.3360018URL: https://dl.acm.org/citation.cfm?id=3357251.3360018BibTeX: Download

0 Comments

Recurrent Neural Networks on Drifting Time-of-Flight Measurements

Feigl T., Nowak T., Philippsen M., Edelhäußer T., Mutschler C.:In: Proceedings of the 9th International Conference on Indoor Positioning and Indoor Navigation (IPIN), Nantes, France, 2018. Kalman filters (KFs) are popular methods to es- timate position information from a set of time-of-flight (ToF) values in radio frequency (RF)-based locating systems. Such filters are proven to be optimal under zero-mean Gaussian error distributions. In presence of multipath propagation ToF measurement errors drift due to small-scale motion. This results in changing phases of the multipath components (MPCs) which cause a drift on the ToF measurements. Thus, on a short- term scale the ToF measurements have a non-constant bias…

0 Comments

Supervised Learning for Yaw Orientation Estimation

Feigl T., Mutschler C., Philippsen M.:In: Proceedings of the 9th International Conference on Indoor Positioning and Indoor Navigation (IPIN), Nantes, France, 2018. With free movement and multi-user capabilities, there is demand to open up Virtual Reality (VR) for large spaces. However, the cost of accurate camera-based tracking grows with the size of the space and the number of users. No-pose (NP) tracking is cheaper, but so far it cannot accurately and stably estimate the yaw orientation of the user’s head in the long-run. Our novel yaw orientation estimation combines a single inertial sensor located at the human’s head with inaccurate positional tracking.…

0 Comments

Head-to-Body-Pose Classification in No-Pose VR Tracking Systems

Feigl T., Mutschler C., Philippsen M.:In: Proceedings of the 25th IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Reutlingen, Germany, 2018. Pose tracking does not yet reliably work in large-scale interactive multi-user VR. Our novel head orientation estimation combines a single inertial sensor located at the user’s head with inaccurate posi- tional tracking. We exploit that users tend to walk in their viewing direction and classify head and body motion to estimate heading drift. This enables low-cost long-time stable head orientation. We evaluate our method and show that we sustain immersion. DOI: 10.1109/VR.2018.8446495URL: http://www2.informatik.uni-erlangen.de/publication/download/IEEE-VR2018b.pdfBibTeX: Download

0 Comments

Beyond Replication: Augmenting Social Behaviors in Multi-User Social Virtual Realities

Roth D., Kleinbeck C., Feigl T., Mutschler C., Latoschik M. E.:In: Proceedings of the 25th IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Reutlingen, Germany, 2018. This paper presents a novel approach for the augmentation of social behaviors in virtual reality (VR). We designed three visual transformations for behavioral phenomena crucial to everyday social interactions: eye contact, joint attention, and grouping. To evaluate the approach, we let users interact socially in a virtual museum using a large-scale multi-user tracking environment. Using a between-subject design (N = 125) we formed groups of five participants. Participants were represented as simplified avatars and experienced the virtual…

0 Comments

Human Compensation Strategies for Orientation Drifts

Feigl T., Mutschler C., Philippsen M.:In: Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Reutlingen, Germany, 2018. No-Pose (NP) tracking systems rely on a single sensor located at the user's head to determine the position of the head. They estimate the head orientation with inertial sensors and analyze the body motion to compensate their drift. However with orientation drift, VR users implicitly lean their heads and bodies sidewards. Hence, to determine the sensor drift and to explicitly adjust the orientation of the VR display there is a need to understand and consider both the user's head and body…

0 Comments

A Location-Based VR Museum

Lugrin J. L., Kern F., Schmidt R., Kleinbeck C., Roth D., Daxer C., Feigl T., Mutschler C., Latoschik M. E.:In: Proceedings of the International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), Würzburg, Germany, 2018. This poster presents a novel type of Virtual Reality (VR) application for education and culture: a location-based VR Museum, which is a large-room scale multi-user multi-zone virtual museum. This VR museum was designed to support over 100 simultaneous users, walking in a large tracking system (600 m2) and sharing a ten times bigger virtual space (7000 m2) containing indoor and outdoor dinosaur exhibitions. This poster is giving an overview of the system…

0 Comments

Acoustical manipulation for redirected walking

Feigl T., Mutschler C., Philippsen M., Kõre E.:In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST), Gothenburg, Sweden, 2017. Redirected Walking (RDW) manipulates a scene that is displayed to VR users so that they unknowingly compensate for scene motion and can thus explore a large virtual world on a limited space. So far, mostly visual manipulation techniques have been studied.This paper shows that users can also be manipulated by means of acoustical signals. In an experiment with a dynamically moving audio source we see deviations of up to 30% from a 20 m long straight-line walk for male participants and of up…

0 Comments