Immersive Systems 2015 |
Immersion systems, commonly known as “virtual reality”, are used for a variety of functions such as gaming, rehabilitation, and training. These systems mix the virtual with the actual, and have implications for cybersecurity because attackers may make the jump from virtual to actual systems. For the Science of Security community, this work is relevant to resilience, human factors, cyber physical systems, privacy, and composability. Work cited here was presented in 2015.
Ntokas, I.; Maratou, V.; Xenos, M., "Usability and Presence Evaluation of a 3D Virtual World Learning Environment Simulating Information Security Threats," in Computer Science and Electronic Engineering Conference (CEEC), 2015 7th, pp. 71-76, 24-25 Sept. 2015. doi: 10.1109/CEEC.2015.7332702
Abstract: The use of 3-D immersive Virtual World Learning Environments (VWLE) for educational purposes has been rapidly increased in the last few decades. Recent studies focusing on the evaluation of such environments have shown the great potential of virtual worlds in e-learning, providing improvement in factors such as satisfaction, enjoyment, concentration and presence compared to traditional educational practices. In this paper we present the 3D VWLE that has been developed at the framework of the V-ALERT project; the system's main goal is to contribute to the improvement of Information Security(IS) issues awareness. In particular, we present the methodology followed to evaluate critical aspects of the implemented platform, such as usability, presence and educational value. The data analysis has shown that the implemented system is usable, offered to the users high presence perception and increased their knowledge regarding IS. In addition, the evaluation results have shown that interface improvements should be considered and the training session should be enhanced in order to strengthen the system's functionality and educational scope.
Keywords: computer aided instruction; human factors; security of data; 3D VWLE; 3D immersive virtual world learning environments;3D virtual world learning environment; IS issues awareness; V-ALERT project; critical aspect evaluation; data analysis; e-learning; information security issue awareness ;information security threat simulation; interface improvements; presence evaluation; training session; usability evaluation; user concentration; user enjoyment; user satisfaction; Computer science; Electronic mail; Europe; Information security; Three-dimensional displays; Usability; Evaluation of VWLE; Information Security (IS);Virtual World Learning Environments (VWLE);educational value; presence; usability (ID#: 15-8784)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7332702&isnumber=7332684
Sharma, S.; Rajeev, S.P.; Devearux, P., "An Immersive Collaborative Virtual Environment of a University Campus for Performing Virtual Campus Evacuation Drills and Tours for Campus Safety," in Collaboration Technologies and Systems (CTS), 2015 International Conference on, pp. 84-89, 1-5 June 2015. doi: 10.1109/CTS.2015.7210404
Abstract: The use of a collaborative virtual reality environment for training and virtual tours has been increasingly recognized an as alternative to traditional reallife tours for university campuses. Our proposed application shows an immersive collaborative virtual reality environment for performing virtual online campus tours and evacuation drills using Oculus Rift head mounted displays. The immersive collaborative virtual reality environment also offers a unique way for training in emergencies for campus safety. The participant can enter the collaborative virtual reality environment setup on the cloud and participate in the evacuation drill or a tour which leads to considerable cost advantages over large scale real life exercises. This paper presents an experimental design approach to gather data on human behavior and emergency response in a university campus environment among a set of players in an immersive virtual reality environment. We present three ways for controlling crowd behavior: by defining rules for computer simulated agents, by providing controls to the users to navigate in the VR environment as autonomous agents, and by providing controls to the users with a keyboard/ joystick along with an immersive VR head set in real time. Our contribution lies in our approach to combine these three methods of behavior in order to perform virtual evacuation drills and virtual tours in a multi-user virtual reality environment for a university campus. Results from this study can be used to measure the effectiveness of current safety, security, and evacuation procedure for campus safety.
Keywords: educational institutions; groupware; helmet mounted displays; multi-agent systems ;safety ;virtual reality; Oculus Rift head mounted displays; VR environment; autonomous agents; campus safety; computer simulated agents; crowd behavior control; emergency response; experimental design approach; human behavior; immersive VR head set; immersive collaborative virtual reality environment; multiuser virtual reality environment; university campus; virtual campus evacuation drills;virtual campus evacuation tours; virtual online campus tours; Buildings; Computational modeling; Computers; Servers; Solid modeling; Three-dimensional displays; Virtual reality; behavior simulation; collaborative virtual environment; evacuation; virtual reality (ID#: 15-8785)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7210404&isnumber=7210375
Jansen Dos Reis, P.R.; Falcao Matos, C.E.; Sousa Diniz, P.; Mota Silva, D.; Dantas, W.; Braz, G.; Cardoso De Paiva, A.; Araujo, A.S., "An Immersive Virtual Reality Application for Collaborative Training of Power Systems Operators," in Virtual and Augmented Reality (SVR), 2015 XVII Symposium on, pp. 121-126, 25-28 May 2015. doi: 10.1109/SVR.2015.24
Abstract: The use of immersive Virtual Reality applications for training in industrial areas has been increasing due to the benefits related to that technology. This paper presents an application to perform training of power system operators in a collaborative and immersive environment. This application aims to enhance the user immersion and increase collaborative training in a Virtual Reality using Collaborative Virtual Environment and a Problem Based Learning approach. It was build in Unity engine and presents a fully integrated scenario of power system visualization with a supervisor module that improves training through the simulation of real events.
Keywords: computer based training; data visualisation; power engineering computing; virtual reality; Unity engine; collaborative training; collaborative virtual environment; immersive environment; immersive virtual reality application; industrial areas; power system operator training; power system visualization; power systems operators; problem based learning approach; supervisor module; user immersion; Collaboration; Mice; Power systems; Three-dimensional displays; Training; Virtual reality; Visualization; Collaborative Virtual Environment; Power System; Virtual Reality (ID#: 15-8786)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7300736&isnumber=7300710
Tredinnick, R.; Broecker, M.; Ponto, K., "Experiencing Interior Environments: New Approaches for the Immersive Display of Large-Scale Point Cloud Data," in Virtual Reality (VR), 2015 IEEE, pp. 297-298, 23-27 March 2015. doi: 10.1109/VR.2015.7223413
Abstract: This document introduces a new application for rendering massive LiDAR point cloud data sets of interior environments within high resolution immersive VR display systems. Overall contributions are: to create an application which is able to visualize large-scale point clouds at interactive rates in immersive display environments, to develop a flexible pipeline for processing LiDAR data sets that allows display of both minimally processed and more rigorously processed point clouds, and to provide visualization mechanisms that produce accurate rendering of interior environments to better understand physical aspects of interior spaces. The work introduces three problems with producing accurate immersive rendering of Li-DAR point cloud data sets of interiors and presents solutions to these problems. Rendering performance is compared between the developed application and a previous immersive LiDAR viewer.
Keywords: computer displays; data visualisation; optical radar; pipeline processing; rendering (computer graphics);virtual reality; LiDAR point cloud data sets; flexible pipeline processing; immersive VR display systems; interior environments; rendering; visualization mechanisms; Graphics processing units; Laser radar; Loading; Mirrors; Rendering (computer graphics); Three-dimensional displays; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism — Virtual reality; I.3.8 [Computer Graphics]: Applications (ID#: 15-8787)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7223413&isnumber=7223305
Kyriakou, M.; Xueni Pan; Chrysanthou, Y., "Interaction with Virtual Agents — Comparison of the Participants' Experience Between an IVR and a Semi-IVR System," in Virtual Reality (VR), 2015 IEEE, pp. 217-218, 23-27 March 2015. doi: 10.1109/VR.2015.7223373
Abstract: In this paper we compare participants' behavior and experience when navigating through a virtual environment populated with virtual agents in an IVR (Immersive Virtual Reality) system and a semi-IVR system. We measured the impact of collision and basic interaction between participants and virtual agents in both systems. Our findings show that it is more important for our semi-IVR systems to facilitate collision avoidance between the user and the virtual agents accompanied with basic interaction between them. This can increase the sense of presence and make the virtual agents and the environment appear more realistic and lifelike.
Keywords: human computer interaction; multi-agent systems; virtual reality; collision avoidance; collision impact; immersive virtual reality system; participants experience; participants interaction; semiIVR system; virtual agents; virtual environment; Collision avoidance; Computer science; Navigation; Teleoperators; Tracking; Virtual environments (ID#: 15-8788)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7223373&isnumber=7223305
Wang Dawei; Yang Hongyan, "An Intention Based Manipulation Method of Large Models in Virtual Environment," in Control, Automation and Robotics (ICCAR), 2015 International Conference on, pp. 209-213, 20-22 May 2015. doi: 10.1109/ICCAR.2015.7166033
Abstract: This paper proposes a big-span position tracking algorithm based on pre-position time series to determine the user operation intention in virtual environment. Some problems, such as lost subject and inconvenient operation, when a tracker is working on big-scale models, like an airplane or a ship, can't be solved well in an immersive virtual reality system. Based on the moving trajectory and speed of a tracker in a unit of time span, the tracking algorithm is a plus to the mainstream relative position algorithm. The algorithm can tell the magnitude and intensity of a user's operation. It makes it especially convenient to display large-scale models in the CAVE system. Cases are used in this thesis to prove the effectiveness and convenience of this algorithm.
Keywords: manipulators; object tracking; position control; virtual reality; CAVE system; big-span position tracking algorithm; cave automatic virtual environment system; immersive virtual reality system; intention based manipulation method; mainstream relative position algorithm; pre-position time series; Cameras; Charge coupled devices; Computational modeling; Solid modeling; Time series analysis; Tracking; Trajectory; CAVE; relative position; tracking algorithm; virtual reality (ID#: 15-8789)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7166033&isnumber=7165846
Papadopoulos, C.; Mirhosseini, S.; Gutenko, I.; Petkov, K.; Kaufman, A.E.; Laha, B., "Scalability Limits of Large Immersive High-Resolution Displays," in Virtual Reality (VR), 2015 IEEE, pp. 11-18, 23-27 March 2015. doi: 10.1109/VR.2015.7223318
Abstract: We present the results of a variable information space experiment, targeted at exploring the scalability limits of immersive high resolution, tiled-display walls under physical navigation. Our work is motivated by a lack of evidence supporting the extension of previously established benefits on substantially large, room-shaped displays. Using the Reality Deck, a gigapixel resolution immersive display, as its apparatus, our study spans four display form-factors, starting at 100 megapixels arranged planarly and up to one gi-gapixel in a horizontally immersive setting. We focus on four core tasks: visual search, attribute search, comparisons and pattern finding. We present a quantitative analysis of per-task user performance across the various display conditions. Our results demonstrate improvements in user performance as the display form-factor changes to 600 megapixels. At the 600 megapixel to 1 gigapixel transition, we observe no tangible performance improvements and the visual search task regressed substantially. Additionally, our analysis of subjective mental effort questionnaire responses indicates that subjective user effort grows as the display size increases, validating previous studies on smaller displays. Our analysis of the participants' physical navigation during the study sessions shows an increase in user movement as the display grew. Finally, by visualizing the participants' movement within the display apparatus space, we discover two main approaches (termed “overview” and “detail”) through which users chose to tackle the various data exploration tasks. The results of our study can inform the design of immersive high-resolution display systems and provide insight into how users navigate within these room-sized visualization spaces.
Keywords: computer displays; data visualisation; user interfaces; Reality Deck; attribute search task; comparisons task; data exploration task; display apparatus space; display form-factors; gigapixel resolution immersive display; horizontally immersive display setting; immersive high-resolution displays; pattern finding task; per-task user performance; quantitative analysis; room-shaped displays; room-sized visualization space; scalability limit; tiled-display walls; user navigation; variable information space experiment; visual search task; Data visualization; Navigation; Rendering (computer graphics);Scalability; Timing; Visualization; Wall displays; display scalability; high resolution display; immersion; navigation; user studies; visualization (ID#: 15-8790)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7223318&isnumber=7223305
Owens, B.D.; Crocker, A.R., "SimSup's Loop: A Control Theory Approach to Spacecraft Operator Training," in Aerospace Conference, 2015 IEEE, pp. 1-17, 7-14 March 2015. doi: 10.1109/AERO.2015.7118921
Abstract: Immersive simulation is a staple of training for many complex system operators, including astronauts and ground operators of spacecraft. However, while much has been written about simulators, simulation facilities, and operator certification programs, the topic of how one develops simulation scenarios to train a spacecraft operator is relatively understated in the literature. In this paper, an approach is presented for using control theory as the basis for developing the immersive simulation scenarios for a spacecraft operator training program. The operator is effectively modeled as a high level controller of lower level hardware and software control loops that affect a select set of system state variables. Simulation scenarios are derived from a STAMP-based hazard analysis of the operator's high and low level control loops. The immersive simulation aspect of the overall training program is characterized by selecting a set of scenarios that expose the operator to the various inadequate control actions that stem from control flaws and inadequate control executions in the different sections of the typical control loop. Results from the application of this approach to the Lunar Atmosphere and Dust Environment Explorer (LADEE) mission are provided through an analysis of the simulation scenarios used for operator training and the actual anomalies that occurred during the mission. The simulation scenarios and inflight anomalies are mapped to specific control flaws and inadequate control executions in the different sections of the typical control loop to illustrate the characteristics of anomalies arising from the different sections of the typical control loop (and why it is important for operators to have exposure to these characteristics). Additionally, similarities between the simulation scenarios and inflight anomalies are highlighted to make the case that the simulation scenarios prepared the operators for the mission.
Keywords: aerospace computing; aerospace simulation; certification; control engineering computing; industrial training; space vehicles; LADEE; STAMP-based hazard analysis; SimSup loop; astronauts; complex system operators; control executions; control flaws; control theory approach; ground operators; hardware control loops; immersive simulation scenarios; inflight anomalies; lunar atmosphere and dust environment explorer mission; operator certification programs; simulation facilities; software control loops; spacecraft operator training program; Biographies; Control systems; NASA; Training (ID#: 15-8791)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7118921&isnumber=7118873
Reich, D.; Stark, R., "The Influence of Immersive Driving Environments on Human-Cockpit Evaluations," in System Sciences (HICSS), 2015 48th Hawaii International Conference on, pp. 523-532, 5-8 Jan. 2015. doi: 10.1109/HICSS.2015.69
Abstract: To ensure safety and usability of advanced in-car cockpit solutions, prospective evaluation during early prototyping stages is important, especially when developing innovative human-cockpit-interactions. In this context, highly realistic test environments will help to provide reliable and valid findings. Nevertheless, real car driving studies are difficult to control, manipulate, replicate and standardize. They are also more time consuming and expensive. One economizing suggestion is the implementation of immersive driving environments within simulator studies to provide users with a more realistic awareness of the situation. This paper discusses research investigating the influence of immersive driving environments. We evaluated three interaction modalities (touch, spin controller, free-hand gestures), and two levels of immersivity (low, high) to examine this methodology. Twenty participants took part in the driving simulator study. Objective and subjective data show advantages regarding situational awareness and perception for high immersive driving environments when interacting with a navigation system.
Keywords: digital simulation; human computer interaction; road safety; road vehicles; traffic engineering computing; advanced in-car cockpit solution safety; driving simulator; free-hand gesture interaction; human-cockpit evaluations; immersive driving environments; innovative human-cockpit-interactions; navigation system; spin controller interaction; touch interaction; Glass; Keyboards; Navigation; Reliability; Software; Three-dimensional displays; Vehicles; Human-Cockpit-Interactions; Immersive environments (ID#: 15-8792)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7069718&isnumber=7069647
Lin, M.C., "Perceptually-inspired Computing," in Intelligent Technologies for Interactive Entertainment (INTETAIN), 2015 7th International Conference on, pp. 1-1, 10-12 June 2015. Doi: (not provided)
Abstract: Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.
Keywords: computer animation; computer games; hearing; human computer interaction; human factors; interactive systems; adaptive algorithms; auditory perception; computer animation; crowd simulation; data-driven personality modeling; entertainment; example-guided physics-based sound synthesis; human perceptual systems; human sensory systems; human-computer interaction; immersive experiences; interactive applications; multimodal interaction; perceptually guided principles; perceptually-inspired computational models; physical environment; psychophysics; shared social experience; video games; visual computing; Adaptation models; Animation; Computational modeling; Computer science; Entertainment industry; Games; Solid modeling; computer animation; crowd simulation; entertainment; human perceptual systems; human-computer interaction; multimodal interaction; perceptually-inspired computing; video games (ID#: 15-8793)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7325476&isnumber=7325470
Drouhard, Margaret; Steed, Chad A.; Hahn, Steven; Proffen, Thomas; Daniel, Jamison; Matheson, Michael, "Immersive Visualization for Materials Science Data Analysis using the Oculus Rift," in Big Data (Big Data), 2015 IEEE International Conference on, pp. 2453-2461, Oct. 29 2015-Nov. 1 2015
doi: 10.1109/BigData.2015.7364040
Abstract: In this paper, we propose strategies and objectives for immersive data visualization with applications in materials science using the Oculus Rift virtual reality headset. We provide background on currently available analysis tools for neutron scattering data and other large-scale materials science projects. In the context of the current challenges facing scientists, we discuss immersive virtual reality visualization as a potentially powerful solution. We introduce a prototype immersive visualization system, developed in conjunction with materials scientists at the Spallation Neutron Source, which we have used to explore large crystal structures and neutron scattering data. Finally, we offer our perspective on the greatest challenges that must be addressed to build effective and intuitive virtual reality analysis tools that will be useful for scientists in a wide range of fields.
Keywords: Crystals; Data visualization; Instruments; Neutrons; Solid modeling; Three-dimensional displays; Virtual reality (ID#: 15-8794)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7364040&isnumber=7363706
Garcia, A.S.; Roberts, D.J.; Fernando, T.; Bar, C.; Wolff, R.; Dodiya, J.; Engelke, W.; Gerndt, A., "A Collaborative Workspace Architecture for Strengthening Collaboration among Space Scientists," in Aerospace Conference, 2015 IEEE, pp. 1-12, 7-14 March 2015. doi: 10.1109/AERO.2015.7118994
Abstract: Space exploration missions have produced large data of immense value, to both research and the planning and operating of future missions. However, current datasets and simulation tools fragment teamwork, especially across disciplines and geographical location. The aerospace community already exploits virtual reality for purposes including space tele-robotics, interactive 3D visualization, simulation and training. However, collaborative virtual environments are yet to be widely deployed or routinely used in space projects. Advanced immersive and collaborative visualization systems have the potential for enhancing the efficiency and efficacy of data analysis, simplifying visual benchmarking, presentations and discussions. We present preliminary results of the EU funded international project CROSS DRIVE, which develops an infrastructure for collaborative workspaces for space science and missions. The aim is to allow remote scientific and engineering experts to collectively analyze and interpret combined datasets using shared simulation tools. The approach is to combine advanced 3D visualization techniques and interactive tools in conjunction with immersive virtuality telepresence. This will give scientists and engineers the impression of teleportation from their respective buildings across Europe, to stand together on a planetary surface, surrounded by the information and tools that they need. The conceptual architecture and proposed realization of the collaborative workspace are described. ESA's planned ExoMars mission provides the use-case for deriving user requirements and evaluating our implementation.
Keywords: aerospace computing; data visualisation; interactive systems; EU funded international project; ExoMars mission; aerospace community; collaborative visualization systems; collaborative workspace architecture; data analysis; immersive virtuality telepresence; interactive 3D visualization; interactive tools; space exploration missions; space tele-robotics; Collaboration; Computer architecture; Data visualization; Mars; Solid modeling; Space missions; Space vehicles (ID#: 15-8795)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7118994&isnumber=7118873
Panzoli, D.; Pons-Lelardeux, C.; Lagarrigue, P., "Communication and Knowledge Sharing in an Immersive Learning Game," in Games and Virtual Worlds for Serious Applications (VS-Games), 2015 7th International Conference on, pp. 1-8, 16-18 Sept. 2015. doi: 10.1109/VS-GAMES.2015.7295768
Abstract: Learning games are becoming a serious contender to real-life simulations for professional training, particularly in highly technical jobs where their cost-effectiveness is a sizeable asset. The most appreciated feature of a learning game is to provide in an automatic way to each learner an integrated feedback in real time during the game and, ideally, a personally meaningful debriefing at the end of each session. Immersive learning games use virtual reality and 3D environments to allow several learners at once to collaborate in the most natural way. Managing the communication on the other hand has proven so far a more difficult problem to overcome. In this article, we present a communication system designed to be used in immersive learning games. This innovative system is neither based on voice-chat nor branching dialogues but on the idea that pieces of information can be manipulated as tangible objects in a virtual environment. This system endeavours to offer the simplest and most intuitive way for several learners to acquire and share knowledge in an immersive virtual environment while complying with the requirements of a reliable assessment of their performance. A first experiment with nurse anaesthetist students gives evidence that this simple communication system is apt to support lifelike behaviours such as consultation, debate, conflict or irritation.
Keywords: computer based training; computer games; virtual reality; 3D environments; branching dialogues; communication system; highly technical jobs; immersive learning games; immersive virtual environment; knowledge sharing; professional training; real-life simulations; virtual reality; voice-chat; Collaboration; Communication systems; Context; Games; Real-time systems; Surgery; Virtual environments (ID#: 15-8796)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7295768&isnumber=7295691
Mentzelopoulos, Markos; Ferguson, Jeffrey; Protopsaltis, Aristidis, "Perceptually Captured Gesture Interaction with Immersive Information Retrieval Environments: An Experimental Framework for Testing and Rapid Iteration," in Interactive Mobile Communication Technologies and Learning (IMCL), 2015 International Conference on, pp. 307-311, 19-20 Nov. 2015. doi: 10.1109/IMCTL.2015.7359608
Abstract: The use of perceptual inputs is an emerging area within HCI that suggests a developing Perceptual User Interface (PUI) that may prove advantageous for those involved in mobile serious games and immersive social network environments. Since there are a large variety of input devices, software platforms, possible interactions, and myriad ways to combine all of the above elements in pursuit of a PUI, we propose in this paper a basic experimental framework that will be able to standardize study of the wide range of interactive applications for testing efficacy in learning or information retrieval and also suggest improvements to emerging PUI by enabling quick iteration. This rapid iteration will start to define a targeted range of interactions that will be intuitive and comfortable as perceptual inputs, and enhance learning and information retention in comparison to traditional GUI systems. The work focuses on the planning of the technical development of two scenarios.
Keywords: Engines; Graphical user interfaces; Grasping; Hardware; Navigation; Software; Three-dimensional displays; Graphical User Interface (GUI); HCI; PUI; perceptual; serious games (ID#: 15-8797)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7359608&isnumber=7359535
Covarrubias, M.; Bordegoni, M., "Immersive VR for Natural Interaction with a Haptic Interface for Shape Rendering," in Research and Technologies for Society and Industry Leveraging a better tomorrow (RTSI), 2015 IEEE 1st International Forum on, pp. 82-89, 16-18 Sept. 2015. doi: 10.1109/RTSI.2015.7325075
Abstract: This paper presents an immersive virtual reality system that includes a natural interaction approach based on free hand gestures that is used to drive a Desktop Haptic Strip for Shape Rendering (DHSSR). The DHSSR is a mechatronic display of virtual curves intersecting 3D virtual objects, and aims at allowing designers to evaluate the quality of shapes during the conceptual design phase of new products. The DHSSR consists of a 6DOF servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. Virtual curves can be interactively generated on the 3D surface of the virtual object, and coherently the DHSSR haptic interface renders them. An intuitive and natural modality for interacting with the 3D virtual objects and 3D curves is offered to users, who are mainly industrial designers. This consists of an immersive virtual reality system for the visualization of the 3D virtual models and a hand gestural interaction approach used by the user for handling the models. The system has been implemented by using low cost and open technologies, and combines a software engine for interactive 3D content generation (Unity 3D), the Oculus Rift Head Mounted Display for 3D stereo visualization, a motion capture sensor (LeapMotion) for tracking the user's hands, and the Arduino Leonardo board for controlling the components. Results reported in the paper are positive for what concerns the quality of the rendering of the surface, and of the interaction modality proposed.
Keywords: haptic interfaces; interactive systems; rendering (computer graphics); shape recognition; virtual reality; 3D stereo visualization; 3D virtual objects; 6DOF servo actuated developable metallic strip; Arduino Leonardo board; DHSSR; LeapMotion; cross sectional curves; desktop haptic strip for shape rendering; hand gestural interaction; haptic interface; immersive VR; industrial designers; interactive 3D content generation; mechatronic display; motion capture sensor; natural interaction; oculus rift head mounted display; software engine; virtual curves; virtual reality system; Haptic interfaces; Interpolation; Rendering (computer graphics); Shape; Solid modeling; Strips; Three-dimensional displays}, (ID#: 15-8798)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7325075&isnumber=7325058
Sidorakis, N.; Koulieris, G.A.; Mania, K., "Binocular Eye-Tracking for the Control of a 3D Immersive Multimedia User Interface," in Everyday Virtual Reality (WEVR), 2015 IEEE 1st Workshop on, pp.15-18, 23-23 March 2015. doi: 10.1109/WEVR.2015.7151689
Abstract: In this paper, we present an innovative approach to design a gaze-controlled Multimedia User Interface for modern, immersive headsets. The wide-spread availability of consumer grade Virtual Reality Head Mounted Displays such as the Oculus RiftTM transformed VR to a commodity available for everyday use. However, Virtual Environments require new paradigms of User Interfaces, since standard 2D interfaces are designed to be viewed from a static vantage point only, e.g. the computer screen. Additionally, traditional input methods such as the keyboard and mouse are hard to manipulate when the user wears a Head Mounted Display. We present a 3D Multimedia User Interface based on eye-tracking and develop six applications which cover commonly operated actions of everyday computing such as mail composing and multimedia viewing. We perform a user study to evaluate our system by acquiring both quantitative and qualitative data. The study indicated that users make less type errors while operating the eye-controlled interface compared to using the standard keyboard during immersive viewing. Subjects stated that they enjoyed the eye-tracking 3D interface more than the keyboard/mouse combination.
Keywords: gaze tracking helmet mounted displays; multimedia computing; user interfaces; virtual reality; 3D immersive multimedia user interface; Oculus Rift; binocular eye-tracking; gaze-controlled multimedia user interface; immersive headsets; immersive viewing; virtual environments; virtual reality head mounted displays; Electronic mail; Games;Keyboards; Mice; Multimedia communication; Three-dimensional displays; User interfaces; H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems-Artificial augmented and virtual realities I.3.6 [Computer Graphics]: Methodology and Techniques-Interaction techniques I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism-Virtual Reality (ID#: 15-8799)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7151689&isnumber=7151684
Katz, B.F.G.; Felinto, D.Q.; Touraine, D.; Poirier-Quinot, D.; Bourdot, P., "BlenderVR: Open-Source Framework for Interactive and Immersive VR," in Virtual Reality (VR), 2015 IEEE, pp. 203-204, 23-27 March 2015. doi: 10.1109/VR.2015.7223366
Abstract: BlenderVR is an open-source project framework for interactive and immersive applications based on an extension of the Blender Game Engine to Virtual Reality applications. BlenderVR is a generalization of the BlenderCAVE project, accounting for alternate platforms (e.g., HMD, video-walls). The goal is to provide a flexible and easy to use framework for the creation of VR applications for various platforms, making use of the existing power of the BGE's graphics rendering and physics engine. Compatible with 3 major Operating Systems, BlenderVR has been developed by VR researchers with support from the Blender Community. BlenderVR currently handles multi-screen/multi-user tracked stereoscopic rendering through efficient low-level master/slave synchronization process with multimodal interactions via OSC and VRPN protocols.
Keywords: protocols; public domain software; rendering (computer graphics); synchronisation; virtual reality; Blender game engine; BlenderVR; OSC protocol; VRPN protocol; graphics rendering; immersive application; interactive application; open-source framework; physics engine; synchronization process; virtual reality; Engines; Games; Navigation; Rendering (computer graphics); Synchronization; Virtual reality; H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities; I.3.2 [Graphics Systems]: Distributed/network graphics (ID#: 15-8800)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7223366&isnumber=7223305
Jooyoung Lee; Hasup Lee; BoYu Gao; HyungSeok Kim; Jee-In Kim, "Multiple Devices as Windows for Virtual Environment," in Virtual Reality (VR), 2015 IEEE, pp. 219-220, 23-27 March 2015. doi: 10.1109/VR.2015.7223374
Abstract: We introduce a method for using multiple devices as windows for interacting with 3-D virtual environment. Motivation of our work has come from generating collaborative workspace with multiple devices which can be found in our daily lives, like desktop PC and mobile devices. Provided with life size virtual environment, each device shows a scene of 3-D virtual space on its position and direction, and users would be able to perceive virtual space in more immersive way with it. By adopting mobile device to our system, users not only see outer space of stationary screen by relocating their mobile device, but also have personalized view in working space. To acquiring knowledge of device's pose and orientation, we adopt vision-based approaches. For the last, we introduce an implementation of a system for managing multiple device and letting them have synchronized performance.
Keywords: computer vision; groupware; mobile computing; virtual reality; 3D virtual environment; 3D virtual space; collaborative workspace; desktop PC; mobile devices; vision-based approaches; Electronic mail; Mobile communication; Mobile handsets; Servers; Virtual environments; AR; Multiple device; Shared virtual space; immersive VR (ID#: 15-8801)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7223374&isnumber=7223305
Rodehutskors, Tobias; Schwarz, Max; Behnke, Sven, "Intuitive Bimanual Telemanipulation Under Communication Restrictions by Immersive 3D Visualization and Motion Tracking," in Humanoid Robots (Humanoids), 2015 IEEE-RAS 15th International Conference on, pp. 276-283, 3-5 Nov. 2015
doi: 10.1109/HUMANOIDS.2015.7363547
Abstract: Robots which solve complex tasks in environments too dangerous for humans to enter are desperately needed, e.g. for search and rescue applications. As fully autonomous robots are not yet capable of operating in highly unstructured real-world scenarios, teleoperation is often used to embed the cognitive capabilities of human operators into the robotic system. The many degrees of freedom of anthropomorphic robots and communication restrictions pose challenges to the design of teleoperation interfaces, though. In this work, we propose to combine immersive 3D visualization and tracking of operator head and hand motions to an intuitive interface for bimanual teleoperation. 3D point clouds acquired from the robot are visualized together with a 3D robot model and camera images using a tracked 3D head-mounted display. 6D magnetic trackers capture the operator hand motions which are mapped to the grippers of our two-armed robot Momaro. The proposed user interface allows for solving complex manipulation tasks over degraded communication links, as demonstrated at the DARPA Robotics Challenge Finals and in lab experiments.
Keywords: Cameras; Mobile robots; Robot vision systems; Three-dimensional displays; Tracking (ID#: 15-8802)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7363547&isnumber=7362951
Lu, Ching-Hu, "IoT-enhanced and Bidirectionally Interactive Information Visualization for Context-Aware Home Energy Savings," in Mixed and Augmented Reality - Media, Art, Social Science, Humanities and Design (ISMAR-MASH'D), 2015 IEEE International Symposium on, pp. 15-20, Sept. 29 2015-Oct. 3 2015. doi: 10.1109/ISMAR-MASHD.2015.20
Abstract: In recent years, due to deteriorating global warming, there has been increasing attention to home energy savings, which is often a serious and not so interesting task. In this regard, we proposed a playful and bidirectionally interactive eco-feedback with three kinds of information visualization integrated with a 3D pet-raising game, which synchronously visualizes the information of the physical environment with the virtual environment by leveraging IoT (Internet of Things) enabled technologies in hopes of enhancing user experience and prolonging users' engagement in energy savings. In addition to mere mapping from physical to virtual environment for traditional game-based energy savings, this study also makes use of the other direction to form a bidirectional mapping to empower users to allow direct and flexible remote control anywhere and anytime in a more natural and playful way. Furthermore, integrating context-awareness with the bidirectional mapping in an energy-saving system also enhances the immersive experience of the users.
Keywords: Avatars; Games; Home appliances; Positron emission tomography; Sensors; Visualization; Context-awareness; Game-based eco-feedback; IoT; Mixed Reality; hysical-cyber system (ID#: 15-8803)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7350729&isnumber=7350713
Davoudpour, M.; Sadeghian, A.; Rahnama, H., "Synthesizing Social Context for Making Internet of Things Environments More Immersive," in Network of the Future (NOF), 2015 6th International Conference on the, pp. 1-5, Sept. 30 2015-Oct. 2 2015. doi: 10.1109/NOF.2015.7333282
Abstract: The growth in context-aware systems and smart devices elevates another technology in ubiquitous computing - Internet of Things (IoT), where all objects are connected. The integration of Smart objects and social networking play an important role in today's life. This paper mainly promotes the management and architecture for adaptive social services within IoT, where objects interact based on social behavior. We proposed social context-aware and ontology as the major keys for this study. Our main goal is to make the presented framework CANthings a standard social framework that can be used both in research and industry projects.
Keywords: Internet of Things; ontologies (artificial intelligence); social aspects of automation; social networking (online); CANthings framework; Internet of Things environments; adaptive social services; context-aware systems; smart devices; smart objects; social behavior; social context synthesis; social networking; ubiquitous computing; Computer architecture; Context; Internet of things; Interoperability; Ontologies; Social network services; Context-aware; Internet of Things (IoT);Interoperability; Ontology; Social IoT (ID#: 15-8804)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7333282&isnumber=7333276
Note:
Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to news@scienceofsecurity.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.