Paper at CHI'22 and Article in the IEEE TVCG Journal

We are happy to announce that we will present our ReLive paper at this year's CHI'22 in New Orleans:

ReLive: Bridging In-Situ and Ex-Situ Visual Analytics for Analyzing Mixed Reality User Studies

Sebastian Hubenschmid*¹; Jonathan Wieland*¹; Daniel Immanuel Fink*¹; Andrea Batch²; Johannes Zagermann¹; Niklas Elmqvist²; Harald Reiterer¹

¹University of Konstanz; ²University of Maryland
*First three authors contributed equally

The nascent field of mixed reality is seeing an ever-increasing need for user studies and field evaluation, which are particularly challenging given device heterogeneity, diversity of use, and mobile deployment. Immersive analytics tools have recently emerged to support such analysis in situ, yet the complexity of the data also warrants an ex-situ analysis using more traditional non-immersive visual analytics setups. To bridge the gap between both approaches, we introduce ReLive: a mixed-immersion visual analytics framework for exploring and analyzing mixed reality user studies. ReLive combines an in-situ virtual reality view with a complementary ex-situ desktop view. While the virtual reality view allows users to relive interactive spatial recordings replicating the original study, the synchronized desktop view provides a familiar interface for analyzing aggregated data. We validated our concepts in a two-step evaluation consisting of a design walkthrough and an empirical expert user study.

You can watch a video about ReLive here.
ReLive is available as open source project on GitHub!

_ _ _ _ _

In addition, a new article from our collaboration with the Graz University of Technology and the University of Stuttgart got published in the IEEE Transactions on Visualization and Computer Graphics (TCVG) journal:

RagRug: A Toolkit for Situated Analytics

Philipp Fleck¹; Aimée Sousa Calepso²; Sebastian Hubenschmid³; Michael Sedlmair²; Dieter Schmalstieg¹

¹Graz University of Technology; ²University of Stuttgart; ³University of Konstanz

The article presents RagRug, an open-source toolkit for situated analytics. The abilities of RagRug go beyond previous immersive analytics toolkits by focusing on specific requirements emerging when using augmented reality (AR) rather than virtual reality. RagRug combines state of the art visual encoding capabilities with a comprehensive physical-virtual model, which lets application developers systematically describe the physical objects in the real world and their role in AR. We connect AR visualization with data streams from the Internet of Things using distributed dataflow. To this aim, we use reactive programming patterns so that visualizations become context-aware, i.e., they adapt to events coming in from the environment. The resulting authoring system is low-code; it emphasises describing the physical and the virtual world and the dataflow between the elements contained therein. We describe the technical design and implementation of RagRug, and report on five example applications illustrating the toolkit's abilities.

You can watch the video here.
RagRug is also available as open source project on GitHub

_ _ _ _ _