Recent developments in AR & VR hardware have resulted in a range of nascent commercial products, e.g. the Microsoft Hololens and DAQRI smart helmet (augmented reality), Occulus Rift and HTC Vive (virtual reality). Laboratory use is an obvious application of current tether-free AR technology, which could enable new experimental methodologies as well as offer basic procedural, efficiency, training and health and safety benefits. VR technology, which typically requires tethering to a high-performance PC, provides a complementary platform, more suited to fully immersive computational uses such as multi-dimensional data visualization and big data applications.
Early work with the Hololens, investigating 3D visualization and basic lab usage, has already begun in the group; this project would further work to explore and develop these capabilities.
Our ongoing work on the calculation of time-dependent wavepackets and observables in photoionization is now collected in an OSF project (DOI: 10.17605/OSF.IO/RJMPD). Aspects of this work have previously been published, but much of the detail and methodology underlying the calculations has remained sitting on our computers. As part of our Open Science Initiative, we’re letting this data go free! Head over to the OSF project “Time-dependent Wavepackets and Photoionization – CS2” for more.
Figure shows TRPADs results (a) Calculated TRPADs (0.7eV) (b), (c) Comparison with expt. TRPADs (discrete times).
New on the arxiv:
ePSproc: Post-processing suite for ePolyScat electron-molecule scattering calculations
ePSproc provides codes for post-processing results from ePolyScat (ePS), a suite of codes for the calculation of quantum scattering problems, developed and released by Luchesse & co-workers (Gianturco et al. 1994)(Natalense and Lucchese 1999)(R. R. Lucchese and Gianturco 2016). ePS is a powerful computational engine for solving scattering problems, but its inherent complexity, combined with additional post-processing requirements, ranging from simple visualizations to more complex processing involving further calculations based on ePS outputs, present a significant barrier to use for most researchers. ePSproc aims to lower this barrier by providing a range of functions for reading, processing and plotting outputs from ePS. Since ePS calculations are currently finding multiple applications in AMO physics (see below), ePSproc is expected to have significant reuse potential in the community, both as a basic tool-set for researchers beginning to use ePS, and as a more advanced post-processing suite for those already using ePS. ePSproc is currently written for Matlab/Octave, and distributed via Github: https://github.com/phockett/ePSproc.
We’ve just finished a manuscript summarising our early work with the Hololens, including data visualization and interdisciplinary work. This is a little different in flavour to our usual work, but will provide a solid foundation for more advanced work with the Hololens, including lab uses and more advanced data visualization.
Paul Hockett & Tim Ingleby
Early hands-on experiences with the Microsoft Hololens augmented/mixed reality device are reported and discussed, with a general aim of exploring basic 3D visualization. A range of usage cases are tested, including data visualization and immersive data spaces, in-situ visualization of 3D models and full scale architectural form visualization. Ultimately, the Hololens is found to provide a remarkable tool for moving from traditional visualization of 3D objects on a 2D screen, to fully experiential 3D visualizations embedded in the real world.
The manuscript is currently available on Authorea, and the arxiv.
The hololens (microsoft.com/microsoft-hololens/) is here! Welcome to week 3* of the future, with augmented/mixed reality.
This week, some large scale basic 3D visualization as we begin to explore the power of the Hololens…
* Original video, July 2016; not uploaded until Sept. 2016 due to embargo on the cabin design.
A full manuscript on this work is now available.
The hololens is here! Welcome to week 4 of the future, with augmented/mixed reality.
This week, a bit of basic use in the laser lab, using Hololens for a remote desktop feed and rapid reference data snapshots. These are the first steps towards a more sophisticated and interactive use in the lab, which could bring together data from multiple discrete instruments around the lab, and present them to the user in either a spatially fixed form (as in the video), or a HUD which tracks and is always visible as the user moves around.
Some additional notes & links: