*/ ?>

The School of Biomedical Science at Monash University runs a Talented Student Program for high-achieving Bachelor of Biomedical Science Advanced Honours and Scholars programs. As part of this program, David Barnes presented a talk on 2017 May 4th, on the capabilities of the Monash Immersive Visualisation Platform, and concluded with a tour of the Monash CAVE2 facility to show biomedical applications.

MIVP and Monash Global Terrorism Research Centre student John Pollard presented Man, The Machine, and War at the DSI-DSTG discovery workshop "Bi-Directional Communication for Human-Robot Teaming", held 6-7 April in Melbourne.

“Man, The Machine, & War” is a multipart project to determine the extent to which artificial autonomous agents can comply with the requirements of the laws of armed conflict, the predications of just war theory, and the expectations of moral conduct of warfighters in the modern battlespace. Building on the premise that artificial autonomous agents will be required to conduct themselves with the same degree of moral competence as their human counterparts, this presentation outlines the combined capabilities of the Monash Immersive Visualisation Platform CAVE2 facility and Bohemia Interactive Simulations’ “Virtual
Battlespace 3”.

Real-time testing of autonomous agents, partnered with dismounted infantry in Man-Machine Combat Teams, is the most difficult problem for evaluating human-autonomy teaming because of the inherent limitations of head-mounted virtual reality systems, generally more suited to vehicle simulations. The CAVE2 platform is an ideal environment for real-time ethical testing of artificial moral agency, in realistic tactical simulations that are scalable from individual kinetic action up to theatre-wide conflict scenarios, and in strategic simulations scalable from ocean depths to low earth orbit.

Offering a solution to the dismounted man-machine combat team simulation problem, with hybrid AR-VR in a room-scale arena, this presentation introduces DSTG and Army to ready and familiar real-time simulations, on a platform that is fully compatible with present and future doctrine and training. This unique combination of systems allows for the interactions within and between these teams to be analysed and allows the ethical testing of potentially lethal situations in deadly conflict zones and hostile contested spaces.

David Barnes, Director of MIVP, gave a short overview of large scale comparative visualisation - "parallel visualisation" - to the Australian National Data Service (ANDS) community at the Tech Talk sessions on Friday 3 March 2017. Click here to view the slides.

Hilton San Francisco Union Square Hotel

Oral presentation - Stereoscopic Displays and Applications XXVII

Bjorn Sommer - 2016 February 15

Stereoscopic Space Map – Semi-immersive Configuration of 3D-stereoscopic Tours in Multi-display Environments

Although large-scale stereoscopic 3D environments likeCAVEs are a favorable location for group presentations, the perspective projection and stereoscopic optimization usually follows a navigator-centric approach. Therefore, these presentations are usually accompanied by strong side-effects, such as motion sickness which is often caused by a disturbed stereoscopic vision. The reason is that the stereoscopic visualization is usually optimized for the only head-tracked person in the CAVE – the navigator – ignoring the needs of the real target group – the audience.

To overcome this misconception, this work proposes an alternative to the head tracking-based stereoscopic effect optimization. By using an interactive virtual overview map in 3D, the pre-tour and on-tour configuration of the stereoscopic effect is provided, partly utilizing our previously published interactive projection plane approach. This Stereoscopic Space Map is visualized by the zSpace 200®, whereas the virtual world is shown on a panoramic 330° CAVE2TM.

A pilot expert study with eight participants was conducted using pre-configured tours through 3D models. The comparison of the manual and automatic stereoscopic adjustment showed that the proposed approach is an appropriate alternative to the nowadays commonly used head tracking-based stereoscopic adjustment.

Rydges World Square Sydney

Oral presentation - Astronomical Data Analysis Software and Systems XXV

Christopher Fluke - 2015 October 28

The Ultimate Display

Astronomical images and datasets are increasingly high-resolution and multi-dimensional. The vast majority of astronomers perform all of their visualisation and analysis tasks on low-resolution, two-dimensional desktop monitors. If there were no technological barriers to designing the ultimate immersive stereoscopic display for astronomy, what would it look like? What capabilities would we require of our compute hardware to drive it? And are existing technologies even close to providing a true 3D experience that is compatible with the depth resolution of human stereoscopic vision? With the CAVE2 (an 80 Megapixel, hybrid 2D and 3D virtual reality environment directly integrated with a 100 Tflop/s GPU-powered supercomputer) and the Oculus Rift (a low-cost, head-mounted display) as examples at opposite financial ends of the immersive display spectrum, I will discuss the changing face of high-resolution, immersive visualisation for astronomy.