Speaker
Description
The ‘Internet of Things’ (IoT); increasingly cheap, accessible, and available data storage volumes; greater data/sensor readout rates and the ability to access data remotely from anywhere in the world has resulted in “Big Data” - serving as a backbone upon which both modern Machine Learning (ML) and Artificial Intelligence (AI) reside. Like the multi-decade-long exponential growth in processing power brought about by unabating enhancements in transistor fill-factor, the amount of globally distributed “Big Data” is also increasing at a mind-boggling rate. While hard to accurately predict, industry and government estimates place the personal per-capita data volume to be more than 500 TB for citizens in developed countries - held only in a small part by the user, with sprawling amounts held by businesses, social media networks, video sharing platforms and ecommerce websites. As more countries (and their citizens) enter the “Big Data Age”, this ‘data density’ (alongside the infrastructure and requirements that underpin it) is only going to increase.
Alongside these personal, commercial, and business uses for large datasets; nuclear security and monitoring, typically as part of complex wide-area detection networks, also collects and processes vast densities of intricate multi-variable data. As for non-radiometric datasets, with the size and complexity of monitoring data ballooning, a new challenge is emerging; How do you best represent such “Big Data” to allow for expert human operatives to make the most appropriate, timely, and informed decisions? What do they need to see? What is essential in underpinning their thought process? And what is surplus to the decision-making? These are all considerations and requirements that this project sought to address, for datasets such as SIGMA and the University of Bristol’s own distributed monitoring network, where real-time decision making, contextualisation and response are required.
This project has successfully produced a scalable system whereby spatially distributed, contextualised, spectral radiometric data is visualised within a user-intuitive experience, whether big-screen or virtual reality (VR; e.g. Meta Quest) based, to enhance how this complex data is explored. The use of VR and AR in and across nuclear is still in its infancy, but it is hoped that developments and deployments such as this will highlight its promise and future applications across the space.