Authors: 
Klaus Mueller
Abstract: 
When I began my research career, volume rendering was a hot topic. It was invented in part because of great advances at another front – medical imaging. These emerging modalities, such as X-ray Computed Tomography, produced data that needed to be visualized in fully 3D and not just by surfaces. During my PhD years and after I was working in both domains – 3D volume reconstruction from projection data produced by CT and the visualization of these using volume rendering. Volume data were considered ‘big’ back then and their visualization ‘computationally expensive’. Fortunately, another great advance came along – the birth of commodity graphics hardware, now known as GPUs. First attempts to use these boards for volume rendering were largely hacks – albeit very creative ones. But eventually, driven by the strong market forces of computer games, both hardware and API of GPUs became very flexible and one could soon render even large volumes in amazing beauty, with complex special effects, and at interactive speeds. So where to go from here – increase data size, add more dimensions, make things more irregular? I decided to do it all and ventured into the dark universe of high-dimensional data. There, I soon was ‘cursed by high dimensionality’ and got lost in the maze of ‘redundant subspaces’. But eventually I ‘illuminated my path’ and hopefully that of others. On my journey I found that knowing volume rendering can be quite helpful for understanding some of the issues that arise in high-dimensional data visualization, for example, in sampling and rendering. In this talk I want to share my experiences and also present our software package ‘The ND-Scope‘ which features some of the practical outcomes of the research I have conducted along with my students in recent years.