The Blue Brain Project has made major efforts to create morphologically accurate neurons to simulate sub-cellular and electrical activities, e.g. molecular simulations of neuron biochemistry or multi-scale simulations of neuronal function. Ray-tracing can help to highlight areas of the circuits where cells touch each other and where synapses are being created. In combination with ‘global illumination’, which uses light, shadow, and depth of field effects to simulate photo-realistic images, this technique makes it easier to visualise how the neurons function.
Brayns is a visualiser that can perform ray-traced rendering of scientific data. It provides an abstraction of the underlying rendering engines, so that the best possible acceleration libraries can be used for the relevant hardware.
Here is its story... (WORK IN PROGRESS...)
North-East Wales Institute of Technology
Strange encounter with GPU programming
Client Server architecture
Thanks to my experience in Client/Server architecture, the engine becomes cloud ready. The client sends information such as mouse and keyboard events to the server, that takes care of the rendering and sending a stream of images back to the client. Transport is optimized using compression technologies. Each client can enjoy a different and fully customizable view of the 3D scene.
Souce code for the client/server setup using ZeroC's ICE framework:
In order to produce optimal results in terms of rendering speed, I implement a naive bounding volume hierarchy. This technique allows the ray-tracing engine to work with thousands of objects. The GPU implementation offers the necessary computing power for real-time rendering.
Birth of the Sol-R EngineSol-R is an open-source CUDA/OpenCL-based realtime ray-tracer compatible with Oculus Rift DK1, Kinect, Razor Hydra and Leap Motion devices. Sol-R was used by the Interactive Molecular Visualiser project (http://www.molecular-visualization.com)
A number of videos can be found on my channel: https://www.youtube.com/user/CyrilleOnDrums
Sol-R was written as a hobby project in order to understand and learn more about CUDA and OpenCL. Most of the code was written at night and during week-ends, meaning that it's probably not the best quality ever ;-)
The idea was to produce a Ray-Tracer that has its own "personality". Most of the code does not rely on any litterature about ray-tracing, but more on a naive approach of what rays could be used for. The idea was not to produce a physically based ray-tracer, but a simple engine that could produce cool images interactively.
Take it for what it is! Sol-R is a lot of fun to play with if you like coding computer generated images.
The Sol-R source code is available here: https://github.com/favreau/Sol-R
Interactive ray-tracing on Android mobile device
First, an HTTP server is required to serve images generated by Sol-R:
And then a client application using the Android SDK:
Virtual Reality with Oculus Rift DK1 with multiple GPUs
Sol-R is now able to simultaneously produce a different view for each eye, making it ready for NVIDIA 3DVision by simply capturing the window, and playing it back in a 3DVision compatible player in order to enjoy immersive experiences.
Sol-R also provides a cheap way to visualize molecules in 3D, thanks to anaglypth technology. Get a pair of glasses for less than $2 and enjoy en immersive and unmatched experience.
From OpenGL to Ray-tracing in 2 lines of code
Fully raytraced viewports
Join Blue Brain Project at EPFL
Initial version of Brayns
First prototype of an hardware-agnostic and high-performance ray-tracer dedicated to scientific visualization.