The life of Brayns

One of the keys towards understanding how the brain works as a whole is visualisation of how the individual cells function. In particular, the more morphologically accurate the visualisation can be, the easier it is for experts in the biological field to validate cell structures; photo-realistic rendering is therefore important.

The Blue Brain Project has made major efforts to create morphologically accurate neurons to simulate sub-cellular and electrical activities, e.g. molecular simulations of neuron biochemistry or multi-scale simulations of neuronal function. Ray-tracing can help to highlight areas of the circuits where cells touch each other and where synapses are being created. In combination with ‘global illumination’, which uses light, shadow, and depth of field effects to simulate photo-realistic images, this technique makes it easier to visualise how the neurons function.

Brayns is a visualiser that can perform ray-traced rendering of scientific data. It provides an abstraction of the underlying rendering engines, so that the best possible acceleration libraries can be used for the relevant hardware.

https://github.com/BlueBrain/Brayns.git

Here is its story... (WORK IN PROGRESS...)

1995

North-East Wales Institute of Technology

 


2011

Strange encounter with GPU programming 

 


It suddenly started because of the CUDA, Supercomputing for the Masses series from Rob Farber

2012

DICOM attempt

 

 

Client Server architecture

Thanks to my experience in Client/Server architecture, the engine becomes cloud ready. The client sends information such as mouse and keyboard events to the server, that takes care of the rendering and sending a stream of images back to the client. Transport is optimized using compression technologies. Each client can enjoy a different and fully customizable view of the 3D scene.

 

Souce code for the client/server setup using ZeroC's ICE framework:

 

Acceleration structures

In order to produce optimal results in terms of rendering speed, I implement a naive bounding volume hierarchy. This technique allows the ray-tracing engine to work with thousands of objects. The GPU implementation offers the necessary computing power for real-time rendering.

 

2013

Birth of the Sol-R Engine

Sol-R is an open-source CUDA/OpenCL-based realtime ray-tracer compatible with Oculus Rift DK1, Kinect, Razor Hydra and Leap Motion devices. Sol-R was used by the Interactive Molecular Visualiser project (http://www.molecular-visualization.com)
A number of videos can be found on my channel: https://www.youtube.com/user/CyrilleOnDrums
Sol-R was written as a hobby project in order to understand and learn more about CUDA and OpenCL. Most of the code was written at night and during week-ends, meaning that it's probably not the best quality ever ;-)
The idea was to produce a Ray-Tracer that has its own "personality". Most of the code does not rely on any litterature about ray-tracing, but more on a naive approach of what rays could be used for. The idea was not to produce a physically based ray-tracer, but a simple engine that could produce cool images interactively.
Take it for what it is! Sol-R is a lot of fun to play with if you like coding computer generated images.

The Sol-R source code is available here: https://github.com/favreau/Sol-R
 

Interactive ray-tracing on Android mobile device

First, an HTTP server is required to serve images generated by Sol-R:


And then a client application using the Android SDK:


 

Protein visualizer

 





 

 Virtual Reality with Oculus Rift DK1 with multiple GPUs

 

Sol-R is now able to simultaneously produce a different view for each eye, making it ready for NVIDIA 3DVision by simply capturing the window, and playing it back in a 3DVision compatible player in order to enjoy immersive experiences.


Sol-R also provides a cheap way to visualize molecules in 3D, thanks to anaglypth technology. Get a pair of glasses for less than $2 and enjoy en immersive and unmatched experience.

 

From OpenGL to Ray-tracing in 2 lines of code

 

4-view visualizer

 

Fully raytraced viewports

 

 

2014

Join Blue Brain Project at EPFL


2015

Initial version of Brayns

First prototype of an hardware-agnostic and high-performance ray-tracer dedicated to scientific visualization.

2016

First iteration of the Web UI over HTTP and open-sourcing of Brayns

Demo with Intel at SC Frankfurt


2017

First iteration of the Python API over HTTP

https://github.com/favreau/pyBrayns