Sunday, February 18, 2018

Transfer function editor in Jupyter Notebook

Just created a Jupyter Notebook widget for volume rendering with Brayns. Also improved the volume rendering module for better performance and image quality.

The code sources are:
Volume rendering module:

And the code for the transfer function is the following:

import seaborn as sns
from ipywidgets import widgets, Layout, Button, Box, VBox, ColorPicker
import webcolors
% matplotlib inline

class ColorMapEditor(object):

    def __init__(self, renderer, filename=None, name='rainbow', 
                size=32, alpha=0.0, data_range=(0, 255), 
        self.renderer = renderer
        self.palette = list()
        self.alpha_sliders = list()
        self.color_pickers = list()
        self.continuous_update = continuous_update
        self.data_range = data_range
        self.send_updates_to_renderer = True
        if filename is None:
            # Initialize palette from seaborn            
            for color in sns.color_palette(name, size):
                self.palette.append([color[0], color[1], color[2], alpha])
            # Load palette from file            

        # Create control and assign palette        

    def _load(self, filename):
        # Clear controls        

        # Read colormap file        
        lines = tuple(open(filename, 'r'))
        for line in lines:
            words = line.split()
            if len(words) == 4:
                r = float(words[0])
                g = float(words[1])
                b = float(words[2])
                a = float(words[3])
                color = [r, g, b, a]

    def save(self, filename):
        with open(filename, 'w') as f:
            f.write(str(len(self.palette)) + '\n')
            for color in self.palette:
                f.write(str(color[0]) + ' ' + str(color[1]) + ' ' + str(color[2]) + ' ' + str(color[3]) + '\n')

    def set_palette(self, name):
        size = len(self.palette)
        newPalette = sns.color_palette(name, size)
        for i in range(size):
            color = newPalette[i]
            self.palette[i] = [color[0], color[1], color[2], self.palette[i][3]]

    def set_range(self, range):
        self.data_range = range

    def _html_color(self, index):
        color = self.palette[index]
        color_as_string = '#' \
                          '%02x' % (int)(color[0] * 255) + \
                          '%02x' % (int)(color[1] * 255) + \
                          '%02x' % (int)(color[2] * 255)
        return color_as_string

    def _update_colormap(self, change):

    def _update_colorpicker(self, change):
        for i in range(len(self.palette)):
            self.alpha_sliders[i].style.handle_color = self.color_pickers[i].value

    def _create_controls(self):
        self.send_updates_to_renderer = False        
        # Layout        
        alpha_slider_item_layout = Layout(
            overflow_x='hidden', height='180px', max_width='20px')
        color_picker_item_layout = Layout(
            overflow_x='hidden', height='20px', max_width='20px')
        box_layout = Layout(display='inline-flex', flex_flow='row wrap')

        # Sliders        
        self.alpha_sliders = [widgets.IntSlider(
            value=self.palette[i][3] * 256, min=0, max=255, step=1
        ) for i in range(len(self.palette))]

        # Color pickers        
        self.color_pickers = [
                disabled=False) for i in range(len(self.palette))
        # Display controls        
        color_box = Box(children=self.color_pickers)
        alpha_box = Box(children=self.alpha_sliders)
        box = VBox([color_box, alpha_box], layout=box_layout)

        # Attach observers        
        for i in range(len(self.palette)):
            self.alpha_sliders[i].observe(self._update_colormap, names='value')
            self.color_pickers[i].observe(self._update_colorpicker, names='value')
        self.send_updates_to_renderer = True

    def _update_controls(self):
        self.send_updates_to_renderer = False        
        for i in range(len(self.palette)):
            color = self._html_color(i)
            self.alpha_sliders[i].style.handle_color = color
            self.color_pickers[i].value = color
        self.send_updates_to_renderer = True

    def _send_colormap_to_renderer(self):
        if not self.send_updates_to_renderer:
            return        self.renderer.material_lut.diffuse = []
        self.renderer.material_lut.contribution = []
        self.renderer.material_lut.emission = []

        for i in range(len(self.palette)):
                color = webcolors.name_to_rgb(self.color_pickers[i].value)
            except ValueError:
                color = webcolors.hex_to_rgb(self.color_pickers[i].value)

            c = [
                (float)( / 255.0,               
                (float)( / 255.0,                
                (float)( / 255.0,                
                (float)(self.alpha_sliders[i].value) / 255.0            
            self.palette[i] = c
            self.renderer.material_lut.diffuse.append([c[0], c[1], c[2], c[3]])
            self.renderer.material_lut.emission.append([0, 0, 0])

        self.renderer.material_lut.range = self.data_range

Saturday, February 10, 2018

Sharing my experimental modules (plugins) for Brayns

I created this Github repository to share my experiments with Brayns. It's all about shaders, volume rendering, custom cameras, etc.

Feel free to download, try, contribute, star the repo, etc :) Hope this will be helpful to anyone willing to enjoy ray-tracing based computer graphics.

Tuesday, January 23, 2018

Advanced interactive volume rendering

Realtime volume rendering using:
  • Phong and blinn shading. The normal is defined by a vector going from the voxel position to the barycenter of the surrounding voxels, weighted by opacity. I got the idea in the middle of the night, thanks to a strong winter storm that prevented me from getting a good rest, but that seems to work ok :-)
  • Soft and hard shadows 

 Source code:

Demo running on Quadro K6000 NVIDIA GPU (960x540 pixels)

Sunday, January 21, 2018

Brayns is rendering everything, everywhere!

Thanks to Docker, Brayns is now available on any platform (Windows, Mac and Linux), in a couple of clicks only:

First download and start the Brayns renderer:
docker pull bluebrain/brayns
docker run -ti --rm -p 8200:8200 bluebrain/brayns

Then download the corresponding web UI:
docker pull bluebrain/sauron
docker run -ti --rm -p 8080:8080 bluebrain/sauron

And connect to the UI using the following address:

That's it, you're done! Enjoy, and crontribute! :-)

Tuesday, January 9, 2018

Happy new year 2018!

For most people, it is a stretch of the imagination to understand the world in four dimensions but a new study has discovered structures in the brain with up to eleven dimensions -- ground-breaking work that is beginning to reveal the brain's deepest architectural secrets.

Thursday, November 16, 2017

Visualization of brain simulation (87TB dataset)

Bellow a few frames of the video I lead in the context of the Blue Brain project. The full video should soon be available on various channels. Stay tuned.

The video was produced using Brayns, the visualizer I created for visualizing high quality and neuroscience related large datasets.

The rendering technique is of course ray-tracing with a bit of global illumination to enhance the rendering of neuronal activity.

Visiting K Computer in Kobe

Many thanks to my good friend Syoyo for making it possible to visit the K Computer in Kobe. Great time, great people! Inspiring :)

Monday, September 25, 2017

Real-time raytracing of animated nD geometry

Let's face reality, 3D is dead, long live nD. Now that the Sol-R ray-tracer is more or less complete (for what it was initialy designed for anyway), it's time to move on to the next level. I added the Hypercube scene to the Sol-R viewer so that we can now play around with n dimensional hypercubes, thanks to the Delphi code, by cs_Forman (

Now that it's pretty clear that our world has more than 3 dimensions, I am hoping that Sol-R can now help understanding what n dimensional geometry looks like when it intersects our 3D world.

The code is there:

PS: The code was written while watching the inspiring Christopher Nolan Interstellar movie :-)

Saturday, July 8, 2017

Saturday, June 17, 2017

Deep learning and neuron visualization

Currently teaching my network to go from a simple representation of a neuron morphology to a more detailed and realistic representation. Here are the first (promising) results:

Tuesday, March 7, 2017

Global illumination / Volume rendering

Last night, I got inspired by the Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework, and started adding global illumination to volume rendering in Brayns. The first results are the following ones, and the code is about to be merged :-)

Monday, February 20, 2017

When Virtual Drums meet Interactive Brain Visualization

Could not help it, had to synchronize my VDrums with Brayns, the large scale interactive brain visualizer I created at EPFL.

Components used for the demo:
- Mido: 
- Brayns:

Wednesday, January 25, 2017

In the insideHPC News!

 In this silent video from the Blue Brain Project at SC16, 865 segments from a rodent brain are simulated with isosurfaces generated from Allen Brain Atlas image stacks. The work is derived from the INCITE program’s project entitled: Biophysical Principles of Functional Synaptic Plasticity in the Neocortex.

I produced 2 sequences of that video using Brayns, the application I designed in the context of the Blue Brain Project.

Monday, January 9, 2017

Happy New Year!!!

Happy new year, best wishes to all, and ray-tracing forever!

Saturday, December 3, 2016

Brayns for neuro-robotics

Last tuesday, I presented how Brayns could be used to render high quality images for our colleagues from the neuro-robotics team. Brayns is hardware agnostic and it takes no more than one command line argument to switch between OSPRay (CPU) and OptiX (GPU) backends. The following video shows Brayns in action, on a 24MPixel display wall! Brayns is running on 1 machine powered by 2 Quadro K5000 NVIDIA GPUs.

Friday, October 28, 2016

SIMD accelerated voxelizer

SIMDVoxelizer is a CPU-based voxalizer taking advantage of vectorization units to speed up creation of 8bit raw volumes.

usage: SIMDVoxelizer <voxel_size> <cutoff_distance> <input_file> <output_file>

Input file is a binary array of floats: x, y, z, radius and value of elements. Each voxel of the final volume contains the sum of all elements with a weight that correspond to the value of the element divided by its squared distance to the voxel. Note that in the final volume, values are normalized.

This is currently a brute force implementation that produces accurate 8bit volumes.

The <output_file> is suffixed by the size of the volume.

SIMDVoxelizer makes use of the Intel ISPC compiler and requires ispc to be in the PATH.

To build SIMDVoxelizer, simply run make in the source folder.

Source code available on github.

Sunday, October 16, 2016

Volume rendering and raytracing ... merged!

It's there, merged and operational in Brayns, and the first results are quite exciting.

Mixing volume rendering and ray-tracing allows more control on atmospheric effects which, in the case of neurosciences, can be used to represent the electromagnetic fields surrounding the neurons.

Saturday, October 8, 2016

Volume Rendering coupled to Ray-Tracing: a win-win combination

I am currently working at adding Volume Rendering to the existing Brayns implementation. Using volumes clearly is the way to go to represent the activity that takes place outside of the geometry. Ray-tracing, on the other hand, concentrates on high quality surface rendering, processing shadows and other global illumination.

Many thanks to my colleagues  Raphael and Grigori without whom that development would have taken a few more days!

Currently in my volume branch, but soon to be merged into master!!

Saturday, September 24, 2016

Framework for efficient synthesis of spatially embedded morphologies

I was delighted to contribute to this paper:



Many problems in science and engineering require the ability to grow tubular or polymeric structures up to large volume fractions within a bounded region of three-dimensional space. Examples range from the construction of fibrous materials and biological cells such as neurons, to the creation of initial configurations for molecular simulations. A common feature of these problems is the need for the growing structures to wind throughout space without intersecting. At any time, the growth of a morphology depends on the current state of all the others, as well as the environment it is growing in, which makes the problem computationally intensive. Neuron synthesis has the additional constraint that the morphologies should reliably resemble biological cells, which possess nonlocal structural correlations, exhibit high packing fractions, and whose growth responds to anatomical boundaries in the synthesis volume. We present a spatial framework for simultaneous growth of an arbitrary number of nonintersecting morphologies that presents the growing structures with information on anisotropic and inhomogeneous properties of the space. The framework is computationally efficient because intersection detection is linear in the mass of growing elements up to high volume fractions and versatile because it provides functionality for environmental growth cues to be accessed by the growing morphologies. We demonstrate the framework by growing morphologies of various complexity.

Saturday, August 13, 2016

360 previews of 7'000 neurons

Thanks to the OSPRay equirectangular camera, Brayns can now render 360 images and videos.

Wednesday, July 13, 2016

In the news:

Brayns in the news, thanks to a amazing collaboration with the Intel team.

Using Intel’s Xeon Phi for Brain Research Visualisation

Sunday, June 26, 2016

Bringing interactive ray-tracing to neuroscience

Brayns, the application I've been working on for 6 months at EPFL, is now open-source!

To me, this application represents the first step to bringing interactive ray-tracing to neuroscience. Very exciting moments and millions of ideas are on their way. Stay tuned!

Saturday, June 25, 2016

ISC 2016 Tech Demo on Intel's new KNL architecture

Had a great time presenting Brayns at the Intel booth ISC2016.

Brayns is finally open-source, star it, fork it and contribute to introducing ray-tracing to the world of Neuro-Science. 

Brayns was presented running on 4 KNL nodes and was running amazingly fast! Up to 60fps in Full-HD on a 500 million geometries (spheres, cones and cylinders) model.

Sunday, May 22, 2016

Short trip in the virtual brain

Work in progress... Cool camera lens effects coupled with ambient occlusion and light emitting objects, thanks to ray-tracing.

Souce code is meant to go Open-source in the next few weeks on the Blue Brain github repository.

Stay tuned!

Monday, May 16, 2016

Stepping into the virtual brain


One of the keys towards understanding how the brain works as a whole is visualisation of how the individual cells function; photo-realistic rendering is therefore important.