Tuesday, October 2, 2018

Talk at EPFL - Visual Computing Seminar

Super excited to visit Wenzel Jacob and give the following talk in the context of the Visual Computing Seminars.

Blue Brain Brayns, A platform for high fidelity large-scale and interactive visualization of scientific data and brain structures

The Blue Brain Project has made major efforts to create morphologically accurate neurons to simulate sub-cellular and electrical activities, for example, molecular simulations of neuron biochemistry or multi-scale simulations of neuronal function.

One of the keys towards understanding how the brain works as a whole, is visualization of how the individual cells function. In particular, the more morphologically accurate the visualization can be, the easier it is for experts in the biological field to validate cell structures; photo-realistic rendering is therefore important. Brayns is a visualization platform that can interactively perform high-quality and high-fidelity rendering of neuroscience large data sets. Thanks to its client/server architecture, Brayns can be run in the cloud as well as on a supercomputer, and stream the rendering to any browser, either in a web UI or a Jupyter notebook.

At the Blue Brain project, the Visualization team makes intensive use of Blue Brain Brayns to produce ultra-high resolution movies (8K) and high-fidelity images for scientific publications. Brayns is also used to serve immersive visualization on the large displays, as well as unique devices such as the curved OpenDeck located at the Blue Brain office.

Brayns is also designed to accelerate scientific visualization, and to adapt to the large number of environments. Thanks to its modular architecture, Brayns makes it easy to use various rendering back-ends such as Intel's OSPRay (CPU) or NVIDIA's OptiX for example. Every scientific use-case such as DICOM, DTI, Blue Brain research, etc, is a standalone plug-in that runs on top of Brayns, allowing scientists and researchers to benefit from a high performance/fidelity/quality rendering system, without having to deal with the technical complexity of it.

Brayns currently implements a number of basic primitives such as meshes, volumes, point clouds, parametric geometries, and pioneers new rendering modalities for scientific visualization, like signed distance fields.

During this talk, I will explain the motivations behind the creation of the Brayns platform, give some technical insight about the architecture of the system and the various techniques that we already use to render datasets. I will also describe how new datasets, as well as rendering components (engines, shaders, materials, etc), can be added to the platform.

Links:
https://github.com/BlueBrain/Brayns
http://rgl.epfl.ch/courses/VCS18f

Friday, August 10, 2018

DICOM Photorealistic and interactive visualization

Just created an initial version of a DICOM plugin for Brayns, in 4 hours... :)

Source code is there:

https://github.com/favreau/Brayns-UC-DICOM

And you'll need the latest version of Brayns to make it work. Follow the readme instructions provided on the github page of the plugin and start enjoying cool visualization of your DICOM datasets.

This is of course work in progress, but if you keep an eye on the github repo, you will soon be able to visualize more.


Thursday, May 10, 2018

The music of neurons

Music generated from approximate brain simulation data... Just for the fun of it :)

Data is generated using Neuron and CoreNeuron:

https://github.com/BlueBrain/CoreNeuron
https://github.com/nrnhines/nrn.git 

With the help of this tutorial:

https://github.com/nrnhines/ringtest

Once the tutorial is successfully completed, the following file is used to produce the music:

coreneuron_tutorial/sources/ringtest/coreneuron_data/spk1.std

This file contains the timestamp of each spike. This is translated into a MIDI file using the mido library.

https://github.com/olemb/mido

Fractals love in high resolution

After all these years, still in love with fractals. Those ones are rendered with Brayns, on a 40MPixel curved display. Source code is here:

https://github.com/favreau/Brayns-Research-Modules/tree/master/fractals



Thursday, April 5, 2018

Menger Sponge in Brayns

Based on the very cool shader toy from Inigo Quilez, I ported the code to ISPC and exploited the vertorized units of the CPU to render the Menger sponge in real time:


As usual, the code is there:

https://github.com/favreau/Brayns-Research-Modules/blob/master/fractals/ispc/renderer/



More information on ray-marching and distance functions can be found on Inigo's website

Thursday, March 1, 2018

Contours and wireframe in ray-tracing

I created this simple shader that attempts to draw geometry contours using ray-tracing. The idea is to send extra rays, parallel to the camera ray, and translate according to the normal of the intersected surface. If the parallel ray hits the same geometry, it's not a contour.

That somehow seems to work, but that still needs a bit of refinement.

The code is in the ContoursRenderer:

https://github.com/favreau/brayns-research-module/tree/master/cartoon

And can be easily used with the Brayns viewer, with the following command line arguments:

braynsViewer --module cartoon --renderer contours



Sunday, February 18, 2018

Transfer function editor in Jupyter Notebook

Just created a Jupyter Notebook widget for volume rendering with Brayns. Also improved the volume rendering module for better performance and image quality.




Install it with:

pip install ipyTransferFunction

The code sources are:
Transfer Function: https://github.com/favreau/ipyTransferFunction
Brayns: https://github.com/BlueBrain/Brayns
Volume rendering module: https://github.com/favreau/brayns-research-module



Saturday, February 10, 2018

Sharing my experimental modules (plugins) for Brayns

I created this Github repository to share my experiments with Brayns. It's all about shaders, volume rendering, custom cameras, etc.

https://github.com/favreau/brayns-research-module

Feel free to download, try, contribute, star the repo, etc :) Hope this will be helpful to anyone willing to enjoy ray-tracing based computer graphics.

Tuesday, January 23, 2018

Advanced interactive volume rendering

Realtime volume rendering using:
  • Phong and blinn shading. The normal is defined by a vector going from the voxel position to the barycenter of the surrounding voxels, weighted by opacity. I got the idea in the middle of the night, thanks to a strong winter storm that prevented me from getting a good rest, but that seems to work ok :-)
  • Soft and hard shadows 

 Source code: https://github.com/favreau/Brayns/tree/aidenoiser

Demo running on Quadro K6000 NVIDIA GPU (960x540 pixels)

Sunday, January 21, 2018

Brayns is rendering everything, everywhere!

Thanks to Docker, Brayns is now available on any platform (Windows, Mac and Linux), in a couple of clicks only:

First download and start the Brayns renderer:
docker pull bluebrain/brayns
docker run -ti --rm -p 8200:8200 bluebrain/brayns

Then download the corresponding web UI:
docker pull bluebrain/sauron
docker run -ti --rm -p 8080:8080 bluebrain/sauron

And connect to the UI using the following address:
http://localhost:8080/?host=localhost:8200

That's it, you're done! Enjoy, and crontribute! :-)


Thursday, November 16, 2017

Visualization of brain simulation (87TB dataset)

Bellow a few frames of the video I lead in the context of the Blue Brain project. The full video should soon be available on various channels. Stay tuned.



The video was produced using Brayns, the visualizer I created for visualizing high quality and neuroscience related large datasets.

The rendering technique is of course ray-tracing with a bit of global illumination to enhance the rendering of neuronal activity.


Visiting K Computer in Kobe

Many thanks to my good friend Syoyo for making it possible to visit the K Computer in Kobe. Great time, great people! Inspiring :)





Monday, September 25, 2017

Real-time raytracing of animated nD geometry

Let's face reality, 3D is dead, long live nD. Now that the Sol-R ray-tracer is more or less complete (for what it was initialy designed for anyway), it's time to move on to the next level. I added the Hypercube scene to the Sol-R viewer so that we can now play around with n dimensional hypercubes, thanks to the Delphi code, by cs_Forman (http://codes-sources.commentcamarche.net/source/33735-hypercubes).

Now that it's pretty clear that our world has more than 3 dimensions, I am hoping that Sol-R can now help understanding what n dimensional geometry looks like when it intersects our 3D world.

The code is there:

https://github.com/favreau/Sol-R

PS: The code was written while watching the inspiring Christopher Nolan Interstellar movie :-)




Saturday, July 8, 2017

Saturday, June 17, 2017

Deep learning and neuron visualization

Currently teaching my network to go from a simple representation of a neuron morphology to a more detailed and realistic representation. Here are the first (promising) results:










Tuesday, March 7, 2017

Global illumination / Volume rendering

Last night, I got inspired by the Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework, and started adding global illumination to volume rendering in Brayns. The first results are the following ones, and the code is about to be merged :-)


Monday, February 20, 2017

When Virtual Drums meet Interactive Brain Visualization

Could not help it, had to synchronize my VDrums with Brayns, the large scale interactive brain visualizer I created at EPFL.


Components used for the demo:
- Mido: http://mido.readthedocs.io/en/latest/ 
- Brayns: https://github.com/BlueBrain/Brayns

Wednesday, January 25, 2017

In the insideHPC News!

 In this silent video from the Blue Brain Project at SC16, 865 segments from a rodent brain are simulated with isosurfaces generated from Allen Brain Atlas image stacks. The work is derived from the INCITE program’s project entitled: Biophysical Principles of Functional Synaptic Plasticity in the Neocortex.

I produced 2 sequences of that video using Brayns, the application I designed in the context of the Blue Brain Project.

Monday, January 9, 2017

Happy New Year!!!

Happy new year, best wishes to all, and ray-tracing forever!

Saturday, December 3, 2016

Brayns for neuro-robotics

Last tuesday, I presented how Brayns could be used to render high quality images for our colleagues from the neuro-robotics team. Brayns is hardware agnostic and it takes no more than one command line argument to switch between OSPRay (CPU) and OptiX (GPU) backends. The following video shows Brayns in action, on a 24MPixel display wall! Brayns is running on 1 machine powered by 2 Quadro K5000 NVIDIA GPUs.


Friday, October 28, 2016

SIMD accelerated voxelizer



SIMDVoxelizer is a CPU-based voxalizer taking advantage of vectorization units to speed up creation of 8bit raw volumes.

usage: SIMDVoxelizer <voxel_size> <cutoff_distance> <input_file> <output_file>

Input file is a binary array of floats: x, y, z, radius and value of elements. Each voxel of the final volume contains the sum of all elements with a weight that correspond to the value of the element divided by its squared distance to the voxel. Note that in the final volume, values are normalized.

This is currently a brute force implementation that produces accurate 8bit volumes.

The <output_file> is suffixed by the size of the volume.

SIMDVoxelizer makes use of the Intel ISPC compiler and requires ispc to be in the PATH.

To build SIMDVoxelizer, simply run make in the source folder.

Source code available on github.

Sunday, October 16, 2016

Volume rendering and raytracing ... merged!




It's there, merged and operational in Brayns, and the first results are quite exciting.

Mixing volume rendering and ray-tracing allows more control on atmospheric effects which, in the case of neurosciences, can be used to represent the electromagnetic fields surrounding the neurons.



Saturday, October 8, 2016

Volume Rendering coupled to Ray-Tracing: a win-win combination

I am currently working at adding Volume Rendering to the existing Brayns implementation. Using volumes clearly is the way to go to represent the activity that takes place outside of the geometry. Ray-tracing, on the other hand, concentrates on high quality surface rendering, processing shadows and other global illumination.

Many thanks to my colleagues  Raphael and Grigori without whom that development would have taken a few more days!


Currently in my volume branch, but soon to be merged into master!!




Saturday, September 24, 2016

Framework for efficient synthesis of spatially embedded morphologies

I was delighted to contribute to this paper:




 

Abstract:

Many problems in science and engineering require the ability to grow tubular or polymeric structures up to large volume fractions within a bounded region of three-dimensional space. Examples range from the construction of fibrous materials and biological cells such as neurons, to the creation of initial configurations for molecular simulations. A common feature of these problems is the need for the growing structures to wind throughout space without intersecting. At any time, the growth of a morphology depends on the current state of all the others, as well as the environment it is growing in, which makes the problem computationally intensive. Neuron synthesis has the additional constraint that the morphologies should reliably resemble biological cells, which possess nonlocal structural correlations, exhibit high packing fractions, and whose growth responds to anatomical boundaries in the synthesis volume. We present a spatial framework for simultaneous growth of an arbitrary number of nonintersecting morphologies that presents the growing structures with information on anisotropic and inhomogeneous properties of the space. The framework is computationally efficient because intersection detection is linear in the mass of growing elements up to high volume fractions and versatile because it provides functionality for environmental growth cues to be accessed by the growing morphologies. We demonstrate the framework by growing morphologies of various complexity.

Saturday, August 13, 2016

360 previews of 7'000 neurons

Thanks to the OSPRay equirectangular camera, Brayns can now render 360 images and videos.



Wednesday, July 13, 2016

In the news: Top500.org

Brayns in the news, thanks to a amazing collaboration with the Intel team.

Using Intel’s Xeon Phi for Brain Research Visualisation




Sunday, June 26, 2016

Bringing interactive ray-tracing to neuroscience

Brayns, the application I've been working on for 6 months at EPFL, is now open-source!





To me, this application represents the first step to bringing interactive ray-tracing to neuroscience. Very exciting moments and millions of ideas are on their way. Stay tuned!

Saturday, June 25, 2016

ISC 2016 Tech Demo on Intel's new KNL architecture

Had a great time presenting Brayns at the Intel booth ISC2016.

Brayns is finally open-source, star it, fork it and contribute to introducing ray-tracing to the world of Neuro-Science. 


Brayns was presented running on 4 KNL nodes and was running amazingly fast! Up to 60fps in Full-HD on a 500 million geometries (spheres, cones and cylinders) model.