Wednesday, February 26, 2020

Neurones, les intelligences simulées (Centre Pompidou)

Incredibly happy and proud to have created the Blue Brain Project movie that is currently being displayed at the Centre Pompidou. A world class modern art museum located in Paris.

"Alors que l’intelligence artificielle semble avoir envahi tous les domaines industriels du monde contemporain, de la finance au domaine médical, des jeux aux objets à comportement, de l’architecture au militaire, cette situation n’a jamais été véritablement mise en relation avec l’histoire des neurosciences et de la neuro-computation. L’exposition « Neurones, les intelligences simulées » souligne la continuité des recherches d’artistes, d’architectes, de designers et de musiciens avec celle développées par les grands laboratoires scientifiques ou ceux du monde industriel. Dans le cadre de « Mutations/Créations ».


Avec « Mutations/Créations », le Centre Pompidou se transforme en laboratoire de la création et de l’innovation à la frontière des arts, de la science, et de l’ingénierie. Chaque année, le programme réunit des artistes, des ingénieurs, des scientifiques et des entrepreneurs. En 2020, « Mutations/Créations » poursuit sa recherche prospective au travers de deux expositions, « Neurones, les intelligences simulées » et « Jeremy Shaw », après trois éditions successivement consacrées à l’impression 3D (« Imprimer le monde » et « Ross Lovegrove » en 2017), aux langages informatiques (« Coder le monde » et « Ryoji Ikeda » en 2018) et à la création mêlant artificiel et vivant (« La Fabrique du vivant » et « Erika Verzutti » en 2019)."



          More information available here.


Tuesday, January 28, 2020

Biggest and Most Detailed Map of the Fly Brain

Thanks to the amazing dataset provided by HHMI Janelia, I had to to see it by myself, and decided to create the few renders below





Dataset is downloadable from here: https://www.janelia.org/news/unveiling-the-biggest-and-most-detailed-map-of-the-fly-brain-yet





Saturday, December 7, 2019

Hemispheres of the Mind

Hemispheres of the Mind is a collaboration between the EPFL Laboratory of Experimental Museology and the EPFL Blue Brain Project for the ArtLab Infinity Room II 2019. 

A collaboration between Cyrille Favreau (Blue Brain Project),  Sarah Kenderdine (eM+ EPFL) and Peter Morse for ArtLab Infinity Room II 2019 celebrating 50 years of Ecole polytechnique fédérale de Lausanne.

Special thanks to Grigori Chevtchenko for the fisheye camera implementation, and Paweł Podhajski for allowing the deployment of Brayns on the Blue Brain Supercomputer.


Audio Production by Peter Morse and Cathie Travers.
Music by Kai Engel & Cathie Travers

#epfl #project #laboratory #bluebrainproject #brain #sciart




Tuesday, May 21, 2019

Journal for Neuroscience cover published!

Making an image of the brain always is a challenge. The idea behind the cover image is to pay tribute to the amazing work initiated by Camillo Golgi back in the late 19th century. His discovery of a staining technique called black reaction changed the way one could visualize brain structures. Recent studies have shown that the aesthetic of images representing data visualization in neuro-science plays a significant role in the way those image are understood. In the last years, computer technologies have improved dramatically, together with the field of computational neuro-science. Thanks to advanced computer graphics techniques, it is now possible to reconstruct realistic shapes of neurons from a set of points and radii describing the structure of the cell. Many of those data sets for single neurons can be found on the web. Combining them together and placing them in space makes it a simple way to imagine what regions of the brain could look like if they were to be seen using Golgi’s technique. In the near future, it seems clear that realistic, physically based rendering of in-silico data will become a necessary tool to understand how the brain functions. But if those new tools are very promising, one should not forget the work of the pioneers, and this is what this cover image is all about.

Wednesday, January 30, 2019

Celebrating the beauty of the brain






So... I played, and I won :) I somehow became the winner of the NeuroArt contest for December 2018, which is great

I am extremely happy to be one the winners of the NeuroArt contest for December 2018.

Making an image of the brain always is a challenge. The idea behind the cover image is to pay tribute to the amazing work initiated by Camillo Golgi back in the late 19th century. His discovery of a staining technique called black reaction changed the way one could visualize brain structures. Recent studies have shown that the aesthetic of images representing data visualization in neuroscience plays a significant role in the way those image are understood.

In the last years, computer technologies have improved dramatically, together with the field of computational neuroscience. Thanks to advanced computer graphics techniques, it is now possible to reconstruct realistic shapes of neurons from a set of points and radii describing the structure of the cell. Many of those datasets for single neurons can be found on the web. Combining them together and placing them in space makes it a simple way to imagine what regions of the brain could look like if they were to be seen using Golgi’s technique.

In the near future, it seems clear that realistic, physically based rendering of in-silico data will become a necessary tool to understand how the brain functions.
But if those new tools are very promising, one should not forget the work of the pioneers, and this is what this cover image is all about.


Saturday, January 12, 2019

Full stack/frontend software engineer wanted!

Your mission if you accept it

The EPFL Blue Brain Project (BBP), situated on the Campus Biotech in Geneva, Switzerland, applies advanced neuroinformatics, data analytics, high-performance computing and simulation-based approaches to the challenge of understanding the structure and function of the mammalian brain in health and disease. The BBP provides the community with regular releases of data, models and tools to accelerate neuroscience discovery and clinical translation through open science and global collaboration.

We are looking for a self motivated full stack/frontend software engineer (W/M) to join our team and help us with the development of our visualization tools.

You will be working in a dynamic team with highly skilled software engineers and our goal is to aid scientists in visualizing and understanding their (neuroscientific) data.

Main duties and responsibilities include :

Your responsibility will be to develop new features for our current interactive 3D viewer Brayns (on the frontend) and maintain existing ones, and to drive the development of our new hub application where the scientists can manage their data visualizations.
 
More information available here.


Tuesday, October 2, 2018

Talk at EPFL - Visual Computing Seminar

Super excited to visit Wenzel Jacob and give the following talk in the context of the Visual Computing Seminars.

Blue Brain Brayns, A platform for high fidelity large-scale and interactive visualization of scientific data and brain structures

The Blue Brain Project has made major efforts to create morphologically accurate neurons to simulate sub-cellular and electrical activities, for example, molecular simulations of neuron biochemistry or multi-scale simulations of neuronal function.

One of the keys towards understanding how the brain works as a whole, is visualization of how the individual cells function. In particular, the more morphologically accurate the visualization can be, the easier it is for experts in the biological field to validate cell structures; photo-realistic rendering is therefore important. Brayns is a visualization platform that can interactively perform high-quality and high-fidelity rendering of neuroscience large data sets. Thanks to its client/server architecture, Brayns can be run in the cloud as well as on a supercomputer, and stream the rendering to any browser, either in a web UI or a Jupyter notebook.

At the Blue Brain project, the Visualization team makes intensive use of Blue Brain Brayns to produce ultra-high resolution movies (8K) and high-fidelity images for scientific publications. Brayns is also used to serve immersive visualization on the large displays, as well as unique devices such as the curved OpenDeck located at the Blue Brain office.

Brayns is also designed to accelerate scientific visualization, and to adapt to the large number of environments. Thanks to its modular architecture, Brayns makes it easy to use various rendering back-ends such as Intel's OSPRay (CPU) or NVIDIA's OptiX for example. Every scientific use-case such as DICOM, DTI, Blue Brain research, etc, is a standalone plug-in that runs on top of Brayns, allowing scientists and researchers to benefit from a high performance/fidelity/quality rendering system, without having to deal with the technical complexity of it.

Brayns currently implements a number of basic primitives such as meshes, volumes, point clouds, parametric geometries, and pioneers new rendering modalities for scientific visualization, like signed distance fields.

During this talk, I will explain the motivations behind the creation of the Brayns platform, give some technical insight about the architecture of the system and the various techniques that we already use to render datasets. I will also describe how new datasets, as well as rendering components (engines, shaders, materials, etc), can be added to the platform.

Links:
https://github.com/BlueBrain/Brayns
http://rgl.epfl.ch/courses/VCS18f

Thursday, May 10, 2018

The music of neurons

Music generated from approximate brain simulation data... Just for the fun of it :)

Data is generated using Neuron and CoreNeuron:

https://github.com/BlueBrain/CoreNeuron
https://github.com/nrnhines/nrn.git 

With the help of this tutorial:

https://github.com/nrnhines/ringtest

Once the tutorial is successfully completed, the following file is used to produce the music:

coreneuron_tutorial/sources/ringtest/coreneuron_data/spk1.std

This file contains the timestamp of each spike. This is translated into a MIDI file using the mido library.

https://github.com/olemb/mido

Fractals love in high resolution

After all these years, still in love with fractals. Those ones are rendered with Brayns, on a 40MPixel curved display. Source code is here:

https://github.com/favreau/Brayns-Research-Modules/tree/master/fractals



Thursday, April 5, 2018

Menger Sponge in Brayns

Based on the very cool shader toy from Inigo Quilez, I ported the code to ISPC and exploited the vertorized units of the CPU to render the Menger sponge in real time:


As usual, the code is there:

https://github.com/favreau/Brayns-Research-Modules/blob/master/fractals/ispc/renderer/



More information on ray-marching and distance functions can be found on Inigo's website

Thursday, March 1, 2018

Contours and wireframe in ray-tracing

I created this simple shader that attempts to draw geometry contours using ray-tracing. The idea is to send extra rays, parallel to the camera ray, and translate according to the normal of the intersected surface. If the parallel ray hits the same geometry, it's not a contour.

That somehow seems to work, but that still needs a bit of refinement.

The code is in the ContoursRenderer:

https://github.com/favreau/brayns-research-module/tree/master/cartoon

And can be easily used with the Brayns viewer, with the following command line arguments:

braynsViewer --module cartoon --renderer contours



Sunday, February 18, 2018

Transfer function editor in Jupyter Notebook

Just created a Jupyter Notebook widget for volume rendering with Brayns. Also improved the volume rendering module for better performance and image quality.




Install it with:

pip install ipyTransferFunction

The code sources are:
Transfer Function: https://github.com/favreau/ipyTransferFunction
Brayns: https://github.com/BlueBrain/Brayns
Volume rendering module: https://github.com/favreau/brayns-research-module



Saturday, February 10, 2018

Sharing my experimental modules (plugins) for Brayns

I created this Github repository to share my experiments with Brayns. It's all about shaders, volume rendering, custom cameras, etc.

https://github.com/favreau/brayns-research-module

Feel free to download, try, contribute, star the repo, etc :) Hope this will be helpful to anyone willing to enjoy ray-tracing based computer graphics.

Tuesday, January 23, 2018

Advanced interactive volume rendering

Realtime volume rendering using:
  • Phong and blinn shading. The normal is defined by a vector going from the voxel position to the barycenter of the surrounding voxels, weighted by opacity. I got the idea in the middle of the night, thanks to a strong winter storm that prevented me from getting a good rest, but that seems to work ok :-)
  • Soft and hard shadows 

 Source code: https://github.com/favreau/Brayns/tree/aidenoiser

Demo running on Quadro K6000 NVIDIA GPU (960x540 pixels)

Sunday, January 21, 2018

Brayns is rendering everything, everywhere!

Thanks to Docker, Brayns is now available on any platform (Windows, Mac and Linux), in a couple of clicks only:

First download and start the Brayns renderer:
docker pull bluebrain/brayns
docker run -ti --rm -p 8200:8200 bluebrain/brayns

Then download the corresponding web UI:
docker pull bluebrain/sauron
docker run -ti --rm -p 8080:8080 bluebrain/sauron

And connect to the UI using the following address:
http://localhost:8080/?host=localhost:8200

That's it, you're done! Enjoy, and crontribute! :-)


Thursday, November 16, 2017

Visualization of brain simulation (87TB dataset)

Bellow a few frames of the video I lead in the context of the Blue Brain project. The full video should soon be available on various channels. Stay tuned.



The video was produced using Brayns, the visualizer I created for visualizing high quality and neuroscience related large datasets.

The rendering technique is of course ray-tracing with a bit of global illumination to enhance the rendering of neuronal activity.


Monday, September 25, 2017

Real-time raytracing of animated nD geometry

Let's face reality, 3D is dead, long live nD. Now that the Sol-R ray-tracer is more or less complete (for what it was initialy designed for anyway), it's time to move on to the next level. I added the Hypercube scene to the Sol-R viewer so that we can now play around with n dimensional hypercubes, thanks to the Delphi code, by cs_Forman (http://codes-sources.commentcamarche.net/source/33735-hypercubes).

Now that it's pretty clear that our world has more than 3 dimensions, I am hoping that Sol-R can now help understanding what n dimensional geometry looks like when it intersects our 3D world.

The code is there:

https://github.com/favreau/Sol-R

PS: The code was written while watching the inspiring Christopher Nolan Interstellar movie :-)




Saturday, July 8, 2017

Saturday, June 17, 2017

Deep learning and neuron visualization

Currently teaching my network to go from a simple representation of a neuron morphology to a more detailed and realistic representation. Here are the first (promising) results:










Tuesday, March 7, 2017

Global illumination / Volume rendering

Last night, I got inspired by the Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework, and started adding global illumination to volume rendering in Brayns. The first results are the following ones, and the code is about to be merged :-)


Monday, February 20, 2017

When Virtual Drums meet Interactive Brain Visualization

Could not help it, had to synchronize my VDrums with Brayns, the large scale interactive brain visualizer I created at EPFL.


Components used for the demo:
- Mido: http://mido.readthedocs.io/en/latest/ 
- Brayns: https://github.com/BlueBrain/Brayns

Wednesday, January 25, 2017

In the insideHPC News!

 In this silent video from the Blue Brain Project at SC16, 865 segments from a rodent brain are simulated with isosurfaces generated from Allen Brain Atlas image stacks. The work is derived from the INCITE program’s project entitled: Biophysical Principles of Functional Synaptic Plasticity in the Neocortex.

I produced 2 sequences of that video using Brayns, the application I designed in the context of the Blue Brain Project.

Monday, January 9, 2017

Happy New Year!!!

Happy new year, best wishes to all, and ray-tracing forever!

Saturday, December 3, 2016

Brayns for neuro-robotics

Last tuesday, I presented how Brayns could be used to render high quality images for our colleagues from the neuro-robotics team. Brayns is hardware agnostic and it takes no more than one command line argument to switch between OSPRay (CPU) and OptiX (GPU) backends. The following video shows Brayns in action, on a 24MPixel display wall! Brayns is running on 1 machine powered by 2 Quadro K5000 NVIDIA GPUs.


Friday, October 28, 2016

SIMD accelerated voxelizer



SIMDVoxelizer is a CPU-based voxalizer taking advantage of vectorization units to speed up creation of 8bit raw volumes.

usage: SIMDVoxelizer <voxel_size> <cutoff_distance> <input_file> <output_file>

Input file is a binary array of floats: x, y, z, radius and value of elements. Each voxel of the final volume contains the sum of all elements with a weight that correspond to the value of the element divided by its squared distance to the voxel. Note that in the final volume, values are normalized.

This is currently a brute force implementation that produces accurate 8bit volumes.

The <output_file> is suffixed by the size of the volume.

SIMDVoxelizer makes use of the Intel ISPC compiler and requires ispc to be in the PATH.

To build SIMDVoxelizer, simply run make in the source folder.

Source code available on github.

Sunday, October 16, 2016

Volume rendering and raytracing ... merged!




It's there, merged and operational in Brayns, and the first results are quite exciting.

Mixing volume rendering and ray-tracing allows more control on atmospheric effects which, in the case of neurosciences, can be used to represent the electromagnetic fields surrounding the neurons.