Randomizing light intensity for each pixel creates some sort of film grain effect that can look cool in some cases. Ray-tracing has this weakness of creating images that so "perfect" that they do not real. Path tracing is amazing but very expensive. I thought that using this randomization technique was pretty good, but it's for you to say if it really is ;-) Here are a few samples:
Wednesday, August 28, 2013
Sunday, August 18, 2013
Incredibly happy
Incredibly happy to have my CUDA molecule visualizer referenced on the nVidia website!!
First ever fully ray-traced photo visualizer for Oculus
Got a CUDA card? an Oculus Rift? a Kinect? Download the first ever immersive photo visualizer, fully ray-traced. Then drop a few jpg file in the photos subfolder and enjoy a unique experience!
Download link: https://www.dropbox.com/s/3zjqs2deu0a4gee/VRGallery.zip
Sunday, August 4, 2013
Cuda, Kinect and Oculus rift, what a team!
I am finally getting to what I have been thinking about for the last few months, or maybe years: A fully ray-traced virtual and immersive environment. In order to achieve that project, the following technologies were able to fulfill my requirements:
- Oculus rift for immersive 3D visualization and head orientation
- Kinect for the position of the actor in the real 3D space
- CUDA for making it possible to achieve real-time ray-tracing on a consumer PC.
- WiiMotes for hands orientation and object manipulation
Download the Windows demo and judge by yourself :-)
Thursday, August 1, 2013
Interactive Ray-tracing and Oculus Rift
That's it, I found an Oculus rift in my mailbox 2 days ago and since then, my social life sort of isn't the same anymore ;-) After getting sick after an endless HL2 session, I decided that it was time to initiate more serious stuff. I you read my blog, you already know that I was really, really waiting for that rift, and despite the amazing weather outside, I spent another afternoon integrating that new device with my interactive CUDA raytracer. And here is the result:
Device initialization and destruction was as simple as that:
void Scene::initializeOVR()
{
HMDInfo Info;
bool InfoLoaded;
OVR::System::Init();
m_manager = *OVR::DeviceManager::Create();
m_HMD = *m_manager->EnumerateDevices<HMDDevice>().CreateDevice();
if (m_HMD)
{
InfoLoaded = m_HMD->GetDeviceInfo(&Info);
m_sensor = *m_HMD->GetSensor();
}
else
{
m_sensor = *m_manager->EnumerateDevices<SensorDevice>().CreateDevice();
}
if(m_sensor)
{
m_sensorFusion.AttachToSensor(m_sensor);
}
}
void Scene::finalizeOVR()
{
m_sensor.Clear();
m_HMD.Clear();
m_manager.Clear();
System::Destroy();
}
Device initialization and destruction was as simple as that:
void Scene::initializeOVR()
{
HMDInfo Info;
bool InfoLoaded;
OVR::System::Init();
m_manager = *OVR::DeviceManager::Create();
m_HMD = *m_manager->EnumerateDevices<HMDDevice>().CreateDevice();
if (m_HMD)
{
InfoLoaded = m_HMD->GetDeviceInfo(&Info);
m_sensor = *m_HMD->GetSensor();
}
else
{
m_sensor = *m_manager->EnumerateDevices<SensorDevice>().CreateDevice();
}
if(m_sensor)
{
m_sensorFusion.AttachToSensor(m_sensor);
}
}
void Scene::finalizeOVR()
{
m_sensor.Clear();
m_HMD.Clear();
m_manager.Clear();
System::Destroy();
}
And getting the orientation of the drift is as difficult as that:
if ( m_sensorFusion.IsAttachedToSensor() )
{
OVR::Quatf orientation = m_sensorFusion.GetOrientation();
m_viewAngles.x = orientation.x;
m_viewAngles.y = orientation.y;
m_viewAngles.z = orientation.z;
}
m_gpuKernel->setCamera( m_viewPos, m_viewDir, m_viewAngles );
if ( m_sensorFusion.IsAttachedToSensor() )
{
OVR::Quatf orientation = m_sensorFusion.GetOrientation();
m_viewAngles.x = orientation.x;
m_viewAngles.y = orientation.y;
m_viewAngles.z = orientation.z;
}
m_gpuKernel->setCamera( m_viewPos, m_viewDir, m_viewAngles );
Splitting the screen for both eyes was already implemented in my engine, originaly for 3D vision, but I have to admin that Oculus is way more convincing.
More to come...
Subscribe to:
Posts (Atom)