Using textures to compute intersections allows creation of complex objects with minimal computation.
Thursday, June 30, 2011
Monday, June 20, 2011
When Cuda meets Kinect...
When Cuda meets Kinect, realtime raytracing takes a whole new dimension. Thanks to Microsoft, the Kinect SDK allows creation of human 3D structures than can be inserted into the real-time ray-tracing engine.
Friday, June 10, 2011
Realtime raytracing including Kinect video capture
Hi, I finally made it! I wanted to introduce real-time video capture into my raytraced world and there it is. It's still running pretty smoothly on my nvidia GTX260 (around 25fps in 1280x720). This was acheived using Kinect, mainly because it's much simpler to get an array of bytes using this device, rather than using the default DirectShow stuff.
C++ code for Kinect initialisation:
// Variables
HANDLE m_hNextDepthFrameEvent;
HANDLE m_hNextVideoFrameEvent;
HANDLE m_hNextSkeletonEvent;
HANDLE m_pVideoStreamHandle;
HANDLE m_pDepthStreamHandle;
char* m_hVideo;
char* m_hDepth;
// Initialize Kinect
NuiInitialize( NUI_INITIALIZE_FLAG_USES_DEPTH_AND_PLAYER_INDEX | NUI_INITIALIZE_FLAG_USES_SKELETON | NUI_INITIALIZE_FLAG_USES_COLOR);
m_hNextDepthFrameEvent = CreateEvent( NULL, TRUE, FALSE, NULL );
m_hNextVideoFrameEvent = CreateEvent( NULL, TRUE, FALSE, NULL );
m_hNextSkeletonEvent = CreateEvent( NULL, TRUE, FALSE, NULL );
m_skeletons = CreateEvent( NULL, TRUE, FALSE, NULL );
NuiSkeletonTrackingEnable( m_skeletons, NUI_SKELETON_TRACKING_FLAG_ENABLE_SEATED_SUPPORT );
NuiImageStreamOpen( NUI_IMAGE_TYPE_COLOR, NUI_IMAGE_RESOLUTION_640x480, 0, 2, m_hNextVideoFrameEvent, &m_pVideoStreamHandle );
NuiImageStreamOpen( NUI_IMAGE_TYPE_DEPTH_AND_PLAYER_INDEX, NUI_IMAGE_RESOLUTION_320x240, 0, 2, m_hNextDepthFrameEvent, &m_pDepthStreamHandle );
NuiCameraElevationSetAngle( 0 );
For each iteration, the Kinect frames need to be retreived:
// Video
const NUI_IMAGE_FRAME* pImageFrame = 0;
WaitForSingleObject (m_hNextVideoFrameEvent,INFINITE);
HRESULT status = NuiImageStreamGetNextFrame( m_pVideoStreamHandle, 0, &pImageFrame );
if(( status == S_OK) && pImageFrame )
{
INuiFrameTexture* pTexture = pImageFrame->pFrameTexture;
NUI_LOCKED_RECT LockedRect;
pTexture->LockRect( 0, &LockedRect, NULL, 0 ) ;
if( LockedRect.Pitch != 0 )
{
m_hVideo = (char*) LockedRect.pBits;
}
}
// Depth
const NUI_IMAGE_FRAME* pDepthFrame = 0;
WaitForSingleObject (m_hNextDepthFrameEvent,INFINITE);
status = NuiImageStreamGetNextFrame( m_pDepthStreamHandle, 0, &pDepthFrame );
if(( status == S_OK) && pDepthFrame )
{
INuiFrameTexture* pTexture = pDepthFrame->pFrameTexture;
if( pTexture )
{
NUI_LOCKED_RECT LockedRectDepth;
pTexture->LockRect( 0, &LockedRectDepth, NULL, 0 ) ;
if( LockedRectDepth.Pitch != 0 )
{
m_hDepth = (char*) LockedRectDepth.pBits;
}
}
}
NuiImageStreamReleaseFrame( m_pVideoStreamHandle, pImageFrame );
NuiImageStreamReleaseFrame( m_pDepthStreamHandle, pDepthFrame );
CloseHandle(m_hNextDepthFrameEvent);
CloseHandle(m_hNextVideoFrameEvent);
NuiShutdown();
Subscribe to:
Posts (Atom)