Francois Demoullin's webpage

Ray Caster using OpenGL, GLSL and C++

A short presentation with stunning images can be found at http://1drv.ms/1OZrDQF

The source code can be found at: https://bitbucket.org/FrancoisDemoullin/volumerenderer

Tri-linear vs Tri-cubic interpolation

Tri-linear interpolation estimates the value within each voxel along with the gradient at that point linearly based on surrounding data points. ​ Tri-cubic interpolation is a more advanced method to interpolate values within a voxel. Per voxel 64 tensor products are produced in a pre-processing step. Using Bernstein polynomials both the gradient and the value are smoothly interpolated

CVM and perspective changes

For the Volume Renderer I have implented a full Camera Viewing Model (CVM). This is not standard for volume rendering as often times Volume Renderers resort to paralell projections along the axis. The CVM allows the user to change the perspective and look at the image from different viewing angles. The From Point is fixed at (0, 0, 0) because the imgae is produced around the origin.

Performance limitations + future work

I want Real time!​ The implementation is done entirely on the CPU. I could drastically improve the performance by using OpenGL Compute shaders or CUDA. I like Real-time, a lot. So ultimately that will be the goal.​ Color and Opacity mapping editor​ I want any user to specify what opacity function and what color mapping they see appropriate for the data set. The most fun I had with this project was tweaking to mappings to produce the best images. However that was all done in code and I would like for non programmers to be able to easily come up with their own mappings

Source code and library dependencies

The source code is located at https://bitbucket.org/FrancoisDemoullin. My program requires the GLEW and the OpenGL libraries to be linked to it for compilation.

Particle System on the CPU using OpenGL, GLSL and C++

The source code can be found at: https://bitbucket.org/FrancoisDemoullin

How the particles are stored and how that effects blending

All particle locations are stored in a vector on the Heap. This has the advantage that at each frame the array can be easily sorted based on depth, which however is rather slow and makes the frame rate drop. For this project, I opted for additive blending because additive blending makes sorting redundant giving my program a boost in performance and allowing for a higher number of particles.

How the particles are generated in the FragmentShader

The main program only keeps track of a single location for each particle. However, one particle is much more than just a point sprite: it has a randomly chosen color and a randomly generated radius. Furthermore, I wanted all particles to fade out towards the outside. This means that close to the center of the particle the color should be more vivid than at the particle's edges. This effect is achieved in the Fragment shader.

Performance limitations

The performance of this implementation is relatively limited. After more than 0.5 million particles, my program starts lagging and the frame rate drops below 10 fps. The bottleneck definitely is the big array that holds all the particles. It needs to be traversed at every frame because every position of each particle needs to be updated. This is rather inefficient. As a follow-up project improving the performance of the particle system, I started implementing the particle it on the GPU. See the project below for more details.

Source code and library dependencies

The source code is located at https://bitbucket.org/FrancoisDemoullin and it is publicly accessible. My program requires the GLEW and the OpenGL libraries to be linked to it for compilation.

Particle System on the GPU using OpenGL, GLSL and C++

The source code can be found at: https://bitbucket.org/FrancoisDemoullin

How the particles are stored

All particles are stored in textures. That is unconventional, but it brings a massive boost in performance over storage of particles on the CPU. Upon creation, 2 position textures and 2 velocity textures are created. Each TextureCoordinate represents one particle. At every frame, two render passes are done, the first render pass updates the velocity of the particle based on previous the previous velocity and based on simulated physics. The second render pass uses the newly updated velocity to update position. The reason there are 2 textures is that at every frame, one set of textures is used to render the particles, the other set of textures is updated. This effectively stores all information about the particles on the GPU instead of the CPU. In addition, all particles are updated in the FragmentShader on the GPU using the parallel nature of the GPU to gain performance.

Why not OpenCL

While OpenCl is intended to do regular computation on the GPU it made no sense to use OpenCL for my project. The context switch from OpenCL back to OpenGL takes approximately half a second. This context switch is necessary for rendering every single frame because OpenCl does not allow rendering operations. This effectively reduces the framerate to 2 frames per second, which, of course, does not allow for a smooth animation.

Future plans and termination of this project

I have not yet had the time to finished the particle system. While I am really close I am currently trying to refactor my code and put together a demo. Unfortunately, a big course load at school got in the way of the completion of this project although I keep working on it on quiet Week-Ends and sometimes after school. This particle system means a lot to me, and completing it will be an amazing feeling which I am looking forward to.

Tweet Plot Web Application using Ruby on Rails, Google Maps API and the Twitter API

The deployed application: http://tweet-maps.herokuapp.com/

The source code can be found at: https://github.com/bftf/Tweet_Map_public

How it works

Sign up with your own Twitter account and see where you tweeted from. Out web application plots all your tweets with geo-tagging onto a Google Map. You can also see tweets from any other Twitter account that has geo-tagging turned on. Another feature is search by location. Search for Paris, France, and all tweets from Paris will be plotted on the Map

Screen Shots

I am sorry, Screen shots are currently not available. Please refer to the application for actual images of how it works.