WebGL Hello World

Earth rendered with WebGL
WebGL Hello World

This small demo shows the rendering of the earth using HTML5 and WebGL.

The demo uses shaders for per pixel lighting, procedural clouds, atmosphere and a day and night cycle. The high resolution textures are taken from NASAs visible earth gallery.  You can move using the WASD keys and change the view by clicking and dragging.

You can try it by clicking on the following link (WebGL-compatible browser required, e.g. Firefox, Chrome, Opera):

WebGL Hello World

 

Stochastic Order Independent Transparency

The rendering of transparent objects is not an easy task when using z-buffering. The reason is that the way z-buffering works requires to draw all transparent objects from back to front. However, this is not always possible since, for example, two triangles (or objects) could intersect each other, making it impossible to tell which one is in front of the other. To solve this problem, you need an algorithm which is order independent, hence the name order independent transparency. There are already a few algorithms for this: Using an A-Buffer, Depth Peeling , Screen-Door-Transparency and Stochastic Transparency, which I implemented as a test for another demo.

oit2 Continue reading “Stochastic Order Independent Transparency”

LIDAR Rendering

When I had a day off, I thought it would be fun to write a small viewer for LIDAR point cloud data.
Most of the LIDAR data which is freely available in a simple ASCII format or in the LAS file format. While it is quite easy to read the point data from both formats, the LAS format is more flexible and requires more implementation effort.
After writing the reader I tried to render aerial LIDAR point data directly, but it didn’t work as expected. The reason: The file format stores an x, y and z coordinate, but it doesn’t say in which coordinate system. Most of the time, cartographic coordinate systems are used. Fortunately, there is the PROJ4 which allows you to project the point data to a coordinate system which is more useful vor visualization. Since I’m more interested in terrestial LIDAR data, I decided to implement PROJ4 later.
There isn’t much terrestial LIDAR data available for free, however, the UC Santa Barbara provides terrestial LIDAR scans of their campus in the ASCII format. Some data sets are locally aligned, making them well suited for direct rendering.
The following screenshots show my first results, with three data sets which contain about 11.2 million points in real time:

BonzaiGL_2013_06_16_13_58_31_046 BonzaiGL_2013_06_16_13_58_05_016 BonzaiGL_2013_06_16_13_58_56_076 BonzaiGL_2013_06_16_13_59_55_176 BonzaiGL_2013_06_16_13_57_54_446 BonzaiGL_2013_06_16_13_57_25_366 BonzaiGL_2013_06_16_13_56_56_386 BonzaiGL_2013_06_16_13_56_38_636

I’m not sure if I continue this project, but the next steps would be to see how many more points can be rendered in real time, to implement a spatial hierarchy and to evaluate other rendering techniques.