Category Archives: Web

Intro to Pixel Shaders in Three.js

8231596784_dc2f2d935d_b

I recently started playing with shaders in three.js and I wanted to share some of what I’ve discovered so far. Shaders are the ‘secret sauce’ of modern graphics programming and understanding them gives you a lot of extra graphical fire-power.

For me the big obstacle to learning shaders was the lack of documentation or simple examples, so hopefully this post will be useful to others starting out. This post will focus on using pixel shaders to add post-processing effects to Three.js scenes. This post assumes you already know the basics of using Three.js.

What is a Shader?

A Shader is a piece of code that runs directly on the GPU. Most modern devices have powerful GPUs designed to handle graphics effects without taxing the CPU. This means you get a lot of graphical power essentially for free.

The big conceptual shift when considering shaders is that they run in parallel. Instead of looping sequentially through each pixel one-by-one, shaders are applied to each pixel simultaneously, thus taking advantage of the parallel architecture of the GPU.

There are 2 main types of shaders – vertex shaders and pixel shaders.

  • Vertex Shaders generate or modify 3D geometry by manipulating its vertices. A good example is this fireball where the vertex positions of a sphere geometry are deformed by perlin noise.
  • Pixel Shaders (or ‘Fragment Shaders’) modify or draw the pixels in a scene. They are used to render a 3D scene into pixels (rasterization), and also typically used to add lighting and other effects to a 3D scene.

There are 2 different kinds of pixel shaders –

  • Shaders that draw an image or texture directly. These allows you to draw the kind of  abstract patterns seen on glsl.heroku.com. These types of shaders can be loaded into a THREE.ShaderMaterial to give cool textures to 3D objects like this example.
  • Shaders that modify another image or texture. These allow you to do post-processing on an existing texture, for example to add a glow or blur to a 3D scene. This second type of shader is what we will be talking about for the remainder of this post.

Pixel Shaders in Three.js

Three.js has an effects manager called EffectsComposer and many useful shaders built in. This code is not compiled into the main Three.js file, rather it is maintained separately in 2 folders in the three.js root folder:

  • /examples/js/postprocessing – contains the main EffectsComposer() class, and a number of ShaderPasses.
  • /examples/js/shaders – contains multiple individual shaders.

Unfortunately these shaders are not very well documented, so you need to dig in and test them out yourself.

Preview some of the three.js built-in shaders with this demo.

preview

Applying Shaders in Three.js

Applying a shader is pretty straight-forward. This example applies a dot screen and RGB shift effect to a simple 3D scene:

To use shaders that come with three.js, first we need to include the required shader JS files. Then in the scene initialization we set up the effect chain:

// postprocessing
composer = new THREE.EffectComposer( renderer );
composer.addPass( new THREE.RenderPass( scene, camera ) );

var dotScreenEffect = new THREE.ShaderPass( THREE.DotScreenShader );
dotScreenEffect.uniforms[ 'scale' ].value = 4;
composer.addPass( dotScreenEffect );

var rgbEffect = new THREE.ShaderPass( THREE.RGBShiftShader );
rgbEffect.uniforms[ 'amount' ].value = 0.0015;
rgbEffect.renderToScreen = true;
composer.addPass( rgbEffect );

First we create an EffectComposer() instance. The effect composer is used to chain together multiple shader passes by calling addPass(). Each Shader Pass applies a different effect to the scene. Order is important as each pass effects the output of the pass before. The first pass is typically the RenderPass(), which renders the 3D scene into the effect chain.

To create a shader pass we either create a ShaderPass() passing in a shader from the ‘shaders’ folder, or we can use some of the pre-built passes from the ‘postprocessing’ folder, such as BloomPass. Each Shader has a number of uniforms which are the input parameters to the shader and define the appearance of the pass. A uniform can be updated every frame, however it remains uniform across all the pixels in the pass. Browse which uniforms are available by viewing the shader JS file.

The last pass in the composer chain needs to be set to to renderToScreen. Then in the render loop, instead of calling renderer.render() you call composer.render()

That’s all you need to apply existing effects. If you want to build your own effects, continue.

GLSL Syntax

WebGL shaders are written in GLSL. The best intro to GLSL syntax I found is at Toby Schachman’s Pixel Shaders interactive tutorial. Go thru this quick tutorial first and you should get a lightbulb appearing over your head. Next take a look at the examples in his example gallery. You can live edit the code to see changes.

GLSL is written in C so get out your ‘Kernighan and Ritchie’ :). Luckily a little code goes a long way so you won’t need to write anything too verbose. The main WebGL language docs are the GLSL ES Reference Pages, which lists all the available functions.  GLSL Data Types are described here. Usually Googling ‘GLSL’ and your query will give you good results.

Some GLSL Notes:

  • Floats always need a number after the decimal point so 1 is written as 1.0
  • GLSL has many useful utility functions built in such as mix() for linear interpolation and clamp() to constrain a value.
  • GLSL allows access to components of vectors using the letters x,y,z,w and r,g,b,a. So for a 2D coordinate vec2 you can use pos.x, pos.y. For a vec4 color you can use col.r, col.g,col.b,col.a
  • Most GLSL functions can handle with multiple input types e.g. float, vec2, vec3 and vec4.
  • Debugging GLSL is notoriously difficult, however Chrome’s JS console will provide pretty good error messaging and will indicate which line of the shader is causing a problem.

Brightness Shader Example

For the first example we will walk through a super simple brightness shader. Slide the slider to change the brightness of the 3D scene.

Shader code can be included in the main JS file or maintained in separate JS files. In this case the shader code is in it’s own file. We can break apart the shader code into 3 sections, the uniforms, the vertex shader and the fragment shader. For this example we can skip the vertex shader since this section remains unchanged for pixel shaders. Three.js shaders require a vertex and fragment shader even if you are only modifying one.

UNIFORMS
The “uniforms” section lists all the inputs from the main JS. Uniforms can change every frame, but remain the same between all processed pixels

uniforms: {
"tDiffuse": { type: "t", value: null },
"amount": { type: "f", value: 0.5 }
},
  • tDiffuse is the texture from the previous shader. This name is always the same for three.js shaders. Type ‘t’ is a texture – essentially a 2D bitmap. tDiffuse is always passed in from the previous shader in the effect chain.
  • amount is a custom uniform defined for this shader. Passed in from the main JS. Type ‘f’ is a float.

FRAGMENT SHADER
The fragmentShader (pixel shader) is where the actual pixel processing occurs. First we define the variables, then we define the main() code loop.

fragmentShader: [

"uniform sampler2D tDiffuse;",
"uniform float amount;",
"varying vec2 vUv;",

"void main() {",
"vec4 color = texture2D(tDiffuse, vUv);",
"gl_FragColor = color*amount;",
"}"

].join("\n")

Here you will notice one of the quirks of shaders in three.js. The shader code is written as a list of strings that are concatenated. This due to the fact that there is no agreed way to load and parse separate GLSL files. It’s not great but you get used to it pretty quick.

  • uniform variables are passed in from main JS. The uniforms listed here must match the uniforms in the uniforms section at the top of the file.
  • varying variables vary for each pixel that is processed. vUv is a 2D vector that contains the UV coordinates of the pixel being processed. UV coords go from 0 to 1. This value is always called vUv and is passed in automatically by three.js

The main() function is the code that runs on each pixel.

  • Line 8 gets the color of this pixel from the passed in texture (tDiffuse) and the coord of this pixel (vUv). vec4 colors are in RGBA format with values from 0 to 1, so (1.0,0.0,0.0,0.5) would be red at 50% opacity.
  • Line 9 sets the gl_FragColor. gl_FragColor is always the output of a pixel shader. This is where you define the color of each output pixel. In this case we simply multiply the actual pixel color by the amount to create a simple brightness effect.

Mirror Shader Example

In addition to modifying the colors of each pixel, you can also copy pixels from one area to another. As in this example Mirror Shader.

For example to copy the left hand side of the screen to the right you can do this:

"uniform sampler2D tDiffuse;",
"varying vec2 vUv;",

"void main() {",
"vec2 p = vUv;",
"if (p.x > 0.5) p.x = 1.0 - p.x;",
"vec4 color = texture2D(tDiffuse, p);",
"gl_FragColor = color;",
"}"

This code checks the x position of each pixel it is run on (p.x). If it’s greater than 0.5 then the pixel is on the right hand side of the screen. In this case, instead of getting the gl_FragColor in the normal way, it gets the pixel color of the pixel at position 1.0 – p.x which is the opposite side of the screen.

More Shaders!

Here’s some examples of some more advanced shaders that I built recently. View the source to see how they work.

BAD TV SHADER:

Simulates a bad TV via horizontal distortion and vertical roll, using Ashima WebGL Noise. Click to randomize uniforms.

[ EDIT: View source code on Github]

badtv 

DOT MATRIX SHADER:

Renders a texture as a grid of dots. The demo then applies a glow pass by blurring and compositing the scene via an Additive Blend Shader.

dotmatrix

PAREIDOLIA:

An audio reactive animation. Post-processing via a combination of mirror, dotscreen shader and RGB shift shaders:

para

Of course this is just scratching the surface of what you can do with shaders, but hopefully it will be enough to get some people started. If you made it this far, congratulations! Let me know how you get on in the comments.

Stage3D vs WebGL Performance

So you want to build some crazy next-gen 3D graphics in the browser? Right now there are 2 good options available: Flash 11’s Stage3D versus JavaScript’s native WebGL.

Demos and Source Code

To compare the performance of both options I built a couple of simple demos that show 100 semi-transparent cubes with Additive Blending and 2 point lights.The purpose of these demos is to compare 2 equivalent 3D scenes built with the 2 technologies.

If you are getting low frame rates, try reducing the size of the browser window. On the Stage3D demo, check the debug panel DRIV field for if it says “OpenGL” or “Software” (see below for what this means).

Building the Demos

For these demos I used Away3D 4 for Stage3D and Three.js for WebGL.

Away3D 4 is the latest version of the popular Flash 3D engine, built to utilize Stage3D. Away3D has a great online community and a well maintained set of API docs. Away3D 4 is currently in alpha and some features are missing, for example there is no Particle System object and the filters/shaders support is pretty limited. Use this tutorial to get started with Away3D 4.

Three.js is an open source WebGL 3D engine that has an energetic community who are constantly adding new features. There are plenty of tutorials showing how to get started with Three.js.

Code Comparison

Away3D and Three.js have very similar logical models and syntax. For example to create a Cube in Away3D, you do this:

var cube:Cube = new Cube(material, 100,100,100); 
view.scene.addChild(cube); 

To do the same in Three.js, you do this:

var geometry = new THREE.CubeGeometry(100, 100, 100); 
var cube = new THREE.Mesh(geometry, material); 
scene.add(cube); 

Supported Platforms

To run with hardware acceleration, Stage3D requires Flash 11 and a recent graphics card. If Stage3D cannot use hardware rendering it will fall back to software rendering which is around 5-10 times slower. For a list of Stage3D unsupported GPUs, check here. You can manually detect for software rendering mode, and handle it appropriately. Stage3D uses OpenGL on Mac and DirectX on Windows. The big advantage for Stage3D is that it will run in IE.

WebGL requires no plugins and is currently supported in Chrome, Firefox, Safari and Opera. (But not IE – cheers Microsoft!). WebGL in Chrome on Windows uses ANGLE which converts WebGL to DirectX and gives good performance.

On the Mac, both Stage3D and WebGL translate to OpenGL, so there is not much difference in performance. Neither options currently run on mobile devices (iOS / Android), however both are expected to in the future.

Performance Comparison

I tested the 2 demos on a MacBook Pro (with an NVIDIA GeForce GT 330M) and a mid-range Windows 7 Dell laptop (with integrated Intel(R) HD Graphics). Your mileage will vary.

Stage3D WebGL
Mac / Chrome 60 FPS (OpenGL) 60 FPS
Mac / Firefox 60 FPS (OpenGL) 60 FPS
Mac / Safari 60 FPS (OpenGL) 60 FPS
Windows / Chrome 8 FPS (Software) 50 FPS
Windows / Firefox 8 FPS (Software) Does not run
Windows / IE 12 FPS (Software) Does not run



On the Mac, Stage3D and WebGL both perform well. On Windows, Stage3D performs well if you have supported hardware, otherwise poorly. On Windows, WebGL performs well in Chrome, otherwise it does not run. It’s interesting that Windows / Chrome with WebGL gives good performance even with an integrated GPU.

On the Mac, the Stage3D demo is slower to initialize and the frame-rate has more stutters than the WebGL equivalent. Stage3D gives more interesting color blending and specular highlights.

Summary

Stage3D and WebGL are both great technologies for delivering interactive 3D content in the browser. Away3D and Three.js provide surprisingly similar APIs for developing 3D content.

When the hardware supports it, both options give great performance. Unfortunately neither option runs well across the most common hardware configurations. If you want to target the general population, you will need to provide alternative content for non-supported machines.

Stage3D’s software mode is a nice idea, in that it will show 3D content on unsupported machines. The problem is that the performance is unusably slow. Is it better to show something that performs badly, or to redirect to alternative content?

Since Chrome is a free install on Mac and Windows, you could argue that using WebGL gives a broader reach with decent performance. Perhaps we will start seeing more ‘View this site in Chrome’ banners?

Want more stuff like this? Follow me on Twitter.

UPDATE – There are many comments reporting 60FPS with Windows/Stage3D. The numbers above do not claim to represent a broad range of machine configurations, just what I observed on the 2 machines closest to me. My test PC laptop has an integrated GPU and therefore does not support Stage3D hardware mode. It is a mid-range laptop, less than 6 months old and as such is probably a good indication of where the general public is at.

UPDATE 2 – I’ve updated the demos to allow you to add more cubes, so please go crazy with 1000s of cubes! I also added a mouse look and updated Three.js to r46. On my MBP, WebGL handles 1000 cubes a lot better than Stage3D.

View All Your Tweets On One Page with AllMyTweets

Trying to find that cool URL that you posted to Twitter a few months ago? Want to easily archive all your tweets? Now you can with AllMyTweets. AllMyTweets displays all your tweets on one page. Just enter your Twitter user name. All of the data is pulled from Twitter’s public API, so no authentication is required. To archive your tweets, select all the text on the page, copy it and paste to a text document – old school style 🙂

Note that Twitter limits the maximum number of tweets returned to 3,200, so if you have more tweets than that you are out of luck. Also the Twitter API is not always completely reliable so you may find the loading hangs sometimes. If this happens, please try again later.

I built this because there I couldn’t find anything similar on the web, and I figured other people must want to be able to do the same thing. Please put any feedback in the comments. Thanks!

WebGL Roundup

Since the demise of the Flash Player’s ubiquity, it’s like the Wild West out there for rich media web developers today. There is a whole world of new technologies that allow for animation and rich interactivity on the web. These include jQuery, Canvas, CSS3 transitions, Unity and WebGL. Unfortunately none of these work across all platforms so we are back to the days of creating multiple versions of our content and switching them out based on browser detection.

One of the most exciting new technologies is WebGL. WebGL is a open-standard browser implementation of OpenGL – the graphics API that is used just about everywhere. The strength of WebGL is that it uses hardware acceleration to allow for complex, high frame-rate 3D animations and games. WebGL is scripted with JavaScript.

WebGL is currently supported in Chrome 9+, Firefox 4+ and Safari OSX 10.6+. Since Google turned on WebGL support for the current release version of Chrome, this means anyone can view WebGL content on the web for free. As this is a new technology it’s a little flakey. Some of these demos may hog your machine resources, also you may need to restart your browser now and again.

Three.js

WebGL is low level API that provides a lot of flexibility and power with the trade-off that it can take a lot of code to make a simple scene. Luckily the ubiquitous Mr Doob has built a very nice JavaScript Library that allows you to easily utilize WebGL: Three.js. Three.js provides the option to choose a Canvas Renderer or a WebGL renderer. Download the library including many WebGL examples here.

WebGL Demos

Here are some demos from around the web that show the potential of WebGL. Use the latest Google Chrome to view them. Let me know in the comments if I’ve missed any good ones.

Three.js/WebGL demos from AlteredQualia.

Three.js/WebGL Demos

FractalLab – insane 3D fractal visualizer from subblue.

Demoscene style shader effects from Frank Reitberger.

WebGL demos from Evgeny Demidov

Learn More

 

It’s a Good Time to be a Web Developer.

I think this graph from Google I/O says it all:

Webs Up!

With all the FUD around Flash and the iPad its easy to forget that we are living in a golden age of web development. The web is fulfilling all but the most outlandish predictions of its success from back in the 90s. Things are just starting to get interesting.

The other take-away I got from the I/O keynote is how a lot of HTML5 is basically HTML playing catchup with Flash. Many of the coolest new features have been available in Flash for years. This means all the skills you honed as a Flash developer are directly applicable to the newly emerging platforms of HTML5, Unity, Android and even iApps:

  • Creating animations and transitions that enhance static content.
  • Optimizing for performance and fast page loads (including preloading).
  • Handling rich media: video, audio and images.
  • Custom text layouts and font handling.
  • Asynchronously querying the backend.

Only the syntax has changed – the end result remains the same.

Which Way Now? Web Development in the iPad Era.

Flash? No Thanks!

So it turns out Steve Jobs does not like Flash, and won’t be allowing it onto the iPad in the foreseeable future. This creates the first dent in the Flash player’s ubiquity for many years and leaves a more complex set of web development choices ahead of us.

Rather than get into the debate over whether Flash is good or evil, I would like to discuss what options are available for delivering rich interactive experiences on the web today. Most developers don’t have a philosophical preference over which tools they use, they simply want the technology that can provide the best experience while reaching the most users. With that in mind, let’s look at some numbers.

How Big is the iPad?

The iPad is selling phenomenally well and may be the herald of a new era in computing. That said, if we look at the web usage numbers for the iPad’s progenitors, we see that the usage numbers do not equate to the amount of media buzz these devices get.

In the last 12 months, mobile web usage accounted for 1.48% of total web usage, versus desktop usage at 98.52%.

Of that number 32% came from iDevices (iPhone + iPod Touch). That gives iDevices a whopping 0.45% of total web usage. Lets be generous and say the iPad doubles this usage in the coming year. That will give iDevices around 1% of total web usage in the coming year. Admittedly, mobile usage is about to explode and iDevices will be part of that, but I think it’s important to keep some perspective on the current state of the market.

What About HTML5?

Steve Jobs has suggested that we should drop Flash for HTML5. HTML5 holds a great deal of promise, however it is currently unsupported on the great majority of web browsers.

For example, based on current browser usage, 71% of web users cannot view the HTML5 video tag. This means that that building a website solely in HTML5 is not currently an option. Since IE9 will support HTML5, the IE9 adoption rates will be the deciding factor as to when we can start targeting HTML5

Where Does this Leave Web Developers Now?

So what should we recommend to our clients who want a brand experience or RIA today? It depends 🙂

  • If you can simplify the requirements enough, build it in plain HTML with a few jQuery animations thrown in. This will work in many situations. The truth is that many sites that are currently in Flash do not need to be. Most sites can get by with plain HTML, images and simple javascript animations.
  • If you need complex animations, interactivity, games, video, audio, web cam support, etc – build it in Flash. Your content will be viewable by the vast majority of web users. In addition, build a simplified HTML version for non-Flash devices. This is typically what we have been doing since the first web accessible mobile devices came around.

Looking Forward

Personally I find it hard to believe that Apple can single-handedly kill Flash when it is so ubiquitous on the web and has such a huge and loyal developer community. I still hold out hope that Flash will eventually come to the iPad in the same way that multitasking came to the iPhone. In the short-term, iPad users will be locked out from a lot of great content. Only when HTML5 browser support and tooling is broadly available can we start looking at developing RIAs with it.

[Credit to Noel Billing for pointing me to the StatCounter global data.]