This page collects assorted, advanced, snippets of code that bring diverse bits of the reference section together. Remember: best not to copy and paste any code you a) don’t understand and b) can’t test.
The AR initialization ritual
The code for grabbing a position in space from a mouse down and then putting a loaded object there:
Custom Shaded Duck
This is an example of loading a model from FBX and applying a custom ‘shader’ to it — instructing the graphics hardware of your phone / desktop to transform the vertices and fill in the pixels of your model differently.
The model loading is quite straightforward:
Of course, the real action is in the custom shader. You can access the different parts of the shader using the ctrl-space menu:
Writing ‘edit fragment’ (or enough of it) and pressing return swaps you to a different ‘tab’ of the editor. ‘edit vertex’ brings you to the vertex shader, ‘edit code’ brings you back to JavaScript. Writing ‘reload shader’ causes your edits to be sent to the graphics card (and error messages to be sent back to the top of the document).
Our Vertex shader is left untouched — those vertices end up exactly where they were going to go anyway — but our fragment shader takes the existing texture coordinates that were on the model and uses them very differently:
Of course, some of the power of shaders come from manipulating these complex forms from small pieces of data sent from JavaScript. If we change our declaration of the shader to include some data from JavaScript:
Now our fragment shader can read:
Now we can animate offset.x and animate our duck. Export offset to boxes below this box:
Then, in a connected new box:
Fully Alpha-Composited Video
There’s no good mobile-hardware compatible compression format for playing video that includes an animated transparency layer. This isn’t an issue for Netflix or YouTube, because it’s not like there’s anything to composite their content onto. But for us AR pioneers treating the edge of video: the places where our material yields to the world, is very important. Let’s hack our way around this limitation using custom shaders.
First, let’s recall how to load a video onto a plane:
We’ve made a very special video by carefully preparing it (in After Effects):
This video has some color information (what we want to draw) on the left hand side, and, on the right, corresponding transparency information. Here ‘black’ is transparent and pure white is opaque and grey the places in-between. Obviously, showing it this way doesn’t make any sense. But, with a shader, we can ‘decode’ this video and show it differently.
Let’s start with adding a custom shader, that includes the video texture map under the name map:
Just adding the shader wipes out the texturing and replaces it with a monochrome red rectangle. Let’s first restore the texturing. We want to change pixels, so we change the fragment shader:
That gets us right back to where we were. Now let’s focus on the line vec4 m1 = ..... If we change it to
We get the left hand side of the texture stretched over the whole thing:
While
Gets us the right half:
Now we can finish the task:
This gives us an alpha blended movie like the gif in the opening to this section. The left half here maps to color and the right maps to alpha.
Now we are off to the races! To feather the border of everything, consider this cryptic edit:
Yielding a smooth transition to transparent regardless of the contents of the video:
Study float ax = vUv.x*(1.0-vUv.x)*4.0; in the knowledge that vUV.x goes from 0 to 1 across the video plane.
Live microphone input [half-duplex only on iOS]
After this video here I give you live microphone input in Field:
More “World Sensing” 1 — keypoints [iOS]
This code give access to the keypoints currently sensed by the phone
More “World Sensing” 2 — planes [iOS]
This screenshot shows a cube peeking out of a ‘plane’ made out of FLine:
This is the code that digs out the current plane information from ARKit and draws it on the screen. We’ve drawn the FLine with a red translucent fill, but if we’d drawn it completely transparently it would ‘correctly’ occlude other objects in the scene.
More shading — The collapsing tree
Two things together. Firstly a point cloud stored in a ply file:
That gets us:
And then a (very) custom vertex shader:
And a fragment shader to color things correctly:
If you look carefully at the vertex shader you’ll see this line here:
That is our route in, we can conspire to set alpha from JavaScript and have it sent to the shader.
Let’s connect all these things togther:
Now we can animate alpha.x to distort all of our points.