Notes on VR support in Field

This page explains some of the VR support in Field.

Field has extensive support for 2- and 3-d drawing which typically takes place inside a Stage — a place on the canvas (the main window) or a separate ‘fullscreen’ window that acts as a container for geometry. Geometry is typically fashioned out of FLines — objects that, in turn, support drawing instructions that describe lines (that might be filled in to create solid areas).

To draw in VR in Field is to draw onto a Stage that just happens have its contents visible in VR somehow. Field can do this in two ways.

  1. Firstly, by supporting the Oculus Windows SDK Field can render directly to the Oculus HMD (note, this includes the actual Field UI itself).
  2. Secondly, by building a network geometry protocol Field can send geometric forms and other drawing instructions to WebVR capable web-browsers.

These two techniques are really rather different, but, as far as you are concerned, they both consist of drawing things to the Stage. This means that you’ll absolutely want to become comfortable with that documentation before proceeding much further. Of course, the Stage is primarily documented as if its oriented towards 2d drawing, Field secretly draws everything in 3d anyway.

This page also collects, for now, miscellaneous VR related 3d things in Field.

Direct HMD / Fake HMD

Field’s direct HMD support is the most efficient, richest and most direct (and the most hardware demanding — it only actually works on Windows, although Field is happy to pretend on OS X). To start drawing on an HMD simply insert an ‘Oculus’ capable stage into your sheet. You can grab one from the ‘usual place’, the command palette (ctrl-space), followed by ‘insert from workspace’:

The current one is called ‘graphics.stage.VR’:

That will give you a new Stage, which you’ll have to execute once (just alt-click on it) to initialize it. Upon doing that your Oculus app will start up (and possibly try to sell you something) and more importantly a new, ominously black, window will appear on your desktop:

That black window is the HMD debug display. It’s showing what is showing in the headset, which is currently absolutely nothing. Let’s fix that.

var layer = _.stage.withName("myLayer")

// make this layer a fully 3d affair
layer.vrDefaults()

var f = new FLine()
f.moveTo(-1,-1,-5)
f.lineTo(1,-1,-5)
f.lineTo(1,1,-5)
f.lineTo(-1,1,-5)
f.lineTo(-1,-1,-5)

f.color = vec(1,0,1,1)
f.filled=true
layer.lines.f = f

This code draws a little magenta square centered on 0,0,-50. If we look in the right direction in the HMD, we’ll see it:

Two notes on cameras:

  1. Note that the Stage shows a ‘front’ camera all the time, where the HMD debug window shows exactly what is entering your eyes. To reset where the ‘front’ of the VR experience is, right click on the Stage and select ‘reset’

  2. Once you have fun driving the camera around with the keyboard to find just the right angle on something you might ask yourself, how do I do that with code? See here

  3. You can move the the whole world around by selecting the stage and using the arrow keys. See the documentation for the keyboard camera and the bottom of the page.

Hints

The Oculus Rift’s idea of space and Field’s idea of (2d) space are fairly different. To reduce this call layer.vrDefaults() to initialize the camera and some scaling constants to be much more uniform across our different canvases.

The WebVR cell-phone trick

The second kind of VR rendering supported by Field is to send the contents of a stage to a web browser. This typically means a cell-phone (although Firefox on Windows works as well, it isn’t nearly as good as Field’s native HMD support).

All normal 2D Stages support this ‘hack’. To connect your web-browser / cell phone to Field browse to http://YOUR.IP.ADDRESS.HERE:8090/boot. To find your IP address you might need to open your network settings (or you can insert a ‘graphics.stage.webvr’ template following the instructions above and that will tell you at the bottom). This is likely to only work when your cell phone and browser are on the very same wifi network and, even then, its dependent of the whims of the security policies of whoever set up the network.

Some notes:

  1. Field only updates the geometry on your cell phone when it redraws its screen. Make sure you call _.redraw() or _.stage.frame() (or wiggle the canvas with the mouse).

  2. If you draw too much then your cell phone won’t be able to handle it. If you change too much then your WiFi connection won’t be able to handle it.

  3. Texturing is supported, but Field’s video support isn’t. We’ll build towards supporting the kinds of (stereoscopic) video that you can play in web-browsers instead.

  4. If you restart Field or reinitialize the Stage you’ll need to reload your browser page to get new updates.

Other extensions to Stage

Right now the only other extension to Field needed to let you build things in 3D is to constrain layers to only be visible in particular eyes. This is useful in the one case where you want to display different textures to different eyes (like in the case of the stereo pair viewer). So with code like this:

var KeyboardCamera = Java.type('field.graphics.util.KeyboardCamera')

// here are a pair of images
var layer1 = _.stage.withTexture("c:/Users/marc/Pictures/classPairs/09/left.jpg")
var layer2 = _.stage.withTexture("c:/Users/marc/Pictures/classPairs/09/right.jpg")

// you'll need to tweak this a bit
z = -50

layer1.lines.clear()
layer2.lines.clear()

// this constrains layer1 to be on the left eye and
layer1.sides=1
// layer 2 to be on the right
layer2.sides=2

// let's build a rectangle set back by '-z'
var f = new FLine()
f.moveTo(50-50*aspect, 0, -z)
f.lineTo(50+50*aspect, 0, -z)
f.lineTo(50+50*aspect, 100, -z)
f.lineTo(50-50*aspect, 100, -z)
f.lineTo(50-50*aspect, 0, -z)
layer1.bakeTexture(f)

f.filled=true
f.color = vec(1,1,1,1)
layer1.lines.f = f

// let's build another rectangle set back by '-z'
// (just in case we want it to be different)
var f = new FLine()
f.moveTo(50-50*aspect, 0, -z)
f.lineTo(50+50*aspect, 0, -z)
f.lineTo(50+50*aspect, 100, -z)
f.lineTo(50-50*aspect, 100, -z)
f.lineTo(50-50*aspect, 0, -z)
layer2.bakeTexture(f)
f.filled=true
f.color = vec(1,1,1,1)
layer2.lines.f = f

You can completely rebuild the original stereo pair viewer for Sketch 1. Of course, with Field there’s a lot more you can do, you can place those pairs anywhere you want (try one pair per box), in a timeline, with varying opacity etc.

Interaction — knowing where the head and hands are

Layers also export the following functions:

It’s worth closing your eyes and thinking, visually, for a moment why, given that each layer has it’s own layer.camera it’s important that all of these calls go through layer rather than the _.stage. The answer is that these are returning positions that are ready to use to draw an FLine in this layer given the camera associated with the layer. Your head, in short, is not in the space of the layer camera, but the FLine is — thus the layer needs to help out with the math.

Field’s 3d linear algebra

If you have been working through the tutorials on this site diligently you’ll have seen code such as:

var v = vec(1,2)

or

var v = vec(1,2) * rotate(10)

or even

var v = vec(1,2) * rotate(10).pivot(40,50)
var f = myFLine * rotate(10)
var f = myFLine * scale(2)

These are snippets of code that manipulate positions of things (including the positions of inside pieces of FLines). Most of this site is dedicated to drawing things in 2D, so all of these operations operate on the ‘x’ and ‘y’ dimensions. But for VR it makes sense to manipulate things in 3D. Thus:

// make a new 3d vector
var v = vec(1,2,3) 

or

// rotation by 10 degrees clockwise around the y axis
var v = vec(1,2,3) * rotate3(10, vec(0,1,0))

or even

// rotation by 10 degrees clockwise around an axis pointing in the yz direction
var v = vec(1,2,3) * rotate3(10).pivot(0,1,1)
// rotation by 10 degrees clockwise around the z axis
var f = myFLine * rotate3(10, vec(0,0,1))
// scale by 2 in the x direction 4 in the y direction and 3 in the z direction
var f = myFLine * scale3(2,4,3)

The crucial point here is to get full 3d rotations scales and transformations you should use rotate3, translate3 and scale3 accordingly.

Simple VR spatialized audio — [not iOS]

Recent builds of Field include support for Spatial Audio with head position governed by tracking devices. This technology stack is based on Google’s Overtone / Resonance Audio work which enables “high order ambisonic” encoding, rendering, reverberation and decoding to binaural displays. Field provides a simple interface to this engine running in a webpage.

Having inserted a WebVR or Oculus-aware stage into the canvas, you’ll have access to a new variable _.space. This functions a lot like _.stage, it’s available everywhere. To get the spatial audio engine running, open a browser webpage on the usual spot: http://YOUR.IP.ADDRESS:8090/boot — the spatial audio engine is included in our WebVR engine, but also available at this url when running Oculus.

Like stage, _.space has an idea of a named layer that is created on demand:

var layer = _.space.withFile("/Users/marc/Documents/c3.wav")

That call will open the file and start streaming it to the browser(s). Given a layer you can do the following with it:

 // start at the beginning
layer.play(0)

 // start 10 seconds in 
layer.play(10)

 // pause
layer.pause()

 // continue from where you left off
layer.play()

 // move the sound to position 1,2,3
layer.position(1,2,3)

Additionally you can:

_.space.setRoomDimensions(4,5,6) // sets the size of the room that you are 'in'
_.space.setRoomMaterial('plywood') // makes the room made out of plywood

See here for a list of materials.

Ambisonic soundscape playback

For now, a quick hack to playback a Second Order Ambisonic (SoA) soundscape:

var layer = _.space.playSoASet("/Users/marc/Documents/back2_")

Loading Point clouds

Finally, a way of loading point-clouds that have been exported from COLMAP as “Export as text file…” or .ply files without triangles (such as you might get from MeshLab). This is the example code you want:

// secret invocation to enable point-cloud loading
var PointCloud = Java.type("trace.graphics.PointCloud")

// load a point cloud from a points3D.txt file somewhere
var points = new PointCloud("/Users/marc/Documents/colmapproject/points3D.txt")

// make a new FLine
var f = new FLine()

// make sure we actually draw the points
f.pointed=true
f.pointSize = 0.3

// put the points into the FLine
points.toFLine(f)

// make a layer for the points on a stage
var layer = _.stage.withName("somePoint")

// and add it
layer.lines.f = f

// and send it to the screen (and any cellphones)
_.redraw()