Notes on VR support in Field

This page explains some of the VR support in Field.

Field has extensive support for 2- and 3-d drawing which typically takes place inside a Stage — a place on the canvas (the main window) or a separate ‘fullscreen’ window that acts as a container for geometry. Geometry is typically fashioned out of FLines — objects that, in turn, support drawing instructions that describe lines (that might be filled in to create solid areas).

To draw in VR in Field is to draw onto a Stage that just happens have its contents visible in VR somehow. Field can do this by supporting the OpenVR SDK Field can render directly to the VR headset.

Since Field is ‘merely’ extending its drawing Stage to appear in VR, you’ll absolutely want to become comfortable with the documentation for Stage before proceeding much further. Of course, the Stage is primarily documented as if its oriented towards 2d drawing, Field secretly draws everything in 3d anyway.

This page also collects, for now, miscellaneous VR related 3d things in Field.

Direct VR

Field’s direct VR support is the most efficient, richest and most direct (and the most hardware demanding — it only actually works on Windows, although Field is happy to pretend on OS X). To start drawing on a VR headset simply insert an ‘VR’ capable stage into your sheet. You can grab one from the ‘usual place’, the command palette (ctrl-space), followed by ‘insert from workspace’:

The current one is called ‘graphics.stage.VR’:

That will give you a new Stage. Upon doing that your Oculus or Steam app will start up (and possibly try to sell you something). You’ll no doubt want to open the Steam VR settings and ask it to show you what’s being displayed on the headset (or, pair up with somebody who doesn’t mind being your VR ‘spotter’).

Let’s draw something in VR

var layer = _.stage.withName("myLayer")

var f = new FLine()
f.moveTo(-1,-1,-5)
f.lineTo(1,-1,-5)
f.lineTo(1,1,-5)
f.lineTo(-1,1,-5)
f.lineTo(-1,-1,-5)

f.color = vec(1,0,1,1)
f.filled=true
layer.lines.f = f

This code draws a little magenta square centered on 0,0,-5. If we look in the right direction in the HMD, we’ll see it:

How to get work done on a Mac (without a headset)

Field’s VR capable stage will start up on a Mac just fine, except there will be no VR happening. Still, it works as a Stage. To get it as close to VR as possible, try code like this:

var layer = _.stage.withName("something or other")

// this sets up the camera and scale properly
layer.vrDefaults()

// drive the camera around with the keyboard 
// rather than your head
layer.makeKeyboardCamera()

var f = new FLine()
f.moveTo(0,1)
f.lineTo(10,10)
f.color=vec(1,1,1,1)
layer.lines.f = f

// explicitly redraw the canvas 
// (which you don't need to do with a headset that's always drawing)
_.stage.frame()

Remember, the geometry you’ve drawn might be behind you. And, likely, you’ll have to move things around once you move back to VR. Leave time for that, plan accordingly.

Interaction — knowing where the head and hands are

Layers also export the following functions, solely for interactive VR.

It’s worth closing your eyes and thinking, visually, for a moment why, given that each layer has it’s own layer.camera it’s important that all of these calls go through layer rather than the _.stage. The answer is that these are returning positions that are ready to use to draw an FLine in this layer given the camera associated with the layer. Your head, in short, is not in the space of the layer camera, but the FLine is — thus the layer needs to help out with the math.

Field’s 3d linear algebra

If you have been working through the tutorials on this site diligently you’ll have seen code such as:

var v = vec(1,2)

or

var v = vec(1,2) * rotate(10)

or even

var v = vec(1,2) * rotate(10).pivot(40,50)
var f = myFLine * rotate(10)
var f = myFLine * scale(2)

These are snippets of code that manipulate positions of things (including the positions of inside pieces of FLines). Most of this site is dedicated to drawing things in 2D, so all of these operations operate on the ‘x’ and ‘y’ dimensions. But for VR it makes sense to manipulate things in 3D. Thus:

// make a new 3d vector
var v = vec(1,2,3) 

or

// rotation by 10 degrees clockwise around the y axis
var v = vec(1,2,3) * rotate3(10, vec(0,1,0))

or even

// rotation by 10 degrees clockwise around an axis pointing in the yz direction
var v = vec(1,2,3) * rotate3(10).pivot(0,1,1)
// rotation by 10 degrees clockwise around the z axis
var f = myFLine * rotate3(10, vec(0,0,1))
// scale by 2 in the x direction 4 in the y direction and 3 in the z direction
var f = myFLine * scale3(2,4,3)

The crucial point here is to get full 3d rotations scales and transformations you should use rotate3, translate3 and scale3 accordingly.

Simple VR spatialized audio — [not iOS]

Recent builds of Field include support for Spatial Audio with head position governed by tracking devices. This technology stack is based on Google’s Overtone / Resonance Audio work which enables “high order ambisonic” encoding, rendering, reverberation and decoding to binaural displays. Field provides a simple interface to this engine running in a webpage.

Having inserted a WebVR or VR-aware stage into the canvas, you’ll have access to a new variable _.space. This functions a lot like _.stage, it’s available everywhere. To get the spatial audio engine running, open a browser webpage on the usual spot: http://YOUR.IP.ADDRESS:8090/boot — the spatial audio engine is included in our WebVR engine, but also available at this url when running Oculus / Vive.

Like stage, _.space has an idea of a named layer that is created on demand:

var layer = _.space.withFile("/Users/marc/Documents/c3.wav")

That call will open the file and start streaming it to the browser(s). Given a layer you can do the following with it:

 // start at the beginning
layer.play(0)

 // start 10 seconds in 
layer.play(10)

 // pause
layer.pause()

 // continue from where you left off
layer.play()

 // move the sound to position 1,2,3
layer.position(1,2,3)

Additionally you can:

_.space.setRoomDimensions(4,5,6) // sets the size of the room that you are 'in'
_.space.setRoomMaterial('plywood') // makes the room made out of plywood

See here for a list of materials.

Ambisonic soundscape playback

For now, a quick hack to playback a Second Order Ambisonic (SoA) soundscape:

var layer = _.space.playSoASet("/Users/marc/Documents/back2_")

Loading Point clouds

Finally, a way of loading point-clouds that have been exported from COLMAP as “Export as text file…” or .ply files without triangles (such as you might get from MeshLab). This is the example code you want:

// secret invocation to enable point-cloud loading
var PointCloud = Java.type("trace.graphics.PointCloud")

// load a point cloud from a points3D.txt file somewhere
var points = new PointCloud("/Users/marc/Documents/colmapproject/points3D.txt")

// make a new FLine
var f = new FLine()

// make sure we actually draw the points
f.pointed=true
f.pointSize = 0.3

// put the points into the FLine
points.toFLine(f)

// make a layer for the points on a stage
var layer = _.stage.withName("somePoint")

// and add it
layer.lines.f = f

// and send it to the screen (and any cellphones)
_.redraw()

Other extensions to Stage [ADVANCED]

Right now the only other extension to Field needed to let you build things in 3D is to constrain layers to only be visible in particular eyes. This is useful in the one case where you want to display different textures to different eyes (like in the case of the stereo pair viewer). So with code like this:

var KeyboardCamera = Java.type('field.graphics.util.KeyboardCamera')

// here are a pair of images
var layer1 = _.stage.withTexture("c:/Users/marc/Pictures/classPairs/09/left.jpg")
var layer2 = _.stage.withTexture("c:/Users/marc/Pictures/classPairs/09/right.jpg")

// you'll need to tweak this a bit
z = -50

layer1.lines.clear()
layer2.lines.clear()

// this constrains layer1 to be on the left eye and
layer1.sides=1
// layer 2 to be on the right
layer2.sides=2

// let's build a rectangle set back by '-z'
var f = new FLine()
f.moveTo(50-50*aspect, 0, -z)
f.lineTo(50+50*aspect, 0, -z)
f.lineTo(50+50*aspect, 100, -z)
f.lineTo(50-50*aspect, 100, -z)
f.lineTo(50-50*aspect, 0, -z)
layer1.bakeTexture(f)

f.filled=true
f.color = vec(1,1,1,1)
layer1.lines.f = f

// let's build another rectangle set back by '-z'
// (just in case we want it to be different)
var f = new FLine()
f.moveTo(50-50*aspect, 0, -z)
f.lineTo(50+50*aspect, 0, -z)
f.lineTo(50+50*aspect, 100, -z)
f.lineTo(50-50*aspect, 100, -z)
f.lineTo(50-50*aspect, 0, -z)
layer2.bakeTexture(f)
f.filled=true
f.color = vec(1,1,1,1)
layer2.lines.f = f

You can completely rebuild the original stereo pair viewer for Sketch 1. Of course, with Field there’s a lot more you can do, you can place those pairs anywhere you want (try one pair per box), in a timeline, with varying opacity etc.