This page explains some of the VR support in Field.
Field has extensive support for 2- and 3-d drawing which typically takes place inside a
Stage — a place on the canvas (the main window) or a separate ‘fullscreen’ window that acts as a container for geometry. Geometry is typically fashioned out of
FLines — objects that, in turn, support drawing instructions that describe lines (that might be filled in to create solid areas).
To draw in VR in Field is to draw onto a
Stage that just happens have its contents visible in VR somehow. Field can do this by supporting the OpenVR SDK Field can render directly to the VR headset.
Since Field is ‘merely’ extending its drawing
Stage to appear in VR, you’ll absolutely want to become comfortable with the documentation for
Stage before proceeding much further. Of course, the
Stage is primarily documented as if its oriented towards 2d drawing, Field secretly draws everything in 3d anyway.
This page also collects, for now, miscellaneous VR related 3d things in Field.
Field’s direct VR support is the most efficient, richest and most direct (and the most hardware demanding — it only actually works on Windows, although Field is happy to pretend on OS X). To start drawing on a VR headset simply insert an ‘VR’ capable stage into your sheet. You can grab one from the ‘usual place’, the command palette (ctrl-space), followed by ‘insert from workspace’:
The current one is called ‘graphics.stage.VR’:
That will give you a new Stage. Upon doing that your Oculus or Steam app will start up (and possibly try to sell you something). You’ll no doubt want to open the Steam VR settings and ask it to show you what’s being displayed on the headset (or, pair up with somebody who doesn’t mind being your VR ‘spotter’).
Let’s draw something in VR
This code draws a little magenta square centered on 0,0,-5. If we look in the right direction in the HMD, we’ll see it:
Field’s VR capable stage will start up on a Mac just fine, except there will be no VR happening. Still, it works as a
Stage. To get it as close to VR as possible, try code like this:
Remember, the geometry you’ve drawn might be behind you. And, likely, you’ll have to move things around once you move back to VR. Leave time for that, plan accordingly.
Layers also export the following functions, solely for interactive VR.
Vec3returns the position of the inside of the viewer’s head. Note that drawing things here is a recipe for confusion! You might want to offset them by:
Vec3points along your nose.
layer.vrViewerPosition() + layer.vrGazeDirection()is right in front of (and between) your eyes.
Vec3is where the Hand controller is for the left hand. It ‘points’, as if it were a gun, along:
Vec3direction where the left hand controller is ‘pointing’
It’s worth closing your eyes and thinking, visually, for a moment why, given that each layer has it’s own
layer.camera it’s important that all of these calls go through
layer rather than the
_.stage. The answer is that these are returning positions that are ready to use to draw an
FLine in this layer given the camera associated with the layer. Your head, in short, is not in the space of the layer camera, but the
FLine is — thus the layer needs to help out with the math.
If you have been working through the tutorials on this site diligently you’ll have seen code such as:
These are snippets of code that manipulate positions of things (including the positions of inside pieces of
FLines). Most of this site is dedicated to drawing things in 2D, so all of these operations operate on the ‘x’ and ‘y’ dimensions. But for VR it makes sense to manipulate things in 3D. Thus:
The crucial point here is to get full 3d rotations scales and transformations you should use
Recent builds of Field include support for Spatial Audio with head position governed by tracking devices. This technology stack is based on Google’s Overtone / Resonance Audio work which enables “high order ambisonic” encoding, rendering, reverberation and decoding to binaural displays. Field provides a simple interface to this engine running in a webpage.
Having inserted a WebVR or VR-aware stage into the canvas, you’ll have access to a new variable
_.space. This functions a lot like
_.stage, it’s available everywhere. To get the spatial audio engine running, open a browser webpage on the usual spot:
http://YOUR.IP.ADDRESS:8090/boot — the spatial audio engine is included in our WebVR engine, but also available at this url when running Oculus / Vive.
_.space has an idea of a named layer that is created on demand:
That call will open the file and start streaming it to the browser(s). Given a
layer you can do the following with it:
Additionally you can:
See here for a list of materials.
For now, a quick hack to playback a Second Order Ambisonic (SoA) soundscape:
Finally, a way of loading point-clouds that have been exported from COLMAP as “Export as text file…” or .ply files without triangles (such as you might get from MeshLab). This is the example code you want:
Right now the only other extension to Field needed to let you build things in 3D is to constrain layers to only be visible in particular eyes. This is useful in the one case where you want to display different textures to different eyes (like in the case of the stereo pair viewer). So with code like this:
You can completely rebuild the original stereo pair viewer for Sketch 1. Of course, with Field there’s a lot more you can do, you can place those pairs anywhere you want (try one pair per box), in a timeline, with varying opacity etc.