Virtual reality with React: Moving from React VR to React 360

Recently, I built a virtual reality application called Find Your Zen — originally built in React VR, now ported over to React 360 — which allows the user to choose his or her immersive meditation environment, each of which comes with its own mantra inspired by the show “The Good Place”.

If you want an introduction to React VR (as well as Recompose), you can find that here and here.

As I ported my application over to React 360, I made note of the significant differences between the libraries.

Viewing the finished demo code

$ git clone https://github.com/lilybarrett/find-your-zen.git
$ cd find-your-zen
$ npm i
$ npm start

File structure

The basic file structure for React VR was as follows:

  • index.vr.js = entry point for your app
  • vr folder = stores the code that launches your app, includes index.html and client.js files
  • static_assets = stores images, audio files, and other external resources

And here’s the new file structure for React 360:

  • index.js = entry point for your app
  • client.js = sets up the “runtime,” which turns our React components into 3D elements in our VR landscape
  • index.html = as in the typical React application, provides a place for you to mount your React code
  • static_assets = stores images, audio files, and other external resources

I set up the rest of my folder structure as follows:

- components // shared components
  - base-button
  - content 
- consts 
- providers
  // Recompose providers live here 
- scenes
  - home-environment
    - components
      - menu
      - title 
      - zen-button
      - zens 
  - zen-environment
    - components
      - home-button
      - mantra 
- static_assets
  - images 
  - sounds 

Shared components live in the top-level components folder. Stored in scenes, my HomeEnvironment — the first environment to load, where my user accesses a menu of meditation environments to explore — and ZenEnvironment scenes each have their own sets of relevant components. My state management is handled by Recompose providers and functionally composed into each component that needs access to state.

Mounting the app

In React VR, our client.js was pretty simple and didn’t give us too many configuration options:

// React VR application -- vr/client.js
// Auto-generated content.
// This file contains the boilerplate to set up your React app.
// If you want to modify your application, start in "index.vr.js"

// Auto-generated content.
import {VRInstance} from "react-vr-web";

function init(bundle, parent, options) {
  const vr = new VRInstance(bundle, "MeditationApp", parent, {
    cursorVisibility: "auto",
    // Add custom options here
    ...options,
  });
  vr.render = function() {
    // Any custom behavior you want to perform on each frame goes here
  };
  // Begin the animation loop
  vr.start();
  return vr;
}

window.ReactVR = {init};

In React 360, we can mount our application’s content to a surface or a location. Surfaces, as the docs say, “allow you to add 2D interfaces in 3D space, letting you work in pixels instead of physical dimensions.” In my case, I wrap the visual content of my application in an AppContent component, which I mount to React 360’s default, cylindrical surface. This surface projects the content onto the inside of a cylinder — centered in front of the user — with a 4 meter radius.

You can create your own custom surfaces in React 360, increasing or decreasing the radius or making the surface flat rather than cylindrical.

I also mount the entire app itself to React 360’s default location, which allows my app to take advantage of React 360’s runtime.

The new runtime is one of React 360’s significant advantages over React VR. Why? Separating out the rendering or “runtime” aspects of the application from the application code improves the latency: the time between a user action and the time the pixels in the view update in response to that action. If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen.

As the React 360 docs further explain, web browsers are single-threaded, which means that as part of the app updates behind the scenes, that process could block or hinder data transfer. “This is especially problematic for users viewing your 360 experience on a VR headset, where significant rendering latency can break the sense of immersion,” the docs tell us, “By running your app code in a separate context, we allow the rendering loop to consistently update at a high frame rate.”

If the data transfer is too slow, it results in a choppy, disorienting view for the user — similar to buffering on a Youtube video or static on a television screen.

In my index.js, I register my MeditationApp to mount to the default location — giving my entire application access to the runtime — while I register the content I want to display (again, stored in AppContent) to the default cylindrical surface.

// components/content.js
import React from "react";
import { View } from "react-360";
import { HomeEnvironment, ZenEnvironment } from "../../scenes";
import { withAppContext } from "../../providers";

const AppContent = withAppContext(() => (
   <View>
        <HomeEnvironment />
        <ZenEnvironment />
   </View>
));

export default AppContent;
// index.js
import React from "react";
import {
  AppRegistry,
  View,
} from "react-360";
import { AppContent } from "./components";
import { withAppContext } from "./providers";

const MeditationApp = withAppContext(() => (
    <View style={{
      transform: [{ translate: [0, 0, -2] }]
    }}>
      <AppContent />
    </View>
));


AppRegistry.registerComponent("AppContent", () => AppContent);
AppRegistry.registerComponent("MeditationApp", () => MeditationApp);

My client.js deals with mounting my component to locations and surfaces:

// client.js
import { ReactInstance, Surface } from "react-360-web";

function init(bundle, parent, options = {}) {
  const r360 = new ReactInstance(bundle, parent, {
    fullScreen: true,
    // Add custom options here
    ...options,
  });

  r360.renderToSurface(
    r360.createRoot("AppContent", { /* initial props */ }),
    r360.getDefaultSurface()
  );

  r360.renderToLocation(
    r360.createRoot("MeditationApp", { /* initial props */ }),
    r360.getDefaultLocation(),
  );

  r360.compositor.setBackground(r360.getAssetURL("images/homebase.png"));
}

window.React360 = {init};

Playing audio

In React VR, we used a Sound component, which took in a URL for a sound file in the static_assets folder as a source prop. To prevent audio from playing in certain environments, I implemented logic via Recompose for “hiding” and “showing” the Sound component based on whether or not we were in an environment with no audio files associated with it.

// React VR -- components/audio.js
import React from "react";
import { Sound } from "react-vr";
import zens from "../consts/zens.js";
import { compose } from "recompose";
import { asset } from "react-vr";
import { hideIf, usingAppContext } from "../providers/index.js";

const hideIfNoAudioUrl = hideIf(({ selectedZen }) => {
    const zenAudio = zens[selectedZen - 1].audio;
    return zenAudio === null || zenAudio === undefined || zenAudio.length === 0;
});

export default compose(
    usingAppContext,
    hideIfNoAudioUrl,
)(({ selectedZen }) => {
    const zenAudio = zens[selectedZen - 1].audio;
    return (
        <Sound source={asset(zenAudio)} />
    )
});

React 360 improves upon this. For playing audio, we’ll use the AudioModule Native Module. Its playEnvironmental method allows us to provide a path (to the audio in our assets folder) and a volume at which to play said audio at a looping pace; Once the audio file stops playing, it’ll start again.

One thing to keep in mind is that you’ll need to tell your application when to stop playing a particular audio file when you switch scenes. (Otherwise, as in Find Your Zen, you may wind up listening to audio from your previous environment — i.e., church bells in a city square in Paris — after you navigate back to the home environment) You can do this via the AudioModule‘s stopEnvironmental method.

Using Images

In React VR, we used a Pano component to display a 360 degree photo. To display a specific image, Pano, like Audio, took in an assets URL as a source prop. Based on which environment the user selected, the state of the app updated to display an image for that environment.

// React VR -- components/wrapped-pano.js
import React from "react";
import { Pano } from "react-vr";
import { usingAppContext } from "../providers/index.js";
import { Audio } from "../components/index.js";
import zens from "../consts/zens.js";
import { asset } from "react-vr";

export default usingAppContext(({ selectedZen }) => {
    return (
        <Pano source={asset(zens[selectedZen - 1].image)} >
            <Audio />
        </Pano>
    )
});

You may have noticed that, in my React 360 application’s client.js, I write the following line after rendering my application’s components:

r360.compositor.setBackground(r360.getAssetURL("images/homebase.png"));;

This line of code, which immediately sets the background image when the app is first mounted, uses the asset utility from React 360 to automatically look inside our static_assets folder for the correct image.

That’s all well and good, but keep in mind you’ll eventually want to change the image based on which environment the user selects. You can handle dynamic images from within a React event by using React 360’s Environment module. Example:

Environment.setBackgroundImage(asset(someImage));

And, to pull it all together, here’s how I dynamically set my background image and audio based on which environment the user selects, using Recompose’s withState and withHandlers functions:

// providers/withStateAndHandlers.js
import React from "react";
import { withState, withHandlers, compose } from "recompose";
import { Environment, asset, NativeModules } from "react-360";
const { AudioModule } = NativeModules;
import { zens } from "../consts";

const withStateAndHandlers = compose(
    withState("selectedZen", "zenClicked", 4),
    withHandlers({
        zenClicked: (props) => (id, evt) => {
            Environment.setBackgroundImage(asset(zens[id - 1].image));
            if (zens[id - 1].audio !== null && zens[id - 1].audio !== undefined) {
                AudioModule.playEnvironmental({
                    source: asset(zens[id - 1].audio),
                    volume: 0.3,
                });
            } else {
                AudioModule.stopEnvironmental();
            }
            props.zenClicked(selectedZen => id);
        }
    }),
)

export default withStateAndHandlers;

Styling the app

React 360, like React VR, uses Flexbox to easily adapt the application’s layout to any display, whether it be a laptop’s web browser or a phone screen or a VR headset. However, for parts of the application mounted to a location — like MeditationApp in my case — React 360 switches from Flexbox layout to a three-dimensional, meter based coordinate system. That’s why you see this code in my index.js:

// index.js

// other code goes here
const MeditationApp = withAppContext(() => (
    <View style={{
      transform: [{ translate: [0, 0, -2] }]
    }}>
      <AppContent />
    </View>
));
// other code goes here

The values passed into transform are x, y, and z, in that order. x represents the orientation of an object to the right of the user; y represents the orientation upwards or downwards, and z represents the perceived distance away from the user.

In the example above, the View should be in the center and 2 perceived meters ahead of the user.

Transforms are all positioned relative to their parents.

Practices that worked well for me

StyleSheets

We get StyleSheet from react-native, basically, which allows us to use JavaScript to pass styling attributes to our React components. As an example:

// scenes/home-environment/components/zen-button/style.js
import { StyleSheet } from "react-360";

export default StyleSheet.create({
    text: {
        backgroundColor: "#29ECCE",
        textAlign: "center",
        color: "white",
        marginTop: 30  
    }
})

Here, we create and export a StyleSheet object that allows us to reference styles in a terse, DRY manner in our component itself:

// scenes/home-environment/components/zen-button/index.js
import React from "react";
import { BaseButton } from "../../../../components";
import style from "./style";

const ZenButton = ({ text, buttonClick, selectedZen }) => {
  return (
    <BaseButton
      text={text}
      selectedZen={selectedZen}
      buttonClick={buttonClick}
      textStyle={style.text}
    />
  )
}

export default ZenButton;

State management

Because, at the end of the day, this is still just React, you can approach handling state in the same way you would in a typical React application: Redux, Recompose, Mobx, etc. I chose to use Recompose because I love how it allows me to build functional components. I wrote some posts about Recompose in the context of React VR, which you can find here and here. I did not need to change anything about my state management approach when porting my application from React VR over to React 360.

Debugging React 360

When you Inspect Element, you”ll see that React 360 bundles all its files into one giant blob that isn”t super easy to grok. Fortunately, because it supports sourcemaps, we can still access the original files, use debugger, etc.


Leave a comment