Flower UI: A Gaze Based interaction approach for Mobile VR

For a while now I have tangled with the concept of a verb based interaction system in mobile VR/AR using only gaze. The PC market currently has numerous peripherals for tracking movements and actions with more coming out monthly, and they give us a multitude of ways to interact in VR but the lure of mobile VR is the ease of setting up and consuming these experiences without being encumbered by those additional peripherals. A quick jump in and play experience. It also feels like an opportunity has been missed in the last few years gaze UI has remained largely the same.

A little bit of history:

Initially I came up with something simple called “Verbinator”, but the approach was flawed. It was far too linear and lacked the ability to empower a user to explore and interact with a virtual world, it also required the user to press button, which I soon found through analytics on mobile VR was a stumbling block. Children of this experiment can be seen in the early versions of “You are In A maze” (The historical builds are available on Aptoide (YAIAM V1.00), so you can see how it evolved) and a few other unreleased experiments. (Like the one pictured below.)

I looked back at a few classic games with interesting verb based interactive systems. Games where the mouse cursor is your eyes. In the classic graphic adventure games from Lucas Arts you find this type of interface and in the more the relatively contemporary critically acclaimed adventure RPG Anachronox from Ion Storm designed by Tom Hall, within which your cursor is a living and breathing character called Fatima.

I looked back at a few classic games with interesting verb based interactive systems. Games where the mouse cursor is your eyes. In the classic graphic adventure games from Lucas Arts you find this type of interface and in the more the relatively contemporary critically acclaimed adventure RPG Anachronox from Ion Storm designed by Tom Hall, within which your cursor is a living and breathing character called Fatima.

Then I looked at more modern Point and Click adventure titles and saw this movement toward contextual actions for you to perform on an object. From this came the idea that the interactions could not be as static as they had been in the traditional point and click adventure titles and the Verbinator concept needed to evolve.

Where the solution stands:

The current solution plays as follows:

  • Gaze at an interactive object.
  • A contextual Flower UI element appears.
    • This is driven by settings on the interactive object.
  • Gazing away from the object between the petals closes the Flower UI element.
  • Gazing at a petal will either:
    • instantly start an action.
    • Or a radial countdown (common in gaze interfaces today) will appear over the petal before the action occurs so the user can abort by gazing away.
  • On confirmation of an action when gazing at a petal the following will occur.
    • The Petal and Flower will close and the action will take place: Handy for interactions that are a solid choice which will then entertain the player with a reaction.
    • Or The Petal and Flower will remain open and you can continue to gaze at petals to make changes: Handy for colour and character creation changes that happen quickly.

Given the inspiration the solution currently is not suited to fast paced action games which require the player to dictate movement, defend and attack simultaneously. However, this solution would be a comfortable fit for management, base building, farming, auto play mobile RPGs, turn based RPGs, adventure games and even clickers could become “gazers”.

You can still play an action arcade title that does not require body movement if firing at targets occurs on gaze and not button pressed (As they do in “Shooty Blocks”.).

The Future:

Updating the current experiments to utilise this input method to navigate through the currently released experiments. Refining the Gaze Flower UI prefabs and scripts to release on the Unity Asset Store.

Gaze based communication and text entry is challenging, but worth attempting. I won’t go too deep into this, but whilst we wait for voice recognition to become more accurate we might need some gaze based solution for correcting and composing our misheard words.

Further in the future:

A living breathing “gaze based cursor”. I’m just calling it the “Bee”. A gaze cursor that has more character to it. A cursor that is wise cracking like Cortana/Fatima/Siri but less digital than the current UI flower. Like a bird or bee, it comes to rest on the flower, not instantly landing on it. Something that gives the UI solution a more delicate feel and presence. Essentially the cursor will become the visual embodiment of your mobile devices AI within VR and AR spaces.

–Shane

 

Auspicious VadR Analytics

With the recent launch of gaze focused game “Shooty Blocks VR” we have incorporated VaDR’s unique telemetry for gathering information about where players are looking.

VaDR’s implementation is almost as simple as setting up Unity’s own analytics platform.

  • Register on their website
  • Drop and drag prefab into scene
  • Update prefab with codes

To compliment visualisation of the data you can also import your scene into their web viewer in order to analyse the collected information in 3d space.

The results of your average gazing so far resemble the shape of a rooster or chicken which is quite auspicious given the Chinese calendar year. Beyond this it also helps us understand how far up people wish to tilt their heads when playing the game and where they look the most.

TIP: In open environments when using analytics platforms like this, you may not see co-ordinates where people are looking without having invisible boundaries around the edges of your arena. In the above example I placed an invisible boundary ahead of the player where the main action takes place. This way it captures idle gazes into the distance as well as active gazes at targets as they arrive on scene.

Shooty Blocks VR

A Ballistic Musical Experiment. Stop the invasion!

A simple gaze based ballistic music game.

Protect the desert mirage from the Catchy Blocks.

  • Gaze at white blocks to fire musical rockets at them
  • Gaze at ammo pick ups when you are running low
  • Last for 5 days and nights in order to win
  • Post your high scores on Google Play
  • Earn achievements.

The third VR experiment from Booster Pack.

Made in Singapore

Available Now on Google Play.

Check out the Booster Pack Store on Aptoide

Aptoide enables us to reach audiences beyond those that google play can usually reach. It also gives you as a consumer a powerful way to select the version of the game you want to play. For instance, if we changed something in one build and you were not 100% happy with the change, you are able to roll back. Or, if you are curious to how the game looked on day 1, you will be able to go and have a look at time capsule of that build.

Check out our store on Aptoide here.

Sky Marshall VR

Sky Marshall, stand on the bridge of your space cruiser and defend against an incoming attack from an unknown enemy.

The thousand years of peace have come to an abrupt halt, all across the galaxy a new enemy has emerged. Will you survive the first wave.

  • A simple science fiction bridge simulation
  • Launch Defensive Drones
  • Charge and use your mega weapon
  • Protect yourself by activating your shields
  • Low on energy, watch a video from the intergalactic fleet to get a recharge (Sorry, no 360 advert providers at this time)
  • Earn achievements.

Sky Marshall VR is the second VR experimental game from Booster Pack, we are attempting to find what sticks when it comes to mobile VR game making.

Contribute and shape the game the with your feedback and reviews.

Your choices in game will also shape how the game continues to develop.

This is the second VR Experiment from Booster Pack

Made in Singapore.

Available Now on Google Play.