Facebook AR Studio – How to: Use tap gestures to change effect materials
Recently I was given the opportunity to spend some time playing with Facebook’s awesome AR (Augmented Reality) Studio. I worked through Facebook’s quick start tutorial and created a mustache effect in no time. Immediately I was hooked, it was all so easy to use. AR Studio does the heavy lifting for the face tracking, so really all you need is a texture to make your first filter. So after the first and second basic filter, I wanted to make something more dynamic, I wanted to change the filter on tap, and suddenly my progress screamed to a halt. All of a sudden the documentation and instructions lacked what I needed. I looked through the tutorials and found out how to add scripts, but not how to change materials, or bind the events to specific objects. After hours of experimentation I was able to solve it though, so here is my guide for creating a filter effect which updates a material with touch events.
Facebook Quick Start Guide Recap
If you haven’t completed this or don’t understand the tasks below (minus the mustache texture) then you should likely check out Facebook’s quick start guide
- Create new project
- Insert new face mesh
- Add the mustache texture
- Add a new material
- Set the material shader type to face paint with the mustache texture
- Set the face mesh to use the new material
- Basic effect complete!
Alright, you should have something that looks like this:
Let’s Ramp it up
Since the goal here is to use tap events to change materials, we’re going to need some more materials with textures. I created a rough pennywise (Stephen King’s IT) and a lipstick texture that I can use for the demo. Running with that, we need to:
- Create a new project or add to the basic tutorial project
- Add two new textures, in my case pennywise and lipstick
- Add two new materials both with facepaint shaders and the textures matching the new textures
- Test the new materials by changing the material on the facemesh, make sure each individual effect looks good
Alright, so now my effects are as follows (plus the original mustache effect):
Okay, hopefully you’ve made it this far, because this is where I started to run into issues
Adding buttons to trigger the various effects
Let’s add some buttons so the user can select an effect from the choices and also maybe the ability to turn them off.
Also worth noting, the goal of this is not necessarily to look too refined, the goal is just a functional demo.
- Add four rectangles to the scene, by default they will all be stacked in the center of the display. I aligned them each to a corner for ease.
- Create four new materials, one for each of the new “buttons” (the rectangles)
- Add four textures to the project to use as a skin for the buttons, anything will do. I used some screen clippings of icons.
Below is my version with the face mesh turned off and the buttons skinned
Let’s write that script so they can start doing some work!
Adding the script to put it all together
Our script will need to be able to target each of our buttons, the face mesh, and our materials.
In order to use touch gestures in your effect, they will need to be enabled, which is done through the project menu then properties, as seen below.
Below is a screen shot of my incomplete script. I wanted to go over a few key details about the script to make sure you could follow along.
- The files in the project all follow a hierarchy from the Scene root, but in reality all of our items are located under Focal Distance so I declare my “base” that directory. (Also be mindful of filenames as this is case sensitive and files will not be found unless names match exactly) The added lines are showing the files on the left being referenced by the script.
- I am only printing to the console on press. My goal was for this to be a simple testing step and also a way to show the diagnostic tool in action and how simple it is to print values to the console.
This is the prelim script for testing the touch events and to see the diagnostics in action. Note that if you copy this, you must make sure your references have all of the correct names or else you will see errors in the console at runtime.
The final step: connecting the events
Looking at the final version of my script below, you can see that it doesn’t take a whole lot more code to make the dynamic changes. All we needed to do was pull in the library for the Materials, set the material on the face mesh, and lastly toggle overall effect visibility on tap.
We made it, and it was easy… right?
Hopefully your app looks like the one below… actually I hope it looks better.
To learn more about Chatbots and how AR tools like this can be embedded into the experience, CLICK HERE.