Maker Fair at TheCO: Post-Mortem

On October 31st, the Memphis Game Developers were invited up to Jackson, TN for the first annual Maker Faire at The CO. This time around it was only Ernest McCracken and I showing off games. He was showing the latest version of Fallen Space: Genesis, and I was showing off a technical demo of a Point&Click engine we had developed using the Oculus Rift and a Leap Motion Touch.

This was the first technical demo of this engine I had shown to people so I was looking for some raw feedback on it. The way the engine works, is that it ray casts an invisible line from the player’s eye, through their right finger tip and hits any object that can be “clicked” on. When the player hoves over something, they see a blue sphere appear on the object. They then reach forward and “tap” on it to activate it. This is achieved via collider sphere wrapped around the player, which detects a collision and sends the “click” event. Originally, I had the raycast extend from the players finger tip, so they could naturally point at things. Unfortunately, when you point, your hand tends to obscure your finger and the Leap Motion would have to make a “best guess” at what you were pointing at.

An immediate problem I noticed was that  some of the mesh colliders for objects were too perfect, as such, players wouldn’t notice the click indicator. I fixed this by swapping out the mesh colliders for sphere or box colliders where necessary, and making them much larger than the object. The other issue was “tapping” on objects them selves. To reiterate, Players would hover their finger over an object, if it was “clickable” a blue orb would appear on that object, they then reach out and tap on the sphere around them to click. Some players would have no problems with this, but others would have problems with their hand starting outside the sphere trying to tap or inside the sphere and not reaching far enough. Once I showed them how, it would “click” for them and they would catch on. The general feedback was confusion of what sphere they were supposed to tap on, as both the collider sphere and the indicator sphere were blue. I came up with a solution for this, but didn’t have time to implement it at the Maker Fair so I focused on better education. (I’ll cover this in our next Dev Blog post.)

The only other problem we ran into was environmental:  TheCO was too nice! They had freshly waxed the floors, all the desks and screens were polished, and they had some nice chrome bar stools near us. The Leap Motion uses IR sensors to detect your hands, and the reflective surfaces were playing havoc with it. I was able to code in a quick input to recenter the player, and as you can see in the header image, I had them face away from as much as possible to avoid IR junk data. In our development, this was never a real issue, and when we did A/B testing the next week, we had zero problems.

Alright, the bad is out of the way, here’s the good: everyone who tried it liked it. Even the people who had input problems enjoyed it once they figured it out. I noticed that people 25 and older understood what “Point & Click” meant where as younger than that said “Oh! It’s like an Escape Room!” People were also very understanding of the technical limitations. The best quote I heard was “Man this is so cool. I can tell the tech [for hand sensors] isn’t quite there yet, but it’s cool seeing it now and feels like we’re getting closer to the future!” With that in mind, we fully intend to integrate support for the Oculus Touch and the HTC Vive controllers as soon as they come out with public developer’s kits. When our dev team tried them, we loved the 1:1 fidelity they offered, but for now we will continue work with the Leap Motion.

Stay tuned to our dev blog as we did some A/B testing with our engine’s locomotion and we’ll post our results there.

Categories: Blog, Dev Blog,