Sunday, 26 January 2014

Android sensors for AR

I've added a simple class to NetMash called Sensors, that I use to pick up the phone orientation, to drive the 3D view. This is only active once I set the "Around" menu item. I currently only use "azimuth" to pan around the room.

That's something I learned doing this project: "azimuth" is panning around you - and is also called "yaw", "pitch" is looking up and down, and "roll" is tipping or rotating the device while looking at the same thing.

The Android documentation appears to have some rather odd axis conventions to the eyes of someone who hasn't taken the trouble to work out all the maths, but working code beats theory every time.

Obviously, the code was more-or-less copied from multiple StackOverflow posts, but there's not one post or article anywhere I could find that gave me complete code like the simple class I ended up with. The smoothing algorithm I invented is probably far too simplistic, but I can refine it.

Actual Positions

So now I can pan around my 3D place, it becomes pretty obvious that the room isn't exactly oriented square to North.  But more importantly, the positions of the lights bear no relation to their actual positions in the room around me.

Setting Thing positions to their actual locations in the room will have to be done manually for now, when I run NetMash on each Pi. They will then be notified to the place object which can pick them up in the initial discovery rule.

I'm still intending to video all of this working..

No comments:

Post a Comment