FreeTrack Forum

Welcome, you're not connected. ( Log in - Register )

FreeTrack Forum > FreeTrack : English Forum > Support : FreeTrack Software > 360 degree 1:1 Tracking using Multiple Point Models

RSS >  360 degree 1:1 Tracking using Multiple Point Models
farhan #1 04/09/2014 - 04h47

Class : Apprenti
Posts : 3
Registered on : 03/09/2014

Off line

Hello all,

I am new to FreeTrack and have been quite impressed with its performance.

I was wondering that is it (theoratically) possible to use Freetrack to do 360 degree 1:1 head tracking? If so is it possible to use multiple similar point models attached to my cap/head at different angles (e.g. 120 degrees apart in azimuth) and to turn only one of them on based upon the estimated head pose?

If the above is possible in theory how should the software be tweaked to control the selection of the particular point model to be powered? Would it require use of Kalman filter to predict the next pose and then turn on the point model if the next predicted pose would lie in the region where it would be possible to track the next model points or we could do it simply using current pose and delta change or delta rate of change as well?

Many thanks

Farhan Zaidi
Steph #2 04/09/2014 - 17h51

Class : Moderator
Posts : 656
Registered on : 16/11/2007

Off line

Hi,

in my opinion this isn't realizable. Turning for examp. your head to the left with one to one tracking rotation from 20° to -20° you are using the full tracking angle of you webcam and point model. Switching to another point model do not give you any more angle but the same you've already made, because you are "one to one".
CKDV #3 04/09/2014 - 23h17

Class : Apprenti
Posts : 5
Registered on : 24/08/2014

Off line

If you were serious about achieving this for some reason you'd be better of wearing a STEM module on your head http://sixense.com/wireless though outside of VR that uses that tech for tracking anyway I'm not sure why you'd want to... Freetrack functions quite well as an input for stationary displays in front of you and that said if you actually had displays surrounding you in a circle why would you a system to rotate the onscreen display based on were you are looking? With full field of view aren't you achieving looking in different directions by looking in those directions anyway?
farhan #4 05/09/2014 - 05h38

Class : Apprenti
Posts : 3
Registered on : 03/09/2014

Off line

Steph - As per my understanding if we use multiple point models with marginally overlapping tracking angles and proper switching ON/OFF and model selection then it would be a simple matter of adding a pre-determined constant of roll/pitch data to calculate the actual/global Line of Sight of the head. For instance if I have 3 point models (clusters) one facing forward w.r.t. to my face, one 20 degrees to the left of  center cluster and one 20 degrees to the right of center cluster then for example if my webcam is tracking the right most cluster I would simply have to add 20 degrees right to the roll value of the calculated raw pose of right cluster to get the global/real pose.

But the above is just my understanding. Comments and opinions are highly welcome.

CKDV - I am planning to use FreeTrack for head pose estimation for use with a DIY Head Mounted Display I want to construct. A 360 degree linear pose estimation from FreeTrack would be used to control imagery generation. I got this idea of multiple point models after looking at the optical tracking systems used in Helmet Mounted Displays of fighter jet. Operational description of one particular system revealed that it operated on the same principle. It used multiple similar clusters of points which were activated one at a time and the estimated pose from the active one was used to acquire a global or real pose.

As I mentioned earlier, the above is my understanding of how to achieve the same. However it would require a lot of tweaking of the code.

The only barrier I seem to encounter while thinking of the above is of how to properly switch between point models based on the head movement. The pitfall would be confusion in tracking due to fast/ erratic head movement in selection of which point model is to be switched on. Second could be in the case of completely non-overlapping tracking angles of different point models so that there is a point where small head movement results in a view where no model presents complete points for pose estimation.

I need a solution for robust point model selection based on current head position and previous head position. A prediction model seems a good option but would be computationally intensive. While a simple case of "switch to model number 'n' if global pose is (xRoll, yPitch, zYaw)" is simple but not robust.

Thanks for your help. Once again comments and opinions are most welcome.

P.S. The tracking system has to be open source so that it can be modified in software to cater for the customized requirements of my particular project!
Steph #5 05/09/2014 - 06h33

Class : Moderator
Posts : 656
Registered on : 16/11/2007

Off line

Well it sounds like a very interesting project, but I'm not sure if Freetrack is the right starting point for this.

If you need to fiddle around with code (which would be totally off my competences :stuart:   ...why not going for a 3D compass? Most mobile use those apps and there are open source codes out there as well.
( Ido/3D-Compass )
Steph #6 05/09/2014 - 17h04

Class : Moderator
Posts : 656
Registered on : 16/11/2007

Off line

...by going further into that stuff, found this interesting page:
http://www.intorobotics.com/accelerometer-gyroscope-and-imu-sensors-tutorials/

I think Arduino could be a good starting point for this.
Edited by Steph on 05/09/2014 at 17h06.
moeburn #7 10/09/2014 - 15h01

Class : Apprenti
Posts : 8
Registered on : 30/01/2012

Off line

farhan @ 05/09/2014 - 07h38 a dit:


CKDV - I am planning to use FreeTrack for head pose estimation for use with a DIY Head Mounted Display I want to construct.



You're starting in the wrong place.  For 360-degree tracking, you are going to want to use a gyroscope sensor.  Most Android phones built after 2012 have a gyroscope sensor in them (NOT an accelerometer, those won't do).

There are some apps for Android that can use an android phone as a joystick for a PC game over wifi (with a server/client app on the PC too, of course).  You could get one of those apps and assign the joystick it creates to control your game/app's view.  Then just duct tape the android phone to your head.  Now you have a 360 degree head-tracker, no coding required.

The problem is that these apps work through wifi; I'm not aware of any that work through USB, so its much laggier than a webcam would be.

EDIT: I see steph has already commented on this.  If you want to go hardcore and build your own from scratch, an Arduino with a gyro sensor module would be the next easiest solution, though for head tracking, you would want one of the fastest Arduino modules available.
Edited by moeburn on 10/09/2014 at 15h02.
doveman #8 12/09/2014 - 14h13

Class : Habitué
Posts : 106
Registered on : 09/05/2012

Off line

I'm having quite good results with a Pro Micro 5v/16Mhz and GY-85 sensor module, so I don't think you need a particularly fast Arduino.

The code still needs some work and it can take a few tries of replugging the USB cable and starting the software to get it to start but once it does it works quite well.

 >  Fast reply

Message

FreeTrack Forum > FreeTrack : English Forum > Support : FreeTrack Software > 360 degree 1:1 Tracking using Multiple Point Models

 >  Stats

1 user(s) connected during the last 10 minutes (0 member(s) and 1 guest(s)).