1. Do not share user accounts! Any account that is shared by another person will be blocked and closed. This means: we will close not only the account that is shared, but also the main account of the user who uses another person's account. We have the ability to detect account sharing, so please do not try to cheat the system. This action will take place on 04/18/2023. Read all forum rules.
    Dismiss Notice
  2. For downloading SimTools plugins you need a Download Package. Get it with virtual coins that you receive for forum activity or Buy Download Package - We have a zero Spam tolerance so read our forum rules first.

    Buy Now a Download Plan!
  3. Do not try to cheat our system and do not post an unnecessary amount of useless posts only to earn credits here. We have a zero spam tolerance policy and this will cause a ban of your user account. Otherwise we wish you a pleasant stay here! Read the forum rules
  4. We have a few rules which you need to read and accept before posting anything here! Following these rules will keep the forum clean and your stay pleasant. Do not follow these rules can lead to permanent exclusion from this website: Read the forum rules.
    Are you a company? Read our company rules

Trip's treatise on motion compensation and enclosed simulators

Discussion in 'VR Headsets and Sim Gaming - Virtual Reality' started by Trip Rodriguez, Jan 9, 2020.

  1. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Edit: This is a complete re-write for reasons explained within.

    Please don't hesitate to correct me anywhere I'm wrong. I'll update the post to correct errors.

    An internet buddy asked a question that started me down the slippery slope that lead to a couple hours of writing this all down. Thirteen hours actually, by the time the re-write was finished. Then I lost the last hours worth of writing.

    Anyhoo.. It may be of interest to some people, but probably not too many. =P

    Basics: How VR tracking works

    IMU tracking: Pitch and roll are measured by an IMU using Earth's gravity as the reference. Gravity gives it a constant to measure by. Yaw can not use gravity as a frame of reference since the direction of gravity never changes relative to yaw, but I believe a magnometer can be used in a similar fashion using magnetic North as the point of reference. This is a much weaker reference point so it is subject to a lot more error.

    Definitions first: We all know pitch, roll, and yaw which are the rotational DOF's. The other three DOF's are the translational ones. They are surge, sway, and heave. Surge is strafe forward and back. Sway is strafe left and right. Heave is strafe up and down.)

    Contrary to what I believed until today the IMU is used not just for rotations but for all 6DOF. I should have realized it before as it's pretty obvious but I made the common mistake of failing to consider, question, and verify things I've "known" for a long time. I read years ago that VR used optical for translations and an IMU for rotations so I accepted that until writing all this and addressing Noorbeast's replies tonight made me realize that doing it that way would make no sense when both methods can provide data on all six DOF.

    The IMU uses accelerometers to measure linear accelerations. I'm not sure if yaw involves an accelerometer, manometer or both. I would now guess the latter or both now that I've thought a lot more about it. Accelerometers are nowhere near 100% accurate so if you don't somehow correct the data, the errors pile up and "drift" off pretty quickly. From what I understand the reason for using an IMU when optical tracking could do the job with much more reliability is speed. Using the optical method is slower for whatever reason, and for VR we need the fastest possible response time when we move.


    Optical tracking: VR optical tracking is done using cameras which may be InfraRed or visible light spectrum. DK2 and CV1 had stationary IR cameras and IR LED's on the HMD's. The camera sees the lights and tell what the HMD's position is from that. By using a pattern of LED's, what Oculus calls "constellation" tracking, the computer can tell not just where the HMD is but also what it's orientation is. It also can do this with a single camera, no stereo camera required! Here's how it works: From a given perspective (the camera location) the pattern of the LED's appears to change due to perspective. If you have four lights in a simple rectangle it appears to be a perfect rectangle if it's facing me straight on. If you were to turn it to one side, the lights on the side that was now farther from me would appear closer together and the ones closer to me would appear to be farther appart. Multiply that but a whole lot of little lights and you've got the "original" type of optical tracking use by DK2 and CV1.

    Because this is a completely viable full 6DOF method of tracking by itself I asked Oculus years ago if we motion sim folks could just disable the IMU. It would solve the motion platform problem, as long as the Rift camera was on the simulator instead of in a fixed location. The optical tracking could do the job 100% all by itself without the IMU helping. This is exactly what TrackIR5 is doing, 100% optical 6DOF with only three IR LED's or three reflectors (vs the many on Oculus DK2 and CV1, and with no IMU to back it up. Anyway, I was told that if you disabled the IMU the tracking response time would be too slow which would likely cause visible tracking latency and would definitely make motion sickness much more likely. I wished they'd let me at least try but it was a dead end for me at that point.

    Lighthouse (and Nolo) are the opposite arrangement. They have the cameras in the HMD (I'm ignoring controllers for the purpose of this write-up) and instead of having cameras spread around the room that have to be transmitting data to the computer they just have these amazing laser arrays that shoot out IR in fancy patterns for the cameras to see and use to measure their location. The big problem here is those laser arrays are spinning very fast on multiple axes and causing a Gyro effect. You can't move them even gently while they are running. They shut off to protect from damage at the tiniest movement. One issue with this setup is confusion in terminology. This is technically an "inside out" tracking system (and the original VR inside out system at that!) since the cameras are on the HMD, so it's actually the same setup as Rift S and WMR except instead of using ambient light they use stationary IR lasers for much greater accuracy and reliability. People constantly refer to the Lighthouse tracking as an "outside-in" solution because it requires an external component. This error can cause a breakdown in communications at times, so I'm always more specific when it comes to lighthouse so there can be no mistake.

    And finally we've got the "inside-out" ambient light tracking that has become the standard thing now, with cameras in the HMD using ambient light and bouncing off the environment for reference. To put it simply the cameras look at random stuff in the room and use that to triangulate movement.

    Past Efforts to solve the problem:

    First, what exactly IS the problem? The problem is a bit tricky to grasp. I'll attach my lovely art to try to help with that. What happens is that when the motion simulator moves us around, it thinks that we are walking or jumping around as in room scale. It makes this assumption that if we are moving relative to the real world we are supposed to be moving in the sim world. Walking around a VR room or peeking around a VR corner.

    compensation explanation.png

    When we use motion simulators the vehicle you see in VR is actually locked to the living room floor. The car moving doesn't make you move in real life at all. If you lean over you could stick your head out the window though, the car doesn't move over to follow you when you do that right? It's stuck to the real life floor.

    When you use a motion sim, and for example the car goes over a bump the VR tracking thinks maybe you just did a little bunny-hop while standing in your living room. Maybe you were hopping across a little VR obstacle. =) Every motion that the motion sim makes moves your VR headset around because it's moving and you are sitting on it. That's bad. You want it to totally ignore all that, but you DON'T want it to ignore when you lean over inside your motion cockpit to stick your head out the window or turn your head to the right to check that mirror. We have to separate those two, because the VR tracking thinks they are both the same.

    What we need is to attach the VR vehicle to the simulator instead of to the living room floor/Earth. This is what you see in my image above. Our head tracking needs to track only how we move relative to the car.

    The Rift camera method: What we all tried with partial success using Rift DK2 and CV1 was to put the VR optical tracking camera on the simulator. Now the optical tracking is attached to the car, your positions are relative to one another and all is well. If both camera and reference are moving together, there is no change in position between the two. Now you won't slide over to the passenger seat, up through the roof, or through your driver's door to outside the car. We can get the same result with WMR and Rift S by having the ambient light references (the "room") attached to the simulator. You would do this by putting an enclosure on the sim platform. It has to have enough light inside for the cameras to see, and it may need some help on "objects" easy for the cameras to track. Something like X's on the inside walls or whatever. This isn't possible with Lighthouse because the spinning laser assemblies will not tolerate being moved while operating.

    Where things go wrong with this approach is the IMU tracking element. We obviously can't move gravity and magnetic north around with the simulator, and without the proto-molecule (nerd culture reference) we can't move around in the surge, sway, and heave DOF's without inertia getting involved. Unfortunately BOTH types of tracking are used, and we can change the frame of reference for one but not the other. Now I read years ago that IMU was used for pitch, and roll, optical was used for surge, sway, and heave, and IMU was used for yaw but optical was used to correct drift in yaw. I should have realized sooner by pure common sense that this was incorrect. It's a myth almost certainly born out of the fact that before optical tracking (DK1) we had only 3DOF tracking and we knew it was 100% IMU. Then we got a fancy little IR camera and optical tracking and all the excitement of finally having positional head tracking so we can lean over and look down at the ground from our aircraft. Apparently that caused many people to assume that the new optical system was added just to add positional tracking right? Nope. Both systems are used for all 6DOF. This revelation is why I had to re-write this entire article. Moving on!

    The VR system uses the IMU data that we can't possibly make relative to the sim instead of to Earth to do it's best to adjust the IMU's super fast "guesses" with the facts gleaned from the optical tracking telling it in absolute terms the correct position and orientation of the HMD. Normally this deviation is going to be fairly small. It corrects them in real time before they have a chance to pile up and turn into big deviations and we are none-the-wiser.

    When we "fix" the optical tracking to work the way we want, but don't do anything about the IMU tracking data the software is suddenly looking at potentially MASSIVE deviations. The bigger travel and angle the sim can create the bigger the deviations. The bigger the deviations, the less likely that the VR tracking is going to work things out in an acceptable way. This manifested as the player's viewpoint making sudden jumps as the software decided it had things totally wrong and tried to fix itself. Now the myth that surge, sway, and heave were optical and that pitch and roll were IMU was actually backed up by results, which is probably why I accepted it all this time. For the most part when we put the camera on the sim we stopped having the big positional errors where we would be outside our vehicle, but we still saw rotational errors full force.

    Some folks with motion sims say they have no idea what we are all complaining about. Putting the camera on the sim works perfectly, or well enough anyway. Sometimes they are quite noisy about it. The fact is that it's a matter of "mine's bigger than yours". =P If you've got pretty small ahem.. equipment... you likely won't have any problems. Seriously though, on my big 6DOF when I stomped the go pedal and accelerated like a bat out of hell my brain was asking me why my feet were sticking up through the windshield. And also invisible. When I stomped the brake my feet seemed to be not just below the floor of the car, but buried well into the road beneath. The fact that this was happening when I had the Rift camera on the sim, but had mostly (except for the jarring "jumps" of the camera) stopped having significant positional errors.

    The Rift Camera Gimbal Method:

    The next thing that came along was using an electronically operated camera gimbal meant for drone photography to mount the camera on the sim. This gimbal kept the camera level relative to real earth. The "roll" axis helped, the pitch axis ultimately made things worse for folks with bigger simulators and most folks disabled it using only the roll portion. This method corrected heave, sway, and surge errors because the camera was mounted to the sim. It did not correct roll at all, but because it eliminated the perspective issue on that axis it did allow you have positional tracking and also use as much roll as your brain would accept and it wouldn't make the camera jump around often. Pitch it actually turned into heave deviation, so pitch wasn't solved at all by this. With the pitch axis enabled on the camera gimbal it would quickly put you through the roof of the car, or have you sitting on the pavement beneath it, so basically you still had to go really light on the pitch DOF cues. Again, people who didn't have big 6DOF rigs often found this good enough.

    I decided in a vague way that the problem was the computer noticing that the data from IMU and optical didn't at all match up and getting upset about it. The point was it wasn't good enough, and about that time the "motion compensation" plugin came along.

    For those of you who are scratching your heads and can't understand where the problem is because with the camera on the motion simulator the positional data will always be relative and therefore correct, here's the explanation of what was going wrong that I now know in a non-vague (but very confusing) way.

    Let's use the example of hard forward acceleration in a simulated race car. Your IMU is keeping you looking straight ahead relative to the real world and your brain is ok with that, but your optical tracking says straight ahead is up in the sky in front of you. The sim is tilted back remember? So the CV1 camera for example is now raised up and tilted down looking at you at an angle. It's seeing the top front of the HMD instead of the front straight on, but the IMU tells it that that do not have your head tilted downward.

    In the example of the CV1 the camera is not seeing what it expects to see in terms of the LED array on the HMD, the spacing is all wrong because of perspective. It's recognizing the LED pattern showing it that you head is tilted forward/down and actively trying to correct what it thinks is bad IMU data that says you are looking straight ahead.

    Enclosed simulators: Unfortunately, the same thing will most likely happen with a visually enclosed simulator using Rift S or WMR because it will be looking for it's "straight ahead" reference (for example an 'X' drawn on the inside of the sim straight in front of the pilot when the sim is level) straight ahead, where that reference is actually now up higher and at an angle. Same thing as the Rift CV1 camera being moved to the same location. In both cases the VR headset thinks it's safe to assume that those things are stuck to the Earth, so they aren't supposed to move. As far as the optical tracking is concerned the room is changing shape around it. It thinks maybe you are sitting in a tesseract! It would prefer Flatland and politely asks for a change of venue but completely failing to work properly.

    SteamVR OVRIE Motion compensation:
    Finally we had a proper fix at least partially working. It didn't work properly at all with Oculus though, and we were mostly unable to use bass shakers (which are awesome in sims with software support!) as they caused big tracking problems.

    To use the motion compensation plugin what you do is you attach a tracked motion controller such as a Vive wand, Vive Tracker, or Knuckles controller to the motion platform. When the simulator moves, that tracked device moves with it.

    When you move your head, that tracked device does NOT move with it. This is the key.

    So if you subtract the motion of that device from the motion of the HMD, you just subtracted the motion of the motion simulator which was the whole objective.

    But it's not that easy. =/ For it to be perfect, the tracked device would have to be in the exact same location as your head. I do not recommend trying to have two objects occupy the same space at the same time under any circumstances, but when one of those objects is your head it goes double!

    So there's going to be some error, the farther the device is from your head the bigger the deviation is. With Lighthouse we generally place the tracker on the top of the seat, right behind our head. This is quite close to the head and that's good. Unfortunately WMR headsets and others with that type tracking such as Rift S and Quest can not track the hand controller if it's directly behind you. The only option becomes to have it in front of your or alongside you. This puts it much farther from your head so there is a lot more error, but it's much worse than that.

    On most motion simulators the center of rotation is almost always fairly close right in front of the driver/pilot. This means that if you put the tracked controller two feet in front of you, when the simulator makes a "pitch" movement, the tracked controller and the pilot's head go opposite directions in the heave axis! If the sim pitches up, everything forward of the CoR goes up and everything aft of the CoR goes down! If the sim yaws left, the controller moves left on the sway axis, and the driver moves right on the sway axis. If you put the tracked controller alongside the pilot say on his left side to fix the huge pitch problem then when the sim rolls right the controller goes up, but in a pure roll motion the pilot's head does not. In fact it actually goes down a little bit. And that's where things go horribly wrong with WMR motion compensation.

    As I mentioned it didn't work properly with Oculus, and again some of the guys with the smaller sims didn't notice much trouble, for basically the same reasons detailed above. What they did notice was black edges coming into view during roll cues. This was a big glaring issue, it was not subtle. Probably for that reason a lot of people didn't seem to notice that it also (for Oculus users) was that it was only half working at best in all other respects as well.

    Anyway, I very unhappily switched to Vive which was expensive and had lower pixel density but it gave me access to proper motion compensation. It also made me turn off my beloved "Simshaker for Aviators" and "SimShaker Wheels" powered bass shaker system. The bass shaker system vibrations were causing the IMU, and to a lesser extent the cameras, to have extreme tracking issues. It might be your whole view vibrating so you could hardly see, or got motion sick, or both. It might be much less significant and semi tolerable. With significant effort made some folks succeeded in isolating the controllers adequately and most or completely solving this. This was another example of me being too dumb to question facts that didn't add up. I got it mostly solved, but as soon as I turned my bass shakers up I would be frequently teleported out fifty feet from my airplane in a semi random direction. I kept fighting with vibration, but that kept happening. I thought it was odd that the tracking error would manifest in that way and nobody else described this particular symptom, but it was kinda similar to the Oculus camera jumping thing (just much worse) so I figured the vibration corrupted IMU or camera data made the computer decide to try to fix things, and it fixed them badly.

    I'm now highly suspicous that it was possibly actually EMI interference interrupting communication between the tracked controller and the PC. EMI interference not form the motion sim, but rather from the amplifier and speaker wires that were powering the bass shakers, and I had solved the vibration. Oh well.


    The "Short, short, short version" of what I typed and lost:
    spaceballs.jpg

    My original idea, based on the misconception that the IMU data didn't include surge, sway, or heave data at all was to use a G-seat for all the cues except heave, and a big heave elevator which wouldn't affect tracking at all with a Reverb. I hoped I might later add surge and sway as well.

    Now I know that the IMU does contribute translational movement data, but there is still hope for the concept and it's not a total disaster if it doesn't work. We know that at least with CV1 when we put the camera on the sim it pretty much entirely removed unwanted translational movements. That indicates that the data for translations was heavily weighted to prioritize the optical data over the IMU. If this is the case with WMR it might work with barely noticeable error on the heave axis.

    If that doesn't work, I'm back to sticking with Lighthouse and having to use "motion compensation" for the heave axis. This was the original plan anyway and should be fine as long as motion compensation doesn't get broken by a SteamVR patch or compatibility issue with my hand tracking software.

    The only down sides to that really are still relying on the abandoned plugin that might break, and having to build a rig to raise the tracker up and down but isolate it from vibration. The other thing that worries me is that between the actuator "towers" for the elevator and the overhead switch panel and supports for the Huey it might be a problem to find a place to mount the lighthouses where the sim can go through it's entire 38" plus (1000 mm plus) of heave travel and not have the HMD lose sight of the lighthouses behind one or more of those structures.

    In truth I should be hoping that the Reverb doesn't work out because that will save me around $800 that I really shouldn't spend. That's the cost of the Reverb plus an allowance for enclosing the cockpit. I think in reality though if I'm going to be honest its probably more about the following reasons:

    • Internet buddies and internet strangers keep telling me how great Reverb is and that I'm missing out. Iv'e been made to believe that I can get better looking visuals with higher frame rates if I use Reverb instead of super sampling the Index as much as my computer can handle.
    • Having the dedicated HMD for the sim will allow me to install it on a flight helmet to increase immersion, improve the ability to apply G-loading to my head, and make it possible to implement a much more immersive sound system by having comms in headphones in the helmet but aircraft noise on speakers inside the cockpit which will be enclosed if I use Reverb and create what I expect to be amazingly realistic acoustic effects. Holy run-on sentence, batman. I've been unwilling to helmet-ify my primary VR HMD because I use it for other things as well. Half-Life Alyx incoming!
    • A third reason I can not remember
    • A fourth reason that doesn't exist but I thought the third reason looked like it needed another aimless reason to hang out with.
    • I just remembered the third reason. Paypal offered me two years zero interest financing on the Reverb which made it a lot easier to click the button and buy it.
    OK, that wasn't all that short, just much less time and effort put into it. I need sleep and might be slightly delirious. Oh wait, it's too late to sleep, I've gotta head to the Dentist in two hours.


    More information on the motion compensation software solution here:
    https://www.xsimulator.net/communit...r-needed-please-help.13226/page-5#post-189850
    • Informative Informative x 1
    Last edited: Jan 9, 2020
  2. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    Just a note, powerful transducers can also effect a camera mounted to a rig.

    I am not sure I agree with the example given for a camera mounted to the rig Vs the IMU. The optical tracking reference is always relative to the HMD, if the camera is mounted on the rig. Though I may not have fully understood what you meant.
  3. J-1775

    J-1775 Aviator

    Joined:
    Jan 28, 2014
    Messages:
    175
    Location:
    Switzerland
    Balance:
    1,564Coins
    Ratings:
    +51 / 0 / -0
    My Motion Simulator:
    6DOF
    MANY THANKS FOR THIS, Trip! It's a wealth of information tied together.:thumbs Hopefully this becomes a lively thread where we can focus on these specific problems and their solutions.
    From my first reading there's one topic I am missing: How do Vive Trackers (and Wands) fit in the compensation solution within the Vive ecosystem? Are they tracked in way comparable to the HMD? Is there a difference between trackers and wands?
    And finally: are the new Lighthouse and wands alternatives (coming with Index and Pimax) comparable, bringing the same benefits and problems?
    I understand that my questions do not add much to your solution, but your insights could help me and others to find their individual solutions.
  4. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    The wands and trackers are variations of the same technology, but members have reported slight differences in tracking quality, though really they should be the same.

    v2 of the Valve tracking mainly concerns the base stations, which are simplified but also allow additional base station for greater coverage and/or lessening of occlusion. The possibility of an additional base station and greater tracking fidelity may help a little, but I suspect it won't really change the fundamental known issues.
  5. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    It is very confusing! This is the same bit most folks stumble over. I've got a good understanding of it and I still have to be careful not to get tripped up thinking about it.

    Ok so we are talking about running the sim like a CV1, without the motion compensation plugin.

    When you accelerate forward, the sim pitches up to cue this motion right? But the Car and the world in the VR headset did NOT tip up. Alright, I found the diagram I made a year or two ago to help with this. =)

    compensation explanation.png
    So that explains why we need motion compensation, which isn't what we are discussing so basically ignore the top half.

    We are talking about the CV1 camera without motion compensation so that's the bottom diagram there. Let's say the CV1 camera is on the hood of the car, that's about right. So the IMU in the HMD is making you look "real world straight ahead" as in the bottom drawing because the reference is real gravity. The camera thinks that it's on the hood of the car in the bottom drawing.

    The problem is that the camera is actually on the hood of the car in the top drawing because it went up with the platform tilting. It's looking down at you from up there, while you are looking straight ahead relative to the real world because the IMU says that way is straight and level and that's where your windshield is and the road ahead.

    So the Camera thinks it hasn't moved and you should be looking straight at it. But what it see's is the HMD looking at an angle below it. As far as it is concerned you are looking at a downward angle. If that's was true than the IMU should be showing that your head is tilted down, but it says you are looking straight ahead, so they disagree completely on this. This should help.

    camera confusion3.png

    and finally, here is another (earlier) attempt at an explanation that might confuse more than help:
    camera confusion.png
    Last edited: Jan 9, 2020
  6. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    Are you talking about tracking camera mounted on the rig, or off it? Your sketch suggests the latter, but I am not sure if that is what you meant.

    As I said, with a camera mounted on the rig the camera tracking view remains relative between the camera and HMD.
  7. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Glad it wasn't a wasted effort. =) And this thread is here quite specifically to help you and others so ask away!

    So here we go: Trackers, wands, or knuckles are interchangeable (though the trackers have inferior tracking I've discovered, and knuckles have superior tracking. (More on that down a few paragraphs).

    So to use the motion compensation plugin what you do is you attach one of those to the motion platform. When the simulator moves, that tracked device moves with it.

    When you move your head, that tracked device does NOT move with it. This is the key.

    So if you subtract the motion of that device from the motion of the HMD, you just subtracted the motion of the motion simulator which was the whole objective.

    But it's not that easy. =/ For it to be perfect, the tracked device would have to be in the exact same location as your head. I do not recommend trying to have two objects occupy the same space at the same time under any circumstances, but when one of those objects is your head it goes double!

    So there's going to be some error, the farther the device is from your head the bigger the deviation is. With Lighthouse we generally place the tracker on the top of the seat, right behind our head. This is quite close to the head and that's good. Unfortunately WMR headsets and others with that type tracking such as Rift S and Quest can not track the hand controller if it's directly behind you. The only option becomes to have it in front of your or alongside you. This puts it much farther from your head so there is a lot more error, but it's much worse than that.

    On most motion simulators the center of rotation is almost always fairly close right in front of the driver/pilot. This means that if you put the tracked controller two feet in front of you, when the simulator makes a "pitch" movement, the tracked controller and the pilot's head go opposite directions in the heave axis! If the sim pitches up, everything forward of the CoR goes up and everything aft of the CoR goes down! If the sim yaws left, the controller moves left on the sway axis, and the driver moves right on the sway axis. If you put the tracked controller alongside the pilot say on his left side to fix the huge pitch problem then when the sim rolls right the controller goes up, but in a pure roll motion the pilot's head does not. In fact it actually goes down a little bit. And that's where things go horribly wrong with WMR motion compensation.

    Comparison of tracked devices: I have trackers, wands, and knuckles. The trackers have the exact same tracking cameras, but they definitely lose tracking due to reflections or partial occlusion a lot more easily than the controllers. I'm 100% sure of this and I rarely say that. I would guess it's a matter of bandwidth but that is only a guess. The wands are the middle of the road, and the Knuckles are noticeably more resistant to reflections and such than even the wands are. I also tested a 2.0 tracker (mine are 1.0) and there was no difference when using v1 lighthouses. All my tests were with Lighthouse 1.0, but that shouldn't matter.

    Lighthouse 2.0 vs. Lighthouse 1.0 theoretically has no effect on this whatsoever. The only differences between the two were supposed to be the 2.0 is cheaper (funny about that eh?) and that you can use more than two just two.
  8. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Neg, this is all about what happens with the camera on the rig. Mostly pay attention to the middle image.

    Here is another way to explain it. If you completely removed the IMU from the VR headset, the optical tracking would be able to track the HMD 100% on all 6DOF without it. The only reason for the IMU is apparently it's faster. TrackIR5 works exactly this way, 100% optical.

    So you put the camera on the rig and the camera knows exactly where the HMD is and it's pitch/roll/yaw angles from the shape of the constellation of LED's. It's 100% fooled by our trick of putting the camera on the rig and has no idea that anything unusual is going on.

    But we can't disable the IMU. The IMU doesn't know or care that the camera is on the rig, it knows THE TRUTH because it uses gravity, and perhaps magnetic north as it's references.

    If the camera is 100% fooled on all 6DOF, but the IMU isn't fooled at all then the software now has to deal with two totally contradictory data-sets being reported for the pitch, roll, and yaw position of the HMD.

    Now, I actually think that instead of understanding what the camera is seeing as a second data set on rotational position and trying to decide what's correct, it's simply assuming that the pitch/roll/yaw data from the IMU is correct and it's looking out there and saying "WTF, these LED's are a mess. Did this guy melt his HMD?" because if the IMU says you are looking straight forward it should see the flat face of the HMD, but instead (with the pitch up example) it's seeing a perspective shot from the top. That would totally change the spacing and shape of the IR LED's. If it's specifically looking for the arrangement of LED's that the IMU is telling to expect, it's going to assume that it is getting corrupt positional data due to reflections, physical damage to the HMD, or whatever.
  9. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Noorbeast, what you say right there would be correct if the camera tracking was a single reference point like PS Move. It isn't though.

    You've made me think at this from more angles and answered one of the few unknown's for me. =) This has to be right for the reasons that are making you resist the idea!

    The Camera is tracking the position AND the orientation of the HMD. The IMU data is used for it's speed, but the software is constantly adjusting for drift and error in the IMU based on full 6DOF optical tracking.

    That's the bit you made me realize. I thought it was "most likely" using the camera to compensate for yaw drift via 6DOF optical tracking. If it wasn't tracking 6DOF with the camera there would be no reason at all for the fancy pattern of IR tracking LED's. It would only need to see one light. This is how PS Move controller tracking works, and the reason that it's so much less accurate than our PC VR controller tracking.

    Our PC VR software "might" only be applying the rotational tracking data from the camera for yaw correction, but still that's enough. One cross reference is made, and it's wrong so it gets confused.
  10. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Here we go, proof that the Oculus software is tracking 6DOF with the camera! I think you will agree that if the camera is tracking 6DOF and the camera is on the simulator, the camera and the IMU will disagree, yes?

    This is direct from Oculus, I'll provide the link but here is a quote:

    The key underlying principle that drives Constellation tracking is the detection and triangulation of infrared LEDs within camera images which have extremely short exposure times. For each controller, the tracking system attempts to solve for the 3D pose, which is the 3D position and orientation. Within each frame, the system executes the following steps:

    • Search camera images for bright volumes of infrared light
    • Determine a matching scheme between image projections and the underlying 3D model of the controller
    • Compute the 3D pose of the controller with respect to the headset and fuse with inertial data


    In order for the system to obtain a sufficient number of constraints to solve for the position and orientation, we need a minimum number of observations. In turn, one of the major issues we faced with tracking the Quest controllers was that the typical number of LEDs visible in any given camera image is quite low. Due to lower camera resolutions and various other constraints, the Quest controllers have fewer LEDs placed on them for tracking (15 vs. 22 on the Rift Touch controllers). This issue was further compounded by the fact that there are very few poses where more than one camera can see a controller at a time, unlike Rift where the controller is typically viewed by two or three cameras at a time, depending on your setup.

    They are talking about improving controller tracking, but the topic is the constellation tracking system which is what was used on CV1 and DK2 for the HMD.
  11. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    Some things to take into account, first is the limitation and complexities of Oculus tracking, particularly when subjected to large movement, here is the original Oculus paper covering the use of optical and IMU: http://msl.cs.uiuc.edu/~lavalle/papers/LavYerKatAnt14.pdf

    And how that plays out in reality, as Rift tracking is subject to glitches even in a static environment, as captured here, which is even more complicated when subjected to significant mechanical vibration when attached to a rig, irrespective of other factors:

  12. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    Noorbeast, yes there's more info on why my plan to not use motion compensation may not work, but it's also more proof of why my explanation of how the camera and IMU disagree is what causes the big "jumps" in a CV1 when the camera is mounted to a high displacement 6DOF. =)

    Putting the camera on the rig changes the camera based tracking data.

    Doing that does not in any way change the IMU tracking data.

    Now the two disagree on the pose by a far larger margin than is normal, and the software has to try to figure out how to handle that.
  13. SeatTime

    SeatTime Well-Known Member

    Joined:
    Dec 27, 2013
    Messages:
    2,573
    Occupation:
    Retired
    Location:
    Brisbane Australia
    Balance:
    28,370Coins
    Ratings:
    +2,844 / 39 / -0
    My Motion Simulator:
    AC motor, Motion platform
    The above discussion is yet another reason why I have built my new sim the way I have. VR motion cancellation... - don't need to worry about it anymore :).
    • Agree Agree x 2
  14. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    I also stuck with fairly modest movement from the onset, for similar reasons when it comes to incorporation with VR.
  15. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    ARGH, NOOOOO!!! The past hour plus of typing just got erased when I finally finished and hit "post reply".

    It returned "please submit a post with no more than 25,000 characters (or maybe words?) and totally erased the whole last section. D=

    Thank goodness it wasn't the whole thing that got wiped. I spent the past 13 hours straight on this between the original version, replies, and then the full re-write. I'd have thrown my computer out the window and vowed to never touch one again.


    Worst part is It wouldn't have been over the limit if I knew. There were several temporary WIP paragraphs I planned to delete.
    Last edited: Jan 9, 2020
  16. cfischer

    cfischer Active Member Gold Contributor

    Joined:
    Sep 7, 2015
    Messages:
    372
    Location:
    Colorado
    Balance:
    2,688Coins
    Ratings:
    +259 / 1 / -0
    Ouch, that sucks^

    I don't think this is true. For the best tracking the tracker should be placed at the center of rotation of your rig. Not at your head. (One could argue the gyros in the tracker can go anywhere but the accelerators need to go at the center of rotation) The second part is the tracker should be placed on the stiffest part of your rig and then damped for vibration to ensure accurate tracking. My rig is currently a dbox style 4 actuator 3dof 8020 setup with 230mm of travel on each actuator. I've got the vive tracker placed on the main rail (very close to the center of rotation, very far from my head). Through trial and error this works the best for me. I can run a very high level of vibration from the actuators and bass shakers and still get good tracking. Don't bolt stuff to cantilevers (like the back of a seat) and expect it to track.
  17. Jorant

    Jorant Member

    Joined:
    Jul 15, 2019
    Messages:
    31
    Balance:
    326Coins
    Ratings:
    +1 / 0 / -0
    My Motion Simulator:
    3DOF
    Why can we not just take the telemetry data from the game and use that to offset the motion? Or, to game programmers (and this should be super easy): have a vr camera mode that senses the telemetry and offset the ingame camera. So let's say you are going up a hill, the camera would actually look downwards a bit to offset the fact that the player will be leaned back. Therefore offsetting the motion sim. You could have a slider in game to make the effect more or less powerful. That would take some programmer like.... an hour to put into the game.
  18. noorbeast

    noorbeast VR Tassie Devil Staff Member Moderator Race Director

    Joined:
    Jul 13, 2014
    Messages:
    21,159
    Occupation:
    Innovative tech specialist for NGOs
    Location:
    St Helens, Tasmania, Australia
    Balance:
    148,640Coins
    Ratings:
    +10,909 / 54 / -2
    My Motion Simulator:
    3DOF, DC motor, JRK
    @pmvcda already considered something similar but concluded latency an issue on the software side, so was perusing the hardware side, see here for details: https://www.xsimulator.net/community/threads/flypt-motion-cancelling-project-trial.13527/
  19. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    I have to disagree with you there. I think you've got things a little jumbled which is what happens to pretty much all of us with this stuff.

    We aren't using the tracker to measure what the motion sim is doing. If that was the goal, you would be correct.

    We are using the tracker to measure precisely what the motion sim is doing to the HMD sensors. That's what we need to subtract. The HMD is cantilevered, so the tracker must also be cantilevered. Ideally we want the sensors from the tracked controller and the HMD to give identical readings if you were to sit 100% perfectly still, so that the only difference between the two is when you move around in the seat and turn your head.

    Alternatively you could measure the actual motion sim movements and then use a model of the geometry to extrapolate how the HMD would be affected but that wouldn't be very efficient.
  20. Trip Rodriguez

    Trip Rodriguez VR Pilot

    Joined:
    May 8, 2016
    Messages:
    675
    Location:
    Lake Ariel, Pennsylvania
    Balance:
    3,922Coins
    Ratings:
    +330 / 6 / -0
    My Motion Simulator:
    6DOF
    That's a very attractive idea that has been much discussed but it's not nearly that simple.

    For one thing, since our sims are not identical, and even beyond that our personal judgement calls on motion cue tuning aren't identical, each of our sims responds differently to the same telemetry data. There's even deviation between the instructed position and the actual position at any given point in time and almost none of us are using a system that reports that info back to the computer at all. PMVCDA is the only one I know of who has a 6DOF constantly telling the PC what it's position is. Most have the whole closed loop downstream from the PC.

    Also, tracking how the HMD is going to be moved around by the sim requires forward kinematic calculations using all the dimensions of your specific motion sim to calculate the needed compensation because the HMD is not at or even near the center of the motion platform.

    If we were dealing with a bunch of identical sims running identical software with identical settings this might be practical, but that is not the case.

    The idea is much more likely to be possible with sims that have individual drives for each axis rather than a hexapod, but hexapods are generally what need this compensation the most!

    As Noorbeast mentioned, there are also latency issues and such.

    Here is an exception though, as I argue this point I sit here knowing that my own new rig is one sim where this actually might be practical. I might even ask PMVCDA about this when the time comes. My v.2 simulator is going to be 1DOF! Heave only, the rest is G-seat. If I could install a single simple linear sensor and have the computer read that and use it for 1DOF motion compensation I'd be set. =) Surge and sway should also be this simple if there are no rotations involved whatsoever.