1. Do not share user accounts! Any account that is shared by another person will be blocked and closed. This means: we will close not only the account that is shared, but also the main account of the user who uses another person's account. We have the ability to detect account sharing, so please do not try to cheat the system. This action will take place on 04/18/2023. Read all forum rules.
    Dismiss Notice
  2. For downloading SimTools plugins you need a Download Package. Get it with virtual coins that you receive for forum activity or Buy Download Package - We have a zero Spam tolerance so read our forum rules first.

    Buy Now a Download Plan!
  3. Do not try to cheat our system and do not post an unnecessary amount of useless posts only to earn credits here. We have a zero spam tolerance policy and this will cause a ban of your user account. Otherwise we wish you a pleasant stay here! Read the forum rules
  4. We have a few rules which you need to read and accept before posting anything here! Following these rules will keep the forum clean and your stay pleasant. Do not follow these rules can lead to permanent exclusion from this website: Read the forum rules.
    Are you a company? Read our company rules

OpenVR-MotionCompensation

Discussion in 'VR Headsets and Sim Gaming - Virtual Reality' started by Dschadu, Apr 19, 2020.

  1. Avee

    Avee Virtual Pilot

    Joined:
    Jul 5, 2020
    Messages:
    141
    Location:
    Germany
    Balance:
    1,119Coins
    Ratings:
    +35 / 1 / -0
    My Motion Simulator:
    2DOF
    The lighthouses emit an encoded laser pattern. By evaluating this laser pattern, headsets, controllers and trackers can find out where they are in space compared to the lighthouse. That is what the calibration is for, to find out how the center of the default playing position relates to the position of the lighthouse.

    So everything "sees" the lighthouse, and therefore knows where it is. You need to see it with multiple sensors on one thing, so you can triangulate the distance as well. The Vive Tracker has 25 optical sensors. Theoretically, rotation can also be calculated by comparing different sensors in different positions on the controllers, trackers and headsets. However, this requires more precision, and I think for this reason everything has a gyro sensor as well. The calculations then probably get mixed in a Kalman filter to estimate the most accurate pose possible basing on the positions in space of the multiple sensors and the information from the gyros.

    The protocol for building a lighthouse sensor is actually open AFAIK. All you need is some optical sensors and a microcontroller like an arduino. The difficult part then is to do all the calculations and set up an appropriate kalman filter if you also use a gyro.

    The PC then combines all positions to make them show up in the right spots and to set the camera right so that it matches the headset. In the case of this motion compensation software, one tracker will be used to subtract the motion of the platform from the motion of the headset, so that only movements that are not caused by the platform control the camera.

    I think the libsurvive project would be a starting point if you want to DIY a tracking solution:
    https://github.com/cntools/libsurvive
    Last edited: Jul 18, 2020
  2. Fireblade69

    Fireblade69 New Member

    Joined:
    Nov 11, 2015
    Messages:
    12
    Location:
    UK
    Balance:
    271Coins
    Ratings:
    +0 / 0 / -0
    Wow, great detailed reply, thanks. That is curious - I would have thought it was the other war round with the lighthouse locating the devices in 3D space, not the device orienting itself based on a home pattern. Which is why they were probably called Lighthouse eh?

    As that is the case then it would be too complex for me to do a DIY that could be cost effective even with The high markup on the trackers I’m seeing.

    Motion compensation will be a challenge for motion simulators in VR, until that’s solved I’m uncomfortable with dropping too much money into a motion platform.
  3. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    A challenge that has already been solved. That's why this thread exists. The most issues user do report are problems related to inside out tracking. Lighthouse tracking works just fine. Yes, that is until you add very strong vibrations into the mix. Then you run into hardware issues with the integrates IMU. But that is also solved by using FlyPT Mover as data source. There is currently an alpha driver available in discord.
    • Agree Agree x 1
  4. xxpelle

    xxpelle Discord "TPMax#9574" Gold Contributor

    Joined:
    Dec 24, 2017
    Messages:
    120
    Occupation:
    Discord "TPMax#9574"
    Location:
    Germany
    Balance:
    614Coins
    Ratings:
    +150 / 0 / -0
    My Motion Simulator:
    DC motor, Arduino, Motion platform, 6DOF

    Where exactly on discord ?
  5. Fireblade69

    Fireblade69 New Member

    Joined:
    Nov 11, 2015
    Messages:
    12
    Location:
    UK
    Balance:
    271Coins
    Ratings:
    +0 / 0 / -0
    It may have been worked around in certain circumstances, but it isn’t solved by any means.
    I use an Oculus Rift-S and have a Reverb G2 on pre-order, both use inside-out tracking which as far as I can tell is still an issue. The motion compensation I’ve experienced is OpenVR only and it is good, but imperfect. I’ll check out the one you reference on Discord as soon as I can find it.
  6. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    Please describe in more detail what's imperfect and not solved by any means?
    I'm happy to hear your detailed feedback to improve motion compensation.

    @xxpelle: write something on our discord channel, I will then post a link to the alpha driver
  7. PeterW

    PeterW alias Wickie

    Joined:
    Oct 21, 2018
    Messages:
    227
    Occupation:
    Dipl. Ing. Mb (FH)
    Location:
    Germany
    Balance:
    1,747Coins
    Ratings:
    +385 / 3 / -0
    My Motion Simulator:
    6DOF
    Hello everybody,
    after the problems with my Pimax don't want to end, I switched back to my HTC Vive yesterday.
    For the motion cancellation I downloaded and tried the OVRMC.
    However - and this is also my question - the picture in VR is tilt and I don't know how I can get it leveled.
    In addition, no rotations are compensated, only the position of the tracker in the room.
    I would be very happy if someone could give me a quick hint!
    Wickie
  8. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    What version and which controller are you using?
    The latest release in the forum is 0.2.3, in discord 0.2.4 with minor bug fixes.
  9. Michael Mariacher

    Michael Mariacher Member

    Joined:
    Jan 25, 2019
    Messages:
    65
    Balance:
    626Coins
    Ratings:
    +47 / 0 / -0
    My Motion Simulator:
    4DOF
    Hi, it's not.
    Motion compensation works flawless if your HMD allows to.

    See:

    I've started with Oculus Rift CV1 and matzman666's openinput emulator https://github.com/matzman666/OpenVR-InputEmulator/releases

    It worked well until Facebook decided to cut the support of open VR.

    So I moved to a HTC Vive pro almost 2 years ago, and I can tell you it's brilliant.
    I assume it will work with Valve's HMD the same.

    No mater if you like flying, racing or simple driving.

    Positioning the Lighthouse sensors in an angle between 60° and 90° to each other makes compensation much better for my rig, it's a Fastetech Racingcube RC4 v3



    If the tracker (whatever you use) is not ar rigit mounted than my one, it's maybe better to try the other available motion compensation software OVRMC, available here too and discussed in the fasetech forum https://forum.fasetech.com/support/new-motion-compensation-ovrmc/

    I'm fine with matzman666's, so I stick to that one.
    My mount: camera mount at at metal bracket with double-side 3M tape glued to seat (duct tape just as backup)

    DSC02969.JPG

    Check out my other videos where you can see motion compensation in action.

    https://www.xsimulator.net/community/threads/michael-racing-cube-rc4-flexrig.14420/#post-193808

    Michael
  10. PeterW

    PeterW alias Wickie

    Joined:
    Oct 21, 2018
    Messages:
    227
    Occupation:
    Dipl. Ing. Mb (FH)
    Location:
    Germany
    Balance:
    1,747Coins
    Ratings:
    +385 / 3 / -0
    My Motion Simulator:
    6DOF
    Hi @Dschadu
    I use the htc vive tracker with your 0.2.3 version.
    I tried some different settings steam settings and now it works perfect! Dont know what was the key.
    Your motion cancellation works much better than the Pimax motion cancellation. Thanks a lot !!
    Wickie
  11. Avee

    Avee Virtual Pilot

    Joined:
    Jul 5, 2020
    Messages:
    141
    Location:
    Germany
    Balance:
    1,119Coins
    Ratings:
    +35 / 1 / -0
    My Motion Simulator:
    2DOF
    I have found that the mount to the chair is very important. I used a thick double sided gel tape. which conforms very nicely to everything and thus sticks like nothing else does. But the slight flexibility of the material was translating my vertical vibrations into all kinds of different vibrations and rotations. So the mount has to be really solid, also where it is glued, otherwise it will create mayhem.

    Another option I found for a vibration dampening mount:
    https://www.quadlockcase.com/pages/quad-lock-vibration-dampener
  12. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    Yes, the tracker mount is very important! It has do be very rigid.

    Keep in mind that you can mount the tracker anywhere you want! No need to mount it near your head. So you can put it i.e. above your steering wheel and mount it to your 20x60.
  13. Graham J

    Graham J moving while sitting

    Joined:
    May 28, 2020
    Messages:
    17
    Occupation:
    Developer
    Location:
    Ottawa, Canada
    Balance:
    204Coins
    Ratings:
    +6 / 0 / -0
    My Motion Simulator:
    2DOF, Motion platform
    I’m struggling to understand this last point. If the controller moves along different vector and magnitude than the headset for a given seat movement - which I believe it would if not close to the headset - how can the software apply the correct offset?
  14. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    The formula in the background handles this.

    compensatedPoseWorldPos = _ZeroPos + vrmath::quaternionRotateVector(_RefRot, _RefRotInv, poseWorldPos - _RefPos, true);

    If you cut down the rotation stuff, you end up with:
    compensatedPos = ZeroPos + poseWorld - RefPos

    ZeroPos: The tracker position in 3d space after you hit "apply" (this position is saved)
    poseWorld: Headset position in 3d space
    RefPos: Current tracker position

    An example:
    "HC" is Headset Compensated

    We have this initial position:
    Example 01.png

    Now our Rig moves 1 Unit to the left:
    Example 02.png

    As you can see, the [H]eadset moves with the [R]eference one to the left. But our Zero position does not move. The [HC] position stays where it was at the beginning.

    Now the rotation:
    The easiest way to understand is this: Imagine the above shown gird is rotated by 25°. Only the grid, not the devices. Don't rotate it in the middle - rotate it at -5, 3. It would be very hard to compensate any XY movement, as there is always this point of rotation involved, which has to be found before we can do any compensation.
    Why? There is this case: Tracker is at the front of your rig, your head at the back. The front of your rig moves down, the end up. the tracker goes down, your head up. The motion is opposite.
    But if we undo the rotation we don't have to bother with any point of rotation. And this is what happens. The rotation is reversed and we end up with a straight grid with no point of rotation. Now we can do the XY-Compensation.
    Done :)

    I hope this helps a little bit!
  15. Graham J

    Graham J moving while sitting

    Joined:
    May 28, 2020
    Messages:
    17
    Occupation:
    Developer
    Location:
    Ottawa, Canada
    Balance:
    204Coins
    Ratings:
    +6 / 0 / -0
    My Motion Simulator:
    2DOF, Motion platform
    Wow thanks for the detailed reply!

    I think I almost understand. Cancelling the rotation removes the issue of the center of rotation for sure, so that makes sense. But depending on the tracker location the tracker and the headset will also translate different amounts.

    For example let's say the tracker is at the rig center of rotation and the headset is higher up on the seat. For a given rotation the tracker will not translate at all, whereas the headset will. Even after removing the rotation, how do you arrive at a proper translation offset?

    In your second grid I envision H being more to the left of where you show it because it moved more in that direction than the tracker did. Accordingly, HC would also be more to the left because its location is H + R - Z. But then it would not be in the correct location because the required offset is greater than R - Z.

    In short, it looks to me like the tracker must be the same distance from the center of rotation as the headset or else you would need to know the ratio of tracker to headset movement in each axis.

    Although I am a developer, math is not my forte so I'm sure I am missing something!
    Last edited: Aug 2, 2020
  16. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    I'm not sure what you mean by H being more to the left? If your Sim moves 1 to the left, your tracker moves 1 to the left and your head does so, too. So both devices are now 1 more to the left.
    The formula is HC = Z+H-R.
    The shown example is of course only in 2d, to visualize what happens. The code itself does this in all three dimensions.

    To the translation problem:
    Its vector math, its a bit hard for me to explain it in easy words (and in English) as I'm also not 100% on the subject. Its within that quaternionRotateVector bit. This rotates the translation. Then you end up with new XYZ values. It gets rotated by the difference between tracker and ZeroPos.
    Maybe someone else can explain it a bit better?
  17. Graham J

    Graham J moving while sitting

    Joined:
    May 28, 2020
    Messages:
    17
    Occupation:
    Developer
    Location:
    Ottawa, Canada
    Balance:
    204Coins
    Ratings:
    +6 / 0 / -0
    My Motion Simulator:
    2DOF, Motion platform
    hehe no it's ok, I get it now. quaternionRotateVector handles the differing device translations that result from each device being a different distance from the rig's center of rotation. I was thinking about it from a device-centric point of view but when you rotate the grid both the rotations and translations that result from a rig rotation are taken into account.

    So now I just need to figure out what I'm seeing in the headset. It feels like there are two things happening - the rotational offset as discussed above which is fairly immediate, but also something else that seems to happen more slowly over time, like the world is chasing my head. When driving I sort of glide around in the cockpit which ideally MC should be eliminating.

    I'm using 0.01 LPF and 7 samples which I believe should give minimal latency. Do you know what I might be doing wrong?

    Thanks again!
  18. Globespy

    Globespy Member

    Joined:
    May 18, 2020
    Messages:
    88
    Balance:
    666Coins
    Ratings:
    +24 / 0 / -0
    My Motion Simulator:
    3DOF
    I was referred to this thread after being unable to resolve VR motion issues (Oculus Rift S), and wondered if this new plugin had progressed enough to work with the Rift S (using OpenVR) without too many compromises?
    I seem to understand that HMD's with inside out tracking (no external sensors or lighthouses) will not work well? Shame, as VR is clearly moving away from external sensors/lighthouses as inside out tracking continues to improve and is obviously a much less expensive direction for HMD manufacturers to go.
    Happy to test if there's still beta testing going on.
    Thanks
    Last edited: Aug 3, 2020
  19. Dschadu

    Dschadu Active Member

    Joined:
    Jan 2, 2017
    Messages:
    109
    Location:
    Germany
    Balance:
    1,933Coins
    Ratings:
    +145 / 1 / -0
    LPF and Samples work in different directions. LPF is for rotation. A value of 1 means no filter. 0.01 is extremely strong.
    DEMA is for translation. Samples 2 means no filter. 7 should be ok, but you can go lower.

    @Globespy If your tracking works flawlessly, OVRMC will do, too.
    • Informative Informative x 1
  20. J-1775

    J-1775 Aviator

    Joined:
    Jan 28, 2014
    Messages:
    175
    Location:
    Switzerland
    Balance:
    1,564Coins
    Ratings:
    +51 / 0 / -0
    My Motion Simulator:
    6DOF
    Holy cow, so glad you said that again. Shame on me, I had missed it out when you said that over two month ago. Hence I just wasted a good couple of months of my lifetime.:(

    After reading it today I immediately moved my tracker from behind the seat's headrest to a back corner of my 6 DoF Stewart platform. This indeed worked fine for compensation and at the same time removed almost all of my vibration problems I had with a Buttkicker LFE attached to the seat plus a JetSeat placed on it (the seat is well isolated towards the platform).

    But the real "pièce de résistance" was the Reverb, my favourite HMD. Though in OVRIE times I had tried EVERYTHING to apply compensation, even the ominous SpaceCalibrator tool to match the SteamVR workspace with the WMR workspace, I always failed to get motion compensation for the Reverb. For all the past months it was either Vive Pro WITH full motion (compensation) or the Reverb with the platform's envelope reduced to a mere 20%.

    Not believing it could work I mounted one Reverb Controller right beside my joystick's base, of course all IN FRONT of me, that is on the other side of the CoG (!), confirmed it as the reference tracker and lifted off in IL-2. My jaw dropped: COMPENSATION WORKED PERFECT!

    Dschadu, you are a GENIUS. Me and other people with platforms are so glad for your work with the compensation tool and really appreciate it. THANKS A LOT! Of course, you deserve more than nice words...
    • Like Like x 2
    • Informative Informative x 1