1. Do not share user accounts! Any account that is shared by another person will be blocked and closed. This means: we will close not only the account that is shared, but also the main account of the user who uses another person's account. We have the ability to detect account sharing, so please do not try to cheat the system. This action will take place on 04/18/2023. Read all forum rules.
    Dismiss Notice
  2. For downloading SimTools plugins you need a Download Package. Get it with virtual coins that you receive for forum activity or Buy Download Package - We have a zero Spam tolerance so read our forum rules first.

    Buy Now a Download Plan!
  3. Do not try to cheat our system and do not post an unnecessary amount of useless posts only to earn credits here. We have a zero spam tolerance policy and this will cause a ban of your user account. Otherwise we wish you a pleasant stay here! Read the forum rules
  4. We have a few rules which you need to read and accept before posting anything here! Following these rules will keep the forum clean and your stay pleasant. Do not follow these rules can lead to permanent exclusion from this website: Read the forum rules.
    Are you a company? Read our company rules

Need help to implement a virtual tracker for OVRMC

Discussion in 'Miscellaneous' started by Dirty, Sep 15, 2023.

  1. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    744
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,909Coins
    Ratings:
    +878 / 3 / -0
    Hi all :)

    In my motion cueing software (YAME) I would like to implement a virtual tracker that talks to OVRMC via a MemoryMappedFile. As far as I understood, the MMF only contains a Struct that holds the necessary data:
    Screenshot 2023-09-15 at 18.32.13.png

    I already have all the necessary data to fill this struct, but since my software is written in C# there is no library for it, so "vr::HmdVector3d_t" is an unknown type.

    I was thinking that for the MMF it shouldn't make a difference, as long as the bits & bytes are written in the right places. So my question is: Can I just use a simple Vector3 instead? It's just three doubles, right?

    Can anyone help me with some example code to write to the MMF that does not require an OpenVR Library?
  2. cfischer

    cfischer Active Member Gold Contributor

    Joined:
    Sep 7, 2015
    Messages:
    372
    Location:
    Colorado
    Balance:
    2,688Coins
    Ratings:
    +259 / 1 / -0
    Hi Dirty,
    I wish I had some wisdom for you on this, instead I have some comments/question.

    When using the OVRMC virtual tracker in flypt mover I need to painstakingly align the origin to my rigs center of rotation. I think most people have a good enough mindset because they have small travel on each dof. I have a large travel sim(going much larger soon) and would really love seeing a way to calibrate the sims rotation to the origin point(maybe with a tracker - just for calibration). I imagine with your long travel aircraft sim you will also want a way to do this. Have you thought about this at all?

    Always a pleasure reading your posts.
  3. Dirty

    Dirty Well-Known Member Gold Contributor

    Joined:
    Oct 15, 2017
    Messages:
    744
    Occupation:
    All the way up front.
    Location:
    Germany
    Balance:
    7,909Coins
    Ratings:
    +878 / 3 / -0
    I have indeed thought about this a lot, unfortunately with very few results as of yet! :) I was under the impression that all it would, or should take was a simple one-time calibration while the rig is in the zero position. I call that the Standby position, when all DOFs are at zero. Just a button click and then from that moment on, all motion relative to that reference position is transmitted to OVRMC's MMF.

    I haven't managed to do that yet, 'cause I got distracted with other stuff, but in theory I thought it could work. If it turns out it doesn't, it could point perhaps to the nested transforms that create the different DOFs are not nested in the same order as in my software or I have categorically misunderstood how OVRMC works. Probably the latter :)

    As soon as I get this MMF thing working, I will tinker about and see how it goes. I will probably initially start with translations only and then stumble my way into adding other DOFs. I'm sure at some point I will see where the culprit lies.

    This is something that kinda concerns me. In my world view it shouldn't be necessary, but since it obviously is, I think I may have a fundamental misconception :confused:

    Thanks for the kind words :thumbs
  4. cfischer

    cfischer Active Member Gold Contributor

    Joined:
    Sep 7, 2015
    Messages:
    372
    Location:
    Colorado
    Balance:
    2,688Coins
    Ratings:
    +259 / 1 / -0
    I can confirm you are correct in your thinking that you zero the rig and turn on the motion compensation and from there on you are good. This works well in mover if you want to experiment quickly.

    You are also correct in thinking you set up the origin coordinates and as long as your rig doesn't slide on your floor or something you don't have to change anything each time you jump in to drive/fly.

    The problem with the origin is that you must wear the headset and adjust the numbers of the origin (which is displayed in the headset at some point in your room). Then you pull the headset off and back on again and again to try to put the crosshairs at the center of rotation for your rig.

    I'm thinking the right way to find the center of rotations is to install a tracker or controller on the rig and drive the sim through its travel, recording the coordinates at many points. Once you do that you can extrapolate the origin and type that in exactly.

    My rig has 360 degrees of yaw so I don't want the origin to be off even a little because it shows up as a wobble during the travel. Also my rig doesn't have any visual indicator that serves as the center of rotations so I'm just kind of guessing. It seems to work ok for +/-8 degrees of pitch and roll but I'm building the rig to go farther so that will probably be an issue too.


    If I had a way of seeing the tracker location while its moving through its travel then I could grab the points and plot them in cad and find the center quite accurately with geometry, then type the coordinates back in. Its some work but only needs to be done once per installation(if you move your sim you must do this again).

    I actually installed unity because I saw that you could see the location of the controllers trackers and headset in real time. Unfortunately the coordinates in unity are different from the coordinates in OVRMC. Even the significant figures are different (three decimal places in OVRMC rather than 2 in unity).


    (Now I'm rambling) On top of the origin problem I also am on a journey to get encoder data into OVRMC along side the virtual tracker data in mover. So I am trying to learn how to read the data from an arduino (who is watching the absolute encoder via spi) into some program I write (in C++ probably?) that I can use the custom hook input in mover to bring the encoder data in and send it out with the other dofs to OVRMC. Not sure that its remotely the same problem you are facing writing your own MMF into OVRMC but I feel a kinship with your challenge.
  5. cfischer

    cfischer Active Member Gold Contributor

    Joined:
    Sep 7, 2015
    Messages:
    372
    Location:
    Colorado
    Balance:
    2,688Coins
    Ratings:
    +259 / 1 / -0
    Oh and re reading your post I see that you think the origin shouldnt matter. It wont matter for translation. But it will for rotations.


    This is what you see in vr. For translations you need to rotate yaw to line up with your rig.
    [​IMG]

    Here is where you input the origin coordinates and rotations to line up with your rig.
    [​IMG]
  6. herrylauu

    herrylauu New Member

    Joined:
    Nov 8, 2024
    Messages:
    1
    Balance:
    16Coins
    Ratings:
    +1 / 0 / -0
    My Motion Simulator:
    DC motor
    If you're working with Oculus devices, the Oculus SDK provides built-in features for motion and position tracking using their sensors. The SDK supports both controllers and hand tracking.
    • Like Like x 1