Right now it’s augmenting reality with fake shit. Like you see the real world and can put non-real things in it. What I’d like to be able to do is be in a fake world and bring real world objects into that.

Like, say I’m in VRchat just sitting somewhere hanging out it would be cool if I could bring my drink, my food, my vape, etc into the game so I don’t have to fumble around looking for them with my hands. I haven’t found anything that does this though… :/

    • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.netOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I can technically accomplish what I want with a more complicated setup using my PC, a camera and some printed out QR tags since there are a couple of FOSS tools on GitHub specifically for adding tracking to anything. But I keep coming back to the fact the HMD itself (in my case, a Quest 3) has hand tracking that works good enough that it should be able to do what I am asking here without too much additional resources.

  • LanternEverywhere@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    3 months ago

    It’s a neat idea that I’ve never heard before.

    For now you can just bounce into passthru mode for a half second. On oculus you can set it so a double tap on the side of the device toggles passthru mode on and off

    • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.netOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      I do. Sometimes… I like to turn the boundary off when just sitting somewhere because even if I’m nowhere near the edge, the boundary lines often pop up and mess with my vision. It sucks that passthrough mode only works when the boundary is on. It doesn’t even make sense, as if you’re in passthrough, you don’t need a boundary 🤷🏻‍♂️

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    I think you will have the same problem as with 3D printers and accuracy. Like, people don’t realize their 3D printers are precision machines that are not accurate. There is a considerable variation in the actual zero point location of the tip of the nozzle on the bed. It simply does not matter that this is the case because every location is relative to the zero point.

    This is why 3D printers are cheap as a hobby tool. When you start trying to design a printer where there are two extruder heads that move independently, you need to know absolute position for each print head and that position must be accurate at all times. This is a MUCH harder problem to solve.

    In your VR headset, you are floating like how a cheap printer works. Your absolute position is irrelevant. If you want to incorporate external objects, you need to know absolute position. Computationally that is a larger problem, but also it requires a much more expensive sensory system, especially if you want to triangle location based on sensors mounted to the headset where the separation distance is small. I think you would really struggle to incorporate that and keep a half decent battery life to weight balance.

  • EP51L0N@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    I feel like that would be significantly more difficult to implement. You’d have to be constantly tracking the objects, cutting out a portion of your gameplay to render the objects, and also making sure it doesn’t overlap with anything in the game world making you unable to see it. I think it’d be better for them to add a slider that changes the amount of passthrough, so you can see both the game world and the real world. (facebook showed a demo of this in one of their quest ads, but has yet to deliver)

    • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.netOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      They already do this with your hands. And those change shape. It shouldn’t be harder to track a thing that is rather static using the cameras and software it already uses to do hand tracking. It would just have to be trained on them since I assume it uses AI to handle some of this.

      • EP51L0N@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Oh I thought you meant passing the object itself through the playspace. I guess something like that could work, but again I don’t know exactly how everything works.

  • secret300@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    28 days ago

    That would honestly be dope. It kinda reminds me of this one YouTube video I saw where this dude recreated his room in VR so he didn’t bump into things and could easily walk to his bed or chair without taking the headset off