DEV Community

Cover image for Setting Up a Virtual Desk
Kunal Desai
Kunal Desai

Posted on • Originally published at kunaldesai.dev

Setting Up a Virtual Desk

I recently noticed that the base Oculus Quest 2 model was being sold for $300. Shocked that it was so cheap, I decided to buy one and try it out. 8 years prior, I remember attending a Meetup where some engineers were pitching their startup: an IDE in VR. I remember trying their demo and not being very impressed. I was curious how much has improved since then. It's safe to say, I am shocked.

I ordered the Oculus Quest 2 128GB model and it was shipped to my house in some nice packaging. It came with the headset, a charger, two handheld controllers, and some accessories to fit the headset better on your face.

I turned on the Oculus, charged it, and finally got past the setup instructions. I was greeted by a beautiful backdrop with some menu options for using a browser, messaging people (there is an integration with Facebook Messenger), and an app store. Like most, the first thing I downloaded was Beat Saber.

The game as ok. I played it a couple times and it reminded me of Guitar Hero back in the day. I can see how people really enjoy it — I'm just not a huge gamer. The experience was really cool though. I was very happy with the latency in terms of responsiveness to my actions. Things actually felt real!

After that, I decided I was ready to see how I could go about setting up a coding environment. I found Oculus Workroom which is an application built by Meta aimed at simulating a virtual work environment. It goes beyond desks and monitors, but I was particularly interested in the monitor set up. As someone that enjoys traveling while working (a "digital nomad" I guess?) one of the biggest pain points for me was having to pack light and not being able to carry a monitor around with me. As a software engineer, this is a critical piece of my productivity! Buying the Oculus Quest 2, my hope was that I could create many screens of many different sizes without any additional monitors or hardware. That way, I could travel while working, and still have access to the same amount of screen real estate I did at home.

To set this up, I needed to download an Oculus Workroom app on my MacBook and install the Horizon Workroom Beta App on my Oculus Quest 2. Once that was ready, I connected my Oculus Quest 2 to my laptop and it just. worked. I was shocked. I was able to connect to my monitor, see my keyboard, and write code on my normal IDE while in VR. The experience was seamless.

One feature that I was mesmerized by was the hand gestures. Instead of using the controllers provided, there is an option to just use your hands. I kid you not, I spent at least 10 minutes just moving my hands around in the space and feeling how imperceptible latency was between when I told my hand to move and when it moved in virtual reality. On top of that, I was able to click and pan with my hands and fingers rather than the controller.

What's missing?

There has clearly been massive improvements in virtual reality technology over the last 8 years. I'm in awe at the engineering that went on behind it. Nonetheless, there are still some areas of improvement. Here is what is on my bucket list:

  1. Solving my remote work monitor problem - The Oculus Quest 2 doesn't yet solve my remote work monitor problem. The screen size is fixed to two resolutions and I'm only able to view one screen at a time. For me to have the same screen real estate I have at home, I need to be able to see multiple screens simultaneously and would like larger resolutions.
  2. Keyboard recognition - I use an ergonomic Microsoft keyboard at home and my Oculus wasn't able to recognize it. So when I look down at my hands, it looks like I'm typing in space.
  3. Text blurriness - I experienced some text blurriness while using the Oculus. It seems that when my headset is in some positions, the text is less blurry than in other positions. I'd like if there was a way for me to keep the headset more stable or be able to toggle how the text appears to me with software.
  4. Hand gestures - The hand gestures were incredibly low latency, but it wasn't always super accurate. There were distances from my headset at which my VR hands wouldn't accurately represent what I was doing with my hands. Additionally, at times, the Oculus would fail to correctly identify my hand and finger positions when my hands or fingers were close together. For example, sometimes I would touch my fingers together in real life, but my VR hands wouldn't touch. Man all these improvements in VR get me so excited for its future. It has definitely sparked a craving to do some engineering work in this space. Similar to Mighty, it scratches my itch of deep, low level software challenges while being a consumer facing product with a big mission.

Top comments (0)