Categories
minecraft technology virtualreality

We’re going to Disney! … in Minecraft VR

Welcome to Disney in Minecraft in Virtual Reality!

Disclaimer: This guide involves sideloading apks onto your Meta Quest 3. This is not officially supported by Meta and can go horribly wrong such that you may have to factory reset your device – if you are not comfortable with this stop here!

Requirements:

  • A purchased Java edition of Minecraft
  • A Meta Quest 2, 3 or 3s
  • Access to QuestCraft at the most recent version, at the time of writing 1.21.11
  • SideQuest installed on your PC, and possibly a usb cable if you want to connect your Quest to your PC in order to install apk files

In April our family went to Disneyworld and had a blast. When we came back, my young daughter was gloomy that the trip was over. Not to worry, with Disney in Minecraft I was able to bring our vacation to her!

While she is too young to wear a Meta Quest 3, she enjoys riding on the virtual rides on her PC in Imagine Fun, a family friendly, extremely detailed and faithful replica of Disneyland in Minecraft. Lucky me, I get to join her in virtual reality via QuestCraft.

To join the server, I use the current version of QuestCraft which I sideload onto the Quest using SideQuest . If you have not sideloaded an Android package (apk file), I recommend following SideQuest’s setup guide. QuestCraft shows up under “Unknown Sources” in the Application launcher after it is installed.

I subscribe to the QuestCraft Patreon so I benefit from more frequent and recent updates. You do not have to do this to enjoy Imagine Fun, so long as you have a version of QuestCraft that includes 1.21.11 or higher.

How to join the Imagine Fun server

Once you have opened QuestCraft on your headset, selected the most recent version supported by QuestCraft and clicked “Play”, the game will load.

Click “Multiplayer” and you will have to add the server.


The server address is

mc.imaginefun.net

Add the server and click “Join Server” to join.

You will have to click “Proceed” for it to accept the prompt for it to download a resource pack for you. Make sure you have the resource pack setting set to prompt or auto-download.

It will take about a minute to join the server and the scene may freeze. Don’t worry, this is perfectly normal!

Once you join the server, you may want to tweak a few settings to maximize your experience. I turn my rotation increment to “Smooth” to disable snap turning and avoid a jerky spin when I turn my view. To do this, use the settings button on the left controller to open the Minecraft menu.

Click Options, then use the trigger button on the right controller to select VR Settings then Locomotion Settings.

I also calibrate my height to my elevation while sitting down – this is the button on the very first Minecraft menu below the Options and above “Come back soon.”

Finally, I adjust the world size which is under the VR Settings. I enjoy the server while seated – of course, you are welcome to experience while standing too!

Advanced: How to enable Audio through the Headset

I was frustrated with the default audio experience on the headset which is that you have to generate an audio streaming link through the chat with the /audio command. When on a PC this is a natural flow since you can click the link and it will open a tab in your browser that you can use to pair your movements with audio from your in-game location. Pretty cool right? Well not so cool on the Quest since trying to paste this into a browser there will freeze your game.

I came up with a workaround for this that uses my Android phone as a relay for the audio. For what it’s worth I do not know if this would work on a non-Android phone or a PC.

For the record while you get used to it and it’s a much better quality audio experience, there is the definite possibility that the Imagine Fun folks will add in-server audio any time now and this method will become unnecessary. It will probably take you some time to get it working for you – so only dive in if you’re like me, you’re relatively tech savvy and really would prefer an elaborate workaround to typing in the address into a browser every time!

I sideloaded KDE Connect and AudioRelay onto the Quest using the same approach as I did with QuestCraft. Note that to use this approach you need the actual apk files. You can’t install Android apps from the Google Play store directly onto the Quest, you need to find and download the apk files from an apk mirror and sideload them. On the Android phone however, you definitely can and should install these apps via the Google Play store.

KDE Connect allows you to send clipboard contents from the Quest to your Android phone so you don’t have to type out the audio link on your phone every time you get one.

AudioRelay allows your Android phone to mirror its sound to the Quest. The idea is you copy the audio link from the Quest to the phone, then transfer the phone’s audio back to the Quest. Note that only the default browser, not Chrome, appeared to work with AudioRelay.

But how to set this up exactly once in game?

First you must pair the Quest and the phone in both KDE Connect and Audio Relay. This part is a one time setup.

After installing KDE Connect and AudioRelay on both the Quest and the phone, join the Imagine Fun server and move to the side so you can see what you’re doing. Everyone spawns in the same location, which gets crowded quickly.

To generate the audio link, the easiest way is to use the “IFone”. You can also use a Magic band, which is what I use in the screenshots. The difference is purely cosmetic.

This “IFone” object auto-spawns in your inventory. This is your way to interact with the park. Press the Y key on the left controller to bring up your inventory. Use the controller grip keys to highlight the IFone, then use the left trigger to open the interface. This will be an inventory-like menu. Use the right controller to click the “volume” icon, and this will create the link in your chat.

Quickly since it will scroll up, press and hold the “B” key on the right controller to open your radial menu. You should have an “Open Chat” option, so click that with a right trigger button press.

Use the right controller to click the “here” in the audio link where it says “Click *here* to open the web client”. They mean the audio web client.

Click “Copy to Clipboard”, not yes or cancel.

At this point make sure you have exited the chat – click the chat input and then click “Esc” on your floating keyboard to close the chat before moving to the next step.

Now that you have the link on your clipboard, use the Meta button to open the Quest interface. This will temporarily freeze your QuestCraft game. Don’t worry, once you exit the Meta menu the QuestCraft game will resume.

Click the nine-dot application launcher and choose “Unknown sources”, then click “KDE Connect”. Click the three dots on the top right corner. Click “Send clipboard”.

Without taking off your headset, lifting it up slightly or however you need to do it, paste the link from the Quest that should now have been transferred to the phone into the default browser on your Android phone.

It will bring up an audio connection, click the big button to confirm the connection and the phone will start playing the in-game audio. First part achieved!

Now to send the audio back to the Quest …

On your phone, open AudioRelay and click the Server tab.

Click the “Apps” section, and share the browser window you opened.

Now it should be available for the Quest to pick up.

Back on the Quest, if you need to hit the Meta Quest button to reopen the Meta menu again do so, then open the Unknown Sources applications again and open AudioRelay.

In the Player tab, select the phone. You should now hear the audio coming through the Quest speakers! Click the X on the bar below the AudioRelay application to close it, and the audio will continue playing while you are in the world of Imagine Fun.

Stay on top of keeping the phone from locking – extend the inactive time as much as you can and keep it handy to unlock it if needed.

One more nugget of advice – the Disneyland park map spawns in your offhand which is surprisingly difficult to remove due to the Minecraft keybinds not mapping perfectly to the controller buttons. If you do not remove it it will constantly remain in your offhand, which can be distracting on rides.

To fix this without a keybind, aim your right controller at your hotbar displayed along your in-game left arm and hold down the trigger button to focus the individual squares in the hotbar. Play with it until you highlight the offhand square, then you can drag the map away from your offhand. Weird how the simplest things can be the hardest to accomplish …

Playing with my daughter in Minecraft from VR is fun already, but spicing it up with a trip to Disney makes it all the more magical! Hope this helps you make some memories too!

Categories
raspberrypi technology virtualreality

Control your Raspberry Pi 5 from your Meta Quest 3 over Bluetooth

The Meta Quest 3 is capable of so much since it is an Android based device and can leverage the existing Android app ecosystem – if you know how to do it! Let’s control our Raspberry Pi 5 from within virtual reality over Bluetooth.

Keep in mind that using Bluetooth to pair these two devices limits our ability to connect from a physically distant location. I personally prefer this since exposing it over say, SSH, would open it up to potential exploits over the Internet. I personally wouldn’t trust an SSH private key to an Android app – if you see one asking you for that, I would recommend against using it. For my use case, Bluetooth works perfectly well since I always use my Pi when I’m in close proximity to my Quest.

Disclaimer:

In this tutorial I endorse the use of Bluedot, an Android app by the tech rockstar Martin O’Hanlon. The app is MIT licensed which means you are free to use it for private or commercial use, and you can even distribute it, but you don’t have any warranty or liability claims when using it.

Prerequisites:

  • Follow this video tutorial or this pdf tutorial and have SideQuest downloaded, installed and connected to your Quest 3. Note that you will need the “advanced” version of SideQuest in order to install apk files not distributed by the SideQuest store.
  • You will need to download the apk file from the Bluedot Github repository – click here for the direct download link, or here for the repository link
  • You should be familiar with how to sideload an apk file (not from the SideQuest store). See page 7 of the pdf tutorial for a refresher
  • Raspberry Pi 5 with appropriate accessories
    • an SD card with the Raspberry Pi OS and enough space for this exercise – 50 GB or more should be plenty
    • a power supply that outputs 27W that connects via USB-C
    • a micro HDMI cable with an output compatible with a monitor of your choice
    • a mouse and keyboard that can connect via USB type A (the standard big type) or Bluetooth

If you are brand new to the Raspberry Pi, there is a great “Getting Started” tutorial at https://www.raspberrypi.com/documentation/computers/getting-started.html . This tutorial assumes you have already gone through these steps and have your Pi powered, booted up, configured with your login details, and connected to Wifi .

Steps:

  1. Sideload the Bluedot apk onto the Meta Quest 3

2. Boot up your Pi and ensure that Bluetooth is enabled – on mine it came up enabled automatically.

3. Create a Python script on your Pi to listen for Bluedot button presses. Here is a simple one from the Bluedot docs that says “Hello World” when the button is pressed

from bluedot import BlueDot
from signal import pause

def say_hello():
    print("Hello World")

bd = BlueDot()
bd.when_pressed = say_hello

pause()

4. Install the dependencies bluedot and signal, using a virtual environment as described in this tutorial. Keep in mind that using pip to install packages in the Pi environment directly may break system packages.

5. Run the Python script on the Pi. Unless this script is active and running Bluedot on the Pi will not be able to pair with the Bluedot app on the Quest 3

(myenv)pi@raspberrypi:~ $ python3 myrecipe.py

6. Back on the Quest, turn on Settings > Bluetooth. It should automatically begin scanning for available devices

7. On the Pi, click the Bluetooth icon and click “Make Discoverable”.

The icon should begin flashing

8. On the Quest you should see “raspberrypi” as an available device. When you click it, it will give you a pairing code. Back on the Pi, click the pop-up for pairing. On the Quest, click “Pair” as well. I found it easier to click the pairing pop-up on the Pi first and then on the Quest. If you happen to do it in the reverse order clicking “Pair” on the Quest first, you will need to click “Stop Discovery” on the Pi to get it to recognize the pairing.

The Quest and Pi should be paired via Bluetooth at this point.

9. Open the Bluedot app from App Library > Unknown Sources

10. Select the raspberrypi from the list of devices.

It should pop up with a screen that says “Connected” and a big blue dot

11. Click the dot, and check the Pi to see if it says “Hello World”

12. If it does, congratulations, you have connected your Quest 3 to your Pi 5!

In this tutorial we learned how to connect a Raspberry Pi 5 to a Quest 3 over Bluetooth, and click a button to execute an action on the Pi from with virtual reality.

Enjoy!

Categories
raspberrypi technology virtualreality

Adventures with Immersed VR on the Pi 5

Raspberry Pis are cool. Virtual Reality, also cool.

Let’s make a Raspberry Pi Virtual Reality coolness sandwich!

To be clear, this tutorial is about getting an Immersed VR virtual monitor working with the Pi 5. It is NOT about running a VR simulator with the Pi 5 – if you want that you can get some guides with a Google search on “DIY VR on raspberry pi.”

Disclaimer:

This tutorial makes use of a prototype build compatible with the Raspberry Pi which the Immersed VR team made available to me through Discord. It is not yet available on their downloads page and may not be compatible with all headset/raspberry pi combinations. YMMV.

Shout-Out:
This tutorial would not have been possible without the inspiring work done by Augusto Ícaro . Check out https://github.com/augustoicaro/Immersed-Linux-Virtual-Monitors and be prepared to go ooooh-ahhhhh 🙂

Prerequisites:

  • An Immersed VR account – their free starter mode should work, although you can also earn Pro by using Immersed in starter mode minimum 3 out of 7 days. More on that at https://immersed.com/faq#how_to_unlock_pro_mode
  • Access to the Immersed VR Discord – invite available at https://discord.com/invite/zy6KMbJ – in order to request and download the prototype build. Check out the #linux-help channel for the download and any assistance you might need
  • Headset compatible with Immersed VR – you can see their compatibility statement at https://immersed.com/faq under the “Compatibility” heading
  • Raspberry Pi 5 with appropriate accessories
    • an SD card with the Raspberry Pi OS and enough space for this exercise – 50 GB or more should be plenty
    • a power supply that outputs 27W that connects via USB-C
    • a micro HDMI cable with an output compatible with a monitor of your choice
    • a mouse and keyboard that can connect via USB type A (the standard big type) or Bluetooth
  • Wifi connectivity – Immersed VR works over your wifi network. Both the Pi and the headset should be connected to the same wifi network.
    ** Note that it is advisable to use a private wifi network rather than a public one, since public ones are vulnerable to hacking and could result in your session being compromised

If you are brand new to the Raspberry Pi, there is a great “Getting Started” tutorial at https://www.raspberrypi.com/documentation/computers/getting-started.html . This tutorial assumes you have already gone through these steps and have your Pi powered, booted up, configured with your login details, and connected to Wifi .

All the following steps should be executed on the Pi itself, except for the scp command which would be from a separate computer.

Steps:

  1. Download the app image from the Discord #linux-help Immersed VR channel to your Pi. If you don’t want to install Discord on your Pi, you can also download it on a separate computer that has Discord and use secure copy (scp) to transfer it to the Pi
    To use this second method, first copy your main computer’s public ssh key to the Pi as ~/.ssh/authorized_keys. A sample scp command to do this would be:
scp -i mylaptop_private_ssh_key /Users/mylaptopuser/Downloads/Immersed-aarch64.AppImage mypiuser@192.168.1.165:/home/mypiuser/Downloads

2. Once the App Image is on the Pi, grab the Linux setup script

wget https://raw.githubusercontent.com/augustoicaro/Immersed-Linux-Virtual-Monitors/main/scripts/immersed-setup/immersed-setup.sh

3. This script assumes your app image is in /home/mypiuser/Applications. Either create that directory and move both the setup script and the app image into that directory, or modify the setup script to find these resources wherever you want them to live.

4. Modify the setup script to use the App Image you downloaded. I had to replace “x86_” with “aarch”. A sample command to do this is:

sed -i -e 's/x86_/aarch/g' ~/Applications/immersed-setup.sh

5. Make the setup script executable

chmod a+x ~/Applications/immersed-setup.sh

6. Put your Applications directory on your PATH by adding this export line to your .bashrc file and then sourcing it to load it in your terminal.

echo -n "export PATH=$PATH:$HOME/Applications" >> ~/.bashrc
source ~/.bashrc

7. Install additional packages and make a symbolic link for libwlroots. Note that these versions are specific to the 10.3.2 version of the App Image and may change when this is updated.

sudo apt install -y libfuse2 libwebkit2gtk-4.0 libva2 libva-drm2
sudo ln -s /usr/lib/aarch64-linux-gnu/libwlroots.so.11 /usr/lib/aarch64-linux-gnu/libwlroots.so.5

8. Adjust the ~/.ImmersedConf file to setup your v4l2 camera device (to enable the virtual webcam feature). Use the Text Editor to find the line that starts with “CameraDevice” and adjust it to read like so:

CameraDevice=/dev/video7

8. Run the immersed-setup.sh script from the Applications directory.

cd ~/Applications
./immersed-setup.sh


This script may end with an error similar to this:

modprobe fatal module card_label=Immersed not found in directory /lib/modules/6.6.31+rpt-rpi-2712

If so not to worry, it means the v4l2loopback module hasn’t loaded into your current session. The v4l2loopback module relates to the webcam functionality.

To load the module, restart your Pi with shutdown now -r or by hitting the power button, letting it power off, and hitting the power button again.

9. At this point you should be able to run the setup script and not get any errors. Now it’s time to run the app image!

cd ~/Applications
./Immersed-aarch64.AppImage

Depending on what happened with the v4l2loopback module, the app image may generate a pop-up prompting you to install a few more dependencies. If so not to worry, we already installed these in step 7, so you can click “Do not display again” on these.

At this point you should be able to sign in with your Immersed VR account and pair the VR app with the Pi. The pop-up should say “Agent Ready” when it is available for the VR app.

10. In the VR app, select “Computers” and click the “raspberry pi” option. It should connect after a few moments. If it does not connect right away, try stopping the AppImage with Ctrl + C and re-run it.

11. View your fancy new monitor in VR!

OCULUS_ATTRIBUTION_ID:9279578942068913–

Congratulations! You now have the ability to connect to your Raspberry Pi 5 in virtual reality!

If you have any questions or get stuck, feel free to ping in the #linux-help channel of the Immersed VR Discord and we’ll see what we can do!

Categories
internet of things technology

Internet of Things Developer Days with Intel

Back in May we ran an Internet of Things Developer Day at the Axeda industry event Connexion with Intel Galileos and Raspberry Pis. A Developer Day is a hands-on workshop where developers get a guided experience with Axeda coaches while connecting microcontrollers to sensors and sending up data. It just so happened an Intel rep was there and had such a great time that we ended up partnering with Intel to organize a road show for the fall. We did one in New Jersey two weeks ago and we have now finished our first one in Silicon Valley.

Our technical team consisted of Kevin Holbrook, Joe Biron, Chris Meringolo, Allen Smith and Haris Iqbal all from Axeda, Howard Alyne from Wind River and Val Laolagi from ThingWorx. Intel supported us on the administrative side so we were able to focus purely on the content.

We had about 70 developers connect Galileos using our Axeda Developer Toolbox, which allows you to pick from any of about 25 Axeda Ready devices and get a self-guided tutorial on how to send data up to our cloud. The Galileos ran a proprietary Wind River version of Yocto Linux, which has cool security features baked in such as application signing and device identity key verification. The baseline for completing the tutorial was a round trip for the data, sending up light, sound, and temperature readings from the board and then triggering an action on the board from the app – in this case a blinking LED and a buzzing buzzer. We sent the developers home with documentation on an advanced path which took them through the ThingWorx dashboard, as well as a sample app that did AJAX calls to the Axeda RESTful web services.

It’s a gratifying experience for me to be able to coach developers past the initial hurdles of connecting a device. One student in particular whom I was helping had a Galileo board that was not able to get a serial connection to his PC over the COM port. I was able to log into the Axeda platform and see the IP address it was sending up as data, which we were then able to use to SSH into the board. A few minutes later he had his first circuit built with an LED, and by entering “blink 5” into the Toolbox app he saw the Axeda agent receive the downstream command and then blink the LED five times. After only a few more minutes he had his buzzer triumphantly buzzing as well, and then high five!

My key takeaway from these events is that developers are hungry to get experience with hardware and learn what the Internet of Things really is beyond the hype. Holding events like this one allows developers to find the meaning behind the buzz words and start laying the foundation for the future of their companies, one LED at a time.

For more information for Axeda Developers, check out http://developer.axeda.com .

Categories
technology

Apology to the participants in my last Newbie 4357

I accidentally deleted the last episode of the Newbie 4357 show on Operator 11 – crap!! I apologize to the people who participated in it – I was trying to delete the episode that had the sound problems.  Not a good night for Op11 directing!!

Please come and visit next week and we’ll have another great show – Fridays at 6:30pm.

Categories
technology

What do all the Doodads Do?

Depending on how much your loved ones spent on you this Christmas, chances are you have a new electronic thingamajig that does some nifty trick.  I want to take a moment to ponder doodads – why are they so much fun, can/should one doodad do everything, and are we getting too dependent?

 First, a Gizmo Review:

Cellphone – be reachable by phone again; as well as receive text messages, oh yeah, and it’s a camera too, because your cellphone, unlike your digital camera, will automatically adhere to whatever you take with you all the time 

mp3/mp4 player – the Apple iPod or the Creative Zen, welcome to gigs of storage for music, photos, movies, and stuff, this is an entertainment center – think solitaire on steroids

flash drive – use this as backup for your computer or as an extra drive, and then take your files with you to another computer

Bluetooth – this headset syncs up compatible devices when it comes within range, so whichever computer you’re at knows who you are

 PDAs – no longer public displays of affection, this acronym has been co-opted for personal digital assistant – Blackberry, Treo, Palm Pilot are a few – send your e-mail, access the internet and combine functions of the above devices in one handy device

The funny thing about these doodads is that once you start using them, you start to wonder how you ever functioned without them.  So that brings up my next question – are we Inspector Gadget, with the preparedness of the Boy Scout saving the day with our gear – or are we the Borg, giving up autonomy for the sake of being in sync through our implants?  I would say it depends on why we use these devices.  I don’t worry about my work reaching me on my cellphone because I’m not on call when I’m not at work.  My boss however is on call, and he groans about having to answer e-mail on his handheld device when he’s out of the office.  I don’t like competing with someone’s iPod if I’m trying to speak to him/her, but then again I do like walking to work to a beat.  Then there’s the question of whether these doodads are user-friendly.  I’ve heard plenty of people complain about texting because of the pain of getting the right letters.  I tried Bluetooth and had to return it to the store because I didn’t get the whole press and hold the button thing.  I can see why you’d want to access your e-mail all of the time, but then do you really want to be accessible by e-mail all the time?

The whole tricorder-aspect of the portable device appeals to me.  I like the idea of having flexibility about how I communicate.  And as the flight attendants so wisely tell us, their switches do have an “off” position.

Categories
technology

Mistakes in Logic

In computer science and in life, there are a lot of ways you can screw up the program.  The way I want to talk about is the logic error, the one that has nothing to do with how I phrase the instructions, but rather that occurred because I didn’t create a successful algorithm.  This kind of error has nothing to do with knowledge, but with experience.  No matter how well I know a language, I can still make something silly happen with it. 

So how am I supposed to avoid logic errors?  Testing the algorithm helps, but there is only so much trial and error can do.  Eventually plain simple thought is what actually produces the answer.  Seems obvious right?  Well, if thought were a liquid running through the pipes of my mind, there would be a few ways to clear the way for freer flow. 

Sleep/Exercise – mind-body connection here, my body should be healthy in order for my mind to be

Dealing with Animal magnetism – a term from Christian Science, I use it to cover all the mental failings that don’t really seem to come from anywhere but can become a problem; note upon speaking with a Christian Science practitioner – according to Christian Science in order to defeat animal magnetism I have to pray to know that God is Mind
examples being – doubt of my own ability to succeed, depression, loneliness, etc.  Dealing with these issues is as simple as refusing to listen to any thought that tells me emotions are in control of my decisions instead of me.  I’m not arguing for a Spock-like repression of emotions – emotions are okay to feel, but reason and consistent decisions should be what lead me.

Willingness to make mistakes – Part of computer programming is the thrill of tackling new challenges.  It’s only a matter of time before I’ll know %80 of it, but there will always be that last remaining %20 that escapes me.  Without the courage to try things, I’ll be stuck trying the same old solutions when what is needed is radical new approaches. 

Listening to other people’s solutions – Letting someone else help me out can be a humbling experience.  The truth is I don’t have all the answers, and somebody out there probably has the one I’m looking for.  I have to be willing to admit my own weaknesses and failings and then, when the opportunity presents itself, take the time to hear what others have learned and are going through.

The one thing a computer isn’t going to do for me is think.  It takes effort, but it is a reward in and of itself.