Human Computer Integration Lab
June 2022 - Present
What is it?
The Human Computer Integration Lab is an engineering lab run by Professor Pedro Lopes based in the University of Chicago's Department of Computer Science. This lab focuses on integrating technology with the human body. Typically, the lab works on creating new wearable and haptic devices, often with virtual/augmented reality applications. The lab consists almost entirely of PHD students with only a few select undergraduates.
What do I do?
As a computer science major with years of programming experience, I contribute in this lab by helping with all of the software. I program high-level software, including VR scenes to help evaluate and demo devices, and low-level software, including the logic of an Arduino microcontroller.
As for my position at the lab, I was originally hired as a 10 week summer intern. Now, I am honored to have been invited by Professor Lopes to continue working at the lab indefinitely.
Note: I do not own any of the code written for these projects, so I cannot post anything publicly on the internet and I am limited in what I can share. If you would like to see some of my code for any of these projects, please email me at firstname.lastname@example.org and I can send small snapshots of what I've worked on.
[Unpublished Project] - Fall 2022 - Present
We are still working on this project, so I cannot post too much information about it yet. However, I can detail some of the work I have done so far.
This project requires an augmented reality demo, so I created the backend systems of a simple AR racing game for the Hololens. Without going into too much detail, some cool features I implemented so far include the ability to resize and position the game map in the real world before the game begins (the Hololens positions applications in the real world relative to the starting position/orientation of the headset, but this can be problematic if certain objects are meant to be in specific parts of the real world), and a simple racecar AI bot. There are currently multiple implementations of the racecar AI, one of which is detailed here.
Jumpmod Haptic Backpack - Summer 2022, Accepted to CHI 2023 (I am a co-author!)
TLDR - Watch the video above since it showcases the VR demo scene I made on my own to demo the backpack shown above.
The Jumpmod backpack is a haptic backpack that enhances users' perceptions of vertical acceleration, especially during jumps. The backpack moves a weight at high speeds up and down the user's back at a whim which can create a variety of different effects, including, but not limited to, the feeling of a higher jump and the feeling of a harder landing. The PHD student I worked with on this project, Romain Nith, concentrates more heavily on the hardware, so I helped him by working on the majority of the software for this project.
One contribution I made is that I wrote and refactored all of the code for controlling the hardware, including creating an OSC + BLE communication protocol so the hardware can easily communicate with Unity demos. After switching multiple components in a new revision of the hardware, much of the original code became obsolete, so I stepped in and updated everything. These updates included many complete reworks of preexisting functionality to improve performance, ease of use, and readability. I also refactored the code for the old VR demo to fix numerous detrimental bugs and to allow the demo to work with the new communication protocol.
Another contribution I made is the creation of two different jump prediction algorithms that are vital for timing the device's haptic effects. These two algorithms include a 100% accurate algorithm for VR applications and a less accurate algorithm for non-VR applications. The VR algorithm uses the headset's head tracking to directly determine the user's position on a jump curve. Whenever the position reaches a key point on the curve, an event is triggered which can trigger haptics or anything else in the VR demo. The non-VR input method uses the accelerometer from the IMU on the microcontroller attached to the backpack to predict key phases of jumps as before. However, this method is less precise than the VR method since the accelerometer on the backpack was incredibly inaccurate and none of the jump phases can be determined directly from acceleration and instead require either velocity or position. To determine velocity and position, the algorithm uses Semi-implicit Euler Integration, though this causes error to accumulate over time since acceleration values are polled at finitely small time intervals. To combat this, the algorithm adjusts its estimations at certain points, though there is no perfect way of preventing this error. I proved the effectiveness and accuracy of these algorithms by running a user study with 12 participants and comparing the algorithm's data to that of a post-jump analysis of the participants' positions (see paper for more information).
The largest contribution I made is a VR demo game that dynamically showcases many of the features of the backpack. I created this game using the Unity game engine. This VR game is now the HCI Lab's primary means of demoing this project, and it has already been used in another 12 participant user study to determine the impact and enjoyability of the haptic effects. I even had the chance to present this VR scene at Argonne National Lab! The game is an escape room platformer where all of the jumping actions used to solve the puzzles trigger haptic effects in the backpack. These effects include a jump boost effect when the user uses a jump pad, a hard landing effect when the user jumps from high up, an elevator effect to suggest the movement of the elevator between levels, a slight vibration to suggest the presence of the ghost, and more. A video showcasing a first-person playthrough of the game is displayed above.