Human Computer Integration Lab 

June 2022 - Present

What is it?

The Human Computer Integration Lab is an engineering lab run by Professor Pedro Lopes based in the University of Chicago's Department of Computer Science. This lab focuses on integrating technology with the human body. Typically, the lab works on creating new wearable and haptic devices, often with virtual/augmented reality applications. The lab consists almost entirely of graduate-level students with only a few select undergraduates.

What do I do?

As a computer science major, I contribute in this lab by helping with all of the software. I primarily help by creating simulations to showcase the lab's various haptics projects. These simulations are typically in Virtual Reality or Augmented Reality, though I have also helped with some physical demonstrations. Additionally, I have worked on other areas of software to get the haptics devices up and running, including wireless communication scripting and low-level Arduino logic.

Projects

Note: I do not own any of the code written for these projects, so I cannot post anything publicly on the internet and I am limited in what I can share. If you would like to see some of my code for any of these projects, please email me at jserf02@gmail.com and I can send small snapshots of what I've worked on. 

Stick and Slip - Fall 2022 - Spring 2023

2 Full Video _compressed.mp4

The Stick and Slip is an innovative surface haptics device that modulates friction by dropping fluid chemicals onto users' fingertips. We chose specific non-toxic fluids that evaporate quickly to simulate reduced and increased friction effects. More information about the hardware of this device can be found in the video above.

For this project, I implemented the entirety of the Augmented Reality demo. This demo is a racing game where users control a car with their fingers and drive it along a constantly-evolving racetrack. As the users drive, the track will change in various ways, including getting slippery from ice, becoming slimy from a giant slime monster, and getting sticky from bees. During all of these effects, the Stick and Slip haptics device will activate and allow the user to physically feel the changes in the track. Examples of some of these effects are shown in the video above.

Jumpmod Haptic Backpack - Summer 2022, Accepted and Demoed CHI 2023 (I am a co-author!), Will Soon Demo at Siggraph 2023

Jumpmod Escape Room Let's Play.mkv

TLDR - Watch the video above since it showcases the VR demo scene I made on my own to demo the backpack shown above.

The Jumpmod backpack is a haptic backpack that enhances users' perceptions of vertical acceleration, especially during jumps. The backpack moves a weight at high speeds up and down the user's back at a whim which can create a variety of different effects, including, but not limited to, the feeling of a higher jump and the feeling of a harder landing. The PHD student I worked with on this project, Romain Nith, concentrates more heavily on the hardware, so I helped him by working on the majority of the software for this project.

One contribution I made is that I wrote and refactored all of the code for controlling the hardware, including creating an OSC + BLE communication protocol so the hardware can easily communicate with Unity demos. After switching multiple components in a new revision of the hardware, much of the original code became obsolete, so I stepped in and updated everything. These updates included many complete reworks of preexisting functionality to improve performance, ease of use, and readability. I also refactored the code for the old VR demo to fix numerous detrimental bugs and to allow the demo to work with the new communication protocol.

Another contribution I made is the creation of two different jump prediction algorithms that are vital for timing the device's haptic effects. These two algorithms include a 100% accurate algorithm for VR applications and a less accurate algorithm for non-VR applications. The VR algorithm uses the headset's head tracking to directly determine the user's position on a jump curve. Whenever the position reaches a key point on the curve, an event is triggered which can trigger haptics or anything else in the VR demo. The non-VR input method uses the accelerometer from the IMU on the microcontroller attached to the backpack to predict key phases of jumps as before. However, this method is less precise than the VR method since the accelerometer on the backpack was incredibly inaccurate and none of the jump phases can be determined directly from acceleration and instead require either velocity or position. To determine velocity and position, the algorithm uses Semi-implicit Euler Integration, though this causes error to accumulate over time since acceleration values are polled at finitely small time intervals. To combat this, the algorithm adjusts its estimations at certain points, though there is no perfect way of preventing this error. I proved the effectiveness and accuracy of these algorithms by running a user study with 12 participants and comparing the algorithm's data to that of a post-jump analysis of the participants' positions (see paper for more information).

The largest contribution I made is a VR demo game that dynamically showcases many of the features of the backpack. I created this game using the Unity game engine. This VR game is now the HCI Lab's primary means of demoing this project, and it has already been used in another 12 participant user study to determine the impact and enjoyability of the haptic effects. I even had the chance to present this VR scene at Argonne National Lab! The game is an escape room platformer where all of the jumping actions used to solve the puzzles trigger haptic effects in the backpack. These effects include a jump boost effect when the user uses a jump pad, a hard landing effect when the user jumps from high up, an elevator effect to suggest the movement of the elevator between levels, a slight vibration to suggest the presence of the ghost, and more. A video showcasing a first-person playthrough of the game is displayed above.