View Paper and View Code
I worked on the MADE-Axis during my internship at the Immersive Analytics Laboratory at Monash University from late 2019 (the second year of my undergraduate studies) to the end of 2020. The devices were designed to serve as composable controllers for data visualization in extended reality and conventional 2D formats. Below is a video demonstration of the device from ACM ISS 2021.
The controllers consist of a pair of actuated sliders, a rotary encoder (knob), and a push button.
I contributed to the development of these controllers by programming the firmware that runs on the devices, enabling features such as ranged selection of data and haptic feedback from the sliders. I also developed a Unity program to facilitate Bluetooth and wired (USB) communications between the controllers and a host PC. The repository containing the code I contributed to for this project can be found here.
In addition to developing the software for the devices, during 2020, when I worked part-time with the lab, I explored the design space offered by these controllers by creating demo applications designed to showcase unique interactions made possible by the devices. This included an application that allowed the device to interact with Adobe Photoshop, as well as custom self-developed Windows Forms applications. Below are a select few examples.
Using the device to choose a color
An interesting interaction afforded by these controllers is their ability to allow accurate selection of a color by modifying the sensitivity of one slider using another. An early prototype application I developed allowed modifying RGB colors in two ways:
• Binding one slider each to the R and G color channels, and the blue channel to the rotary encoder.
• Clicking the rotary encoder button toggled between color channels, where one slider set the sensitivity of the other, allowing more precise selection of color. The sliders could also be used as “joysticks” by allowing the user to keep scrolling past the end of the range of colors mapped to the slider by holding at the extreme ends of the axis. When doing this, a haptic “tapping” sensation was encoded using the motors of the sliders.
Communicating the range of colors available to the user at a given sensitivity value was also a question I sought to answer. One approach I took was to show bars on either side of the cursor on a color channel, which filled to show the domain of colors available to the user on the slider.
Another approach I demonstrated was showing the available colors in a gradient directly below the color channel, with a cursor (circle) moving along this range with the user’s selection. I also included a small canvas in the application to allow drawing of simple shapes and squiggles.
The color selection techniques I developed could be applied more broadly than just to art. Here, they are used to modify the shade of a data series, tying back to the core function of the devices as controllers for data visualization.
Using the device to control Adobe Photoshop
The controllers could be used to resize and rotate and image, using the sliders for resizing operations, and the knob mapping to rotation.
The sliders could also be used as joysticks to translate a layer of the project.
The device could also be mapped to, and adjust the color widget in Photoshop.
As the sliders on the devices are actuated, a color could be "swabbed" in Photoshop, with the physical sliders aligning to reflect the selected color channel values. This color could then be fine tuned, or applied to other areas on the canvas.
View Paper and View Code