I have just completed my second of a three part sprint where I am currently creating an Augmented Reality application that utilizes the Meta 1 Glasses. Below, I share my process during this second phase of my project and the results of the usability tests I conducted with the help of Quarry Integrated Communications.
The purpose of conducting these usability tests was to see if it is ultimately worth it to continue developing for the Meta 1 headset, or if it would be better to shift focus to other, more easily accessible platforms. The version of the Meta in the lab is the first available version, which came from the Meta Kickstarter campaign. This means the headset is still in the development stage, and still has some kinks to work out yet. Here you can find another post by REAPster Filip Jadczak
By conducting these tests, I wanted to find out if the technical capabilities of the Meta were at a level that would warrant the time needed to develop an app for it.
I created a quick prototype using the location-based AR app I created in my previous sprint. In it, I developed three different ways for people to learn about the technologies in the lab. The first is text-based. When the user looks over at the SoundShower, a parabolic speaker in the lab, it is highlighted with a label. Underneath, there is a button with the text “Learn More.” When pressed, a text box appears that lists the features of the SoundShower.
The second method involves watching a video. To activate the video, the user simply reaches out and grabs it. Originally, my intention was for the video to play when touched, but I ran into issues on the development side, so I opted for the grabbing motion instead.
The third method involves interacting with a 3D model, in this case a model of the Oculus Rift headset. The user can reach out and grab it to move it around. Grabbing with two hands allows the user to rotate the model in different directions. This method is much more interactive and hands-on that the first two methods, and offered a unique experience with the Meta, vs. other AR viewers which tend to be much more static.
In total, I tested the app with six different users. Each was an employee from Quarry. For each test, I began by explaining what the Meta is, and the purpose of the app we were testing. I helped them get settled with the headset and then open the app. After waiting for the calibration to complete, I would make sure they were seeing what they are supposed to. Then, I guided them through each of the three methods. If the user ran into any problems or was confused as to what to do next, I provided further guidance.
Unfortunately, in a few cases the Meta did not work properly for some of the methods, resulting in the need to skip them. Once all the tasks were completed, I gave the user a bit of time to play with the app before concluding the test, and then gave them a survey to fill out. The results of this survey and the notes I took during the tests are presented in my attached report.
User Testing Results
During testing, I noted several pain points. There were a few occasions where at least one of the methods wouldn’t work properly. Another major issue was that the users often weren’t aware of what exactly they needed to do for each method. Based on my instructions, they were often able to activate the video and interact with the 3D model with a grabbing motion. On the other hand, they didn’t know how far they had to reach out in order to interact with the virtual items.
One user told me that there wasn’t enough visual feedback based on their actions, and that more visual cues would be helpful to know what exactly to do.
While the attached report shares the details of my usability testing, here are some quick highlights of what I learned:
These results went against my original assumptions. I was surprised that users found the video most informative. I thought that the 3D model of the Oculus might be the most informative, but users found that method to be the most difficult to interact with.
One of the participants brought up something I hadn’t thought of while I was creating the Meta app. They said that if there had been more visual feedback on what to do next, then I wouldn’t have had to guide the users so explicitly through each task.
Overall, from my usability testing with the Meta I’ve found that there are still a few factors that hinder the user experience of the product that cannot be fixed through app development alone. Some hardware issues including inconsistent hand-gesture recognition and the size of the viewport prevented users from being able to complete tasks quickly, or to complete them at all in some cases. For anyone who wants to pursue app development for the Meta, it is important to remember that users need visual feedback from the virtual interface, as mentioned above.
Since AR headsets are an emerging technology, it is important to design with first-time users in mind. They will need explicit instruction on how to interact with this virtual environment, and they will need feedback from the interface to know when they’ve completed a task.