Test Report

Updated Research Questions:

  1. How to improve the limited VR experiences of individual immersion for an interactive art project?
  2. How to make shared awareness in a VR experience between HMD wearer and other observers by merging other existing emerging technologies?
  3. How to use speculative design as a tool to explore mobile-rendered mixed reality HMD for the future?
  4. How to make active communications and interactions between HMD wearer and other observers to overcome the trust issues?
  5. How to use game design methods to set the goals, constraints in the shared VR experiences?
  6. How to discuss the relationship between human and animals by showing different perspectives in a shared VR experience?

The purpose of this user testing is to gain some useful feedback to examine my main research question about the shared awareness between HMD wearer and other audiences to further improve the experience. From more technical part of this user testing, I want to test out the wearability of my headset, the usability of the AR app, the functionality of the IoT communications. Beyond that, I also want to test the artistic concept behind relationship and different perspectives of human and animals.

Prototype:

HMD wearer: A DIY VR headset with LEDs, servos and felts marker. An Android device that running VR scene with the designed rabbit’s vision to be placed inside the headset.

Observer: A ios device that running AR app which can control the components on the headset.

User flow:

  1. HMD wearer put on the headset
  2. Observer launch the AR app
  3. Observer touch the ON button on AR app
  4. LED on the headset turn on and motors move
  5. Observer touch the OFF button on AR app
  6. LED on the headset turn off

Types of my user testers:

  • Self-testing
  • My cohorts
  • Professors
  • People I don’t know

Before the formal testing on Tuesday, I tested my prototype on myself and my friend both as an HMD wearer and observer. I was aiming to make the VR scene moving like a rabbit but both of us were feeling very sick when the VR camera bouncing and moving forward. As a consequence, I decided to remove the ability to move in VR for my user testing. On Tuesday, our class ran a user testing session with my cohorts and professors. After that, I also asked someone who completely has no idea of my thesis to test out my prototype.

Methods of user-testing:

  • One-to-one testing
  • Interview
  • Open discussion

 

 

giphy.gif

In-session observations

  • The HMD wearer moved their head in a large-range once they put on the headset to look around in the VR. The observer who is holding the ios device had to chase the movement of the headset to be able to trigger the AR control panel.
  • The HMD wearer didn’t feel any uncomfortable when wearing the headset and their body movement is pretty natural.
  • The HMD wearer tried to walk around and wondered the interactions in the VR.
  • The observer with the ios device had some difficult to touch the button in the AR app.
  • The sound in the VR scene is not loud enough.
  • Some people have some difficulties to adjust their eye focus in VR, I need to tell them to watch trees that far away, which helped a lot.

Questions and interviews

To HMD wearer:

  1. How do you feel like as a rabbit in VR?
  2. How do you find the VR scene?
  3. How do you feel like when the ears are moving?
  4. Do you feel any uncomfortable when wearing the headset?
  5. Is the headset heavy to you?

To AR App observer:

  1. Do you find the app to be difficult to use?
  2. How do you feel when you can interact with the HMD wearer?

Feedbacks and Answers:

  • The headset is not heavy at all, It’s quite comfortable to wear.
  • I would prefer more instructions from you.
  • The menu in AR app is too small.
  • I wonder why there is a campfire on the headset.
  • The sound is too low, I can’t hear anything.
  • I wonder If there is any interaction I can do in VR.
  • If I were a rabbit, I would prefer to see my pink noise, furry feet and pows when I look down.
  • Wow I feel like I am so small like a rabbit.
  • I prefer the small movements of the ears, I feel like the big movements are too robotics.
  • I can see some shadow of trees is pink.
  • I wonder if a rabbit can turn their neck at such a big angle.
  • I found the graphics in VR are a bit rough.
  • I feel it’s a bit blur in VR.

Reflections and Revision ideas

Based on the feedback I received from the testers, firstly I want to speculate a background story which can both be instructions for the participants but also to immerse my participants in a cyberculture futurist setup and further introduce the idea of human-animal perspectives.

Secondly, I would like to improve both the AR and VR apps. For VR, I want to add a lab scene outside the forest to make sense of my background story and also improve the graphics quality. For AR, I want to replace the small buttons with a 3D rabbit model, so instead of touching on the buttons, my user can touch the specific parts of the rabbit model to interact with the corresponding parts on the headset.

Adding background story:

lab.jpg

Today is 26th July 2038, My name is Yiyi Shao, I’m the chief scientist at AniBot Lab. Welcome welcome! It’s an exciting day to the public our new project RabBot. We implanted two robotics ears and eyes to our test subject. It’s still the first stage of our experiment, but we saw a remarkable success. Now may I invite our guest to wake up RabBot1.0.

Overall, I found the user testing is very useful to my thesis. It helped me to find a way to combine my theoretical framework of cybernetics, speculative design and my research question about shared awareness between HMD wearer and other observers. It also helped me set the artistic style that I intend to incorporate into my design. Beyond that, I gained the positive results from the RTD, UCD, agile methodology I want to use for my thesis, the design and research contributes to each other to examine my research questions. From the user testing feedback, I understand more on what the HMD wearer and other observer expect to interact and communicate with each other. And all of the results will be revised on my next steps of prototypes.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s