Blog

What? Why? How? So What?

What?

The identification of a ‘hunch’ or tentative research proposition, leading eventually to a defined and viable research question — Gary

After I got my bachelor degree in Digital Arts and Technology, I went to work as a teaching assistant in Roy Ascott’s Technoetic Arts Studio to teach basic knowledge and skills in VR. At the end of the semester, I also helped and supervised the students to set up the VR section in the exhibition called “A Technoetic Multiverse”. During my teaching process, I gained more interests in VR and also found that VR has so many limitations. One of which is that most VR experiences are characterized by individual immersion. The wearer puts the headset on and suddenly disconnected from his or her surroundings, dives into another space. Especially for exhibition purpose, other audiences cannot get involved in that experience. I argue that a good VR experience should not only focus on the HMD wearer but also focus on other audiences. By doing that, I want to find some possible ways to improve the shared awareness in VR experience.

Why?

The need for your research in relation to the wider context, in order to test out the value of your proposition, locate your research position, and explore a range of research strategies — Gary

Firstly, the limitation I just mentioned will cut off the social connections and awareness between the virtual world and physical world, between the HMD wearer and other audiences. As a consequence, people will get bored of VR very easily.

Secondly, VR has been around for more than 30 years and just enter the slope of enlightenment recently. It needs new revolution and innovation by merging with other parallel emerging technologies.

How?

The importance of developing an appropriate methodology and specific methods for gathering and generating information relevant to your research question, and evaluating, analysing and interpreting research evidence. — Gary

The methodologies I intend to use for my thesis is Research Through Design and Agile Design. I will make different versions of prototypes in an iterated process to help to improve the shared awareness in VR. At the same time, I will do academic research to gain more knowledge and contributes to my design process.

So What?

Challenges you to think about the significance and value of your research contribution, not only to your practice but to the wider research context, and how this is best communicated and disseminated. — Gary

So in the end, I want to prove my argument of designing shared social VR experiences for both HMD wearer and other audiences through my final outputs of my thesis. On the top of that, I want to encourage other artists, designers, researchers to pay more attention to the physical world and other audiences when designing and developing a VR experience.

Reading and Research

Virtual reality is cool and has been around for more than 30 years. However, only recently it then gradually found its way into the commercial world and become more commonly known. Although VR can provide very immersive experiences to people, it still has many limitations. One of which is the gap between the HMD wearer and other audiences, this immersive technology cut off the social connections and awareness between the virtual world and the physical world. As an artist and designer, I argue that a good VR experience should not only just focused on the HMD wearer but also focused on other audiences. Hence, I want to make an interactive art project with a redesigned HMD to help to solve this problem by combining other emerging technologies with VR. My thesis is aiming to explore how to improve shared awareness in virtual reality experiences.

My independent study in the summer is a series of experiments to explore whether the technology is capable of helping to solve the research questions, so based on that I can initialize a fundamental framework and add more creative contents later on. These experiments including minimizing wearable circuits, working with e-textiles and fabrics, 3D printing, laser cutting and developing IoT AR app.

Virtual Reality and Augmented Reality

Screen Shot 2018-06-29 at 10.31.32.png
Figure 1. Simplified representation of a RV Continuum  Source:https://www.researchgate.net/figure/Simplified-representation-of-a-RV-Continuum_fig1_308889275

The definition of Mixed Reality can be traced back to 1994 to a research paper written by Paul Milgram and Fumio Kishino. As shown in Figure 1, the reality-virtuality continuum encompasses all possible variations and compositions of real and virtual objects.  Many of the virtual reality headsets rely on smartphones to display the content. While these devices are an excellent introduction to VR and also provide a budget-friendly solution for my thesis, but they lack the visual quality to deliver an immersive experience. In Mark Billinghurst’s blog, he mentioned there are seven types of HMDs as following:

  1. Monitor based (non-immersive) video displays. Showing video of the real world onto which digital images are superimposed
  2. A HMD showing video. The same as type 1, but the content is in a HMD
  3. Optical see-through HMD. A see-through display that allows virtual images to appear superimposed over the real world
  4. Video see-through HMD. The same as 3, but showing video of the real world in front of the user with virtual graphics superimposed on it.
  5. Monitor based AV system. Showing 3D graphics on a monitor with superimposed video.
  6. Immersive or partially immersive AV. Showing 3D graphics in an immersive display with video superimposed on it.
  7. Partially immersive AV systems. AV systems which allow additional real-object interactions, such as interacting with one’s own (real) hand.

AR is about augmenting the human experience and it will not advance in isolation. The real impact AR will have is when it becomes a super medium that combines other parallel emerging technologies like wearable computing, sensors, the Internet of Things (IoT), machine learning, and artificial intelligence. — Helen Papagiannis

Working with e-textiles and fabrics

E-textiles can more easily adapt to fast changes in the computational and sensing requirements of any specific application, this one representing a useful feature for power management and context awareness (Stoppa & Alessandro, 2014)

Smart Textiles will serve as a means of increasing social welfare and they might lead to important savings on welfare budget. They integrate a high level of intelligence and can be divided into three subgroups (Stoppa& Alessandro, 2014) :

  • Passive smart textiles: only able to sense the environment/user, based on sensors;
  • Active smart textiles: reactive sensing to stimuli from the environment, integrating an actuator function and a sensing device;
  • Very smart textiles: able to sense, react and adapt their behavior to the given circumstances.

For this prototype, I only used a small piece of press sensitive fabric to incorporate with the headset, but I would like to try a large piece of smart textile to place at the intraorbital place of the headset so it can detect the movement or the bio-data of the wearer’s face and then contribute to the interactions of the whole experiences.

3D printing and laser cutting

The DDF (Digital Design Fabrication) method is a two-stage process of working that integrates generative computing and RP (Rapid Prototyping) into one process. Together they support a process to generate diverse candidate artifacts as solutions to design problems (Sass & Rivka, 2006).

I used the laser cutter to cut the felt marker for my headset, the problem I am facing is that the felt cloth continuously moved under tension through a cutting zone. After reading old papers by Mcken, Medley and John, I found that magnetic material may be a better solution. It can provide a template formed of a generally rigid magnetic material for placing over sheets of material to be cut with a base plate therebelow, in general alignment with the template. The base plate will provide a magnetic field for attracting the template toward the base plate to apply pressure to the sheets of material stacked therebetween (Macken & Jon, 1984). I haven’t tried out this method yet, but I used a wooden plate to place under the fabric and used a paper tape to align. My fabric does have some slightly burnt mark at random places after laser cutting and also the off-track results because of the high-speed setting(I set to 60) to make it moved around. It’s ok to cut on small surface and cheap material, but for my next iteration it would be good to completely align the materials to achieve a better result.IMG_4560.JPG

Reference List

De Angeli, Daniela, and Eamonn J. O’Neill. “Development of an Inexpensive Augmented Reality (AR) Headset.” Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2015.

Macken, John A. “Method and apparatus for laser engraving of smoke-sensitive materials.” U.S. Patent No. 4,458,133. 3 Jul. 1984.

Medley, John A. “Method of and apparatus for cutting cloth.” U.S. Patent No. 3,614,369. 19 Oct. 1971.

Sass, Larry, and Rivka Oxman. “Materializing design: the implications of rapid prototyping in digital design.” Design Studies 27.3 (2006): 325-355.

Stoppa, Matteo, and Alessandro Chiolerio. “Wearable electronics and smart textiles: a critical review.” Sensors 14.7 (2014): 11957-11992.

Final Prototype

Version Conflicts Problem (solved):

After several attempts for different sets of versions of systems and software, I finally found by far the best combination:

  • MAC OS: High Sierra 10.13.5 (17F77)
  • iPhone OS: 11.3.1 (15E302)
  • Xcode: 9.4 (9F1027a)
  • Vuforia: 7.1.35
  • Unity 3D: 2018.1.4f1

The newest version of the software will usually have some incompatible bugs, it’s better to check with the slightly older version. However, Vuforia 7.1.35 actually fix the black screen issue I had since last time.

The newer version of unity comes with native Vuforia support, it’s slightly different with the old version but much convenient. Image target can be added directly through Gameobject and also load the database I customized from Vuforia developer portal.

Screen Shot 2018-06-28 at 19.17.55

Screen Shot 2018-06-28 at 22.44.59.pngScreen Shot 2018-06-28 at 22.42.36.png

Final Circuit:

I reduced the amount of components for the final prototype to make it clear and simple enough to work with. I used the three AAA batteries in the end. On the one hand, it has stable current to power my micro-controller. On the other hand, it’s lightweight and easy to change and maintain.

ciucuit-1-12.jpg

Assembling:

So first thing first, the wires need to be fixed, I removed the felt cover and untied everything. I followed my initial sketch to attach the microcontroller and battery case at the bottom of the headset.

IMG_4547
Before

The battery pack should be placed in a position that to be easy to replace the batteries. And also the microcontroller needs to be exposed for future development. I glued all the wires along the edges to make them out of the way when wearing the headset. I also changed all the jumper wire to hookup wires, so they can stay nicely.

The wires come with the servo has the female jack which is not secure enough to connect, so I cut it out and soldered the wires covering with a shrink tube.

Then I laser cut two new patterns for the right and left sides of the headset. As you can see from the first video, the size of the felt cover dosen’t fit the case very well so it makes AR app difficult to trigger the image targets. I also adjusted the design a little bit to make the rating to be 5 stars, which means it would be the most readable mark for Vuforia.

Screen Shot 2018-06-28 at 23.56.03.png

Here is the final results:

IMG_4590.jpgIMG_4588.JPG

Future Plan:

This is the very first version of a prototype for my entire thesis. I want to explore the possibilities to make the shared awareness in VR experiences and also speculative design the future HMD display. IoT and AR become two very doable solutions for me. This prototype provides a fundamental framework for me to add more creative contents for the future development of my thesis and further examine my research questions.

I would like to do the followings for my next iteration:

  • Add more visual effects to the headset like LED, EL wire, EL panel.
  • Find more solutions for the image targets.
  • Develop VR interactive scene to communicate with the AR APP.

Combining and Finalizing

 

This week I am combinning and finalizing my headset prototype. I printed out another LED chimney and attached to the headset. And now it looks like this.

The next step is building out the app to my iphone but I am having a hard time with software and system version conflicts. I am building my app on an Iphone6 plus with ios 11.3.1 on a 10.12.6 Sierra Macbook pro (mid 2012 model).

Versions Conflics Problem

Xcode: The problem starting with Xcode, my Xcode is 9.2 which only support up to ios 11.2, when I tried to build app to my iphone, the warning window poped up. So I have to update my Xcode to the latest version. However, it also required my MacOS to be latest which needs to be higher than High Sierra 10.13.2.

Screen Shot 2018-06-12 at 11.53.16.png

MacOS: I checked my system update and it shows my MacOS system is the latest, so I went to app store and manually downloaded a High Sierra system update, it then install the High Sierra system on my Macbook. However, the update page on App store only install 10.13.1 which still not capable to work with Xcode and also need to manually update to 10.13.3 after found a update pakage here.Screen Shot 2018-06-12 at 12.06.30.png

 

NVIDIA and CUDA: After my system updated to 10.13.3 it then can be update from App store to 10.13.5. I open my Unity to continue working on project and nothing shows up. I then found out the new macos system High Sierra doesn’t compatible with NVDIA driver for the graphic card, it also showed a “Update Required” on CUDA preference setting. After some research, I found a post talking about fullly compatible driver with High Sierra here .

Right now the macOS’ native graphics driver that help the system communicate with the nVIDIA GPU, is still not updated to the really compatible version. So what we have to do here is to install the “web version” of the driver, which is an OFFICIAL version from nVIDIA, it’s just not a “native version” from Apple.” 

After I removed my NVIDIA driver and installed the web driver, my unity can work fine now.

Screen Shot 2018-06-10 at 21.26.05.png

Unity version: I was then able to build the app on my Iphone but it can’t detect the image target from my device. Screen shot when iPhone running the app

So I rebuilt the app from unity and it shows another error, it shows the problem of my unity version, I was using 5.6.1. I went to update my Unity to the latest (2018.1.4), which then logged out more errors and I have to went down to 2017.4.3. It then works fine not without logging out errors.Screen Shot 2018-06-12 at 06.28.34.png

Unity version 2018.1.4, error messages  Black screen issue

After all the version updated, I successfully build the app to my iphone with a new black screen issue. It is still one of the bugs from Xcode 9.3, I found a tutorial on youtube to replace iphone support file with the older version.

I am still working on this and hoping the fix the problem this week.

Laser cutting

AR Marker

AR marker also known as fiducial marker provides a fixed point of reference of position or scale in a scene. AR markers can help to generate an interface between the physical world and the augmented reality content such as 3D models or videos. The camera of the device will calculate the position and orientation through AR markers in real time which is known as tracking.

A good and effective AR marker should contain the high-contrast and random pattern. I have two ideas in mind to make AR markers, one is using LED screen to make dynamic QR code and the other one is laser cutting the markers on a piece of fabric to cover the whole headset.

  • LED screen markers
    • benefits: Aesthetic pleasing, the whole screen can be hidden under light-diffusing materials such as fabric or plastic. Ideally, it can generate dynamic QR code to trigger different objects in AR.
    • shortcomings: It will take a lot of pins on a microcontroller to be able to generate animations. My original plan was using one particle photon board which has a very limit pin ports.
  • Laser cutting markers
    • benefits: Also can be aesthetically pleasing after designing nicely. Multiple patterns can be placed at different angles and trigger different objects in AR as well. No technical limitations.
    • shortcomings: Comparing to LED, It’s hard to reveal the markers.

For my prototype one, I decided to experiment with laser cutting AR markers to see if it can work nicely with Vuforia in Unity3D. I found an AR marker generator which can help to generate the high-quality complex pattern online and download as .png or .jpg which can be readable in any AR application. I converted the pattern to vector file in Illustrator and tested with the laser cutter. However, it didn’t work well as it has too many details and the headset itself isn’t big enough to be able to cover it, so at the end the laser couldn’t be able to cut in such a small piece.

I then decide to design the pattern myself to make it work better with the small surface. I created three different patterns for three angles: font, left and right.

This time it worked much better in laser cutting, I’ve also optimized my machine setting to reduce the less burnt on the fabric I was using.

IMG_3617
Machine settings: Speed, 60. Power, 10. 1.5mm felt
blog
Results after covering my headset

Then I uploaded my customized markers in Vuforia development site, it then generated the unity editor package for me to able to import in Unity later on. However, I was stuck here for a while because the unity can’t read my markers for some reason. It’s important to be aware that the customized image target needs to be checked as “Activate” in the Vuforia configuration in Unity.

Screen Shot 2018-06-05 at 12.33.01.png

20180605_1
All the components are working properly

Here is the video of the laser cutting markers working in different angles and trigger different objects in Unity.

3D Print Design

 

3D Print Servo Attachment Design

For this week, I decided to work on the design part of the headset. The most important thing is the servo attachment. Firstly I measured the dimensions of a 9g servo and sketched out the ear, then I created the 3d model in Rihno and added the screwed part so it can attach to the servo after. As I haven’t used Rihno before, so I just followed the beginner tutorial on youtube and created a fairly simple model. I print them out on Ultimaker 2+ with 0.4mm yellow PLA filament. I am not a big fan of yellow, but it has been set on the machine, I will paint them later on with my preferred color.

IMG_3499.JPG

img_3497.jpg

Screen Shot 2018-05-27 at 19.59.24.png

After that, I secured the servo onto the holder with two screws and hot glued each one onto my headset. There are another two holes left for screwing at the bottom, but I just left them for the end until other parts have been assembled. I download the free servo holder file from here.

After attaching the ears on my headset, I then calculated the right angles for movements, so it will move from both directions at the same time.

Then I included the servo codes with my previous LEDUnity code in Particle IDE. However, at the first time, it won’t work. So I created an empty sketch to just write simple servo code in it to test if it’s working. After that, I combine the test code with my LEDUnity code and found out it only works when I put them inside the loop. For the web control purpose, it can’t be inside a loop but actually a function. I then did a research and found a tutorial on Particle website and realized that the servo should be initialized the starting angle at setup firstly, so it can then be written inside a function.

Screen Shot 2018-05-28 at 09.52.14
testing code
Screen Shot 2018-05-28 at 10.06.14
amended code

Power Supply Issue

Countining with last week issue for power supply, my supervisor pionted out that the reason I can’t use coin cell battery is that it has big current drop. CR2320 Enginnering Sepcifications.

I tested with two lipo battery as the same circuit last week by using a 5v regulator, it can successfully power my photon and it only drop 0.3v after connecting to the particle photon. So I decided to use AAA battery to build this prototype considering my overall budget.

IMG_3483

Sewing electronics onto fabric

 

Power supply issue

Followed by my previous blog post, the first problem I need to solve here is to convert my 6v power supply to 5v. I connected a 5v regulator to my coin cell batteries, as you can see from the photos of the measurement from a multimeter below, I successfully converted the voltage from 6.32v to 3.86v.

How I connect the regulator
How I connect the regulator

 

However, I do have another problem after connecting my power supply to particle photon. My particle photon won’t light up properly as the status led will turn on red and I kept trying, it sometimes doesn’t even light up. I firstly think it may be the reason of wifi connections so I power my particle photon directly from usb cable, it then just worked fine. So I used my 3 used AA batteries (4.08V) supply instead to test whether it will work or not even I have done this way and I know it can work. At the same time, I measured the voltage, which reads 3.41V. I connected back to coin cell batteries and the multimeter read 2.23V. I guess the problem is the coin cell batteries supply somehow becomes very low after connecting to the board and I don’t know the reason. I will consider use the AA battery pack instead for my prototype.

Sewing electronics onto fabrics

I followed this online course and this tutorial to practice my sewing skills and be prepared for my prototype building.

Some very important knowledge I have learned is that to avoid short circuit when sewing with the conductive thread such as loose knots and loopy connections.

So firstly I made a 3v battery pouch for a 20mm coin cell battery after downloading the stencil and cut it out on the felt.

img_3417.jpg

IMG_3420
I did this wrong for the first time because the conductive fabric should go through back

 

 

IMG_3427

ezgif-3-60cdf73877.gif

Solved communication from Unity to Particle Photon

There are very few but pieces of resources on the internet to talk about this communication, but it’s actually in the same concept with net controlled LED and very easy. I created a button in unity, it will send a form to the HTTP server with the post request.

Screen Shot 2018-05-21 at 22.54.31.png

 

Useful Resources:

https://docs.unity3d.com/ScriptReference/Networking.UnityWebRequest.Send.html

https://stackoverflow.com/questions/46003824/sending-http-requests-in-c-sharp-with-unity?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa

https://stackoverflow.com/questions/48627680/unitywebrequest-post-to-php-not-work#new-answer

https://docs.particle.io/guide/getting-started/examples/core/

https://answers.unity.com/questions/239248/web-service-in-a-button-event-handler.html

Related Youtube tutorial