top of page

The Winter Time Project

The Winter Time MR Project is an innovative and immersive a mixed reality initiative designed to address the challenges and impacts of Seasonal Affective Disorder (SAD),and other related mood disorders that often emerge during the colder, darker months of the year. It leverages MR technology to provide individuals with an interactive and engaging solution to combat the symptoms of SAD and enhance their overall well-being during the winter season.Developed by Shuting Lei, Peiru Chen, Yawen Chen from Goldsmiths MA/MSc Virtual & Augmented Reality.

As a VR designer, I contributed to MR development, UI/UX design, sound design, interactive and visual effects (VFX) design, integration, testing, and video editing.

Why creating MR project about the winter time?

sad-title-image_tcm7-179953_edited.jpg

01

Innovative Use of Technology for Well-being:

Winter months are often associated with Seasonal Affective Disorder, a type of depression that's related to changes in seasons. By creating immersive and interactive experiences, it provides a novel approach to improve mood disorders.

02

Advancements in MR Technology with Head-Mounted Devices:

With the evolution of MR technology, the use of head-mounted devices, such as the Quest 3, has become increasingly feasible. These devices offer a more immersive experience compared to smartphones by seamlessly blending virtual interactions with the real world. The head-mounted display provides a more encompassing and interactive environment, enhancing the user's sense of presence within the MR world.

截屏2024-01-09 05.44.00.png
截屏2024-01-09 05_edited.jpg

03

Leveraging Environmental Changes:

Winter is characterized by shorter days and longer nights, leading to reduced exposure to natural sunlight. This can affect mood and energy levels. An AR project can simulate brighter and more stimulating environments, countering the lack of natural light and its psychological impacts.

截屏2024-01-09 05.44_edited.jpg

The cold and often lifeless nature of winter leads to a significant decrease in outdoor exploration and physical activity. By integrating MR experiences that can be accessed outdoors, the project encourages users to engage in physical activity and exploration, even in colder weather. This not only combats the sedentary lifestyle commonly adopted in winter but also adds an element of fun and discovery to outdoor environments. 

Enhancing Outdoor Activities in Winter:

04

"In winter's grasp, 
we yearn for the sun's gentle touch,
for moments of dazzle in winter’s cold.
 
When skies are gray,
colors burst forth,
in a virtual display.
A technological haven,
an escape from the chill."


------Shuting Lei & Ross Bowes

Get Inspired

MR Features

The Winter Time MR Project incorporates several key mixed reality (MR) functionalities:

1

Outdoor Exploration Enhancements

2

Mood-Enhancing Visuals

3

截屏2024-01-12 05.14.05.png

Based on Quest3 & Unity

4

Hand Tracking

5

Location-Based interaction

6

Interactive Elements

User Experience Design

First of all, find any wide, flat piece of land !!!

Players will first enter the initial interface and read the introduction of the project.

Next, players will automatically enter the experience program. Players only needs to follow the prompts in the scene to experience the full functionality in its entirety.

截屏2024-01-09 06.55_edited.png

1

2

There are seven circles, representing the seven colors of the rainbow.

3

After you trigger the final color, purple, you'll receive a rainbow that can be tracked by gestures. 

4

You can see fireflies gathering together, and when you get close to them, they scatter. 

When you step into the circle at the center of the scene, it causes changes in the sun.

Project Development Process

11.20-11.22

Setting up the unity development environment

11.23-12.3

Scene Design
User Experience Design

11.23-12.13

Location Interaction
Gesture Tracking Interaction

Visual Effects Design

12.10-12.12

UI design
Sound Design

12.11-12.13

Integration
Testing

Personal Contribution

01

​AR development environment settings

Since this is the first time for me and the team members to develop an AR project on Unity, we need to set up an AR development environment in the early stages of development.

 

I used the Oculus XR framework and installed the Oculus integration packages. After I completed the setup and conducted a simple test in Quest3, I sent the files for setting up the AR development environment to the team members to facilitate their subsequent development.

截屏2024-01-16 15.53.59.png
截屏2024-01-16 15.56.02.png
截屏2024-01-16 15.54.18.png

02

UI design

I created the opening interface and project introduction page, using the sun element as our main visual.

title.png
menu2.png

Also, I've added text prompts on the map to guide viewers through the entire project experience, with unique prompts for each interactive experience. 

截屏2024-01-12 08.05.52.png
截屏2024-01-12 08.02.04.png
截屏2024-01-12 08.01.23.png
截屏2024-01-12 08.03.55.png

03

Background Music

Background music will be played immediately when the player enters the initial interface. I used the DontDestroyOnLoad code to ensure the sound continues playing across different scenes without interruption.

using UnityEngine;
using UnityEngine.SceneManagement;
using UnityEngine.UI;

public class ContinuePlayingMusic:MonoBehaviour
{
    void Awake()
    {
        GameObject[] musicObj = GameObject.FindGameObjectsWithTag("GameMusic");
        DontDestroyOnLoad(this.gameObject);

    }
}

04

Interactive and VFX design

I designed the VFX special effect of the rainbow trail, and I used VFX Property Binder to track the position of the hand.

rainbowtrack.gif
截屏2024-01-23 16.54.56.png

After players trigger the final color, purple, players'll receive a rainbow that can be tracked by gestures. To achieve this functionality I used OnTriggerEnter, Invoke, and Collider.

截屏2024-01-24 16.52.46.png

using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.Events; public class Trigger : MonoBehaviour { [SerializeField] UnityEvent onTriggerEnter; [SerializeField] UnityEvent onTriggerExit; void OnTriggerEnter(Collider other) { onTriggerEnter.Invoke(); } void OnTriggerExit(Collider other) { onTriggerExit.Invoke(); } }

I used the particle system to design the VFX of the ground halo. In the scene, when the player steps into the halo, a rainbow will spray out.

halo.gif

05

Integration & Testing 

After each team member completed the corresponding part, they exported the files and sent them to me, and finally imported and integrated the files one by one on my computer.

截屏2024-01-24 17.29.33.png

During the integration process, I ran into some issues. Due to the excessive visual effects of the project, jitters and freezes occurred when the application was running on Quest3, which seriously affected the player experience. Therefore, I optimized all VFX to reduce the number of particles while maintaining good visual effects.

截屏2024-01-24 17.46.45.png
截屏2024-01-24 17.48.46.png
截屏2024-01-24 17.48.32.png
截屏2024-01-24 17.46.37.png
截屏2024-01-24 17.38.26.png
截屏2024-01-24 17.40.37.png
截屏2024-01-24 17.38.11.png
截屏2024-01-24 17.32.28.png

06

Screen recording & Editing

After continuous adjustments and testing, the final project was able to run completely and smoothly on Quest3. Finally, I recorded the complete experience process at the college green of Goldsmiths Campus and edited it into the full project demo video.

截屏2024-01-24 23.17.09.png

Reflection

As a first-time developer entering into AR programming using Unity, this project brought a completely new set of hurdles for me. I had to start from scratch and learn everything from the ground up. Initially, my team and I were at a loss on where to start, especially because the Quest 3 was the most recent device and there were few tutorials or case studies on using HMD for AR development. Fortunately, with the support of professors and senior students, we were able to locate our way 🤝❤️🤝.

Coming from an artistic background 🎨, none of us had experience with software development or coding 🛠️, making the development and programming elements very difficult. However, our strength was in producing spectacular visual effects 🤗✨💥, so we decided to focus on that in order to deliver an impressive visual experience in our project. We decided to use our expertise while applying minimal coding to enable interactive parts.

Fortunately, the vast amount of Unity resources available online 🌏, together with the cooperation of our seniors and tutors, allowed us to create stunning dynamic effects. I hope our research provides some comfort during the chilly winter months ☀️❄️🔥, and that AR and MR technologies can be used more extensively to improve everyday life. I am proud of the hard work and outcomes that my team and I have accomplished over this time 🤟🙏🤞.

Looking ahead, we hope to continue improving our project by introducing more interesting interactive elements and fine-tuning nuances such as interaction sound design 📢 and haptic feedback. Finally, my aim is to create a series of projects that can transform any flat surface on Earth into a personal playground 🛝, showing the joy that technology gives. What an exciting prospect 🤩🤩🤩!

bottom of page