15.06.2025 - 06.07.2025 (Week 8- Week 10)
Gunn Joey / 0366122
Experiential Design / Bachelor of Design (Hons) in Creative Media
Task 3: Project MVP Prototype
TABLE OF CONTENTS
1. Module Information Booklet
2. Lectures
3. Task 3
4. Feedback
5. Reflection
MODULE INFORMATION BOOKLET
LECTURES
Week 7:
Mr. Razif guided us through the steps required to build and run a Unity project on a mobile device. He explained how to prepare the project for deployment and set up the necessary settings for mobile testing. For those using Mac, we used Wecode to deploy the application onto our phones. This allowed us to test the app outside the Unity editor and see how it performs on actual devices. The session gave us a clear understanding of the workflow needed to develop, test, and troubleshoot Unity projects for mobile platforms, which is a key part of mobile app development.
Week 8:
Week 9:
Week 10:
TASK 3
Once their proposal is approved, the students will then work on the prototype of their project. The prototype will enable the students to discover certain limitations that they might have not learnt about before and they’ll have to creatively think on how to overcome the limitation to materialize their proposed ideas. The objective of this task is for the students to test out the key functionality of their project. The output may not necessarily be a finished visually designed project. The students will be gauge on their prototype functionality and their ability to creatively think on alternatives to achieve the desired outcome.
Requirements
- Screen Design visual prototype (Figma)
- Functioning MVP of the App Experience
- Video walkthrough and presentation of the prototype
- Online posts in your E-portfolio as your reflective studies
Progress
A. Figma Screen Design Prototype
Fig - Demo Progress
Fig - Original Design (Figma Mockups)
Fig - Original Colour Scheme
Fig - Refinement (Figma Mockups)
Fig - Colour Scheme Refinement
B. Set Up Progress
Using the prototype from Task 2 as a starting point, we began by setting up the Normal Mode scene in Unity, which served as the main environment for user interaction and feature testing.
To begin, we imported the Vuforia Engine into Unity and set up both the AR Camera and Image Target. This allowed us to start creating AR effects that would appear on the specified image. We started by laying out the core visual elements and then introduced animations to make the experience feel more interactive and lively.
Creating our avatar was a creative process. We started by sketching and illustrating it manually using Procreate, which gave us the freedom to add a personal and artistic touch. Once we were happy with the initial drawing, we brought it into Adobe Illustrator to refine the details further. There, we added background elements, subtle glow effects, and other enhancements to elevate its visual appeal and ensure it fit well within our overall design style.
With the avatar complete, we shifted our focus to building the landing scene. We incorporated a carefully chosen background image to set the tone and atmosphere, then added interactive buttons to make the scene functional and engaging for users. This combination of hand-drawn elements and interactive design helped create a welcoming and polished entry point to our AR experience.
The Home Scene was designed to follow the same visual structure as the landing scene, maintaining consistency in layout and style. However, instead of a single button, we introduced two distinct options: one for Normal Mode and another for AR Mode. This allowed users to easily choose how they wanted to engage with the application, depending on their preferences or needs at the moment.
Fig - Image Target (Business Card)
1. Normal Mode Scene
Fig - Normal Mode Progress (Import Image Target & Assets)
Fig - Normal Mode Progress 2 (Animation)
Fig - Normal Mode Progress 3 (Video Clip)
Next, we brought in the video clip and made sure all visual assets were configured to follow the Image Target. By placing everything in World Space, we ensured that the AR content remained properly anchored and responsive to user movement. After testing the scene and confirming that all components worked smoothly, we continued by developing the next part of the experience.
Fig - Normal Mode Outcome (Layout)
Fig - Normal Mode Preview
2. Landing Scene
Fig - Avatar png (Adobe Illustrator)
Fig - Landing Page Outcome
Fig - Landing Page Outcome Preview (GIF)
3. Home Scene
Fig - Home Scene Outcome Preview (GIF)
To make the interface feel more interactive and user-friendly, we added responsive features to the buttons in both scenes. When users hover over a button, it visually responds to indicate that it is clickable. Additionally, when a button is pressed, it gently scales down to give a sense of tactile feedback. These small but thoughtful details helped create a smoother and more engaging experience, encouraging users to interact confidently with the interface.
Fig - Button Settings
Fig - Buttons code Progress (Navigate to scenes)
4. AR Mode
Fig - AR Mode Outcome Preview (Screen Overlay)
5. Learn About Me Scene
When users tap the "Learn About Me" button, the avatar stays on screen and begins to speak, introducing itself in a friendly and engaging manner. This interaction is designed to feel more personal, giving users the sense that they are having a direct conversation with the character rather than just watching a scripted animation. To enhance this experience, the avatar’s self-introduction is accompanied by subtle animations and facial expressions that match the dialogue.
Fig - Learn About Me Scene Outcome Preview (Screen Overlay)
6. Watch My Intro Scene
Fig - Watch My Intro Scene Outcome Preview (Screen Overlay)
7. Ask Me Anything Scene
For the "Ask Me Anything" page, I designed an interactive feature that simulates a conversation with the avatar, activated through a simple button click. To begin, I planned and arranged the layout of the page to ensure it was user-friendly and visually balanced. Once the structure was in place, I added animations to make the interaction feel more dynamic and engaging.
To enhance the realism of the experience, I used the button’s OnClick() function to trigger the conversation. This means the interaction only appears after the user actively presses the button, creating a sense of responsiveness and giving the impression that the avatar is directly answering their prompt. This thoughtful approach helped make the interaction feel more natural, as if the user were genuinely chatting with the avatar in real time.
Fig - Ask Me Anything Progress (Button onclick)
Fig - Ask Me Anything Progress (Animator)
Fig - Onclick Settings
Fig - Ask Me Anything Outcome Preview (GIF)
C. Final Submission
Goggle Drive Link:
Walkthrough Video:
Presentation Video:
FEEDBACK
REFLECTION
During the development process, I noticed that interactive features had a strong impact on user engagement. Animations, responsive buttons, and character movements made the app feel more lively and immersive. A simple static interface did not create the same effect. I also observed that consistency in layout and color choices helped the app feel more professional and easier to use. When the design elements worked well together, it made navigation smoother and more intuitive. These small details contributed to a better overall experience for users.
Through this task, I learned that successful AR design is not just about visual quality but also about how the user interacts with the content. Every element needs to serve a purpose and work together to create a clear and enjoyable experience. Placing assets correctly in World Space and linking them to the Image Target helped ensure that the AR content stayed anchored and responsive. I also realized the importance of refining a design based on feedback. By improving the color scheme and adding more interactive features, the final version of the app became more effective and user-friendly. This experience showed me the value of combining creativity with functionality in AR development.
Comments
Post a Comment