top of page

Sprint 3

Asset and designing

As a 3D visual artist, my goal is to design a high quality character within a central hub, serving as a guiding entity for a story-based game. This character should possess both facial and body movement capabilities, immersive interaction with players.

First. Sprint goal

Exploration ​

In order to incorporate facial movements and expressions into my creation, I recognize the importance of understanding how people naturally use their faces when communicating. To gain insight, I actively seek out inspirational videos how human facial expressions during conversation. These videos provide valuable guidance on aspects such as eye blinking patterns, eyebrow movements, hand gestures, and mouth movement. Additionally, I explore videos showcasing the expressions of cyborgs, as my entity will possess a humanized robot appearance. This research helps me capture the essence of realistic and engaging facial motions for my animated character.

Screenshot 2023-05-24 173104.png

To explore facial movement, first I tried pose mode which is a Human generator supporting through a pre-made rig. Since I sculpt a lot of faces and bodies, movement through pose mode didn't work. I keep researching the other way to face movement animation,  I found the way by using the shape key, which I could make designated parts and make movements. But in this case, the shape key didn't work as well because my entity started from the Human generator which has already the value of preset shape keys. So if I put the new shape key and go to edit mode  all the sculpting work I had done went disappear.

Screenshot 2023-04-21 111916.png

Sculpted entity in Edit mode

Screenshot 2023-04-24 135422.png

Designated eye lids in Weight painter

Screenshot 2023-04-24 141652.png

Changing keyframe in Edit mode

After realizing that shape keys were not compatible with the Human generator,I looked for another solution and went to a classmate for advice. He introduced me to the concept of weight painting, which involves grouping specific parts of the mesh and assigning movement to them. Upon reflection, I noticed that the Human generator did not support edit mode, as entering edit mode would reset all shape key values to zero, erasing the sculpting work I had done. Nevertheless, I decided to take a risk and experiment with changing the location of the entity in edit mode, just to see the outcome when returning to object mode. Surprisingly, the changes I made persisted in object mode, even though the entity appeared to have zero values in edit mode. This revelation sparked an idea in my mind. I started to think about how can I designate the part and make changes through value as a keyframe, and I came up to mix up the weight painter and shape key. Through weight painting, I grouped the meshes where I desired movement, while shape keys enabled me to introduce movement with values ranging from 0 to 1. This approach allowed me to successfully animate the eyes, eyebrows, and cheek movements of the entity.

Body movement

Screenshot 2023-05-24 173232.png

Pose mode rig moving

Screenshot 2023-05-24 173200.png

Insert as key frame in the timeline

Screenshot 2023-05-24 173131.png

Once I had completed the work on facial expressions, I shifted my focus towards animating body gestures. Utilizing the pre-made rig provided by the human generator, I began creating gestures through keyframe settings, them over time by time.To ensure authenticity, I studied how people naturally use their bodies when explaining something, and also searched the movements of entities in other games. I observed that the hands, wrists, and fingers tend to move in unison, initially displaying symmetrical motions before adopting more individualized movements. Furthermore, hand movements often involve corresponding shoulder movements. Armed with this research, I proceeded to create pose modes by adjusting the location and rotation of the various body parts. To ensure smooth animation, I set the frame rate at 60fps, accorting to Unity tends to have a movement flow at 72fps.

Full animation

Synchronize with audio sample

Reflection

During this sprint, I gained valuable insights into the movements of the human body during communication. Additionally, I discovered an alternative method for animating facial expressions, beyond the limitations of relying solely on the human generator. Utilizing the weight painter technique proved to be a significant way, allowing for more precise control over designated parts and their corresponding movements. I received positive feedback from a fellow group member, who commended the entity's realistic movements, particularly telling the synchronization of the mouth with the accompanying audio. These experiences have contributed to my personal growth and improved my overall understanding of animation techniques.

User Interaction

Sprint 3

As a 3D developer, I want to incorporate coding to enable interaction with a shoe model by making a one interaction c# script into the shoe.

Second. Sprint goal

Exploration ​

During this semester, I had a goal of contributing to user interaction within our project. To achieve this, I collaborated with my group members to brainstorm potential ideas. After consideration, I decided to implement a scaling interaction feature for the shoe model. In the central hub of our game, where the shoe is prominently displayed in the middle of the room, players will have the ability to dynamically adjust the size of the shoe. This interactive feature will allow players to zoom in or out, providing them with an overview of the shoe from different perspectives.

​

C# scaling tutorial

https://www.youtube.com/watch?v=Zlwg3SIX7FA

Screenshot 2023-05-25 135204.png

​

O

​

O

Script from Youtube, missing "z

Screenshot 2023-05-25 140517.png

Added "z" input

Screenshot 2023-05-25 140719.png
Screenshot 2023-05-25 140743.png

Input of limitation of size decreasing.

Screenshot 2023-05-10 141326.png
Screenshot 2023-05-10 123658.png
Screenshot 2023-05-10 141326.png
Screenshot 2023-05-10 141956.png

In my search for tutorials on resizing objects through C# scripting, I came across a helpful guide that explained how to manipulate the size of 2D objects. I decided to apply this concept to a 3D cube and discovered that by inputting the same value into the "Z" field, the object's size would change across all three dimensions. With this knowledge, I successfully implemented a script that allowed for size adjustments using the "V" key to increase and the "C" key to decrease the object's dimensions.During my experimentation with size adjustments, I encountered a potential issue when working with the shoe model. As I continued decreasing the size, I noticed that once it reached zero, the object would flip and begin growing in the opposite direction. To address this concern, I ask assistance from my colleague, Kevin. He suggested incorporating a script that would impose a limitation on the shoe model, preventing it from scaling below zero. By implementing this solution, the script would halt any further size reduction once it reached zero, effectively resolving the problem of the flipped and inversely growing shoe.

Screenshot 2023-05-25 140918.png

Script for shoe scale interaction.

* Code language note *

int : number ex. 1,2,3,4,5,

float: just num,ber with desimo ex. 1.2343

string name: text, name

vector3: direction, free data , x,y,z

; : close the sentense

Public: can adjust from outside of script = [SerializeField] private

Private: only change in scripts, is safer than Public

// : Just note

R: scale

T: adjust

E: rotate

GetAxis: move only -1,0,1

FixedUpdate: 50 persec, giving consistence

SerializeField: movement speed work in a Unity (more convenience than "movement script" )

Transform in script means transform in Unity

GetKeyDown vs KeyDown: GetKeyDown is whenever you press the botton it's applied

Reflection

Throughout the exploration, I acquired knowledge of C# scripting for adjusting object sizes and successfully incorporated the script into my project. During this process, I also discovered a potential issue with continuous size reduction and object flipping, which I addressed and resolved independently.  I've learned terms of code and script for scaling too. Even in the beginning searching for the right tutorial was hard because whenever I search "scaling in Unity" showed up a basic scale in Unity. However, through conversations with experienced game developers, I obtained the necessary keywords and successfully located the relevant tutorials. This journey has been quite exhilarating, as I have managed to contribute a small yet valuable enhancement to user interaction. Moving forward, my next challenge will involve adapting this interaction for the Oculus experience, as the current implementation relies on computer keyboard input rather than the immersive VR experience.

© 2023 by Yoojin Seo.

bottom of page