Module 1 Activity Research

Weekly Activity Template

Tzu Yu Hwa


Project 1


Module 1

This activity shows a basic setup of an LED circuit using the Arduino UNO. The breadboard connects an LED, a resistor, and jumper wires to create a simple input–output system. The LED lights up when the circuit is powered or when a signal is triggered, demonstrating how digital pins send output to physical components. It’s a simple but key example of how sensors and interactions work in physical computing.

Activity 1

The LED circuit is powered through the Arduino board connected to a laptop via USB, allowing quick testing and live power supply. A close-up of the LED and resistor setup on the breadboard, showing proper wiring and current control for stable brightness. A top view of the Arduino UNO and breadboard connection, highlighting the clear wiring between digital pins, resistors, and LED output. Two LEDs are connected in parallel to demonstrate how multiple outputs can respond to the same signal from the Arduino pins. A push button is added to the circuit, enabling basic user input to control the LED’s on-and-off behavior.

Activity 2

Simple Arduino code was written to read the potentiometer’s analog values and send them to the serial monitor. This step tested the communication between hardware and software before integrating sensors. ProtoPie Connect shows the live link between Arduino and the digital prototype. Real-time temperature and humidity values appear here, confirming smooth data transmission. An AHT20 sensor was added to measure temperature and humidity. The updated code sends both data types through the serial port, preparing for real-time interaction with ProtoPie. An interactive smart home interface was built in ProtoPie. The temperature and humidity values on the screen respond instantly to the sensor data coming from Arduino. The completed prototype displays live readings from the sensor. Temperature and humidity values update continuously, demonstrating the connection between physical input and digital feedback.

Activity 3

The operator library in TouchDesigner is shown, where different nodes like COMP, CHOP, SOP, and DAT are used to build interactive systems. Serial data from Arduino is imported and processed using DAT and Select operators, converting sensor readings into usable numeric values. A 3D sphere model is created and linked with transform and material nodes to prepare for dynamic visual changes. Math and channel operations are applied to map sensor values to the sphere’s visual properties, such as brightness or scale. The final output demonstrates color and light intensity changing in real time, reacting to data received from Arduino — visualizing interaction between physical input and digital form.

Research Activity


WGSN Consumer Trend Research

Both categories explore how UX/UI and AI use emotional design to build deeper connections between people and technology. They highlight the shift from functional interaction to empathetic, human-centered experiences that make technology feel more intuitive and emotionally aware.

This visual shows global research on how people emotionally engage with AI and technology. Findings include that 74% of U.S. teens feel peaceful yet anxious when offline, and 77% of Indian teens experience anxiety when separated from their phones. These statistics reveal the strong emotional bond between users and devices and stress the need for empathetic, emotionally aware UX/UI design in AI-driven interactions. The image shows Crdl, a musical device that uses touch to create sound and emotional connection between people. It illustrates how design can turn physical interaction into a shared emotional experience, aligning with the trend’s goal of bonding through technology. The data highlights that 92% of Gen Z prefer in-person relationships, and 50% of U.S. adults experience loneliness. This underlines the emotional gap modern technology needs to bridge and reinforces why emotion-driven UX/UI design is so valuable for fostering connection and wellbeing.

WGSN Personas Research

This section presents two personas based on WGSN research, showing how people connect with technology in emotional ways. One focuses on building human connection through empathy, while the other explores emotional intelligence through AI and ethical design.

A mindfulness scene featuring a person meditating beside a smart ambient device. It illustrates how emotional technology supports calm, human-centered experiences that balance tech and emotion. A concept design of an AI-driven mobile interface that prioritizes user privacy and emotional sensitivity. It represents the rise of empathetic, minimalist UX/UI design for digital wellbeing. A timeline chart showing executives’ predictions for when AI agents will surpass traditional apps and websites. It highlights the accelerating trust and expectation toward AI integration in everyday digital interactions.

HMI Research

These examples highlight how emotional design bridges technology and human experience. From interactive storytelling toys to tactile devices and AI-driven wellness tools, each creation explores new ways to visualize emotions, foster empathy, and promote mindful digital interaction. Together, they demonstrate how design can transform emotional awareness into meaningful, human-centered connections.

This tactile device creates a calm, mindful experience through touch-based interaction. It helps users reconnect with their emotions and promotes comfort, awareness, and a sense of human warmth in daily life. This interactive storytelling toy encourages children to build emotional connection through dialogue and imagination. By using voice interaction, it fosters empathy, curiosity, and social engagement in a playful and meaningful way. These emotion-driven wellness products use AI and sensory feedback to visualize emotions through light, color, and sound. They represent how technology can support emotional wellbeing, self-awareness, and empathy in everyday routines.

Project Path

These visuals are part of my personal exploration combining digital art and emotion. Created through p5.js and hand-drawn elements, each piece captures the movement, rhythm, and warmth of human feeling — translating emotion into visual expression through both code and creativity.

This piece represents the fluid and expressive nature of emotions, blending light and texture to capture the warmth of human feelings through visual abstraction. The structure symbolizes the invisible patterns of emotional energy, transforming digital motion into a delicate, responsive form. This circular visual expresses emotional balance and inner calm, reflecting how technology can translate sound and touch into sensory harmony.

Project 1 Concept


Project 1 Concept

This project aims to explore emotional visualization through interactive digital art. Using p5.js, it will transform human emotions—such as expression, sound, and movement—into dynamic visuals that reflect the connection between people and technology.

This project explores the connection between human emotion and digital expression. Instead of using a photo frame, the concept focuses on the human body as a living interface — translating feelings into visual motion through gestures, expressions, or sound. Using p5.js with real-time input, the work visualizes emotions such as calmness, tension, or joy as evolving patterns of light and rhythm. It aims to create a poetic space where technology not only responds to emotion but also reflects how emotion can be seen, heard, and felt through digital form.
×

Powered by w3.css