Graduation Project — Motion-Sensing Product-Service System for People with Disabilities
Jan 2016
Student
20 mins

Through two core touchpoints (the design of motion-sensing smart hardware and interactive software design), this project aims to help special groups, typically represented by the deaf-mute community, to express themselves more freely in daily life. For instance, during travel, it helps them become familiar with their surroundings, effectively intervenes in dangerous situations to ensure travel safety, and assists them with communication in everyday scenarios. The design of these tools aims to correct physical defects and deviations organically.
As a designer, my goal is to drive the progress of an accessible society, promoting greater empathy, embracing diversity, and advancing inclusive and barrier-free design. By doing so, every individual is given the opportunity for full self-expression, showcasing their personality, achieving equality, and even pursuing happiness, enjoying the joy brought by learning and creating.
The following video demonstrates simple, universal sign language vocabulary, such as: "I", "know", "you", "happy", "of", "eat", and "people".
(Please use a VPN to play the video smoothly if required)

The Product-Service System (PSS) shifts the focus of business activities from solely designing and selling physical products to offering comprehensive solutions that blend products and services to meet specific customer needs through strategic innovation. It sustainably expands upon physical products, evolving into foreseeable models within the post-industrial society, connecting intangible services with tangible products into an integrated system. The value of this process lies in its ability to enhance the technological ecosystem driven by different participants, technological artifacts, services, business models, and forces like sustainable development and dematerialization.
Motion-sensing technology fulfills the sensory, emotional, and social needs of individuals with disabilities while enabling self-expression. It reduces the cognitive load for these users, adapts to the diverse contexts of their services, and significantly elevates their engagement and emotional experience.
Interaction design for special groups represents a classic model of multimodal Natural User Interface (NUI). NUI employs multimodal interactive technologies such as eye tracking, speech recognition, gesture input, facial recognition, and sensory feedback. It allows users to interact with computers or mobile devices in parallel and non-precise ways by utilizing their natural sensory and cognitive abilities, thereby improving the naturalness and efficiency of human-computer interaction. In an ideal NUI, users can fully concentrate on their activities while forgetting the interface entirely, achieving true "natural" interaction.
Principles for motion-sensing interaction design for special groups: 1) Non-intrusive interactions for physically capable deaf-mute individuals, 2) The utilization of physical mediums, 3) Multimodal fusion with context awareness, 4) Correct and straightforward conceptual models, 5) Feedback and constraining factors, 6) Consideration of system scalability.
Systematic approaches centered around a unified design subject
Foresight, where the application of new technologies uncovers more possibilities for users
Complexity, ensuring reliable and high-quality services amid diverse scenarios
Convenience, focusing on system designs that prioritize user-friendly operations
Choice of research methods and tools: 1) Objective observation of disabled individuals as research subjects; 2) Experience prototyping method to simulate real-life scenarios for disabled users.
1) Address the daily challenges faced by deaf-mute individuals, unearth their emotional and psychological needs, and prioritize specific content to provide a foundational basis for upcoming product feature design. 2) Understand the usage patterns and operational suggestions from deaf-mute individuals regarding smartphones, mobile apps, and the mobile internet, guiding future product interaction design. 3) Further explore their understanding and acceptance of motion-sensing interaction technology as a reference for optimizing service design.
The target research subjects are mostly deaf-mute individuals ranging from school-age children to young adults. This demographic has already mastered sign language and is currently in a phase of growing life experience and knowledge accumulation. Simultaneously, they hold a keen interest in emerging novelties like consumer electronics and mobile applications, which facilitates the development of software and hardware. Most of their daily inconveniences and needs naturally surface, making it easier for researchers to observe and record, providing critical design evidence.
Physical disabilities, due to the unpredictability of injury locations, are hard to quantify within human factors engineering limit parameters. The sheer volume of unstable variables makes a product-service system overly complex to resolve. In contrast, patients with speech and hearing impairments generally share a single, well-defined disability area, making it a highly effective entry point for designing a well-structured PSS and laying the foundation for future academic research in related fields.
According to a 2008 report, special groups consist primarily of individuals with speech and hearing disabilities. Prolonged physical inability to communicate often leads to alexithymia, accompanied by psychological and mental challenges, making it difficult to gain societal acceptance. Furthermore, disability acquired at different life stages causes varying degrees of separation between knowledge and capability, along with assorted lifestyle barriers, resulting in diverse social life patterns.
The activity characteristics of special groups primarily manifest as limited physical activity spaces; brief periods of effective engagement time; and high costs coupled with low quality of participation. In other words, within the experiential composition of special groups, instrumental performance stands as the primary evaluation metric, which must inherently reflect strict efficiency.
Through careful and meticulous observation, designers can identify the triggers for users using a product, their interactions with the environment, any personalized modifications made to products to meet their specific needs, latent product attributes, and hidden user needs. Researchers personally stepping into the context where users operate products allows for loss-free data retrieval through authentic real-world observation.
1) During observations, three deaf-mute individuals exhibited varying degrees of hearing impairment and demonstrated a reliance on family care in their daily lives.
2) Observations prominently highlighted patterns of social anxiety.
3) During observations, communication between the three subjects and their immediate family members was adequate, but communication quality dropped significantly when interacting with relatives or friends.
Physiological Simulation: Earplugs and face masks were used to physically simulate the physiological limitations of hearing loss and impaired expressive ability during character immersion.
Roleplay: Deaf-mute Person A, Person A's Mother, Deaf-mute Person B, Deaf-mute Person C, Store Clerk, Driver
1) Through scenario immersion, researchers and designers recognized that the hearing-impaired face significant travel pressure in complex traffic environments. It was further identified that due to physiological limitations, the deaf-mute community struggles to express their feelings during sudden incidents, describe ongoing situations, or seek help effectively. A reliable solution is essential to allow hearing-impaired individuals to send SOS signals and grant them a sense of security while traveling, making the product design their emotional anchor in such scenarios.
2) Sign language is the most common communication method for the hearing-impaired in daily life, but it usually fails when interacting with the general public. This results in widespread social anxiety disorders among the demographic, which requires an appropriate design solution.
3) Through scenario simulation, it was concluded that sign language translation services should be shifted from offline to online via an app, restoring the most natural communication environment and helping deaf-mute individuals experience the joy of connection.
User Behavior Trajectories and Key Scenarios Summary
User Experience Journey Map Summary
1) Help individuals with disabilities understand their surroundings and ensure travel safety.
2) Professional sign language translation bridging communication barriers.
3) Visualized voice-to-text input.
4) Family sharing of sign language services to foster a harmonious home environment.
5) Emergency alarm functionality for sudden incidents.
The "Hand-Translate" hardware embraces two forms of wearable design: neck-mounted and clamp-on, addressing the crucial focus of portability in interaction design for special groups. This hardware satisfies the explicit requirement of overcoming communication barriers while meeting the implicit desire of deaf-mute individuals to express individuality. Furthermore, the design has a feature to capture vehicle honking noises and send vibration alerts to mobile devices to keep them safe on the road. Technical support is built on the infrared sensors and SDK interface provided by Leap Motion, creating a non-intrusive motion-sensing interactive technology that is highly suited for people with disabilities. The core features operate the device switch, emergency alarm triggering, and real-time sign language translation. It serves the disabled community by rebuilding natural face-to-face communication models between them and non-disabled speakers, shattering walls of isolation, and empowering hearing-impaired people to reintegrate into society and create wealth in both the social and psychological spheres.
According to the position guidelines for sign language interpretation defined in the "GB/T24435--2009 National Standard of the People's Republic of China for Basic Sign Language Gestures": standard gestures are generally performed within a spatial range extending below the forehead, above the waist, between the shoulders, and roughly two palm-widths outward from the chest (as shown below). Some specific gestures may be executed in other appropriate positions depending on the expressed content.
The Mobile APP revolves around five core features. First, GPS Tracking, which monitors the user's location in real-time to enable timely assistance in case of accidents. Second, during transit, the app actively listens to the environment for vehicle horns, vibrating the phone to act as a critical travel safety warning for the deaf-mute person. Third, Video call features. The app provides sign-language-to-voice translation services alongside voice-to-text dictation mapping. It shifts the offline paradigm of face-to-face communication to online protocols, fully restoring a natural communication method. After completing calls, users are requested to rate the provided translation services, inviting them into the product development lifecycle to let users and services grow side by side. Routine system updates allow software and hardware iterability as user needs broaden. Fourth, an Emergency Alarm System. When the disabled user encounters unexpected trauma, the app rapidly dispatches alerts to family contacts, immediately procuring assistance; the user safely neutralizes the alarm when resolved. This mechanism fires rapid feedback and directly pings contacts about the ongoing emergency situation.