Arow
Designing an instant, intelligent way to call for help.
Role:
Brand Designer, Motion Designer, Visual Designer
Duration:
3 Month, 2024
Category:
Visual Design, Motion Design
Tools:
Figma, Photoshop, After Effects, Cinema 4D, Blender, Redshift
Summary
AROW reimagines how emergency support systems can serve people in danger, fast, intelligently, and without friction. The concept envisions a product built for real-world urgency, where users might be unable to speak, think clearly, or even unlock their phone. Through AI and contextual sensing, AROW detects potential distress and automates response, sending alerts, location data, and status updates to trusted contacts or local authorities within seconds.
This project challenges conventional UX boundaries by combining product design, interaction strategy, and ethical AI to create a system that saves time, and potentially lives.
Design Thinking
This project began with a question: How might we make emergency response effortless for ANYONE, ANYWHERE? Through rapid prototyping and behavioral research, I focused on minimizing cognitive load during panic every action, motion, and visual cue. AROW’s AI logic interprets sensor data and user context, transforming complex decisions into automated, human-centered actions. Designing AROW meant designing for incapacity, coercion, noise, panic, and ambiguity.
danger scenarios → user limitations → design responses
Silent / Coercive Situations
When users can’t speak or must hide their actions, AROW allows silent activation through discreet gestures or hardware triggers. The interface stays dark, sending live location data and alerts invisibly.
Physical Impairment / Injury
If sudden impact or loss of movement is detected, AROW automatically initiates a safety check or dispatches help with the last known coordinates.
Unable to Speak or Hear
For users who can’t communicate verbally, AROW uses tap patterns, icons, or visual prompts to send specific alerts — ensuring accessibility for all.
Crowded or Noisy Environments
In high-noise settings like stadiums or events, haptic feedback and bold visual cues replace audio prompts to maintain reliable interaction.
Remote or Low-Connectivity Areas
Offline-first functionality stores alerts locally and relays them once signal returns, keeping safety active even without a network.
Transportation Risks
If a rideshare route deviates or a car crash is detected, AROW auto-verifies the situation and alerts emergency contacts — no manual input required.
Domestic or Repeat Abuse
For recurring threats, AROW lets users set exclusion zones and automatic proximity alerts to warn them or trusted contacts when danger approaches.
Public or Large-Scale Emergencies
During disasters or public incidents, AROW switches to a contextual mode—providing evacuation routes, local alerts, and quick SOS broadcast functions.
Sensitive or Legal Barriers
For users unable or unwilling to involve authorities, AROW allows routing alerts to trusted contacts or community responders, preserving safety and choice.
Experience Architecture & System Design
Designing AROW’s experience architecture meant translating real-world danger into a system of clarity, automation, and trust. The core challenge wasn’t visual, it was structural: how to create a product that thinks for the user when they can’t.
AROW’s ecosystem
AROW’s ecosystem is built on three layers of response, allowed the system to function intuitively across edge cases, whether the user can’t speak, move, or even unlock their device. By removing unnecessary decision points, AROW transforms complex emergency protocols into an adaptive, intelligent flow that feels instant and human.

User Layer
Immediate Action
A single-tap or silent trigger activates the system without cognitive effort. This layer is optimized for speed and minimal visibility, guiding users through one clear action instead of multiple decisions.

Intelligence Layer
Context Awareness.
Once triggered, AI interprets sensor data, movement, sound, and location to assess risk. It differentiates between accidental inputs and real distress, and routes the alert to the right channel trusted contacts, nearby responders, or local authorities.

Assurance Layer
Feedback & Trust.
Every response loop provides confirmation through discreet cues, vibration, color pulse, or visual signal to reassure the user that help is on the way. After the event, AROW delivers transparent logs showing what was shared and with whom.
The Trigger
A crucial design question emerged while shaping AROW’s system architecture—what actually triggers it, and how? In an emergency, users can’t rely on complex gestures or visual interfaces. The trigger had to be instinctive, tactile, and accessible in any condition. We explored physical gestures and hardware interactions that could activate the system without visual focus—such as a long hold on the side or camera button, or a subtle shake motion when the phone is in a pocket. Each option was tested for speed, discretion, and reliability across iOS and Android devices, ensuring that the action feels natural under pressure and technologically feasible within real hardware constraints.
Refining the Gesture Logic
After exploring multiple activation gestures, we focused on two that balanced speed, safety, and technical feasibility: a long-press of the side or camera button, and a triple-shake motion for in-pocket use. These gestures leveraged existing smartphone hardware while remaining subtle and accessible with one hand. The challenge was to design haptic feedback and timing that felt deliberate—not accidental—under stress. Each millisecond of vibration and delay was tuned to help users feel confident that AROW understood their intent.

User Testing & Timing Validation
To validate the system, we ran a series of gesture and timing tests with participants simulating high-stress conditions—walking at night, carrying items, or keeping the phone in a pocket. We measured success rate, reaction time, and perceived confidence in both gestures. Photos and prototypes captured each iteration, from early timer calibrations to haptic feedback mapping. These sessions helped fine-tune the hold duration and feedback rhythm, striking the right balance between responsiveness and false-trigger prevention.

Long-Press Testing
Testing the 3-second long-press trigger for speed and comfort under one-handed use. The goal was to make activation feel deliberate, not accidental.

Shake Motion Prototype
Simulating the triple-shake gesture while the phone is in-pocket. We measured consistency and false-trigger prevention across multiple users.

Haptic Feedback Calibration
Mapping vibration timing and intensity to confirm that users can recognize activation without looking at the screen.
Storyboardd

Service blueprint
Once triggered, AROW activates a chain of human and system interactions built for speed and trust. Sensors capture motion, sound, and location data; AI interprets the context; and encrypted alerts reach trusted contacts or responders, all within seconds. The blueprint shows how these layers work together using today’s smartphone technology, from background sensors and machine learning to secure APIs and haptic feedback systems. Designed within existing mobile capabilities.
Wait!
During development, one question reshaped the system: What if AI gets it wrong? A false alarm could waste critical resources, or worse, put someone in danger. That realization led to restoring human control as an essential layer of safety. AROW’s AI now acts first to detect, but may confirms through human context, trusted contacts, user gestures, or manual override. It’s a balance between automation and empathy, ensuring every response is both intelligent and accountable.
Designing for Panic, Not Perfection
AROW’s interface was designed for moments of fear and urgency. Every interaction had to work under pressure—when users can’t think, see clearly, or speak. The flow was reduced to a single action, with large targets, silent gestures, and instant haptic feedback confirming safety.
Testing Human Instinct
Prototype tests simulated real stress, dim light, trembling hands, limited mobility, to study instinctive behavior. With a timer, because timing is everything. Results showed users reacted faster to tactile cues than visual ones, leading to refinements that made the system feel natural and reliable even in chaos.
Trust and Transparency
Because AROW operates in moments of fear, every automated action had to feel accountable. Users can see what data is shared, with whom, and when. Clear feedback loops and privacy logs were built to ensure that trust isn’t just implied—it’s designed into the system.
Visual Design — Clarity Under Pressure
The visual system was intentionally minimal—built to communicate faster than words. Large, high-contrast shapes guide focus, while color and motion signal urgency or safety. Typography is bold and legible under low light, and animations are restrained, designed only to orient or reassure. Every element serves one purpose: to make the interface readable, instinctive, and calm when everything else isn’t.
Conclusion
AROW pushed me to design beyond interfaces, to think about safety, trust, and accountability as user experiences. It taught me that in moments of fear, good design isn’t decorative, it’s decisive. Every choice, from AI logic to color contrast, became a study in empathy and restraint.
This project deepened my belief that technology should extend human capability, not replace it. By blending automation with human judgment, AROW shows how UX can turn complexity into confidence, and systems into lifelines.