ACUITY · AI Startup Design Hackathon · 2026
Clarity before crisis — turning biometric drift into the next right action
Maya wakes up with chest discomfort. Her smartwatch shows an elevated heart rate, but the numbers are still technically within a “normal” range.
She hesitates — is this something serious, or just stress?
Modern wearables detect physiological signals but rarely translate them into clear medical action.
ACUITY bridges that gap. It detects sustained deviation from a user’s baseline and escalates guidance with proportionate urgency:
nudge → schedule → urgent care.
Product
AI-powered preventive health monitoring ecosystem (bracelet + mobile app + provider pathway)
Primary user
High-risk cardiovascular patients during recovery and monitoring periods
Core innovation
An intervention engine translating biometric drift into clear medical next actions
Business model
Distributed through insurers and healthcare systems for preventive monitoring
What I’m optimizing for
Hiring signals for innovation roles: judgment under time pressure, systems thinking, AI orchestration, and end‑to‑end craft. This case study is organized around the decisions that shaped ACUITY — not a checklist of process steps.
Overview
The Data‑Action Gap
In our research sprint, the pattern was consistent: people experience symptoms, look at their wearable, and still feel stuck. They have numbers — not meaning. ACUITY is designed as an interpretation layer that bridges raw signals to a confident next step, without alarming users or over‑triaging care.
System at a glance
Inputs
Multi‑modal vitals (HRV, SpO2, sleep, ECG, blood pressure where available) + patient‑reported symptoms.
Brain
Deviation detector compares real‑time patterns against a personalized baseline + clinical risk priors.
Outputs
Tiered guidance: subtle haptic cue → “Schedule now” pathway → urgent escalation when warranted.
Deployment
Distributed through insurers / health systems to reach high‑risk members and align incentives.
Who
High‑risk cardiovascular patients, with specific attention to groups that are frequently misdiagnosed or dismissed.
What we shipped
Hardware concept + end‑to‑end mobile experience + service pathway (provider + insurer integration).
Constraint
24 hours — forced clarity on the few system behaviors that create trust (thresholds, escalation, language).
Solution
A story‑first walkthrough (lead with motion)
This film is the fastest way to understand the end‑to‑end ecosystem: the bracelet, the tiered alert logic, and the “Schedule now” handoff. Put this above the fold so reviewers get the concept before they decide whether to scroll.
Core promise
ACUITY turns sustained baseline drift into a clear action. Not more data. A decision you can trust.
What’s novel
Designing the confidence thresholds and escalation behaviors — the system’s “when do we speak?” logic — as a first‑class UX surface.
AI workflow
AI as a force multiplier (directing > executing)
In 24 hours, the bottleneck is not ideation — it’s synthesis and throughput. I treated AI tools as an orchestration layer: Claude for reasoning and decision framing, Midjourney/Sora for believable media, and Figma Make for rapid UI construction. My job was to set constraints, define evaluation criteria, and keep the system coherent.
Brainstorming · Claude
Problem framing, tradeoff exploration, tier logic, and copy that avoids medical over‑claiming.
Execution · Figma Make
Accelerated layout scaffolds and component variants so we could spend time on the system behaviors.
Media · Sora + Midjourney
Story film + hardware visuals that made the concept feel “real enough” to evaluate.
Key takeaway
AI didn’t replace design judgment — it surfaced more options faster. The differentiator was choosing which system behaviors mattered and making them legible.
Key decisions
The decisions that made ACUITY credible
Decision 1 — Interpretation over more data
We chose to design a translation layer (meaning + next step) instead of a richer vitals dashboard.
In early‑symptom moments, “normal range” numbers can still hide dangerous trend shifts. Users needed contextual guidance, not another chart.
Research repeatedly described uncertainty: symptoms without clarity, or metrics that didn’t map to action.
Copy + thresholds became core UI. We prioritized “what this means” and “what to do now.”
Why
Evidence
Design consequence
Decision 2 — Deploy via insurers, not direct‑to‑consumer
We designed ACUITY as an infrastructure product: high‑risk enrollment + provider pathways + aligned incentives.
The people who most need preventive intervention are least likely to self‑purchase a new wearable. Insurer deployment reaches the right users sooner.
Operationally, prevention works when triage and follow‑through are baked into care pathways — not left to the user.
We added a clinician‑ready summary and a “Schedule now” CTA as the primary action in the alert card.
Why
Evidence
Design consequence
System logic
Deviation detector + logic gate
ACUITY’s intelligence isn’t “AI magic.” It’s a legible system that weighs drift magnitude, duration, and symptom context. The UX is designed around trust: reduce false alarms, explain why the system is escalating, and always offer a safe next step.
Deviation detector
Continuously compares current signals to a personalized baseline; flags sustained abnormal patterns (not one‑off spikes).
Logic gate
Routes the user to the right intervention tier by combining confidence thresholds + risk priors + user confirmation.
Designing for false positives
We treated “when do we interrupt?” as a design problem: ACUITY escalates only after sustained drift, uses cautious language, and makes “schedule evaluation” the default path before urgent action.
AI decision engine
Turning biometric signals into interventions
ACUITY’s intelligence lies in the intervention engine — the system that decides when to act. Instead of reacting to individual spikes, the model evaluates multiple signals together:
- Drift magnitude — how far the signal deviates from baseline
- Drift duration — whether abnormal readings persist
- Context signals — symptoms, sleep, recent activity
| Tier | System interpretation | Product response |
|---|---|---|
| Tier 1 | Mild deviation | Passive monitoring + gentle nudge |
| Tier 2 | Sustained abnormal trend | Recommend scheduling evaluation |
| Tier 3 | High-confidence risk | Urgent care escalation |
Designing these thresholds was the core product challenge: intervene too often and users lose trust; intervene too late and critical events are missed.
Core flow examples
1) Tiered alert that explains the “why”
The bracelet vibration is intentionally subtle. The primary UI work happens in the card: risk tier, plain‑language explanation, and a single primary CTA: Schedule an appointment soon.
2) Low‑friction handoff to care
The product is only as good as follow‑through. The scheduling flow reduces steps, pre‑fills context, and confirms what happens next.
3) Clinician‑ready summary
A concise, shareable report translates drift into medically relevant context so providers can triage quickly — without the patient needing to interpret charts.
Note: keep screen recordings framed in a consistent device mock (same scale, same background, no heavy shadows). If you have many screens, group by outcome (alert → action), not by navigation.
Impact
User outcomes + system value
ACUITY focuses on reducing the delay between symptom onset and medical evaluation. Even small behavioral shifts could significantly reduce emergency admissions.
If deployed to 100,000 high‑risk cardiovascular patients, modeled outcomes could include:
- 30–40% reduction in time between symptom onset and seeking care
- 1–2% reduction in emergency cardiovascular admissions
- $8–12M potential annual cost reduction for insurers
User impact
Shorter time‑to‑action: reduces the gap between symptom and seeking care.
Lower uncertainty: replaces raw metrics with meaning + a next step.
Less alarm fatigue: escalates only after sustained drift; cautious language; transparent rationale.
Business / system value
Fewer avoidable admissions: even small reductions can meaningfully lower emergency costs.
Aligned incentives: insurer deployment encourages early evaluation and closes the follow‑through loop.
Scalable care capacity: reduces unnecessary ER burden by routing “elevated” to clinics first.
Next steps (with more time)
Validate tier thresholds with clinicians, prototype “false positive” handling (user confirmation + clinician review), and pilot insurer enrollment workflows. Expand the hardware story with form‑factor iterations and sensor placement constraints.
Technical appendix
Prompt logs + orchestration notes
I include prompt artifacts selectively to show how I direct AI systems — not to show “I used AI.” Keep these collapsed by default so the main narrative stays clean.
Claude · decision framing prompts
Include 3–5 prompts that demonstrate systems thinking and constraint setting. Example categories:
- Tier escalation rubric (inputs, thresholds, language constraints)
- False positive / false negative tradeoffs and mitigations
- Deployment model comparison (DTC vs insurer)
[PROMPT]
Define a 3-tier escalation system for a wearable that detects cardiovascular risk.
Constraints: minimize alarm fatigue; avoid medical over-claiming; require sustained deviation; include a safe, actionable next step.
Output: tier names, trigger conditions, on-device cue, in-app card copy, primary CTA.
[OUTPUT EXCERPT]
…
Sora / Midjourney · media direction
Show how you locked variables (actor consistency, bracelet geometry, lighting, shot constraints) for physics accuracy and reuse.
[PROMPT]
Continue the video using the SAME woman from the previous clip.
Bracelet locked on LEFT wrist in every scene.
Cinematic realism, medium-wide framing, soft natural light.
…
Figma Make · component scaffolding
Include a short log of what you generated vs. what you intentionally edited by hand (copy, hierarchy, states, CTA priority).
[LOG]
Generated: baseline dashboard scaffolds, card variants, spacing system.
Refined manually: alert copy, tier semantics, CTA hierarchy, edge-case states.