Mobile Application Design
The Eyedar application is a LiDAR-driven echolocation tool for visually impaired and blind users to navigate their environment through sound.
User Experience Architect with 10+ years designing products that deliver measurable business outcomes while solving complex user problems at scale. I bridge design and engineering by translating strategy into systems, components, and shipped features across the full product lifecycle.
Domain expertise in high-stakes environments: I've designed healthcare/pharmaceutical products navigating FDA regulations, HIPAA compliance, and accessibility standards (WCAG 2.1 AA). My work includes patient-facing mobile applications, clinician workflow tools, and AR-enabled diagnostic systems used by thousands of healthcare professionals.
Technical fluency beyond mockups: I understand how systems work, including API integrations, design token architectures, and component libraries. I write production HTML/CSS/JavaScript, build Figma plugins, and automate design-to-development workflows. I don't just design interfaces; I implement design systems that scale.
Proven impact: Award-winning work (Cannes Lion, 4 Clio Gold) with outcomes including increased patient adherence, reduced clinician cognitive load, and faster product adoption. I measure success in user retention, task completion rates, and reduced support tickets, not just aesthetics.
Currently seeking: Product Designer or UX Engineer roles (L5/L6) at tech or healthtech companies where systems thinking, technical implementation skills, and regulated domain expertise create competitive advantage.
The Eyedar application is a LiDAR-driven echolocation tool for visually impaired and blind users to navigate their environment through sound.
Optimized information architecture and redesigned self-injection instructions for a multi-indication pharmaceutical website. IFU pattern adopted into client's global component library.
UX-Engineered automated workflow to sync design tokens from Figma to code, keeping design and development in alignment using Model Context Protocol.
A augmented reality tool for the Veeva CLM platform that transformed standard KOL video content into an immersive, interactive experience for pharmaceutical sales representatives.
I start every engagement by understanding the business case. Before beginning sketches or concept development, I assess what exists: functionality gaps, usability issues, technical constraints. This groundwork builds credibility with stakeholders and prevents wasted cycles.
I build systems, not just screens. Design tokens, interaction specifications, research frameworks and artifacts that scale beyond a single project and keep teams aligned as products evolve.
When it comes to product design and feasibility, I believe in leading by example. Pitching, doing, creating and iterating. When I advocate for a direction, I ensure that stakeholders are informed and I show the work. This approach invites collaboration rather than debate, and it moves teams from ambiguity to execution faster.