Interaction Design: Eyedar mobile application
LiDAR-driven echolocation mobile application for visually impaired and blind users to navigate their environment through sound.
I'm a User Experience Architect with 10+ years designing products that balance user needs, technical constraints, and business impact. My work spans the full product development lifecycle—from research and strategy through interaction design and post-launch optimization.
I specialize in complex, regulated environments: healthcare/pharmaceutical products, accessibility-first mobile applications, API integrations, and emerging technology (XR/AI). I've shipped award-winning products (Cannes Lion, 4 Clio Golds) used by thousands of patients and healthcare professionals.
My approach combines deep user research with technical fluency. I understand how systems work, not just how they look. I build design systems, automate
workflows, and bridge the gap between design intent and engineering implementation.
I'm currently seeking Product Design or UX Engineering roles in tech/healthtech where I can leverage my domain expertise and systems thinking at scale.
Design & Systems:Figma, design tokens, component architecture, design system governance
Development: HTML, CSS, JavaScript, Figma API, Model Context Protocol
Specialized: Front-end development, XR development, Acessibility, LIDAR integration, AI model integration
Methods: User research, usability testing, information architecture, interaction design, prototyping.
📍 Based in New York, open to remote and hybrid roles
💼 Currently seeking product design opportunities in tech/health tech
📧 Available for interviews immediately
LiDAR-driven echolocation mobile application for visually impaired and blind users to navigate their environment through sound.
Optimized information architecture and redesigned self-injection instructions for a multi-indication pharmaceutical website. IFU pattern adopted into client's global component library.
Automated workflow to sync design tokens from Figma to code, keeping design and development in alignment using Model Context Protocol.
An augmented reality sales tool for iPad that transformed standard KOL video content into an immersive, interactive experience for pharmaceutical sales representatives.
I start every project by understanding the business case. Before sketching screens, I assess what exists: functionality gaps, usability issues, technical constraints. This groundwork builds credibility with stakeholders and prevents wasted cycles.
I build systems, not just screens. Design tokens, interaction specifications, research frameworks—artifacts that scale beyond a single project and keep teams aligned as products evolve.
I lead through demonstration. Pitching, doing, creating. When I advocate for a direction, I show the work. This approach invites collaboration rather than debate, and it moves teams from ambiguity to execution faster.
