Mobile Application Design
The Eyedar application is a LiDAR-driven echolocation tool for visually impaired and blind users to navigate their environment through sound.
I'm a User Experience Architect with 10+ years designing products that balance user needs, technical constraints, and business impact. My work spans the full product development lifecycle: from research and strategy through interaction design and post-launch optimization.
I specialize in complex, regulated environments: healthcare/pharmaceutical products, accessibility-first mobile applications, API integrations, and emerging technology (XR/AI). I've shipped award-winning products (Cannes Lion, 4 Clio Golds) used by thousands of patients and healthcare professionals.
My approach combines deep user research with technical fluency. I understand how systems work, not just how they look. I build design systems, automate
workflows, and bridge the gap between design intent and engineering implementation.
I'm currently seeking Product Design or UX Engineering roles in tech/healthtech where I can leverage my domain expertise and systems thinking at scale.
Design & Systems: Figma, design tokens, component architecture, design system governance
Development: HTML, CSS, JavaScript, Figma API, Model Context Protocol
Specialized: Front-end development, XR development, Acessibility, LIDAR integration, AI model integration
Methods: User research, usability testing, information architecture, interaction design, prototyping.
📍 Based in New York, open to remote and hybrid roles
💼 Currently seeking product design opportunities in tech/health tech
📧 Available for interviews immediately
The Eyedar application is a LiDAR-driven echolocation tool for visually impaired and blind users to navigate their environment through sound.
Optimized information architecture and redesigned self-injection instructions for a multi-indication pharmaceutical website. IFU pattern adopted into client's global component library.
UX-Engineered automated workflow to sync design tokens from Figma to code, keeping design and development in alignment using Model Context Protocol.
A augmented reality tool for the Veeva CLM platform that transformed standard KOL video content into an immersive, interactive experience for pharmaceutical sales representatives.
I start every engagement by understanding the business case. Before beginning sketches or concept development, I assess what exists: functionality gaps, usability issues, technical constraints. This groundwork builds credibility with stakeholders and prevents wasted cycles.
I build systems, not just screens. Design tokens, interaction specifications, research frameworks and artifacts that scale beyond a single project and keep teams aligned as products evolve.
When it comes to product design and feasibility, I believe in leading by example. Pitching, doing, creating and iterating. When I advocate for a direction, I ensure that stakeholders are informed and I show the work. This approach invites collaboration rather than debate, and it moves teams from ambiguity to execution faster.