Google Smart Displays
Establishing Design Systems Across Multiple Teams
Smart displays evolved from smart speakers by adding visual AI interactions. Leading UX for Google's smart display portfolio meant creating unified experiences across Assistant, YouTube, Photos, Smart Home, and hardware teams—each with distinct design languages and user expectations.
Cross-Functional Design Framework
I established design principles and collaborative processes that enabled five different Google teams to work cohesively. This included developing experience patterns for complex features like touchless personalization, air gesture controls, and hands-free video calling—capabilities that had never been developed by Nest before.
User Research & Privacy Innovation
Led extensive user research to understand how families interact with always-on displays in intimate home spaces. This research informed our approach to privacy controls, interface feedback, and hardware design decisions, ensuring user trust while enabling powerful AI features.
“The Google Nest Hub Max sets a new benchmark for smart displays.”
— Digital Trends
My role:
Lead end to end UX for Nest Hub, Nest Hub Max and Sleep Sensing, including hardware and software experiences
Coordinate Interaction Designers, Visual Designers, Motion Designers, UX Researchers and UX Engineers
Partner with Product Manager Lead to define Vision, Strategy and Roadmap for the Displays portfolio
Design systems unifying experiences across 5 Google teams
Privacy-first design patterns for AI-powered home devices
Advanced interaction models including voice, touch, and gesture controls