I design AI features that people don't just use — they trust. From the first generative photo editor on any phone to experiences touching two billion interactions.
See the workI think about how things look and how they feel to use — and I don't think those are separate questions.
I'm a Lead Product Designer at Samsung R&D Institute India, where I've spent 3+ years making AI feel natural on a phone. My background is in Fine Arts (BFA, Delhi University) and Interaction Design (M.Des, IIITDM Jabalpur) — which gives me an unusual lens: I care as much about the aesthetic quality of a moment as the usability of it.
The work I'm most proud of sits at a hard intersection — where cutting-edge AI capability meets the everyday person who just wants a beautiful photo. I've contributed to Photo Assist (the first generative photo editing on any smartphone), Instruction-Based Editing, AI Portrait styling, and a draw-to-generate camera experience. I was promoted to Lead in 2024 and was one of a select few Bengaluru designers chosen for an on-site residency at Samsung UX HQ in Seoul.
Before Samsung, I designed B2B enterprise products at Infosys for BHP and Deutsche Bank — where I learned how design thinking can unlock commercial value, not just better UX. And before that, an internship at Cafe Coffee Day taught me brand and packaging design, and at Kamalan I worked as a film intern learning visual storytelling from the ground up.
From Galaxy's first generative photo editor to natural-language editing on the S26 — a multi-year evolution of how people interact with AI in their most personal space: their photos. Portrait Studio, AI Erase, Outpaint, Sketch-to-Image, and Instruction-Based Editing — each building on the last.
Generative AI photo editing didn't exist on any smartphone. The challenge wasn't just building a feature — it was defining what this category should feel like on a phone. How do you make a complex AI capability feel effortless in a tool people use every day?
Part of the core team that shipped Photo Assist. My focus was on usability research, prompt structuring, and iterating on the interaction model to ensure AI outputs felt controllable and trustworthy. Worked directly with the Seoul tech team to translate model constraints into design decisions.
Also developed a cross-cultural design framework for AI face features — addressing global facial markers and cultural sensitivity — adopted by Samsung Seoul as standard protocol for all AI face-related features.
2024 — Photo Assist launches on Galaxy Z Flip6 & Fold7 with Portrait Studio, AI Erase, Outpaint, and Sketch-to-Image. The first commercial generative photo-editing experience on any smartphone.
2025 — Instruction-Based Editing ships on Galaxy S26. Users describe edits in natural language — "add water and fish in the bubbles" — and the AI generates it. A leap from button-driven to language-driven interaction.
2026 and beyond — Remaining features from the editing roadmap extend to Fold8, establishing a multi-cycle product shaped from the ground up.
The hardest design decision was what to leave out. A generative model can do a hundred things — the job was to pick the ten that would feel magical and hide the ninety that would feel broken.
First of its kind — launched as the first commercial generative photo-editing experience on any smartphone.
Employee of the Year — awarded Samsung SRIB's highest individual recognition.
Established the playbook — interaction patterns from Photo Assist became the foundation for every subsequent AI editing feature at Samsung.
A journey from the first mobile feature to retain facial identity during style transfer, to pet portraits, to full-photo stylization. When AI became powerful enough to generate any style, the real challenge became choosing which styles work universally.
Style transfer that doesn't destroy what makes a face recognisable. Then: extending that same precision to pets. Then: applying it to entire photos. Each step raised the bar — and when AI became powerful enough to generate any style, the real design challenge became curation, not capability.
With 50+ styles technically possible, the question became: which ones resonate universally? We ran extensive secondary research and floated global surveys to understand cultural preferences, aesthetic expectations, and what "beautiful" means across different markets.
The research narrowed 50+ candidate styles down to 10 that shipped — styles that worked across demographics, skin tones, and cultural contexts. This wasn't a technical filtering exercise; it was a design decision about taste, sensitivity, and inclusivity at scale.
When AI can generate anything, the designer's job shifts from "what can we build?" to "what should we ship?" That distinction shaped every decision in this project.
Lead Product Designer — designed the prompt systems for People Portrait, scaling across visual styles while preserving facial identity. Extended the system to Pet Portraits and full-photo stylization. Selected as one of a small number of Bengaluru designers for an on-site residency at Samsung UX HQ in Seoul to co-develop the feature with the research team.
People AI Portrait — First mobile feature to retain facial identity during style transfer. Comic, 3D cartoon, watercolor, sketch — each preserving the person.
Pet AI Portrait — Extended the system to pets. Different challenge: animal features require different landmark detection and style mapping than human faces.
Entire Photo Stylization — Full-frame style transfer. Oil painting, studio, fisheye lens, and more — applied to the entire photo, not just the subject.
A partnership feature embedded into Galaxy A-series cameras. Designed the integration UX and lens qualification guidelines. The feature crossed 2 billion interactions.
Embedding a third-party experience (Snapchat Lenses) inside Samsung's native camera without it feeling foreign. The A-series audience skews younger — the experience had to feel native to Samsung while leveraging Snapchat's creative engine.
Designed the integration UX and established safe-zone guidelines for lens qualification. Proposed two features — Favoriting Lenses and Personalised Lenses — both adopted into the next release cycle.
Surpassed 2 billion cumulative interactions. The two proposed features shipped in the subsequent release, proving the design thinking extended beyond the initial launch.
Five generations of Samsung Camera UI — owning visual specs, GUI/UI guides, and production handoffs with offshore engineering in China. Plus the Expert Raw Sky Guide: a visual language for planets, stars and constellations recognised across pro photography communities.
Samsung Camera ships across S-series, Fold, Flip, Triple Fold, and Tablets — each with different screens, aspect ratios, and hardware. Maintaining a coherent, high-quality camera UI across all of them, over five OS generations, is an exercise in systems thinking.
Owned the full production pipeline: visual specs, GUI/UI documentation, design system guides for A & M-series, testing, and offshore dev handoffs with China. In parallel, redesigned the Magnifier accessibility app across all six form factors.
Defined the Expert Raw Sky Guide design system — a visual language for Samsung's pro photography app covering planets, stars and constellations, recognised across pro photography communities.
Three AI-native concepts explored over two years. None shipped in their original form. But they shaped what came after.
Three innovation explorations: an AI video editing suite, an AI photo editing pipeline that predated Photo Assist, and a Creator's Camera mode for content creators.
None shipped as conceived — but they weren't failures. The photo editing pipeline directly informed Photo Assist. The Creator's Camera shaped Samsung's understanding of their pro-sumer audience. The video work built AI interaction patterns the team still draws on.
Showing unshipped work is a deliberate choice. These projects represent thinking that only comes from exploring without a guaranteed outcome — and that thinking is present in everything that did ship.