03 Accessibility Zappar · 2021–2023

Zapvision

Accessible QR technology for the 2.2 billion people packaging was never designed for.

Zapvision
Nobody had designed product packaging to speak to blind users. I led the product that changed that.
Role
Lead Designer
Company
Zappar
Year
2021–2023
Team
Lead Designer · RNIB · Unilever
Platform
Mobile app · iOS & Android
Tags
Accessibility · AR/XR · Inclusive Design · Product Strategy

Context

The World Health Organisation estimates 2.2 billion people have near or distant vision impairment. Of those, 284 million are partially sighted and 39 million are registered blind. For most of them, identifying a product on a supermarket shelf or checking allergens on packaging at home is a daily challenge with no good solution.

I led the design of Zapvision from the first internal pitch through to the shipped product. The case for building it was mine to make before any design work started.

Zapvision is a four-part integrated system: an App, an SDK, a CMS, and the Accessible QR code itself. I designed all four. The brand is also mine, built as part of the Zappar ecosystem. The product was announced at AWE 2022 and launched with Unilever as the first major brand partner.

The Problem

AR experiences and product packaging share the same assumption: the person using them can see. Every design decision in both disciplines treats vision as a given.

The consequences are practical and daily. Two products that feel identical to the touch, like Persil Bio and Non-Bio, can have completely different ingredients. A blind user picking them off a shelf has no way to tell them apart. That's not an edge case. It's a routine problem for tens of millions of people, with no solution anywhere on the market.

This wasn't only a question of inclusion. It was a large, underserved market with real commercial potential. Healthcare, education, and consumer packaged goods were all affected. I made the case for building Zapvision on both grounds: moral responsibility and commercial opportunity. The internal pitch positioned it as a competitive differentiator and a route to market segments Zappar had no other path into. That framing was what got it built.

Research & Discovery

I worked with the RNIB, who formally commissioned an expert assessment of the Zapvision SDK and led consumer research with the blind and partially sighted community. Testing sessions were conducted in environments that simulated real use: rooms set up as shop floors and in-home cupboard settings. Not a lab. Situations that reflected where the product would actually be used.

I also gathered insights directly from Unilever, RNIB, and the team at Microsoft's Seeing AI, understanding how the product needed to behave to fit into the routines of people who already relied on specific apps for daily independence.

I tested the physics of QR code detection in detail: what happens to camera alignment when there is no visual feedback, and what detection distance would make scanning genuinely usable without sight. A standard 15mm QR code scans at roughly 15cm. That requires precise alignment no blind user can consistently achieve. The detection mechanism needed to work from at least five times that distance to be usable in a real environment. That constraint shaped the entire technical approach.

Key Findings

  • A standard 15mm QR code scans at roughly 15cm. That requires precise alignment a blind user cannot achieve without visual feedback. The detection mechanism needed to work from over a metre away to be genuinely usable in a real shopping environment.
  • Existing accessible solutions were stigmatising or device-dependent. RNIB consumer research confirmed what the co-design sessions had already suggested: the technology had to live inside the apps users already relied on daily, like Seeing AI and Envision, not in a new app they would need to discover and adopt.
  • Audio quality, structure, and spatial placement were as critical as content. Users couldn't evaluate whether information was complete or relevant if it was poorly organised or competed with ambient sound. The information architecture of the audio output was a core design problem.
  • Blind users are not a homogeneous group. Participants ranged from congenitally blind to recently acquired sight loss, with very different relationships to visual concepts and different levels of assistive technology familiarity. The design had to work across that full range.
  • Voice commands were researched as a potential interaction model and consciously excluded from the MVP. Users were already operating their phones through established accessibility routines. Adding a new interaction paradigm would have created friction rather than removing it.

My Thinking

The first decision was the detection model itself. We developed the D3 code: a pattern of additional dots and dashes around one corner of a standard QR code. The D3 augments the existing QR without replacing it. When scanned by a standard camera app, the code behaves exactly as before, directing the sighted user to the brand's standard destination. When scanned by a Zapvision-enabled app, it unlocks the accessible experience. One code on the packaging serves everyone. That constraint was non-negotiable. Brands were not going to clutter their packaging with multiple codes.

The second decision was the proximity model. Rather than announcing everything at once, I designed a two-stage audio approach that mirrors how a sighted person shops. At around 1.15 metres, the app announces the product category and its physical distance from the user. At around 60cm, it announces the full product information: ingredients, allergens, usage instructions. That staged disclosure gives users orientation before detail. It respects how people actually navigate a shelf.

The third decision was how the product should communicate AR content to users who cannot see it. I rejected an audio overlay on top of a visual experience immediately. It would have made accessibility a secondary track, designed for sighted users and then described to everyone else. Building audio-first forced every design decision through the lens of the actual target user from the beginning.

The fourth decision was platform scope and distribution. I chose native mobile over web: the spatial audio and haptic capabilities that made the experience meaningful required native hardware access that web APIs couldn't reliably deliver in 2021. And rather than a standalone Zappar-branded app, I built the Zapvision SDK as a free, royalty-free integration for developers. A proprietary SDK means adoption depends on Zappar's commercial relationships. A free SDK means any accessibility app in the world can add support. The technology spreads faster, reaches more users, and becomes infrastructure rather than a product feature. It is now integrated into Microsoft's Seeing AI and the Envision App, which has over 250,000 users.

My Role & The Team

I led the brand identity within the Zappar ecosystem, conducted user testing with the RNIB, and designed the Zapvision app and CMS for brand partners. The CMS was a significant body of work in its own right. Designed in close collaboration with Unilever, it gave merchants the tools to input and manage product information at scale: seamless product input, layered information architecture supporting everything from ingredients to usage instructions, editable duplicates for SKU management, categorisation using EAN and GTIN standards, multilingual support with auto-translation review, and pronunciation logic for terms that are written as abbreviations but need to be read as full words. TNA, for example, should be announced as Tree Nut Allergy, not as three letters. Every one of those decisions was about making the audio output genuinely useful, not just technically functional.

The Outcome

Accessible QR codes are now on over 5 billion product items globally. Unilever launched first, with Persil capsules and Ultimate Liquids in the UK, before rolling across other brands. Bayer followed. The technology opened healthcare, education, and consumer packaged goods as markets Zappar had no previous route into.

Zapvision has grown into its own product line within the Zappar ecosystem, with its own brand identity, mission, and partner network. The work was recognised with four industry awards: Pentawards, the Gaadys Award, the AWE Award, and the UK Packaging Award.

The bigger internal shift was less expected. The research and design principles from Zapvision became the accessibility baseline across the entire Zappar product suite. That wasn't planned. It happened because the work was good enough that the team wanted to carry it forward.

What's Next

The technology is being extended into complex indoor navigation in partnership with Auki Labs and their PoseMesh spatial mapping protocol, with pilot work underway in offices, retail environments, transit hubs, and public buildings. The design principles from Zapvision are now the accessibility standard for every new Zappar product.