Trying On The Levi's And Google Smart Jacket At SXSW Feels Like The Future
- Dipesh Pal
- Mar 14, 2017
- 3 min read
The most important thing to say about the Levi's x Jacquard by Google jacket is that fundamentally there's little about it that screams technology.
We've known that ever since Paul Dillinger, VP of global product innovation at Levi Strauss & Co, demonstrated it on the Google I/O stage in May 2016. But now this weekend, everyone else who wanted to could give it a go at SXSW in Austin, where a whole rail of different sizes embedded with the conductive yarn that enables touch interactivity were on show.
Getting your hands physically on a technology that has just been a concept for over 18 months, is an exciting leap – especially given the fact this comes with the announcement that Levi’s and Google’s Advanced Technology and Products (ATAP) group, will finally launch it to the public this fall for a retail price of $350.
But again, what strikes you the most when you do put it on, is the fact this is a fashion item first. Where most other "wearables" have been about hardware devices with the occasional fashion accessorising, this is all about textiles. The design is based on an existing Levi's jacket - the Commuter Trucker. It's specifically for urban cyclists, which is the point about the tech’s functionality too, but it's also generally a nice looking, great feeling, performance piece.
And that’s what’s going to be critical for the longevity and mass uptake of wearable technology today – that it merges with what we’re both used to and want to put on our bodies. The tech has to enhance what we wear, not supersede or act in place of what it looks and feels like,

In this instance, the tech integration itself is twofold. On the one hand there's the conductive interface on the left cuff of the sleeve, designed to look like a slight error in the weave and act as a touchscreen. The second part is a black strap that slots into the jacket next to that interface before snapping on like a popper. It's that part that holds the battery, and the USB connecter to charge it (roughly every couple of days).
Fundamentally the experience itself is incredibly intuitive. That strap, or smart tag, is really simple to snap into place. It automatically then connects you to the smartphone app that enables the functionality. Within that you can choose what exactly you want the experience to consist of at any given time. A drag and drop user experience, it’s like building blocks, if you will, for the type of interaction you're intending to have with the sleeve, and the resulting action you want that to create.
At this point, the demos are pretty simple – brush in, brush out, double tap. The resulting actions are then things like answer a call, turn music on, ask the time, get directions, read a message and more.
Those actions were going to be called superpowers at one point, according to Ivan Poupyrev, technical program lead at Google’s ATAP at SXSW this weekend, but have ended up being named "abilities." From his perspective it was about ensuring they didn't overwhelm the user with too many options at this point. It is of course possible to do all sorts more, he explained.
The other important part was to give the user a sign of the action they take – so the idea that brushing in or out is actually working as they do it. As a result, there's a small vibration on your arm that takes place each time, as well as an LED light that comes on. That haptic feedback, the physical response to what you're doing, is pleasantly satisfying. If the music doesn't start playing immediately, how otherwise do you know your double tap succeeded?
In an increasingly screen-less world, that's also an increasingly interesting proposition. Connected clothing needs to feel as well as just act.
Comentários