Imagining personalized, interactive wardrobes

The Fashion Innovation Agency (FIA) at London College of Fashion, University of the Arts London and Reactive Reality are creating the personalized digital human stylists of the future—that will transform the way we interact with our clothes, with help from Microsoft AI and IOT.

London’s Fashion Innovation Agency has dreamt up a new way for fashion lovers to try on clothing and get uniquely customized style tips, directly from their personal devices. They’re introducing digital human stylists—powered by Microsoft AI, IoT, and Natural Language Processing and Reactive Reality’s PictoFit technology—that have the potential to transform the entire ritual of getting dressed, with huge implications for the fashion industry.

So, what are digital human stylists exactly? They’re like personal avatars, only much more realistic and complex. The digital humans are brought to life with effortless natural animations and speech. Each one is customized to a user’s body and style preferences, with an engaging interface to boot.

Interacting and building trust with digital humans will have a huge impact on the way that we experience and consume fashion. These digital twins will allow for a far more immersive and meaningful connection, that will forever change the fashion and retail industries.

Matthew Drinkwater, Head of LCF Fashion Innovation Agency

Like each of us, every digital human stylist is unique. They’re tailored to a user’s own body shape and driven by the clothes a person already owns. They’re further infused with additional intelligence like current location and environmental data such as local services, weather, and community information. With this nuanced information, the digital stylists create outfit recommendations based on pre-scanned 3D garments stored in a user’s digital wardrobe—ultimately posing more fashionable and sustainable ways for us to engage in our daily clothing choices.

Creation of the digital human stylist side by side and example of the interaction of digital human stylist.

The interaction with the digital humans goes beyond just what we wear in the here and now. Using Microsoft’s HoloLens 2 or a mobile device, people can have actual conversations with their digital humans about their future choices, as well: providing feedback about outfits, requesting new garment options, and even adding alternate looks or garments to their wardrobes.

The digital human stylists also leverage a broad suite of connected Microsoft products to make contextually relevant style recommendations. Bing search provides a broad slate of outfits and can be used to locate local services like dry cleaning, or fill gaps in a user’s wardrobe with new garments for purchase based on the styles users already love. With help from a Microsoft Outlook calendar, digital humans provide outfit recommendations while keeping schedules in mind. Whether it’s a wedding, a job interview, or a formal dinner, each outfit is tailored down to the tiniest detail while considering parameters like venue, location, and guest list, so you don’t end up wearing the same outfit with the same people.

At the core of this game-changing project is an initiative with massive potential across retail, enterprise, advertising, showcases, and the entire fashion industry.  FIA is always pushing the envelope of what’s possible for the fashion industry of tomorrow. Reactive Reality’s solution which is built on Microsoft Azure can help retailers unleash the power of AI and innovation to bring the most value to their customers. FIA and Reactive Reality are only just beginning to explore the possibilities of this technology and how it can transform fashion—while currently the project is a proof of concept demo, it’s a window to what tomorrow might look like.