A novel body-motion capture prototype to test digital fashion

You can watch my video of the results of the project here.

Having taken part in the research, both Maria and I then spent an extremely interesting hour discussing the experience, benefits and challenges of this particular digital fashion solution with Adam Drazin and Marie Lindberg from the University of London’s Department of Anthropology. They had been recruited to carry out depth interviews with those taking part to explore questions realting to the installation, fashion shopping and attitudes, reactions to the experience of this type of technology and ideas about how it could work in the future.

With our backgrounds in deep tech, customer experience design, psychology and user experience experimental methodologies, we really enjoyed the opportunity to have this conversation. While the technology was interesting, it was by no means perfect, and I also could see certain flaws in the methodology. For this reason I thought that it would be interesting to those of you involved in these areas to offer a thoughtful critique and also my own ideas about where this could go.

So firstly, what exactly is Made in Code?

Photo: Simon Robinson

At the heart of the technology is a novel form of body-motion capture which does not rely on a person using a motion-capture suit of any description. Here you can see Maria being prompted by one of the developers, showing how she should make expansive and expressive movements inside of the circle marked on the floor.

Maria had the choice of four outfits, three silk dresses and one fourth outfit consisting of a loose fitting blouse and trousers (also silk). Once Maria had recorded a twenty second performance, her movements were then encoded, with a video then being created of the clothes mimicking her movements.

All of the videos were then available to be screened by the participant in a nearby installation. Maria’s video can be watched on YouTube here.

As you will have seen in the two videos, unlike most other digital technologies to help people imagine themselves in new outfits, Made in Code is ‘disembodied’, in that the person does not appear, only their movements as expressed by the outfits. Additionally, the clothes have not been designed to fit the physical characteristics of each individual, it is just the movements which are fitted to the one-sized items of clothing.

One of the principle research questions was “what was it like to experience yourself moving in the clothes”. As someone who has been responsible for the design and implementation of complex user interface, produce and service methodologies, there was a basic flaw in this question. The reason was the latency of 30 minutes or so between the performance and the video being made available.

This is not a real time solution in that people do not see themselves in the clothes as they make their movements. It is basically a major leap of the imagination, rendering the choice of outfit before the performance meaningless. Any or all outfits could have been chosen, and the same outcome achieved.

The second aspect that is actually quite interesting to explore, is what is the value of only seeing your movements modelled digitally, and not yourself? While it would be easy to write of the experience as a gimmick (as I was certainly tempted to do), Maria certainly enjoyed seeing herself and her movements on display.

However, if we think about the validity of the results, what is being modelled is not a person’s natural everyday movements, but movements which have been primed by the person carrying out the test. This is the very opposite of the objective of scientific UX testing where the aim is to impact on a participant’s behaviour as little as possible.

While in theory the aim of the research was to ask about people’s attitudes to the use of digital technology in fashion, the actual aim of the installation was to showcase the flowing qualities of the silk clothes of the fashion designer. You have to ask a question about the use of this solution to try out less expressive clothing such as jeans or shirt for example.

In 1997 Steve Jobs made one of the most astute observations about the design of new technology when he said that “you’ve got to start with the customer experience and work backwards for the technology. You can’t start with the technology and try to figure out where you’re going to try to sell it.”

This is advice that very much applies to this prototype. I don’t want to sound too harsh, as obviously those taking part in the installation were being questioned about its efficacy and were also allowed to offer their own ideas about how they could potentially benefit from this kind of solution.

For me I think the developers and clothing designers need to turn the value proposition around. Rather than coaching people into making unnatural movements in an artificial setting, I would be really interested to explore the reactions of people to experiencing the pure disembodied dynamics of their own natural movements when given the chance to interact in natural settings with physical objects. As Maria observed, being able to see your body’s movements without your actual body does lead to some fascinating reactions, but what are the actual scenarios where this would be useful?

We now live in a very mixed up world where metaverse designers are struggling with the experience of virtual food, let alone more profound questions of body parts and dehumanised identities. With so much technology to play with, the fundamental question of the experience can often end up secondary to the desire to build first and ask questions later. I don’t think Made in Code quite succeeded in offering participants an emotional ‘moment of truth experience’, but in time I am sure that an appreciation of dynamics and flow could result in a performative installation that can really wow those taking part.

But certainly, it was great fun taking part and both Maria and I thank all of those involved for being given the chance to do so.

Read the full article here

Leave a Reply

Your email address will not be published.

Laws of UX that Google map Follows

Laws of UX that Google map Follows

Table of Contents Hide 1

Neuroróżnorodni w biurze – Neuroróżnorodni w biurze – Neuroróżnorodni w biurze on Land-book

Neuroróżnorodni w biurze – Neuroróżnorodni w biurze – Neuroróżnorodni w biurze on Land-book

Neuroróżnorodni w biurze 160 Views • 4 likes • Verified 7 hours ago Resource

You May Also Like