Photograph: Isabelle Whiteley / That’s What She Said Project

Artificial intelligence and the automation of body dysmorphia

Marc Crouch
5 min readDec 3, 2018

Computer vision technology is automating a lot of things that traditionally require Photoshop skills, including image manipulation and airbrushing. Worryingly, it is becoming less and less subtle about the commercial applications of such tech in the realm of body image transformation, a fertile commercial playing field with young women the prime target. As A.I. innovation spreads ever further into the alcoves of human activity, we need to ask ourselves whose responsibility it is to monitor the impact on human beings. Whilst a lot of interesting technology is being developed, most of it is still being developed on the foundation of “can it be done” instead of “should it be done”. This needs to change.

Orbo is a highly innovative company. They are operating very much at the practical cutting edge of A.I. image innovation with some really cool technologies under their belt that will doubtless enhance aspects of human life. For example, they have technology that can automatically adjust photos to be more vivid or compensate for low-light conditions. Handy for those late-night selfies and evening scenery shots, and the results are quite convincing.

Orbo’s background removal tool in action

They also have some really useful tech for photo manipulation, including a background removal tool which any designer who’s spent tedious minutes trying to draw a selection around an object in Photoshop will really appreciate. This kind of thing is a massive time saver and most definitely valuable.

Then we have their face transformation technology, and here is where the question starts to surface. At one end they have technology that can apply makeup to a face using A.I. This could certainly have practical applications in the beauty and cosmetics industries; for example, a woman could upload a photo of her face to an online store and try different combinations before buying — effectively digitising the in-store experience of beauty counters. Where it becomes more insidious is in the physical features manipulation. Orbo’s face transform page has demos of technology that can:

  • Add a smile to a non-smiling face
  • Automatically slim down a face
  • Conversely, automatically plump up skinnier faces
  • Enlarge the eyes on a face
Orbo’s face shaper tool which allows women to magically lose weight on social media

There is no ambiguity about how this technology is targeted. All the faces on the demo page are women, and the accompanying text is explicit about its intended use, for example:

“Now you can loose (sic) some facial fat before your diet kicks and look the best in your online social circle.”

In other words, use our A.I. to automatically airbrush yourself in Instagram selfies.

Old problem, new platform

I don’t need to go into the debate around the mental health impact on young women being constantly presented with impossible — and usually doctored — images of the female form. Nor is this kind of technology particularly new: a couple of years ago Samsung caused controversy by installing a “beauty” filter on their phones’ front-facing cameras which was set to on by default.

Melissa Wells on Instagram

What concerns me is that the supply-and-demand paradigm is overruling the idealistically well-intentioned goal of scientific advancement which is, after all, the cluster in which A.I. development belongs. It’s common knowledge that girls as young as 12 are routinely doctoring their own photos on mobile phones for social media consumption, so the commercial temptation for building such technology is clear. Equally clear, though, is the overwhelming evidence of the psychological harm such image manipulation is causing, with mass proliferation of doctored bodies linked to low self-esteem, body dysmorphia, depression, and eating disorders. Particularly vulnerable are younger teenagers, both boys and girls.

The problems are so well-researched and known that governments have started to clamp down on mainstream media use of doctored body imagery. In France, a law was introduced last year forcing all doctored images of models to bear a cigarette-packet style warning label indicating that it has been manipulated. In London, the mayor introduced a ban on the use of such imagery in the buses and underground trains, acknowledging, again, the overwhelming evidence of the harm such imagery causes.

Which brings me back to Orbo, who are based in Mumbai, India, but were recently backed by London, UK-based VC Founders Factory. Where the debate around body image is perhaps not as advanced in India, there is no such excuse for a London-based VC. Their precise plans for the investment and the future of the company are not public, but I can only hope it embraces a sense of responsibility about how such powerful technology should be deployed.

Nor do I mean to pick on Orbo specifically, as there are many other companies providing similar image manipulation technology, both manual and automated. The app stores are full of them, and young girls in particular are lapping them up. And so the question arises: if there is such a high demand for image manipulation technology of this nature, shouldn’t we focus on addressing that demand? The answer to that depends on how you would respond to two other questions:

  1. Do you care about the social impact of the technology you develop?
  2. Are you fuelling the demand or responding to a pre-existing problem?

Artificial intelligence is on a fast track to changing our world in countless ways, and amongst the pace of change I feel these questions often get forgotten, particularly when the commercial potential is so potent. I, however, could not live with myself if I ever heard that my technology had in some way contributed to a young girl dying from anorexia or suicide from depression.

In the whirlwind of A.I. technology advancement, we mustn’t forget to be human.

--

--

Marc Crouch
Marc Crouch

Written by Marc Crouch

Entrepreneur, developer, designer, marketer and probable Viking. https://thegingerviking.com

No responses yet