![A cartoon of a woman with a smartphone next to her showing it has recognised her face.](/media/gzgnpp5r/hdtw-image.png?width=1000&height=600&format=webp&rnd=133419556809800000)
Face filters
Dig into any of the biggest social media apps and you’ll find an array of filters that can transform the input from your phone’s front camera. These augmented reality (AR) effects can change a person’s appearance, or that of their background, using computer vision and image processing models that perform real-time video modifications.
Among the millions of filters available, you’ll find all sorts. You can accessorise with some glasses or throw in a few sparkles, as if you’ve been doused in glitter. Other filters are limited only by their creators’ imaginations and range from the ridiculous to the fantastical. The classics include the face, age or gender swap; appended animal ears, or the Pixar version of you. But you can also, if you like, freakishly distort your features, or see what it would look like if your face were plastered onto the body of a giant prawn. Or instead of the boring wall behind you, why not take on the lo-fi graphics of a noughties music video – using the very same background subtraction technology popularised by Zoom during the pandemic?
Face filters rely on computer vision and image processing models that can modify a video feed in real time. Central to the process is a facial detection algorithm, such as the Viola-Jones algorithm (for more on facial recognition technology, see Ingenia 79). This algorithm works out differences in contrast between portions of an image to detect the edges of the features. For example, as any portrait artist knows, the eye sockets, sides of the nose and lower lip are darker than the upper lip, bridge of the nose and middle of the forehead. If the algorithm finds enough of these features, it can detect a face.
Having detected the facial features, the software then aligns a statistical model of the face with to face in question using machine learning. Called an active shape model, this is derived from hundreds of thousands of facial images, on which people have marked the borders of features. A mesh representing the ‘average face’ from this model is then scaled and aligned with the user’s face, with adjustments made where it doesn’t fit perfectly.
This mesh is where the magic happens. Using special software, it can be distorted, or have accessories attached to it, or have colour changes applied to segments of it (such as the eyes). Importantly, the mesh and any programmed alterations to it must move along in real time with the video of your face, for a smooth viewing experience. But this tracking isn’t quite perfect yet, as can be seen when people turn their heads to the side, as effects can disappear. Plus, in some cases, the algorithms must account for occlusion. This refers to what happens say, when a hat is seen from a three-quarter profile: the head will block the back of the hat from view. So, for a hat filter, the part of the animation behind the head must be subtracted.
One concern about filters that has been increasingly highlighted is the beauty filter. Although they rely on distortion too, the distortion is so subtle you might not immediately notice it if you saw it on someone else’s photo. However, the effects are worrying: noses or jawlines can be slimmed down, skin can be smoothed, eyes and lips enlarged, and eyelashes lengthened to achieve a so-called ‘Instagram face’ – a Eurocentric standard of beauty often achieved with cosmetic procedures.
What’s more, not all ‘beauty’ filters are labelled as such – sometimes a hat filter might also slim down your nose. All of this adds up to a troubling phenomenon that has been dubbed in the media as ‘filter dysmorphia’, a new version of an old problem to which some young people are particularly susceptible. This is an issue tech companies will need to consider carefully if filters continue their trajectory of popularity.
Beyond the uses purely for social media, brands are developing filters allowing people to try on clothing, accessories, jewellery, and makeup, or see what a piece of furniture could look like at home. While these haven’t gone mainstream yet, in the next few years, you could well be investing in your next pair of glasses without ever seeing them in person.
***
This article originally appeared in Ingenia 92 (September 2022).
Keep up-to-date with Ingenia for free
SubscribeRelated content
Technology & robotics
![A man sitting in a moving car autonomous vehicle, with his hands on his lap.](/media/ngznq2i1/ampnet_photo_20130909_067269.jpg?width=250&height=150&format=webp&rnd=133573941435930000)
When will cars drive themselves?
There are many claims made about the progress of autonomous vehicles and their imminent arrival on UK roads. What progress has been made and how have measures that have already been implemented increased automation?
![The C-Enduro in the ocean with land in the background.](/media/ckhben15/c-enduro-012.jpg?width=250&height=150&format=webp&rnd=133566986128670000)
Autonomous systems
The Royal Academy of Engineering hosted an event on Innovation in Autonomous Systems, focusing on the potential of autonomous systems to transform industry and business and the evolving relationship between people and technology.
![Fisherman on a boat with electronic equipment.](/media/4p3ev4rq/fisherman-6270656_1920.jpg?width=250&height=150&format=webp&rnd=133565401668770000)
Hydroacoustics
Useful for scientists, search and rescue operations and military forces, the size, range and orientation of an object underneath the surface of the sea can be determined by active and passive sonar devices. Find out how they are used to generate information about underwater objects.
![A headshot of Professor Paul Newman FREng.](/media/1fsmjcyj/newman-portrait.jpg?width=250&height=150&format=webp&rnd=133564417088770000)
Instilling robots with lifelong learning
In the basement of an ageing red-brick Oxford college, a team of engineers is changing the shape of robot autonomy. Professor Paul Newman FREng explained to Michael Kenward how he came to lead the Oxford Mobile Robotics Group and why the time is right for a revolution in autonomous technologies.
Other content from Ingenia
Quick read
![A woman standing in front of a sign for COP27 that says "Sharm El-Sheikh, Egypt 2022"](/media/wqtfdbz5/outside-cop27.jpg?width=320&height=200&format=webp&rnd=133419544800030000)
- Environment & sustainability
- Opinion
A young engineer’s perspective on the good, the bad and the ugly of COP27
![A wind turbine with fields of sunflowers in front of it](/media/jgzhtx1s/unsplash-wind-farm.jpg?width=320&height=200&format=webp&rnd=133419547948300000)
- Environment & sustainability
- Issue 95
How do we pay for net zero technologies?
Quick read
![](/media/1bnlklys/3.jpeg?width=320&height=200&format=webp&rnd=133444292959200000)
- Transport
- Mechanical
- How I got here
Electrifying trains and STEMAZING outreach
![An artist's impression of a home in the future made using mycelium-based materials, which are represented by hyphae-like white tendrils](/media/m0bgvyu3/95_cover.png?width=320&height=200&format=webp&rnd=133419544709630000)
- Civil & structural
- Environment & sustainability
- Issue 95