Innovation Monitor: Beauty is in the eye of the algorithm

Innovation Monitor: Beauty is in the eye of the algorithm

View this email in your browser

Welcome to this week’s Innovation Monitor.

Augmented reality innovation excites our community and me — check out our previous editions — AR adapts complex tech into fun, everyday uses that surprise and delight. We love the beautiful and playful AR app called Wallflower created by a team in NYC Media Lab’s recent Synthetic Media & Storytelling challenge. It’s a great example of how, especially during the pandemic, filters provided a wonderful reprieve from the gravity of what we were facing during family Facetimes and Zooms.

At the NYC Media Lab, we embrace the rewards and recognize the risks of emerging tech. So as we explore and celebrate new creative expressions immersive and creative tech offers, we’ll also consider the unintended consequences of rapid technological innovation.

In this week’s edition we look into how AR-based facial filters have exploded in popularity with relatively few guidelines. What societal implications are most urgent, especially as these tools collect our biometric information? How does this culture of filters, followers, and instant fame impact the development and mental health of children and young people? Read on to find ways people are using AR that may be both surprising and disturbing.

Thank you for reading, and as always, stay safe and thank you for reading, and if you were forwarded this email, you can easily sign up here!

All best,
Erica Matsumoto Filtering your life In his 1996 magnum opus Infinite Jest, the late David Foster Wallace wrote an eerily prescient passage: in the book, the telecommunications industry devises “High-Definition Masking,” starting off with “flattering multi-angle photos” that end up culminating in “a form-fitting polybutylene-resin mask.” Today, we also wear masks, though in the form of filters.

The beauty filter is rooted in selfie culture, which blossomed in the mid-90s ‘kawaii’ scene in Japan, particularly at purikura, “photo booths that allowed customers to decorate self-portraits,” according to MIT Tech Review.

AR filters themselves started off innocent enough. Arguably, Snapchat’s face swapping Lenses kicked off users’ and the media’s interest, though the tech had been around for a decade. A sophisticated early example was released in 2011, when media artist Kyle McDonald’s Face Substitution Research with Arturo Castro blew people’s minds:

Things took off in 2015 when Snapchat added Lenses (aka filters), and in 2016, when the company made face swapping a viral phenomenon. Consequently, the app also had the youngest demographic among the social giants at the time. And therein lies the concern:

“The amount of biometric data that TikTok, Snapchat and Facebook have collected through these filters [is worrying]. Though both Facebook and Snapchat say they do not use filter technology to collect personally identifiable data, a review of their privacy policies shows that they do indeed have the right to store data from the photographs and videos on the platforms.”

Besides very real privacy concerns, researchers still aren’t sure of the lasting impact continuous filter use has on our mental health. When Snapchat launched in 2011, selfies shifted from byproduct to a primary mode of conversation. By 2013, “selfie” was ingrained in the Oxford Dictionary. Today, over 90% of young people in the US, France, and the UK use Snapchat’s AR features. For some, AR filters are more than a makeup substitute, but a lifeline:

“Caroline Rocha, a makeup artist and photographer, says that social media filters provided her a lifeline at a crucial moment. In 2018, she was at a personal low point… [filters gave her] the chance to travel … to experiment, to try on makeup, to try a piece of jewelry.”

Rocha became a popular Lens creator among the hundreds of thousands designing filters for Snapchat. Her first hit was “Alive,” dedicated to her battle with mental illness.

But filters can also become a compulsive habit: Rocha says some people “refuse to be seen without these filters, because in their mind they think that they look like that.” Even the creative direction of Lens creators is changing, says Rocha, with focus shifting to beautification for “money and fame.”

“There is a bad mood in the community. It’s all about fame and number of followers, and I think it’s sad, because we are making art, and it’s about our emotions … It’s very sad what’s happening right now.”

According to Claire Pescott, a researcher who studies the behavior of preteens on social media, for young girls AR filters are primarily used for beautification: “[In a study, girls] were all saying things like, ‘I put this filter on because I have flawless skin. It takes away my scars and spots.’ And these were children of 10 and 11.” Facial distortion Filters don’t just smooth the skin, but perform facial distortions, sort of like digital cosmetic surgery — making lips bigger, lifting eyebrows, narrowing the jaw, thinning the nose, and enlarging the eyes. This is appalling.

Instagram actually banned cosmetic surgery filters in 2019 due to the potential negative impact, and then brought them back with some limitations in 2020. (As CNBC noted in 2018, “even if Snapchat or Instagram removed its filters, other apps would simply take their place.”)

Changing one’s online persona or avatar — especially at a young age — may lead to a change in behavior. Coined by Stanford’s Nick Yee and Jeremy Bailenson (who is the founding director at Stanford’s Virtual Human Interaction Lab), the Proteus Effect is the hypothesis that “an individual’s behavior conforms to their digital self-representation independent of how others perceive them.”

“Across different behavioral measures and different representational manipulations, we observed the effect of an altered self-representation on behavior. Participants who had more attractive avatars exhibited increased self-disclosure and were more willing to approach opposite-gendered strangers after less than 1 minute of exposure to their altered avatar. In other words, the attractiveness of their avatars impacted how intimate participants were willing to be with a stranger.”

When asked what he thought about AR filters’ effects on his own two daughters, Bailenson said it’s “a real tough one, because it goes against everything that we’re taught in all of our basic cartoons, which is ‘Be yourself.’” The eye of the algorithm A quick Google search will yield a collection of startups around the world offering facial recognition and analysis APIs, leveraging AI for use cases like beauty scoring, makeup recommendations, dating apps, and more. The world’s largest open facial recognition platform, Face++, has a beauty scoring AI.

With thousands of available filters able to morph your appearance to a predefined popular look, as well as algorithms that rate you on how well you achieve that look, it feels like beauty is becoming a commodity. This isn’t boding well for youth. According to a report from the Royal Society for Public Health that surveyed 1,500 social media users aged 14 to 24:

Every single platform, other than YouTube, was associated with users’ anxiety and depression. In fact, use of the two most image-centric platforms, Snapchat and Instagram, were ranked lowest for users’ well-being, particularly pertaining to bullying and [FOMO], and in news that will not surprise anyone who has looked at the #thinstagram hashtag, Instagram scored poorly related to body image and anxiety.”

Social media algorithms aren’t helping. In fact, they’ve come under continuous criticism for promoting this commodified version of beauty, excluding or flagging people of color or those with disabilities. Adore Me’s viral tweet thread, which demonstrated how videos on TikTok with plus-size, Black, or disabled lingerie customers were taken down, is a great example:

This Week in Business History

June 14th, 1822: Charles Babbage introduces his “difference engine” that will set the course for the future of the computer.

On this day, Babbage unveiled the machine that would be the first example of a mechanical computing machine. The British government funded the building of this idea, that would never be completed, but set the stage for modern computing. While Difference Engine would never be completed, the subsequent “Analytical Engine” would be the precursor to the foundation of how we understand computer science and software. This newsletter took a deep dive into this topic and Ada Lovelace a few editions ago we’ll take you back to:

After spending grant money equal “to the cost of two large warships,” the inventor found that there was nobody in the early 1800s that could manufacture the necessary parts. (Someone did eventually fund the construction of the Difference Engine… in the 1990s.)

It was around this time — 1833 — that Babbage, 41, met 17-year-old Lovelace (who went by Ada Byron at the time). Lovelace was fascinated by Babbage’s Difference Engine — and understood how it worked — and the two kept in touch.






This email was sent to <<Email Address>>
why did I get this? unsubscribe from this list update subscription preferences
NYC Media Lab · 370 Jay Street, 3rd floor · Brooklyn, New York 11201 · USA

NYC Media Lab connects university researchers and NYC’s media tech companies to create a new community of digital media & tech innovators in New York City.