This AI Gives Emojis Real Faces and the Results Are Horrifying
If you train a machine learning tool to find human features in emojis, the results are not pretty.
Withered eyes on child,
Red lips as pepperoni,
Ghost with gaping jaw.
This haiku was inspired by Jonathan Fly’s horrifying post on his blog “I Forced a Bot.” Fly took a machine learning model—designed to reconstruct human faces using 16 x 16 pixel images of human faces—and used it to reconstruct human faces using 16 x 16 emojis, Twitch emotes, and video game sprites. I don’t like how it looks!
Human-inspired emojis, like the woman-raising-hand emoji or the sweating-while-smiling emoji, look disturbingly photo-realistic. But arguably the worse results come from object emojis, like the pizza or fountain emojis, that spurt eyes, hair, and lips when they shouldn’t. It has the same effect of looking at that “realistic SpongeBob” art. Imposing realistic features on cutesy images will always be a bit jarring. Fly told Motherboard via Twitter DM that he made these images mainly because he was interested in what would happen.
“It’s not so much inspiration as it is curiosity,” Fly said. “I guess it’s just my default mode to try things the wrong way or that shouldn’t work.”
“My favorites are the fountain because it worked so unexpectedly well, and the pizza with human lips, because it’s horrifying,” Fly said.
According to his blog, Fly trained his images based on an arxiv.org paper, which was published August 22 by researchers Korea Advanced Institute of Science and Technology. The paper, called “Progressive Face Super-Resolution via Attention to Facial Landmark,” uses a method called face super-resolution, which makes artificial human faces using an input of real human faces. This isn’t a new technique, but this paper uses 16×16 pixel images as the input. Conveniently, Fly said, that’s just about the proportional size of many emojis and video game sprites. (Fly had to adjust the Twitch emote image sizes to apply the model to those images, and also scale down the resolution of the emojis he used.)
“These samples are cherry picked and many outputs were not very interesting,” Fly notes in his blog. “I also went to some lengths to encourage the model to make aggressive guesses about facial features.”
Fly posted the code used he used to make these emojis on Github. So if you have the patience, knowledge, and resources to replicate the machine learning tool used here, I guess you can try to attempt to duplicate these results! Do as you please, godspeed.
This article originally appeared on VICE.com To read the full article and see the images, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering tools for Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT, IBM Cloud Pak for Integration and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics