Computers are becoming more efficient in making fake faces using a technology called generative adversarial network (GAN). This technology enables propagandists to hide behind computer-generated personas without any baggage. GAN is good at imitating people, but some features still trouble them. As a result, there are many artificially generated pictures with signature glitches. Here are some things to look for when recognizing an image produced by a GAN and identify fake AI – generated faces:

Source

Straight Hair Appears Like Paint

Usually, long hair takes a hyper-straight look where a small patch seems good. But long strands look like someone smudged the acrylic with a huge brush or a palette knife.

Asymmetrical Jewelry or Facial Features

Suppose a big weird blob is floating off the top of the person’s head in a picture. Such kind of aberration or artifact is common in artificial intelligence generated images. You can also check asymmetry in a photo by looking around the ear of the person. You will notice fuzzy hair, and sometimes also there are missing earrings.

Indecipherable Text

Source

AI-generated faces have difficulty capturing things in the background with some structure. Also, GANs have seen original and mirrored versions of the data, meaning they have trouble modeling writing. Finally, it is due to the fact that it only appears in one orientation.

Teeth

Algorithms don’t know the orientation or number of teeth. So instead, it draws on examples of teeth from different angles. Sometimes, the algorithm messes with that and leaves its creations with mangled and odd teeth. So it might be a little more difficult to spot than a disfigured or missing earring. But, if you look at the image closely, you can find some weirdness, especially around the ears.

Surreal Background

GAN-created faces look believable because all the training data has been centered. It means there is less variability for the GAN to the model while placing and rendering eyes and ears. On the other hand, the background can contain anything. It is enough for the GAN to model; thus, it replicates general background-like textures, not the “real” scenes.

Non-Stereotypical Gender Presentation

GAN has a collection of 200k images of above 10k celebrity faces. In this dataset, GAN regularly mixes different features from stereotypical gender presentations. More generally, it is because GANs don’t learn the same categories that humans socially reinforce. Therefore, it is vital to be clear that if asymmetry and non-stereotypical gender presentation is not inherently an indicator, an image will not be “real.”

Strange Backgrounds or Clothing

Sometimes, patterns are a mystery for face-generating algorithms, leading to strange structures in the image background or weird clothing on the subject. Therefore, watch out for text in the background of the image since it’s always malformed. Though the image’s subject will look stunning, it will appear in some geometric jade prison.

To Sum Up

Hopefully, the tips mentioned above will help you identify fake AI – generated faces make you start to question things you see in different ways. In addition, it will force you to corroborate evidence even when you see an image that looks human.

 

Pin It on Pinterest

Share This