Google Pixel 6 “Magic Eraser” Reveals AI Flaw

From an article praising the Google Pixel 6, here’s the before:

Source: CNet

And the after:

Source: CNet

Note the obvious contrast failure of AI (light hand vs dark hair and light straw vs dark clothing). Areas of contrast should be easiest for the algorithm to manage because the light/dark borders of a human hand classifier for example, yet they instead reveal flaws in the algorithm.

It’s important to highlight the image flaws here because the next point made in the article is that “Real Tone” is a “major rethink” about people with different complexions in order to get contrasts right (e.g. better handling of darker skin, related to an historic problem of racism in technology engineered for photography). It begs a question of why a white hand wasn’t outlined properly against dark black hair.

Also I’m just going to say that a people “eraser” leaving all the artifacts of life (cups on the table) is a very cynical filter for a “major rethink” team of engineers. They’re allegedly trying to see people more accurately while creating a feature that removes people entirely… put the two together and you get the worst chapters in history.

On the plus side perhaps the feature could generate a whole new class of “ghost” art from Pixel 6 owners to raise awareness: users who publish photos with everyone “erased” and things left behind to emphasize the horrors.

Genocide documentation comes to mind where tables set for dinner are left behind by people abruptly seized and exterminated… perhaps even Pixel could facilitate imagery for the 1838 Trail of Tears, which was initiated with house invasions at dinner time specifically so U.S. Soldiers could put as many Americans into internment camps as quickly as possible.

Also worth considering is how Google has run afoul of the Illinois privacy laws for image processing without consent, and whether more localized device processing is meant to help avoid prosecution.

Google’s face grouping tool, which sorts faces in the Google Photos app by similarity, runs afoul of Illinois’ biometric privacy law. The law requires companies to get user consent for the use of such technologies.

I’ve written here before about Civil War-era photograph manipulation, in the context of impersonation. Instead of asking the ages old question what if something you are can be faked or manipulated, Google brings forward an even older question of what if you and all your friends and family can be erased using technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.