Face Off: AI’s facial identity tricks and its future uses
Quasimodo (Hunchback of Notre Dame), or Joseph Merrick (Elephant Man) would have no trouble getting dates in today’s software-intensive world. Accused of “catfishing,” the men could still satisfy their wish for companionship. At-a-distance relationships, thanks to innovations in artificial intelligence and facial recognition and facial substitution software, would make it possible. Yes, Mr. Merrick could conjure up a Brad Pitt-like visage for his fair conquests to view as he spun his tantalizing tales
The market for the software, also known as face-swapping apps include:
FaceApp — AI face editor
Faceover Lite: Photo face swap
Face Swap Booth — Face changer
Face Swap Live
Reflect: Realistic face swap
“The global facial recognition market size (will) grow from USD 3.2 billion in 2019 to USD 7.0 billion by 2024, at a Compound Annual Growth Rate (CAGR) of 16.6% during 2019–2024. Major growth drivers for the market include increasing users and data security initiatives by government, growing usage of mobile devices, and increasing demand for robust fraud detection and prevention systems.”
Market limitations, therefore, are not strictly for lonely lotharios or bashful maids casting lingering looks at their cell phones. As the connectivity grows and the devices become more available in varying stages, the surge will continue, according to investors. Once the genie is out of the bottle, it will not go back.
The Devil is in the details
Inventors of the first bicycles didn’t realize the dangers of a wheeled mechanism driving down the roadway without any brakes. The term “header” came from that design catastrophe and brakes were soon a part of any bike as surely as heaters in cars would be in the future. But facial recognition or facial spoofing or distortion software has no such on-board instructions to prevent misuse of the technology and willful deception. How do you hide your face if it is used for financial transactions? Your credit cards are safe and out of sight, but not your face.
Getting a “date” online is one thing; stealing someone’s face to make withdrawals from accounts or purchases is another. One software program, Zao, released in China recently, has drawn sharp criticism for potential lack of security for personal data, including one’s face.
Slippery slope thinking in the service of technology may bring the ignorance of the past into the technology of the future. For example, how were criminal types to be determined by law enforcement or the public at large in the 19th Century when phrenology came into existence? The belief was that character and personality were revealed by the bumps and other physical features of the head. The system was a favorite of the Nazis during WWII.
Erroneous 19th-century thinking
Scientists made head measurements and detailed maps to insure a means of analysis that mimicked science. Was it the math aspect that made it scientific? Is an algorithm any more likely to make these mistakes? How are the data sets biased, and the results skewed? Where are the brakes? If you don’t know the questions, you’ll never get the answers.
A variant of the theory had its admirers. President Richard Nixon was eager to identify future criminals upon their entry to grade school. Undoubtedly, it was a system intended to select out a specific segment of the population, most probably children of color.
Nixon’s psychiatrist, Dr. Arnold Hutschnecker, was willing to do his bidding. The psychiatrist was a devotee of pre-crime intervention first theorized by Cesare Lombroso who conceptualized the belief that physical characteristics served this end. Did he have the means or the science to accomplish this goal, and was this an ethical thing to do?
Dr. Hutschnecker believed criminality was genetic and ran in families. He claimed that he had the science to back up his claims and his plan. The science was never presented. He devised a program where all 6-to-8-year-olds in the nation would be given a psychological test to determine criminal potential.
According to a book written by Norman Denzin, “those who flunk these tests — which have been shown to provide successful individual prognosis slightly over 50% of the time — would be sent to rehabilitation centers ‘in a romantic setting with trees out West,’ as Hutschnecker phrased it. This late, unlamented proposal was sent on White House stationery to the Secretary of HEW with the request for suggestions on how to implement it.”
Does it sound like remediation or punishment without ever having committed a crime? Nixon dropped the plan before it ever got off the ground.
Reading emotions not head bumps
Apple has proven to be a company with aspirations outside the personal computer or entertainment field. It purchased Emotient in 2016, a program that uses AI to detect emotion. The software has been used by retailers, physicians, and advertising companies. Apple’s plan for it hasn’t been revealed.
Developed by a research team at the University of California at San Diego, the AI program may find a new use in films. Producers wouldn’t need to ask audience members to fill out cards because the machine would evaluate audience emotional reactions and do the work in minutes with accuracy.
Law enforcement in California uses facial recognition AI to go through criminal databases. Theoretically, Emotient could be used to evaluate interviews with witnesses or persons charged with a crime. Here, however, the bias that may be in the software coding may need some braking. Does everyone reveal emotional state in the same way?
Facial recognition or facial emotional evaluation or facial swapping are all in their post-infancy. Even now, we see how technology can be used to skew video media production of political events or news and affect the fate of countries.
Only the future will tell if we’ve used appropriate prudence in the technology or showed our inability to stop evil or faulty data processing. Are the coders the ultimate authorities, or do we need an ethical guidance system for supervision and evaluation?