Prepare all you want but know it doesn’t matter because your job interview fate doesn’t lie in your hands but the AI behind it. No, it’s not “where do you want to be in five years” or “what positive qualities do you have” but factors that are hidden from you.
Be as gracious and charming as you want, but it’s all been decided before you sit down in front of that HR person. Forget the resume, the school, your charitable work and your close association with your field; it doesn’t matter. It’s done well before you ever applied for this job. Hopeless? Maybe or maybe not.
Prepare all you want, but know it doesn’t matter because your job interview fate doesn’t lie in your hands but the AI behind it. No, it’s not “where do you want to be in five years” or “what positive qualities do you have” but factors that are hidden from you.
Be as gracious and charming as you want; it’s all been decided before you sit down in front of that HR person. Forget the resume, the school, your charitable work, and your close association with your field. Experience, your social network, and everything else doesn’t matter. It’s done well before you ever applied for this job. Hopeless? Maybe or maybe not.
Artificial intelligence has entered the world of industry hiring, and it hasn’t been a smooth transition. The companies and the potential employees being interviewed are all asking questions about the information and how it’s being obtained.
Employees, without their knowledge, are being scored on things such as facial movements, which are then translated by the AI programs into an “employability score.” The number of companies using this new technology for hiring purposes is not insignificant and has been the subject of both new laws as well as lawsuits.
What’s the problem(s)?
An estimated 100 companies have already processed over a million job-interview applicants with AI. The program that is used by many corporations is Higher Vue, just one of many on the market now, and the process appears to be a simple one.
The algorithm uses videos of applicants to complete an assessment of their employability scale. However, researchers know that facial expressions alone are a poor indicator of the individual’s ability and/or experience level.
We’ve all heard the expression, “You can’t tell a book by its cover,” and, when it comes to job applicants, that applies, also. Large numbers of people are very good at maintaining a specific, highly-valued countenance, and this would, possibly, successfully deceive the software.
Also, there have been problems in these programs which have been shown to have a bias against anyone non-white and, in some cases, not male. The programs are designed to homogenize applicants into what is considered a “traditional” applicant, and those who are not native-English speakers may be scored negatively. Is all of this going unnoticed?
Industry experts are raising alarms that this type of interview screening will encourage applicants to attempt to deceive the program, should they know of its presence. The problem is of sufficient concern that The Electronic Privacy Information Center (EPIC) has requested that the Federal Trade Commission investigate at least one of the programs being used, HigherVue.
Primarily, one of the main factors in the program, which is a source of contention, is the use of the aforementioned facial expressions. Anyone who has had any association with or studied acting will quickly understand the validity of that concern.
Throughout our lives, we are taught to be appropriate in our behavioral signals and, in this manner, deceive the viewer as to our intentions and our personality factors. One would have to be quite naïve, which AI programs may be, to fall for this deception. Wouldn’t HR professionals be adept at detecting this type of trickery?
Technology and not HR professionals appear to be the prime mover here. Corporations have bought this hook, line, and sinker. In the process, AI employability scales have potentially discriminated against qualified employees in favor of those who would game the system by their acting abilities. A charming visage isn’t what it seems to portend.
The programs depend on learned, overt behavioral cues in terms of what is considered traditional in the culture. Anyone outside that algorithm is unacceptable, and this would include those who are disabled in any way. Their scores would be lower than someone who can mimic the correct, acceptable behavioral cues.
The legal implications of AI interviewing
The Electronic Privacy Information Center and others have complaints against the use of AI software in hiring. The problem, however, appears to be that there is little regulation of this industry tool, and companies are reluctant to share any information regarding their data or its potential bias. The bias factor in AI is well-known. These factors have contributed to the lack of significant lawsuits and legislation in most states.
One state, Illinois, has decided to take the bull by the horns. Realizing that companies are using third-party digital hiring platforms, the state legislature passed the Artificial Intelligence Video Interview Act, which would create requirements for disclosure as well as the utilization of this technology.
“Video interviews constitute one type of product offered by certain digital hiring platforms. Video interviews may be offered in a variety of forms — from live interviews conducted by a hiring manager but simultaneously recorded for future audiences, to recorded interviews conducted by the computer program, giving applicants a limited time (e.g., 30 seconds) to record an answer to each question. In any recorded form, these digital hiring platforms use artificial intelligence (“AI”) to analyze an applicant’s answers. AI may be used to analyze facial expressions or eye contact, or even the speed of an individual’s response, in order to evaluate the quality of an applicant’s answers.”
The Illinois law requires that any employer who is using AI-enabled video interviewing technology must make the potential employee aware of the following:
1. Notify each applicant before the interview that AI may be used to analyze the applicant’s video interview and consider the applicant’s fitness for the position
2. Provide each applicant with information before the interview explaining how the AI works and what general types of category characteristics it uses to evaluate applicants
3. Obtain prior consent from the applicant to be evaluated by the AI program
The law also includes provisions for maintaining the applicants’ privacy and limits sharing of the material to persons qualified to make judgments regarding their qualifications for the job.
Also, all applicants must be permitted to request that all copies of their videos, and all backups of these videos, be destroyed “no later than 30 days after the applicant requests the company to do so.”
This law is in keeping with the history Illinois has of providing for employees’ privacy, i.e., 2008 Biometric Information Privacy Act (BIPA). The law, in that instance, anticipated the need for protection in terms of biometric data and is thought to be the first of its kind in the nation.
Future considerations for AI and employment
The situation of AI is both complicated and premature in its development. Laws designed for AI must establish new language and norms. Specifically, the Illinois law still has not defined what AI means and does not provide guidance on what specific information a candidate is to receive in order to fully understand the mechanism used.
The specified 30-day deletion requirement is also seen as vague and may need to be reworded to come into compliance with other regulations. However, Illinois law does provide guidelines for other states to write their laws to more adequately address technology changes, which will affect employment in the future.
http://www.drfarrell.net
Comments