I lost my job after AI recruitment tool assessed my body language, says make-up artist::A make-up artist says she lost her job at a leading brand after an AI recruitment tool that used facial recognition technology marked her down for her body language.
Absolute bigotry against neurodivergent people. Normalizing body language is exactly the sort of prejudice neurodivergent people have to put up with all the time.
Thinking more on this, you don't need to even have autism or ADHD or any other form of diagnoseable neurodivergence. You could just be an introverted person who doesn't do well in such situations.
And then there's the nationality issue. Different nationalities and cultures have different body languages. It is disrespectful to make eye contact in Japan and expected in the U.S. So what does this AI do with a Japanese hiree?
These kinds of tools can easily just be the fall guy. An excuse they use to get rid of you. Then if you complain, they can just be like "the AI did it!"
Suddenly, viciously reminded of that quote: "If a machine cannot be held accountable, it cannot be allowed to make a management decision" (paraphrased)
...should prooooobably start legislating that shit
This isn't new. Recruiting firms such as HireVue have been pushing out "AI" interviewing platforms which automatically judge your body language, fashion choices, tone of language etc since at least 2018.
For a brief moment I worked in that industry as a programmer. The whole point is not to find the most qualified candidate but to find the one that fits into the company culture the most in order to reduce turnover. These algorithms will throw away applications from people of color because they have "behaviors not in line with the company culture" or applications from disabled people because they would "not react properly to certain situations".
Of course they aren't explicitly rejecting these people, but the questions and answers on the tests for applications are specifically and painstakingly crafted to filter out these people without making it clear what type of person the question is trying to filter out.
This doesn't necessarily have to do with the AI in question, but my point is that the entire hiring/firing process is totally fucked, and companies are constantly looking for ways to get around discrimination laws.
Using an AI to grade someone's body language seems like a horrible thing.
Although I will say there is some validity to being careful about who you hire company culture wise and I'm not talking about race gender or disability.
We've turned down the 'best programmer' numerous times, some people that really had some solid skills, because they came in aggressive and brash.
The one guy got his "sorry but no thanks", said look at my resume I'm an absolute master at everything you do, and he wasn't kidding he was very good. We told him we recognize his skills but said that socially he was difficult and abrasive just in interviews and that there's no way that they could subject him to the rest of the company. He unleashed a string of profanities and said couldn't we just have him work somewhere else on his own separate projects. No, that wasn't going to be an option.
Nobody wants to hire somebody that's going to make a workplace toxic. That means that sometimes you turn down some of the better skilled opportunities, but you can always find somebody nicer and train/educate them.
As far as race, gender, quirks, we have you meet with everybody, a group takes you out to lunch. You can be shy, flighty, uncomfortable, awkward, the basic test is, can you mostly do the job and would other people want to work with you. And if the people come back with the answer of no, we don't bring you on. We've done that since the very beginning, so everybody there is already pretty much a tolerant nice person.
I had this one guy interview for my department, He made it through the morning interviews no problems. Gold star. The lunch crew took him out to lunch. He turned it into a people watching affair and started making horrible comments about all the people coming in the door. One of the strongest personalities I know was the lunch came back to me and said he makes me very uncomfortable. I sent him packing.
I hope we don't get to the point where all jobs are using AI to weed people out without humans checking behind it.
Funny. That sounds exactly like how they tried to use "intelligence tests" to prevent Black people from voting. The questions didn't explicitly exclude Black people, but we're written in a vague and subjective way so that the test-giver could claim that any answer was right/wrong and thereby exclude anyone they wanted.
They are not using it to stop bias. If history has proven anything, it's that AI is biased as shit. They are using AI to excuse bias, because "computers ergo cold hard logic" while ignoring that they aren't training in ethical and moral considerations.
What's twisted is that, if you go way out at the edge of the curve, you'll find people who either actually do enjoy any horrible job you can think of, or else are willing to do an earnest imitation of enjoying it. Large enough employers can search though a large enough pool, using sophisticated enough tools, to find these weirdos and try to employ only the kinds of lunatics who would say "I love this job, and I think a pizza party is an acceptable alternative to a raise!"
Only if the AI used discriminatory criteria from a protected class.
They CAN fire you for feeling you're likely to sue. They can't retaliate against a lawsuit, but there isn't one yet. At-will employment sucks, and the thing that protects against this is a union, not discrimination laws.
It classified her as "most likely to be critical and sue company for ethical violations".
But seriously, I don't know what it is with the AI craze. Today, HAL 9000 seems like a documentary because it's like most of these AI are behaving - highly reliable until they go off the completely deep end and suddenly aren't. They are at their worst when deployed in highly subjective and dynamic situations, like the one mentioned in this article.