BID® Daily Newsletter
Sep 30, 2022

BID® Daily Newsletter

Sep 30, 2022

Keep Your Eye on AI in Your Hiring Process

Summary: AI can help you handle your CFI’s hiring process by efficiently finding qualified job candidates. Keep in mind, though, that it learns what you teach it. Plan to check AI’s results regularly, respond personally to the candidates you interview, and keep the data AI gathers secure.

Would you go see a movie made by artificial intelligence (AI)? The Crow, a film made with AI text-to-video technology, just won an award at the Cannes Short Film Festival. Creator Glenn Marshall used other videos as an image source for the film, which blends a video of a dancer mimicking a crow with a more painting-based rendering of a crow against a bleak landscape. AI can do some pretty incredible things, so it’s no wonder that it can help you out with ordinary tasks, like sorting through job applicants.
If your community financial institution (CFI) is like a lot of employers, it uses artificial intelligence (AI) to sort through resumes when you’re ready to hire. AI is not a magic wand that can solve all your human resources problems, but AI serves as a time-saving filter, helping you find people with the right qualifications to interview. Not to mention, AI does this quickly and can free your human resources staff up to perform other tasks that need the human touch more. With all those benefits, it’s safe to say that if your CFI does not utilize AI for recruitment, then it’s missing out on a key advantage that competition may already be using for the hiring process — and that could cost you a chance of winning top talent.
What AI does best

AI excels at saving users from routine, time-devouring tasks. It will happily schedule interviews using automated email, chat, or voice recognition. Ask it to search for particular words on resumes and it will be your best buddy. Or, if you prefer, AI will scan resumes posted to job boards for the keywords you want and alert you when it finds a match.
AI differs from other kinds of automation technology in that it can be paired with machine learning to actually learn from the ways in which you interact with it. The more you use AI, the more it learns. In theory, it gets better over time.
AI’s pitfalls

In reality, however, AI is a bit like your dog — when you realize that the dog has learned exactly what you taught it, for better or worse. That works out great when it learned what you intended to teach. It’s less excellent when the dog’s new behavior reveals a flaw in your own thinking and behavior.
For instance, you might direct AI to find you job candidates who share qualities with your best employees. That could work out well if those qualities are mission-critical, and less so if those characteristics are incidental. AI can’t tell the difference. Show AI resumes from only one type of employee, and AI will learn to look for only that kind of job candidate.
In some ways, AI is more objective than people. It doesn’t care what school a candidate attended as long as that person has the job skills it seeks. But AI is also good at learning our unconscious biases. In practice, AI tends to exclude women, LGBTQIA+ candidates, members of the disability community, and people who aren’t white. Amazon found this out when their AI recruiting tool, using historical hiring data, was biased against women, because their previous hiring practices had favored male candidates. A mistake like that can put you on the wrong side of equal employment laws, cause you to overlook some potentially terrific hires, and rob your organization of the diversity that it needs to thrive.
Check algorithms

Whether your AI system is finding you the job candidates you want or not, you need to keep an eye on what it’s learning from you. Review how your AI is functioning having a human being two or three times a year review a random set of resumes for a job you’ve recently filled. If a staff member spots qualified candidates that the AI filtered out, then your algorithm needs a tweak. As your dataset gets better and the AI becomes more consistent in delivering the most-qualified candidates, you won’t need to essentially audit your AI recruiting process as often.
The human touch

One of the ways that AI streamlines some of the tasks around hiring is by sending out automated responses to job candidates. That’s fine at the beginning of a search, when you might be fielding resumes from dozens of candidates. It is less fine when you have gotten into the interview process and candidates have invested substantial time and energy in learning your CFI and shaking hands with possible future colleagues. Every job applicant is a potential customer or future hire, and every applicant deserves the courtesy of a response from a person. Those who you’ve interviewed should get a personal response from you at that stage in the interview process.
Make candidates aware

Many state and local governments require employers to tell job applicants that they’re using AI. You might need candidate consent to use facial recognition software or guarantee that you’ll delete applicant data when you’re done with it. Even if no laws require this, treating job candidates with the respect you’d offer future customers or colleagues can generate a lot of community goodwill, which is a crucial ingredient in your CFI’s success.
As a tool, AI works best in combination with the discernment that only a person can provide. AI utilization in the hiring process is great for filtering resumes and simplifying the early stages of recruitment, but it’s also only as good as the care and attention your CFI puts into ensuring it’s actually selecting the best candidates for the job. Your potential hires benefit the best from both an automated and a personal touch during hiring, at the right stages and in the right amounts. Keeping AI’s pluses and minuses in mind can help you work more efficiently, find gems you might otherwise have overlooked, and create a vibrantly diverse workplace.
Subscribe to the BID Daily Newsletter to have it delivered by email daily.

Related Articles:

What CFPB’s Rule Means for Consumer Data and Competition
Even if your institution isn't required to comply with the new Personal Financial Data Rights rule, preparing for open banking is essential. As consumer demand for data sharing grows, adopting secure digital interfaces will help you stay competitive and meet evolving expectations.
2024 in Review: Part 2 of 3 — Regulations & Digital Banking
In this second part of our review of 2024, we look at the challenges and opportunities arising from increased regulatory scrutiny, the rise of open banking, and the adoption of faster payments.