So you say your recruitment software is using AI? It's not!

AI, artificial intelligence, is the hottest of all hypewords in the recruitment industry these days. Different types of recruitment software keep popping up, claiming they are using AI to somehow enhance the recruitment process for recruiters or job seekers. Unfortunately, much too often that's not quite true!

 

 

 

I remember moving to London back in the day c. 12 years ago, and for the first time getting to work with a proper ATS. It had all the cool features, such as mass text messages, integration to Broadbean to advertise jobs at various instances with just one click, it had multiple options to organize search results, and more.

 

Once a CV was uploaded to the system, it automatically parsed the CV / resume for keywords. I think either the first or the second recruitment agency I worked for, was using a system which used Daxtra to parse the information from CVs and applications.

 

Daxtra still operates in the same field, and now they can parse social media profiles, too. You could go into someone's Linkedin profile, and the system would parse all the relevant information from the profile just like that. "DaXtra Parser supports the parsing of social media content, and the growing list of social CVs and business profiles, including all electronic communication contact types, such as Twitter, Skype, LinkedIn and Xing contact info, amongst others."

 

Why am I talking about parsing when I should be talking about AI?

 

You may ask why am I talking about parsing when I should be talking about AI? Well, parsing and matching go very much hand in hand. And matching is what many "AI-powered" recruitment software are really all about, rather than having any sort of real intelligence.

 

AI according to Wikipedia: "... the term "artificial intelligence" is applied when a machine mimics "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving"."

 

So, a machine (software) is supposed to be able to learn, solve problems, and work like a human mind would in a similar situation. It's not about writing algorithms that tell the computer how to function in a certain situation, but rather the program's ability to learn how to function without someone specifically telling it.

 

Lately I've read several organisations' manifests about their new or upcoming recruitment software using AI to match candidates and jobs without human input. To me, that's not AI, but just a good set of matching algorithms.

 

Once a CV / resume or a social media profile is parsed, the information is saved in a database with other similar datasets. Once a job description is typed in to the system, the software then is supposed to "automagically" match the position to suitable people already in the system, or to compare the job description to any new profile added to the database since that moment. Magic? Not. AI? Not. Basic matching algorithms? Yes.

 

Let's say I would perform a simple search in a CV database or ATS. I would type in the keywords related to a skillset and location, and the system would return a list of names and profiles for me to view. The search results would be in a certain default order, that I could then play with. Depending on the level of sophistication of the system, I might arrange the results according to the date when the information was set in or last edited or when the application had come in. I might use location as a basis for organising the candidates, perhaps putting more weight on keywords related to the skills and less weight on location, salary, or whatever criteria there may be. Newer profiles, people who live closer to the location, who are available sooner, etc., may become my top candidates that I start reaching out to.

 

That's normal, that's how systems are expected to work these days. Not magic nor AI.

 

 

If I was searching for certain types of engineers, and used the keyword "engineer" to search for people, and the software would understand that actually some people may use the abbreviations "B.Sc. (eng.)" or "Ing." or "Eng." or perhaps mistyping "Ingineer", and would be just as relevant matches as people with the actual keyword "engineer" in their profile, then we'd be getting closer to AI.

 

If the system would be smart enough to learn from all the CVs in the database, that certain words, like "engineering" in the job title actually mean just about the same thing or at least is closely related to the engineering degree, we'd be talking about AI.

 

If the software would understand, that previously when I've been looking for engineers, I've actually hired people with certain type of background, education, degree, or job titles, and would suggest similar types of people to me when I run a similar search, that would be AI.

 

If the system would do the matching and contacting of people automatically, building me a shortlist of verified candidates, that's AI. Link the system to a chatbot that even interviews the candidates at basic level, that's AI!

 

The "matching algorithms" alone don't make AI, so stop saying your system "uses AI", if it doesn't! That's just stupid.

 

GIFs via GIPHY