There's no right or wrong answer when it comes to a company's use of generative AI tools in the hiring process
Making sense of generative AI can be, to say the least, confusing. “So many benefits and potential disruptions,” tech industry insiders are saying. But then those same insiders are also saying, “So much uncertainty and things that could go wrong.”
With all these mixed signals around generative AI, trying to figure out how and whom to hire — particularly technical talent — is especially confusing for talent acquisition (TA) professionals.
Many companies don’t have firm policies on employees using ChatGPT — and, in many cases, TA leaders are left to navigate AI on their own, while being pressured to grow their teams, yesterday.
This awkward position explains what appears to be a contradiction in the results of our recent talent acquisition survey:
While companies are sorting out their long-term policies, it’s important to commit to one of two approaches for using generative AI, especially when hiring. Even if that approach is likely to change.
Let’s dig into “the why” through the lens of two of our customers here at Filtered.
One of our clients, an emerging leader in cloud-based IT services, has a technical team of more than 1,000 engineers.
In the near term, the company is projecting growth in revenue several times faster than headcount growth.
To achieve this goal, the company’s technical team uses a variety of cutting-edge tools to increase efficiency – and has readily embraced the emergence of generative AI.
Like many developers, our client’s technical team members use generative AI tools to:
To continue to make strides with efficiency using generative AI, this company encourages candidates to use tools like ChatGPT in technical assessments.
With job simulations, powered by Filtered, hiring teams monitor whether candidates use tools such as ChatGPT, Github Copilot, Bard, and others to create code snippets that can plug into solutions as a part of technical skill assessment exercises.
And with our video recording tool, the company can also probe candidates, both technical and non-technical, about how they use these different tools during the assessments — covering topics such as prompt engineering, responsibly sourcing code/libraries from external sources, and quality control.
There are some industries that, rightly, place information security above all else. And because generative AI tools are immature and lack regulations, some companies are concerned with risks related to employees using proprietary and personal information with these tools.
As a result, employees of these companies, including technical team members, are not allowed to use these tools.
For example, one of Filtered’s customers is a publicly traded financial institution. It is involved in transactions amounting to hundreds of billions of dollars every year and has financial data about millions of individuals.
While known as an innovator in their space, the company does not currently allow the use of generative AI as a tool — because of security.
When seeking new members for its technical team, the hiring team seeks out development expertise using its approved tools. However, due to the industry that they’re in, ChatGPT is not among the approved tools in their tech stack until there are proper guidelines and regulations in place.
To be fair and transparent to candidates, the company communicates the rules of the assessment — stating, among other guidance, that developers should not use generative AI tools.
The hiring team then monitors when job candidates leave the simulated job interface to go to other sites — as the Filtered platform can track activity that is deemed as “cheating” or might prove to be fraudulent, based on this company’s definition.
From the start of the interviewing process, companies need to be clear about their policy on generative AI with candidates.
All candidates deserve to understand the full picture of the job they're pursuing — including the tools they are expected to use, or not use, on a daily basis. In fact, candidates told us in a recent survey that clear insight into the tech stack and day-to-day work was the top factor in choosing a job.
Meanwhile, as more companies invest in skills-based hiring tools and processes, TA leaders should focus on assessing the very specific skills they need out of their next hire.
Investing time in the interview process to assess a candidate’s skill level with Gen AI will only create more confusion if their company or industry has not yet approved the use of those tools.
Put another way, don’t hire an electrician if you need a plumber!
Right now, there’s no right or wrong answer when it comes to a company’s use of gen AI tools among its employees.
As discussed above, there’s much to consider with AI in the hiring process. But we do believe that a company’s hiring process today should be consistent with the company’s overall policy today.
That’s the only way to ensure companies land the specific talent and skills they need today. And we can help.
Filtered can help you define and design the assessments that match the skills you need. You can create simulations with the very specific work environment and tools your candidates will experience if they get hired.
We can help you clearly communicate the rules of those assessments so that there’s no ambiguity for candidates regarding the tools they are able to use. After candidates complete their assessment we’ll provide the data and insights to help you assess the performance of those candidates — including how they used different tools in the test.
In the end, we’ll help you find the right talent for the job you need to be done today.