AI in the interview process raises questions
Richard Stott, a comedian and writer from Beverley in East Yorkshire, made headlines by declining a freelance copywriting interview after discovering that an artificial intelligence system would be conducting the questions. The move has sparked a wider conversation about the role of AI in hiring, the human element in assessment, and what candidates should expect in a rapidly digitizing job market.
What happened, in brief
Stott had applied for a freelance copywriting role and, during the process, learned that the interview would be conducted by an AI. He chose to withdraw, citing concerns about transparency, accountability, and the potential for biases inherent in machine-driven interviewing. While AI can offer standardized questions and scoring, many professionals argue that it cannot replicate the nuance of human judgment or properly evaluate soft skills such as empathy, humor, and improvisation—qualities a writer like Stott often relies on.
The broader implications for freelancers and employers
Automated interviewing tools promise speed, consistency, and reduced human bias in some measurable ways. However, critics warn that AI can replicate or even amplify biases present in training data, misinterpret nuanced answers, and fail to assess cultural fit or creative potential accurately. For freelance roles in copywriting and content creation, the risk is not only about a poor candidate experience but about missing out on talent that excels in adaptability, tone, and audience engagement—areas where human intuition remains important.
Employers adopting AI for screening should be clear about what is being evaluated. If the goal is to assess writing style, ability to meet deadlines, or knowledge of SEO best practices, AI can help set objective criteria. But the interview stage often demands spontaneous thinking, storytelling, and the ability to respond to unexpected prompts—areas where human interviewers typically shine. Transparency about the interview format is essential, and candidates deserve to know if AI is weighing their answers and how.
Ethical considerations for AI in hiring
There are several ethical dimensions to consider when AI is involved in interviewing:
- Transparency: Candidates should be informed when AI is used, what it evaluates, and how outcomes affect their chances.
- Bias and fairness: AI systems can perpetuate bias if trained on biased data. Regular audits and diverse testing datasets are necessary.
- Accountability: If AI flags a candidate as unsuitable, humans should review the decision and provide feedback.
- Human touch: Creative roles often require nuanced interpretation, cultural awareness, and improvisation—areas where humans excel over machines.
What candidates can do in AI-enabled hiring landscapes
For job seekers, the rise of AI screening means adapting preparation strategies. Respondents should expect structured questions that test reasoning, tone, and problem-solving, and should practice delivering concise, authentic examples that demonstrate creativity and resilience. It can also be beneficial to ask potential employers questions about the interview process, such as whether AI is used, what criteria it assesses, and how candidate feedback is handled.
Looking ahead for the writing and creative industries
The debate around AI in interviews is part of a larger conversation about how AI should be integrated into the creative workflow. While AI can generate drafts, ideas, and metrics, human writers still drive originality, voice, and audience connection. The incident involving Richard Stott underscores a growing preference among some professionals for human-operated processes, at least until AI systems can demonstrate reliable fairness and empathy in evaluation.
Bottom line
Stott’s decision to decline an AI-led interview reflects a broader expectation: hiring processes should be transparent, fair, and capable of capturing the nuanced strengths of human candidates. As AI becomes more prevalent in the workplace, both employers and applicants have a shared responsibility to ensure that technology serves, rather than replaces, thoughtful human judgment in the hiring journey.
