Skip Navigation

Insights

Are AI and empathy mutually incompatible?

Posted on 20 November 2025 by Sarah Lardner

We are already starting to see the extraordinary potential in artificial intelligence (AI) but to get the best out of it we need to use it in the right way, and approach each use case with curiosity, caution and care.

Why? Because people are not robots. They have good days and bad days, families, complications and responsibilities. They arrive at work with feelings of overconfidence and insecurity, exhilaration and grief, hope and anger. As HR and reward professionals, we need to ensure all the AI-driven innovation we use can live alongside that humanity and strengthen it, rather than aggravate or replace it.

The upside: how AI can negate human empathy in a positive way

  1. Reducing bias in the hiring process

Properly designed AI can flag bias or help standardise applications, allowing empathetic recruiters more time to make fairer decisions with better data. By acting as a filter for unconscious bias, AI can help HR make inroads into equality in key areas:

Gender - even empathetic recruiters can stereotype or make assumptions around leadership styles, maternity leave or finding the right 'fit' in industries considered "male-dominated". AI tools can anonymise CVs and focus in on skills and experience, creating a fairer shortlist and giving women and non-binary candidates equal opportunity.

Age - in the same way, some recruiters might unconsciously assume older candidates are less adaptable, more expensive and lack digital savvy. On the flip side, younger candidates may be seen as inexperienced. By highlighting skill relevance and proven achievements rather than age, AI can help unearth capability.

Disability - some disabilities are more visible than others triggering assumptions around performance, absenteeism or 'complications'. Again, by screening for skills and paying less attention to gaps in education or employment, AI highlights capability without revealing disability status, giving candidates a better chance.

Ethnicity or background - a name or accent can trigger unconscious bias and on the flip side, attending a private school might also lead to negative assumptions. Again, AI's ability to be impartial can ensure candidates from underrepresented groups get an equal chance.

  1. Assisting in pay and role evaluation

By standardising how roles are evaluated, AI applies the same rules to all positions. It can analyse the skills, qualifications, responsibilities and market data associated with a role regardless of the individual holding it. This reduces the risk of subjective managerial discretion, or that women or other underrepresented groups might be paid less simply because of negotiation styles, stereotypes or historical bias.

  1. Personalising employee support and freeing time for human connection

By automating many of the more tiresome admin tasks like payroll, annual leave requests and rota-creation, AI reduces the time HR wastes, allowing them to focus more on the people and those high-value conversations.

By analysing engagement surveys, health data or career development needs at scale, the tech can also allow HR to act empathetically and tailor intervention or support where employees need them most.

  1. Help with employee wellbeing

If used well, AI tools’ predictive insights can enable proactive care and prevention by detecting early signs of burnout, attrition risk or disengagement. Armed with this information, managers can step in and apply empathy before problems worsen.

The downside: challenges of using AI

There is one important caveat to all of the above: the idea of AI promising objectivity only holds true if the training data and algorithms are carefully selected and audited. If historical hiring data is biased, an AI algorithm may only amplify bias rather than eliminate it and this is where the human touch - empathy and ethics – is still essential, even indispensable.

AI may filter bias, but humans are still needed to interpret and validate. So what are the key dangers of AI in those same areas?

  1. Risks in recruitment and onboarding

If poorly trained, AI-driven application tracking could default to simple settings, filtering candidates out based on keywords, gaps in CVs or non-traditional career paths. This kind of robotic assessment can overlook human potential, ignore resilience and deny the kind of ‘magic’ cultural fit that a recruiter’s empathy might detect in conversation.

  1. Missing the ‘human side’ of role benchmarking

A role’s value is not only down to the tasks it performs but also how it fits into the overall culture and team dynamic, which AI might miss or undervalue. Invisible or soft skills like empathy, relationship management and coaching – often disproportionately held by women - will inevitably be harder to quantify and value in roles, further skewing any imbalance.

  1. Performance Management

There is a danger that AI-led performance monitoring reduces employees to numbers and metrics. By ignoring the context of personal challenge, home lives, creativity or efforts above-and-beyond, an algorithm will fall short of a manager’s more personal judgement of how someone is doing.

  1. Employee wellbeing

Chatbots or automated mental health tools can offer generic support but will still miss nuance. If these are relied on too much, employees might feel their concerns are not being properly heard or validated.

When extended to conflict resolution, these tensions can be further stretched. No amount of AI sentiment analysis will allow a machine to mediate with sensitivity and in a way that reads body language, intent and facial expression.

Summary: Innecto Insight

At Innecto we are helping clients to future-proof their job architecture, skills modelling and competency frameworks, while considering the potential, pace and scale of AI integration. Broadly speaking, this covers three key areas:

  • With AI taking on routine tasks, we explore the transition from traditional role-based pay to models rewarding the value of specific tasks and the skills they apply to them.
  • We are exploring the shift away from rewarding service, job title and role to compensating proficiency in new and in-demand skills and talents.
  • We are helping our clients look at how they can move away from hierarchies towards more AI-enabled hybrid teams, also exploring how to design and implement group incentives that foster collaboration and shared success.

AI cannot currently replace human judgement, but if it is trained and honed well enough to genuinely augment human understanding, it can enhance our working practices by creating more time for our own application of empathy.

« Back to Insights

×

MENU