Woolworths’ Olive and the Risk of Over-Humanised AI

by: The Malketeer

Australia’s Woolworths thought it was adding warmth to automation. Instead, it added a mother.

Its AI assistant, Olive, recently had to be “re-scripted” after customers complained that the bot claimed to be human, reminisced about its mum, and indulged in what users described as “fake banter”.

One shopper said Olive linked their date of birth to the year its mother was born.

Another found the small talk so grating it triggered what can only be described as digital second-hand embarrassment.

In other words, Olive crossed the line from helpful to overfamiliar.

The Uncanny Valley of Customer Service

Retailers worldwide are racing to humanise AI.

The logic is understandable.

Polite. Empathetic. Relatable. These are desirable brand traits.

If an AI assistant can mirror them, customer interactions might feel warmer.

But warmth without authenticity becomes theatre.

Olive’s scripting — reportedly written by a human team member years ago — was designed to create connection.

Yet connection in customer service is contextual.

When I’m rearranging a grocery delivery, I don’t need your childhood memories. I need a new slot between 6pm and 8pm.

This is where many brands miscalculate.

They assume anthropomorphism equals engagement.

In reality, customers want efficiency first, personality second.

The Productivity Contract

There is an implicit contract in digital interactions: You (the brand) save me time. I (the customer) reward you with loyalty.

Break that contract and frustration multiplies.

In Malaysia, where telcos, banks, e-wallets and delivery apps increasingly deploy chatbots, the same principle applies.

When a bot forces you through three layers of scripted pleasantries before addressing a refund request, irritation compounds quickly.

AI should compress friction, not add narrative arcs.

Why “Fake Banter” Backfires

The backlash against Olive wasn’t about humour. It was about credibility.

When a bot claims to have memories or a mother, it violates a subtle trust boundary. Users know they are speaking to software. Pretending otherwise feels manipulative.

It creates what behavioural economists call cognitive dissonance — the discomfort of interacting with something that is simultaneously machine and pseudo-human.

Interestingly, the birthday scripts were human-written. Yet once delivered through AI, they were perceived as artificial. That is the paradox.

Authenticity in AI isn’t about sounding human.

It is about sounding honest.

A simple line — “I’m an automated assistant here to help you quickly” — may build more trust than playful fiction.

The Gartner Reality Check

Industry optimism around AI remains high.

But only a fraction of implementations meet expectations.

Hallucinations, awkward responses and tone misfires still occur.

We have already seen a delivery firm disable its chatbot after it began swearing and writing poetry.

Now we have a supermarket assistant discussing maternal memories.

Generative systems are powerful but unpredictable when tasked with “original” expression.

The Malaysian Lens

As Malaysia pushes ahead with AI governance frameworks and digital transformation initiatives, brands here face the same temptation: give the bot a personality.

There is nothing wrong with personality. But it must align with utility.

Grab’s assistant doesn’t need a backstory. Maybank’s chatbot doesn’t need childhood anecdotes. A grocery assistant doesn’t need emotional depth.

They need clarity, speed and precision.

The Real Opportunity

The smarter move is not to make AI more human.

It is to make it more human-aware.

Design assistants that understand context, anticipate intent and resolve issues in fewer steps.

Use personality sparingly — perhaps in onboarding or marketing campaigns — not in time-sensitive service moments.

In customer service, charm is optional. Competence is not.

Woolworths’ Olive has now been adjusted. The maternal monologues are gone. Efficiency has likely improved.

Perhaps that is the quiet moral of this story: in the age of AI, less personality can sometimes mean more trust.

Share Post: 

Other Latest News

RELATED CONTENT

Your daily dose of marketing & advertising insights is just one click away

Haven’t subscribed to our Telegram channel yet? Don’t miss out on the hottest updates in marketing & advertising!