George Gouzounis
  • My Newsletter
  • Insights
    • 2511 The Empathy Protocol
    • 2510 The Elephant In The Room
    • 2509 AI: Buy, Build, or Wait
    • 2508 How AI Will Transform Aged Care
    • 2507 From Policy to Practice
    • 2504 Social Robots in Aged Care
  • Custom GPT Instructions
  • Creative Pursuits
  • My Newsletter
  • Insights
    • 2511 The Empathy Protocol
    • 2510 The Elephant In The Room
    • 2509 AI: Buy, Build, or Wait
    • 2508 How AI Will Transform Aged Care
    • 2507 From Policy to Practice
    • 2504 Social Robots in Aged Care
  • Custom GPT Instructions
  • Creative Pursuits
Search
20 October 2025

The elephant in the room: when your staff are using AI in secret (and why)

A veteran aged care worker recently rang me with a story that should be a wake-up call for every board and executive team in the sector. They asked for anonymity – and once you hear what they shared, you'll understand why.

This person has been using AI tools like ChatGPT for a couple of years now, finding them invaluable for managing travel, research, and personal projects. When their organisation started preparing for Support at Home, they spotted a problem: their client assessment tool wasn't fit for purpose under the new service model.
So they did what came naturally – asked ChatGPT to help design something better. They fed it the parameters and criteria for what Support at Home would require. What came back was an excellent starting framework for a more comprehensive, relevant assessment instrument.

Excited by the possibilities, they circulated it to management and colleagues for input.

Management showed indifference. They didn't see immediate value.

Their co-workers' reactions ranged from "it will be a threat to my job" to "don't tell management – if they know about AI, they'll just load more work onto us."
A tool that could genuinely enhance client-centred care was viewed with either apathy or fear. There seemed to be no awareness that in our increasingly challenging environment, such technology might actually contribute to organisational viability and allow staff to do what they love most – working directly with clients.
Here's where it gets uncomfortable: they continue to use AI quietly to achieve what they need and free themselves up for direct client work*. But they don't feel they can share this openly within their organisation. 

A couple of days after we spoke on the phone, they sent me a link to an article about hospital workforce shortages. In it, one clinician said: "That means you can make eye contact with patients, actually have a genuine conversation, and leave work on time without hours of paperwork waiting for you." The person said that’s exactly how they also feel about AI.

The elephant is in the room with us

This isn't an isolated case. An MIT study found that whilst only 40% of companies have official AI subscriptions, employees at more than 90% are already using personal accounts for daily work. This unapproved use reflects stalled formal adoption but rapid grassroots uptake.

Your staff are already using AI. They're using it because it works. They're using it because it helps them do their jobs better. They're using it on their personal accounts because they don't have sanctioned tools – and sometimes because they're afraid to ask.

The researchers warn that whilst this boosts productivity, it also raises genuine risks around data security, compliance, and trust between staff and IT teams. These risks exist right now, whether you acknowledge them or not.

Time to lead by example

If your staff are already using AI (and they are), you have two choices:

Option One: Continue pretending it's not happening. Let staff use unsecured personal accounts. Create an environment where innovation has to happen in the shadows. Risk data breaches, compliance issues, and a growing trust gap between the people doing the work and the people managing it.

Option Two: Get ahead of it. Provide proper tools. Create clear policies. Build a culture where staff can openly discuss how technology can enhance care rather than whispering about it in corridors.

The good news is you don't have to reinvent the wheel. The AI Adoption in Aged Care workgroup has developed the GenAI Guidelines specifically for our sector. These provide a practical framework for organisations ready to move from denial to deployment. You can access them for free today.

Your workers want to provide better care. They're already finding ways to do it. Now it's time for organisations, management, and boards to catch up and give them the tools they need to do their work properly – with the security, governance, and support that comes from official adoption.


* It is important to note that they do observe governance principles associated with AI being used in a community based aged care setting, despite the organisation not having adopted any guidelines yet.

P.S. The person who rang me has nicknamed their AI assistant "HAL 9000" from 2001: Space Odyssey. The irony of naming your helpful AI after a homicidal computer isn't lost on aged care workers.
© 2025 GG 
  • My Newsletter
  • Insights
    • 2511 The Empathy Protocol
    • 2510 The Elephant In The Room
    • 2509 AI: Buy, Build, or Wait
    • 2508 How AI Will Transform Aged Care
    • 2507 From Policy to Practice
    • 2504 Social Robots in Aged Care
  • Custom GPT Instructions
  • Creative Pursuits