Cloud Telephony, Conversational Agentic AI

Privacy, AI and Telephony in Community Housing: What Providers Need to Get Right

“The technology is moving faster than the Privacy Act can keep up.” 

Alex MacDonald, Director, Vixels

Alex MacDonald, Director of Vixels, speaking on privacy and AI in community housing

The pace of change in communication technology—particularly AI-driven telephony—is outstripping how most organisations think about privacy.

I had the opportunity to hear Alex MacDonald, Director of Vixels (www.vixels.co) speak at a recent event about his work advising companies on privacy and he generously agreed to discuss it further.  Vixels’s core service is providing expert advice on privacy compliance, governance & policy development for emerging technologies so we explored that in the context Community Housing and how providers should approach privacy as voice (telephony) AI, and data converge.

Below is a distilled view of the key considerations.

Telephony is now a privacy system (not just a communication tool)

Modern telephony platforms don’t just carry conversations. They can record, transcribe, summarise, and analyse interactions.

The shift is simple but material: voice is now structured data.

From a privacy standpoint, the focus should not be the technology itself, but the controls around it.

Key guardrails include:
• Clear upfront notification (recording and analytics, not just recording)
• Data retention limits – Defined retention policies aligned to sector obligations
• Role-based access controls
• A clearly documented purpose for data collection

A practical takeaway raised in discussion is that organisations should seek advice early where there is uncertainty relating to Privacy, rather than relying on vendor assumptions.

Transparency is the new compliance baseline

A consistent theme was “scope creep”.

Technology is often introduced for one purpose, such as call recording, but quickly expands into sentiment analysis, behavioural insights, and performance monitoring.

The risk is not the capability itself. It is the gap between what organisations are doing and what customers believe is happening.

Best practice includes:
• Aligning messaging with actual system capability
• Updating privacy policies to reflect real use cases
• Ensuring tenants understand how their data is being used 

If AI is extracting insight from conversations, transparency must scale with it.

Treat all communication data as personal information

Even if a call does not explicitly identify a person, it can still be used to do so.

The safest operational model is to treat all voice data, transcripts, and metadata as personal information by default.

This avoids complexity, ensures good governance and reduces risk exposure.

Minimum viable governance model for CHPs

From a practical standpoint, CHPs should implement:

  • Documented policies outlining why data is collected and how it is use
  • Controlled access to recordings and transcripts through defined hierarchies
  • Data sovereignty checks to understand where information is stored
  • Privacy Impact Assessments (PIAs)
  • Regular audits to ensure policies are followed in practice

A key point is that risk often comes from operational breakdown, not the technology itself.

Vulnerable cohorts require stronger consent frameworks

Community housing and NDIS environments often involve vulnerable individuals.

Organisations must consider:
• Whether the individual has the capacity to provide informed consent
• Whether a guardian or authorised party is required
• How clearly the use of data has been explained

This moves privacy beyond compliance into ethical responsibility.

The biggest emerging risk: purpose drift

One of the most important risks is the shift from primary to secondary use of data. 

For example:
• Recording calls for training → using them for behavioural analysis
• Capturing interactions → applying AI for insights not originally disclosed

While the Privacy act allows for the secondary use of data, it needs to be related to the primary purpose of collection. This creates exposure if the secondary use is not clearly communicated or aligned with the original purpose.

Industry perspective: privacy risk is accelerating

“AI is accelerating at a pace that privacy frameworks are struggling to keep up with. Privacy is a key concern for community housing providers managing data for tenants. AI platforms with built-in privacy and ‘human in the loop’ steps and processes offer distinct advantages in this sector.”

— Tier 1 CHP Privacy Officer

This reinforces a key shift: privacy is no longer just about compliance frameworks—it is about how platforms are designed and how organisations operationalise control.

What privacy legislation applies to Community Housing?

Federal framework
• Privacy Act 1988 (Commonwealth)
• Australian Privacy Principles (APPs)
https://www.oaic.gov.au/privacy/the-privacy-act

State-based legislation (government entities)
NSW: https://www.ipc.nsw.gov.au
VIC: https://ovic.vic.gov.au
QLD: https://www.oic.qld.gov.au
ACT: https://www.oaic.gov.au/privacy
TAS: https://www.personalinfoprotection.tas.gov.au
NT: https://infocomm.nt.gov.au

Simple interpretation for CHPs
• Large CHPs (private sector business or a non-profit organisation> $3M turnover): Federal Privacy Act applies. 
• CHPs handling health/sensitive data (regardless of annual turnover): Federal Privacy Act applies
• Government-owned housing entities: State legislation applies
• Smaller CHPs: May fall outside thresholds but should follow best practice

State legislation broadly mirrors the APPs, with some nuanced differences.

Final observation

Technology is no longer the primary risk.

The real risk sits in the misalignment between:
• what the technology is capable of
• what the organisation says it is doing
• what the customer believes is happening

For Community Housing Providers, the opportunity is significant—but only if implemented with clear purpose, strong governance, and full transparency.

As Alex pointed out, “the real risk isn’t the technology itself—it’s how organisations implement, communicate, and govern it”.

Alex MacDonald of Vixels can be reached at alex@vixels.co

About Donnabrook

Advising and consulting community, care and health sectors on telephony and supporting selection, transition, transformation and alignment to strategy.

For any advice or updates on the emerging capabilities of next-generation telephony, how it may impact community housing providers or other organisations concerned with privacy, governance and focused on efficiency then feel free to reach out.  Andrew.olsen@donnabrook.com

More insights​