March 3, 2026
Am I Allowed to Use This AI Tool?
Legal commentary on using AI tools for clinicians in Canada.
Through TIL, we've been exploring how therapists can best collaborate with AI to improve care.
Yet as we've researched this space, something has become clear: many of us are still in the 'Am I even allowed to use this?' stage. A main friction point is the lack of clarity around legal and contractual responsibility for AI tools in mental health, especially when client data is involved.
This is exactly what motivated us to invite licensed attorneys to share practical legal commentary on one core question:
What due diligence is reasonable (and necessary) before a clinician uses an AI tool in clinical workflows, especially when client data may be involved?
Today's installment includes guest commentary from Richard Stobbe, Licensed attorney at Field Law (fieldlaw.com) in Alberta, Canada. He comments on various considerations to take into account before using AI tools, as well as regional requirements specific to Alberta.
The Contract Reality: Choosing AI is not a singular decision. Stobbe recommends clinicians identify exactly what they are using: is it a consumer-grade tool like ChatGPT, a professional healthcare-oriented platform, or a custom-built solution?
The service tier often dictates the level of protection. Many sole practitioners will be tempted to try an entry-level version before committing to a pro version. However, while a professional-grade provider might tailor terms to be compliant, consumer-grade or free versions rarely address clinical risks in favor of the practitioner.
The Label Trap: Do not assume an AI scribe provider will have protocols that align with your professional obligations. Many vendors target specific industries but rely on American (HIPAA) or European (GDPR) standards, and these do not necessarily meet Canadian requirements.
Regional Mandates -- The Alberta Perspective: In Alberta, psychologists acting as custodians under the Health Information Act (HIA) face two primary obligations. Consent: according to guidance from Alberta's Office of the Information and Privacy Commissioner (OIPC), clinicians may need written consent from clients depending on how data is collected. Privacy Impact Assessments (PIA): introducing an AI scribe is considered a significant modification to an electronic system, and such changes typically trigger a legal requirement to complete a formal PIA.
Choosing an AI tool requires clear thinking. To ensure your choice is one you can defend, make sure to: (1) identify the exact tool and service tier, (2) confirm consent requirements and privacy assessment expectations, and (3) ensure your contract matches the technical reality of the tool.
Thanks for reading! If you learned something useful from this post, help us grow by sharing it with a like-minded colleague.