This page is written for the people who have to answer the hard questions: in-house counsel, privacy officers, compliance managers, and risk committees.
Most professional services firms are not legally prohibited from using cloud AI. But using it without proper controls creates obligations they are not equipped to manage — and exposes their clients to risks those clients never agreed to.
When a law firm or accounting practice uses ChatGPT or Microsoft Copilot (on non-enterprise tiers), client data is processed on shared infrastructure operated by a US company. That data:
Every AKYB deployment is single-tenant. Your data is processed in an environment that exists only for you:
All Alberta healthcare deployments include: data flow diagrams, vendor attestation letter, technical safeguards documentation, and a pre-populated HIA s.64 PIA template for your legal counsel to review and sign. Note: The PIA must be conducted and submitted by the health custodian — AKYB provides the technical evidence to support that process. For on-premise deployments, the PIA answer to "where does health information go?" is "it stays on hardware in this clinic" — the simplest and most defensible answer possible.
There is no specific Canadian law prohibiting law firms from using cloud AI. The issue is professional obligation. Solicitor-client privilege requires that client confidences be protected from disclosure — including disclosure to third parties whose data handling practices the firm cannot control.
When client files are processed by a US AI provider, the firm has created a record outside its custody, subject to laws and policies it did not negotiate and cannot audit.
There is no legal prohibition on oil & gas companies using cloud AI for internal documents. The risk is commercial: exploration datasets, reservoir models, and engineering reports are among the highest-value proprietary assets in the industry. Processing them on shared infrastructure operated by a third party — under that third party's terms, not yours — is a risk assessment question, not a compliance one. Most operators with serious assets make the same assessment.
AKYB's deployment architecture — whether on-premise or on the operator's own infrastructure — keeps this data inside your operational environment and outside any third-party system.
CPA Canada professional standards impose strict client confidentiality obligations. Using cloud AI to process client financial records without explicit consent and a documented data processing agreement creates professional liability exposure — not because a specific rule prohibits it, but because the firm cannot demonstrate the controls that confidentiality obligations require. A confidential AI system provides a straightforward answer to that question.
This is the most commonly misunderstood aspect of cloud AI compliance for Canadian regulated industries.
The US Clarifying Lawful Overseas Use of Data (CLOUD) Act allows US law enforcement to compel US-incorporated companies to produce data held anywhere in the world — including on servers physically located in Canada.
Microsoft, Google, and Amazon are US-incorporated. Storing your data in their Canadian datacentres does not remove it from CLOUD Act jurisdiction. The company — not the server location — determines legal exposure.
The Alberta OIPC has explicitly flagged the CLOUD Act as a primary risk factor in its AI governance guidance for health custodians.
The only way to eliminate CLOUD Act exposure is to not use a US-incorporated company in the data processing chain. This means:
AKYB's on-premise and client-infrastructure deployment models eliminate CLOUD Act exposure entirely. The cloud deployment model does not — and we are transparent about that distinction with every client.
These are the questions every GC asks before signing. The answers are unambiguous.
The client owns all data at all times. AKYB has no claim on any document, record, or output. AKYB's access during installation and maintenance is governed by a Data Processing Agreement signed before any data is touched.
The hardware is owned by the client (purchased as part of the project). The software uses open-weight models and open-source tooling — no proprietary AKYB licence is required for the system to keep running. If AKYB's engagement ends, the system keeps working.
All embeddings generated from client documents, and any fine-tuned model weights trained on client data, are the exclusive property of the client. AKYB retains no copies. A written destruction certificate is provided upon request at contract termination.
All client data held on AKYB systems during configuration or maintenance is permanently deleted within 5 business days, following NIST SP 800-88 media sanitization standards. A written destruction certificate is provided. All project documentation — architecture diagrams, SOW, training materials — belongs to the client.
Where AKYB performs fine-tuning using client data, the resulting model weights are owned by the client and transferred in full at project completion. AKYB retains no copy of fine-tuned weights. This is contractually guaranteed in the project SOW.
Book a scoping call. We'll work through your specific regulatory context, existing infrastructure, and what the right deployment architecture looks like for your situation.
Schedule a Confidential Consultation