Built for the way regulated
industries actually work.
This page is written for the people who have to answer the hard questions: in-house counsel, privacy officers, compliance managers, and risk committees.
How AKYB helps organizations
get from concern to approval.
Different organizations land on different deployment models, but the approval logic is consistent: reduce external exposure, document the controls, and define contractual responsibility before go-live.
When each deployment model fits
- ✓ On-premise: best when data residency, physical control, or operational sensitivity makes outside processing unacceptable.
- ✓ Client infrastructure: best when the organization already has servers, private cloud, or an internal IT environment that can host the stack.
- ✓ Dedicated cloud: suitable only when single-tenant isolation is acceptable under the client's legal and risk framework.
What AKYB provides for review
- ✓ Deployment architecture and data flow explanation
- ✓ Technical safeguards and access-control documentation
- ✓ Ownership, deletion, and data handling terms
- ✓ Sector-specific compliance support scoped during Discovery
- ✓ Clear statement of what remains client counsel's responsibility
Confidentiality obligations
don't pause for productivity tools.
Most professional services firms are not legally prohibited from using cloud AI. But using it without proper controls creates obligations they are not equipped to manage — and exposes their clients to risks those clients never agreed to.
What "shared AI" means in practice
When a law firm or accounting practice uses ChatGPT or Microsoft Copilot (on non-enterprise tiers), client data is processed on shared infrastructure operated by a US company — outside the firm's control and outside the scope of what their clients consented to. That data:
- ✗ Is processed outside the firm's custody and control
- ✗ May be retained and used to improve the provider's models
- ✗ Is subject to US law enforcement requests under the CLOUD Act
- ✗ Cannot be audited by the client in any meaningful way
- ✗ Was processed without your clients' knowledge or consent
What AKYB's architecture changes
Every AKYB deployment is single-tenant. Your data is processed in an environment that exists only for you:
- ✓ No data shared with or accessible by any third party
- ✓ AKYB contractually prohibited from using your data for any other purpose
- ✓ Full query audit trail — who asked what, when, what the AI returned
- ✓ Deployment environment matches your actual legal and regulatory requirements
What the rules actually say.
Healthcare — Alberta HIA
What the HIA requires
- ▸ HIA s.64: Health custodians must complete a Privacy Impact Assessment before any new AI information system goes live.
- ▸ HIA s.69.1: Health information may not be transferred outside Canada without specific authorization or consent.
- ▸ OIPC guidance (Sept 2025): AI tools used by health custodians must be evaluated under HIA s.64 — including AI scribes, clinical decision support, and document retrieval systems.
Why cloud AI fails this test
- ✗ Microsoft Copilot is not certified as HIA-compliant by the OIPC
- ✗ Canadian datacentres do not eliminate US CLOUD Act risk for US-incorporated providers
- ✗ Most small clinics cannot independently complete the required PIA documentation
AKYB's HIA PIA Documentation Support
All Alberta healthcare deployments include: data flow diagrams, vendor attestation letter, technical safeguards documentation, and a pre-populated HIA s.64 PIA template for your legal counsel to review and sign. Note: The PIA must be conducted and submitted by the health custodian — AKYB provides the technical evidence to support that process. For on-premise deployments, the PIA answer to "where does health information go?" is "it stays on hardware in this clinic" — the simplest and most defensible answer possible.
Law Firms — Solicitor-Client Privilege
There is no specific Canadian law prohibiting law firms from using cloud AI. The issue is professional obligation. Solicitor-client privilege requires that client confidences be protected from disclosure — including disclosure to third parties whose data handling practices the firm cannot control.
When client files are processed by a US AI provider, the firm has created a record outside its custody, subject to laws and policies it did not negotiate and cannot audit.
Practical risks
- ▸ US CLOUD Act: law enforcement can compel provider to produce data without notifying the Canadian firm or its client
- ▸ Provider retention: queries and outputs may be retained beyond the session
- ▸ No audit trail: firm cannot demonstrate to a client or regulator exactly what data was processed and when
Oil & Gas — Commercial Sensitivity
There is no legal prohibition on oil & gas companies using cloud AI for internal documents. The risk is commercial: exploration datasets, reservoir models, and engineering reports are among the highest-value proprietary assets in the industry. Processing them on shared infrastructure operated by a third party — under that third party's terms, not yours — is a risk assessment question, not a compliance one. Most operators with serious assets make the same assessment.
AKYB's deployment architecture — whether on-premise or on the operator's own infrastructure — keeps this data inside your operational environment and outside any third-party system.
Accounting & Finance — CPA Standards
CPA Canada professional standards impose strict client confidentiality obligations. Using cloud AI to process client financial records without explicit consent and a documented data processing agreement creates professional liability exposure — not because a specific rule prohibits it, but because the firm cannot demonstrate the controls that confidentiality obligations require. A confidential AI system provides a straightforward answer to that question.
What internal reviewers usually need
before they can say yes.
This is the practical checklist most legal, privacy, IT, and operations stakeholders are trying to satisfy during internal review.
Internal approval questions
- ✓ Where is the data processed?
- ✓ Who can access the system and how is access controlled?
- ✓ Is there an audit trail for queries and outputs?
- ✓ Does the client keep ownership of hardware, data, and model artifacts?
- ✓ What happens to data at termination?
AKYB review package supports
- ✓ Architecture review by IT or infrastructure teams
- ✓ Privacy review by internal or external counsel
- ✓ Procurement review of ownership, support, and exit terms
- ✓ Operational sign-off on deployment model and support process
- ✓ Pre-go-live checklist for user access, training, and documentation
Why "Canadian datacentre"
is not the same as "Canadian control."
This is the most commonly misunderstood aspect of cloud AI compliance for Canadian regulated industries.
What the CLOUD Act does
The US Clarifying Lawful Overseas Use of Data (CLOUD) Act allows US law enforcement to compel US-incorporated companies to produce data held anywhere in the world — including on servers physically located in Canada.
Microsoft, Google, and Amazon are US-incorporated. Storing your data in their Canadian datacentres does not remove it from CLOUD Act jurisdiction. The company — not the server location — determines legal exposure.
The Alberta OIPC has explicitly flagged the CLOUD Act as a primary risk factor in its AI governance guidance for health custodians.
What removes CLOUD Act exposure
The only way to eliminate CLOUD Act exposure is to not use a US-incorporated company in the data processing chain. This means:
- ✓ On-premise hardware owned and controlled by the client
- ✓ Dedicated deployment on the client's own infrastructure
- ✓ Open-weight AI models with no network telemetry
AKYB's on-premise and client-infrastructure deployment models eliminate CLOUD Act exposure entirely. The cloud deployment model does not — and we are transparent about that distinction with every client.
What belongs to you.
These are the questions every GC asks before signing. The answers are unambiguous.
Who owns the data?
The client owns all data at all times. AKYB has no claim on any document, record, or output. AKYB's access during installation and maintenance is governed by a Data Processing Agreement signed before any data is touched.
Who owns the AI system?
The hardware is owned by the client (purchased as part of the project). The software uses open-weight models and open-source tooling — no proprietary AKYB licence is required for the system to keep running. If AKYB's engagement ends, the system keeps working.
Who owns the embeddings and fine-tuned weights?
All embeddings generated from client documents, and any fine-tuned model weights trained on client data, are the exclusive property of the client. AKYB retains no copies. Fine-tuned weights are transferred in full at project completion. A written destruction certificate is provided upon request at contract termination.
What happens at termination?
All client data held on AKYB systems during configuration or maintenance is permanently deleted within 5 business days, following NIST SP 800-88 media sanitization standards. A written destruction certificate is provided. All project documentation — architecture diagrams, SOW, training materials — belongs to the client.
What AKYB provides and what
the client still owns.
Strong compliance language only works when responsibility is explicit. AKYB supports the client's review process, but does not replace the client's own legal authority or regulatory obligations.
AKYB provides
- ✓ Architecture design and deployment logic
- ✓ Technical safeguards documentation
- ✓ Data handling, ownership, and deletion terms
- ✓ Implementation evidence to support legal and privacy review
- ✓ Scoped compliance support artifacts during Discovery and delivery
The client or client counsel remains responsible for
- ✓ Final legal interpretation and approval decision
- ✓ PIA submission where required by law
- ✓ Internal policy approval and governance sign-off
- ✓ Determining whether dedicated cloud is acceptable for their obligations
- ✓ User behavior, permissions, and ongoing policy enforcement inside the organization
You're not the obstacle.
You're the approval authority.
Your role isn't to block AI adoption — it's to ensure the organization can adopt it without violating what it owes its clients. Standard tools don't give you what you need to approve. AKYB is built to.
How AKYB is structured
under Alberta privacy law.
These are the questions your legal counsel will ask about AKYB's own obligations — not just your system's architecture.
Is AKYB a health information custodian?
No. AKYB is not a health custodian as defined under HIA s.1(1)(f). The client — physician, clinic, or health services provider — remains the custodian at all times. AKYB acts as an information manager under HIA s.66(1).
What governs AKYB's access to health data?
All health information handling is governed by a signed Information Manager Agreement (IMA) executed before any health information is accessed. This satisfies HIA s.66(2).
What if the OIPC rejects a PIA?
AKYB's contractual obligation is to provide the technical documentation that supports a compliant PIA. If the OIPC requires modifications, AKYB will modify the architecture, quoted as a change order if modifications exceed the original scope.
Compliance questions specific to your organization?
Book a private AI assessment and we'll work through your regulatory context, infrastructure constraints, and what approval-ready deployment looks like for your situation.
Book a Private AI Assessment