AI Compliance · US Accountants & CPAs 18 April 2026

AI Compliance for US Accountants & CPAs 2026:
AICPA, SEC, PCAOB & State AI Laws

Artificial intelligence is now embedded in accounting, audit, tax preparation, and financial advisory workflows across the United States. The compliance frameworks have not moved as fast as the tools — but they are moving. The AICPA has published guidance on AI in audit and advisory work. The SEC has issued statements on AI use in financial reporting and fiduciary contexts. The PCAOB is actively examining how AI is being used in audit engagements. Multiple states have enacted AI laws in the past eighteen months that directly affect how firms use AI in hiring and client-facing decision processes. And when an AI-generated error appears in a financial statement, a tax filing, or an audit opinion, the liability lands where it always has: with the professional who signed off on it.

This applies to your firm if:
  • You use AI tools in audit, tax preparation, financial reporting, or advisory services
  • You are FINRA-registered or SEC-registered and use AI in client-facing investment or advisory processes
  • You use AI in hiring, candidate screening, or employee performance review at your firm
  • You have California, Colorado, Illinois, or New York clients whose personal data is processed by AI tools
  • You serve EU-based clients — the EU AI Act applies to your AI systems used in EU contexts regardless of where your firm is based
See our US AI compliance packages →

AICPA guidance: what professional competence requires for AI

The American Institute of Certified Public Accountants has published guidance confirming that existing professional standards — including the Code of Professional Conduct and Statements on Standards for Accounting and Review Services — apply to AI-assisted work. The AICPA's position mirrors that of every major professional body globally: competence obligations extend to understanding the tools you use.

For AI specifically, AICPA guidance identifies four areas where CPAs must maintain documented practices:

  • Output verification: AI-generated financial analyses, summaries, and computations must be verified against primary source data before inclusion in client deliverables. The verification process must be documented in working papers.
  • Understanding tool limitations: CPAs must understand the training data limitations, known error patterns, and domain-specific weaknesses of AI tools used in client work. General-purpose AI tools trained on broad internet data may not be reliable for specialised tax code interpretation or complex financial calculations.
  • Client data confidentiality: Client financial data uploaded to AI platforms is subject to the same confidentiality obligations as data shared with any third party. CPAs must review vendor data handling terms and obtain appropriate client authorisation where required.
  • Documentation: Working papers must reflect how AI was used, what outputs were generated, how they were reviewed, and what the CPA's independent conclusion was.

SEC guidance: AI in financial reporting and investment advice

The Securities and Exchange Commission has been clear about its expectations for AI use in regulated financial contexts. SEC staff have issued guidance and risk alerts covering:

  • AI in financial reporting: Public companies and their auditors using AI in financial reporting processes must ensure adequate controls over AI-generated outputs. Material errors in financial statements caused by AI systems do not constitute a new category of defence — existing standards for financial reporting accuracy and internal controls apply.
  • Investment advisers: SEC-registered investment advisers using AI in portfolio management, client recommendations, or risk assessment must apply their fiduciary obligations to AI-assisted decisions. An adviser cannot discharge fiduciary duty by delegating judgement to an AI tool — the adviser remains responsible for the recommendation.
  • Marketing and communications: AI-generated performance projections, risk disclosures, or investment recommendations in client communications are subject to the same accuracy and fair dealing standards as human-generated content.

Building an AI governance framework for your US accounting practice?

Our AI compliance packages give professional services firms a documented AI policy, data processing review, and verification framework. Fixed price, delivered in five to seven working days.

See the US Compliance Packages →

PCAOB: AI in audit engagements under active oversight

The Public Company Accounting Oversight Board has explicitly identified AI in audit as a focus area for inspection and standard-setting. PCAOB inspectors are examining how audit firms use AI tools in audit procedures, what documentation supports those uses, and whether AI-assisted conclusions meet existing standards for audit evidence sufficiency and appropriateness.

Key PCAOB considerations for audit firms using AI:

  • Audit evidence standards: AI-generated analysis must meet the same standards for sufficiency and appropriateness as any other audit procedure. An AI tool that identifies anomalies or patterns is generating a starting point for audit work, not a conclusion.
  • Auditor judgement: PCAOB standards require that significant audit judgements are made by the engagement partner or engagement team — not delegated to AI. Where AI assists in forming a judgement, the file must document the human review process.
  • Third-party tool validation: Audit firms that rely on AI tools developed by third-party vendors must have a basis for concluding that those tools are fit for purpose in the specific audit context. Vendor marketing claims are not sufficient — independent assessment is expected.

State AI laws: what is already in force

In 2025, US state legislators introduced over 1,100 AI-related bills across all fifty states — and 145 were enacted. Several directly affect accounting practices:

  • Illinois HB 3773 (in force January 2026): Applies to any employer using AI in hiring, promotion, or performance review decisions — regardless of which state the employer is based in. Written notice to candidates and employees is required. This applies to accounting firms using AI in their own HR processes.
  • Colorado AI Act (in force February 2026): Applies to consequential AI decisions on Colorado residents — including financial decisions. Firms making AI-assisted financial determinations about Colorado residents must provide notice, explanation, and an opportunity to appeal.
  • California ADMT (in force 2026): Automated decision-making affecting California consumers carries notice and opt-out rights. AI-driven financial recommendations or client assessments for California residents are in scope.
  • New York City Local Law 144: Any AI hiring tool used for New York City employees must undergo an independent bias audit. Already in force.

Client data and state privacy laws

US accounting firms using AI platforms that process client personal financial data face a patchwork of state privacy obligations determined by where clients are located:

  • CCPA/CPRA (California): Service provider agreements with AI vendors must include required CCPA contractual provisions. Client rights to know about AI processing of their personal data must be accommodated. Penalties up to $7,500 per intentional violation.
  • Illinois, Virginia, Connecticut, Colorado: State comprehensive privacy laws with consent and disclosure requirements for automated processing of personal data. AI tools processing the personal financial data of residents in these states require documented compliance assessments.

EU AI Act: US firms with EU clients are in scope

The EU AI Act's high-risk classification explicitly includes credit scoring and financial risk modelling. US accounting and financial services firms that use AI in these contexts for EU-based clients are in scope for high-risk obligations from August 2026 — regardless of where the firm is incorporated. The Act follows the location of the client, not the service provider.

What a compliant US accounting practice AI framework requires

  1. AI inventory: Every AI tool documented — what it processes, what client data it handles, where that data is stored, and what the vendor's data handling terms say.
  2. Verification and documentation policy: Written procedures for reviewing AI-generated outputs before delivery to clients or regulatory bodies, with documentation requirements for working papers.
  3. State compliance mapping: Assessment of which state AI laws apply based on client and employee locations — Illinois, Colorado, California, and NYC at minimum.
  4. Vendor data agreements: Service provider agreements with every AI vendor handling client personal data, including CCPA-compliant contractual provisions where required.
  5. EU exposure assessment: For firms with EU clients, determination of whether AI use in financial contexts triggers EU AI Act high-risk classification before August 2026.

US Accounting Practice AI Compliance

AI policy, AICPA compliance mapping, state law assessment, vendor data review. Fixed price, delivered in five to seven working days. Built for US CPAs and accounting firms.

See the US Compliance Packages →
Call Now Book a Free Call