An auditor asks a simple question that can ruin your week: “When your AML model changed, what exactly changed, and who approved it?”
If you don’t have a data science team, “the model” might be a vendor score, a ruleset, or a mix of thresholds and case workflows. Either way, auditors still expect you to prove control, traceability, and testing. That’s what an AML model change log is for.
This guide shows how to document AML model changes in a lightweight way, plus a template you can copy into Google Sheets or Excel.
What “AML model changes” usually mean when you’re not doing ML
In smaller fintechs, marketplaces, iGaming brands, and payments startups, the “model” is often a practical risk engine. Document changes to any of these:
- Rules and thresholds: velocity rules, structuring thresholds, geo rules, “funds-in-funds-out” patterns.
- Risk scoring logic: customer risk tiers, weighting changes, new risk factors.
- Data inputs: adding a new field (device ID, merchant category), changing a data source, fixing a mapping.
- Watchlist/sanctions screening configuration: match thresholds, new lists, tuning false positives.
- Vendor model updates: a vendor releases a new score version, feature set, or calibration.
- Alert routing and case workflow: new queues, new disposition codes, SAR escalation triggers.
If your program is rules-heavy, it helps to think like a regulator: rules are your “instrument panel.” When a dial moves, you log it. For a practical example of how rules evolve over time, see this internal reference on simple transaction monitoring rules for iGaming operators.
What auditors look for (it’s less math, more discipline)
Auditors don’t need a PhD explanation. They need evidence you’re operating a controlled process:
- A clear before-and-after: what changed, when, and in which environment.
- A business reason: what problem the change solved (noise reduction, new product risk, regulator feedback).
- Testing evidence: even simple “backtesting” on recent cases, alert volume comparisons, or QA checks.
- Approvals: who requested, who reviewed, who signed off (and whether they’re appropriate).
- Traceability: link the change log entry to a ticket, email approval, meeting minutes, or vendor release note.
- Post-change monitoring: did alert volumes spike, did true positives drop, did investigators complain?
If you use a third-party platform, borrow their structure. For example, product-grade versioning and release notes show the level of clarity auditors like, even if yours is simpler (see AML Watcher’s change log approach).
The lightweight system: one log, a few attachments, no chaos
Your AML model change log works best when it’s boring and consistent. A simple setup:
- One source of truth: Google Sheet, Excel in SharePoint, or a Notion database.
- One owner: usually Compliance, Risk, or the MLRO (even if Engineering implements changes).
- A rule that nothing ships without a log entry: no exceptions, no “we’ll write it later.”
- A folder structure for evidence: each change gets a small folder with testing notes and approvals.
If you’re hiring or formalizing responsibilities, it’s useful to define ownership clearly, similar to this iGaming compliance officer job description template.
Auditor-accepted AML model change log template (copy into a sheet)
Use this table as your standard set of columns. It’s intentionally plain.
| Field | What to record | Auditor-friendly tip |
|---|---|---|
| Change ID | Sequential ID (AML-CL-2025-001) | Don’t reuse IDs, even if a change is rolled back |
| Date requested | When it was raised | Match to the ticket created date |
| Date approved | When approval was granted | Approval should be before production deployment |
| Effective date | When it went live | If phased rollout, note the phases |
| Component | Rules engine, vendor score, screening, data pipeline | Make it searchable and consistent |
| Change type | New rule, threshold tweak, bug fix, vendor upgrade | Avoid vague labels like “optimization” |
| Description (before → after) | What changed in one paragraph | Include exact thresholds or logic names |
| Rationale | Why you changed it | Tie to risk assessment, volume, product change, or findings |
| Impact assessment | Expected alert volume, risk coverage, ops impact | Even estimates are better than silence |
| Testing performed | What you tested and on what period | Note dataset dates and sample sizes if you can |
| Results summary | What you saw | “Alerts down 18%, true positives steady” (only if verified) |
| Approvers | Names and roles | Include “prepared by” and “reviewed by” |
| Implementation owner | Person/team who deployed | Helps trace execution and accountability |
| Artifacts/links | URLs to ticket, files, release notes | Keep links stable and access-controlled |
| Post-release check | Date and what you reviewed | Add a 7-day and 30-day check when possible |
Want to see what “compliance documentation” looks like in more automated platforms? DataRobot’s overview of generating model compliance documentation is a helpful benchmark for the kinds of evidence regulators expect, even if you’re doing it manually.
Example: one completed entry (what “good” looks like)
This is the level of detail that passes reviews without turning your log into a novel.
| Field | Example entry |
|---|---|
| Change ID | AML-CL-2025-014 |
| Effective date | 2025-12-05 |
| Component | Transaction monitoring rules |
| Change type | Threshold tweak |
| Description (before → after) | “Funds-in-funds-out” rule window changed from 24h to 12h; trigger threshold changed from 90% to 80% withdrawal-to-deposit ratio for new accounts (0 to 30 days). |
| Rationale | New payment method increased rapid cash-out patterns; investigators flagged missed cases in QA sample. |
| Testing performed | Replayed last 30 days of transactions in staging; compared alert volume and dispositions for same population. |
| Results summary | Alert volume +22%; investigator review time +8%; two previously missed suspicious patterns captured in sample review. |
| Approvers | MLRO (review), Head of Risk (approve) |
| Artifacts/links | Ticket, test notes, approval email saved in evidence folder |
| Post-release check | 2025-12-12: alert volume stable; no system performance issues |
If you manage cases in a platform, map your log entry to the case lifecycle your team follows. Google’s AML AI documentation on the lifecycle of a risk case is a clear reference for consistent statuses and handoffs.
A no-data-science workflow that still feels “controlled”
You can run this process with Compliance, Ops, and one technical owner:
- Raise a change request (ticket): what’s broken, what’s proposed, what’s the risk of doing nothing.
- Write a one-page impact note: alert volume guess, investigation workload, customer friction, risk coverage.
- Test in a safe way: staging replay, sampled historical review, or controlled pilot for a subset.
- Approve with separation: requester and approver shouldn’t be the same person.
- Deploy and log immediately: same day entry, link artifacts, note effective date.
- Post-release monitoring: 7-day check for volume, hit rate, and investigator feedback.
This is also where simple automation helps. If you’re using AI tools to summarize test notes or standardize write-ups, keep the human sign-off explicit. This internal primer on generative AI applications in finance can help you frame AI use in a way that stays audit-friendly.
Mistakes that trigger painful audit follow-ups
A few common traps show up again and again:
- Backfilled logs: entries created weeks later, with fuzzy dates.
- No “before” state: you can’t prove what changed.
- Testing that’s implied, not documented: “we checked it” isn’t evidence.
- Missing approvals: especially for changes that materially affect coverage or customer outcomes.
- No post-release validation: auditors want to see you looked after deployment.
In 2025, scrutiny keeps rising across many markets, so change control hygiene matters more than ever (see Accountancy Europe’s overview of new EU AML rules).
AI image prompts (optional, for publishing)
- Hero image prompt: “Modern fintech compliance desk scene, laptop showing a simple spreadsheet titled ‘AML Model Change Log’, subtle icons for alerts, sanctions, and approvals, clean editorial style, high contrast, brand colors navy and teal, December 2025, realistic, no text artifacts.”
- Workflow illustration prompt: “Simple 6-step flowchart for AML model change control, boxes labeled Request, Impact, Test, Approve, Deploy, Monitor, minimal flat design, white background, teal accents, professional SaaS style.”
- Template graphic prompt: “Spreadsheet-like visual with highlighted columns Change ID, Effective Date, Before/After, Testing, Approver, clean UI mockup, no real company logos, compliance-themed.”
Conclusion
If you can’t explain your last AML change in two minutes, you don’t have a change log, you have a memory. A lightweight AML model change log fixes that by making every adjustment traceable, testable, and approved, without needing a data science team.
Set up one sheet, keep evidence links tidy, and treat every change like it could be questioned later, because it probably will.

Adeyemi Adetilewa leads the editorial direction at IdeasPlusBusiness.com. He has driven over 10M+ content views through strategic content marketing, with work trusted and published by platforms including HackerNoon, HuffPost, Addicted2Success, and others.