Automated Documentation Validation in Your CI/CD Pipeline
What This Builds
Every time a pull request touches documentation files, a GitHub Actions workflow automatically runs Claude against the changed docs and posts a style review comment on the PR — before any human reviewer sees it. The PR gets a "pre-check" that catches style violations, missing sections, and clarity issues automatically, so your documentation reviews focus on accuracy and completeness rather than formatting.
Prerequisites
- Docs managed in a Git repository (GitHub or GitLab with GitHub Actions equivalent)
- Documentation written in Markdown or similar plain text format
- Claude API key (Anthropic account, pay-as-you-go, ~$0.01-0.05 per PR check)
- Comfortable with basic YAML and GitHub Actions concepts
- Cost: Claude API ~$0-2/month for typical PR volume
The Concept
Think of this as a documentation linter — just like ESLint or TypeScript checks your code for errors before merging, this workflow checks your documentation for quality before merging. It's a colleague who reads every documentation PR the moment it's opened and posts immediate, specific feedback.
The pipeline: PR opened or updated → GitHub Action detects changed .md files → extracts the diff → sends changed content to Claude API with your style rules → Claude returns specific feedback → GitHub bot posts the feedback as a PR comment.
Build It Step by Step
Part 1: Set Up Your Claude API Key as a GitHub Secret
- Go to your GitHub repository → Settings → Secrets and variables → Actions
- Click "New repository secret"
- Name:
CLAUDE_API_KEY - Value: your Anthropic API key
- Click "Add secret"
What you should see: A secret named CLAUDE_API_KEY in your repository secrets list.
Part 2: Create the GitHub Actions Workflow File
In your repository, create the file .github/workflows/docs-review.yml:
name: Documentation Quality Review
on:
pull_request:
paths:
- 'docs/**/*.md'
- 'content/**/*.md'
- '*.md'
types: [opened, synchronize, reopened]
jobs:
review-docs:
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: read
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get changed documentation files
id: changed-files
run: |
CHANGED=$(git diff --name-only origin/${{ github.base_ref }}...HEAD -- '*.md' 'docs/**/*.md' 'content/**/*.md')
echo "files=$CHANGED" >> $GITHUB_OUTPUT
# Collect content of changed files
CONTENT=""
for file in $CHANGED; do
if [ -f "$file" ]; then
CONTENT="$CONTENT\n\n=== FILE: $file ===\n$(cat $file)"
fi
done
echo "content<<EOF" >> $GITHUB_OUTPUT
echo "$CONTENT" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
- name: Review with Claude
id: claude-review
if: steps.changed-files.outputs.files != ''
env:
CLAUDE_API_KEY: ${{ secrets.CLAUDE_API_KEY }}
run: |
CONTENT='${{ steps.changed-files.outputs.content }}'
RESPONSE=$(curl -s -X POST https://api.anthropic.com/v1/messages \
-H "x-api-key: $CLAUDE_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-H "content-type: application/json" \
-d "{
\"model\": \"claude-3-haiku-20240307\",
\"max_tokens\": 1500,
\"messages\": [{
\"role\": \"user\",
\"content\": \"Review this documentation PR for quality issues. Return a concise numbered list of: (1) style violations (passive voice, second-person inconsistencies, jargon without definition), (2) missing required sections for the doc type, (3) clarity issues. Keep feedback specific and actionable. If the docs look good, say so. Be encouraging but honest.\n\nDocumentation changes:\n$CONTENT\"
}]
}")
FEEDBACK=$(echo $RESPONSE | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['content'][0]['text'])")
echo "feedback<<EOF" >> $GITHUB_OUTPUT
echo "$FEEDBACK" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
- name: Post review comment
if: steps.changed-files.outputs.files != '' && steps.claude-review.outputs.feedback != ''
uses: actions/github-script@v7
with:
script: |
const feedback = `${{ steps.claude-review.outputs.feedback }}`;
const files = `${{ steps.changed-files.outputs.files }}`;
await github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `## Documentation Quality Review\n\n**Files reviewed:** ${files.split('\n').filter(f=>f).join(', ')}\n\n${feedback}\n\n---\n*Automated review by Claude. This is a first pass — human review is still required.*`
});
Customize lines 47-50: Replace the review instructions with your actual style rules. You can make this as specific as your style guide requires.
Part 3: Test the Workflow
- Create a test branch with a documentation change
- Open a pull request to your main branch
- Watch the Actions tab — the workflow should trigger within 30-60 seconds
- After it completes, check the PR for a comment from
github-actions[bot]
What you should see: A comment on your PR listing any style or quality issues Claude found, or a confirmation that the docs look good.
Troubleshooting:
- Workflow doesn't trigger: Check that your markdown files are in the paths specified in the
on.pull_request.pathssection - API error in logs: Check that
CLAUDE_API_KEYsecret is set correctly - Comment not appearing: Check that the job has
permissions: pull-requests: write
Part 4: Tune the Review Instructions
The quality of feedback depends on how well you specify your standards. After a week of PRs, review what Claude flagged:
- Too many false positives? → Make your instructions more specific: "Only flag passive voice in numbered procedural steps, not in introductory paragraphs"
- Missing important issues? → Add them explicitly: "Flag any procedural article that doesn't have a Prerequisites section"
- Wrong tone? → Add: "Be encouraging. Frame feedback as suggestions, not corrections"
Update the prompt in your workflow file and commit the change.
Real Example: PR Review in Action
Scenario: Your colleague opens a PR adding documentation for a new admin configuration feature.
PR diff includes:
## Configuring Integration Settings
The integration settings can be found by navigating to the admin panel.
Users should click on "Settings" and then select "Integrations".
Once this has been done, the API key should be entered.
Claude's automated comment:
Documentation Quality Review
Files reviewed: docs/admin/integration-settings.md
Issues found:
1. **Passive voice in steps** — "should be entered" → "Enter the API key" (line 6)
2. **Inconsistent person** — "Users should click" uses third person; change to "Click 'Settings'" (line 4)
3. **Missing Prerequisites section** — This article doesn't specify what permissions are required
to access admin integration settings
4. **Vague navigation** — "the admin panel" — specify the exact path: Settings > Admin > Integrations
Overall: Good structure and clear purpose. These four fixes will bring it up to our documentation
standards before merge.
Result: Your colleague fixes all four issues before you even see the PR. Your review focuses on accuracy — is this actually how the feature works?
What to Do When It Breaks
- Workflow runs but Claude returns empty feedback → The content extraction may have failed. Add logging to the "Get changed files" step to verify content is being captured
- Claude hits token limits on large PRs → Add
maxFileSizecheck and skip files over 10KB, or chunk large files - Too noisy for developers → Add a condition to only post if issues > 3, or change to a "passing" / "review needed" summary instead of listing every issue
- Cost is higher than expected → Claude Haiku is the most cost-efficient model; verify you're using
claude-3-haiku-20240307in the workflow
Variations
Simpler version: Skip the CI/CD integration entirely. Create a one-click script (shell or Python) that you run locally on a file to get Claude's review before you even commit. Much simpler; you just lose the automation.
Extended version: Add a second Claude call that checks the documentation against your existing docs library (fetched via API) to flag potential duplicate content or suggest cross-references to related articles.
What to Do Next
- This week: Deploy the workflow on one repository and monitor the first 10 PRs
- This month: Refine the instructions based on what reviewers say is helpful vs. noisy
- Advanced: Add a second job that validates Markdown syntax, checks for broken internal links, and verifies all code blocks have a language specified — combine style review with technical validation in one PR check
Advanced guide for technical writer professionals. These techniques use more sophisticated AI features that may require paid subscriptions.