Auto-Generate Release Notes from Commit History Using AI
Learn how to use AI to automatically create release notes from commit history efficiently and accurately.

An AI release notes generator from commit history exists because the current state is almost universally bad. Release notes get written from memory, copied from Jira titles, or skipped entirely. When they do exist, they are written in developer shorthand that means nothing to a product manager or customer success team.
The after state is release notes drafted automatically from commit history and PR descriptions, grouped by type, written in plain language, and ready for review before the release tag is created. This guide shows how to build that workflow from scratch.
Key Takeaways
- Commit messages are structured data, not documentation: The AI's job is to interpret intent and translate technical commits into language non-engineers can act on.
- PR descriptions are the highest-signal input: Well-written PR descriptions give the AI context to distinguish a bug fix from a feature addition or a performance improvement.
- Audience segmentation matters: Release notes for the internal team, customer success, and external customers require different language, detail level, and framing.
- Conventional Commits dramatically improve AI output: Teams using
feat:,fix:, andchore:prefixes give the AI pre-classified inputs that improve grouping and accuracy. - The AI still needs a human editor: AI-generated release notes require review for accuracy before reaching customers, especially for security, pricing, or breaking changes.
- Release notes become a knowledge asset: Stored consistently in Notion or Confluence, AI release notes create a searchable product history that reduces support and sales queries.
What Does AI Release Note Generation Do That Commit Message Summaries Can't?
Raw commit log exports give you developer-language output. An LLM interprets the intent behind each commit and produces customer-facing explanations grouped by user impact.
The input design differs from basic summarisation tools in important ways. Standard commit summaries produce entries like "refactor auth middleware to use JWT." That is an internal changelog entry, not a release note. An LLM reads that same commit alongside its linked PR description and produces "Improved login speed for accounts with multi-factor authentication enabled."
- Intent interpretation: LLMs go beyond the commit message text to understand what the change means for the end user.
- Grouping by type: Commit logs are chronological; release notes need grouping by new features, bug fixes, improvements, and deprecations.
- Prioritisation by impact: Not all commits belong in customer-facing notes; the AI filters internal refactors from user-visible changes.
- PR description fallback: When commit messages are inconsistent or sparse, the AI falls back to PR titles and descriptions, which is why PR hygiene matters more than commit hygiene.
Release note generation is a tractable starting point for AI process automation for engineering teams with minimal workflow disruption, because the inputs already exist in your version control system.
What Does the AI Need to Generate Notes Humans Actually Read?
The input design determines output quality. The input hierarchy runs from PR descriptions (highest signal) through commit messages (medium signal) to linked Jira or GitHub issue titles (supplementary signal).
The input design here follows the same principles as the full engineering workflow automation stack: structure the inputs before the AI node, not after. Define your data sources and format requirements before writing a single prompt.
- Commit range fetching: Use the GitHub API
GET /repos/{owner}/{repo}/compare/{base}...{head}to retrieve the exact commits between the previous release tag and the current one. - Conventional Commits advantage: When commits use
feat:,fix:,perf:,chore:, ordocs:prefixes, the AI can group entries by type without inferring intent from ambiguous messages. - Audience specification in the prompt: Define three output sections: internal (technical detail), customer-facing (plain language, user impact), and executive summary (3-5 sentences on what changed and why).
- Structured JSON output: Instruct the AI to return a JSON object with arrays per section type so downstream nodes can route each version to the correct destination.
The workflow works without Conventional Commits, but output accuracy improves significantly when commits are pre-classified by type.
How to Build the AI Release Notes Generator Workflow — Step by Step
The AI release notes generator blueprint provides the full n8n workflow with GitHub, Notion, and Slack integrations pre-configured. The steps below walk through how each piece connects.
Step 1: Configure the Release Trigger and Tag Comparison
Set up a GitHub webhook trigger that fires on the release event and establishes the exact commit range for this release.
- Webhook trigger setup: In n8n or Make, configure a GitHub webhook that fires on the
releaseevent with action typepublishedfor the target repository. - Tag extraction: Extract the new release tag name from the webhook payload to use as the head reference for the commit comparison API call.
- Previous tag fetch: Call
GET /repos/{owner}/{repo}/releasesto retrieve the previous release tag, which becomes the base reference for the comparison range. - Exact range definition: Comparing two tags gives you the precise commits belonging to this release without guessing based on dates or branch names.
- Prior release exclusion: Tag-based comparison ensures no commits from a previous release accidentally appear in the current notes.
This range definition step is the foundation. An incorrect base tag propagates errors into every downstream note generated from that commit set.
Step 2: Fetch Commits, PRs, and Linked Issues
Build a structured input array from commits, PR descriptions, and linked issues to maximise AI context.
- Comparison API call: Use
GET /repos/{owner}/{repo}/compare/{base}...{head}to retrieve all commits in the release range. - PR description fetch: For each commit referencing a PR (for example
(#123)), callGET /repos/{owner}/{repo}/pulls/123to retrieve the full PR description. - Linked issue fetch: If a GitHub issue number appears in the PR description, fetch the issue body to add another layer of context about the change's intent.
- Structured array build: Assemble an array of objects with fields: commit SHA, commit message, PR title, PR body, issue title, and issue body.
- Completeness as quality lever: The completeness of this array determines how much context the AI has; missing PR bodies are the most common cause of thin release note entries.
This structured array is the AI node's input payload. Its quality directly determines whether the output is useful or generic.
Step 3: Write the Release Note Generation Prompt
Structure the prompt to define the AI's role, the product context, and the three required output sections.
- System message role: Define the AI as a technical writer translating engineering changes into release notes for three distinct audiences.
- Product context: Include the product name, what it does, and who the users are so the AI can frame changes from a user perspective.
- Three output sections: Specify
internal_notes(technical detail, no length limit),customer_release_notes(plain language, grouped by feature/fix/improvement), andexecutive_summary(3-5 sentences). - JSON output instruction: Instruct the Claude API or OpenAI API to return a JSON object with one key per section for clean downstream routing.
- Commit exclusion rules: Add explicit instructions for
chore:anddocs:commits, which are typically excluded from customer-facing output entirely.
Add the exclusion rules in the system message, not the user message. Exclusion logic applied consistently at the system level produces more reliable filtering than per-request instructions.
Step 4: Route Each Audience Version to Its Destination
Parse the JSON output and route each section to its intended destination independently.
- Internal notes routing: Post
internal_notesto the engineering Slack channel and append to the repository's CHANGELOG.md via a GitHub API file update. - Customer notes routing: Post
customer_release_notesto the product update Notion page or Confluence release notes space using their respective APIs. - Executive summary routing: Send
executive_summaryas a formatted Slack message to the product and leadership channel with release tag and date metadata. - Airtable release record: Add the full structured JSON output to an Airtable base as a searchable release record for sales and support teams.
- Record searchability: The Airtable record becomes the release history that sales and support can query without contacting the engineering team.
Route sections in parallel where possible. Sequential routing adds latency that delays the engineering channel notification during an active release window.
Step 5: Flag Entries That Require Human Review
Add a classification node that scans customer_release_notes for keywords indicating sensitive changes requiring review.
- Keyword scan: Check for terms including "breaking," "deprecated," "removed," "pricing," "security," "authentication," and "data" in the customer-facing section.
- Review flag: For any entry containing these terms, add a review flag field to the Airtable record before any external publication step runs.
- Slack notification: Post a Slack message to the product lead with the flagged entry highlighted and a 24-hour review deadline before external publication.
- Publication gate: Hold customer-facing notes in "pending review" status until the product lead approves or edits the flagged entry.
- Build time: The flag step takes under 10 minutes to build and eliminates a meaningful publication risk on every sensitive release.
This step prevents a security-relevant or breaking change from reaching customers with incorrect framing. The risk is real; the build cost is minimal.
Step 6: Test and Validate AI Release Note Quality Before Going Live
Run the workflow against historical releases with existing human-written notes before enabling live generation.
- Historical test set: Use the last 5-10 releases where notes were already written manually as the validation dataset for side-by-side comparison.
- Line-by-line comparison: Compare AI-generated customer-facing notes against the human-written version for accuracy of user-impact identification and language quality.
- Exclusion accuracy check: Verify the AI correctly excludes internal refactors and dependency updates from the customer-facing section in every test case.
- Accuracy target: Hit 80% accuracy on user-impact categorisation before enabling live generation for real releases.
- Common failure fix: The most frequent early failure is
chore:commits appearing in customer-facing output; fix with explicit prompt exclusion rules, not post-generation filtering.
Fix prompt issues before going live rather than building post-generation filters. Downstream filters mask prompt problems rather than solving them.
How Do You Connect Release Note Generation to Error Log Context?
Bug fix entries are more credible when they reference the specific error that was fixed. Connecting the error log workflow to release note generation provides that specificity automatically.
The AI error log analysis workflow produces the bug-specific context that makes fix entries in release notes genuinely informative, rather than generic summaries that could apply to any release.
- Error-linked fix descriptions: When a bug fix PR references a Sentry issue that has an attached error log analysis, that analysis becomes an input to the release note for that fix.
- Customer communication value: "Fixed an authentication timeout error affecting accounts with SSO enabled" outperforms "Fixed auth bug" in usefulness, credibility, and customer trust.
- Airtable as the integration layer: Store error log analysis outputs in Airtable with a release-tag field so the release note workflow can query them during generation.
- Specificity from log data: Error log analysis provides the technical context that lets the AI write fix descriptions precise enough to be genuinely useful to affected users.
The AI error log analyzer blueprint includes output schema documentation showing how to pass error analysis into the release notes workflow, so you can connect both systems without rebuilding the data model.
How Do You Connect Release Notes to Broader Process Documentation?
Release notes generated consistently become a structured product history. Process documentation automation can draw from that history to produce reports, changelogs, and support content without additional manual work.
The AI process documentation automation guide covers how generated release notes feed into a broader documentation maintenance workflow that keeps knowledge bases current without requiring a dedicated technical writer for every update.
- Metadata-tagged storage: Store release notes in Notion or Confluence with tags for release version, date, affected components, and audience type so they are queryable by support and sales.
- Quarterly summary generation: AI can use the release notes history to generate "what changed in the last quarter" reports or feature changelog summaries for sales enablement materials.
- Support ticket reduction: Well-structured release notes reduce the volume of tickets about "when did X change" because the answer is findable without contacting the engineering team.
- Knowledge base currency: Consistent release notes give AI documentation tools a reliable source to pull from when updating help articles or product documentation after each release.
The process documentation generator blueprint shows how to use the release notes archive as a source for automated changelog and documentation updates, connecting the release workflow to a living knowledge base.
What Must Editors Review Before Release Notes Go to Customers?
AI-generated release notes require a defined human review gate before external publication. The review is not optional, and the checklist is not long.
Five areas require editor attention before customer-facing notes are published. Skipping any of them introduces risk that the workflow's automation cannot catch.
- Accuracy check: Verify each entry correctly describes what the change does, because the AI can misinterpret PR descriptions, especially for complex refactors.
- Audience appropriateness: Confirm no infrastructure changes with no user impact have appeared in the customer-facing section, which happens when commit messages are ambiguous.
- Breaking change verification: Any entry involving a deprecated feature, removed API endpoint, or changed behaviour must be verified by the engineer who made the change.
- Tone and brand voice: AI defaults to neutral technical writing; the editor's job is to make the notes sound like the product team, not an automated changelog system.
- Legal and security review trigger: Any entry touching security, data handling, or pricing must go through a legal or security review before publication; AI classification is a flagging tool, not a compliance gate.
Editors should expect the review to take 10-15 minutes per release once the workflow is well-tuned. Early releases may require more intervention as you refine the prompt.
Conclusion
An AI release notes generator from commit history solves the documentation debt that compounds across every release cycle. When built with the right input structure and a clear human review gate, it produces notes that are more consistent and more useful than what most teams currently publish manually.
Before building, check whether your team uses Conventional Commits. If not, adopt the standard for the next sprint before writing a single workflow node. Commit format quality is the single biggest lever on output quality, and it costs nothing to change.
Want Release Notes That Write Themselves Before the Release Ships?
Most engineering teams accept poor release notes as a fixed cost of shipping software. They don't have to be.
At LowCode Agency, we are a strategic product team, not a dev shop. We design and build AI automation workflows that connect your existing tools and generate documentation assets that used to require manual effort after every release. Release note generation is one of the highest-ROI automations for teams that ship frequently, because the compounding benefit grows with every release cycle.
- Trigger design: We configure GitHub webhook triggers that fire on the exact release events your workflow needs to process.
- Multi-source input fetching: We build the commit, PR, and issue fetching layer that gives the AI the full context to generate accurate notes.
- Prompt engineering: We write and iterate the prompts that produce consistent, audience-appropriate output across every release.
- Audience routing: We configure Notion, Confluence, and Slack routing so each version of the notes reaches the right team automatically.
- Review gate automation: We build the keyword-based flagging system that routes sensitive entries to the right reviewer before publication.
- Historical validation: We test the workflow against your existing release history to hit accuracy targets before going live.
- Ongoing optimisation: We monitor output quality and refine the prompt as your product and commit patterns evolve over time.
We have built 350+ products for clients including Coca-Cola, American Express, and Medtronic. Our AI agent development services include release note generation workflows built for GitHub, Notion, Confluence, and Slack environments. Start the conversation today and we'll build a release documentation workflow that ships alongside your next release.
Last updated on
April 15, 2026
.








