Revit Automation with Claude Code Skills: A Practical Guide for Architects
Build Claude Code skills that automate Revit workflows - family management, schedule exports, model audits, and Dynamo script generation.
Go deeper with Archgyan Academy
Structured BIM and Revit learning paths for architects and students.
Why Revit Workflows Are Perfect for Claude Code Skill Automation
Revit projects generate enormous amounts of structured data - family parameters, schedules, warnings, view templates, workset configurations - and most of that data follows predictable patterns. When something deviates from the pattern, you get coordination failures, model bloat, and late-night fire drills before submission deadlines.
The problem is not that architects lack the knowledge to audit these things. The problem is that manual auditing takes hours and nobody has the bandwidth to run full compliance checks on a weekly basis. This is exactly where Claude Code skills excel. A skill is a reusable markdown instruction file stored in your project’s .claude/skills/ directory. When invoked, Claude Code reads the instructions and executes a multi-step workflow - reading exported Revit data, applying your firm’s rules, and generating actionable reports.
Revit workflows are uniquely suited for this because:
- Revit data exports to structured formats. Schedules export to CSV or TXT. Warnings export to HTML. Family parameters can be exported via tools like DiRoots ParaManager or even Dynamo scripts. These are all formats that Claude Code can parse natively.
- BIM standards are rule-based. Your BEP defines naming conventions, shared parameter requirements, workset structures, and view template standards. Rules translate directly into skill validation logic.
- Repetitive auditing is the bottleneck. Nobody questions the value of checking shared parameters before a model submission. The question is always whether anyone has time to do it. Skills eliminate the time constraint.
- Results need to be consistent. When a junior architect runs the audit, the output should match what a senior BIM manager would catch. Skills enforce that consistency.
This guide walks through six complete, production-ready Claude Code skills for Revit automation. Each skill is presented as a full SKILL.md file that you can copy directly into your project.
Setting Up Your Revit Project for Claude Code Integration
Before building skills, you need a clean data pipeline from Revit to your project folder. Claude Code operates on files in your repository - it cannot connect directly to the Revit API. The bridge between Revit and Claude Code is exported data.
Here is the folder structure we recommend:
project-root/
├── .claude/
│ └── skills/
│ ├── revit-family-auditor/
│ │ └── SKILL.md
│ ├── schedule-validator/
│ │ └── SKILL.md
│ ├── warnings-analyzer/
│ │ └── SKILL.md
│ ├── dynamo-generator/
│ │ └── SKILL.md
│ ├── view-template-checker/
│ │ └── SKILL.md
│ └── workset-auditor/
│ └── SKILL.md
├── revit-exports/
│ ├── families/
│ │ └── parameter-export-2026-04-01.csv
│ ├── schedules/
│ │ ├── door-schedule-2026-04-01.csv
│ │ └── room-schedule-2026-04-01.csv
│ ├── warnings/
│ │ └── warnings-export-2026-04-01.html
│ ├── view-templates/
│ │ └── view-template-list-2026-04-01.txt
│ └── worksets/
│ └── workset-export-2026-04-01.csv
├── standards/
│ ├── shared-parameters.csv
│ ├── naming-conventions.md
│ ├── view-template-standards.csv
│ └── workset-standards.csv
└── reports/
└── (skill outputs go here)
Export conventions that make skills reliable:
- Date-stamp every export. Use the format
YYYY-MM-DDin filenames. Skills can then always find the most recent export by sorting filenames. - Use CSV for tabular data. Revit schedule exports and parameter lists should be CSV with headers. Avoid XLSX - Claude Code parses CSV natively without external libraries.
- Export warnings as HTML. Revit’s built-in warning export produces an HTML file. Skills can parse the HTML table structure to extract warning categories and counts.
- Keep a standards folder. Your firm’s BIM standards should live as machine-readable files (CSV, JSON, or markdown tables) alongside the exports. Skills reference these as the source of truth.
- Commit exports to git. This gives you a history of model health over time and lets the entire team access the latest exports.
With this structure in place, every skill follows the same pattern: read the latest export from revit-exports/, compare it against the corresponding standard in standards/, and write a report to reports/.
Skill: Revit Family Parameter Auditor
This skill reads an exported CSV of family parameters and flags missing shared parameters, incorrect parameter types, and naming convention violations. It is the single most valuable audit you can automate because shared parameter mismatches cause downstream failures in schedules, tags, and data extraction.
# Revit Family Parameter Auditor
Audit exported Revit family parameters against the project's
shared parameter standards. Flags missing parameters, type
mismatches, and naming convention violations.
## When to Use
Invoke with `/revit-family-auditor` before any model submission,
after loading new families, or as part of weekly model hygiene.
## Instructions
1. Find the most recent CSV file in `revit-exports/families/`
by sorting filenames (they are date-stamped YYYY-MM-DD).
2. Read the shared parameter standard from
`standards/shared-parameters.csv`. This file has columns:
ParameterName, ParameterType, ParameterGroup, IsInstance,
IsRequired, AllowedValues.
3. Parse the family export CSV. Expected columns:
FamilyName, FamilyCategory, ParameterName, ParameterType,
IsShared, IsInstance, GUID.
4. For each family in the export, run these checks:
a. MISSING REQUIRED PARAMETERS: Compare the family's
parameters against all rows in the standard where
IsRequired = TRUE and the standard's category matches
the family's category. Report any standard parameters
not found in the family.
b. TYPE MISMATCHES: Where a parameter exists in both the
family and the standard, verify ParameterType matches.
Common issues: Text vs Length, Integer vs Number.
c. NON-SHARED PARAMETERS: Flag any parameter whose name
matches a standard parameter but IsShared = FALSE.
These will break schedule consolidation.
d. NAMING VIOLATIONS: Check all parameter names against
these rules:
- Must start with a capital letter
- Must use PascalCase (no spaces, no underscores)
- Must not start with a number
- Must not contain special characters except periods
e. DUPLICATE GUIDs: Flag any two parameters across
different families that share the same GUID but have
different names. This indicates a shared parameter
file conflict.
5. Group findings by severity:
- CRITICAL: Missing required parameters, GUID conflicts
- WARNING: Type mismatches, non-shared parameters
- INFO: Naming convention violations
## Output Format
Write the report to `reports/family-audit-YYYY-MM-DD.md` using
today's date. Format:
### Family Parameter Audit Report - [Date]
**Export file:** [filename]
**Families scanned:** [count]
**Issues found:** [count by severity]
#### Critical Issues
- [FamilyName]: Missing required parameter "ParameterName"
(required for category: [Category])
#### Warnings
- [FamilyName]: Parameter "ParameterName" type mismatch -
expected [StandardType], found [ActualType]
#### Info
- [FamilyName]: Parameter "parameterName" violates PascalCase
naming convention
#### Summary Table
| Family | Critical | Warning | Info | Status |
|--------|----------|---------|------|--------|
| Name | count | count | count| PASS/FAIL |
A family PASSES only if it has zero Critical issues.
This skill catches the exact problems that cause schedule failures in coordinated models. The GUID conflict check alone has saved firms from spending days debugging why door schedules show incorrect hardware values after a family reload.
Skill: Schedule Export Validator
Schedules are one of the most visible outputs of a Revit model, and errors in schedules erode client confidence faster than almost anything else. This skill validates exported schedule CSVs against your project standards.
# Schedule Export Validator
Validate exported Revit schedule CSVs against project
standards for column structure, data completeness, and
value consistency.
## When to Use
Invoke with `/schedule-validator` after exporting schedules
for submission review or CD package preparation.
## Instructions
1. Scan `revit-exports/schedules/` for all CSV files. Process
each one individually.
2. For each schedule CSV, determine the schedule type from the
filename. Expected naming: [type]-schedule-YYYY-MM-DD.csv
where type is: door, window, room, finish, equipment, wall.
3. Read the corresponding standard from `standards/` if one
exists. Standard files follow the pattern:
`standards/[type]-schedule-standard.csv` with columns:
ColumnName, DataType, IsRequired, AllowedValues, MinValue,
MaxValue, Format.
If no standard file exists for the schedule type, perform
only generic checks (steps 4a and 4b).
4. Run these validation checks:
a. EMPTY CELLS IN REQUIRED COLUMNS: If the standard marks
a column as IsRequired = TRUE, every row must have a
non-empty value. Report the row number and column name
for each violation.
b. DUPLICATE ROWS: Check for rows where the primary
identifier column (first column) has duplicate values.
For door schedules, this is the Mark column. For room
schedules, this is the Number column. Duplicates
indicate unresolved Revit warnings.
c. VALUE RANGE VIOLATIONS: For columns with MinValue or
MaxValue in the standard, check that numeric values
fall within range. Example: Room Area should not be
less than 5 sqm (likely a modeling error).
d. ALLOWED VALUE VIOLATIONS: For columns with
AllowedValues in the standard (semicolon-separated
list), verify every cell value appears in the list.
Example: Fire Rating column should only contain
"None", "30min", "60min", "90min", "120min".
e. FORMAT VIOLATIONS: For columns with a Format rule,
validate the pattern. Common formats:
- DOOR_MARK: Must match pattern [Level]-[Type]-[Number]
e.g., "L01-SD-001"
- ROOM_NUMBER: Must match pattern [Level].[Number]
e.g., "01.001"
f. HEADER MISMATCH: Compare the CSV column headers against
the standard's ColumnName list. Flag any missing
standard columns and any extra non-standard columns.
5. Calculate a completeness percentage for each schedule:
(non-empty required cells / total required cells) * 100
## Output Format
Write to `reports/schedule-validation-YYYY-MM-DD.md`:
### Schedule Validation Report - [Date]
**Schedules validated:** [count]
**Overall completeness:** [average percentage across all]
#### [Schedule Type] Schedule
- **File:** [filename]
- **Rows:** [count]
- **Completeness:** [percentage]
- **Issues:**
- [Row X]: Column "ColumnName" is empty (required)
- [Row Y]: Column "FireRating" value "45min" not in
allowed values [None, 30min, 60min, 90min, 120min]
- Duplicate Mark values: "L01-SD-003" appears in rows 12, 47
#### Pass/Fail Summary
| Schedule | Completeness | Critical | Warnings | Status |
|----------|-------------|----------|----------|--------|
| Door | 97% | 2 | 5 | FAIL |
| Room | 100% | 0 | 1 | PASS |
PASS requires 100% completeness and zero critical issues.
The format validation rules (like door mark patterns) are what elevate this from a simple completeness check to a genuine quality gate. Firms that enforce consistent mark formats avoid the chaos of retagging hundreds of doors during construction documentation.
Skill: Revit Warnings Analyzer
Every Revit model accumulates warnings over its lifecycle. The built-in Revit warnings dialog is nearly useless for prioritization - it shows a flat list with no categorization or severity ranking. This skill transforms the exported warning HTML into an actionable triage report.
# Revit Warnings Analyzer
Parse an exported Revit warnings HTML file, categorize
warnings by type and severity, identify the most impactful
warnings to resolve, and generate a prioritized cleanup plan.
## When to Use
Invoke with `/revit-warnings-analyzer` during model cleanup
sprints, before submission milestones, or as part of weekly
model health monitoring.
## Instructions
1. Find the most recent HTML file in `revit-exports/warnings/`.
2. Parse the HTML to extract all warning entries. Revit
warning exports contain a table with columns: Warning
Description, Element IDs, Category.
Note: Some Revit versions export warnings as a simple
list rather than a table. Handle both formats by looking
for repeated patterns in the HTML structure.
3. Categorize each warning into one of these severity tiers:
CRITICAL (fix immediately - causes data loss or corruption):
- "Elements have duplicate 'Mark' values"
- "Room is not in a properly enclosed region"
- "Area is not in a properly enclosed region"
- "Multiple Rooms are in the same enclosed region"
- "Highlighted walls overlap"
- "Could not create a valid extrusion"
- "One element is completely inside another"
HIGH (fix before submission - causes schedule/tag errors):
- "Elements have same 'Number' value"
- "Room Tag is outside of its Room"
- "There are identical instances in the same place"
- "Stair path does not reach top or bottom"
- "Railings cannot be found for this stair"
MEDIUM (fix during cleanup - performance or coordination):
- "Highlighted elements are slightly off axis"
- "One or more elements is slightly off axis"
- "Line in sketch is slightly off axis"
- "Wall join is not clean"
- "Location line of this Wall is slightly off"
LOW (informational - resolve when convenient):
- All other warnings not matching above patterns
4. For each category, count occurrences and identify the
most-affected elements by counting how many warnings
reference the same Element ID.
5. Calculate the "Model Health Score" as:
Score = 100 - (CRITICAL * 5) - (HIGH * 2) - (MEDIUM * 0.5)
Minimum score is 0. Target score is 80+.
6. Generate a cleanup priority list - the top 10 actions that
would resolve the most warnings. Group overlapping warnings
that share Element IDs because fixing one element often
resolves multiple warnings.
## Output Format
Write to `reports/warnings-analysis-YYYY-MM-DD.md`:
### Revit Warnings Analysis - [Date]
**Total warnings:** [count]
**Model Health Score:** [score]/100
**Target:** 80+
#### Warning Distribution
| Severity | Count | Percentage |
|----------|-------|------------|
| Critical | X | X% |
| High | X | X% |
| Medium | X | X% |
| Low | X | X% |
#### Top 10 Cleanup Actions (by impact)
1. Fix [N] duplicate Mark values in [Category] elements.
Resolves approximately [N] warnings.
Affected elements: [ID list, max 5]
2. Enclose [N] rooms with properly bounded walls.
Resolves approximately [N] warnings.
Affected elements: [ID list, max 5]
[Continue for top 10...]
#### Category Breakdown
**Duplicate Marks** ([N] warnings)
Affected categories: Doors (X), Windows (Y), Walls (Z)
Most-affected element: ID [XXXXX] appears in [N] warnings
[Continue for each warning category with 3+ occurrences...]
#### Trend (if previous reports exist)
Compare against the most recent previous report in
`reports/` matching the pattern `warnings-analysis-*.md`.
Report: total warnings delta, score delta, new warning
types not present in previous report.
The cleanup priority list is the most valuable part. In a model with 500+ warnings, knowing which 10 actions will resolve 60% of them transforms an overwhelming cleanup sprint into a focused two-hour task.
Skill: Dynamo Script Generator
Dynamo is Revit’s visual programming environment, but many powerful Dynamo operations require Python Script nodes. This skill generates Python scripts for common Revit automation tasks, saving you from writing boilerplate code and looking up Revit API class names.
# Dynamo Script Generator
Generate Dynamo-compatible Python scripts for common Revit
automation tasks. Produces Python code that runs inside a
Dynamo Python Script node using the RevitAPI.
## When to Use
Invoke with `/dynamo-generator` followed by a description of
what you want to automate. Examples:
- /dynamo-generator rename all doors by level and sequence
- /dynamo-generator export room data to CSV
- /dynamo-generator set wall comments parameter from CSV
## Instructions
1. Read the user's task description provided after the
skill invocation command.
2. Determine which Revit API namespaces are needed. Common
imports for Dynamo Python scripts:
```python
import clr
clr.AddReference('RevitAPI')
clr.AddReference('RevitServices')
clr.AddReference('RevitNodes')
from Autodesk.Revit.DB import *
from RevitServices.Persistence import DocumentManager
from RevitServices.Transactions import TransactionManager
-
Generate a complete Python script following these rules:
a. Always start with the standard Dynamo imports above. b. Get the current document:
doc = DocumentManager.Instance.CurrentDBDocumentc. Wrap all model modifications in a Transaction:TransactionManager.Instance.EnsureInTransaction(doc) # ... modifications ... TransactionManager.Instance.TransactionTaskDone()d. Use FilteredElementCollector for element queries. e. Always include error handling with try/except blocks. f. Output results to the OUT variable for Dynamo display. g. Include inline comments explaining each major step. h. Never use deprecated API methods. Use current Revit 2024/2025 API conventions.
-
Also generate the Dynamo node setup instructions:
- Which input ports to create and what to connect
- Which Dynamo nodes to combine with the Python Script
- Expected output format
-
Include a “What This Does” section in plain English so non-programmers on the team understand the script’s purpose.
-
If the task involves reading from or writing to external files (CSV, Excel), include the Python file I/O code using standard library modules (csv, os).
Output Format
Write to reports/dynamo-script-[task-slug]-YYYY-MM-DD.md:
Dynamo Script: [Task Description]
Generated: [Date]
What This Does
[2-3 sentence plain-English explanation]
Python Script
[Complete script with comments]
Dynamo Setup
- Create a new Dynamo graph
- Add a Python Script node
- [Input port configuration]
- [Additional node connections if needed]
- Run the graph
Testing
- Before running on the full model, test with a filtered selection of 5-10 elements
- Verify the output matches expectations before committing the transaction
- Keep Revit’s Undo available (Dynamo transactions can be undone as a single operation)
Limitations
[Any known edge cases or limitations of the generated script]
The real value of this skill is not just code generation - it is the Dynamo setup instructions and testing guidance that make the scripts accessible to architects who are not comfortable writing Python from scratch. A BIM coordinator can invoke this skill, get a working script, and deploy it within minutes.
## Skill: View Template Compliance Checker
View templates control the visual consistency of every sheet in your document set. When templates drift from firm standards - either through manual overrides or copied templates from other projects - the result is inconsistent line weights, incorrect visibility settings, and unprofessional deliverables. This skill catches template drift before it reaches the printer.
```markdown
# View Template Compliance Checker
Verify that Revit view templates match firm standards for
visibility settings, graphic overrides, detail level, and
naming conventions.
## When to Use
Invoke with `/view-template-checker` before sheet set reviews,
when inheriting a model from another team, or after copying
templates between projects.
## Instructions
1. Find the most recent file in `revit-exports/view-templates/`.
This should be a CSV or tab-delimited file with columns:
TemplateName, ViewType, DetailLevel, VisibilityOverrides,
ScaleValue, Discipline, FilterOverrides.
Note: This export can be generated using the Dynamo script
in `.claude/skills/dynamo-generator/` or via third-party
tools like BIM Interoperability Tools.
2. Read the firm standard from
`standards/view-template-standards.csv` with columns:
TemplateName, ViewType, RequiredDetailLevel,
RequiredScaleValue, RequiredDiscipline,
RequiredVisibilityCategories (semicolon-separated list of
categories that must be visible), BannedCategories
(semicolon-separated list that must be hidden).
3. For each template in the export, find the matching standard
row by TemplateName. If no match exists, flag as
NON-STANDARD TEMPLATE.
4. Run these checks for matched templates:
a. DETAIL LEVEL: Verify DetailLevel matches
RequiredDetailLevel. Mismatches cause linework
inconsistencies across sheets.
b. SCALE VALUE: Verify ScaleValue matches
RequiredScaleValue. Wrong scale with correct template
usually means the template was applied to the wrong
view type.
c. DISCIPLINE: Verify Discipline matches
RequiredDiscipline. Architecture templates applied to
structural views cause visibility chaos.
d. REQUIRED CATEGORIES HIDDEN: Parse VisibilityOverrides
to check that all RequiredVisibilityCategories are set
to visible. Flag any that are hidden.
e. BANNED CATEGORIES VISIBLE: Parse VisibilityOverrides
to check that all BannedCategories are hidden. Example:
MEP categories should be hidden in Architectural plan
templates.
f. NAMING CONVENTION: Template names must follow the
pattern: [Discipline]-[ViewType]-[Scale]-[Description]
Example: "ARCH-FloorPlan-100-Furniture" or
"ARCH-Section-50-WallDetail"
5. Count views using each template (if the export includes
a ViewCount column) to prioritize fixes. A non-compliant
template used by 40 views is more urgent than one used
by 2 views.
## Output Format
Write to `reports/view-template-audit-YYYY-MM-DD.md`:
### View Template Compliance Report - [Date]
**Templates checked:** [count]
**Standards matched:** [count]
**Non-standard templates:** [count]
**Compliance rate:** [matching without issues / total matched]
#### Non-Standard Templates (no matching standard)
These templates exist in the model but have no corresponding
firm standard. Review whether they should be added to
standards or removed from the model.
- [TemplateName] (used by [N] views)
#### Compliance Issues
| Template | Issue | Expected | Actual | Views Affected |
|----------|-------|----------|--------|----------------|
| Name | Detail Level | Fine | Medium | 12 |
#### Template Health Summary
| Template | Checks Passed | Total Checks | Status |
|----------|--------------|--------------|--------|
| ARCH-FloorPlan-100 | 6/6 | 6 | PASS |
| ARCH-Section-50 | 4/6 | 6 | FAIL |
PASS requires all checks to pass.
The discipline check (step 4c) alone prevents one of the most common mistakes in multi-discipline coordination - applying an architectural template to a linked structural model view and accidentally hiding critical elements.
Skill: Workset Organization Auditor
Worksets are the backbone of multi-user Revit collaboration, but workset discipline degrades over time as team members create elements without selecting the correct active workset. This skill audits workset structure and element distribution to catch organizational drift.
# Workset Organization Auditor
Audit Revit workset structure and element distribution against
project standards. Identifies elements on incorrect worksets,
missing standard worksets, and workset bloat.
## When to Use
Invoke with `/workset-auditor` weekly during design development
and construction documentation phases, or before model handoffs.
## Instructions
1. Find the most recent CSV in `revit-exports/worksets/`.
Expected columns: WorksetName, ElementCount, Category,
IsEditable, Owner.
2. Read the workset standard from `standards/workset-standards.csv`
with columns: WorksetName, IsRequired, AllowedCategories
(semicolon-separated), MaxElementCount, Description.
3. Run these checks:
a. MISSING REQUIRED WORKSETS: Flag any workset in the
standard with IsRequired = TRUE that does not appear
in the export.
b. NON-STANDARD WORKSETS: Flag any workset in the export
that does not appear in the standard. These often
result from users creating personal worksets or copying
worksets from other projects.
c. CATEGORY VIOLATIONS: For each workset with
AllowedCategories defined in the standard, check that
the export does not show elements from disallowed
categories on that workset.
Example: The "A-Interiors" workset should only contain
Furniture, Casework, and Specialty Equipment categories.
If Walls appear on this workset, flag it.
d. ELEMENT DISTRIBUTION: Calculate the percentage of total
elements on each workset. Flag any workset that holds
more than 40% of all elements (usually indicates
everything was placed on the default workset).
e. WORKSET BLOAT: Flag any workset exceeding its
MaxElementCount from the standard. Bloated worksets
cause slow sync times and checkout conflicts.
f. CHECKOUT CONFLICTS: If multiple rows show the same
workset with different Owner values, flag as a
potential checkout conflict (this depends on the
export format including per-element ownership data).
4. Calculate a "Workset Health Score":
Start at 100. Deduct:
- 10 points per missing required workset
- 5 points per non-standard workset
- 3 points per category violation (per workset)
- 15 points if any workset exceeds 40% of elements
Minimum score is 0.
## Output Format
Write to `reports/workset-audit-YYYY-MM-DD.md`:
### Workset Organization Audit - [Date]
**Workset Health Score:** [score]/100
**Worksets in model:** [count]
**Standard worksets present:** [count]/[total required]
#### Missing Required Worksets
- [WorksetName]: [Description from standard]
#### Non-Standard Worksets
- [WorksetName]: Contains [N] elements across [categories]
Recommendation: Merge into [suggested standard workset]
#### Category Violations
| Workset | Disallowed Category | Element Count |
|---------|-------------------|---------------|
| A-Interiors | Walls | 47 |
#### Element Distribution
| Workset | Elements | Percentage | Status |
|---------|----------|------------|--------|
| A-Shell | 2,340 | 28% | OK |
| Workset1| 4,100 | 49% | OVER |
#### Recommendations
1. Move [N] [Category] elements from [Workset] to [Workset]
2. Rename [non-standard workset] to [standard name] or
merge its elements into [standard workset]
3. [Additional specific recommendations based on findings]
The element distribution check is particularly powerful. In nearly every project that reports slow sync times, you will find that 50% or more of all elements are sitting on “Workset1” because nobody changed the active workset after creating the model. This skill catches that pattern immediately.
Integrating Skills into Your BIM Execution Plan
Skills become most effective when they are not just available but are formally integrated into your project’s BIM Execution Plan (BEP). Here is how to make that happen:
Add skill checkpoints to your BEP milestones. For each project milestone (schematic design completion, DD submission, CD review), list which skills must be run and what passing criteria apply. Example:
- SD Completion: Run
/revit-warnings-analyzer. Model Health Score must be 70+. - DD Submission: Run all six skills. No CRITICAL issues allowed in family audit or schedule validation. Warnings analysis score must be 80+.
- CD Package: All skills must show PASS status. Any FAIL requires sign-off from the BIM Manager before proceeding.
Standardize the export workflow. Add a “Weekly Export Checklist” to your BEP that specifies exactly which data to export and where to save it. When exports are consistent, skills work reliably without manual intervention.
Version your standards files. The CSV files in your standards/ folder are the source of truth for every skill. When standards change (a new shared parameter is added, a workset is renamed), update the CSV and commit it to git. Skills automatically pick up the new rules on the next run.
Assign skill ownership. In your BEP team roster, assign each skill to a team member who is responsible for running it on schedule and reviewing the output. This prevents the common failure mode where skills exist but nobody remembers to use them.
Track results over time. Because skill outputs are saved as dated markdown files in reports/, you can track model health trends across the project lifecycle. The warnings analyzer skill already includes trend comparison. Consider adding similar trend tracking to other skills as your team matures.
Tips for Revit-Specific Skill Development
Building effective skills for Revit workflows requires understanding both the data Revit produces and the common failure patterns in architectural practice. Here are lessons learned from production deployments:
Start with the export format, not the ideal format. Revit’s built-in schedule export has quirks - merged cells, header rows with project info, encoding issues with special characters. Write your skill instructions to handle the actual export format, not a cleaned-up version. Include explicit instructions for skipping header rows and handling empty trailing columns.
Use category-aware rules. A parameter that is required for doors may be irrelevant for windows. A workset rule that makes sense for walls does not apply to furniture. Every validation rule should be scoped to the appropriate Revit category. The family auditor skill handles this by matching categories between the export and the standard.
Account for linked models. Many Revit warnings and workset issues originate from linked models rather than the host model. Your skill instructions should specify whether to include or exclude linked model elements. For most audits, filtering to only host model elements gives cleaner results.
Handle Revit version differences. Warning messages change slightly between Revit versions. The warnings analyzer skill uses substring matching (“duplicate ‘Mark’ values”) rather than exact string matching to accommodate these variations. Apply the same principle to parameter types and export column headers.
Make skills composable. The dynamo generator skill is deliberately open-ended - it takes a task description as input rather than hardcoding specific scripts. This makes it useful across hundreds of scenarios rather than just the five you anticipated at setup time.
Common Mistakes When Automating Revit with AI
Even with well-designed skills, teams fall into predictable traps when introducing AI automation to their Revit workflows. Avoiding these mistakes will save you time and credibility with your team:
Mistake 1: Skipping the standards files. Skills without explicit standards devolve into generic checks that produce vague results. The difference between “this parameter might be wrong” and “parameter FireRating is type Text but your standard requires it to be type YesNo” is the difference between a useful tool and an annoying one. Invest the upfront time to document your standards as machine-readable files.
Mistake 2: Running skills without reviewing the output. AI is excellent at pattern matching but it does not understand design intent. A skill might flag a “non-standard workset” that was created deliberately for a unique project condition. Always have a qualified team member review skill outputs before acting on recommendations.
Mistake 3: Trying to automate model modifications directly. Claude Code skills are best suited for analysis, validation, and report generation. They should not directly modify your Revit model. The dynamo generator skill follows this principle - it generates scripts that a human then reviews, configures, and runs inside Revit. Keep the human in the loop for any model changes.
Mistake 4: Over-engineering early. Start with one skill (the warnings analyzer is the best starting point) and run it for two weeks before adding more. This gives your team time to refine the export workflow, adjust standards files, and build confidence in the process. Deploying six skills simultaneously overwhelms teams and leads to abandonment.
Mistake 5: Ignoring file encoding. Revit exports sometimes use UTF-16 or Windows-1252 encoding instead of UTF-8. If your skills produce garbled output for special characters (degree symbols, accent marks in room names), add explicit encoding handling instructions to your skill files.
Mistake 6: Not committing exports to version control. When exports live only on someone’s local machine, the skills cannot run on other team members’ setups and you lose the ability to track model health trends. Treat exports as project deliverables and commit them alongside your standards.
What Comes Next
The six skills in this guide cover the most impactful Revit automation workflows, but they are just the starting point. Once your team is comfortable with the skill-based workflow, consider expanding into these areas:
- Clash detection pre-screening - Parse Navisworks clash reports and cross-reference with Revit element data to identify systematic coordination failures
- Sheet index automation - Generate and validate drawing sheet indexes against your project’s sheet numbering standard
- Specification cross-referencing - Compare Revit material assignments against specification sections to catch mismatches before submittal
- Model size monitoring - Track .rvt file size over time and correlate with element counts to identify model bloat sources
Each of these follows the same pattern: export structured data from Revit, define your standard, and write a skill that bridges the two.
If you want to deepen your Revit and BIM skills alongside these automation techniques, explore the structured courses at Archgyan Academy. The courses cover everything from Revit fundamentals through advanced family creation and project management - exactly the domain knowledge you need to write effective automation skills. The more deeply you understand Revit’s data structures and workflows, the more powerful your Claude Code skills become.
Level up your skills
Ready to learn hands-on?
- Project-based Revit & BIM courses for architects
- Go from beginner to confident professional
- Video lessons you can follow at your own pace