← Back to Field Notes

· The Bloomfield Team

AI in Aerospace Manufacturing: What the Compliance Layer Looks Like

AI in Aerospace Manufacturing: What the Compliance Layer Looks Like

An aerospace machine shop considering AI tools faces a question that general-purpose AI vendors rarely understand: every output of every system that touches production data is potentially auditable. AS9100 registrars, NADCAP auditors, prime contractor quality teams, and FAA designees all have the authority to ask how a decision was made, what data supported it, and whether the process followed the documented procedure. An AI tool that cannot answer those questions is a compliance liability, regardless of how well it performs.

Roughly 12,000 manufacturers in the United States hold AS9100 certification. Most of them operate between 30 and 200 employees. They machine structural components, fabricate sheet metal assemblies, process special treatments, and produce the precision parts that go into every airframe and engine in production. For these shops, AI adoption depends on one thing above all: can the tool operate within the compliance framework that governs everything they do?

The Regulatory Stack

Aerospace manufacturing compliance is layered. Understanding those layers is the starting point for understanding where AI can and cannot operate.

AS9100 Rev D is the quality management system standard derived from ISO 9001 with aerospace-specific additions. It requires documented processes, controlled records, risk-based thinking applied to every change, and a configuration management system that tracks the relationship between requirements, design, production, and inspection. Any AI tool that generates recommendations, surfaces data, or supports decision-making in a quality-relevant process needs to fit within the QMS documentation structure. That means the tool's inputs, outputs, logic, and version history must be traceable.

NADCAP governs special processes: heat treatment, chemical processing, welding, nondestructive testing, and surface coatings. NADCAP audits focus on process control, calibration, operator certification, and the documented evidence that every step was performed within specified parameters. AI tools that monitor special process parameters or predict process drift need to produce records that meet NADCAP documentation requirements. An AI-generated alert that a furnace load may need extended soak time is valuable. An alert with no traceable logic or data source is an audit finding waiting to happen.

ITAR (International Traffic in Arms Regulations) restricts access to technical data related to defense articles. For the roughly 30% of aerospace shops that handle ITAR-controlled work, this means the AI system cannot send data to foreign servers, cannot be accessed by non-US persons, and cannot store technical data in any location that is not ITAR-compliant. Cloud-based AI tools built on infrastructure that spans multiple countries fail this requirement by default unless they are specifically architected for ITAR compliance with data residency controls and access restrictions.

FAA Production Approvals (PMA, TSOA, and Parts Manufacturer Approval) carry their own documentation requirements for any system that influences production decisions. If an AI tool recommends a process parameter, inspection frequency, or material substitution on an FAA-controlled part, the recommendation and the data behind it become part of the quality record.

Where AI Operates Safely in Aerospace

The compliance framework does not prevent AI adoption. It constrains how AI tools are designed, deployed, and documented. The practical applications fall into categories based on their proximity to production decisions.

Quoting and estimating. This is the lowest compliance risk application because quoting happens before a production order exists. An AI tool that surfaces historical job data, comparable past quotes, material pricing, and setup time estimates to support the estimator's pricing decisions operates entirely outside the AS9100 quality record. The estimator uses the information to build a price. The tool does not make production decisions. For aerospace shops where slow quoting costs contracts, this is the highest-value, lowest-risk starting point.

Production planning and scheduling. AI-assisted scheduling that accounts for machine capability, operator certifications, and material availability is a planning function. The schedule itself is not a controlled record in most QMS implementations. The work orders, travelers, and inspection records generated from that schedule are controlled. An AI scheduling tool that optimizes load balancing and identifies bottlenecks ahead of time operates in a planning capacity. The compliance requirement is that the resulting work orders follow the documented process for their creation and approval.

Nonconformance pattern detection. An AI tool that analyzes NCR data to identify recurring defect patterns by part type, machine, operator, or material lot is a quality analysis function. The tool surfaces patterns. The quality engineer evaluates them and decides on corrective action through the documented CAPA process. The AI output is an input to a human decision, and the decision and its rationale are documented in the CAPA record. This architecture satisfies AS9100 requirements because the controlled process (CAPA) is human-driven, and the AI serves as an analytical tool.

Supplier performance analysis. Tracking supplier delivery performance, quality metrics, and pricing trends with AI-assisted analysis supports the required supplier evaluation process under AS9100 Section 8.4. The AI tool aggregates data that already exists in the QMS and purchasing records. The supplier rating decisions remain with the quality team.

The Architecture That Passes Audit

An AI tool built for an aerospace manufacturing environment needs five architectural features that general-purpose AI tools typically lack.

Traceability of inputs. When the AI surfaces a recommendation, the system must show exactly which data records contributed to that output. If the quoting tool suggests a setup time of 4.5 hours for a new part, the system should display the three historical jobs it analyzed to reach that estimate, with their job numbers, actual setup times, and the dates they were run. The auditor can then follow the chain from AI output back to source records.

Version control on logic. If the AI model is updated, retrained, or adjusted, the system must maintain a record of what changed, when, and what the previous version produced. This mirrors the configuration management requirements that AS9100 applies to product design and process documentation. A model that produces different outputs today than it did last month, with no record of what changed, creates an uncontrolled condition.

Access controls aligned with ITAR. For shops handling controlled technical data, the AI system must enforce access restrictions at the data level. An operator cleared for commercial work should not be able to query the AI system about a defense program's historical job data. The access control model must match the facility's Technology Control Plan.

Audit logging. Every query, every recommendation, and every user action within the AI system should be logged with timestamps, user IDs, and the specific data accessed. This log serves the same function as the controlled document register: it provides an auditable trail of how the system was used.

Human-in-the-loop architecture. For any application that touches production decisions, the AI recommends and the human decides. This is not a philosophical position. It is the only architecture that satisfies the AS9100 requirement for competent persons making quality-relevant decisions. The AI tool is a decision support system, and the documented process must reflect that the human authority for each decision is clearly defined.

What the Auditor Asks

In a surveillance or recertification audit, the registrar will want to understand any new system that influences quality-relevant processes. Based on current audit practices, the questions follow a predictable pattern.

How is this system documented in the QMS? The AI tool should appear in the relevant process procedures as a tool used within the workflow, with its scope and limitations defined. It does not need its own standalone procedure. It needs to be referenced in the procedures it supports.

What training is required to use this system? The answer should reference the training records for each user, consistent with the competency requirements in AS9100 Section 7.2. If the quoting tool requires the estimator to interpret AI-generated recommendations, the training should cover what the recommendations mean, how to evaluate them, and when to override them.

How do you validate the system's outputs? The answer is ongoing comparison of AI recommendations against actual results. For a quoting tool, that means tracking quoted cost versus actual cost for jobs where the AI's recommendations were used. For a quality prediction tool, tracking predicted defects against actual inspection results. This validation data becomes part of the management review input under Section 9.3.

What happens when the system is unavailable? The fallback process must be documented. If the AI quoting tool goes down, the estimator reverts to the manual process. That manual process must still be documented and maintained, which it already is in most shops because the AI tool was built on top of an existing workflow.

ITAR-Specific Considerations

For shops handling ITAR-controlled work, the AI system's data architecture requires specific controls that many AI vendors cannot provide.

All data must reside on servers physically located within the United States, operated by US persons. Cloud providers that offer GovCloud or ITAR-compliant regions satisfy the infrastructure requirement, but the AI application layer must also enforce data residency. A model trained on ITAR data cannot be deployed on shared infrastructure where non-US-person administrators have access to the underlying data.

The Technology Control Plan must be updated to address the AI system. The TCP already defines how the facility controls access to ITAR technical data. The AI tool is a new access point for that data, and the TCP must document how access is controlled, who has access, and what technical data the system can access.

Encryption requirements apply to data in transit and at rest. The AI system's communication with ERP systems, file servers, and user interfaces must use encryption that meets the facility's existing security requirements.

Starting the Right Way

Aerospace shops that adopt AI successfully do so by starting with applications that carry low compliance risk and high operational value. Quoting is the natural entry point. It sits outside the production quality record, delivers measurable speed and accuracy improvements, and gives the quality team time to develop the documentation and training framework that will be needed for higher-risk applications later.

The second application typically involves quality data analysis or supplier performance tracking, both of which operate as analytical tools within existing QMS processes. By the time the shop considers AI for production-adjacent applications like scheduling optimization or process monitoring, the compliance architecture is established and the audit team has already evaluated the approach.

The shops that struggle are the ones that deploy a general-purpose AI tool without considering the compliance layer, then discover during an audit that the system's outputs are not traceable, its logic is not documented, and its data handling does not meet ITAR requirements. That discovery usually results in the tool being shut down until it can be brought into compliance, which often means rebuilding it from the ground up.

Building the compliance architecture into the tool from day one costs less and takes less time than retrofitting it after an auditor raises a finding. For aerospace manufacturers, the compliance layer is where AI adoption either succeeds or stalls. Getting it right at the start is the only path that works.

See how AI fits within your aerospace compliance framework

We build AI tools for aerospace manufacturers with AS9100, NADCAP, and ITAR requirements built into the architecture from the start.

Talk to Our Team