← Back to Field Notes

· The Bloomfield Team

How to Connect Systems That Were Never Meant to Talk to Each Other

How to Connect Systems That Were Never Meant to Talk to Each Other

A typical mid-size manufacturer runs between four and seven software systems. An ERP for job tracking and invoicing. A CAD/CAM system for programming. A quality management system, maybe a standalone app, maybe a module bolted onto the ERP. A scheduling tool, which might be a spreadsheet. An accounting package. A CRM, possibly. And email, which ends up holding more operational data than any of the purpose-built tools.

None of these systems were designed to work together. They were purchased at different times, from different vendors, to solve different problems. The ERP vendor did not anticipate that you would need to pull data from your quality system into your quoting process. The CAD/CAM vendor did not build an integration with your scheduling spreadsheet. The accounting software does not know your ERP exists.

The result is an operation where information moves between systems the same way it has for decades: a person looks at one screen, remembers or writes down what they see, walks to another screen, and types it in again.

The Real Cost of Disconnected Systems

A production planner at a 60-person precision machining shop in Michigan described her Monday morning this way. She opens the ERP to see what jobs are due this week. She opens a shared spreadsheet to check machine availability. She walks to the shop floor to verify that the schedule on the whiteboard matches what the ERP says. She checks her email for material delivery confirmations from three suppliers. She opens the quality system to see if any jobs from last week have open nonconformances that will affect this week's schedule.

By the time she has a complete picture of the week ahead, two hours have passed. She has touched five systems and made at least three trips to the floor. The information she compiled is accurate as of 9:00 AM Monday. By 11:00 AM, something will change and parts of it will be wrong.

Multiply this across every role that needs information from more than one system. Estimators cross-referencing the ERP with supplier emails. Quality managers comparing inspection data with job records. Sales managers checking order status in the ERP and delivery promises in their CRM. The total labor hours spent moving information between disconnected systems in a typical 50 to 100 person shop ranges from 800 to 2,000 hours per year. At a blended labor rate of $35 per hour, that is $28,000 to $70,000 annually in data transfer labor alone.

The labor cost is measurable. The decision cost is harder to quantify but larger. Every decision made with incomplete information because pulling the full picture takes too long carries an accuracy penalty. Quotes take longer and contain more errors. Schedules miss dependencies that one system knows about but another does not. Quality issues recur because the connection between a nonconformance and the job parameters that caused it lives in two different databases with no link between them.

Why Traditional Integration Fails

The obvious answer is to integrate the systems. Buy middleware. Hire a consultant. Build API connections. This approach has a long track record in manufacturing, and most of that track record is expensive failure.

A 2023 report from Gartner found that 65% of ERP integration projects in mid-market manufacturing exceed their original budget by at least 50%. The average timeline for a full integration project stretches from 9 to 18 months. Many stall entirely after the vendor discovers that the ERP's database schema does not support the data mapping they assumed during the scoping phase.

The core problem is that traditional integration tries to make systems speak the same language natively. It requires mapping every field in System A to a corresponding field in System B, handling every edge case where the data formats differ, and maintaining that mapping as both systems get updated over time. For systems that were designed in different decades by different teams with different data models, this is an enormous engineering effort.

A stamping shop in Indiana spent $180,000 on an ERP-to-quality-system integration project in 2022. The project took fourteen months. When it was finished, the integration handled about 70% of the data flow between the two systems. The remaining 30% still required manual entry because the field mappings could not account for the shop's custom quality codes. Two years later, the ERP vendor released a major update that broke three of the integration's data pipelines. Fixing them cost another $40,000.

This pattern repeats across the industry. The integration works until it does not. The maintenance burden grows over time. And the original promise of seamless data flow between systems remains partially fulfilled at best.

A Different Approach

The alternative is to stop trying to make the systems talk to each other directly and instead build a layer that reads from all of them.

Think of it as an operational data layer. It connects to the ERP database as a read source. It connects to the quality system the same way. It ingests the scheduling spreadsheet. It reads supplier emails. It pulls in whatever structured or unstructured data the operation generates. Rather than forcing System A and System B to share a common language, the data layer translates from each system into a common structure that supports the queries people actually need to run.

This approach has three advantages over traditional integration.

First, it does not require changing any existing system. The ERP keeps running exactly as it does today. The quality system stays in place. No data migration, no field mapping, no vendor coordination. The data layer reads from these systems. It does not write to them, modify them, or depend on their APIs being stable.

Second, it can handle unstructured data. Traditional integration only works with structured databases. An operational data layer can also ingest PDFs (inspection reports, supplier certifications, job travelers), emails (supplier quotes, customer communications, internal notes), and even images (photos of setups, whiteboard schedules). This is where AI becomes relevant. Modern language models can read a PDF inspection report, extract the relevant measurements, and structure them alongside the corresponding job record from the ERP.

Third, it is incremental. You do not have to connect everything at once. Start with the two systems whose disconnection causes the most pain. For most shops, that is the ERP and the quoting process. Or the ERP and the scheduling function. Connect those first, prove the value, then expand. A traditional integration project is all-or-nothing. An operational data layer can deliver value in the first month and grow from there.

How It Works in Practice

Step one is mapping the data sources. This means listing every system, spreadsheet, shared drive, and email thread where operational data lives. In most shops, this exercise alone is revealing. The owner thinks there are four systems. The mapping reveals seven, plus a dozen spreadsheets and two shared drives full of PDFs that contain data people reference daily.

Step two is identifying the high-value connections. Where does a person currently serve as the bridge between two systems? Where does someone look at Screen A, then walk to Screen B, then make a decision based on information from both? Those bridges are the integration targets. Each one represents labor hours, decision latency, and error risk.

Step three is building the connectors. For databases with standard schemas (most ERP systems), this involves setting up a read connection that pulls data on a regular schedule or in real time. For unstructured sources, it involves AI-powered extraction: reading PDFs, parsing emails, and converting that unstructured information into structured data. For spreadsheets, it involves either direct file reading or, where the spreadsheet is the source of truth for a process, building a lightweight interface that captures the same data in a queryable format.

Step four is building the query interface. This is the tool that people in the operation actually use. It might be a dashboard that shows production status from the ERP alongside quality data from the QMS alongside material delivery status from supplier emails. It might be a search tool that lets an estimator type a part number and get the full history from every system that part has touched. The form depends on the workflow. The principle is consistent: bring the data to the person, structured around the decision they need to make.

What Changes When Systems Are Connected

The production planner who spent two hours assembling her Monday morning picture now opens one screen. Job status from the ERP, machine availability from the scheduling system, material delivery confirmations from supplier emails, and open quality holds from the QMS are all visible in a single view. Her Monday morning takes fifteen minutes instead of two hours.

The estimator who used to search the ERP for similar past jobs, then check emails for material pricing, then walk to the floor to ask about setup issues now gets all of that information surfaced automatically when they open an RFQ. Their quote cycle drops from days to hours.

The quality manager who used to manually cross-reference nonconformance reports with job parameters from the ERP now sees the correlations automatically. When three different jobs with the same material and tolerance band all show the same dimensional issue, the system flags it. A pattern that used to take weeks to notice becomes visible in real time.

None of this requires replacing any existing system. The ERP stays. The quality system stays. The accounting package stays. What changes is that the data they contain, which used to be trapped inside each one, can now flow to the people who need it across system boundaries.

The custom tools we build at Bloomfield do exactly this. They sit on top of your existing systems, read from all of them, and deliver the connected picture your team has been assembling manually for years.

Map your disconnected systems

We will walk through your current tool stack and identify where connected data can save time and improve decision quality.

Talk to Our Team