When an EU ETS verifier reviews a monitoring plan or an annual emissions report, one of the things they check is the metering uncertainty basis. They need to be satisfied that the operator's claimed uncertainty is credible, traceable, and consistent with the applicable MRR tier requirement.
A single number on a spreadsheet does not provide that satisfaction.
More than a headline number
Stating that a flowmeter has an uncertainty of ±1.2% is a claim. The verifier's job is to assess whether that claim is supported by evidence. They need to understand where the number came from, what inputs were used, what assumptions were made, and whether the methodology is consistent with recognised standards.
In practice, this means the uncertainty report needs to show its working, not just its conclusion.
The uncertainty budget
The core of any uncertainty report is the budget table. This should show every contributor to the combined uncertainty, with:
- The name and description of the error source
- Whether it is Type A (statistical) or Type B (non-statistical)
- The source of the uncertainty value (datasheet, calibration certificate, lab data, operational data)
- The sensitivity coefficient and how it was derived
- The standard uncertainty at k=1
- The contribution to the combined uncertainty (sensitivity coefficient multiplied by standard uncertainty)
- The percentage of total variance
- The degrees of freedom for Welch-Satterthwaite calculation
The budget should also show the RSS combination, the effective degrees of freedom, the coverage factor, and the final expanded uncertainty.
Complete input record
The verifier needs to see every input value that was used in the calculation. This includes pipe and bore dimensions with their measurement uncertainties, DP transmitter specification (URL, span, operating DP, accuracy, drift, calibration uncertainty and the basis for each), process conditions (line pressure, temperature, and their uncertainties), fluid properties and the source of those properties, and compensation configuration.
Without this, the verifier cannot check whether the inputs are reasonable or whether they match the installed equipment.
Calculation workings
Best practice is to show the step-by-step derivation for each budget row. This means showing how the raw input was converted to a standard uncertainty, how the sensitivity coefficient was calculated from the ISO 5167 flow equation, and how the contribution was computed.
This is not about making the report longer. It is about making it auditable. A verifier who can follow each step from input to contribution can form an independent judgement about whether the result is credible. A verifier who sees only the final table has to take the entire calculation on trust.
Instrument traceability
The uncertainty values in the budget are ultimately derived from specific instruments with specific calibration histories. A complete report should record:
- Make, model, and serial number of each instrument (DP transmitter, pressure transmitter, temperature transmitter, densitometer)
- Calibration certificate numbers and dates
- Orifice plate dimensional inspection certificate
- Flow computer model and software version
- Preventive maintenance work order references
- Laboratory accreditation and analysis method for any lab-derived values
This creates a direct traceability chain: the uncertainty value in the budget can be traced to a specific calibration certificate for a specific instrument with a specific serial number.
Operational evidence
A design-point uncertainty analysis tells you what the meter should achieve. But the verifier may also want to know what it actually achieved across the reporting period.
Operational evidence can include daily flow profile data back-tested against the turndown curve, laboratory density samples compared with the design-basis value (for bias assessment), and speed-of-sound diagnostic data for ultrasonic meters.
This kind of evidence is not always required, but when it is available, it significantly strengthens the uncertainty claim because it demonstrates real-world performance rather than theoretical specification.
Documented exclusions
If any potential error sources were excluded from the budget, the report should document that explicitly and give the technical justification. For example, if the 4-20 mA analogue-to-digital conversion error was excluded because the meter uses digital HART communication, that should be stated with the reasoning.
Unexplained exclusions undermine confidence in the result. Documented exclusions with clear justification demonstrate engineering judgement.
Verifier checklist
When reviewing an uncertainty report, a verifier will typically check:
- Is the methodology consistent with ISO 5168 / ISO 5167 / GUM?
- Are all significant error sources included in the budget?
- Are the input values traceable to documented sources?
- Are the sensitivity coefficients correctly derived?
- Is the DP transmitter error correctly converted between % URL, % span, and % reading?
- Has the coverage factor been correctly determined (fixed k=2 or Welch-Satterthwaite)?
- Does the result comply with the applicable MRR tier?
- If flow profile data is available, does the flow-weighted uncertainty also comply?
- Are any exclusions from the budget documented and justified?
A report that answers all of these questions clearly, with traceable evidence, will pass verification much more smoothly than one that provides only a headline number and a summary table.
See what a complete uncertainty report looks like
The MeterProof example report shows all of these elements: uncertainty budget, calculation workings, complete input record, instrument register, documented exclusions, and flow profile back-test.