Understanding Mycotoxin Reference Materials

Reliable mycotoxin testing relies on measurement systems that deliver results which are accurate, comparable, and defensible. This requirement arises not only from analytical considerations, but also from the regulatory, commercial, and accreditation environments in which food and feed laboratories operate. Quantitative data influence market access, compliance decisions, and risk management; as such, they must be supported by traceable calibration and quality control processes rather than by isolated analytical measurements.

Reference materials (RMs) fulfill this role by providing stable and well-characterized measurement points that anchor calibration, method validation, and performance monitoring. Without such materials, quantitative results would lack the traceability and comparability needed for regulatory reporting, interlaboratory assessment, or accredited conformity evaluation.

 

Types of mycotoxin reference materials

Different analytical objectives require different types of reference materials, and the choice of format has direct implications for calibration, recovery studies, and routine quality control. Neat materials represent the simplest form: a purified toxin delivered as a solid. These are often used for gravimetric preparation of calibration or spiking solutions in LC-MS/MS or HPLC workflows, offering well-defined concentration levels and straightforward traceability through mass and purity.

Gravimetrically prepared solutions derived from purified materials provide an additional layer of convenience and reproducibility. Because the concentration is assigned at production, laboratories can use such solutions directly for instrument calibration, method validation, and control of long-term analytical performance without the added uncertainty of in-house preparation.

Matrix-based reference materials occupy a complementary role. Prepared from naturally contaminated commodities such as maize, wheat, barley, oats, peanuts, or spices, they reflect authentic sample complexity and include endogenous components, co-contaminants, and matrix-dependent effects. This makes them indispensable for assessing extraction efficiency, recovery, matrix effects, and method selectivity. In many cases, the analytical challenge stems not from quantifying the pure analyte but from isolating and measuring it in the presence of interfering substances. Matrix materials therefore provide insights that cannot be obtained from neat or solution-based standards alone.

A further category includes isotope-labeled internal standards, which act as metrological anchors in LC-MS/MS workflows. When added before extraction, they track the analyte through all analytical steps and compensate for recovery losses and ion suppression or enhancement. Although not reference materials in the traditional sense, they are central to modern quantitative mycotoxin analysis and complement both neat and matrix materials.

 

Production and characterization principles

The production of mycotoxin reference materials involves distinct pathways depending on whether neat materials or matrix materials are intended. Neat materials typically rely on fungal biosynthesis in controlled fermentation systems, followed by extraction, solvent partitioning, and chromatographic purification to isolate the target mycotoxin. The purified toxin may then be released as a crystalline solid or gravimetrically dissolved to produce solutions suitable for LC-MS/MS or HPLC calibration.

Matrix materials require a different approach. Naturally contaminated commodities are selected, screened, and processed to ensure that the analyte distribution reflects authentic sample conditions. Controlled milling, blending, and homogenization steps are used to achieve consistent particle-size distribution and to minimize sub-sampling errors. Because matrix materials contain endogenous components and potential co-contaminants, they provide a realistic test of extraction performance, recovery, and matrix effects.

Characterization encompasses several analytical and metrological elements. Identity and purity confirmation for neat materials often relies on orthogonal techniques such as MS and NMR, while concentration verification for solutions typically involves chromatographic quantification. Quantitative NMR (qNMR) has gained particular importance for purity determination because it provides SI-traceable mass fraction values with high accuracy and minimal method bias. However, qNMR is also one of the most resource-intensive characterization techniques: depending on the analyte, tens of milligrams of purified solid (sometimes 10–30 mg) may be required to obtain sufficient signal for high-quality purity assignment. Given the cost and complexity of producing such purified mycotoxins, as well as the limited number of laboratories capable of performing qNMR at the necessary metrological level, the technique is regarded as a gold standard for purity determination in reference material production.

Homogeneity testing evaluates whether analyte levels are consistent across multiple units and within sub-samples, while stability studies examine how assigned values evolve under defined storage and transport conditions. These assessments are essential for establishing the validity of assigned values throughout the material’s intended lifetime. Taken together, these procedures ensure that reference materials behave predictably and support defensible quantitative measurement in both routine and accredited laboratory environments.

 

Certified Reference Materials and traceability

Certified Reference Materials (CRMs) constitute the most rigorously defined class of reference materials. Their assigned values are determined using validated and traceable measurement procedures, accompanied by stated measurement uncertainties, and linked to SI units via a documented calibration hierarchy. This enables laboratories to confirm calibration systems, validate internal standards, and operate measurement processes that can withstand regulatory, accreditation, and conformity assessment scrutiny.

The distinction between CRMs and standard RMs is primarily metrological. While standard RMs provide known property values for calibration, method evaluation, and quality control, they may not include a full uncertainty budget or explicit traceability to SI units. CRMs, by contrast, provide both: a value that is traceable and a quantified uncertainty that defines the range within which the true value lies with a specified confidence level. This combination allows CRMs to support measurement uncertainty estimation, interlaboratory comparability, and the maintenance of defensible calibration hierarchies.

In practice, laboratories use CRMs to verify or adjust internal quality standards, benchmark method performance, and validate new analytical procedures. Standard RMs remain essential for routine calibration and ongoing quality control tasks, but CRMs occupy the top tier of the measurement hierarchy, anchoring laboratory systems to internationally recognized reference frameworks.

 

The competence of reference material producers

ISO 17034 defines the competence requirements for organizations involved in the production of reference materials. It establishes how candidate materials are planned, characterized, value-assigned, tested for homogeneity and stability, documented, and released. The standard does not prescribe analytical methods, but it specifies how the underlying metrological processes must be controlled so that end users can rely on the assigned values and supporting documentation.

ISO 17034 is closely linked to ISO/IEC 17025, which governs the competence of testing laboratories. While 17034 focuses on the production of reference materials, value assignment and stability or homogeneity measurements often require 17025-accredited analytical procedures. The combination of both standards ensures that the metrological steps behind a CRM – such as purity determination, concentration verification, and stability assessment – are traceable, validated, and associated with quantified uncertainties.

In practice, ISO 17034 implementations frequently employ high-end characterization techniques such as qNMR as part of the value assignment process. Although technically demanding and costly, qNMR is considered the gold standard for purity determination because it enables SI-traceable and low-bias measurement of mass fractions. Its use within ISO 17034 workflows underscores the objective of achieving the highest possible level of analytical certainty and product integrity, even if only a limited number of laboratories can perform such measurements at the required metrological level.

By aligning production, characterization, documentation, and release with ISO 17034, CRMs provide laboratories with materials whose assigned values are defensible, traceable, and accompanied by stated uncertainties. This level of rigor enables laboratories in regulated or accredited environments to maintain calibration hierarchies, validate internal quality standards, and demonstrate analytical competence during audits, proficiency testing, and regulatory inspections.

 

Certificates and regulatory context

The certificate accompanying an RM or CRM serves as the central technical document for its correct use. Beyond listing the assigned value, it provides the critical metadata that define the material’s metrological quality and intended function. Elements such as measurement uncertainty (for CRMs), traceability statements, validity periods, storage conditions, evidence of homogeneity and stability testing, and safety information guide how the material can be integrated into calibration, validation, and ongoing quality control workflows. In accredited environments, laboratories must be able to reference and interpret these details to maintain defensible measurement systems.

The certificate also reveals differences in quality levels between material types. CRMs, for example, include declared uncertainties and documented traceability to SI units, while standard RMs typically provide value assignments without full uncertainty budgets or traceability chains. The presence - or absence - of such information allows auditors, regulators, and laboratory personnel to determine whether a material is suitable for calibration, for validation or verification work, or for routine internal quality control. As a result, certificates not only document compliance but also support decision-making in laboratory quality management.

From a regulatory perspective, reference materials play a structural role in ensuring that analytical results are comparable across jurisdictions and over time. Maximum levels, action limits, and compliance thresholds for major mycotoxins create environments in which quantitative accuracy is essential. Accreditation standards such as ISO/IEC 17025 require validated and traceable measurement procedures, and proficiency testing relies on comparable datasets. In this setting, high-quality RMs function not merely as chemical substances, but as components of the broader conformity assessment system that links analytical measurements to regulatory decisions.

Taken together, reference materials provide the infrastructure that connects laboratory instrumentation with regulatory, commercial, and public safety outcomes. Their proper use ensures that mycotoxin data are defensible, comparable, and meaningful across laboratories, markets, and time - attributes that are indispensable for food and feed safety.