Skip to main content

DPP API Integration — Connecting Your Systems to Digital Product Passport Infrastructure

DPP API Integration Guide: Connecting Your Systems to Digital Product Passport Generation

DPP API Integration Guide: Connecting Your Systems to Digital Product Passport Generation

Most manufacturers already have the data needed for a digital product passport. The problem is that data lives in three different systems, formatted differently in each one, maintained by teams who don't talk to each other. An API integration solves that problem by letting your existing systems feed DPP generation automatically — no manual re-entry, no compliance bottlenecks at the point of shipment.

Why Manual DPP Entry Doesn't Scale

When companies start thinking about the process of creating a digital product passport, the first instinct is often to assign someone to fill in a form for each product. That works for a pilot with five SKUs. It falls apart at five hundred, and it's completely unworkable at fifty thousand.

Consider what a mid-sized manufacturer actually produces. Multiple product lines, each with dozens of variants. Each variant might have different material compositions depending on the supplier batch used that month. Carbon footprint figures that update quarterly as your energy mix changes. Certifications with expiry dates. Supplier declarations that need refreshing annually. Maintaining all of that manually is not a compliance strategy — it's a data entry job that will eventually produce wrong data, and wrong data in a regulatory instrument creates liability, not protection.

API integration is the only approach that keeps DPP data accurate at scale. When your ERP creates a new product record, an API call creates the DPP. When your PLM updates a material composition, an API call updates the passport. The data carrier — the QR code on the physical product — always resolves to current information because the source of truth systems are directly connected.

Understanding the DPP API Architecture

The DPP-Tool API is RESTful, uses JSON throughout, and follows standard HTTP conventions. Before building your integration, it helps to understand the core resource types the API exposes.

The Products Resource

Products are the top-level entity. A product record in the API represents a model — all units of a given design. The product resource contains the static attributes that don't change unit to unit: product name, GTIN, manufacturer details, material composition, sustainability metrics, certifications, and end-of-life instructions.

Creating a product via API looks straightforward:

POST /v1/products
Content-Type: application/json
Authorization: Bearer {token}

{
  "gtin": "09506000134376",
  "name": "Industrial Pump Model X-400",
  "manufacturer": {
    "name": "Acme Manufacturing GmbH",
    "country": "DE",
    "registration": "DE123456789"
  },
  "materials": [
    {"name": "Stainless Steel 316L", "percentage": 68, "recycled_content": 35},
    {"name": "PTFE", "percentage": 12, "recycled_content": 0}
  ],
  "carbon_footprint_gco2eq": 245000,
  "recyclable_fraction_percent": 92
}

The response returns the product ID and the DPP record URL — the persistent endpoint that a QR code will resolve to. Understanding this resource is the foundation before moving to the more complex integration patterns.

Units and Batches

Above the product level, the API supports batch and unit resources for serialised products. A batch record inherits all product-level data and adds batch-specific fields: production date, facility, quality control reference, batch-specific material declarations. A unit record goes further, adding individual serial number, assembly date, and any unit-specific test results.

Not every product category will require serialisation. The ESPR regulation leaves serialisation requirements to category-specific delegated acts. Batteries and some electronics will likely require item-level DPPs. Textiles may only require model-level. Your integration architecture should support all three levels even if only one is active today.

ERP Integration Pattern

The most common integration point is the ERP system, because that's typically where product master data is created and maintained. The integration pattern has three parts: trigger, transform, and push.

Trigger

What event in your ERP should create or update a DPP? Options include: creation of a new product master record, approval of a product for sale (status change), creation of a production order for a regulated product category, or shipment of a product to an EU market. The right trigger depends on your business process — but the principle is that DPP creation should be automatic, not a manual step someone remembers to do before shipment.

ERP systems like SAP use BAdI (Business Add-Ins) or output conditions to trigger external calls. Oracle Cloud ERP has native REST API webhooks. Microsoft Dynamics 365 supports Power Automate flows. Each has its own mechanism, but the destination is the same: a call to the DPP-Tool API with the product data extracted from the ERP record.

Transform

This is where most integration projects spend most of their time. Your ERP stores data in its own format, using its own field names and code lists. The DPP API expects a standardised schema. The transformation layer maps between them.

Common mapping challenges: material codes in your ERP are internal identifiers that need translating to standard substance names or CAS numbers. Weight percentages may be stored as raw weights that need normalising to percentages. Country codes may be ISO 3166-1 alpha-2 in the DPP API but full country names in your ERP. Certification references may need enriching with expiry dates from a separate quality management system.

Building a robust transform layer upfront — with explicit mappings, validation rules, and error handling — is what separates an integration that works in production from one that works in a demo. The DPP requirements checklist is useful here because it documents the expected format for each field, making the mapping specification concrete.

Push

With the trigger and transform in place, the push is the API call itself. For new products, this is a POST to /v1/products. For updates, it's a PATCH to /v1/products/{id}. The integration should handle HTTP errors gracefully: retry on 429 (rate limit) and 503 (service unavailable), alert on 4xx errors that indicate data problems, and log all calls with their response codes for audit purposes.

PLM and PIM Integration Patterns

ERPs handle transactional data well but are often weak on product lifecycle data — material compositions, design specifications, test results, sustainability attributes. That data typically lives in a PLM (Product Lifecycle Management) system. Similarly, marketing and channel data — product descriptions, images, feature lists — live in a PIM (Product Information Management) system.

A complete DPP integration often needs to pull from all three. The architecture that works in practice uses the ERP as the trigger (because it owns the product lifecycle status and knows when a product is active for sale) and calls PLM and PIM APIs to enrich the DPP payload before pushing to DPP-Tool.

This orchestration layer can be implemented as middleware using tools like MuleSoft, Azure Integration Services, or a custom microservice. For simpler setups, n8n or similar no-code automation platforms can handle the workflow without custom development. The key requirement is that the orchestration layer can handle asynchronous calls — PLM queries can take seconds, and you don't want your ERP process to block waiting for a DPP response.

Batch Creation for Scale

For manufacturers with large product catalogues who need to create DPPs for existing products retroactively — not just new ones — the batch API endpoint is the practical solution.

POST /v1/products/batch
Content-Type: application/json

{
  "products": [
    { "gtin": "...", "name": "...", ... },
    { "gtin": "...", "name": "...", ... },
    ...
  ],
  "webhook_url": "https://your-system.com/webhooks/dpp-batch-complete"
}

The API processes the array asynchronously and returns a job ID immediately. When processing completes, it calls your webhook URL with a summary: total records, success count, error count, and an array of any records that failed validation with the specific error reason. Your system can then address failures without re-submitting the entire batch.

Batch sizes up to 1,000 records per call are supported. For catalogues of tens of thousands, you'll run multiple batch jobs — the rate limits and guidance on sequencing are documented in the platform features page.

Webhook Notifications and Event-Driven Architecture

The API is not just a push destination — it can push back to your systems. Webhook subscriptions let DPP-Tool notify your systems when significant events occur.

The events that matter most in production: dpp.published (the passport is live and the QR code can now be applied to products), dpp.certification_expiring (a certification referenced in the DPP expires within 30 days), dpp.compliance_alert (a regulatory check found a potential issue), and batch.completed (for batch operations). Subscribing to these events means your systems can react automatically — routing a compliance alert to your quality team, or holding a production order when a DPP is not yet published.

Webhooks are authenticated using HMAC-SHA256 signatures. Every webhook payload includes a signature header that your endpoint should verify before processing the event, preventing unauthorised requests from triggering your internal workflows.

Testing Your Integration

The DPP-Tool API provides a sandbox environment that mirrors production exactly but does not generate real data carriers or trigger regulatory notifications. All integration development and testing should happen in sandbox. The sandbox uses the same authentication, the same endpoints (with a /sandbox prefix), and the same response formats.

Before going live, run through the complete DPP specification for your product category to confirm your integration is submitting all required fields. A common mistake is treating optional fields as truly optional — some fields that are technically optional in the API schema are mandatory under category-specific delegated acts. The compliance validator in the sandbox environment flags these issues before they become problems in production.

The DPP creation guide walks through the full lifecycle of a DPP from schema design to publication, which is useful context for understanding what your API integration is ultimately producing.

API Access and Getting Started

API access is available on all DPP-Tool plans, including the free tier which supports up to 10 products and is suitable for testing your integration end-to-end before committing to a production plan. API documentation, code examples in Python, JavaScript, and PHP, and a Postman collection are all available after signup.

The pricing page covers the request limits and features available at each tier. For enterprise integrations involving ERP connections and batch volumes over 10,000 products, the enterprise plan includes integration support and custom rate limit configuration.

Frequently Asked Questions

What is a digital product passport API?

A digital product passport API is a RESTful interface that allows external systems — ERPs, PLMs, PIMs, and manufacturing execution systems — to programmatically create, update, retrieve, and publish DPP records. Rather than entering product data manually, your existing business systems push data to the DPP platform automatically whenever a product record is created or updated.

How does batch DPP creation work via API?

Batch creation endpoints accept arrays of product records in a single API call. Instead of creating DPPs one by one, you can send hundreds or thousands of records in a single request. The API processes them asynchronously, returning a job ID immediately and triggering a webhook notification when the batch completes. This is the practical approach for manufacturers generating DPPs at production scale.

What authentication method does the DPP-Tool API use?

The DPP-Tool API uses OAuth 2.0 with API key fallback for simpler integrations. For production integrations with ERP or PLM systems, OAuth 2.0 client credentials flow is recommended as it supports token refresh without manual intervention. API keys can be used for initial testing and webhook configurations.

Can a DPP API integration update existing passport records?

Yes, and this is one of the most important features for manufacturers. A product passport is not a static document — it gets updated when certifications are renewed, when repair instructions change, or when new environmental data becomes available. The API supports PUT and PATCH operations so your systems can push updates that propagate immediately to all data carriers (QR codes, RFID) without re-printing or re-labelling.

What webhook events does the DPP API support?

Webhook notifications are triggered for DPP creation, update, status change (draft to published), data carrier generation (QR code ready), compliance check completion, and batch job completion. Each event sends a JSON payload to a configured endpoint, allowing downstream systems to react automatically — for example, updating an order management system when a DPP is published and ready for shipment.

Ready to Get Started?

Create your first Digital Product Passport today.

Try DPP-Tool Free