Integrations
Connect to what you already have.
Digitillis meets your infrastructure where it is. Native industrial protocol support, enterprise system connectors, and open APIs — without replacing what works.
Integration coverage
Six integration categories
From the plant floor to the boardroom — every layer of your operation can connect to Digitillis.
Industrial Protocols
Native support for the protocols your equipment already speaks. No translation layers, no proprietary gateways required.
- OPC-UA: Subscriber mode — reads tags from historians, PLCs, and SCADA systems
- MQTT: Subscriber mode — consumes sensor telemetry from edge brokers
- Modbus TCP/RTU: Reads coil and register data from legacy PLCs and field devices
- Siemens S7: Direct read access to S7-300/400/1200/1500 series controllers
Enterprise Systems
Bidirectional integration with the enterprise systems you depend on — pulling production context in, pushing work orders and alerts out.
- SAP: RFC connector for master data, production orders, and work order creation
- CMMS (Maximo, ServiceMax): REST connector for work order push and maintenance history pull
- MES: Production schedule and job status synchronisation via REST
- ERP: Order, inventory, and cost data via configurable REST adapters
Data Sources
Connect directly to your existing data stores and historians without moving data to a vendor cloud.
- OSIsoft PI / Aveva: PI Web API or AF SDK connector for historical and live tag data
- Historian (generic): OPC-HDA or OPC-UA Historical Access for any standards-compliant historian
- SQL / PostgreSQL: Direct table or view reads for structured operational data
- CSV / Excel upload: One-time or scheduled data ingestion for batch operational data
REST APIs
Every Digitillis capability is accessible via authenticated REST endpoints — enabling integration with any BI, workflow, or custom tooling.
- Prediction APIs: Real-time predictions for RUL, anomaly, quality, demand, and more
- Alert APIs: Query, acknowledge, and subscribe to platform alerts
- Equipment APIs: Asset hierarchy, sensor metadata, and health summaries
- Webhook support: Push alerts and prediction outputs to any HTTP endpoint
Streaming & Events
For teams that already use event streaming infrastructure, Digitillis publishes validated prediction and alert events directly to your topics.
- Apache Kafka: Publishes prediction events, alerts, and sensor summaries to configurable topics
- Azure Event Hubs: Compatible with Kafka protocol — use existing Event Hubs topics
- AWS Kinesis: Configurable outbound stream for prediction outputs
- Schema validation: All outbound events are validated against a declared schema before publication
Export & Reporting
Bring Digitillis intelligence into the tools your teams already use — without custom development.
- Excel / CSV export: Download prediction history, alert logs, and KPI summaries
- PDF briefings: Automated executive briefing documents at daily, weekly, or monthly cadence
- Power BI connector: Live OData feed for Power BI dashboards
- Grafana datasource: Prometheus-compatible metrics endpoint for operational dashboards
Integration philosophy
Additive, not disruptive.
Every integration is designed to layer intelligence on top of what you already have. Your MES, your CMMS, your historian, your protocols — unchanged. Digitillis reads from them and, where configured, writes back actionable outputs like work orders and alerts.
- Digitillis never replaces your existing systems — it sits alongside them
- All integrations use read-only access by default; write-back (work orders, alerts) is opt-in per tier
- Cloud deployments: sensor data is encrypted in transit and processed in your isolated tenant infrastructure. On-premise deployments: all processing stays within your network with zero data egress to Digitillis
- Authentication via API keys, OAuth 2.0, or service accounts depending on system type
- Custom connectors available for proprietary or legacy systems on the Enterprise tier
What a typical integration looks like
- 1
Connect edge gateway
Deploy the Digitillis edge gateway in your Level 3 DMZ. Configure OPC-UA/MQTT subscriptions to your historian or broker.
- 2
Map equipment hierarchy
Import your asset list from your CMMS or MES, or configure manually. Tag sensors to assets.
- 3
Enable predictions
Activate the agents relevant to your use case. Pre-trained models run immediately on your live sensor stream.
- 4
Configure outputs
Set alert thresholds, work order auto-creation rules, and reporting cadence. Connect to your existing notification channels.
Have specific integration requirements?
Tell us what systems and protocols you are running. We will confirm compatibility and outline what a deployment looks like for your specific environment.
Discuss Your Requirements