Careers

Build the future of industrial AI.

Join a team solving real problems for the world's most complex industries.

We're looking for people who are excited about applying AI to hard, real-world problems. Our team works directly with engineers at chemical plants, refineries, and manufacturing facilities — shipping software that makes a tangible difference in how they work every day.

Open Roles

Forward Deployed Engineer (Software)

Open ApplicationFull-time

Technical Challenges

We digitize P&ID drawings — the foundational engineering documents for refineries, chemical plants, and pharmaceutical facilities — into structured, simulation-ready data. The areas below are what you'd work on directly.

Depth in one or two is enough. Be honest about where you're strong, where you'd ramp, and where you're curious.

01

Air-gapped, on-premises deployment

TL;DR:Design how the whole stack installs and upgrades inside a customer firewall with no phone-home.

Process customers won't send proprietary engineering data to the public internet — the product has to live entirely inside their network. That means models, identity, licensing, telemetry, and updates all have to work offline, shipped as signed artifacts instead of registry pulls, and turnkey enough that a customer admin can install, upgrade, and troubleshoot without our engineers on site.

02

Hybrid neural + symbolic extraction pipeline

TL;DR:Turn a P&ID into a validated process graph using CV, VLMs, and classical topology — stages that cooperate rather than compete.

Neither pure neural nor pure rule-based extraction is good enough on dense engineering drawings: models hallucinate, rules are brittle. The pipeline interleaves them — neural detection for symbols and lines, symbolic topology for connectivity, and scoped model calls only where they're the right tool — so each stage validates the one before it and the final graph is something an engineer can trust.

03

Agentic HAZOP orchestration

TL;DR:A multi-agent safety analysis running over the extracted graph, with bounded and verifiable reasoning at every step.

HAZOP is high-stakes — a missed safeguard is a safety incident, and a fabricated one is worse. The design uses a deterministic orchestrator with parallel analyst agents and an auditor stage that verifies every claim against the actual graph, so findings that cite nodes, edges, or tags not present in the source get filtered out by pointer validation rather than trust.

04

LLMs and VLMs as black-box services, made reliable

TL;DR:Make stochastic models produce trustworthy structured output against a known ground truth — the extracted graph.

Engineers won't accept outputs they can't verify, and model providers change behavior under us. The integration layer pins structured output via schema validation, grounds claims in the graph via pointer validation, bounds tool-use loops with wall-clock and call budgets, falls back to algorithmic paths when the model fails, and records runs for deterministic replay — the same scaffolding has to work against cloud models and whatever runs behind a customer firewall.

05

Interactive correction UX over dense drawings

TL;DR:Give engineers pixel-accurate tools to fix what the model got wrong.

Automated extraction is never 100%, and the gap between "mostly right" and "fully trusted" is the product. The UX has to layer editable overlays onto high-resolution PDFs, snap to meaningful geometry, stage edits into an undoable changeset, and stay responsive across drawings with hundreds of symbols and thousands of line segments — with hit detection, virtualization, and multi-user editing as the frontier.

06

Process graph as the product's spine

TL;DR:Everything downstream — HAZOP, reporting, digital twin with thermodynamics — runs on the extracted graph.

The graph isn't a side-effect of extraction, it's the asset customers buy. It needs industry-classed nodes, fast traversal for upstream/downstream and safeguard/isolation queries, off-page connector resolution across a full facility, and lossless round-trip to industry formats so an engineer can take it into their simulator and back — with thermodynamic properties attached so each piece of equipment carries real physical meaning.

07

Forward-deployed customer work

TL;DR:A lot of the job lives outside the codebase — with process engineers, control engineers, and plant IT.

Process plants have dialects: their own symbol conventions, tag schemas, and export formats. The forward-deployed job is running pilots, translating domain feedback into rules or schema changes, and building extension points the customer's team can own — less "translate a spec into tickets," more "sit with an engineer and decide whether the fix is config, code, or a conversation with the model."