KProvEngine

V1 (scope-locked)

KProvEngine turns each run into structured, reviewable evidence so teams can trust the process, not just the final output.

I built it as a capstone around provenance, release trust, reproducibility, and reviewer-friendly evidence.

  • Deterministic runs
  • Explicit human review
  • Audit-grade artifacts
  • SBOM / provenance / SLSA-aligned signals

Quick value

  • Explains what happened in a run without digging through logs.
  • Produces evidence files a reviewer can inspect quickly.
  • Makes provenance and release trust visible in plain artifacts.
  • Keeps local execution and CI behavior aligned.
  • Shows practical supply-chain-aware engineering discipline.

What it solves

In real engineering environments, the risk often is not the final output. It is everything around it: inconsistent execution, weak traceability, and poor visibility into how an artifact was produced.

  • Runtime behavior varies across environments
  • Artifact lineage is hard to reconstruct
  • Release trust signals are weak
  • Human- and AI-assisted workflows are hard to review after the fact

What it does

KProvEngine provides a deterministic pipeline with stable output contracts, explicit stages, and evidence files that make each run inspectable from start to finish.

  • Deterministic CLI behavior with stable output contracts
  • A clear pipeline: normalize -> parse -> extract -> render
  • Per-run evidence artifacts such as run_summary.json, manifest.json, provenance.json, and human_review.json
  • Guardrails for identity, artifact hygiene, and Python version policy
  • CI parity and release discipline workflows

Architecture

High-level view of the pipeline, reviewer checkpoints, and the evidence each run leaves behind.

KProvEngine architecture diagram showing workflow stages, review checkpoints, and evidence outputs.

The diagram keeps the workflow in the middle and the evidence surfaces around it, which is how the project is meant to be reviewed.

What it produces

  • run_summary.json for execution summary
  • manifest.json for produced artifacts
  • provenance.json for workflow lineage
  • human_review.json for reviewer checkpoints

Tech stack

  • Python and CLI packaging for local-first execution
  • JSON output contracts for machine-readable review surfaces
  • GitHub Actions for CI and release checks
  • tox and make preflight for local gate parity
  • MIT license for the open-source core

Technical highlights

  • Deterministic exit codes and stable output contracts
  • Run-scoped evidence layout and review surfaces
  • Multi-version CI with coverage gates
  • Release discipline around stable public interfaces

Why this matters

This project shows how I think about engineering quality: reproducible behavior, clear contracts, reviewable evidence, and release trust that holds up outside a demo.

  • Reproducibility is treated as a product feature.
  • Traceability is visible in the artifacts, not implied in docs.
  • Reviewer-friendly evidence is built into the workflow itself.
  • Supply-chain-aware release discipline is part of the engineering model.

Open source model

KProvEngine is published as an MIT-licensed open-source core with maintainer-led governance.

Changes flow through issues, pull requests, and local gates so public contract surfaces stay stable as the project evolves.

Outcome

This capstone demonstrates production-minded software engineering: systems that can show what they did, under defined constraints, with reviewer-ready evidence.

For full professional context, use Resume or Recruiter Page.