Skip to main content
GET
/
api
/
policy-optimization
/
online
/
jobs
/
{job_id}
/
results
curl -X GET "https://api.usesynth.ai/api/policy-optimization/online/jobs/{job_id}/results" \
  -H "Authorization: Bearer $SYNTH_API_KEY"
{
  "job_id": "po_abc123",
  "status": "completed",
  "best_reward": 0.94,
  "best_prompt": {
    "messages": [
      {"role": "system", "content": "You are a helpful assistant..."}
    ],
    "reward": 0.94
  },
  "lever_summary": {
    "prompt_lever_id": "mipro.prompt.sys_abc",
    "candidate_lever_versions": {
      "baseline": 1,
      "candidate_9b12": 4
    },
    "best_candidate_id": "candidate_9b12",
    "selected_candidate_id": "candidate_9b12",
    "baseline_candidate_id": "baseline",
    "lever_count": 4,
    "mutation_count": 3,
    "latest_version": 4
  },
  "lever_versions": {
    "mipro.prompt.sys_abc": 4
  },
  "best_lever_version": 4,
  "sensor_frames": [
    {
      "frame_id": "frame_01hzk...",
      "created_at": "2026-02-12T18:44:00Z",
      "sensor_count": 3,
      "sensor_kinds": ["rollout", "reward", "resource"],
      "trace_ids": ["trace_abc"],
      "lever_versions": {
        "mipro.prompt.sys_abc": 4
      }
    }
  ],
  "history": [
    {"generation": 1, "best_reward": 0.72},
    {"generation": 2, "best_reward": 0.85},
    {"generation": 3, "best_reward": 0.94}
  ]
}
Fetch the best prompt and optimization history after completion. For MIPRO runs, results also return canonical lever/sensor outputs so you can reproduce which prompt lever version won.
job_id
string
required
Policy optimization job ID.
curl -X GET "https://api.usesynth.ai/api/policy-optimization/online/jobs/{job_id}/results" \
  -H "Authorization: Bearer $SYNTH_API_KEY"
{
  "job_id": "po_abc123",
  "status": "completed",
  "best_reward": 0.94,
  "best_prompt": {
    "messages": [
      {"role": "system", "content": "You are a helpful assistant..."}
    ],
    "reward": 0.94
  },
  "lever_summary": {
    "prompt_lever_id": "mipro.prompt.sys_abc",
    "candidate_lever_versions": {
      "baseline": 1,
      "candidate_9b12": 4
    },
    "best_candidate_id": "candidate_9b12",
    "selected_candidate_id": "candidate_9b12",
    "baseline_candidate_id": "baseline",
    "lever_count": 4,
    "mutation_count": 3,
    "latest_version": 4
  },
  "lever_versions": {
    "mipro.prompt.sys_abc": 4
  },
  "best_lever_version": 4,
  "sensor_frames": [
    {
      "frame_id": "frame_01hzk...",
      "created_at": "2026-02-12T18:44:00Z",
      "sensor_count": 3,
      "sensor_kinds": ["rollout", "reward", "resource"],
      "trace_ids": ["trace_abc"],
      "lever_versions": {
        "mipro.prompt.sys_abc": 4
      }
    }
  ],
  "history": [
    {"generation": 1, "best_reward": 0.72},
    {"generation": 2, "best_reward": 0.85},
    {"generation": 3, "best_reward": 0.94}
  ]
}

Reproducibility Fields

  • lever_versions + best_lever_version let you pin the exact winning prompt lever version.
  • sensor_frames summarize reward/rollout/resource telemetry that informed candidate selection.
  • lever_summary.candidate_lever_versions shows version assignment across baseline and evolved candidates.