Jobs

Check job status and browse your job history.

GET /v1/jobs/{job_id} Free
GET /v1/jobs Free

Description

Check the status of async jobs and list your job history. Every async endpoint (download, transcribe, translate, preview, fingerprint, comments) returns a job_id. Use these endpoints to poll for completion or browse past jobs.

Get Job Status

GET /v1/jobs/{job_id}

Code Examples

curl "https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890" \
  -H "Authorization: Bearer vc_your_api_key"
import requests

response = requests.get(
    "https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890",
    headers={"Authorization": "Bearer vc_your_api_key"},
)
job = response.json()
print(job["status"], job.get("download_url"))
const response = await fetch(
  "https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890",
  { headers: { "Authorization": "Bearer vc_your_api_key" } }
);
const job = await response.json();
console.log(job.status, job.download_url);
$client = new GuzzleHttp\Client();
$response = $client->get("https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890", [
    "headers" => ["Authorization" => "Bearer vc_your_api_key"],
]);
$job = json_decode($response->getBody(), true);
echo $job["status"];
req, _ := http.NewRequest("GET",
    "https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890", nil)
req.Header.Set("Authorization", "Bearer vc_your_api_key")
resp, _ := http.DefaultClient.Do(req)
defer resp.Body.Close()
var job map[string]interface{}
json.NewDecoder(resp.Body).Decode(&job)
require "net/http"
require "json"

uri = URI("https://videoconduit.com/v1/jobs/a1b2c3d4-e5f6-7890-abcd-ef1234567890")
req = Net::HTTP::Get.new(uri)
req["Authorization"] = "Bearer vc_your_api_key"
res = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) { |http| http.request(req) }
job = JSON.parse(res.body)
puts job["status"]

Response Fields

Field Type Description
job_id string UUID
job_type string download, transcribe, translate, preview, fingerprint, comments
status string pending, processing, completed, failed
source_url string Original source URL
parameters object Request parameters used
credits_charged integer Credits deducted
download_url string? File URL (completed only)
result_data object? Result metadata (completed only)
error_message string? Error details (failed only)
created_at string ISO datetime
completed_at string? ISO datetime
expires_at string? When download_url expires

Job Statuses

pending

Job created, waiting for a worker.

processing

Worker is actively processing the job.

completed

Done. download_url and result_data are available.

failed

Something went wrong. error_message has details. Credits are refunded.

Automatic Refunds

If a job fails, credits are automatically refunded to your balance.

Polling Pattern

import time
import requests

job_id = "a1b2c3d4-..."
while True:
    resp = requests.get(
        f"https://videoconduit.com/v1/jobs/{job_id}",
        headers={"Authorization": "Bearer vc_your_api_key"},
    )
    job = resp.json()
    if job["status"] in ("completed", "failed"):
        break
    time.sleep(2)  # Poll every 2 seconds

Use Webhooks in Production

For production use, we recommend webhooks instead of polling. See the Webhooks guide.

List Jobs

GET /v1/jobs

Query Parameters

Parameter Type Default Description
cursor string null Pagination cursor from previous response
limit int 20 Results per page (max 100)
status string null Filter by status: pending, processing, completed, failed
job_type string null Filter by type: download, transcribe, translate, preview, fingerprint, comments

Code Examples

curl "https://videoconduit.com/v1/jobs?status=completed&limit=20" \
  -H "Authorization: Bearer vc_your_api_key"
import requests

response = requests.get(
    "https://videoconduit.com/v1/jobs",
    params={"status": "completed", "limit": 20},
    headers={"Authorization": "Bearer vc_your_api_key"},
)
data = response.json()
for job in data["items"]:
    print(job["job_id"], job["status"])
const response = await fetch(
  "https://videoconduit.com/v1/jobs?status=completed&limit=20",
  { headers: { "Authorization": "Bearer vc_your_api_key" } }
);
const data = await response.json();
data.items.forEach(job => console.log(job.job_id, job.status));
$client = new GuzzleHttp\Client();
$response = $client->get("https://videoconduit.com/v1/jobs", [
    "query" => ["status" => "completed", "limit" => 20],
    "headers" => ["Authorization" => "Bearer vc_your_api_key"],
]);
$data = json_decode($response->getBody(), true);
foreach ($data["items"] as $job) {
    echo $job["job_id"] . " " . $job["status"] . "\n";
}
req, _ := http.NewRequest("GET",
    "https://videoconduit.com/v1/jobs?status=completed&limit=20", nil)
req.Header.Set("Authorization", "Bearer vc_your_api_key")
resp, _ := http.DefaultClient.Do(req)
defer resp.Body.Close()
var data map[string]interface{}
json.NewDecoder(resp.Body).Decode(&data)
require "net/http"
require "json"

uri = URI("https://videoconduit.com/v1/jobs?status=completed&limit=20")
req = Net::HTTP::Get.new(uri)
req["Authorization"] = "Bearer vc_your_api_key"
res = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true) { |http| http.request(req) }
data = JSON.parse(res.body)
data["items"].each { |job| puts "#{job["job_id"]} #{job["status"]}" }

Response

{
  "items": [
    {
      "job_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
      "job_type": "download",
      "status": "completed",
      "source_url": "https://youtube.com/watch?v=dQw4w9WgXcQ",
      "credits_charged": 1,
      "created_at": "2025-01-15T10:00:00Z"
    }
  ],
  "next_cursor": "eyJjcmVhdGVkX2F0IjogIjIwMjUtMDEtMTQifQ==",
  "has_more": true
}

This endpoint uses cursor-based pagination. Pass the next_cursor value from the response as the cursor parameter in your next request to fetch the next page. When has_more is false, you’ve reached the end. See Rate Limits & Pagination for details.

Try It

Try It

GET /v1/jobs/{job_id}
Response

This site uses only essential cookies required for the service to function (session authentication and security). We do not use analytics, tracking, or advertising cookies. Learn more