Skip to main content

batches

Creates, updates, deletes, gets or lists a batches resource.

Overview

Namebatches
TypeResource
Idopenai.batch.batches

Fields

NameDatatypeDescription
idstring
cancelled_atintegerThe Unix timestamp (in seconds) for when the batch was cancelled.
cancelling_atintegerThe Unix timestamp (in seconds) for when the batch started cancelling.
completed_atintegerThe Unix timestamp (in seconds) for when the batch was completed.
completion_windowstringThe time frame within which the batch should be processed.
created_atintegerThe Unix timestamp (in seconds) for when the batch was created.
endpointstringThe OpenAI API endpoint used by the batch.
error_file_idstringThe ID of the file containing the outputs of requests with errors.
errorsobject
expired_atintegerThe Unix timestamp (in seconds) for when the batch expired.
expires_atintegerThe Unix timestamp (in seconds) for when the batch will expire.
failed_atintegerThe Unix timestamp (in seconds) for when the batch failed.
finalizing_atintegerThe Unix timestamp (in seconds) for when the batch started finalizing.
in_progress_atintegerThe Unix timestamp (in seconds) for when the batch started processing.
input_file_idstringThe ID of the input file for the batch.
metadataobjectSet of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format. Keys can be a maximum of 64 characters long and values can be a maximum of 512 characters long.
objectstringThe object type, which is always batch.
output_file_idstringThe ID of the file containing the outputs of successfully executed requests.
request_countsobjectThe request counts for different statuses within the batch.
statusstringThe current status of the batch.

Methods

NameAccessible byRequired ParamsDescription
list_batchesSELECT
retrieve_batchSELECTbatch_id
create_batchINSERTdata__completion_window, data__endpoint, data__input_file_id
cancel_batchEXECbatch_id

SELECT examples

SELECT
id,
cancelled_at,
cancelling_at,
completed_at,
completion_window,
created_at,
endpoint,
error_file_id,
errors,
expired_at,
expires_at,
failed_at,
finalizing_at,
in_progress_at,
input_file_id,
metadata,
object,
output_file_id,
request_counts,
status
FROM openai.batch.batches
;

INSERT example

Use the following StackQL query and manifest file to create a new batches resource.

/*+ create */
INSERT INTO openai.batch.batches (
data__input_file_id,
data__endpoint,
data__completion_window,
data__metadata
)
SELECT
'{{ input_file_id }}',
'{{ endpoint }}',
'{{ completion_window }}',
'{{ metadata }}'
;