Skip to main content
POST
/
v1
/
data
/
batch-insert
Insert Batch Data into Table from File Path
curl --request POST \
  --url https://api.example.com/v1/data/batch-insert \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "bucket": "my_bucket",
  "credentials": {
    "access_key": "accesskey",
    "secret_key": "secretkey"
  },
  "file_job_parallel_size": 10,
  "file_path": [
    "parquet/big.parquet"
  ],
  "options": {
    "allow_http": true,
    "endpoint": "http://127.0.0.1:8080",
    "virtual_hosted_style": false
  },
  "storage_type": "CEPH"
}
'
{
  "code": 200,
  "data": {
    "elapsed_time": null,
    "inserted_record_batches": 1,
    "inserted_row_count": 1
  },
  "exception": null,
  "success": true
}

Authorizations

Authorization
string
header
required

Bearer token authentication. Include the token in the Authorization header as 'Bearer '

Query Parameters

format
enum<string>

The format of the file to insert. Default is Parquet.

Available options:
Jsonl,
Parquet
batch_insert_size
integer

Number of the maximum rows to insert at a time. It determines the size of each record batch. Default is 1024.

Required range: x >= 0

Body

application/json

The file path containing the data to be inserted. Note that inserting vector data is not supported for Jsonl (Json Lines) format.

file_path
string[]
required

Path to the file to insert.

Example:

"/Path/to/data/file.parquet"

bucket
string | null

Bucket name of cloud storage.

credentials
object

Credentials to access cloud storage. When it is not provided, try to use the credentials in the configuration such as environment variables.

directory_path
string | null

directory path to the file to insert.

Example:

"/Path/to/data/"

file_job_parallel_size
integer<int32> | null
default:1

Number of files to process in parallel.

Required range: x >= 0
options
object

Options for Cloud Storage.

storage_type
null | enum<string>
default:LOCAL

Storage type of the file to insert. When it is not provided, it is treated as a local file. Currently, only 'AWS', 'S3', 'CEPH', 'MINIO', and 'S3_COMPATIBLE' are supported.

Available options:
LOCAL,
AWS,
S3,
GCP,
AZURE,
CEPH,
MINIO,
S3_COMPATIBLE

Response

Data successfully inserted.

code
integer<int32>
required

HTTP status code.

Required range: x >= 0
success
boolean
required

Whether the request was successful.

data
object
exception
object