SDK Batch Processing
Process multiple files using Python’s async capabilities:Async Batch Processing
For higher throughput:CLI Batch Processing
The CLI handles directory processing automatically:REST API Batch Processing
For raw API usage, implement parallel requests with retry handling:Rate Limits
- Default limit: 200 requests per minute per account
- The SDK and CLI handle rate limiting automatically
- For REST API, use retry logic with exponential backoff
- Enterprise plans can request higher limits
Tips
- Use async for high throughput - Async processing handles many concurrent requests efficiently
- Limit concurrency - Start with 5-10 concurrent requests and adjust based on your rate limits
- Handle failures gracefully - Use
return_exceptions=Truewithasyncio.gatherto continue processing on errors - Save progress - Write results incrementally to avoid losing work on long batches
Try Datalab
Get started with our API in less than a minute. We include free credits.