Our on-prem container’s API mimics Datalab’s API.
/api/v1/table_rec
or /api/v1/layout
at the moment, but will in an upcoming release.
X-API-Key
header detailed here, but it will be ignored and any value works.
use_llm
is supported with your own keysuse_llm
flag in /api/v1/marker
.
use_llm
is supported in the container image, but you must bring your own keys.
The container will support any service supported by Marker, which currently includes:
use_llm
. Google’s models are best-supported.
We can’t speak to performance using other models, but you’re welcome to try (and we’d love to hear from you when you do, as would folks in our Discord channel).
Note: use_llm
will only work with a vision-capable model.
marker
with use_llm
in the container/api/v1/marker
in the container, you must send your keys and configuration for use_llm
in the API request itself.
Here’s what that looks like if you want to use Gemini 2.0 Flash (the model we use in production) via OpenRouter:
additional_config
parameter is a dictionary of configs passed to marker
itself.
We allow a small subset of configuration keys on our hosted API via additional_config
, but the entirety of Marker’s configs are usable in the container.
Below are the keys required for each provider.
openai_base_url
, which defaults to OpenAI’s servers.