Customers can choose increasing levels of control. Some of these options are available now and some are roadmap items. The point is to make the tradeoffs explicit instead of pretending every deployment has the same privacy needs.
1
Use your own storage
PlannedUnder Settings, customers will be able to provide credentials for compatible S3 storage so source files and outputs live in their own bucket. Without hosted storage enabled, we should have no access to documents after processing completes.
2
Build on our API
Available todayTeams that need tighter control can use the API directly and build their own front end, workflow, or portal on top of the same conversion backend we use internally.
3
Run the stack in your AWS account
PlannedFor organizations with stricter controls, we will provide a CDK deployment and coordination layer so the full processing path runs inside the customer’s AWS environment and we never see the data.
4
Run the stack with Docker
PlannedFor self-managed environments, we plan to offer a Docker-based deployment option so customers can run the service in their own infrastructure and manage network, storage, and retention locally.
5
Bring your own LLM keys
PlannedCustomers will be able to provide their own model keys. As long as one compatible vision-capable AI model is available, the pipeline can route image and layout analysis through customer-controlled providers.