FAQ

Frequently asked questions about Tangle Router.

What is Tangle Router?
Tangle Router is a decentralized AI inference gateway. It provides a unified, OpenAI-compatible API for accessing 300+ AI models served by independent operators on the Tangle network.
How is it different from OpenAI or Anthropic directly?
Instead of a single company providing inference, Tangle Router routes your requests to a competitive marketplace of operators who run the models. This gives you better pricing, higher availability, and verifiable inference.
Is it OpenAI SDK compatible?
Yes. You can use the standard OpenAI Python/TypeScript SDK by just changing the base URL to router.tangle.tools/v1 and using your Tangle API key.
What are operators?
Operators are independent infrastructure providers who deploy Tangle Blueprints to serve model inference. They stake collateral, set their own pricing, and earn rewards for providing reliable compute.
How does pricing work?
Each model has per-token pricing set by operators. You pay only for what you use — no subscriptions. Operators compete on price, so rates are typically lower than going direct to providers.
What are Shielded Credits?
Shielded Credits enable anonymous payments on the network. Your usage cannot be linked to your identity on-chain, providing privacy for sensitive workloads.
Can I run my own operator?
Yes! Check out the Become an Operator page and the Operator Guide in the docs. You can deploy a Blueprint using vLLM, Modal, or build a custom one.