Skip to main content
All CollectionsInference FAQs
What models are available to run inference on?
What models are available to run inference on?
Updated over 9 months ago

See the Models page for a list of models that can be queried. You can also see these models by navigating to the Playground.

  • See 100+ models hosted for inference.

  • For technical details, see the API Reference.

  • Check out our web-based Chat, Language, Code, and Image Playgrounds.

  • Pricing for inference is distinct for each model based on tokens used. See inference pricing.

  • Learn best practices and prompt engineering techniques through examples.

  • Learn other ways to run inference with Rest API or the Python API.

Did this answer your question?