Skip to main content

Select providers

Continue makes it easy to use different providers for serving your chat, autocomplete, and embeddings models.

To select the ones you want to use, add them to your config.json.

Self-hosted

Local

You can run a model on your local computer using:

Remote

You can deploy a model in your AWS, GCP, Azure, or other clouds using:

SaaS

Open-source models

You can deploy open-source LLMs on a service using:

Commercial models

You can use commercial LLMs via APIs using:

In addition to selecting providers, you will need to figure out what models to use.