Vortex Proxy
Vortex acts as a proxy, seamlessly bridging your applications with Large Language Models (LLMs), such as those provided by OpenAI.
Prerequisites
Before initiating the integration process, ensure you have:
- Vortex Proxy URL: The endpoint through which your application will communicate with LLMs. For more information on how to get your Vortex proxy URL see the Developers section.
- Consumer API Key for Vortex: A unique key that authenticates requests from your application to Vortex. For more information on how to get your consumer API key see the Generating Vortex API Keys section.
Establishing a Connection to LLMs via Vortex
With a channel configured and a Vortex API key generated for your consumer, connecting your application to leverage LLM services becomes a streamlined process. While the exact integration steps might vary based on your application's programming language and framework, the primary mechanism involves using the Vortex API key to authenticate requests and ensure a secure connection.
VORTEX_CHANNEL_NAME
is the name of the channel you have created. e.g.openai
VORTEX_CONSUMER_API_KEY
is the Vortex key created for a consumer.
Below are examples demonstrating how to connect to OpenAI services through Vortex:
These examples illustrate how to send requests to the OpenAI API through Vortex, utilising the proxy URL and your consumer API key for authentication. By adhering to these guidelines, you can effectively integrate OpenAI services into your applications via Vortex, unlocking the potential to enhance your projects with advanced AI capabilities.
Common Errors
Invalid authentication credentials
If you get this error, check your Vortex consumer api key.
{
"message":"Invalid authentication credentials"
}
No Route Matched Error
When using the Vortex Proxy URL for chat completions in cURL please include /chat/completions at the end of the URL. For example: https://your.vortex.proxy.url.com/VORTEX_CHANNEL_NAME/chat/completions
{
"message":"no Route matched with those values"
}