Azure Foundry models don't work (Mistral Codestral 2501 with Azure) #3735
Labels
area:configuration
Relates to configuration options
area:docs
Relates to documentation
ide:vscode
Relates specifically to VS Code extension
kind:enhancement
Indicates a new feature request, imrovement, or extension
priority:high
Indicates high priority
priority:medium
Indicates medium priority
Issue Category
Undocumented feature or missing documentation
Affected Documentation Page URL
https://docs.continue.dev/customize/model-providers/azure
Issue Description
Using Codestral 2501 available in the Azure Hub through Azure Foundry or Azure ML products have a different target URI that OpenAI Azure models.
Azure Open AI Target URI:
https://just-an-example.openai.azure.com/openai/deployments/gpt-4o-july/chat/completions?api-version=2023-03-15-preview
Foundry Codestral URL example (same process for AzureML deployed models):
https://just-an-example.openai.azure.com/chat/completions?api-version=2023-03-15-preview
The difference is just :
openai/deployments/gpt-4o-july
How to configure such models ?
Can you please update the Codestral documentation thats seems to be lights/wrong for Autocomplete Codestral in Azure ?
Adding this feature allows both Azure Foundry and Azure ML models to be deployed with continue dev because the URI are the same for both products (I know azure is messy).
RESOLUTION: just add a new value for ApiType = 'foundry' to set the goold URI.
Version: 0.8.66
IDE: VS Code
Thank you,
Expected Content
No response
The text was updated successfully, but these errors were encountered: