An AI Provider in Select AI refers to the service provider of the LLM, transformer or both for processing and generating responses to natural language prompts. These providers offer models that can interpret and convert natural language for the use cases highlighted under the LLM concept.
See Select your AI Provider for the supported providers
1. Provider¶
- class select_ai.Provider(embedding_model: str | None = None, model: str | None = None, provider_name: str | None = None, provider_endpoint: str | None = None, region: str | None = None)¶
Base class for AI Provider
To create an object of Provider class, use any one of the concrete AI provider implementations
- Parameters:
embedding_model (str) – The embedding model, also known as a transformer. Depending on the AI provider, the supported embedding models vary
model (str) – The name of the LLM being used to generate responses
provider_name (str) – The name of the provider being used
provider_endpoint (str) – Endpoint URL of the AI provider being used
region (str) – The cloud region of the Gen AI cluster
2. AnthropicProvider¶
- class select_ai.AnthropicProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'anthropic', provider_endpoint: str | None = None, region: str | None = None)¶
Anthropic specific attributes
3. AzureProvider¶
- class select_ai.AzureProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'azure', provider_endpoint: str | None = None, region: str | None = None, azure_deployment_name: str | None = None, azure_embedding_deployment_name: str | None = None, azure_resource_name: str | None = None)¶
Azure specific attributes
- Parameters:
azure_deployment_name (str) – Name of the Azure OpenAI Service deployed model.
azure_embedding_deployment_name (str) – Name of the Azure OpenAI deployed embedding model.
azure_resource_name (str) – Name of the Azure OpenAI Service resource
4. AWSProvider¶
- class select_ai.AWSProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'aws', provider_endpoint: str | None = None, region: str | None = None, aws_apiformat: str | None = None)¶
AWS specific attributes
5. CohereProvider¶
- class select_ai.CohereProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'cohere', provider_endpoint: str | None = None, region: str | None = None)¶
Cohere AI specific attributes
6. OpenAIProvider¶
- class select_ai.OpenAIProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'openai', provider_endpoint: str | None = 'api.openai.com', region: str | None = None)¶
OpenAI specific attributes
7. OCIGenAIProvider¶
- class select_ai.OCIGenAIProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'oci', provider_endpoint: str | None = None, region: str | None = None, oci_apiformat: str | None = None, oci_compartment_id: str | None = None, oci_endpoint_id: str | None = None, oci_runtimetype: str | None = None)¶
OCI Gen AI specific attributes
- Parameters:
oci_apiformat (str) – Specifies the format in which the API expects data to be sent and received. Supported values are ‘COHERE’ and ‘GENERIC’
oci_compartment_id (str) – Specifies the OCID of the compartment you are permitted to access when calling the OCI Generative AI service
oci_endpoint_id (str) – This attributes indicates the endpoint OCID of the Oracle dedicated AI hosting cluster
oci_runtimetype (str) – This attribute indicates the runtime type of the provided model. The supported values are ‘COHERE’ and ‘LLAMA’
8. GoogleProvider¶
- class select_ai.GoogleProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'google', provider_endpoint: str | None = None, region: str | None = None)¶
Google AI specific attributes
9. HuggingFaceProvider¶
- class select_ai.HuggingFaceProvider(embedding_model: str | None = None, model: str | None = None, provider_name: str = 'huggingface', provider_endpoint: str | None = None, region: str | None = None)¶
HuggingFace specific attributes
10. Enable AI service provider¶
10.1. Sync API¶
This method adds ACL allowing database users to invoke AI provider’s HTTP endpoint
import os
import select_ai
admin_user = os.getenv("SELECT_AI_ADMIN_USER")
password = os.getenv("SELECT_AI_ADMIN_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")
select_ai_user = os.getenv("SELECT_AI_USER")
select_ai.connect(user=admin_user, password=password, dsn=dsn)
select_ai.grant_http_access(
users=select_ai_user, provider_endpoint="api.OPENAI.com"
)
print("Enabled AI provider for user: ", select_ai_user)
output:
Enabled AI provider for user: <select_ai_db_user>
10.2. Async API¶
import asyncio
import os
import select_ai
admin_user = os.getenv("SELECT_AI_ADMIN_USER")
password = os.getenv("SELECT_AI_ADMIN_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")
select_ai_user = os.getenv("SELECT_AI_USER")
async def main():
await select_ai.async_connect(user=admin_user, password=password, dsn=dsn)
await select_ai.async_grant_http_access(
users=select_ai_user, provider_endpoint="*.openai.azure.com"
)
print("Enabled AI provider for user: ", select_ai_user)
asyncio.run(main())
output:
Enabled AI provider for user: <select_ai_db_user>
11. Disable AI service provider¶
This method removes ACL blocking database users to invoke AI provider’s HTTP endpoint
11.1. Sync API¶
import os
import select_ai
admin_user = os.getenv("SELECT_AI_ADMIN_USER")
password = os.getenv("SELECT_AI_ADMIN_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")
select_ai_user = os.getenv("SELECT_AI_USER")
select_ai.connect(user=admin_user, password=password, dsn=dsn)
select_ai.revoke_http_access(
users=select_ai_user, provider_endpoint="*.openai.azure.com"
)
print("Disabled AI provider for user: ", select_ai_user)
output:
Disabled AI provider for user: <select_ai_db_user>
11.2. Async API¶
import asyncio
import os
import select_ai
admin_user = os.getenv("SELECT_AI_ADMIN_USER")
password = os.getenv("SELECT_AI_ADMIN_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")
select_ai_user = os.getenv("SELECT_AI_USER")
async def main():
await select_ai.async_connect(user=admin_user, password=password, dsn=dsn)
await select_ai.async_revoke_http_access(
users=select_ai_user, provider_endpoint="*.openai.azure.com"
)
print("Disabled AI provider for user: ", select_ai_user)
asyncio.run(main())
output:
Disabled AI provider for user: <select_ai_db_user>