An AsyncProfile object can be created with select_ai.AsyncProfile() AsyncProfile support use of concurrent programming with asyncio. Unless explicitly noted as synchronous, the AsyncProfile methods should be used with await.

1. AsyncProfile API

class select_ai.AsyncProfile(*args, **kwargs)

AsyncProfile defines methods to interact with the underlying AI Provider asynchronously.

async add_negative_feedback(prompt_spec: Tuple[str, Action] | None = None, sql_id: str | None = None, response: str | None = None, feedback_content: str | None = None)

Give negative feedback to the LLM

Parameters:
  • prompt_spec (Tuple[str, Action]) – First element is the prompt and second is the corresponding action

  • sql_id (str) – SQL identifier from V$MAPPED_SQL view

  • response (str) – Expected SQL from LLM

  • feedback_content (str) – Actual feedback in natural language

async add_positive_feedback(prompt_spec: Tuple[str, Action] | None = None, sql_id: str | None = None)

Give positive feedback to the LLM

Parameters:
  • prompt_spec (Tuple[str, Action]) – First element is the prompt and second is the corresponding action

  • sql_id (str) – SQL identifier from V$MAPPED_SQL view

async chat(prompt, params: Mapping = None) str

Asynchronously chat with the LLM

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

str

chat_session(conversation: AsyncConversation, delete: bool = False)

Starts a new chat session for context-aware conversations

Parameters:
  • conversation (AsyncConversation) – Conversation object to use for this chat session

  • delete (bool) – Delete conversation after session ends

async create(replace: int | None = False) None

Asynchronously create an AI Profile in the Database

Parameters:

replace (bool) – Set True to replace else False

Returns:

None

Raises:

oracledb.DatabaseError

async delete(force=False) None

Asynchronously deletes an AI profile from the database

Parameters:

force (bool) – Ignores errors if AI profile does not exist.

Returns:

None

Raises:

oracledb.DatabaseError

async delete_feedback(prompt_spec: Tuple[str, Action] = None, sql_id: str | None = None)

Delete feedback from the database

Parameters:
  • prompt_spec (Tuple[str, Action]) – First element is the prompt and second is the corresponding action

  • sql_id (str) – SQL identifier from V$MAPPED_SQL view

async explain_sql(prompt: str, params: Mapping = None)

Explain the generated SQL

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

str

async classmethod fetch(profile_name: str) AsyncProfile

Asynchronously create an AI Profile object from attributes saved in the database

Parameters:

profile_name (str)

Returns:

select_ai.Profile

Raises:

ProfileNotFoundError

async generate(prompt: str, action=Action.SHOWSQL, params: Mapping = None) DataFrame | str | None

Asynchronously perform AI translation using this profile

Parameters:
  • prompt (str) – Natural language prompt to translate

  • action (select_ai.profile.Action)

  • params – Parameters to include in the LLM request. For e.g. conversation_id for context-aware chats

Returns:

Union[pandas.DataFrame, str]

async generate_synthetic_data(synthetic_data_attributes: SyntheticDataAttributes) None

Generate synthetic data for a single table, multiple tables or a full schema.

Parameters:

synthetic_data_attributes (select_ai.SyntheticDataAttributes)

Returns:

None

Raises:

oracledb.DatabaseError

async get_attributes() ProfileAttributes

Asynchronously gets AI profile attributes from the Database

Returns:

select_ai.provider.ProviderAttributes

Raises:

ProfileNotFoundError

classmethod list(profile_name_pattern: str = '.*') AsyncGenerator[AsyncProfile, None]

Asynchronously list AI Profiles saved in the database.

Parameters:

profile_name_pattern (str) – Regular expressions can be used to specify a pattern. Function REGEXP_LIKE is used to perform the match. Default value is “.*” i.e. match all AI profiles.

Returns:

Iterator[Profile]

async narrate(prompt, params: Mapping = None) str

Narrate the result of the SQL

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

str

async run_pipeline(prompt_specifications: List[Tuple[str, Action]], continue_on_error: bool = False) List[str | DataFrame]

Send Multiple prompts in a single roundtrip to the Database

Parameters:
  • prompt_specifications (List[Tuple[str, Action]]) – List of 2-element tuples. First element is the prompt and second is the corresponding action

  • continue_on_error (bool) – True to continue on error else False

Returns:

List[Union[str, pandas.DataFrame]]

async run_sql(prompt, params: Mapping = None) DataFrame

Explain the generated SQL

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

pandas.DataFrame

async set_attribute(attribute_name: str, attribute_value: bool | str | int | float | Provider)

Updates AI profile attribute on the Python object and also saves it in the database

Parameters:
  • attribute_name (str) – Name of the AI profile attribute

  • attribute_value (Union[bool, str, int, float]) – Value of the profile attribute

Returns:

None

async set_attributes(attributes: ProfileAttributes)

Updates AI profile attributes on the Python object and also saves it in the database

Parameters:

attributes (ProfileAttributes) – Object specifying AI profile attributes

Returns:

None

async show_prompt(prompt: str, params: Mapping = None)

Show the prompt sent to LLM

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

str

async show_sql(prompt, params: Mapping = None)

Show the generated SQL

Parameters:
  • prompt (str) – Natural language prompt

  • params – Parameters to include in the LLM request

Returns:

str

async summarize(content: str = None, prompt: str = None, location_uri: str = None, credential_name: str = None, params: SummaryParams = None) str

Generate summary

Parameters:
  • prompt (str) – Natural language prompt to guide the summary generation

  • content (str) – Specifies the text you want to summarize

  • location_uri (str) – Provides the URI where the text is stored or the path to a local file stored

  • credential_name (str) – Identifies the credential object used to authenticate with the object store

  • params (select_ai.summary.SummaryParams) – Parameters to include in the LLM request

2. Async Profile creation

import asyncio
import os
from pprint import pformat

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


# This example shows how to asynchronously generate SQLs nad run SQLs
async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    provider = select_ai.OCIGenAIProvider(
        region="us-chicago-1", oci_apiformat="GENERIC"
    )
    profile_attributes = select_ai.ProfileAttributes(
        credential_name="my_oci_ai_profile_key",
        object_list=[{"owner": "SH"}],
        provider=provider,
    )
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
        attributes=profile_attributes,
        description="MY OCI AI Profile",
        replace=True,
    )
    print("Created async profile ", async_profile.profile_name)
    profile_attributes = await async_profile.get_attributes()
    print(
        "Profile attributes: ",
        pformat(profile_attributes.dict(exclude_null=False)),
    )


asyncio.run(main())

output:

Created async profile  async_oci_ai_profile
Profile attributes:  {'annotations': None,
 'case_sensitive_values': None,
 'comments': None,
 'constraints': None,
 'conversation': None,
 'credential_name': 'my_oci_ai_profile_key',
 'enable_source_offsets': None,
 'enable_sources': None,
 'enforce_object_list': None,
 'max_tokens': '1024',
 'object_list': '[{"owner":"SH"}]',
 'object_list_mode': None,
 'provider': OCIGenAIProvider(embedding_model=None,
                              model=None,
                              provider_name='oci',
                              provider_endpoint=None,
                              region='us-chicago-1',
                              oci_apiformat='GENERIC',
                              oci_compartment_id=None,
                              oci_endpoint_id=None,
                              oci_runtimetype=None),
 'seed': None,
 'stop_tokens': None,
 'streaming': None,
 'temperature': None,
 'vector_index_name': None}

3. Async explain SQL

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    response = await async_profile.explain_sql("How many promotions ?")
    print(response)


asyncio.run(main())

output:

To answer the question "How many promotions", we need to write a SQL query that counts the number of rows in the "PROMOTIONS" table. Here is the query:

```sql
SELECT
  COUNT("p"."PROMO_ID") AS "Number of Promotions"
FROM
  "SH"."PROMOTIONS" "p";
```

Explanation:

* We use the `COUNT` function to count the number of rows in the table.
* We use the table alias `"p"` to refer to the `"PROMOTIONS"` table.
* We enclose the table name and column name in double quotes to make them case-sensitive.
* We use the `AS` keyword to give an alias to the count column, making it easier to read.

This query will return the total number of promotions in the `"PROMOTIONS"` table.

4. Async run SQL

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


# This example shows how to asynchronously run sql
async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    # run_sql returns a pandas df
    df = await async_profile.run_sql("How many promotions?")
    print(df)


asyncio.run(main())

output:

   PROMOTION_COUNT
0              503

5. Async show SQL

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    response = await async_profile.show_sql("How many promotions?")
    print(response)


asyncio.run(main())

output:

SELECT COUNT("p"."PROMO_ID") AS "PROMOTION_COUNT" FROM "SH"."PROMOTIONS" "p"

6. Async concurrent SQL

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    sql_tasks = [
        async_profile.show_sql(prompt="How many customers?"),
        async_profile.run_sql(prompt="How many promotions?"),
        async_profile.explain_sql(prompt="How many promotions?"),
    ]

    # Collect results from multiple asynchronous tasks
    for sql_task in asyncio.as_completed(sql_tasks):
        result = await sql_task
        print(result)


asyncio.run(main())

output:

SELECT COUNT("c"."CUST_ID") AS "customer_count" FROM "SH"."CUSTOMERS" "c"

To answer the question "How many promotions", we need to write a SQL query that counts the number of rows in the "PROMOTIONS" table. Here is the query:

```sql
SELECT
  COUNT("p"."PROMO_ID") AS "number_of_promotions"
FROM
  "SH"."PROMOTIONS" "p";
```

Explanation:

* We use the `COUNT` function to count the number of rows in the table.
* We use the table alias `"p"` to refer to the `"PROMOTIONS"` table.
* We specify the schema name `"SH"` to ensure that we are accessing the correct table.
* We enclose the table name, schema name, and column name in double quotes to make them case-sensitive.
* The `AS` keyword is used to give an alias to the calculated column, in this case, `"number_of_promotions"`.

This query will return the total number of promotions in the `"PROMOTIONS"` table.

   PROMOTION_COUNT
0              503

7. Async chat

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile"
    )

    # Asynchronously send multiple chat prompts
    chat_tasks = [
        async_profile.chat(prompt="What is OCI ?"),
        async_profile.chat(prompt="What is OML4PY?"),
        async_profile.chat(prompt="What is Autonomous Database ?"),
    ]
    for chat_task in asyncio.as_completed(chat_tasks):
        result = await chat_task
        print(result)


asyncio.run(main())

output:

OCI stands for several things depending on the context:

1. **Oracle Cloud Infrastructure (OCI)**: This is a cloud computing service offered by Oracle Corporation. It provides a range of services including computing, storage, networking, database, and more, allowing businesses to build, deploy, and manage applications and services in a secure and scalable manner.

...
..
OML4PY provides a Python interface to OML, allowing users to create, manipulate, and analyze models using Python scripts. It enables users to leverage the power of OML and OMF from within Python, making it easier to integrate modeling and simulation into larger workflows and applications.
...
...

An Autonomous Database is a type of database that uses artificial intelligence (AI) and machine learning (ML) to automate many of the tasks typically performed by a database administrator (DBA)
...
...

8. Summarize

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")

content = """
A gas cloud in our galaxy, Sagittarius B2, contains enough alcohol to brew 400
trillion pints of beer, and some stars are so cool that you could touch them
without being burned. Meanwhile, on the exoplanet 55 Cancri e, a form of
"hot ice" exists where high pressure prevents water from becoming gas even at
high temperatures. Additionally, some ancient stars found in the Milky Way's
halo are much older than the Sun, providing clues about the early universe and
its composition
"""


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    summary = await async_profile.summarize(content=content)
    print(summary)


asyncio.run(main())

output:

A gas cloud in the Sagittarius B2 galaxy contains a large amount of alcohol,
while some stars are cool enough to touch without being burned. The exoplanet
55 Cancri e has a unique form of "hot ice" where water remains solid despite
high temperatures due to high pressure. Ancient stars in the Milky Way's halo
are older than the Sun, providing insights into the early universe and its composition,
offering clues about the universe's formation and evolution.
import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile",
    )
    summary = await async_profile.summarize(
        location_uri="https://en.wikipedia.org/wiki/Astronomy"
    )
    print(summary)


asyncio.run(main())

output:

Astronomy is a natural science that studies celestial objects and phenomena,
using mathematics, physics, and chemistry to explain their origin and evolution.
The field has a long history, with early civilizations making methodical
observations of the night sky, and has since split into observational and
theoretical branches. Observational astronomy focuses on acquiring data
from observations, while theoretical astronomy develops computer or
analytical models to describe astronomical objects and phenomena. The study
of astronomy has led to numerous discoveries, including the existence of
galaxies, the expansion of the universe, and the detection of gravitational
waves. Astronomers use various methods, such as radio, infrared, optical,
ultraviolet, X-ray, and gamma-ray astronomy, to study objects and events in
the universe. The field has also led to the development of new technologies and
has inspired new areas of research, such as astrobiology and the search for
extraterrestrial life. Overall, astronomy is a dynamic and constantly evolving
field that seeks to understand the universe and its many mysteries.

9. Async pipeline

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile(
        profile_name="async_oci_ai_profile"
    )
    prompt_specifications = [
        ("What is Oracle Autonomous Database?", select_ai.Action.CHAT),
        ("Generate SQL to list all customers?", select_ai.Action.SHOWSQL),
        (
            "Explain the query: SELECT * FROM sh.products",
            select_ai.Action.EXPLAINSQL,
        ),
        ("Explain the query: SELECT * FROM sh.products", "INVALID ACTION"),
    ]

    # 1. Multiple prompts are sent in a single roundtrip to the Database
    # 2. Results are returned as soon as Database has executed all prompts
    # 3. Application doesn't have to wait on one response before sending
    #    the next prompts
    # 4. Fewer round trips and database is kept busy
    # 5. Efficient network usage
    results = await async_profile.run_pipeline(
        prompt_specifications, continue_on_error=True
    )
    for i, result in enumerate(results):
        print(
            f"Result {i} for prompt '{prompt_specifications[i][0]}' is: {result}"
        )


asyncio.run(main())

output:

Result 0 for prompt 'What is Oracle Autonomous Database?' is: Oracle Autonomous Database is a cloud-based database service that uses artificial intelligence (AI) and machine learning (ML) to automate many of the tasks associated with managing a database. It is a self-driving, self-securing, and self-repairing database that eliminates the need for manual database administration, allowing users to focus on higher-level tasks.


Result 1 for prompt 'Generate SQL to list all customers?' is: SELECT "c"."CUST_ID" AS "Customer ID", "c"."CUST_FIRST_NAME" AS "First Name", "c"."CUST_LAST_NAME" AS "Last Name", "c"."CUST_EMAIL" AS "Email" FROM "SH"."CUSTOMERS" "c"

Result 2 for prompt 'Explain the query: SELECT * FROM sh.products' is: ```sql
SELECT
  p.*
FROM
  "SH"."PRODUCTS" p;
```

**Explanation:**

This query is designed to retrieve all columns (`*`) from the `"SH"."PRODUCTS"` table.

Here's a breakdown of the query components:


Result 3 for prompt 'Explain the query: SELECT * FROM sh.products' is: ORA-20000: Invalid action - INVALID ACTION

10. List profiles asynchronously

import asyncio
import os

import select_ai

user = os.getenv("SELECT_AI_USER")
password = os.getenv("SELECT_AI_PASSWORD")
dsn = os.getenv("SELECT_AI_DB_CONNECT_STRING")


async def main():
    await select_ai.async_connect(user=user, password=password, dsn=dsn)
    async_profile = await select_ai.AsyncProfile()
    # matches the start of string
    async for fetched_profile in async_profile.list(
        profile_name_pattern="^oci"
    ):
        print(fetched_profile.profile_name)


asyncio.run(main())

output:

OCI_VECTOR_AI_PROFILE
OCI_AI_PROFILE