WayFlow#
Build robust AI-powered assistants for task automation and enhanced user experiences
WayFlow is a powerful, intuitive Python library for building advanced AI-powered assistants. It offers a standard library of modular building blocks to streamline the creation of both workflow-based and agent-style assistants, encourages reusability and speeds up the development process.
WayFlow is a reference runtime implementation for Agent Spec, with native support for all Agent Spec Agents and Flows.
Why WayFlow?
Flexibility WayFlow supports multiple approaches to building AI Assistants, including Agents and Flows.
Interoperability WayFlow works with LLMs from many different vendors and supports an open approach to integration.
Reusability WayFlow enables you to build reusable and composable components for rapid development of AI assistants.
Extensibility WayFlow has powerful abstractions to handle all types of LLM applications and provides a standard library of steps.
Openness WayFlow is an open-source project, welcoming contributions from diverse teams looking to take AI agents to the next step.
Quick Start
To install WayFlow on Python 3.10, use the following command to install it from the package index:
pip install "wayflowcore==25.4.1"
For complete installation instructions, including supported Python versions and platforms, see the installation guide.
With WayFlow installed, you can now try it out.
WayFlow supports several LLM API providers. Select an LLM from the options below:
from wayflowcore.models import OCIGenAIModel
if __name__ == "__main__":
llm = OCIGenAIModel(
model_id="provider.model-id",
service_endpoint="https://url-to-service-endpoint.com",
compartment_id="compartment-id",
auth_type="API_KEY",
)
from wayflowcore.models import VllmModel
llm = VllmModel(
model_id="model-id",
host_port="VLLM_HOST_PORT",
)
from wayflowcore.models import OllamaModel
llm = OllamaModel(
model_id="model-id",
)
Then create an agent and start a conversation, as shown in the example below:
from wayflowcore.agent import Agent
assistant = Agent(llm=llm)
conversation = assistant.start_conversation()
conversation.append_user_message("I need help regarding my sql query")
conversation.execute()
# get the assistant's response to your query
assistant_answer = conversation.get_last_message().content
# I'd be happy to help with your SQL query...
print(assistant_answer)
Tip
Self-Hosted Models: To use locally hosted models, see the guide on integrating them with WayFlow Installing Ollama.