WayFlow#
Robust AI-powered assistants for task automation and enhanced user experiences.
WayFlow is a powerful, intuitive Python library for building advanced AI-powered assistants. It offers a standard library of modular building blocks to streamline the creation of both workflow-based and agent-style assistants, encourages reusability and speeds up the development process.
With WayFlow you can build both structured Flows and autonomous Agents, giving you complete flexibility and allowing you to choose the paradigm that best fits your use case.
Why WayFlow?
WayFlow has several advantages over other existing open-source frameworks:
Flexibility: WayFlow supports multiple approaches to building AI Assistants, including Agents and Flows.
Interoperability: WayFlow works with LLMs from many different vendors and supports an open approach to integration.
Reusability: Build reusable and composable components to enable rapid development of AI Assistants.
Extensibility: WayFlow has powerful abstractions to handle all types of LLM applications and provides a standard library of steps.
Openness: We want to build a community and welcome contributions from diverse teams looking to take the next step in open-source AI Agents.
Quick start
To install WayFlow from PyPI:
To install wayflowcore
(on Python 3.10), use the following command to install it from PyPI:
pip install "wayflowcore==25.4.1"
For full details on installation including what Python versions and platforms are supported please see our installation guide.
With WayFlow installed, you can now try it out.
WayFlow supports several LLM API providers. First choose an LLM from one of the options below:
from wayflowcore.models import OCIGenAIModel
if __name__ == "__main__":
llm = OCIGenAIModel(
model_id="provider.model-id",
service_endpoint="https://url-to-service-endpoint.com",
compartment_id="compartment-id",
auth_type="API_KEY",
)
from wayflowcore.models import VllmModel
llm = VllmModel(
model_id="model-id",
host_port="VLLM_HOST_PORT",
)
from wayflowcore.models import OllamaModel
llm = OllamaModel(
model_id="model-id",
)
Then create an agent and have a conversation with it, as shown in the code below:
from wayflowcore.agent import Agent
assistant = Agent(llm=llm)
conversation = assistant.start_conversation()
conversation.append_user_message("I need help regarding my sql query")
conversation.execute()
# get the assistant's response to your query
assistant_answer = conversation.get_last_message().content
# I'd be happy to help with your SQL query...
print(assistant_answer)
Tip
Self Hosted Models: If you are interested in using locally hosted models, please see our guide on using them with WayFlow, How to install Ollama.
Next Steps
Familiarize yourself with the basics - Tutorials
Start with the Tutorial building a simple conversational assistant with Agents.
Step through the Tutorial building a simple conversational assistant with Flows.
And then do the Tutorial building a simple code review assistant.
Ways to use WayFlow to solve common tasks - How-to Guides
The how-to guides show you how to achieve common tasks and use-cases using WayFlow. They cover topics such as:
Explore the API documentation
Dive deeper into the API documentation to learn about the various classes, methods, and functions available in the library.
Dive Deeper
Security
LLM-based assistants and LLM-based flows require careful security assessments before deployment. Please see our Security considerations page to learn more.
Frequently Asked Questions
Look through our Frequently Asked Questions.