How to Serialize and Deserialize Conversations#

python-icon Download Python Script

Python script/notebook for this guide.

Serialize conversation how-to script

Prerequisites

This guide assumes familiarity with:

When building AI assistants, it id=s often necessary to save the state of a conversation to disk and restore it later. This is essential for creating persistent applications that can:

  • Resume conversations after an application restart

  • Save user sessions for later continuation

  • Implement conversation history and analytics

  • Support multi-session workflows

In this tutorial, you will learn how to:

  • Serialize Agent conversations to JSON files

  • Serialize Flow conversations at any point during execution

  • Deserialize and resume both types of conversations

  • Build persistent conversation loops that survive application restarts

Concepts shown in this guide#

  • serialize to convert conversations to storable format

  • autodeserialize to restore conversations from storage

  • Handling conversation state persistence for both Agents and Flows

Basic Serialization#

Step 1. Add imports and configure LLM#

Start by importing the necessary packages for serialization:

import json
import os

from wayflowcore.agent import Agent
from wayflowcore.controlconnection import ControlFlowEdge
from wayflowcore.conversation import Conversation
from wayflowcore.dataconnection import DataFlowEdge
from wayflowcore.executors.executionstatus import (
    FinishedStatus,
    UserMessageRequestStatus,
)
from wayflowcore.flow import Flow
from wayflowcore.serialization import autodeserialize, serialize
from wayflowcore.steps import (
    CompleteStep,
    InputMessageStep,
    OutputMessageStep,
    StartStep,
)

WayFlow supports several LLM API providers. Select an LLM from the options below:

from wayflowcore.models import OCIGenAIModel, OCIClientConfigWithApiKey

llm = OCIGenAIModel(
    model_id="provider.model-id",
    compartment_id="compartment-id",
    client_config=OCIClientConfigWithApiKey(
        service_endpoint="https://url-to-service-endpoint.com",
    ),
)

Step 2. Create storage functions#

Define helper functions to store and load conversations:

DIR_PATH = "path/to/your/dir"

def store_conversation(path: str, conversation: Conversation) -> str:
    """Store the given conversation and return the conversation id."""
    conversation_id = conversation.conversation_id
    serialized_conversation = serialize(conversation)

    # Read existing data
    if os.path.exists(path):
        with open(path, "r") as f:
            data = json.load(f)
    else:
        data = {}

    # Add new conversation
    data[conversation_id] = serialized_conversation

    # Write back to file
    with open(path, "w") as f:
        json.dump(data, f, indent=2)

    return conversation_id


def load_conversation(path: str, conversation_id: str) -> Conversation:
    """Load a conversation given its id."""
    with open(path, "r") as f:
        data = json.load(f)

    serialized_conversation = data[conversation_id]
    return autodeserialize(serialized_conversation)

These functions:

  • Use WayFlow’s serialize() to convert conversations to a storable format

  • Store multiple conversations in a single JSON file indexed by conversation ID

  • Use autodeserialize() to restore the original conversation objects

Serializing Agent conversations#

Agent conversations can be serialized at any point during execution:

Step 1. Create an Agent#

assistant = Agent(
    llm=llm,
    custom_instruction="You are a helpful assistant. Be concise.",
    agent_id="simple_assistant",
)

Step 2. Run the conversation#

# Start a conversation
conversation = assistant.start_conversation()
conversation_id = conversation.conversation_id
print(f"1. Started conversation with ID: {conversation_id}")

# Execute initial greeting
status = conversation.execute()
print(f"2. Assistant says: {conversation.get_last_message().content}")

# Add user message
conversation.append_user_message("What is 2+2?")
print("3. User asks: What is 2+2?")

# Execute to get response
status = conversation.execute()
print(f"4. Assistant responds: {conversation.get_last_message().content}")

Step 3. Serialize the conversation#

AGENT_STORE_PATH = os.path.join(DIR_PATH, "agent_conversation.json")
store_conversation(AGENT_STORE_PATH, conversation)
print(f"5. Conversation serialized to {AGENT_STORE_PATH}")

Step 4. Deserialize the conversation#

loaded_conversation = load_conversation(AGENT_STORE_PATH, conversation_id)
print(f"6. Conversation deserialized from {AGENT_STORE_PATH}")

# Print the loaded conversation messages
print("7. Loaded conversation messages:")
messages = loaded_conversation.message_list.messages
for i, msg in enumerate(messages):
    if msg.message_type.name == "AGENT":
        role = "Assistant"
    elif msg.message_type.name == "USER":
        role = "User"
    else:
        role = msg.message_type.name
    print(f"   [{i}] {role}: {msg.content}")

Key points:

  • Each conversation has a unique conversation_id

  • The entire conversation state is preserved, including message history

  • Loaded conversations retain their complete state and can resume execution

  • Access messages through conversation.message_list.messages

Serializing Flow Conversations#

Flow conversations require special attention as they can be serialized mid-execution:

Step 1. Create a Flow#

First, create a flow using the builder function.

start_step = StartStep(name="start_step")
input_step = InputMessageStep(
    name="input_step",
    message_template="What's your favorite color?",
    output_mapping={InputMessageStep.USER_PROVIDED_INPUT: "user_color"},
)
output_step = OutputMessageStep(
    name="output_step", message_template="Your favorite color is {{ user_color }}. Nice choice!"
)
end_step = CompleteStep(name="end_step")

simple_flow = Flow(
    begin_step=start_step,
    control_flow_edges=[
        ControlFlowEdge(start_step, input_step),
        ControlFlowEdge(input_step, output_step),
        ControlFlowEdge(output_step, end_step),
    ],
    data_flow_edges=[
        DataFlowEdge(
            source_step=input_step,
            source_output="user_color",
            destination_step=output_step,
            destination_input="user_color",
        )
    ],
)

Step 2. Run the conversation#

Then start and run the flow conversation.

flow_conversation = simple_flow.start_conversation()
flow_id = flow_conversation.conversation_id
print(f"1. Started flow conversation with ID: {flow_id}")

# Execute until user input is needed
status = flow_conversation.execute()
print(f"2. Flow asks: {flow_conversation.get_last_message().content}")

Step 3. Serialize during execution#

You can now serialize the conversation during its execution, for instance here the flow is requesting the user to input some information but you can serialize the conversation and resume it later.

FLOW_STORE_PATH = os.path.join(DIR_PATH, "flow_conversation.json")
store_conversation(FLOW_STORE_PATH, flow_conversation)
print(f"3. Flow conversation serialized to {FLOW_STORE_PATH}")

Step 4. Deserialize the conversation#

You can now load back the serialized conversation.

loaded_flow_conversation = load_conversation(FLOW_STORE_PATH, flow_id)
input_step_1 = loaded_flow_conversation.flow.steps['input_step']
print(f"4. Flow conversation deserialized from {FLOW_STORE_PATH}")

# Provide user input to the loaded conversation
loaded_flow_conversation.append_user_message("Blue")
print("5. User responds: Blue")

Step 5. Resume the conversation execution#

You can resume the conversation from its state before serializing it.

outputs = loaded_flow_conversation.execute()
print(f"6. Flow output: {outputs.output_values[OutputMessageStep.OUTPUT]}")

# Print the loaded conversation messages
print("7. Loaded flow conversation messages:")
messages = loaded_flow_conversation.message_list.messages
for i, msg in enumerate(messages):
    if msg.message_type.name == "AGENT":
        role = "Flow"
    elif msg.message_type.name == "USER":
        role = "User"
    else:
        role = msg.message_type.name
    print(f"   [{i}] {role}: {msg.content}")


Important considerations:

  • Flows can be serialized while waiting for user input

  • The loaded Flow conversation resumes exactly where it left off

  • User input can be provided to the loaded conversation to continue execution

Building persistent applications#

For real-world applications, you’ll want to create persistent conversation loops:

def run_persistent_agent(assistant: Agent, store_path: str, conversation_id: str = None):
    """Run an agent with persistent conversation storage."""

    # Load existing conversation or start new one
    if conversation_id:
        try:
            conversation = load_conversation(store_path, conversation_id)
            print(f"Resuming conversation {conversation_id}")
        except (FileNotFoundError, KeyError):
            print(f"Conversation {conversation_id} not found, starting new one")
            conversation = assistant.start_conversation()
    else:
        conversation = assistant.start_conversation()
        print(f"Started new conversation {conversation.conversation_id}")

    # Main conversation loop
    while True:
        status = conversation.execute()

        if isinstance(status, FinishedStatus):
            print("Conversation finished")
            break
        elif isinstance(status, UserMessageRequestStatus):
            # Save before waiting for user input
            store_conversation(store_path, conversation)

            print(f"Assistant: {conversation.get_last_message().content}")
            user_input = input("You: ")

            if user_input.lower() in ["exit", "quit"]:
                print("Exiting and saving conversation...")
                break

            conversation.append_user_message(user_input)

    # Final save
    final_id = store_conversation(store_path, conversation)
    print(f"Conversation saved with ID: {final_id}")
    return final_id


This function:

  • Loads existing conversations or starts new ones

  • Saves state before waiting for user input

  • Allows users to exit and resume later

  • Returns the conversation ID for future reference

Best Practices#

  1. Save before user input: Always serialize conversations before waiting for user input to prevent data loss.

  2. Use unique IDs: Store conversations using their built-in conversation_id to avoid conflicts.

  3. Handle errors gracefully: Wrap deserialization in try-except blocks to handle missing or corrupted data.

  4. Consider storage format: While JSON is human-readable, consider other formats for production use.

  5. Version your serialization: Consider adding version information to handle future schema changes.

Limitations#

  • Tool state: When using tools with Agents, ensure tools are stateless or their state is managed separately.

  • Large conversations: Very long conversations may result in large serialized files.

  • Binary data: The default JSON serialization does not handle binary data directly.

Agent Spec Exporting/Loading#

You can export the assistant configuration to its Agent Spec configuration using the AgentSpecExporter.

from wayflowcore.agentspec import AgentSpecExporter

serialized_assistant = AgentSpecExporter().to_json(assistant)

Here is what the Agent Spec representation will look like ↓

Click here to see the assistant configuration.
{
  "component_type": "Agent",
  "id": "simple_assistant",
  "name": "agent_0f75efc7",
  "description": "",
  "metadata": {
    "__metadata_info__": {
      "name": "agent_0f75efc7",
      "description": ""
    }
  },
  "inputs": [],
  "outputs": [],
  "llm_config": {
    "component_type": "VllmConfig",
    "id": "80c8028b-0dd6-4dc9-ad7d-a4860ef1b849",
    "name": "LLAMA_MODEL_ID",
    "description": null,
    "metadata": {
      "__metadata_info__": {}
    },
    "default_generation_parameters": null,
    "url": "LLAMA_API_URL",
    "model_id": "LLAMA_MODEL_ID"
  },
  "system_prompt": "You are a helpful assistant. Be concise.",
  "tools": [],
  "agentspec_version": "25.4.1"
}

You can then load the configuration back to an assistant using the AgentSpecLoader.

from wayflowcore.agentspec import AgentSpecLoader

assistant: Agent = AgentSpecLoader().load_json(serialized_assistant)

Next steps#

In this guide, you learned how to:

  • Serialize both Agent and Flow conversations

  • Restore conversations and continue execution

  • Build persistent conversation loops

  • Handle conversation state across application restarts

Having learned how to serialize a conversation, you may now proceed to How to Serialize and Deserialize Flows and Agents.

Full code#

Click on the card at the top of this page to download the full code for this guide or copy the code below.

  1# Copyright © 2025 Oracle and/or its affiliates.
  2#
  3# This software is under the Apache License 2.0
  4# %%[markdown]
  5# Code Example - How to Serialize and Deserialize Conversations
  6# -------------------------------------------------------------
  7
  8# How to use:
  9# Create a new Python virtual environment and install the latest WayFlow version.
 10# ```bash
 11# python -m venv venv-wayflowcore
 12# source venv-wayflowcore/bin/activate
 13# pip install --upgrade pip
 14# pip install "wayflowcore==26.1" 
 15# ```
 16
 17# You can now run the script
 18# 1. As a Python file:
 19# ```bash
 20# python howto_serialize_conversations.py
 21# ```
 22# 2. As a Notebook (in VSCode):
 23# When viewing the file,
 24#  - press the keys Ctrl + Enter to run the selected cell
 25#  - or Shift + Enter to run the selected cell and move to the cell below# (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0) or Universal Permissive License
 26# (UPL) 1.0 (LICENSE-UPL or https://oss.oracle.com/licenses/upl), at your option.
 27
 28
 29
 30# %%[markdown]
 31## Imports for this guide
 32
 33# %%
 34import json
 35import os
 36
 37from wayflowcore.agent import Agent
 38from wayflowcore.controlconnection import ControlFlowEdge
 39from wayflowcore.conversation import Conversation
 40from wayflowcore.dataconnection import DataFlowEdge
 41from wayflowcore.executors.executionstatus import (
 42    FinishedStatus,
 43    UserMessageRequestStatus,
 44)
 45from wayflowcore.flow import Flow
 46from wayflowcore.serialization import autodeserialize, serialize
 47from wayflowcore.steps import (
 48    CompleteStep,
 49    InputMessageStep,
 50    OutputMessageStep,
 51    StartStep,
 52)
 53
 54# %%[markdown]
 55## Configure your LLM
 56
 57# %%
 58from wayflowcore.models import VllmModel
 59
 60llm = VllmModel(
 61    model_id="LLAMA_MODEL_ID",
 62    host_port="LLAMA_API_URL",
 63)
 64
 65# %%[markdown]
 66## Create storage functions
 67
 68# %%
 69DIR_PATH = "path/to/your/dir"
 70
 71def store_conversation(path: str, conversation: Conversation) -> str:
 72    """Store the given conversation and return the conversation id."""
 73    conversation_id = conversation.conversation_id
 74    serialized_conversation = serialize(conversation)
 75
 76    # Read existing data
 77    if os.path.exists(path):
 78        with open(path, "r") as f:
 79            data = json.load(f)
 80    else:
 81        data = {}
 82
 83    # Add new conversation
 84    data[conversation_id] = serialized_conversation
 85
 86    # Write back to file
 87    with open(path, "w") as f:
 88        json.dump(data, f, indent=2)
 89
 90    return conversation_id
 91
 92
 93def load_conversation(path: str, conversation_id: str) -> Conversation:
 94    """Load a conversation given its id."""
 95    with open(path, "r") as f:
 96        data = json.load(f)
 97
 98    serialized_conversation = data[conversation_id]
 99    return autodeserialize(serialized_conversation)
100
101# %%[markdown]
102## Creating an agent
103
104# %%
105assistant = Agent(
106    llm=llm,
107    custom_instruction="You are a helpful assistant. Be concise.",
108    agent_id="simple_assistant",
109)
110
111# %%[markdown]
112## Run the agent
113
114# %%
115# Start a conversation
116conversation = assistant.start_conversation()
117conversation_id = conversation.conversation_id
118print(f"1. Started conversation with ID: {conversation_id}")
119
120# Execute initial greeting
121status = conversation.execute()
122print(f"2. Assistant says: {conversation.get_last_message().content}")
123
124# Add user message
125conversation.append_user_message("What is 2+2?")
126print("3. User asks: What is 2+2?")
127
128# Execute to get response
129status = conversation.execute()
130print(f"4. Assistant responds: {conversation.get_last_message().content}")
131
132# %%[markdown]
133## Serialize the conversation
134
135# %%
136AGENT_STORE_PATH = os.path.join(DIR_PATH, "agent_conversation.json")
137store_conversation(AGENT_STORE_PATH, conversation)
138print(f"5. Conversation serialized to {AGENT_STORE_PATH}")
139
140# %%[markdown]
141## Deserialize the conversation
142
143# %%
144loaded_conversation = load_conversation(AGENT_STORE_PATH, conversation_id)
145print(f"6. Conversation deserialized from {AGENT_STORE_PATH}")
146
147# Print the loaded conversation messages
148print("7. Loaded conversation messages:")
149messages = loaded_conversation.message_list.messages
150for i, msg in enumerate(messages):
151    if msg.message_type.name == "AGENT":
152        role = "Assistant"
153    elif msg.message_type.name == "USER":
154        role = "User"
155    else:
156        role = msg.message_type.name
157    print(f"   [{i}] {role}: {msg.content}")
158
159
160# %%[markdown]
161## Creating a flow
162
163# %%
164start_step = StartStep(name="start_step")
165input_step = InputMessageStep(
166    name="input_step",
167    message_template="What's your favorite color?",
168    output_mapping={InputMessageStep.USER_PROVIDED_INPUT: "user_color"},
169)
170output_step = OutputMessageStep(
171    name="output_step", message_template="Your favorite color is {{ user_color }}. Nice choice!"
172)
173end_step = CompleteStep(name="end_step")
174
175simple_flow = Flow(
176    begin_step=start_step,
177    control_flow_edges=[
178        ControlFlowEdge(start_step, input_step),
179        ControlFlowEdge(input_step, output_step),
180        ControlFlowEdge(output_step, end_step),
181    ],
182    data_flow_edges=[
183        DataFlowEdge(
184            source_step=input_step,
185            source_output="user_color",
186            destination_step=output_step,
187            destination_input="user_color",
188        )
189    ],
190)
191
192# %%[markdown]
193## Run the flow
194
195# %%
196flow_conversation = simple_flow.start_conversation()
197flow_id = flow_conversation.conversation_id
198print(f"1. Started flow conversation with ID: {flow_id}")
199
200# Execute until user input is needed
201status = flow_conversation.execute()
202print(f"2. Flow asks: {flow_conversation.get_last_message().content}")
203
204# %%[markdown]
205## Serialize before providing user input
206
207# %%
208FLOW_STORE_PATH = os.path.join(DIR_PATH, "flow_conversation.json")
209store_conversation(FLOW_STORE_PATH, flow_conversation)
210print(f"3. Flow conversation serialized to {FLOW_STORE_PATH}")
211
212# %%[markdown]
213## Deserialize the flow conversation
214
215# %%
216loaded_flow_conversation = load_conversation(FLOW_STORE_PATH, flow_id)
217input_step_1 = loaded_flow_conversation.flow.steps['input_step']
218print(f"4. Flow conversation deserialized from {FLOW_STORE_PATH}")
219
220# Provide user input to the loaded conversation
221loaded_flow_conversation.append_user_message("Blue")
222print("5. User responds: Blue")
223
224# %%[markdown]
225## Resume the conversation execution
226
227# %%
228outputs = loaded_flow_conversation.execute()
229print(f"6. Flow output: {outputs.output_values[OutputMessageStep.OUTPUT]}")
230
231# Print the loaded conversation messages
232print("7. Loaded flow conversation messages:")
233messages = loaded_flow_conversation.message_list.messages
234for i, msg in enumerate(messages):
235    if msg.message_type.name == "AGENT":
236        role = "Flow"
237    elif msg.message_type.name == "USER":
238        role = "User"
239    else:
240        role = msg.message_type.name
241    print(f"   [{i}] {role}: {msg.content}")
242
243
244
245# %%[markdown]
246## Creating a persistent conversation loop
247
248# %%
249def run_persistent_agent(assistant: Agent, store_path: str, conversation_id: str = None):
250    """Run an agent with persistent conversation storage."""
251
252    # Load existing conversation or start new one
253    if conversation_id:
254        try:
255            conversation = load_conversation(store_path, conversation_id)
256            print(f"Resuming conversation {conversation_id}")
257        except (FileNotFoundError, KeyError):
258            print(f"Conversation {conversation_id} not found, starting new one")
259            conversation = assistant.start_conversation()
260    else:
261        conversation = assistant.start_conversation()
262        print(f"Started new conversation {conversation.conversation_id}")
263
264    # Main conversation loop
265    while True:
266        status = conversation.execute()
267
268        if isinstance(status, FinishedStatus):
269            print("Conversation finished")
270            break
271        elif isinstance(status, UserMessageRequestStatus):
272            # Save before waiting for user input
273            store_conversation(store_path, conversation)
274
275            print(f"Assistant: {conversation.get_last_message().content}")
276            user_input = input("You: ")
277
278            if user_input.lower() in ["exit", "quit"]:
279                print("Exiting and saving conversation...")
280                break
281
282            conversation.append_user_message(user_input)
283
284    # Final save
285    final_id = store_conversation(store_path, conversation)
286    print(f"Conversation saved with ID: {final_id}")
287    return final_id
288
289
290
291# %%[markdown]
292## Export config to Agent Spec
293
294# %%
295from wayflowcore.agentspec import AgentSpecExporter
296
297serialized_assistant = AgentSpecExporter().to_json(assistant)
298
299# %%[markdown]
300## Load Agent Spec config
301
302# %%
303from wayflowcore.agentspec import AgentSpecLoader
304
305assistant: Agent = AgentSpecLoader().load_json(serialized_assistant)