How to Enable Agents to Handle Long Contexts#
Prerequisites
This guide assumes familiarity with:
In agentic systems, conversations can become lengthy, building up extensive context over many interactions. Large Language Models (LLMs) have limits on the context they can process effectively, which may result in reduced performance or errors when these limits are surpassed. Long context also incur higher costs.
To address these performance issues and reduce cost, you can apply techniques that reduce the context size while retaining key information.
This guide demonstrates three methods for reducing context size:
Discarding old messages: For long conversations, a straightforward approach is to remove older messages and retain only the most recent ones.
Summarizing tool outputs: Tool outputs are often lengthy and include details that might not be relevant anymore after a few rounds of conversation. Summarizing them to extract the key points will shorten the context and can help the agent stay focused in extended conversations.
Summarizing old messages: Rather than discarding old messages entirely, summarizing old messages can allow the agent to retain historical information.
This guide shows how to create MessageTransform objects to reduce the context size in agents.
Introduction#
Message Transforms#
A MessageTransform is a transformation applied to a PromptTemplate. It modifies the list of messages before they are sent to the LLM powering the agent. You can learn more about them in the Advanced Prompting Techniques guide.
In this guide, we define new message transforms to adjust the agent’s chat history by setting them as pre_rendering_transforms on the agent’s PromptTemplate.
LLM Setup#
In this guide, we use an LLM for both the agent and summarization tasks:
from wayflowcore.models import VllmModel
llm = VllmModel(
model_id="LLAMA_MODEL_ID",
host_port="LLAMA_API_URL",
)
Discarding Old Messages#
Creating the MessageTransform#
This method employs a MessageTransform that discards older messages, keeping only the latest ones.
A key consideration is to avoid dropping tool requests, as some LLM providers may fail if they receive tool results without matching requests. Here’s a helper function to split messages while maintaining consistency:
from typing import Tuple
def _split_messages_and_guarantee_tool_calling_consistency(
messages: List[Message], keep_x_most_recent_messages: int
) -> Tuple[List[Message], List[Message]]:
"""Guarantees consistency of tool requests / results"""
messages_to_summarize = messages[: -keep_x_most_recent_messages]
messages_to_keep = messages[-keep_x_most_recent_messages:]
# detect tool results missing their tool request
missing_tool_request_ids = set()
tool_request_ids = set()
for msg in messages_to_keep:
if msg.tool_requests:
for tool_request in msg.tool_requests:
tool_request_ids.add(tool_request.tool_request_id)
if msg.tool_result:
tool_request_id = msg.tool_result.tool_request_id
if tool_request_id not in tool_request_ids:
missing_tool_request_ids.add(tool_request_id)
if len(missing_tool_request_ids) == 0:
return messages_to_summarize, messages_to_keep
# all the rest after the tool call should be summarized
for idx, msg in enumerate(messages_to_summarize):
if any(
tc.tool_request_id in missing_tool_request_ids for tc in (msg.tool_requests or [])
):
return messages_to_summarize[:idx], messages_to_summarize[idx:] + messages_to_keep
raise ValueError("Should not happen")
The message transform can then be defined as follows:
class KeepOnlyRecentMessagesTransform(MessageTransform):
"""Message transform that only keeps the X most recent messages."""
def __init__(self, keep_x_most_recent_messages: int = 10):
super().__init__()
self.keep_x_most_recent_messages = keep_x_most_recent_messages
async def call_async(self, messages: List["Message"]) -> List["Message"]:
old_messages, recent_messages = _split_messages_and_guarantee_tool_calling_consistency(
messages=messages,
keep_x_most_recent_messages=self.keep_x_most_recent_messages
)
return recent_messages
Integrating the MessageTransform into the Agent#
After defining the MessageTransform, incorporate it into the agent’s prompt template, then run the agent:
transform = KeepOnlyRecentMessagesTransform(keep_x_most_recent_messages=1)
agent = Agent(
llm=llm,
tools=[read_logs_tool],
agent_template=llm.agent_template.with_additional_pre_rendering_transform(
transform, append=False
),
)
conversation = agent.start_conversation()
# message is so long that it cannot be processed, but it will be dropped because is too old
conversation.append_user_message("Super long message: " + "..." * 1000000)
conversation.append_agent_message("OK")
conversation.append_user_message("What is the capital of Switzerland?")
conversation.execute()
Note
We use a pre_rendering message transform as these ones are applied to the chat history. Here,
we only want to modify the chat history, not the potential system messages / formatting of the tools
for the LLM.
In general, pre_rendering transforms are to change the chat history, while post_rendering transforms
are about formatting the prompt in a certain way for the LLM.
Note
Using append=True means that the message transform will be added as the first message transform of
the list. The order in which message transforms are configured is important, since each might modify
the content of what the next transforms gets.
Therefore, to avoid cache misses, it is better to set message transforms that use caching as early
as possible in the list of template
Summarizing Tool Outputs#
Creating the MessageTransform#
This approach uses a MessageTransform that splits large tool results into chunks, summarizes each chunk sequentially with an LLM, and replaces the original messages with summarized versions.
The following MessageTransform is an example that can be customized for other scenarios.
from wayflowcore import Message
from wayflowcore.tools import ToolResult
from wayflowcore.transforms import MessageTransform
from wayflowcore._utils.async_helpers import run_async_function_in_parallel
class SummarizeToolResultMessageTransform(MessageTransform):
MAX_LENGTH: ClassVar[int] = 10_000
def __init__(self):
super().__init__()
self._summarized_messages_cache: Dict[str, Message] = {}
async def call_async(self, messages: List[Message]) -> List[Message]:
return await run_async_function_in_parallel(
func_async=self._summarize_message_content_if_tool_result,
input_list=messages,
)
async def _summarize_message_content_if_tool_result(self, message: Message) -> Message:
# Important to use caching for this message transform to not recompute the costly
# summarization.
if message.tool_result is None:
return message
message_hash = message.hash
if message_hash not in self._summarized_messages_cache:
# Creates a new message to replace the message with a content that is too long
self._summarized_messages_cache[message_hash] = Message(
tool_result=ToolResult(
content=await self._summarize_content(message.tool_result.content),
tool_request_id=message.tool_result.tool_request_id,
),
)
self._last_message_summarized = message
return self._summarized_messages_cache[message_hash]
@staticmethod
async def _summarize_content(content: str) -> str:
if len(content) < SummarizeToolResultMessageTransform.MAX_LENGTH:
return content
current_summary = "Nothing summarized yet"
chunk_size = SummarizeToolResultMessageTransform.MAX_LENGTH
for chunk_index in range(0, len(content), chunk_size):
logging.info(f"Summarizing chunk {chunk_index}/{len(content)}")
llm_completion = await llm.generate_async(
prompt=(
"Please generate a new summary based on the previous summary and the added content."
" The summary should be just a few sentences retaining the most important information.\n\n"
"Previous summary:\n"
f"{current_summary}\n\n"
"Added content:\n"
f"{content[chunk_index:chunk_index+chunk_size]}\n"
"Reminder: your response will be replacing the whole content, so just return a summary."
)
)
current_summary = llm_completion.message.content
summarized_tool_result = (
f"Summarized result:\n{current_summary}"
)
logging.info(f"Message has been summarized to '''\n{summarized_tool_result}\n'''")
return summarized_tool_result
Integrating the MessageTransform into the Agent#
After defining the MessageTransform, add it to the agent’s prompt template as shown below:
from wayflowcore import Agent, Message, tool
@tool
def read_logs_tool() -> str:
"""Return logs from the system"""
return (
"Starting long processing\n"
+ "Waiting for process ...\n" * 2_000
+ "Found error: Missing credentials for user kurt_andrews."
+ " Please pass the correct credentials.\n"
)
transform = SummarizeToolResultMessageTransform()
agent = Agent(
llm=llm,
tools=[read_logs_tool],
agent_template=llm.agent_template.with_additional_pre_rendering_transform(
transform, append=False
),
)
Then, run the agent like this:
conversation = agent.start_conversation()
conversation.append_user_message("Can you explain the error in the system?")
conversation.execute()
# INFO:root:Summarizing chunk 0/480118
# INFO:root:Summarizing chunk 100000/480118
# INFO:root:Summarizing chunk 200000/480118
# INFO:root:Summarizing chunk 300000/480118
# INFO:root:Summarizing chunk 400000/480118
# INFO:root:Message has been summarized to '''
# This long tool result has been summarized:
# The system is stuck in an infinite loop, repeatedly displaying "Waiting for process..." without
# any indication of progress or completion. Additionally, an error was found due to missing
# credentials for user kurt_andrews, requiring correct credentials to be passed.
# '''
A natural extension of this transform could be to only summarize long tool results when they are old, so that we keep them intact when the agent is asking for it for the first time.
Summarizing Old Messages#
Creating the MessageTransform#
This technique uses a MessageTransform to summarize older messages, similar to the discarding method but preserving some past information.
from typing import TypeVar, Generic, Optional
from wayflowcore.models import LlmModel
from wayflowcore._utils.hash import fast_stable_hash
from wayflowcore._utils._templating_helpers import render_template
T = TypeVar("T")
class MessageTransformCache(Generic[T]):
def __init__(self) -> None:
self.state: Dict[str, T] = {}
def add(self, key: str, value: T) -> None:
self.state[key] = value
def remove(self, key: str) -> None:
raise NotImplementedError()
def get(self, key: str) -> Optional[T]:
return self.state.get(key, None)
class SummarizationMessageTransform(MessageTransform):
"""
Stateful message transform that summarizes the list of messages when it becomes too long. Preserves
consistency of tool calls/results.
"""
def __init__(
self,
llm: LlmModel,
max_num_messages: int = 20,
min_num_messages: int = 5,
summarization_instruction: str = "Please make a summary of the previous messages. Include relevant information and keep it short. Your response will replace the messages, so just output the summary directly, no introduction needed.",
summarized_message: str = "Summarized conversation: {{summary}}",
_cache_implementation: type = MessageTransformCache,
):
"""
Parameters
----------
llm:
LLM to use for the summarization.
max_num_messages:
Number of message after which we trigger summarization. Tune this parameter depending on the
context length of your model and the price you arem willing to pay (higher means longer conversation
prompts and more tokens).
min_num_messages:
Number of recent messages to keep from summarizing. Tune this parameter to prevent from summarizing
very recent messages and keep a very responsive and relevant agent.
summarization_instruction:
Instruction for the LLM on how th summarize the previous messages.
summarized_message:
Jinja2 template on how to present the summary (with variable `summary`) to the agent using the transform.
Examples
--------
>>> summarization_transform = SummarizationMessageTransform(
... llm=llm,
... # if the conversation reaches 30 messages, it will trigger summarization
... max_num_messages=30,
... # when summarization is triggered, it will summarize all the messages but that last 10 ones,
... min_num_messages=10,
... )
"""
self.llm = llm
self.summarization_instruction = summarization_instruction
self.summarized_message = summarized_message
self.max_num_messages = max_num_messages
self.min_num_messages = min_num_messages
self.internal_cache = _cache_implementation["Message"]() # type: ignore
# a cached message means all the messages before this messages have been summarized
def _partition_cached_and_new_messages(self, messages: List["Message"]) -> List["Message"]:
messages_hashes = []
for idx, msg in enumerate(messages):
messages_hashes.append(msg.hash)
curr_hash = fast_stable_hash(messages_hashes)
found_msg = self.internal_cache.get(curr_hash)
if found_msg is not None:
return self._partition_cached_and_new_messages([found_msg] + messages[idx + 1 :])
else:
return messages
async def call_async(self, messages: List["Message"]) -> List["Message"]:
formatted_messages = self._partition_cached_and_new_messages(messages)
if len(messages) <= self.max_num_messages:
return formatted_messages
messages_to_summarize, messages_to_keep = _split_messages_and_guarantee_tool_calling_consistency(
messages=formatted_messages,
keep_x_most_recent_messages=self.min_num_messages,
)
summarized_message = await self._summarize(messages_to_summarize)
summarized_hash = fast_stable_hash([msg.hash for msg in messages_to_summarize])
self.internal_cache.add(summarized_hash, summarized_message)
return [summarized_message] + messages_to_keep
async def _summarize(self, messages: List[Message]) -> Message:
chat_history = messages + [Message(role="user", content=self.summarization_instruction)]
prompt = await self.llm.chat_template.format_async(
inputs={
self.llm.chat_template.CHAT_HISTORY_PLACEHOLDER_NAME: chat_history,
}
)
completion = await self.llm.generate_async(prompt=prompt)
summary = completion.message.content
return Message(
content=render_template(template=self.summarized_message, inputs=dict(summary=summary)),
role="user",
)
Integrating the MessageTransform into the Agent#
After defining the MessageTransform, add it to the agent’s prompt template, then run the agent:
transform = SummarizationMessageTransform(llm=llm)
agent = Agent(
llm=llm,
tools=[read_logs_tool],
agent_template=llm.agent_template.with_additional_pre_rendering_transform(
transform, append=False
),
)
conversation = agent.start_conversation()
LONG_CONVERSATION = [
{"role": "user", "content": "Hi! Can you tell me something interesting about dolphins?"},
{
"role": "assistant",
"content": "Absolutely! Dolphins are fascinating creatures, famous for their intelligence and complex behavior. For example, they have been observed using tools, such as covering their snouts with sponges to protect themselves while foraging on the seafloor. Dolphins also display strong social bonds and have been known to help injured individuals within their pods. Communication among dolphins is advanced; they use a series of clicks, whistles, and body movements to convey information, and some species even have distinctive signature whistles that function like names.",
},
{"role": "user", "content": "Wow, I didn’t know that. Do dolphins have good memory?"},
{
"role": "assistant",
"content": "Dolphins possess remarkable memories, particularly when it comes to their social groups and vocal communication. Researchers have discovered that dolphins can remember the unique signature whistles of other dolphins for over 20 years, which is the longest social memory recorded in non-human animals. This ability highlights their sophisticated cognitive abilities and the importance of long-term relationships in dolphin societies. Memory also plays a crucial role in their navigation and hunting skills, as dolphins migrate and follow paths in the oceans over great distances. In addition, their keen memory supports learning from one another, enhancing the social structure of their pods.",
},
{
"role": "user",
"content": "That’s impressive. Are there other animals with similar intelligence?",
},
{
"role": "assistant",
"content": "Indeed, there are several other animals that exhibit forms of intelligence comparable to dolphins. Great apes, such as chimpanzees and orangutans, have been observed making and using tools, solving puzzles, and displaying empathy and self-awareness. Elephants show emotional intelligence and have complex social relationships, including mourning their dead and helping injured companions. Crows and parrots, members of the bird family, demonstrate advanced problem-solving, mimicry, and even the ability to plan for future events. These examples across different branches of the animal kingdom show that intelligence has evolved in diverse and fascinating ways.",
},
{"role": "user", "content": "Can elephants recognize themselves in a mirror like dolphins?"},
{
"role": "assistant",
"content": "Yes, elephants are among the elite group of species known to pass the mirror self-recognition test, an experiment generally considered evidence of self-awareness. In these experiments, elephants explore themselves in mirrors and investigate marks on their bodies they otherwise couldn’t see. Notably, this test has only been passed by a few species, including certain great apes, dolphins, magpies, and elephants. Such ability hints at a deep cognitive capacity, suggesting they have a mental representation of themselves, which is foundational for empathy and understanding the experiences of others. This self-recognition is just one of the many amazing aspects of elephant intelligence that continue to be studied.",
},
{"role": "user", "content": "What is the mirror test?"},
{
"role": "assistant",
"content": "The mirror test, also known as the mark test, is a behavioral technique designed to assess self-awareness in animals. In the test, an animal is marked with an odorless dye in a place it cannot see without a mirror. The animal is then given access to a mirror, and researchers observe whether it investigates or tries to touch the mark on its own body, indicating recognition of its reflection as itself, not another animal. Passing the mirror test suggests a level of self-concept, which is considered an advanced cognitive trait. However, not all intelligent animals pass the mirror test, and failing it doesn’t necessarily mean an animal lacks self-awareness – it might simply not care about the mark or not rely on visual cues.",
},
{
"role": "user",
"content": "Besides animals, what’s another field where recognition is important?",
},
{
"role": "assistant",
"content": "Recognition is vitally important in many human technologies, notably in the field of artificial intelligence. Facial recognition technology is widely used for security purposes, such as unlocking smartphones or monitoring public spaces. In social media, these algorithms help automatically tag friends in photos, organizing vast amounts of visual data. Additionally, recognition systems are applied in retail, for personalized marketing, and in health care, for patient identification and monitoring. However, these powerful tools also raise important questions around privacy and data security, sparking ongoing debates about their regulation and ethical use.",
},
{"role": "user", "content": "Is facial recognition always accurate?"},
{
"role": "assistant",
"content": "Facial recognition technology has improved significantly over recent years, but it is not flawless and its accuracy can vary depending on several factors. Lighting, angle, image quality, and the diversity of data used to train the algorithms all influence performance. Notably, many studies have shown that facial recognition systems can have higher error rates when identifying people with darker skin tones or from underrepresented demographics, leading to concerns about bias and discrimination. This has prompted researchers and policymakers to push for better training data, algorithmic transparency, and stricter oversight. As the technology becomes more widespread, ensuring fairness and accuracy is an essential focus for developers and regulators alike.",
},
{"role": "user", "content": "What are some other ethical concerns in AI?"},
{
"role": "assistant",
"content": "Ethical challenges in AI are numerous and complex, reflecting the far-reaching impact of these technologies on society. Privacy concerns are paramount, as AI systems often require large amounts of personal data, raising questions about consent and data protection. Bias in algorithms is another major issue, as AI can inadvertently reinforce existing societal prejudices if not carefully monitored and tested. Transparency is crucial, because many AI systems, especially those using deep learning, can behave as 'black boxes'—making decisions in ways that aren’t easily understandable to humans. Additionally, there are fears around job displacement, as automation could replace roles faster than new opportunities are created, necessitating careful consideration and retraining initiatives.",
},
{
"role": "user",
"content": "Job displacement is interesting. Which jobs are most at risk from AI?",
},
{
"role": "assistant",
"content": "Jobs that involve repetitive, predictable tasks are currently considered the most vulnerable to automation by AI. For example, data entry clerks, assembly line workers, and certain types of customer service representatives face a high likelihood of being replaced or augmented by machines. Even within fields like transportation, we’re seeing the rise of autonomous vehicles that could impact truck drivers and delivery personnel. On the other hand, roles that require creativity, complex decision-making, emotional intelligence, or physical dexterity in unpredictable environments—such as healthcare providers, artists, and skilled trades—are less susceptible for now. To manage this transition, many experts recommend investing in retraining programs and encouraging adaptability in the workforce.",
},
{"role": "user", "content": "On a lighter note, do you know any fun space facts?"},
{
"role": "assistant",
"content": "Absolutely, space offers an endless trove of fascinating discoveries. For example, Venus has a very unusual rotation: a single day on Venus (one rotation on its axis) is actually longer than its year (one complete orbit around the Sun) due to its slow spin. There are also diamond rains on planets like Neptune and Uranus, caused by extreme pressure turning carbon into solid diamonds. You might be surprised to learn that there are more stars in the observable universe than grains of sand on all the Earth’s beaches combined. Finally, if you were to fall into a black hole, the laws of physics as we understand them break down at the singularity—a phenomenon still not fully explained by scientists.",
},
{"role": "user", "content": "That’s fascinating! How hot does it get on Venus?"},
{
"role": "assistant",
"content": "Venus is the hottest planet in our solar system, even outpacing Mercury, which is closer to the Sun. The average surface temperature on Venus can reach an astonishing 465 degrees Celsius, or about 869 degrees Fahrenheit. This is hot enough to melt lead, and it creates an extremely inhospitable environment for life as we know it. The intense heat is largely due to the planet’s dense atmosphere, composed mostly of carbon dioxide, which traps heat through a runaway greenhouse effect. In addition to being scorching, Venus’s atmosphere features thick clouds of sulfuric acid that make its surface perpetually shrouded and acidic.",
},
{"role": "user", "content": "Why is Venus hotter than Mercury?"},
{
"role": "assistant",
"content": "Despite being the second planet from the Sun, Venus is hotter than Mercury primarily because of its thick, carbon dioxide-rich atmosphere. This atmosphere acts as a blanket, trapping incoming solar radiation and preventing heat from escaping, leading to the planet's infamous runaway greenhouse effect. Meanwhile, Mercury has only a thin exosphere, which allows heat to escape into space quickly, resulting in dramatic temperature swings between day and night. The conditions on Venus are so extreme that even spacecraft sent to the surface survive only a few hours before being destroyed by the temperature and pressure. In effect, Venus offers a dramatic example of how atmospheric composition can significantly influence planetary climates.",
},
{"role": "user", "content": "Are there any missions planned to Venus soon?"},
{
"role": "assistant",
"content": "Yes, the next decade promises to be exciting for Venus exploration, with both NASA and the European Space Agency planning new missions. NASA has announced the VERITAS and DAVINCI+ missions, aiming to study Venus’s geology, atmosphere, and history, with launches expected later this decade. ESA is preparing its own mission called EnVision, which will investigate Venus's surface and atmosphere using advanced radar and spectrometry. These missions hope to resolve key questions, such as whether Venus ever had oceans, what processes led to its intense greenhouse state, and if volcanic activity is still ongoing. Renewed interest in Venus stems from its potential to teach us about planetary evolution, including factors that affect habitability.",
},
{"role": "user", "content": "Cool! Back to Earth, what's the largest living structure here?"},
{
"role": "assistant",
"content": "The largest living structure on Earth is the Great Barrier Reef, stretching more than 2,300 kilometers along Australia’s northeastern coast. This massive coral system is so large it can be seen from space, and it comprises thousands of individual reefs and hundreds of islands built by billions of tiny coral polyps. The reef supports an extraordinary diversity of marine life, hosting over 1,500 fish species, turtles, sea snakes, and countless invertebrates. It also plays a crucial ecological role, protecting coastlines from storms and supporting local economies through tourism and fishing. Unfortunately, despite its size and importance, the Great Barrier Reef faces significant threats from climate change, particularly coral bleaching and ocean acidification.",
},
{"role": "user", "content": "Is it in danger?"},
{
"role": "assistant",
"content": "Sadly, the Great Barrier Reef is indeed at risk due to multiple environmental pressures. One of the most severe challenges is coral bleaching, which occurs when higher sea temperatures stress corals, causing them to expel the symbiotic algae they rely on for food and color. Repeated bleaching events in recent years have resulted in large-scale loss of coral cover, and the recovery process can be slow or incomplete. Additional threats include poor water quality from agricultural runoff, overfishing, cyclones, and outbreaks of crown-of-thorns starfish, which prey on corals. Conservation efforts are underway, but reducing global carbon emissions is crucial to ensuring the reef’s long-term survival.",
},
{"role": "user", "content": "Can coral reefs recover from bleaching?"},
{
"role": "assistant",
"content": "Reef recovery after bleaching is possible, but it depends on the severity of the event and the time between subsequent stress episodes. Corals can regain their symbiotic algae and recover their vibrant colors if water temperatures return to normal quickly enough. However, back-to-back bleaching events—such as those in recent years—inhibit full recovery and can lead to widespread coral death. The resilience of a reef also depends on the health of its local ecosystem, water quality, and the diversity of coral species present. Active restoration efforts, like coral gardening and selective breeding for heat-tolerant corals, are being explored to help boost reef recovery in the face of a warming climate.",
},
{"role": "user", "content": "Are there artificial reefs?"},
{
"role": "assistant",
"content": "Yes, artificial reefs have been employed worldwide as a means to enhance marine habitats and promote biodiversity. These structures can be made from a variety of materials, including concrete blocks, purpose-built modules, and even deliberately sunken ships or decommissioned military vehicles. When placed on the ocean floor, artificial reefs provide surfaces for corals to attach to and develop, which in turn attract fish and other marine species seeking food and shelter. Over time, these reefs can become thriving ecosystems, sometimes rivaling their natural counterparts in productivity. Besides ecological benefits, artificial reefs are also used to create tourist attractions for diving and help reduce pressure on natural reefs by diverting fishing and recreational activity.",
},
{"role": "user", "content": "Do fish immediately move into new artificial reefs?"},
{
"role": "assistant",
"content": "Fish and other marine organisms often begin exploring and colonizing new artificial reefs quite quickly—sometimes within just a few weeks of their deployment. The structures offer instant shelter from predators and new feeding opportunities, which attract small fish and invertebrates first. Over several months, larger predatory species may start appearing as the new reef becomes established as a hunting and breeding ground. The colonization process is further accelerated if the surrounding water already hosts healthy marine life populations. Eventually, the artificial reef can develop complex food webs, supporting a community of organisms similar to that of natural reefs, though full ecosystem development takes several years.",
},
{
"role": "user",
"content": "I love learning these facts. Can you tell me one more surprising animal fact?",
},
{
"role": "assistant",
"content": "Of course! One particularly astonishing animal fact concerns the octopus: these remarkable creatures possess not one, but three hearts. Two of the hearts are responsible for pumping blood through the gills, where it picks up oxygen, while the third pumps oxygenated blood throughout the rest of the body. Octopuses also have blue blood, a consequence of their use of a copper-rich protein called hemocyanin instead of hemoglobin, which is based on iron and gives human blood its red color. The blue blood is more efficient at transporting oxygen in the cold, low-oxygen conditions of the ocean, making it especially advantageous for the octopus’s varied habitats. In addition to their unique circulatory system, octopuses are also known for their high intelligence, problem-solving abilities, and incredible camouflage skills.",
},
{"role": "user", "content": "That’s wild. Why do they have blue blood?"},
{
"role": "assistant",
"content": "The blue color of octopus blood is due to the presence of hemocyanin, a copper-based molecule that binds with oxygen for transport throughout the body. This adaptation is particularly effective in the cold, low-oxygen environments found in many parts of the oceans where octopuses live. Hemocyanin is more efficient than hemoglobin (the iron-based molecule in red blood) under these conditions, which helps octopuses thrive in diverse and sometimes extreme marine settings. The evolution of blue blood is an example of the many unique physiological characteristics that have enabled cephalopods to become such successful marine invertebrates. It’s remarkable how life develops specialized solutions to meet the challenges of different environments.",
},
]
for msg in LONG_CONVERSATION:
conversation.append_message(
Message(content=msg["content"], role=msg["role"]) # type: ignore
)
conversation.append_user_message("What tool do dolphins sometimes use when foraging on the seafloor, and why?")
conversation.execute()
Agent Spec Exporting/Loading#
Due to the custom MessageTransform, the agent cannot be exported to an Agent Spec
configuration.
Next Steps#
With your new knowledge of using MessageTransform to manage large message contents, proceed
to How to Build Assistants with Tools.
Full Code#
Click the card at the top of this page to download the complete code for this guide, or copy it below.
1# Copyright © 2025 Oracle and/or its affiliates.
2#
3# This software is under the Apache License 2.0
4# %%[markdown]
5# Code Example - How to use MessageTransform
6# ------------------------------------------
7
8# How to use:
9# Create a new Python virtual environment and install the latest WayFlow version.
10# ```bash
11# python -m venv venv-wayflowcore
12# source venv-wayflowcore/bin/activate
13# pip install --upgrade pip
14# pip install "wayflowcore==26.1"
15# ```
16
17# You can now run the script
18# 1. As a Python file:
19# ```bash
20# python howto_long_context.py
21# ```
22# 2. As a Notebook (in VSCode):
23# When viewing the file,
24# - press the keys Ctrl + Enter to run the selected cell
25# - or Shift + Enter to run the selected cell and move to the cell below# (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0) or Universal Permissive License
26# (UPL) 1.0 (LICENSE-UPL or https://oss.oracle.com/licenses/upl), at your option.
27
28
29import logging
30from typing import ClassVar, List, Dict
31
32logging.basicConfig(level=logging.INFO)
33
34
35# %%[markdown]
36## Define the llm
37
38# %%
39from wayflowcore.models import VllmModel
40
41llm = VllmModel(
42 model_id="LLAMA_MODEL_ID",
43 host_port="LLAMA_API_URL",
44)
45
46# %%[markdown]
47## Creating the message transform
48
49# %%
50from wayflowcore import Message
51from wayflowcore.tools import ToolResult
52from wayflowcore.transforms import MessageTransform
53from wayflowcore._utils.async_helpers import run_async_function_in_parallel
54
55
56class SummarizeToolResultMessageTransform(MessageTransform):
57
58 MAX_LENGTH: ClassVar[int] = 10_000
59
60 def __init__(self):
61 super().__init__()
62 self._summarized_messages_cache: Dict[str, Message] = {}
63
64 async def call_async(self, messages: List[Message]) -> List[Message]:
65 return await run_async_function_in_parallel(
66 func_async=self._summarize_message_content_if_tool_result,
67 input_list=messages,
68 )
69
70 async def _summarize_message_content_if_tool_result(self, message: Message) -> Message:
71 # Important to use caching for this message transform to not recompute the costly
72 # summarization.
73 if message.tool_result is None:
74 return message
75 message_hash = message.hash
76 if message_hash not in self._summarized_messages_cache:
77 # Creates a new message to replace the message with a content that is too long
78 self._summarized_messages_cache[message_hash] = Message(
79 tool_result=ToolResult(
80 content=await self._summarize_content(message.tool_result.content),
81 tool_request_id=message.tool_result.tool_request_id,
82 ),
83 )
84 self._last_message_summarized = message
85 return self._summarized_messages_cache[message_hash]
86
87 @staticmethod
88 async def _summarize_content(content: str) -> str:
89 if len(content) < SummarizeToolResultMessageTransform.MAX_LENGTH:
90 return content
91
92 current_summary = "Nothing summarized yet"
93 chunk_size = SummarizeToolResultMessageTransform.MAX_LENGTH
94 for chunk_index in range(0, len(content), chunk_size):
95 logging.info(f"Summarizing chunk {chunk_index}/{len(content)}")
96 llm_completion = await llm.generate_async(
97 prompt=(
98 "Please generate a new summary based on the previous summary and the added content."
99 " The summary should be just a few sentences retaining the most important information.\n\n"
100 "Previous summary:\n"
101 f"{current_summary}\n\n"
102 "Added content:\n"
103 f"{content[chunk_index:chunk_index+chunk_size]}\n"
104 "Reminder: your response will be replacing the whole content, so just return a summary."
105 )
106 )
107 current_summary = llm_completion.message.content
108 summarized_tool_result = (
109 f"Summarized result:\n{current_summary}"
110 )
111 logging.info(f"Message has been summarized to '''\n{summarized_tool_result}\n'''")
112 return summarized_tool_result
113
114
115
116# %%[markdown]
117## Creating the agent
118
119# %%
120from wayflowcore import Agent, Message, tool
121
122@tool
123def read_logs_tool() -> str:
124 """Return logs from the system"""
125 return (
126 "Starting long processing\n"
127 + "Waiting for process ...\n" * 2_000
128 + "Found error: Missing credentials for user kurt_andrews."
129 + " Please pass the correct credentials.\n"
130 )
131
132transform = SummarizeToolResultMessageTransform()
133agent = Agent(
134 llm=llm,
135 tools=[read_logs_tool],
136 agent_template=llm.agent_template.with_additional_pre_rendering_transform(
137 transform, append=False
138 ),
139)
140
141
142
143# %%[markdown]
144## Running the agent
145
146# %%
147conversation = agent.start_conversation()
148conversation.append_user_message("Can you explain the error in the system?")
149conversation.execute()
150# INFO:root:Summarizing chunk 0/480118
151# INFO:root:Summarizing chunk 100000/480118
152# INFO:root:Summarizing chunk 200000/480118
153# INFO:root:Summarizing chunk 300000/480118
154# INFO:root:Summarizing chunk 400000/480118
155# INFO:root:Message has been summarized to '''
156# This long tool result has been summarized:
157# The system is stuck in an infinite loop, repeatedly displaying "Waiting for process..." without
158# any indication of progress or completion. Additionally, an error was found due to missing
159# credentials for user kurt_andrews, requiring correct credentials to be passed.
160# '''
161conversation.append_user_message('What is the exact message repeated in a loop? No need to recall the tool')
162conversation.execute()
163
164
165
166
167# %%[markdown]
168## Keep Messages Consistent
169
170# %%
171from typing import Tuple
172
173
174def _split_messages_and_guarantee_tool_calling_consistency(
175 messages: List[Message], keep_x_most_recent_messages: int
176) -> Tuple[List[Message], List[Message]]:
177 """Guarantees consistency of tool requests / results"""
178 messages_to_summarize = messages[: -keep_x_most_recent_messages]
179 messages_to_keep = messages[-keep_x_most_recent_messages:]
180
181 # detect tool results missing their tool request
182 missing_tool_request_ids = set()
183 tool_request_ids = set()
184 for msg in messages_to_keep:
185 if msg.tool_requests:
186 for tool_request in msg.tool_requests:
187 tool_request_ids.add(tool_request.tool_request_id)
188 if msg.tool_result:
189 tool_request_id = msg.tool_result.tool_request_id
190 if tool_request_id not in tool_request_ids:
191 missing_tool_request_ids.add(tool_request_id)
192
193 if len(missing_tool_request_ids) == 0:
194 return messages_to_summarize, messages_to_keep
195
196 # all the rest after the tool call should be summarized
197 for idx, msg in enumerate(messages_to_summarize):
198 if any(
199 tc.tool_request_id in missing_tool_request_ids for tc in (msg.tool_requests or [])
200 ):
201 return messages_to_summarize[:idx], messages_to_summarize[idx:] + messages_to_keep
202
203 raise ValueError("Should not happen")
204
205
206
207
208# %%[markdown]
209## Drop Old Message Transform
210
211# %%
212class KeepOnlyRecentMessagesTransform(MessageTransform):
213 """Message transform that only keeps the X most recent messages."""
214 def __init__(self, keep_x_most_recent_messages: int = 10):
215 super().__init__()
216 self.keep_x_most_recent_messages = keep_x_most_recent_messages
217
218 async def call_async(self, messages: List["Message"]) -> List["Message"]:
219 old_messages, recent_messages = _split_messages_and_guarantee_tool_calling_consistency(
220 messages=messages,
221 keep_x_most_recent_messages=self.keep_x_most_recent_messages
222 )
223 return recent_messages
224
225
226
227# %%[markdown]
228## Drop Old Message Transform Run
229
230# %%
231transform = KeepOnlyRecentMessagesTransform(keep_x_most_recent_messages=1)
232agent = Agent(
233 llm=llm,
234 tools=[read_logs_tool],
235 agent_template=llm.agent_template.with_additional_pre_rendering_transform(
236 transform, append=False
237 ),
238)
239
240conversation = agent.start_conversation()
241# message is so long that it cannot be processed, but it will be dropped because is too old
242conversation.append_user_message("Super long message: " + "..." * 1000000)
243conversation.append_agent_message("OK")
244conversation.append_user_message("What is the capital of Switzerland?")
245conversation.execute()
246
247
248
249
250# %%[markdown]
251## Summarize Old Message Transform
252
253# %%
254from typing import TypeVar, Generic, Optional
255
256from wayflowcore.models import LlmModel
257from wayflowcore._utils.hash import fast_stable_hash
258from wayflowcore._utils._templating_helpers import render_template
259
260
261T = TypeVar("T")
262
263
264class MessageTransformCache(Generic[T]):
265 def __init__(self) -> None:
266 self.state: Dict[str, T] = {}
267
268 def add(self, key: str, value: T) -> None:
269 self.state[key] = value
270
271 def remove(self, key: str) -> None:
272 raise NotImplementedError()
273
274 def get(self, key: str) -> Optional[T]:
275 return self.state.get(key, None)
276
277
278class SummarizationMessageTransform(MessageTransform):
279 """
280 Stateful message transform that summarizes the list of messages when it becomes too long. Preserves
281 consistency of tool calls/results.
282 """
283
284 def __init__(
285 self,
286 llm: LlmModel,
287 max_num_messages: int = 20,
288 min_num_messages: int = 5,
289 summarization_instruction: str = "Please make a summary of the previous messages. Include relevant information and keep it short. Your response will replace the messages, so just output the summary directly, no introduction needed.",
290 summarized_message: str = "Summarized conversation: {{summary}}",
291 _cache_implementation: type = MessageTransformCache,
292 ):
293 """
294 Parameters
295 ----------
296 llm:
297 LLM to use for the summarization.
298 max_num_messages:
299 Number of message after which we trigger summarization. Tune this parameter depending on the
300 context length of your model and the price you arem willing to pay (higher means longer conversation
301 prompts and more tokens).
302 min_num_messages:
303 Number of recent messages to keep from summarizing. Tune this parameter to prevent from summarizing
304 very recent messages and keep a very responsive and relevant agent.
305 summarization_instruction:
306 Instruction for the LLM on how th summarize the previous messages.
307 summarized_message:
308 Jinja2 template on how to present the summary (with variable `summary`) to the agent using the transform.
309
310 Examples
311 --------
312 >>> summarization_transform = SummarizationMessageTransform(
313 ... llm=llm,
314 ... # if the conversation reaches 30 messages, it will trigger summarization
315 ... max_num_messages=30,
316 ... # when summarization is triggered, it will summarize all the messages but that last 10 ones,
317 ... min_num_messages=10,
318 ... )
319
320 """
321 self.llm = llm
322 self.summarization_instruction = summarization_instruction
323 self.summarized_message = summarized_message
324 self.max_num_messages = max_num_messages
325 self.min_num_messages = min_num_messages
326
327 self.internal_cache = _cache_implementation["Message"]() # type: ignore
328 # a cached message means all the messages before this messages have been summarized
329
330 def _partition_cached_and_new_messages(self, messages: List["Message"]) -> List["Message"]:
331 messages_hashes = []
332 for idx, msg in enumerate(messages):
333 messages_hashes.append(msg.hash)
334 curr_hash = fast_stable_hash(messages_hashes)
335 found_msg = self.internal_cache.get(curr_hash)
336 if found_msg is not None:
337 return self._partition_cached_and_new_messages([found_msg] + messages[idx + 1 :])
338 else:
339 return messages
340
341 async def call_async(self, messages: List["Message"]) -> List["Message"]:
342 formatted_messages = self._partition_cached_and_new_messages(messages)
343
344 if len(messages) <= self.max_num_messages:
345 return formatted_messages
346
347 messages_to_summarize, messages_to_keep = _split_messages_and_guarantee_tool_calling_consistency(
348 messages=formatted_messages,
349 keep_x_most_recent_messages=self.min_num_messages,
350 )
351 summarized_message = await self._summarize(messages_to_summarize)
352
353 summarized_hash = fast_stable_hash([msg.hash for msg in messages_to_summarize])
354 self.internal_cache.add(summarized_hash, summarized_message)
355
356 return [summarized_message] + messages_to_keep
357
358 async def _summarize(self, messages: List[Message]) -> Message:
359 chat_history = messages + [Message(role="user", content=self.summarization_instruction)]
360 prompt = await self.llm.chat_template.format_async(
361 inputs={
362 self.llm.chat_template.CHAT_HISTORY_PLACEHOLDER_NAME: chat_history,
363 }
364 )
365 completion = await self.llm.generate_async(prompt=prompt)
366 summary = completion.message.content
367 return Message(
368 content=render_template(template=self.summarized_message, inputs=dict(summary=summary)),
369 role="user",
370 )
371
372
373
374
375# %%[markdown]
376## Summarize Old Message Transform Run
377
378# %%
379transform = SummarizationMessageTransform(llm=llm)
380agent = Agent(
381 llm=llm,
382 tools=[read_logs_tool],
383 agent_template=llm.agent_template.with_additional_pre_rendering_transform(
384 transform, append=False
385 ),
386)
387
388conversation = agent.start_conversation()
389
390
391LONG_CONVERSATION = [
392 {"role": "user", "content": "Hi! Can you tell me something interesting about dolphins?"},
393 {
394 "role": "assistant",
395 "content": "Absolutely! Dolphins are fascinating creatures, famous for their intelligence and complex behavior. For example, they have been observed using tools, such as covering their snouts with sponges to protect themselves while foraging on the seafloor. Dolphins also display strong social bonds and have been known to help injured individuals within their pods. Communication among dolphins is advanced; they use a series of clicks, whistles, and body movements to convey information, and some species even have distinctive signature whistles that function like names.",
396 },
397 {"role": "user", "content": "Wow, I didn’t know that. Do dolphins have good memory?"},
398 {
399 "role": "assistant",
400 "content": "Dolphins possess remarkable memories, particularly when it comes to their social groups and vocal communication. Researchers have discovered that dolphins can remember the unique signature whistles of other dolphins for over 20 years, which is the longest social memory recorded in non-human animals. This ability highlights their sophisticated cognitive abilities and the importance of long-term relationships in dolphin societies. Memory also plays a crucial role in their navigation and hunting skills, as dolphins migrate and follow paths in the oceans over great distances. In addition, their keen memory supports learning from one another, enhancing the social structure of their pods.",
401 },
402 {
403 "role": "user",
404 "content": "That’s impressive. Are there other animals with similar intelligence?",
405 },
406 {
407 "role": "assistant",
408 "content": "Indeed, there are several other animals that exhibit forms of intelligence comparable to dolphins. Great apes, such as chimpanzees and orangutans, have been observed making and using tools, solving puzzles, and displaying empathy and self-awareness. Elephants show emotional intelligence and have complex social relationships, including mourning their dead and helping injured companions. Crows and parrots, members of the bird family, demonstrate advanced problem-solving, mimicry, and even the ability to plan for future events. These examples across different branches of the animal kingdom show that intelligence has evolved in diverse and fascinating ways.",
409 },
410 {"role": "user", "content": "Can elephants recognize themselves in a mirror like dolphins?"},
411 {
412 "role": "assistant",
413 "content": "Yes, elephants are among the elite group of species known to pass the mirror self-recognition test, an experiment generally considered evidence of self-awareness. In these experiments, elephants explore themselves in mirrors and investigate marks on their bodies they otherwise couldn’t see. Notably, this test has only been passed by a few species, including certain great apes, dolphins, magpies, and elephants. Such ability hints at a deep cognitive capacity, suggesting they have a mental representation of themselves, which is foundational for empathy and understanding the experiences of others. This self-recognition is just one of the many amazing aspects of elephant intelligence that continue to be studied.",
414 },
415 {"role": "user", "content": "What is the mirror test?"},
416 {
417 "role": "assistant",
418 "content": "The mirror test, also known as the mark test, is a behavioral technique designed to assess self-awareness in animals. In the test, an animal is marked with an odorless dye in a place it cannot see without a mirror. The animal is then given access to a mirror, and researchers observe whether it investigates or tries to touch the mark on its own body, indicating recognition of its reflection as itself, not another animal. Passing the mirror test suggests a level of self-concept, which is considered an advanced cognitive trait. However, not all intelligent animals pass the mirror test, and failing it doesn’t necessarily mean an animal lacks self-awareness – it might simply not care about the mark or not rely on visual cues.",
419 },
420 {
421 "role": "user",
422 "content": "Besides animals, what’s another field where recognition is important?",
423 },
424 {
425 "role": "assistant",
426 "content": "Recognition is vitally important in many human technologies, notably in the field of artificial intelligence. Facial recognition technology is widely used for security purposes, such as unlocking smartphones or monitoring public spaces. In social media, these algorithms help automatically tag friends in photos, organizing vast amounts of visual data. Additionally, recognition systems are applied in retail, for personalized marketing, and in health care, for patient identification and monitoring. However, these powerful tools also raise important questions around privacy and data security, sparking ongoing debates about their regulation and ethical use.",
427 },
428 {"role": "user", "content": "Is facial recognition always accurate?"},
429 {
430 "role": "assistant",
431 "content": "Facial recognition technology has improved significantly over recent years, but it is not flawless and its accuracy can vary depending on several factors. Lighting, angle, image quality, and the diversity of data used to train the algorithms all influence performance. Notably, many studies have shown that facial recognition systems can have higher error rates when identifying people with darker skin tones or from underrepresented demographics, leading to concerns about bias and discrimination. This has prompted researchers and policymakers to push for better training data, algorithmic transparency, and stricter oversight. As the technology becomes more widespread, ensuring fairness and accuracy is an essential focus for developers and regulators alike.",
432 },
433 {"role": "user", "content": "What are some other ethical concerns in AI?"},
434 {
435 "role": "assistant",
436 "content": "Ethical challenges in AI are numerous and complex, reflecting the far-reaching impact of these technologies on society. Privacy concerns are paramount, as AI systems often require large amounts of personal data, raising questions about consent and data protection. Bias in algorithms is another major issue, as AI can inadvertently reinforce existing societal prejudices if not carefully monitored and tested. Transparency is crucial, because many AI systems, especially those using deep learning, can behave as 'black boxes'—making decisions in ways that aren’t easily understandable to humans. Additionally, there are fears around job displacement, as automation could replace roles faster than new opportunities are created, necessitating careful consideration and retraining initiatives.",
437 },
438 {
439 "role": "user",
440 "content": "Job displacement is interesting. Which jobs are most at risk from AI?",
441 },
442 {
443 "role": "assistant",
444 "content": "Jobs that involve repetitive, predictable tasks are currently considered the most vulnerable to automation by AI. For example, data entry clerks, assembly line workers, and certain types of customer service representatives face a high likelihood of being replaced or augmented by machines. Even within fields like transportation, we’re seeing the rise of autonomous vehicles that could impact truck drivers and delivery personnel. On the other hand, roles that require creativity, complex decision-making, emotional intelligence, or physical dexterity in unpredictable environments—such as healthcare providers, artists, and skilled trades—are less susceptible for now. To manage this transition, many experts recommend investing in retraining programs and encouraging adaptability in the workforce.",
445 },
446 {"role": "user", "content": "On a lighter note, do you know any fun space facts?"},
447 {
448 "role": "assistant",
449 "content": "Absolutely, space offers an endless trove of fascinating discoveries. For example, Venus has a very unusual rotation: a single day on Venus (one rotation on its axis) is actually longer than its year (one complete orbit around the Sun) due to its slow spin. There are also diamond rains on planets like Neptune and Uranus, caused by extreme pressure turning carbon into solid diamonds. You might be surprised to learn that there are more stars in the observable universe than grains of sand on all the Earth’s beaches combined. Finally, if you were to fall into a black hole, the laws of physics as we understand them break down at the singularity—a phenomenon still not fully explained by scientists.",
450 },
451 {"role": "user", "content": "That’s fascinating! How hot does it get on Venus?"},
452 {
453 "role": "assistant",
454 "content": "Venus is the hottest planet in our solar system, even outpacing Mercury, which is closer to the Sun. The average surface temperature on Venus can reach an astonishing 465 degrees Celsius, or about 869 degrees Fahrenheit. This is hot enough to melt lead, and it creates an extremely inhospitable environment for life as we know it. The intense heat is largely due to the planet’s dense atmosphere, composed mostly of carbon dioxide, which traps heat through a runaway greenhouse effect. In addition to being scorching, Venus’s atmosphere features thick clouds of sulfuric acid that make its surface perpetually shrouded and acidic.",
455 },
456 {"role": "user", "content": "Why is Venus hotter than Mercury?"},
457 {
458 "role": "assistant",
459 "content": "Despite being the second planet from the Sun, Venus is hotter than Mercury primarily because of its thick, carbon dioxide-rich atmosphere. This atmosphere acts as a blanket, trapping incoming solar radiation and preventing heat from escaping, leading to the planet's infamous runaway greenhouse effect. Meanwhile, Mercury has only a thin exosphere, which allows heat to escape into space quickly, resulting in dramatic temperature swings between day and night. The conditions on Venus are so extreme that even spacecraft sent to the surface survive only a few hours before being destroyed by the temperature and pressure. In effect, Venus offers a dramatic example of how atmospheric composition can significantly influence planetary climates.",
460 },
461 {"role": "user", "content": "Are there any missions planned to Venus soon?"},
462 {
463 "role": "assistant",
464 "content": "Yes, the next decade promises to be exciting for Venus exploration, with both NASA and the European Space Agency planning new missions. NASA has announced the VERITAS and DAVINCI+ missions, aiming to study Venus’s geology, atmosphere, and history, with launches expected later this decade. ESA is preparing its own mission called EnVision, which will investigate Venus's surface and atmosphere using advanced radar and spectrometry. These missions hope to resolve key questions, such as whether Venus ever had oceans, what processes led to its intense greenhouse state, and if volcanic activity is still ongoing. Renewed interest in Venus stems from its potential to teach us about planetary evolution, including factors that affect habitability.",
465 },
466 {"role": "user", "content": "Cool! Back to Earth, what's the largest living structure here?"},
467 {
468 "role": "assistant",
469 "content": "The largest living structure on Earth is the Great Barrier Reef, stretching more than 2,300 kilometers along Australia’s northeastern coast. This massive coral system is so large it can be seen from space, and it comprises thousands of individual reefs and hundreds of islands built by billions of tiny coral polyps. The reef supports an extraordinary diversity of marine life, hosting over 1,500 fish species, turtles, sea snakes, and countless invertebrates. It also plays a crucial ecological role, protecting coastlines from storms and supporting local economies through tourism and fishing. Unfortunately, despite its size and importance, the Great Barrier Reef faces significant threats from climate change, particularly coral bleaching and ocean acidification.",
470 },
471 {"role": "user", "content": "Is it in danger?"},
472 {
473 "role": "assistant",
474 "content": "Sadly, the Great Barrier Reef is indeed at risk due to multiple environmental pressures. One of the most severe challenges is coral bleaching, which occurs when higher sea temperatures stress corals, causing them to expel the symbiotic algae they rely on for food and color. Repeated bleaching events in recent years have resulted in large-scale loss of coral cover, and the recovery process can be slow or incomplete. Additional threats include poor water quality from agricultural runoff, overfishing, cyclones, and outbreaks of crown-of-thorns starfish, which prey on corals. Conservation efforts are underway, but reducing global carbon emissions is crucial to ensuring the reef’s long-term survival.",
475 },
476 {"role": "user", "content": "Can coral reefs recover from bleaching?"},
477 {
478 "role": "assistant",
479 "content": "Reef recovery after bleaching is possible, but it depends on the severity of the event and the time between subsequent stress episodes. Corals can regain their symbiotic algae and recover their vibrant colors if water temperatures return to normal quickly enough. However, back-to-back bleaching events—such as those in recent years—inhibit full recovery and can lead to widespread coral death. The resilience of a reef also depends on the health of its local ecosystem, water quality, and the diversity of coral species present. Active restoration efforts, like coral gardening and selective breeding for heat-tolerant corals, are being explored to help boost reef recovery in the face of a warming climate.",
480 },
481 {"role": "user", "content": "Are there artificial reefs?"},
482 {
483 "role": "assistant",
484 "content": "Yes, artificial reefs have been employed worldwide as a means to enhance marine habitats and promote biodiversity. These structures can be made from a variety of materials, including concrete blocks, purpose-built modules, and even deliberately sunken ships or decommissioned military vehicles. When placed on the ocean floor, artificial reefs provide surfaces for corals to attach to and develop, which in turn attract fish and other marine species seeking food and shelter. Over time, these reefs can become thriving ecosystems, sometimes rivaling their natural counterparts in productivity. Besides ecological benefits, artificial reefs are also used to create tourist attractions for diving and help reduce pressure on natural reefs by diverting fishing and recreational activity.",
485 },
486 {"role": "user", "content": "Do fish immediately move into new artificial reefs?"},
487 {
488 "role": "assistant",
489 "content": "Fish and other marine organisms often begin exploring and colonizing new artificial reefs quite quickly—sometimes within just a few weeks of their deployment. The structures offer instant shelter from predators and new feeding opportunities, which attract small fish and invertebrates first. Over several months, larger predatory species may start appearing as the new reef becomes established as a hunting and breeding ground. The colonization process is further accelerated if the surrounding water already hosts healthy marine life populations. Eventually, the artificial reef can develop complex food webs, supporting a community of organisms similar to that of natural reefs, though full ecosystem development takes several years.",
490 },
491 {
492 "role": "user",
493 "content": "I love learning these facts. Can you tell me one more surprising animal fact?",
494 },
495 {
496 "role": "assistant",
497 "content": "Of course! One particularly astonishing animal fact concerns the octopus: these remarkable creatures possess not one, but three hearts. Two of the hearts are responsible for pumping blood through the gills, where it picks up oxygen, while the third pumps oxygenated blood throughout the rest of the body. Octopuses also have blue blood, a consequence of their use of a copper-rich protein called hemocyanin instead of hemoglobin, which is based on iron and gives human blood its red color. The blue blood is more efficient at transporting oxygen in the cold, low-oxygen conditions of the ocean, making it especially advantageous for the octopus’s varied habitats. In addition to their unique circulatory system, octopuses are also known for their high intelligence, problem-solving abilities, and incredible camouflage skills.",
498 },
499 {"role": "user", "content": "That’s wild. Why do they have blue blood?"},
500 {
501 "role": "assistant",
502 "content": "The blue color of octopus blood is due to the presence of hemocyanin, a copper-based molecule that binds with oxygen for transport throughout the body. This adaptation is particularly effective in the cold, low-oxygen environments found in many parts of the oceans where octopuses live. Hemocyanin is more efficient than hemoglobin (the iron-based molecule in red blood) under these conditions, which helps octopuses thrive in diverse and sometimes extreme marine settings. The evolution of blue blood is an example of the many unique physiological characteristics that have enabled cephalopods to become such successful marine invertebrates. It’s remarkable how life develops specialized solutions to meet the challenges of different environments.",
503 },
504]
505
506for msg in LONG_CONVERSATION:
507 conversation.append_message(
508 Message(content=msg["content"], role=msg["role"]) # type: ignore
509 )
510
511conversation.append_user_message("What tool do dolphins sometimes use when foraging on the seafloor, and why?")
512conversation.execute()