Agent Integration
Default Agent already includes SessionExtension.
Attach / Detach
python
agent.attach_session(session)
agent.detach_session()Convenience helpers
python
agent.enable_session_lite(chars=12000, messages=8)
agent.enable_session_memo(chars=6000, messages=12)
agent.enable_quick_session()
agent.disable_quick_session()chat_history is auto-proxied
When a Session is attached, these methods write into Session:
set_chat_history(...)add_chat_history(...)reset_chat_history()
Before each request, session.current_chat_history is injected into the prompt.
Record control
python
agent.settings.set("session.record.input.paths", ["input", ["input", "user.id"]])
agent.settings.set("session.record.input.mode", "all")
agent.settings.set("session.record.output.paths", ["reply", "data.result"])
agent.settings.set("session.record.output.mode", "first")Custom record handler
python
def my_record_handler(result):
return [
{"role": "user", "content": "custom input"},
{"role": "assistant", "content": "custom output"},
]
agent.set_record_handler(my_record_handler)Practical scenario: trigger different summaries after multi-turn chat
1) Full setup + auto trigger (lite / deep)
python
from agently import Agently
from agently.core import Session
agent = Agently.create_agent()
session = Session(agent=agent).configure(
mode="memo",
limit={"chars": 6000, "messages": 12},
every_n_turns=4,
)
agent.attach_session(session)Default policy order:
- over char limit → deep
- over message limit → lite
- every_n_turns → lite
2) Manual trigger
python
agent.session.resize(force="deep")
agent.session.resize(force="lite")3) Different handlers for lite vs deep
python
def lite_resize_handler(full_history, current_history, memo, settings):
return full_history, current_history[-6:], memo
def deep_resize_handler(full_history, current_history, memo, settings):
return full_history, current_history[-4:], memo
session.set_resize_handlers("lite", lite_resize_handler)
session.set_resize_handlers("deep", deep_resize_handler)How summaries flow back into the next turn (long-term memory)
Key points:
current_chat_historyis injected automaticallymemois not injected automatically — you must add it
Option A: inject memo before every request
python
async def inject_memo(prompt, settings):
memo = agent.session.memo
if memo:
prompt.set("info.memo", memo)
agent.extension_handlers.append("request_prefixes", inject_memo)Option B: manual injection
python
agent.info({"memo": agent.session.memo})
agent.input("continue...").get_text()This creates the loop:
- resize updates memo
- next request injects memo
- model keeps stable preferences and facts