-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Open
Description
It's a great work! I am really appreciate it.
Here are my feedback, and things I want to further discuss.
In general, I am not sure if we can simply an atom agent into parts like, tool, context mgr, prompt.
If we take context mgr as core as where we orchestrate our business logic, and the logic, formed into a DAG way. Where ... maybe there 4 boundaries as
- North: The human interface as how agent interaction with human, UI, or just a bot to capture your event. as
@Depbot rebase - South: In most case, for an agent, it invokes a restful API(openAI sdk, Google sdk or etc...) , which means, as parts of factor 5, execution status can be invoked here.
- West: I suppose the memory mgr happens here, we decided what information been cached and how it been used.
- East: The core interaction with Tools(either direct invokes a system function or function call, MCP or A2A)
The business logic, when human using this Agent, which means, we start from North, go to South and back.
In this journey, we may need to move to East, as tooling support, and move to West for historical reason.
The visa here you need is a structured data format, as we run in a Digital way.
The map here I want to share is like a 4 steps strategy which for agent running in automate:
- start from a NLP question, we need to know the scope and in what condition LLM can answer the specific business question.
- From NLP to structure output to the our Visa.
- This step may a huge one, build DAG and make control feedback to human.(as ask human Y/N in each steps to confirm)
- Summarize a workflow let it go(runs in automate).
ogamaxwellnehalecky
Metadata
Metadata
Assignees
Labels
No labels