-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Labels
Feature requestNew feature requestNew feature request
Description
Description
When invoking the agent with only message_history
, the latest ModelRequest
gets popped off the history, but currently doesn't get passed to the RunContext.prompt
.
This results in the message not really being accessible from the RunContext
:
from datetime import date
from pydantic_ai import Agent, RunContext
agent = Agent(
"openai:gpt-4o",
instructions="You are a helpful assistant",
)
@agent.instructions
def guardrail_instructions(ctx: RunContext[str]) -> str:
print(ctx.messages) # []
print(ctx.prompt) # None
result = check_guardrail(ctx.prompt)
if result.warn:
return f"Proceed with caution! Input triggered guardrail for due to {result.reason}"
return ""
result = agent.run_sync(
message_history=[ModelRequest(parts=[UserPromptPart(content="Hello!")])],
)
print(result.output)
Note: It seems like prompt
is just the text content of the ModelRequest
... It kinda feels like the actual ModelRequest
itself should be available somewhere in the context as there are instances where it could be a ToolReturnPart | RetryPromptPart
instead of a UserPromptPart
.
References
https://pydanticlogfire.slack.com/archives/C083V7PMHHA/p1757561116135159
Metadata
Metadata
Assignees
Labels
Feature requestNew feature requestNew feature request