Utility functions
This section documents utility functions.
prompt_templates.utils
format_for_client
Format OpenAI-style chat messages for different LLM clients.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
messages |
List[Dict[str, Any]]
|
List of message dictionaries in OpenAI format |
required |
client |
ClientType
|
The client format to use ('openai', 'anthropic', 'google'). Defaults to 'openai' |
'openai'
|
Returns:
Type | Description |
---|---|
Union[List[Dict[str, Any]], Dict[str, Any]]
|
Messages formatted for the specified client |
Raises:
Type | Description |
---|---|
ValueError
|
If an unsupported client format is specified |
TypeError
|
If messages is not a list of dicts |
Examples:
Format messages for different LLM clients:
>>> messages = [
... {"role": "system", "content": "You are a helpful assistant"},
... {"role": "user", "content": "Hello!"}
... ]
>>> # OpenAI format (default, no change)
>>> openai_messages = format_for_client(messages)
>>> print(openai_messages)
[{'role': 'system', 'content': 'You are a helpful assistant'}, {'role': 'user', 'content': 'Hello!'}]
>>> # Anthropic format
>>> anthropic_messages = format_for_client(messages, "anthropic")
>>> print(anthropic_messages)
{'system': 'You are a helpful assistant', 'messages': [{'role': 'user', 'content': 'Hello!'}]}
>>> # Google (Gemini) format
>>> google_messages = format_for_client(messages, "google")
>>> print(google_messages)
{'system_instruction': 'You are a helpful assistant', 'contents': 'Hello!'}
Source code in prompt_templates/utils.py
format_for_anthropic
Format OpenAI-style messages for the Anthropic client.
Converts OpenAI-style messages to Anthropic's expected format by: 1. Extracting the system message (if any) into a top-level 'system' key 2. Moving all non-system messages into a 'messages' list
Parameters:
Name | Type | Description | Default |
---|---|---|---|
messages |
List[Dict[str, Any]]
|
List of message dictionaries in OpenAI format |
required |
Returns:
Type | Description |
---|---|
Dict[str, Any]
|
Dict with 'system' and 'messages' keys formatted for Anthropic |
Source code in prompt_templates/utils.py
format_for_google
Format OpenAI-style messages for the Google GenAI SDK's generate_content method.
Converts OpenAI-style messages to Google Gemini's expected format by:
1. Extracting the system message (if any) into a top-level 'system_instruction' key
2. Moving all non-system messages into a 'contents' list of messages as Part
objects
(or a single string if there's only one message).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
messages |
List[Dict[str, Any]]
|
List of message dictionaries in OpenAI format |
required |
Returns:
Type | Description |
---|---|
Dict[str, Any]
|
Dict with 'system_instruction' and 'contents' keys formatted for Google Gemini |
Source code in prompt_templates/utils.py
create_yaml_handler
Create a YAML handler with the specified configuration. Ruamel is the default, because it allows for better format preservation and defaults to the newer YAML 1.2. Pyyaml can also be used, as it can be faster and is more widely used.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
library |
str
|
The YAML library to use ("ruamel" or "pyyaml"). Defaults to "ruamel". |
'ruamel'
|
Returns:
Type | Description |
---|---|
Union[YAML, Any]
|
A configured YAML handler |
Raises:
Type | Description |
---|---|
ValueError
|
If an unsupported YAML library is specified |
Source code in prompt_templates/utils.py
format_template_content
Recursively format content strings to use YAML literal block scalars. This is used to make the string outputs in a yaml file contain "|-", which makes the string behave like a """...""" block in python to make strings easier to read and edit.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
node |
Any
|
The prompt template content to format |
required |
Returns:
Type | Description |
---|---|
Any
|
The formatted content with literal block scalars for multiline strings |