Skip to content

t-prompts

CI Coverage TypeScript Coverage Documentation PyPI Python 3.14+ License: MIT Code style: ruff

Provenance-preserving prompts for LLMs using Python 3.14's template strings

t-prompts turns Python 3.14+ t-strings into navigable trees that preserve full provenance (expression text, conversions, format specs) while rendering to plain strings. Perfect for building, composing, and auditing LLM prompts.

Requirements: Python 3.14+

Quick Example

from t_prompts import prompt

# Simple prompt with labeled interpolation
instructions = "Always answer politely."
p = prompt(t"Obey {instructions:inst}")

# Renders like an f-string
print(str(p))  # "Obey Always answer politely."

# But preserves provenance
node = p['inst']
print(node.expression)  # "instructions"
print(node.value)       # "Always answer politely."

This enables riching tooling:

Widget

Targeted Use Cases

  • Prompt Debugging: "What exactly did this tangle of code render to?"
  • Prompt Optimization (Performance): "What wording / content best achieves my goal?"
  • Prompt Optimization (Size): "How do I get the same result with fewer words?"
  • Prompt Compacting: "LLM tells me to keep it short, now what do I do?"

Caveats:

While this library targets the creation of structured multi-modal prompts, despite the name, there is nothing in particular tying this library to LLMs / Generative models. (It is more "t" than "prompts") To use it for an actual LLM call, you would need to convert the IR into a model specific form (though for text, it could be as simple as str(prompt))

Get Started