Technology
TextGrad
TextGrad is an automatic differentiation framework (Autograd) that uses Large Language Model (LLM) feedback as 'textual gradients' to optimize complex AI systems.
This framework backpropagates rich, natural language suggestions (textual gradients) from LLMs to refine individual components within a compound AI system, following a PyTorch-like API for familiar abstraction. TextGrad targets black-box systems: it optimizes variables ranging from code snippets and prompts to molecular structures. For example, it improved GPT-4o's zero-shot accuracy on the GPQA benchmark from 51% to 55% and delivered a 20% relative performance gain on LeetCode-Hard coding problems. The system is flexible and requires users to provide only the objective function, streamlining the optimization of next-generation AI pipelines.
Related technologies
Recent Talks & Demos
Showing 1-1 of 1