Skip to main content
Sends each record through an LLM prompt to generate new data fields. Use it to classify records, extract insights, estimate values, or enrich records with AI-generated content.

Configuration

SettingDescription
IntegrationThe LLM provider (Gemini, OpenAI, Anthropic)
Prompt TemplateThe instruction sent to the LLM. Use Mentions to inject record values dynamically
Data FieldsAdditional structured data sent alongside the prompt
Result Field NameThe name of the output field that stores the LLM’s response
TemperatureControls response creativity (0 = deterministic, 1 = creative)
ModelThe specific model to use (e.g., Gemini 2.5 Flash Lite)
LimitMaximum number of requests to process

How It Works

1

Configure the LLM

Choose a provider, model, and temperature setting.
2

Write a prompt template

Write the instruction for the LLM. Use Mentions to inject record values (e.g., {{Account Name}}, {{Industry}}).
3

Name the result field

Choose a name for the output field where the LLM’s response will be stored.

Output

A new field (named by Result Field Name) is added to each record containing the LLM’s response. This field becomes available as a Mention in all downstream nodes.

Example

Enrich Salesforce Accounts with employee count estimates:
  1. Set the prompt to instruct the LLM to return an employee count
  2. Use Mentions to pass Account Name, Industry, and Employees as context
  3. Set the result field to num_employees
  4. The enriched field becomes available as a Mention in downstream nodes

Best Practices

  • Be specific in your prompt template — vague prompts produce inconsistent results
  • Use low temperature (0–0.2) for factual extraction, higher for creative tasks
  • Set a Limit when testing to avoid processing the entire dataset
  • Include relevant context fields in Data Fields to improve LLM accuracy
  • Data Normalization — uses an LLM to clean existing fields rather than generating new ones
  • Web Search — enriches records with live web data instead of LLM-generated content