Summary
"Use AI" is a type of step you can use when configuring Custom Automations
This AI step type allows you to connect to different LLMs, using your own external connection or using Planhat's connections to different models
The AI step takes a model, a connection and a prompt as input
If there is only one connection (one external, or only Planhat's connection) available, you don't have to choose the connection
AI steps are only available in Automations in upgraded Planhat (ws.planhat.com)
Who is this article for?
Planhat Users who are building Custom Automations for their organization and would like to leverage AI
Series
This article is part of a series on Automations:
"Use AI" steps in Automations β¬ οΈ You are here
And a series on AI in Planhat:
"Use AI" steps in Automations β¬ οΈ You are here
Article content
This is a technical deep-dive article
Read on if you'd like to learn about "Use AI" steps - a type of step you can use when configuring Custom Automations. Ensure you read our article on Custom Automations before this one, so you are familiar with the context of where these AI steps can be used.
If you would simply like a general introduction to Automations, check out our overview article here.
What are AI steps?
The "Use AI" step type allows you to integrate Large Language Models (LLMs) directly into your Custom Automations. Whether you're looking to analyze customer data, summarize content, or generate dynamic responses, the AI step gives you the flexibility to leverage AI where you need it most.
You can connect to models either through Planhat's managed LLMs: OpenAI's GPT, Anthropic's Claude, or Google's Gemini, or to any LLM you prefer by using external connections. Read about this in our article on setting up AI connections.
π Important to note
When using Planhat's managed LLM connections, usage is billed in credits. Read the article about AI credits here.
Why use AI steps?
Using LLMs in Automations can be used to generate written content for a range of use cases:
Prepare for a QBR by summarizing the state of the account, including any opportunities, objectives, open issues or support tickets, and combine with the notes from the last QBR
Rank Companies in your portfolio by analyzing how they fit your ICP (Ideal Customer Profile)
Summarize all open issues, support tickets and conversations to identify friction points for a customer
Define your weekly priorities based on your upcoming meetings this week
Run a cross-portfolio issue analysis to identify the most common issues or themes of issues across your portfolio
Identify your at-risk Companies across your entire portfolio, and provide you with a game plan with action points
How to set up AI steps
When adding an AI step to your Custom Automation, you'll configure three key inputs:
"Model" β Choose the LLM model you want to use (e.g., GPT-4, Claude 3, etc.)
Note that we are not referring to Planhat models (objects) here!
"Connection" β This will only show if there's more than 1 connection to the LLM provider. Choose between:
Your own account (API key / integration setup)
Planhat managed connections
"Prompt" β Write the prompt that will be sent to the model. You can insert dynamic references from the Automation context to personalize each response. Read more about dynamic references in our article here.
The output of the AI step can optionally be set to "JSON Output" via the checkbox at the bottom. The JSON output can be used to e.g. create data in Planhat or when you want multiple outputs from the LLM. Define the output format in your prompt e.g. if you like to return both a Title and a Summary in a JSON format. Access the Title by using dynamic references, for the example below this would be
<<s-4Pp.Title>>
and<<s-4Pp.Summary>>
Prompt engineering
Writing prompts is an important part of using LLMs - the more explicit each task is defined, the better the response in terms of quality and relevance. Test iteratively!