Prompt Step
Create Prompt Chains, preprocess data, and leverage LLMs to their fullest
Last updated
Create Prompt Chains, preprocess data, and leverage LLMs to their fullest
Last updated
This is the step where you feed data to the AI provider of your choice and get a response. You can use any of your existing Prompts, or just create a new one.
When your prompt is referencing variable data (i.e. contains tags like @tag), you use the Input menu in order to feed data in to every variable. What you're doing here is telling Easybeam how to fill in each of these tags in your prompt.
There are 4 different types of data that can be fed in,
Output
This is the output of a previous step, and is how you can reference the data from an API step or create prompt chains.
Text
This is static text. This is useful if you just want to fill in a value in a prompt and have it be the same every time it's called.
Variable
These are the variables you receive from your user that you defined in the Start step.
Conversational
This is data that you specify in the Conversation Data cards in an Action step. These are only available in an branch of an Action step. An example of conversational data would be something like "location", which would be a requirement for your end user to mention their location before taking this action, and then could be selected by any step off the Action that defined it. See the documentation for the Action step for more information.