Prompts are essential for interacting effectively with Large Language Models (LLMs). They serve as a specific way to communicate with LLMs, guiding them to generate the desired responses. AI models often require fine-tuning of prompts to enhance performance.
In kis.ai, prompts are managed through YAML files in a product, located in the folder ai/prompts. As these files are part of the product git repository, their changes can be tracked over a period of time. Prompts can be categorized, tagged, and searched for quick retrieval. kis.ai provides analytics and feedback mechanisms to help developers refine their prompts, ensuring high-quality responses from the LLMs.
Simple Prompts
Developers can define prompts in a structured manner, in multiple folders and YAML files to match the complexity of the product and its context. Each prompt has a name and has some natural language text to be sent to the LLM. Prompts are also essentially string templates, written in liquid, allowing developers to dynamically change the prompts to match the information passed by the end-users.
For most models, system-prompt and user-prompt are available and give the way to request the LLM on what is expected. The {{ parameter }} notation of liquid templates can be used to demarcate the dynamic sections of the prompt.
In the above example, email-template-gen’s user-prompt with be translated into
with the following json payload AI Gateway or AI Flow
Few-shot Prompts
Few-shot prompting is a technique in natural language processing where a language model is provided with a small number of examples (typically a few) to understand the context or task it needs to perform. Unlike zero-shot prompting, which relies on the model’s pre-existing knowledge without examples, few-shot prompting gives the model specific examples to guide its responses, improving accuracy and relevance. This approach helps the model generalize from the examples to generate appropriate outputs for similar tasks.
With plain text
Question & Answer example
When the user prompts
they get the response
This will be the sample json being sent to the AI Gateway
Text generation example
When the user prompts
they get the response
With inline code
Generate YAML with inline samples
This is an example prpmpt if you want to genrate structured YAML as outputs
When the user prompts
will give the following YAML response
With file inclusion
Generate Vue Code
In this example prompt, user prompt will generate Vue code, based on the samples provided through code file inclusions.