Understanding Prompts

Introduction to Prompts

In the world of Large Language Models (LLMs), a prompt is typically a string of text that provides instructions or context to the AI. However, at Mirai, we've expanded this concept to make prompts more powerful and flexible.

The Evolution of Prompts

Basic Prompts

At its simplest, a prompt might look like this:

What is the Answer to the Ultimate Question of Life, The Universe, and Everything?

Structured Prompts

In more advanced applications, prompts often include different roles:

[
  {
    "role": "system",
    "content": "You are a highly advanced and immensely powerful supercomputer."
  },
  {
    "role": "user",
    "content": "What is the Answer to the Ultimate Question of Life, The Universe, and Everything?"
  }
]

Dynamic Prompts

Real-world applications often need to incorporate variable data:

request = 'Google'
user_message = f'What is {request}?'

Challenges with Traditional Prompts

  1. Code Integration: Prompts become part of the application code, making them difficult to modify.

  2. Template Limitations: Simple templates can help, but they struggle with complex scenarios involving conditions or loops.

  3. Data Pipeline Complexity: Prompts may depend on complex computations or other prompts, requiring additional processing.

Existing Solutions and Their Limitations

Tools like Langchain attempt to address these issues:

llm = ChatOpenAI()
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a highly advanced and immensely powerful supercomputer."),
    ("user", "{input}")
])
output_parser = StrOutputParser()
chain = prompt | llm | output_parser
chain.invoke({"input": "What is the Answer to the Ultimate Question of Life, The Universe, and Everything."})

However, these solutions often limit flexibility and still require significant application-side logic.

The Mirai Approach: Redefining Prompts

At Mirai, we've redefined prompts to achieve greater flexibility and completeness. In our system, a prompt is essentially a program that interacts with LLMs.

Components of a Mirai Prompt

  1. Input Data Format Description: A JSON schema that defines the structure of input data.

  2. Processing Logic: A Python script that processes the input data and interacts with the LLM.

Example of a Mirai Prompt

Input Data Format:

{
  "type": "object",
  "properties": {
    "messages": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "text": { "type": "string" },
          "user_identifier": { "type": "string" }
        },
        "required": ["text", "user_identifier"]
      }
    },
    "users": {
      "type": "array",
      "items": {
        "type": "object",
        "properties": {
          "identifier": { "type": "string" },
          "full_name": { "type": "string" },
          "username": { "type": "string" }
        },
        "required": ["identifier", "full_name"]
      }
    }
  },
  "required": ["messages", "users"]
}

Processing Logic:

key_topics = 'topics'
key_actual = 'actual'

def validate_json(choice) -> str:
    # validation code here

context.filters.register('validate_json', validate_json)
context.filters.use([
    'default',
    'lowercase',
    'validate_json',
])

system_lines = ['Dialog start.'] + lines + ['Dialog end.']

user_lines = [
    'Please provide a comprehensive list of topics discussed in the dialog.',
    f'Return a valid JSON with fields: \'{key_topics}\' (list of suggested topics) and \'{key_actual}\' (most relevant current topic).',
    'If all participants have shared opinions on the current topic, suggest a new, relevant topic based on the conversation context.',
    'Ensure suggested topics maintain relevance and continuity in the dialog.'
]

messages = context.functions.build_messages(system_lines, user_lines)
context.functions.execute_with_messages(context.input, messages)

Benefits of the Mirai Approach

  1. Flexibility: You can include complex logic and custom operations within the prompt itself.

  2. Completeness: All prompt-related logic is contained in one place, separate from the main application code.

  3. Ease of Editing: Prompts can be modified without changing the underlying application.

  4. Powerful Processing: You can perform complex data manipulations and multiple LLM calls within a single prompt.

By redefining prompts as mini-programs, Mirai offers a powerful and flexible solution for creating sophisticated AI-powered applications.

Last updated