Giter Club home page Giter Club logo

Comments (2)

dosubot avatar dosubot commented on September 17, 2024 3

To set a PromptTemplate as the default wrapper for the model when working with llama-index, you can use the SelectorPromptTemplate class to define your default prompt and any conditionals. Here is an example of how you can do this:

from llama_index.core.prompts.base import SelectorPromptTemplate, PromptTemplate
from llama_index.core.prompts.utils import is_chat_model

# Define your custom PromptTemplate
custom_prompt_template = PromptTemplate(template="Your custom prompt template here")

# Define conditionals if any (e.g., for chat models)
custom_conditionals = [(is_chat_model, custom_prompt_template)]

# Create a SelectorPromptTemplate with your custom prompt and conditionals
CUSTOM_PROMPT_SEL = SelectorPromptTemplate(
    default_template=custom_prompt_template,
    conditionals=custom_conditionals,
)

# Now, `CUSTOM_PROMPT_SEL` is set as the default wrapper for the model

This code snippet demonstrates how to create a SelectorPromptTemplate, define your custom prompt and conditionals, and set it as the default wrapper for the model. The SelectorPromptTemplate class allows you to specify a default template and conditionals, similar to how default prompt selectors are defined in the llama-index library [1][2][3][4].

Regarding your second question, this approach is indeed a proper way to work with llama-index when you need to set a default prompt template for your model. It avoids the need to subclass OpenAILike and provides a flexible way to manage prompt templates.

To continue talking to Dosu, mention @dosu.

from llama_index.

logan-markewich avatar logan-markewich commented on September 17, 2024

@egeres you can set some function hooks in OpenAILike -- completion_to_prompt() and messages_to_prompt()

def completion_to_prompt(completion):
  system_prompt = "..."
  return f"""<|begin_of_text|><|start_header_id|>system<|end_header_id|>

{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>

{completion}<|eot_id|><|start_header_id|>assistant<|end_header_id|>"""


# this is very naive, I know llama3 has a chat format, I'm not typing out the logic 
# normally you'd use message.role and message.content to build a prompt
def messages_to_prompt(messages):
  prompt_str = "\n".join([str(x) for x in messages])
  return completion_to_prompt(prompt_str)
  
llm = OpenAILike(..., messages_to_prompt=messages_to_prompt, completion_to_prompt=completion_to_prompt)

from llama_index.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.