Comments (2)
To set a PromptTemplate
as the default wrapper for the model when working with llama-index
, you can use the SelectorPromptTemplate
class to define your default prompt and any conditionals. Here is an example of how you can do this:
from llama_index.core.prompts.base import SelectorPromptTemplate, PromptTemplate
from llama_index.core.prompts.utils import is_chat_model
# Define your custom PromptTemplate
custom_prompt_template = PromptTemplate(template="Your custom prompt template here")
# Define conditionals if any (e.g., for chat models)
custom_conditionals = [(is_chat_model, custom_prompt_template)]
# Create a SelectorPromptTemplate with your custom prompt and conditionals
CUSTOM_PROMPT_SEL = SelectorPromptTemplate(
default_template=custom_prompt_template,
conditionals=custom_conditionals,
)
# Now, `CUSTOM_PROMPT_SEL` is set as the default wrapper for the model
This code snippet demonstrates how to create a SelectorPromptTemplate
, define your custom prompt and conditionals, and set it as the default wrapper for the model. The SelectorPromptTemplate
class allows you to specify a default template and conditionals, similar to how default prompt selectors are defined in the llama-index
library [1][2][3][4].
Regarding your second question, this approach is indeed a proper way to work with llama-index
when you need to set a default prompt template for your model. It avoids the need to subclass OpenAILike
and provides a flexible way to manage prompt templates.
To continue talking to Dosu, mention @dosu.
from llama_index.
@egeres you can set some function hooks in OpenAILike -- completion_to_prompt()
and messages_to_prompt()
def completion_to_prompt(completion):
system_prompt = "..."
return f"""<|begin_of_text|><|start_header_id|>system<|end_header_id|>
{system_prompt}<|eot_id|><|start_header_id|>user<|end_header_id|>
{completion}<|eot_id|><|start_header_id|>assistant<|end_header_id|>"""
# this is very naive, I know llama3 has a chat format, I'm not typing out the logic
# normally you'd use message.role and message.content to build a prompt
def messages_to_prompt(messages):
prompt_str = "\n".join([str(x) for x in messages])
return completion_to_prompt(prompt_str)
llm = OpenAILike(..., messages_to_prompt=messages_to_prompt, completion_to_prompt=completion_to_prompt)
from llama_index.
Related Issues (20)
- [Bug]: LanceDB get_nodes() method always only returns 10 nodes HOT 1
- [Feature Request]: add store to connect searxng as RAG source
- [Bug]: TypeError: RESTfulChatModelHandle.chat() got an unexpected keyword argument 'prompt' HOT 3
- [Bug]: HuggingFaceEmbedding和OpenLLM依赖的llama-index-core版本不一样,同时使用两个类,必有一个报错 HOT 3
- [Question]: how to use SimpleChatEngine.astream_chat HOT 2
- [Question]: what is the mean of the concept of community? HOT 4
- [Question]: How to get all the nodes from the local Neo4jPropertyGraphStore? HOT 1
- [Question]: Does SimpleDirectoryReader intentionally not support Path for the input_dir arguement? HOT 1
- [Question]: VectorIndexRetriever only by determined metadata field HOT 6
- [Bug]: Neo4J PropertyGraphIndex.from_documents: RuntimeError: Event loop is closed HOT 6
- [Question]: Evaluating RAG retrieval HOT 12
- [Bug]: Unnecessary warning in Huggingface LLM when tokenizer is passed as a function argument HOT 1
- [Question]: Workflows/Agents Equivalent to LangGraph Checkpoints? HOT 1
- [Question]: llama-parse + GoogleDriveReader HOT 1
- [Question]: indexing nodes with qdrant throws error “list object has no attribute _id” HOT 1
- [Question]:AttributeError: 'AzureOpenAI' object has no attribute 'generate' HOT 2
- [Question]: Is it possible to filter ObjectIndex by additional metadata? HOT 7
- [Bug]: AttributeError: 'SQLDatabase' object has no attribute 'get_single_table_info' HOT 6
- Use new sparse embeddings class in vector stores that support sparse
- [Bug]: predict_and_call fails with Ollama Mistral on Python 3.10.11 HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llama_index.