Comments (3)
Hey Muhammad,
In that case, you would wrap the model invocation in a custom callable, following the docs here.
In this case it could be as simple as
def custom_llm_callable(prompt):
return llm( #pass prompt to the llm here
res = guard(
custom_llm_callable,
prompt_params={"doctors_notes": doctors_notes},
max_tokens=1024,
temperature=0.3,
)
from guardrails.
@MuhammadHammadBashir we also recently added support for HuggingFace models. So if you are using one of those from the transformers library you can treat model.generate
like an llm api.
We still need to add better docs for this functionality, but you can find a summary and example here:
https://www.guardrailsai.com/blog/0.3-release/#anthropic-and-hugging-face-models-support
from guardrails.
from guardrails.
Related Issues (20)
- [Docs] Concept page around various ways of fixing LLM outputs with Guardrails
- [bug] Let json validation pass when a None-able field is missing from the LLM response HOT 1
- [bug] Remove deprecated code for unsupported data types
- [bug] pip version not updated HOT 2
- [q] Cohere async? Input should be a valid string [type=string_type, input_value=[cohere.Generation {
- [feat] Provenance - Add metadata with underlying numbers to the PassResult and FailResult HOT 1
- Guardrails support of AzureOpenAI with openai>1.0.0[bug] HOT 9
- [feat] Make `on_fail` an enum instead of string for the validator class
- [bug] Reference links in official guardrails docs are broken HOT 3
- [bug] JSON schema validation throws an exception when an LLM generates an array instead of an object HOT 2
- [feat] - llama-index integration HOT 1
- [bug] ToxicLanguage doesn't work with streaming and OpenAI HOT 4
- [bug] Message: 'An unexpected error occurred! Failed to authenticate! HOT 11
- Add better_profanity as a validator which takes care of word dividers HOT 4
- Dead link to docs page HOT 2
- [bug] Importing guardrails raises a deprecation warning for old validators HOT 1
- [bug] JSON validation converts null values to True:bool HOT 1
- No capability to disable telemetry via env var [bug] HOT 2
- [bug] Message: guardrails-cli[6204] , ERROR Failed to inspect HOT 8
- Guardrails Langchain integration with streamable AgentExecutor HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from guardrails.