Giter Club home page Giter Club logo

danielmiessler / fabric Goto Github PK

View Code? Open in Web Editor NEW
9.9K 170.0 1.0K 151.01 MB

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.

Home Page: https://danielmiessler.com/p/fabric-origin-story

License: MIT License

Python 64.56% HTML 6.85% JavaScript 24.43% CSS 4.06% Shell 0.09%
ai augmentation flourishing life work

fabric's People

Contributors

agu3rra avatar argandov avatar ayberkydn avatar bpmcircuits avatar brianteeman avatar chroakpro avatar clintgibler avatar cubermessenger avatar cwhuang119 avatar danielmiessler avatar dependabot[bot] avatar dfinke avatar dheerapat avatar duckpaddle avatar eltociear avatar endogen avatar fr0gger avatar fureigh avatar gilgamesh555 avatar ichoosetoaccept avatar invisiblethreat avatar ksylvan avatar lmccay avatar lukewegryn avatar sbehrens avatar sleeper avatar streichsbaer avatar xssdoctor avatar yourpcchris avatar zestysoft avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fabric's Issues

Token limit?

How does this handle really long documents?

(fabric) xxx@xxxx-MBA client % cat /Users/xxxx/Downloads/xxx\ xxx\ Story\ Building\ Contents/Synthesis_Report.md | fabric --pattern write_essay

Error: Error code: 429 - {'error': {'message': 'Request too large for gpt-4 in organization org-xVSxxxxxbU on tokens per min (TPM): Limit 40000, Requested 40939. The input or output tokens must be reduced in order to run successfully. Visit https://platform.openai.com/account/rate-limits to learn more.', 'type': 'tokens', 'param': None, 'code': 'rate_limit_exceeded'}}
Error code: 429 - {'error': {'message': 'Request too large for gpt-4 in organization org-xVxxxxxxxxxxx on tokens per min (TPM): Limit 40000, Requested 40939. The input or output tokens must be reduced in order to run successfully. Visit https://platform.openai.com/account/rate-limits to learn more.', 'type': 'tokens', 'param': None, 'code': 'rate_limit_exceeded'}}

[Bug]: setup.sh error on 'sed' command on macOS

What happened?

I was doing THIS, when THAT happened. I was expecting THAT_OTHER_THING to happen instead.

I was running setup.sh on macOS and got errors from sed command:
sed: 1: "/Users/user/.zshrc": invalid command code f

As a result the .zshrc file contained empty aliases alias fabric=''

The modified version of 'sed' command which worked for me is:

sed -i "" "/alias $cmd=/c\\
alias $cmd='$CMD_PATH'
" "$config_file"

Version check

  • Yes I was.

Relevant log output

Updating /Users/user/.bashrc
sed: 1: "/Users/user/.bashrc": invalid command code f
Updated alias for fabric in /Users/user/.bashrc.
sed: 1: "/Users/user/.bashrc": invalid command code f
Updated alias for fabric-api in /Users/user/.bashrc.
sed: 1: "/Users/user/.bashrc": invalid command code f
Updated alias for fabric-webui in /Users/user/.bashrc.
Updating /Users/user/.zshrc
sed: 1: "/Users/user/.zshrc": invalid command code f
Updated alias for fabric in /Users/user/.zshrc.
sed: 1: "/Users/user/.zshrc": invalid command code f
Updated alias for fabric-api in /Users/user/.zshrc.
sed: 1: "/Users/user/.zshrc": invalid command code f
Updated alias for fabric-webui in /Users/user/.zshrc.
Updating /Users/user/.bash_profile
sed: 1: "/Users/user/.bash_pro ...": invalid command code f
Updated alias for fabric in /Users/user/.bash_profile.
sed: 1: "/Users/user/.bash_pro ...": invalid command code f
Updated alias for fabric-api in /Users/user/.bash_profile.
sed: 1: "/Users/user/.bash_pro ...": invalid command code f
Updated alias for fabric-webui in /Users/user/.bash_profile.

Relevant screenshots (optional)

No response

CLI Option to Select GPT Model Version - Client CLI

Hello, first off great work, I love the idea.

I have a suggestion, Currently, to switch between different versions of the GPT model, one needs to modify the code directly within the client folder. It would be highly beneficial if users could select the GPT model version directly from the command line interface (CLI) without needing to alter the code.

Proposed Solution:

  • Add a command-line interface (CLI) option that allows users to specify the GPT model version directly. This could look something like --model -m [MODEL_NAME] to setup the GPT that will be used, --modellist to show the available list of models.

I can submit a PR if you agree with this enhancement.

Thanks.

Visual Studio - Powershell

What is your question?

Hello, After spending a considerable amount of time in referencing this site , following some youtube walk-throughs, and troubleshooting using Chatgpt I still can't get through the setup.sh script file. Although I've been able to limp along it appears the main issues occur when attempting to execute Unix commands in the Windows VS PowershellAl environment. I was able, at one point, to get the setup.sh script file to install the dependencies but it fails in the commands and alias part of the script file. Is there any reference or comparable script file for running in the VS Powershell Terminal environment? Has anyone attempted to use some of the Windows add-on apps such as WSL or Cygwin for running this script file? I already have attempted to use Git Bash but that failed as well. Any help is appreciated. Thank you.

Can't find module 'utils' (Windows)[Solution]

Environment

  • Operating System: Windows
  • Python Version: [3.11.5]
  • Poetry Version: [1.7.1]
  • Fabric Version: [0.1.0]

Issue Description

When attempting to execute a sequence of commands involving Poetry and Fabric, an error is encountered stating that the module 'utils' cannot be found. This issue arises after initiating a Poetry environment and executing the fabric command.

Steps to Reproduce

  1. Run poetry install to install dependencies.
  2. Activate the virtual environment with poetry shell.
  3. Execute the fabric command.

Expected Behavior

The fabric command should execute without any issues, with all modules being correctly resolved.

Actual Behavior

An error message is displayed: can't find module 'utils'. This prevents the fabric command from executing as expected.

Solution

The issue appears to be related to an incorrect import statement within the Fabric library. To resolve this issue, a modification in the import path within a specific file is required.

Detailed Steps to Resolve

  1. Navigate to the file located at \fabric\client\fabric\fabric.py.
  2. In the first line of the file, locate the import statement for utils.
  3. Modify the import statement from import utils to import fabric.utils.

After making this change, the fabric command executes successfully without encountering the module resolution error.

Additional Notes

  • This solution is specific to the described environment and versions. If the issue persists, please check for any updates or patches for the involved software.
  • It's recommended to document any changes made to library files for future reference or in case of updates.

i will try

Actually i cannot understand so much but i will try and i want to learn

Use config to generate server routes

The idea is that you could use a yml document and autogenerate all of the routes (and probably the client code too)....

The code below is more complicated than necessary so I'll clean it up and make it more concise and submit as a PR. Just wanted to get the idea out there for discussion.

import yaml
from flask import Flask, request, jsonify
from your_authentication_module import auth_required  # Import your authentication module
from your_fetch_module import fetch_content_from_url  # Import your content fetching module
import openai

app = Flask(__name__)

def create_endpoint(endpoint_config):
    @app.route(endpoint_config["path"], methods=endpoint_config.get("methods", ["POST"]))
    @auth_required
    def generic_endpoint():
        data = request.get_json()

        if "input" not in data:
            return jsonify({"error": "Missing input parameter"}), 400

        input_data = data["input"]

        responses = []
        for content_url in endpoint_config["content_urls"]:
            content = fetch_content_from_url(content_url)
            responses.append({"role": "system" if content_url.endswith("system.md") else "user", 
                              "content": content + ("\n" + input_data if content_url.endswith("user.md") else "")})

        try:
            response = openai.chat.completions.create(
                model=endpoint_config.get("model", "gpt-4-1106-preview"),
                messages=responses,
                temperature=endpoint_config.get("temperature", 0.0),
                top_p=endpoint_config.get("top_p", 1),
                frequency_penalty=endpoint_config.get("frequency_penalty", 0.1),
                presence_penalty=endpoint_config.get("presence_penalty", 0.1),
            )
            assistant_message = response.choices[0].message.content
            return jsonify({"response": assistant_message})
        except Exception as e:
            return jsonify({"error": str(e)}), 500

    return generic_endpoint

# Load the config file
with open("config.yml", "r") as file:
    config = yaml.safe_load(file)

# Create endpoints based on the config
for endpoint_name, endpoint_config in config.items():
    create_endpoint(endpoint_config)

if __name__ == "__main__":
    app.run(debug=True)
extwis:
  path: "/extwis"
  methods: ["POST"]
  content_urls:
    - "https://raw.githubusercontent.com/danielmiessler/fabric/main/patterns/extract_wisdom/system.md"
    - "https://raw.githubusercontent.com/danielmiessler/fabric/main/patterns/extract_wisdom/user.md"
  model: "gpt-4-1106-preview"
  temperature: 0.0
  top_p: 1
  frequency_penalty: 0.1
  presence_penalty: 0.1
# More endpoints can be defined similarly

Documentation Enhancement for Poetry Users - Executable Access

Description

I recently installed fabric using poetry and encountered an issue where the fabric executable was not directly accessible from the command line. Despite a successful installation, this led to some confusion and required additional steps to resolve the issue.

Issue Details

  • Environment: macOS
  • Installation Method: Poetry
  • Fabric Version: 1.0.0
  • Python Version: 3.12

Problem

After installing fabric with poetry, I found that attempting to run the fabric command in the terminal resulted in a "command not found" error. This issue appears to be due to the fabric executable being placed in the virtual environment's bin directory, which is not automatically included in the system's PATH.

Temporary Solution

I was able to manually resolve this issue by creating an alias for the fabric executable located within the poetry-created virtual environment. Here are the steps I took:

  1. Locate the fabric executable within the poetry virtual environment:
    find ~/Library/Caches/pypoetry/virtualenvs -name fabric
  2. Create an alias in my shell configuration file (.bashrc or .zshrc):
    alias fabric='/path/to/virtualenv/bin/fabric'
    Please replace /path/to/virtualenv/bin/fabric with the actual path to the executable found in step 1.

Suggested Improvement

To improve the user experience and assist future users who might encounter a similar situation, I suggest updating the installation documentation with a note or section dedicated to poetry users. This could include guidance on how to locate the fabric executable within the poetry virtual environment and recommend creating an alias for easier use.

Conclusion

Incorporating this information into the documentation could greatly streamline the setup process for users installing fabric via poetry, ensuring a smoother initial experience. Thank you for considering this suggestion to enhance the project's documentation and user accessibility.

`Feature` Add a GUI for a better user experience

Would it possible, to make a GUI version of the client?
It could feature elements that could for example help the user create task plans. Or it could allow users to also add files so the ai could use them as reference.

Issue running poetry in macos

Hi I'm facing the below issue (See log output part)

macos config:

  • System Version: macOS 14.1.1 (23B81)
  • Secure Virtual Memory: Enabled
  • Memory: 16 GB

Python:

  • Python 3.12.1

pip list | grep fabric return :

Package Version Editable project location
#########################################
fabric 0.1.0 /Users/xxxx/Documents/git/fabric/fabric/client

See log output part

(fabric-py3.12) (base) XXXX@XXXX % poetry shell
The currently activated Python version 3.9.2 is not supported by the project (^3.10).
Trying to find and use a compatible version.
Using python3 (3.12.1)
Virtual environment already activated: /Users/xxx/Library/Caches/pypoetry/virtualenvs/fabric-MBJlZ2wn-py3.12
(fabric-py3.12) (base) XXXX@XXXX % fabric -h
Traceback (most recent call last):
File "/Users/xxx/Library/Caches/pypoetry/virtualenvs/fabric-MBJlZ2wn-py3.12/bin/fabric", line 3, in
from fabric import main
File "/Users/xxx/Documents/git/fabric/fabric/client/fabric/init.py", line 1, in
from .fabric import main
File "/Users/xxx/Documents/git/fabric/fabric/client/fabric/fabric.py", line 1, in
from utils import Standalone, Update, Setup
ModuleNotFoundError: No module named 'utils'
(fabric-py3.12) (base) XXXX@XXXX %

Idea - Developer Volunteer Pipeline

As per feedback from Daniel ( #38 (comment) ) , pitching an idea for potential area of contribution.

GOAL:

Help create a simple and easy to manage developer volunteer pipeline so Daniel has a decent volunteer "roster" to lean on to solve small to mid-level issues (or feature enhancements).

Details:

  1. Use a website asset (or online form) that Daniel can share from time to time (or put against ads on social media) with a "Call for Volunteers" that quickly and easily outlines the following:
  • preferred languages
  • areas of need
  • commitment level
  • contact info (in case there isn't an immediate need- but you want to reach out to them later).

This way - Daniel can manage expectations of those who fill out the form, while also quickly size up basic info about their background. You can even feature in the form - an example of a great contribution/ pull request so users can align their expectations to what you need.

What is needed:

  1. A place to submit sample form questions (so Daniel can finalize/approve). Do this in a pull request? Or just send a you a doc?
  2. Have Daniel list out preferred/ideal languages/experience that would be a good fit.
  3. A sample pull request he likes so it can be featured in the form
  4. BONUS/ STRETCH GOAL: Attach this form (or link if the form lives in your website) to a BOT in your chats in discord or wherever your community lives, and have it get shared 1/month or whatever frequency. This way - you don't have to think about "recruiting" contributors. It does it for you.....
  5. Bonus Bonus ;) You could experiment with running a small targeted add, to really see where recruiting volunteers can take you. I have some digital marketing experience that can assist with that - but that might be overkill. But maybe not, if you are interested in building a solid core team that you can actually trust.

Adding function that allows fabric to work under Visual Studio

Under Visual Studio (not the Visual Studio Code) sys.stdin.read() doesn't work so well. I am proposing adding a function called "get_input" that selects the input method based on the platform it is running on. This allowed me to test under VS.

At the head of fabric

from utils import Standalone, Update, Setup
import argparse
import sys
import os
import platform


script_directory = os.path.dirname(os.path.realpath(__file__))


# aided by ChatGPT; a function for Python
# incorporating platform.system
# that accepts either piped input or console input
# from either Windows or Linux
def get_input():
    system = platform.system()
    if system == 'Windows':
        if not sys.stdin.isatty():  # Check if input is being piped
            return sys.stdin.readline().strip()  # Read piped input
        else:
            return input("Enter Question: ")  # Prompt user for input from console
    else:
        sys.stdin.read()


```.
.
,

At the bottom of fabric
else:
    text = get_input() 
if args.stream:
    standalone.streamMessage(text)
else:
    standalone.sendMessage(text)

Broken file

DEMO.write_essay.mov is labeled as a corrupt file

More detail about dynamic path-pattern mapping

This has conflicts, unfortunately, and I'm not sure I'm fully grasping the idea. Can you restate?

example: currently when I go to localhost:5000/extwis I can send a post request to tell the server to mill my input with your pattern and then send to OpanAI to get the answer

I proposed that instead you building @app.route("/extwis", methods=["POST"]) for every pattern, we can just make path /<pattern> dynamic by adding a mapping dictionary to map path with the pattern user want to use. more info on this

Originally posted by @dheerapat in #52 (comment)

Typo in `README.md`

On #L119, ... systemClone ... should be ... system. Clone ....

1. Navigate to where you want the Fabric project to live on your systemClone the directory to a semi-permanent place on your computer.

Update OpenAI API Key Handling

Description:
The current implementation directly reads the OpenAI API key from a file in the Setup class in client/utils.py. This can be a potential security risk as the key is stored in plain text. Additionally, as a user, I already have the OpenAI API key stored as an environment variable, so it is inconvenient to generate a new API key and paste it into the code.

Proposed Changes:

  1. Modify the Setup class to set the API key as an environment variable (OPENAI_API_KEY) instead of writing it to a file.
  2. Directly use OpenAI() without manually setting the API key, as OpenAI() can infer the API key from the environment variables.

Use python poetry for managing dependencies in a deterministic fashion

I noticed the repo contains a collection of requirements.txt file without pinned dependencies. This translates to every time a "build" (dependencies pull in the case of python) happens, a new set of dependencies will be resolved (time dependent), which could lead to issues reproducing the same exact dev environment on multiple machines.

I suggest using https://python-poetry.org/ to manage dependencies as it is really efficient and makes it clear which dependencies you rely on.

I volunteer to submit a PR if you're up for it. Just let me know @danielmiessler.

Cheers! \o/

PS: I assume you wish to manage the client and server dependencies separately. Please confirm if that's the case.

Volunteer opportunities?

Hello,

I have 3+ years of software project management experience. While I use openSUSE as my daily driver, I'm not a programmer or dev, so only limited technical experience.

I am fascinated with your take on AI tools, and wanted to see if there are opportunities to volunteer and help the project out.

README Instructions need more Clarification

I noticed through one of the older issues that it is expected that you run "poetry shell" to pick up the installed dependencies and find the CLI. This should be added to the instructions after "poetry install", I think.

Steps 5 and 6 seem redundant and maybe slightly inaccurate? When I tried that it:

  1. first attempted to run it as a bash script - complaining that there was a problem near the unexpected token '('...
  2. then I changed the alias to be "python3 /path/to/fabric/client/fabric" and it complained that it was a directory...
  3. then I change it to fully qulify the path to the fabric.py file
  4. then it failed to import the dependencies and modules installed by poetry
  5. once I used "poetry shell" it seems to work

So, I'd like to fix up the instructions but am not sure what exactly we need to do.

I can certainly interject the use of "poetry shell" but do we actually expect the bashrc alias bit to work without being in the shell?
If so, what else do we need to do there?

Also, what should step 6 actually be? Seems to just be a slight variation on 5 even though it says to restart your shell and to make sure you can do something...

problem running poetry shell

OS: Windows & Linux Mint
Poetry version: 1.7.1
Python 3.10

I'm having issues on both windows and linux with getting this to work.
Sorry if this is an easy fix, I've tried a few different things to make this work, including changing my OS completely.

Followed the instructions up to the point where I need to run the poetry shell.
I don't remember what was going on with my windows machine, but he's what I get with this one.
It seems like I'm getting an error related to the "cleo" package. I've upgraded poetry, injected cleo, and tried to go back and make sure I did the earlier steps properly.
Any help is greatly appreciated.
I'm not sure if I'm just missing something, or what.

When I try to run it,
Original exception was:
Traceback (most recent call last):
File "/usr/bin/poetry", line 5, in
from poetry.console import main
File "/usr/lib/python3/dist-packages/poetry/console/init.py", line 1, in
from .application import Application
File "/usr/lib/python3/dist-packages/poetry/console/application.py", line 3, in
from cleo import Application as BaseApplication
ImportError: cannot import name 'Application' from 'cleo' (/home/enigma/.local/lib/python3.10/site-packages/cleo/init.py)

Adding own patterns

I've created a folder with the system and user file and I've tried running update. But it's not recognizing my added pattern and returns "pattern not found".

What is the right process?

Local LLMs

The optional server-side functionality of fabric is called the Mill.

Is the Mill only meant to interact with OpenAI or is it possible to setup a Mill with a local LLM via something like LMStudio/Ollama/Langchain?

Infrastructure code seems to only indicate openAI connections. Maybe this just meant as a starting example...

Missing requirements.txt file for client install

In the README quickstart for the client install, it instructs to

Enter the project and its /client folder

cd fabric/client

Install the pre-requisites

pip3 install -r requirements.txt

However, there is no requirements.txt in the fabric/client folder. I did find one in the 'server' folder and installed those requirements, but perhaps something is still missing because after adding the alias (or even trying manually), it's not running, it complains that I'm referencing a folder when I do 'fabric -h'

$ fabric -h
-bash: /home/user/ai/fabric/client/fabric: Is a directory

Thanks

[Question]: Fabric not found in the current Poetry environment

What is your question?

Hi I followed the Quickstart but when I tried to run ./setup.sh

it gives this error
Command fabric not found in the current Poetry environment.
Command fabric-api not found in the current Poetry environment.
Command fabric-webui not found in the current Poetry environment.

Did I install it wrong? Can you provide some steps to solve this?

Token limit

When testing extractwisdom and summarize I run into the 4000 input/output token limit when using the 3.5 turbo model.

Are you aware of any way to address this?

I tried doing chunking but then one receives weird output like this (with different summaries for different parts of the text):

SUMMARY:

This article provides a deep dive into the history and evolution of OAuth, the authentication protocol used by many APIs. It explains how OAuth was developed as a solution to the limitations of HTTP Basic Auth and the problems associated with sharing passwords with third-party applications. The article also discusses the differences between various authentication mechanisms used by different platforms and how they eventually led to the development of OAuth. It further highlights the challenges faced by OAuth 1, especially in the context of mobile apps, which eventually led to the development of OAuth 2.

IDEAS:

1. OAuth was developed as a response to the limitations and security risks associated with using HTTP Basic Auth for API authentication.
2. Various platforms had their own authentication mechanisms, such as FlickrAuth, AuthSub, MD5 signed requests, and BBAuth, before the development of OAuth.
3. Developers realized that they were all trying to solve the same problem and started working together, leading to the development of OAuth 1.
4. OAuth 1 had limitations and was not suitable for mobile apps, which eventually led to the development of OAuth 2.
5. OAuth 2 aimed to simplify the authentication process, improve security, and make it compatible with a wider range of applications, including mobile and single-page apps.

QUOTES:

1. "Before OAuth, it was actually very common for APIs to use HTTP Basic Auth. That’s just an application sending a username and password to the API." 
2. "So in the mid-2000s, as many websites started building public APIs, they all started solving this problem. But everybody did it slightly differently and everyone used different names for things."
3. "OAuth 1 was published in 2007 and was deployed at several companies, including Twitter."
4. "OAuth 2 aimed to make authentication easier for developers by dropping the signature requirements and using Bearer tokens instead."
5. "OAuth 2 was designed to be used securely in mobile apps and single-page apps in a browser."

HABITS:

1. Collaboration and working together: Developers from different companies collaborated to develop OAuth, realizing they were all trying to solve the same problem.
2. Continuous improvement: OAuth 1 was improved upon and eventually replaced with OAuth 2 to address the limitations and challenges faced.
3. Security-consciousness: The development of OAuth 2 aimed to improve security and ensure safe authentication in various applications.
4. Adaptability: OAuth 2 was designed to be compatible with a wide range of applications, including mobile and single-page apps.

FACTS:

1. OAuth 1 was first deployed in 2007 and was used by several companies, including Twitter.
2. OAuth 2 was developed to address the limitations of OAuth 1 and make authentication easier for developers.
3. OAuth 2 introduced the use of Bearer tokens for authentication instead of signatures.
4. OAuth 2 was designed to be secure and compatible with mobile apps and single-page apps in browsers.

REFERENCES:

1. FlickrAuth - Flickr's API authentication mechanism.
2. AuthSub - Google's API authentication mechanism.
3. MD5 signed requests - Facebook's API authentication mechanism.
4. BBAuth - Yahoo!'s browser-based authentication mechanism.

RECOMMENDATIONS:

1. Developers should adopt OAuth for secure and user-friendly authentication in their applications.
2. Mobile app developers should utilize OAuth 2 for safe and efficient authentication.
3. Companies providing public APIs should consider implementing OAuth 2 to facilitate secure data access for third-party applications.

# SUMMARY

This content discusses the adoption of OAuth by larger companies and the separation between the authorization server and API server in OAuth 2.0. It highlights the improvements made in the OAuth 2.0 spec and the continued progress by the IETF Working Group. The content explores how OAuth improves application security by addressing the limitations of handling authentication directly in applications. It emphasizes the benefits of OAuth for single sign-on and enabling multiple apps to share the same user database.

# IDEAS:
1. OAuth 2.0 was designed to address the scalability and security concerns of handling authentication directly in applications.
2. The separation between the authorization server and API server is a key feature of OAuth 2.0, allowing for better scalability and security.
3. The OAuth 2.0 spec has evolved since its finalization in 2012, with additional extensions and best practices being developed to enhance its functionality and security.
4. OAuth enables single sign-on and allows multiple apps from different companies to share a common user database.
5. Handling passwords directly in applications raises trust and security concerns for users, as they are unsure how their passwords will be used or stored.
6. OAuth eliminates the need for applications to collect and store user passwords by providing a secure authorization mechanism.
7. OAuth has specific benefits for native apps, smart TVs, and other scenarios, as demonstrated by extensions like OAuth 2.0 for Native Apps and the Device Grant.
8. The OAuth Security Best Current Practice provides guidelines for building secure OAuth systems.
9. OAuth provides a more flexible and secure approach to authentication, especially when integrating with multiple apps and platforms.

# QUOTES:
1. "As larger companies started looking at adopting OAuth for their own APIs, they also needed to make sure that it would work at a larger scale." 
2. "OAuth 2.0 Spec was finalized in 2012, but the work hasn’t stopped there. The IETF Working Group has continued to make progress on the specs, filling in some of the missing functionality and making OAuth more useful or more secure in even more situations."
3. "It’s the single sign-on case that really gets to the core of why OAuth was created, especially once you consider that you may want multiple apps by different companies to be able to share the same user database."
4. "Without OAuth, these applications would have to collect the user’s password and send it to the API."
5. "Handling passwords directly in applications raises trust and security concerns for users. They are unsure how their passwords will be used or stored."
6. "OAuth eliminates the need for applications to collect and store user passwords by providing a secure authorization mechanism."
7. "OAuth has become a widely adopted standard for authentication and authorization in various scenarios, ensuring better security and scalability."

# HABITS:
1. Storing password hashes properly to ensure better security.
2. Never logging passwords accidentally to protect user privacy.

# FACTS:
1. OAuth 2.0 Spec was finalized in 2012.
2. The OAuth 2.0 For Native Apps extension provides specifications for integrating OAuth in native applications.
3. The Device Grant extension enables the use of OAuth on smart TVs.
4. The OAuth Security Best Current Practice outlines the most secure way to build OAuth systems.

# REFERENCES:
1. OAuth 2.0 For Native Apps extension
2. Device Grant extension
3. OAuth Security Best Current Practice

# RECOMMENDATIONS:
1. Adopt OAuth for authentication and authorization in your applications to enhance security and scalability.
2. Implement OAuth for single sign-on functionality and to enable multiple apps to share the same user database.
3. Follow the OAuth Security Best Current Practice guidelines to ensure the highest level of security in your OAuth systems.

How to install?

There are no instructions in the README on how to install it locally.

Also, I'd love a branch of just people uploading their use cases that others can take as examples

I dont know

can you descript more about this project ?
why do people need this project ,other than openAI?
do you share any prompt in this project ? where can I find it?
thanks for your kindness.

Cost management measures

A brief review of the project reveals that there are no build in API cost management measures. It is well known that OpenAI's don't work. There needs to be a warning, and budgets, otherwise people's accounts are going to get cleaned out.

[Bug]: Failing to import Installer

What happened?

I've followed the quickstart recipe on macOS and a fresh conda env for Python 3.10, but when running fabric I get an error about the installer. Maybe it's not installed correctly. Maybe a __init__.py is missing there?

Version check

  • Yes I was.

Relevant log output

(fabric-env) fabric % which fabric
fabric: aliased to /Users/joe/miniconda3/envs/fabric-env/bin/fabric

(fabric-env) fabric % fabric
Traceback (most recent call last):
  File "/Users/joe/miniconda3/envs/fabric-env/bin/fabric", line 3, in <module>
    from installer import cli
ImportError: cannot import name 'cli' from 'installer' (/Users/joe/miniconda3/envs/fabric-env/lib/python3.10/site-packages/installer/__init__.py)

Relevant screenshots (optional)

No response

Remove Virtual Environment Folder from "client" Directory

Description:
The "client" directory currently includes a "source" folder containing a virtual environment.

Proposed Solution:

  1. Add "source/" to the .gitignore file to exclude the virtual environment folder from version control.
    or
  2. When creating virtual environments, let's adhere to the existing naming conventions in the .gitignore file, such as "env," "venv," and "ENV."

Can someone explain about this project?

I looks like a very promising project but I don't understand what kind of framework it is. There aren't any examples included or anything. I'm taking courses in machine learning and would like to contribute to this project given I understand the basic gist of it.

Option to use other LLM Models

Love your project! 👏

I would be great if fabric could use local LLMs like Lama from Meta!
I do not know how to do it though...

Question on the usage of the `server`.

I understood the main purpose of the project's server is to create endpoints which will host one's own patterns of system.md instructions to then forward requests to the OpenAI API. Did I get this right?

I wonder if we could simply "bake" all of the ready to use patterns in the patterns folder into the fabric CLI directly and then make them discoverable via a built-in fabric --help command. We could perhaps allow users to add their own patterns (assuming they won't contribute them back to this repo) into our ~/.fabric/ config folder. Any folder name containing say ~/fabric/foobar/system.md would then become discoverable for and usable by the CLI.

Does that make any sense? I suppose the alternative is that people should manually add endpoints to the Flask API.

@danielmiessler Thoughts?

Wisdom Input?

Looking through the Patterns to try and understand how to make use of them, I'm confused why the Wisdom one doesn't have an Input section in the system.md?

# INPUT:

INPUT:

[Question]: After setting up fabric with my openai gpt api key it looks like the model is still trying to use a differnet api key ending with "92xm". Are there setup instructions I can follow to run my own client/server/mill functions with my own api keys?

What is your question?

After reading the documentation, I am still not clear how to get my own fabric Mill ecosystem setup. After running $ fabric --setup and entering my personal api key and downloading current patterns I am unable to successfully run any patterns. This is an example of trying one particular pattern:
$ echo "This is brokentext text, ." | fabric --stream --pattern clean_text

I get the following error:
Error: Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-a9HhR***************************************92xm. You can find your API k ey at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}} Error code: 401 - {'error': {'message': 'Incorrect API key provided: sk-a9HhR***************************************92xm. You can find your API key at h ttps://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

This is not my api key and I do have a valid subscription.

Here are some of my system specs in case this helps:
$ cat /var/log/installer/media-info ; echo $XDG_CURRENT_DESKTOP ; echo ; lsb_release -a ; echo ; uname -r ; echo ; pyt hon --version ; echo ; poetry --version

Ubuntu-Server 22.04.3 LTS "Jammy Jellyfish" - Release amd64 (20230810)

No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.4 LTS
Release: 22.04
Codename: jammy

5.15.0-94-generic

Python 3.10.12

Poetry (version 1.7.1)

I know it is early on in the project and I'm thrilled you've chosen to release it! I am happy to contribute patterns & documentation. I just can't figure out getting to the successful starting point yet!

Thank you kindly -- @0xsalt.

General help please

Hey Daniel a belated happy new year to you sir. Thank you for your wonderful work. I love your AI Fabric open source idea. I am just learning AI and wondered how you managed to create those GPTs as you demonstrated via your Linux on prem box?

I want to try out the extractwisdom but not too sure how to start. I know you mentioned flask and some other technologies. But wondered if you had an idiots guide as to how to setup a GPT like extractwisdom

image

Any dummies guide help would be gratefully appreciated

Thank you sir
AJ

api server only support extract_wisdom pattern? I think the api path should be dynamic

we can do something like

path_mappings = {
    "path1": {"system_url": "https://example.com/system1.md", "user_url": "https://example.com/user1.md"},
    "path2": {"system_url": "https://example.com/system2.md", "user_url": "https://example.com/user2.md"},
    # also this can be a relative path since the back-end server lives along pattern directory
}

@app.route("/<path_segment>", methods=["POST"])
def mill(path_segment):

just a rough idea, I will tackle this on the weekend. I want to contribute to this and improve quality of the back-end server.

Docker file

Hello! You should consider a docker file to allow people to easily deploy the server to theirs

Feature: Compatibility with OLLAMA, locally hosted AI LLMS

First, thank you for the tremendous project! I love it.

I am currently a proponent of OLLAMA and locally hosted LLMs. It would be awesome if, as a long-term goal, we could integrate fabric in to a locally hosted LLM.

Ollama does use openai apis etc. but I believe the goal would be to use the locally hosted LLMs over openai if that is what you wished.

https://github.com/ollama/ollama

I would love to offer up my help in testing with my ollama installation if I can.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.