fixie-ai / ai-jsx Goto Github PK
View Code? Open in Web Editor NEWThe AI Application Framework for Javascript
Home Page: https://docs.ai-jsx.com
License: MIT License
The AI Application Framework for Javascript
Home Page: https://docs.ai-jsx.com
License: MIT License
When using the docs chat (https://ai-jsx-nextjs-demo.vercel.app/docs-chat), I asked a question (see attached screenshot) and then experienced three issues:
The app also seems to allow multiple presses of "enter" and just seems to create an empty dialog between the user and robot before finally replying to the empty queries (see the second screenshot below).
Typescript can be a lot for newcomers. We should have examples that show usage with just JS.
In the run some queries section:
the suggested queries in this section produce expected results, so I am using some unknown corpora.
Namely, document that in cases like:
const memoized = memo(<Component />);
return <>
<OpenAI chatModel="gpt-4">{memoized}</OpenAI>
<OpenAI chatModel="gpt-3.5-turbo">{memoized}</OpenAI>
</>
the Component
will not be memoized, whereas in cases like
const memoized = memo(AI.withContext(<Component />, renderContext));
return <>
<OpenAI chatModel="gpt-4">{memoized}</OpenAI>
<OpenAI chatModel="gpt-3.5-turbo">{memoized}</OpenAI>
</>
the Component
will be memoized and will effectively disregard the <OpenAI>
tags.
Right now our <DocQA>
component depends on a memory store. This would allow you to have a Pinecone provider to access embeddings stored in a Pinecone DB.
Error condition 1: https://fixie-ai.slack.com/archives/C05J75EFZ6X/p1693945790214519?thread_ts=1693945728.012129&cid=C05J75EFZ6X
Error condition 2: The user-defined npm run dev
command fails. In this case, the CLI tool should present a clean message to the user telling them to fix their command. (One way it could fail is if the user references fixie-serve-bin
but fails to install @fixieai/sdk
.) In the event of an error, the CLI tool could try running npm run dev
locally to ensure it succeeds.
Love what @juberti is building with fixie voice, and the open-source benchmarking sites for ASR and TTS. Something I personally want to test is how noise cancellation algorithms like Picovoice's Koala affects these ASR's transcription, and I think that benchmarking site would be a good place for me to contribute and test that out.
Just wanted to check with @juberti if that's a welcome addition to the site for me to contribute? Not sure how exactly you'd imagine the layout, but I'd like to compare the transcriptions both with and without the noise cancellation going in. If you'd want it as a separate page or a checkbox on top of the ASR to enable noise cancellation? Happy to discuss!
Iam using this in a react project with vite and getting this error.
✘ [ERROR] Failed to resolve entry for package "@nick.heiner/openai-edge". The package may have incorrect main/module/exports specified in its package.json. [plugin vite:dep-pre-bundle]
node_modules/vite/node_modules/esbuild/lib/main.js:1360:21:
1360 │ let result = await callback({
Mobile Client Error
Error: TextEncoder is required for this module to work in the browser
ERROR Invariant Violation: Failed to call into JavaScript module method AppRegistry.runApplication(). Module has not been registered as callable. Registered callable JavaScript modules (n = 11): Systrace, JSTimers, HeapCapture, SamplingProfiler, RCTLog, RCTDeviceEventEmitter, RCTNativeAppEventEmitter, GlobalPerformanceLogger, JSDevSupportModule, HMRClient, RCTEventEmitter.
A frequent cause of the error is that the application entry file path is incorrect. This can also happen when the JS bundle is corrupt or there is an early initialization error when loading React Native.
We should create a draft of this and share out with the community. We should also make it so that someone has to adopt the code of conduct before they can contribute to the repo.
The image for the loom demo video is broken in the readme . Readme needs to be fixed and the npm package patched.
See here for details adriatic/fixie-logs#2
Need to clarify there are public corpora
Can I use it in React Native ?
Currently, our demos only support streaming text. The next step is to get JIT'd UI working.
The autogenerated types from typedoc
aren't super readable.
Also, sometimes when we link to generated docs from hand-written docs, we get a build failure. For example:
When you render on the top level, you use `AI.createRenderContext()` to create a [`renderContext`](../api/interfaces/RenderContext.md)
Locally, I run yarn workspace docs build
, and it works. But on CI, I get a failure:
YN0000: │ ai-jsx-monorepo-root@workspace:. STDOUT docs:build: Error: Docs markdown link couldn't be resolved: (../api/interfaces/RenderContext.md) in "/vercel/path0/packages/docs/docs/guides/rendering.md" for version current
14:02:55.238
I think this is because the API docs aren't being generated prior to the link checker running – there's some race condition that only manifests on CI.
Hello Community Maintainers,
I thought it would be nice to have demo video with a Youtube link in the repository for better outreach
Add support for logging when streaming when using toStreamResponse (or document how if already possible)
return toStreamResponse(
<ChatCompletion>
<SystemMessage>
You are an assistant who writes poems. If the user asks for anything else, politely decline.
</SystemMessage>
<UserMessage>Write a poem about {topic}.</UserMessage>
</ChatCompletion>
);
}
Let users edit and change avatars
If a component yields two non-append-only frames, the append-only render will not be append-only. The fix is to disable yielding intermediate values from non-append-only streams when processing an AsyncIterator in renderStream
.
Need to implement delete corpus
Then update the Corpus API examples to use the client for the delete example in JS.
are there plans to support local backends instead of just openai/claude etc?
When trying to run the RedditQA example, the first step is to load the docs with yarn tsx packages/examples/src/bakeoff/reddit-qa/load-articles.mts. This failed for me because tsx was missing. Not sure if I missed some step along the way but fixed this by running yarn add tsx
.
What about folks that don't use React?
The context is described here. The oddity is this: manually edit the file index.js (which was initially created at the first time running this app) - something I should not be doing. Next time running the npm run dev
, leaves the index.js unchanged (my edits are still there).
This is of course huge potential for problems and this behavior could be explained by "access rights" issue with the script npm run build && npm run serve
.
Observed by running VSCode version 1.8.4 (Universal) on MacOS Sonoma 14.0
It would be great to have all the examples in the docs and elsewhere integrated with CodeSandBox. This makes it nicer to navigate the code, alter it, etc.
Trying to run yarn tsx packages/examples/src/bakeoff/reddit-qa/ai-jsx.tsx
and it throws with the below. This is after running yarn tsx packages/examples/src/bakeoff/reddit-qa/load-articles.mts
.
ben@Bens-MacBook-Air ai-jsx % yarn tsx packages/examples/src/bakeoff/reddit-qa/ai-jsx.tsx
/Users/ben/Dropbox/Mac (2)/Documents/GitHub/ai-jsx/packages/examples/src/bakeoff/reddit-qa/ai-jsx.tsx:34
throw Error('No documents available. Did you run load-articles first?');
^
Error: No documents available. Did you run load-articles first?
at <anonymous> (/Users/ben/Dropbox/Mac (2)/Documents/GitHub/ai-jsx/packages/examples/src/bakeoff/reddit-qa/ai-jsx.tsx:34:9)
Node.js v20.4.0
.env has a variable for the DOCS_URL that gets set to localhost for local development. Automate the setting of this to docs.ai-jsx.com when deploying to production.
I've set EXPORT ANTHRPIC_API_KEY=sk-ant-api03-xxxxxxxxxxxxxxxxxxxxxxxxxxxx in .bashrc and during individual terminal sessions but it doesn't work as it does with openai.
Where do i set the following, or other option pointing to anthropic?
In https://docs.ai-jsx.com/sidekicks/sidekicks-docsqa change Example Corpus note to be:
Rather than creating a corpus from scratch, you can also query an existing corpus that has a bunch of web pages with exciting facts about foxes. This corpus has ID 44094d5a-f817-4c2e-a2a4-8f8a0c936d0f and can be used in your code if you don't want to take the time to create a corpus of your own. If that is the case, you can jump ahead to the section below titled "Querying the Corpus" (https://docs.ai-jsx.com/sidekicks/sidekicks-docsqa#querying-the-corpus). Otherwise, you can continue with "Adding Sources to the Corpus" and make your own corpus to use.
Page Quickstart: Fixie Sidekicks and Part 1 - Introduction to Sidekicks are too similar to each other to remain as they are now.
I believe that the fixie.i docs collection of pages is being rewritten - please merge the two pointed above as a part of this rewrite.
Make it explicit in the content that the listed UUIDs are examples only and need to be swapped out with one that belongs to you.
Bring in the darkness for the dashboard
I deployed (to the cloud) the sample https://docs.ai-jsx.com/sidekicks/sidekicks-quickstart and asked the question show me what can you help with
. This resulted with the error from lookUpGitHubKnowledgeBase
:
This model response had an error: "Error during generation: AI.JSX(1032): OpenAI API Error: 400 This model's maximum context length is 4097 tokens. However, your messages resulted in 6069 tokens (5970 in the messages, 99 in the functions). Please reduce the length of the messages or functions. It's unclear whether this was caused by a bug in AI.JSX, in your code, or is an expected runtime error.
I have no doubt that I exceeded my token limit - and am reporting it just to be safe (that I reported this possible bug)
Added later: Rerun with a different (but similar) question: show me what can you help with
and this time everything went fine:
I can help you with various tasks related to Git and GitHub, including:
Providing guidance on Git and GitHub workflows.
Assisting with creating, cloning, and initializing repositories.
Explaining how to commit changes and manage branches in Git.
Guiding you through the process of creating and merging pull requests on GitHub.
Helping you resolve merge conflicts in Git.
Assisting with configuring and using Git remotes.
Explaining how to collaborate with others using Git and GitHub.
Providing information on Git and GitHub APIs and how to use them.
If you have any specific questions or need assistance with a particular task, please let me know, and I'll be glad to help you!
Perhaps this LLM is too smart for me, as running it with the first question, that resulted with Error - This model's maximum context length is 4097 tokens
now responded fine.
Note: I am fascinated with the difference in answering first and second question. Debugging this seems like a nightmare 😄
When logged in to the console, the session doesn't seem to persist and requires repeated logging in, even if the session persists, the UI doesn't change to indicate the change. Also, the user data from console.fixie.ai doesn't carry into app.fixie.ai. I think after the user logs in, the red circled area should toggle into the user's avatar image to show the user is signed in.
interactivity:
latency:
https://arethetypeswrong.github.io/?p=fixie%402.2.0
And in general, trying to import it into a NextJS project is challenging.
Reddit-QA readme refers to yarn tsx packages/examples/src/bakeoff/reddit-qa/ai-jsx-imperative.ts
. However, the file, ai-jsx-imperative.ts, is not there.
We should move the experimental version of NextJS + RSC into an experimental folder so people can explore the ideas and contribute.
in the "Sending messages to your Sidekick" section of the quickstart, add a section for ### Method 3: Via Embedding in Another Web App
with some instructions. Consider adding a sample web page to the template that can be modified to show this.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.