Giter Club home page Giter Club logo

llm-answer-engine's Introduction

Perplexity-Inspired LLM Answer Engine


This repository contains the code and instructions needed to build a sophisticated answer engine that leverages the capabilities of Groq, Mistral AI's Mixtral, Langchain.JS, Brave Search, Serper API, and OpenAI. Designed to efficiently return sources, answers, images, videos, and follow-up questions based on user queries, this project is an ideal starting point for developers interested in natural language processing and search technologies.

YouTube Tutorials

Technologies Used

  • Next.js: A React framework for building server-side rendered and static web applications.
  • Tailwind CSS: A utility-first CSS framework for rapidly building custom user interfaces.
  • Vercel AI SDK: The Vercel AI SDK is a library for building AI-powered streaming text and chat UIs.
  • Groq & Mixtral: Technologies for processing and understanding user queries.
  • Langchain.JS: A JavaScript library focused on text operations, such as text splitting and embeddings.
  • Brave Search: A privacy-focused search engine used for sourcing relevant content and images.
  • Serper API: Used for fetching relevant video and image results based on the user's query.
  • OpenAI Embeddings: Used for creating vector representations of text chunks.
  • Cheerio: Utilized for HTML parsing, allowing the extraction of content from web pages.
  • Ollama (Optional): Used for streaming inference and embeddings.
  • Upstash Redis Rate Limiting (Optional): Used for setting up rate limiting for the application.
  • Upstash Semantic Cache (Optional): Used for caching data for faster response times.

Getting Started

Prerequisites

  • Ensure Node.js and npm are installed on your machine.
  • Obtain API keys from OpenAI, Groq, Brave Search, and Serper.

Obtaining API Keys

Installation

  1. Clone the repository:
    git clone https://github.com/developersdigest/llm-answer-engine.git
    
  2. Install the required dependencies:
    npm install
    
    or
    bun install
    
  3. Create a .env file in the root of your project and add your API keys:
    OPENAI_API_KEY=your_openai_api_key
    GROQ_API_KEY=your_groq_api_key
    BRAVE_SEARCH_API_KEY=your_brave_search_api_key
    SERPER_API=your_serper_api_key
    

Running the Server

To start the server, execute:

npm run dev

or

bun run dev

the server will be listening on the specified port.

Editing the Configuration

The configuration file is located in the app/config.tsx file. You can modify the following values

  • useOllamaInference: false,
  • useOllamaEmbeddings: false,
  • inferenceModel: 'mixtral-8x7b-32768',
  • inferenceAPIKey: process.env.GROQ_API_KEY,
  • embeddingsModel: 'text-embedding-3-small',
  • textChunkSize: 800,
  • textChunkOverlap: 200,
  • numberOfSimilarityResults: 2,
  • numberOfPagesToScan: 10,
  • nonOllamaBaseURL: 'https://api.groq.com/openai/v1'
  • useFunctionCalling: true
  • useRateLimiting: false
  • useSemanticCache: false

Function Calling Support (Beta)

Currently, function calling is supported with the following capabilities:

  • Maps and Locations (Serper Locations API)
  • Shopping (Serper Shopping API)
  • TradingView Stock Data (Free Widget)
  • Any functionality that you would like to see here, please open an issue or submit a PR.
  • To enable function calling and conditional streaming UI (currently in beta), ensure useFunctionCalling is set to true in the config file.

Ollama Support (Partially supported)

Currently, streaming text responses are supported for Ollama, but follow-up questions are not yet supported.

Embeddings are supported, however, time-to-first-token can be quite long when using both a local embedding model as well as a local model for the streaming inference. I recommended decreasing a number of the RAG values specified in the app/config.tsx file to decrease the time-to-first-token when using Ollama.

To get started, make sure you have the Ollama running model on your local machine and set within the config the model you would like to use and set use OllamaInference and/or useOllamaEmbeddings to true.

Note: When 'useOllamaInference' is set to true, the model will be used for both text generation, but it will skip the follow-up questions inference step when using Ollama.

More info: https://ollama.com/blog/openai-compatibility

Roadmap

  • [] Add AI Gateway to support multiple models and embeddings. (OpenAI, Azure OpenAI, Anyscale, Google Gemini & Palm, Anthropic, Cohere, Together AI, Perplexity, Mistral, Nomic, AI21, Stability AI, DeepInfra, Ollama, etc) https://github.com/Portkey-AI/gateway
  • [] Add a settings component to allow users to select the model, embeddings model, and other parameters from the UI
  • [] Add support for follow-up questions when using Ollama
  • [Complete] Add support for semantic caching to improve response times
  • [Complete] Add support for dynamic and conditionally rendered UI components based on the user's query

Example

  • [Completed] Add dark mode support based on the user's system preference

Example

Backend + Node Only Express API

Watch the express tutorial here for a detailed guide on setting up and running this project. In addition to the Next.JS version of the project, there is a backend only version that uses Node.js and Express. Which is located in the 'express-api' directory. This is a standalone version of the project that can be used as a reference for building a similar API. There is also a readme file in the 'express-api' directory that explains how to run the backend version.

Upstash Redis Rate Limiting

Watch the Upstash Redis Rate Limiting tutorial here for a detailed guide on setting up and running this project. Upstash Redis Rate Limiting is a free tier service that allows you to set up rate limiting for your application. It provides a simple and easy-to-use interface for configuring and managing rate limits. With Upstash, you can easily set limits on the number of requests per user, IP address, or other criteria. This can help prevent abuse and ensure that your application is not overwhelmed with requests.

Contributing

Contributions to the project are welcome. Feel free to fork the repository, make your changes, and submit a pull request. You can also open issues to suggest improvements or report bugs.

License

This project is licensed under the MIT License.

Star History Chart

I'm the developer behind Developers Digest. If you find my work helpful or enjoy what I do, consider supporting me. Here are a few ways you can do that:

llm-answer-engine's People

Contributors

amacsmith avatar developersdigest avatar ftoppi avatar linbojin avatar qin2dim avatar starbuck100 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

llm-answer-engine's Issues

Open Ai API

There is another open ai api key being set , removed my own from .env and another still being set - my configuration or being dynamically set somewhere else in the code ?

nodejs error

hello,

I'm experiencing the follow error on any search

llm-answer-engine-1  | There was a problem with your fetch operation: [Error: Network response was not ok. Status: 429]
llm-answer-engine-1  | Error: Network response was not ok. Status: 429
llm-answer-engine-1  |     at getImages (webpack-internal:///(rsc)/./app/action.tsx:201:19)
llm-answer-engine-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
llm-answer-engine-1  |     at async Promise.all (index 0)
llm-answer-engine-1  |     at async eval (webpack-internal:///(rsc)/./app/action.tsx:325:69)
llm-answer-engine-1  | Error: Network response was not ok. Status: 429
llm-answer-engine-1  |     at getImages (webpack-internal:///(rsc)/./app/action.tsx:201:19)
llm-answer-engine-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
llm-answer-engine-1  |     at async Promise.all (index 0)
llm-answer-engine-1  |     at async eval (webpack-internal:///(rsc)/./app/action.tsx:325:69)
llm-answer-engine-1  |  ⨯ unhandledRejection: Error: Network response was not ok. Status: 429
llm-answer-engine-1  |     at getImages (webpack-internal:///(rsc)/./app/action.tsx:201:19)
llm-answer-engine-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
llm-answer-engine-1  |     at async Promise.all (index 0)
llm-answer-engine-1  |     at async eval (webpack-internal:///(rsc)/./app/action.tsx:325:69)
llm-answer-engine-1  |  ⨯ unhandledRejection: Error: Network response was not ok. Status: 429
llm-answer-engine-1  |     at getImages (webpack-internal:///(rsc)/./app/action.tsx:201:19)
llm-answer-engine-1  |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
llm-answer-engine-1  |     at async Promise.all (index 0)
llm-answer-engine-1  |     at async eval (webpack-internal:///(rsc)/./app/action.tsx:325:69)
llm-answer-engine-1  | The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call `.done()`.

I updated to latest repo version and still same error happening

how can I set proxy ?

when I run :
npm run dev

I get this info:

[email protected] dev
next dev

▲ Next.js 14.1.2

✓ Ready in 1763ms
○ Compiling / ...
✓ Compiled / in 4.8s (10805 modules)
⨯ ReferenceError: window is not defined
at webpack_require (/Users/xxl/llm-answer-engine/.next/server/edge-runtime-webpack.js:37:33)
at fn (/Users/xxl/llm-answer-engine/.next/server/edge-runtime-webpack.js:318:21)
at eval (./components/answer/Map.tsx:10:65)
at (ssr)/./components/answer/Map.tsx (/Users/xxl/llm-answer-engine/.next/server/app/page.js:188:1)
at webpack_require (/Users/xxl/llm-answer-engine/.next/server/edge-runtime-webpack.js:37:33)
at fn (/Users/xxl/llm-answer-engine/.next/server/edge-runtime-webpack.js:318:21)
✓ Compiled in 460ms (5070 modules)
Error fetching videos: [Error: Network response was not ok. Status: 403]
Error: Network response was not ok. Status: 403
at getVideos (webpack-internal:///(rsc)/./app/action.tsx:233:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Promise.all (index 2)
at async eval (webpack-internal:///(rsc)/./app/action.tsx:299:69)
Error: Network response was not ok. Status: 403
at getVideos (webpack-internal:///(rsc)/./app/action.tsx:233:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Promise.all (index 2)
at async eval (webpack-internal:///(rsc)/./app/action.tsx:299:69)
⨯ unhandledRejection: Error: Network response was not ok. Status: 403
at getVideos (webpack-internal:///(rsc)/./app/action.tsx:233:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Promise.all (index 2)
at async eval (webpack-internal:///(rsc)/./app/action.tsx:299:69)
⨯ unhandledRejection: Error: Network response was not ok. Status: 403
at getVideos (webpack-internal:///(rsc)/./app/action.tsx:233:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Promise.all (index 2)
at async eval (webpack-internal:///(rsc)/./app/action.tsx:299:69)
There was a problem with your fetch operation: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching search results: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done().

Remote Agentic AI Backend with LangServe, LangGraph consumed as RemoteRunnable in NextJS with AI/RSC

First of all thank you @developersdigest for this excellent youtube tutorial and repo.

Please take following as an idea and potential feature request.

I feel like it is not ideal to mix all the AI logic with the web app. I would like to see some stable example of separation of concerns of "complex AI backend" and consumer NextJS web app.

Imagine following setup of Answer Engine

Backend:

  • Python - as it is lingua franca of LLM and the better part of LangChain libs
  • LangServe (with LCEL streaming) as API
  • LangGraph for complex state & process management and tool invocation
  • Persistence of history (graph) e.g. in EdgeDB or Zep
  • Rate limiting
  • Semantic Caching

Frontend:

  • NextJS app
  • Streaming AI backend consumed as RemoteRunnable of LangChain
  • Vercel AI SDK with AI/RSC for streaming React components from AI backend
  • Able to stream intermediate agentic results / graph state from remote backend

There are multiple requests for such setup and nobody came with a stable solution

I have filled similar request to Vercel AI SDK repo:
vercel/ai#1506

ollama interface problem

 ✓ Ready in 667ms
 ○ Compiling / ...
 ✓ Compiled / in 1628ms (1571 modules)
 ✓ Compiled in 293ms (470 modules)
There was a problem with your fetch operation: [Error: Network response was not ok. Status: 422]
Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
 ⨯ unhandledRejection: Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
 ⨯ unhandledRejection: Error: Network response was not ok. Status: 422
    at getImages (webpack-internal:///(rsc)/./app/action.tsx:178:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at async eval (webpack-internal:///(rsc)/./app/action.tsx:293:43)
Error fetching search results: [Error: HTTP error! status: 422]
Error fetching videos: [Error: Network response was not ok. Status: 403]
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call `.done()`.


app/config.tsx

// - The below are going to be the default values, eventually this will move to a UI component so it can be easily changed by the user
// - To enable + use Ollama models, ensure inference and/or embeddings model are downloaded and ollama is running https://ollama.com/library 
// - Icons within UI are not yet dynamic, to change currently, you must change the img src path in the UI component
// - IMPORTANT: when Ollama Embeddings + Ollama inference enabled at the same time, this can cause time-to-first-token to be quite long
// - IMPORTANT: Follow-up questions are not yet implrmented with Ollama models, only OpenAI compatible models that use  {type: "json_object"}

export const config = {
    useOllamaInference: true, 
    useOllamaEmbeddings: true, 
    inferenceModel: 'mistral', //mixtral-8x7b-32768', // Groq: 'mixtral-8x7b-32768', 'gemma-7b-it' // OpenAI: 'gpt-3.5-turbo', 'gpt-4' // Ollama 'mistral', 'lla
ma2' etc
    inferenceAPIKey: process.env.GROQ_API_KEY, // Groq: process.env.GROQ_API_KEY // OpenAI: process.env.OPENAI_API_KEY // Ollama: 'ollama' is the default
    embeddingsModel: 'llama2', // Ollama: 'llama2', 'nomic-embed-text' // OpenAI 'text-embedding-3-small', 'text-embedding-3-large'
    textChunkSize: 500, // Recommended to decrease for Ollama
    textChunkOverlap: 200, // Recommended to decrease for Ollama
    numberOfSimilarityResults: 4, // Numbher of similarity results to return per page
    numberOfPagesToScan: 5, // Recommended to decrease for Ollama
    nonOllamaBaseURL: 'https://api.groq.com/openai/v1', //Groq: https://api.groq.com/openai/v1 // OpenAI: https://api.openai.com/v1 
};

Local File Search

Objective: Develop an enhanced search tool capable of recursively searching through files in a specified directory and its subdirectories. This tool will search for files containing a specified search term within the filename, metadata, or contents, with case-insensitive matching.

Key Features:

Advanced Search Capabilities: Utilize semantic search powered by a local RAG to index and search contents. The search will consider various file types and metadata attributes.
Integration of Local and Online Data: Seamlessly integrate data from local files with relevant online content to provide enriched search results. Users can choose to start with local data and pull supplementary information from online sources or vice versa.
User Interface: Offer both a command-line interface for advanced users and a graphical user interface for ease of use, including search parameter configuration and real-time results display.
Performance: Implement efficient indexing and caching strategies to handle large datasets with minimal performance impact.

Use Cases:

Academics researching historical documents could find references and additional resources by searching through both local copies of primary sources and enriched online databases.
Software developers could use the tool to locate specific code snippets that are both locally available and in public repositories.
Technologies Used: Python, Elasticsearch for indexing

'"ai/rsc"' has no exported member named 'readStreamableValue'

I installed all the dependencies and ran the code in my local. I searched for 'readStreamableValue' in the entire repo and couldn't find it, this is the path to the file in which I get the error "llm-answer-engine-main/app/page.tsx".
Path to ai/rsc - "llm-answer-engine-main/node_modules/ai/rsc/dist/"

Can I only use GPT's API?

Because I don't have a credit card which means for Brave and the below one i can't get their api key for free, can i use this repo without it?

hey i am on windows and get this error on windows 11

  1. Received POST request
  2. Destructured request data
  3. Initializing Search Engine Process
  4. Rephrasing input
  5. Rephrased input and got answer from Groq
  6. Rephrased message and got documents from BraveSearch
  7. Fetching page content for https://smodin.io/free-english-rewriter-and-spinner
  8. Fetching page content for https://www.prepostseo.com/paraphrasing-tool
  9. Fetching page content for https://products.groupdocs.app/rewriter/text
  10. Fetching page content for https://paraphraz.it/
  11. Extracting main content from HTML for https://products.groupdocs.app/rewriter/text
  12. Extracting main content from HTML for https://smodin.io/free-english-rewriter-and-spinner
  13. Extracting main content from HTML for https://paraphraz.it/
  14. Extracting main content from HTML for https://www.prepostseo.com/paraphrasing-tool
  15. Processed 1 sources for https://products.groupdocs.app/rewriter/text
    file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/node_modules/@langchain/openai/dist/embeddings.js:203
    input: this.stripNewLines ? text.replace(/\n/g, " ") : text,
    ^

TypeError: Cannot read properties of undefined (reading 'replace')
at OpenAIEmbeddings.embedQuery (file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/node_modules/@langchain/openai/dist/embeddings.js:203:46)
at MemoryVectorStore.similaritySearch (file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/node_modules/@langchain/core/dist/vectorstores.js:104:90)
at fetchAndProcess (file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/index.js:97:30)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async Promise.all (index 2)
at async searchEngineForSources (file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/index.js:57:12)
at async file:///C:/Users/Toprak/Desktop/fetcher/llm-answer-engine/index.js:100:19

Node.js v20.9.0

Please create a docker image for this repo

Please create a docker image for this repo making it extremely easy for new comers to deploy this project on their own server

I can send a pull request with a dockerfile and a docker-compose.yml file. Let me know if you're up.

Can it access to the whole conversation?

It seems like that if I add a follow up question on the same threat, it has no idea about the previous message content. Is that true? If yes, do you have plan to support this feature in the future?

Run without GUI?

Awesome project, thanks!
Quick question: any guidance howto run without the GUI so I can use the output in my automation?
Cheers,
Steven

Feature Enhancement: Implement An Authentication Mechanism for Enhanced Security.

As the project relies on Next.js as its primary technology, deploying it on Vercel seems like a natural fit. However, we need to address some concerns, specifically regarding authentication. To prevent unauthorized access, we should implement authentication measures. One possible solution is to add an authentication code as an environment variable. If you’re short on time, I’m more than happy to contribute to this effort. I can submit a pull request to implement a simple authentication mode.

Open Source embedding - Nomic

Hello. Thank you for open sourcing this awesome tool.
I was watching the video by Matthew Berman and he suggested to improve the open source nature of the project by replacing the OpenAI embedding with the open source Nomic embedding.

Which all are the APIs for which entire web crawl will come as input or output Tokens ?

Kindly help with which all are the API cases for which the entire crawl will come as tokens, which will help to assess the cost correctly.

Serper - Per request -no issues here.
Open AI (Embedding) - Input + output tokens text alone or including web crawl data tokens?
Groq/MiXTRAL (for query analysis) - Input + output text tokens alone or including web crawl data tokens ?

Thanks in advance for your response.

Suggestions: Use Readability (JSDOM) and Query de-structuring

Suggestions:

  1. Use Mozilla's Readabiility (requires JSDOM) but does a great job reduce the size of content from the webpage for the context injection. https://github.com/mozilla/readability

  2. Use Query decomposition rather than rephrasing, and doing parallel searches, this helps for more complex queries (but does multiple the hits to brave/serarch engine by 3), e.g Your role is to generate a few short and specific search queries to answer the QUERY below. List 3 options, 1 per line. - then parse the response with a split on \n and remove numbers - or use json response.
    Unchecked code sample:

// get variations of input
const queryVariations = [...(await getVariations(input)), input]
// run the search for each variation
const braveTasks = queryVariations.map(q => searchWithBrave(q))
// remove any duplicates based on all 3 variations to avoid fetching the html of the same page more than one)
// you could use this to rerank instead of removing.. ie if the result showed more than one you could higher rank it for RAG
const uniqueResults = _.uniqBy(await Promise.all(braveTasks)), "url")

SUGGESTION: Tavily Api !!

That would be a BANGER addition !

Tavily API is a specialized search engine that empowers large language models (LLMs) and AI agents. It delivers tailored and accurate search results optimized for retrieval-augmented generation (RAG), helping to minimize hallucinations.

Here's how Tavily API benefits your project:

Aggregates and filters information from over 20 online sources per API call, intelligently selecting the most relevant content for your query or task.
Designed specifically for AI developers and LLM agents, streamlining the process of content retrieval and formatting.

  • Offers a free tier with 1,000 monthly API calls and an interactive API playground for testing. [1][3]

Citations:
[1] https://docs.tavily.com/docs/faq
[2] https://tavily.com
[3] https://docs.tavily.com/docs/tavily-api/introduction
[4] https://python.langchain.com/docs/integrations/retrievers/tavily
[5] https://js.langchain.com/docs/integrations/retrievers/tavily

Open source embeddings

Anyhow, I've to use openai embeddings for this. Can anyone provide any lead to use open source embeddings instead to make it completely open source.

Unable to run this on Github codespaces

I'm trying to run this on Github codespaces. When I run npm dev run I get the following error.

`x-forwarded-host` header with value `xxxx-xxx-xxxxxxx-3000.app.github.dev` does not match `origin` header with value `localhost:3000` from a forwarded Server Actions request. Aborting the action.
 ⨯ Error: Invalid Server Actions request.
    at AsyncLocalStorage.run (node:async_hooks:346:14)
    at AsyncLocalStorage.run (node:async_hooks:346:14)

I tried suggestions from here and here. They didn't work

The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call `.done()`.

[email protected] dev
next dev

▲ Next.js 14.1.2

✓ Ready in 12.2s
○ Compiling / ...
✓ Compiled / in 3.4s (1588 modules)
✓ Compiled in 210ms (470 modules)
There was a problem with your fetch operation: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error fetching search results: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/mZ8BPrdKTkc/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3lRyzOU467uTpWUa49AhfweToex3g: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-DYkuIBsWzs/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3nbR7mEEioQfWBbs17h0sksG1fneA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/Zb31mMMkH6Q/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3l4JobnYd4EG_MjeAhBvZC8E-3X3A: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/9Nzz3R2ojjc/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kqYSyXK6HVTu5i0lqPqQwMGC0IDw: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/v8IVq6Vxthk/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3mNiL-Y7YlaQ20BM4rjS4O2_7DCXA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-5tYQzum2b0/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3ldxkVaL6qvlR6BgGbYdmk3GbJV2Q: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/JXkWoUTSZGM/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3lSup36ZxbZ3oIP4-8RmWYIJDn-Mg: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/Xg4S6uMJtas/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kqR-Td1bO789YvPGIx9JhgTVsjLg: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-DAY8lwLN8A/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3k3GEkK8lln85ul30z_w3rMXr6wsA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/O1DI4PjKqV4/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kFZ2-OI0GviTWpwBkbttQnK1RzUA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done().
终止批处理操作吗(Y/N)?
Y
(.venv) PS D:\sider_search\llm-answer-engine> npm run dev

[email protected] dev
next dev

▲ Next.js 14.1.2

✓ Ready in 3.5s
○ Compiling / ...
✓ Compiled / in 1344ms (1570 modules)
✓ Compiled in 213ms (470 modules)
There was a problem with your fetch operation: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:169:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:294:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:344:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error fetching search results: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/mZ8BPrdKTkc/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3lRyzOU467uTpWUa49AhfweToex3g: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/9Nzz3R2ojjc/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kqYSyXK6HVTu5i0lqPqQwMGC0IDw: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-DYkuIBsWzs/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3nbR7mEEioQfWBbs17h0sksG1fneA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/Zb31mMMkH6Q/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3l4JobnYd4EG_MjeAhBvZC8E-3X3A: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/v8IVq6Vxthk/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3mNiL-Y7YlaQ20BM4rjS4O2_7DCXA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-5tYQzum2b0/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3ldxkVaL6qvlR6BgGbYdmk3GbJV2Q: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/Xg4S6uMJtas/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kqR-Td1bO789YvPGIx9JhgTVsjLg: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/rugvjRg9vYQ/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3nIkzZzkB4Lt0_qXzlfg-OqREIfMQ: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/-DAY8lwLN8A/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3k3GEkK8lln85ul30z_w3rMXr6wsA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/cSzg8XgEf8w/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3mYaPyske7Hj4q_SDZUKd_D4W1iPA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done().
✓ Compiled in 534ms (1570 modules)
终止批处理操作吗(Y/N)?
Y
(.venv) PS D:\sider_search\llm-answer-engine> npm run dev

[email protected] dev
next dev

▲ Next.js 14.1.2

✓ Ready in 2.5s
○ Compiling / ...
✓ Compiled / in 1486ms (1570 modules)
✓ Compiled in 480ms (470 modules)
There was a problem with your fetch operation: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:171:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:296:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:350:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:171:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:296:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:350:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:171:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:296:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:350:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
⨯ unhandledRejection: Error: fetch failed
at context.fetch (D:\sider_search\llm-answer-engine\node_modules\next\dist\server\web\sandbox\context.js:272:38)
at fetch (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/react/cjs/react.shared-subset.development.js:182:16)
at doOriginalFetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:363:24)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:488:20)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:108:36)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NoopTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18108)
at ProxyTracer.startActiveSpan (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:18869)
at eval (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:103)
at NoopContextManager.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:7062)
at ContextAPI.with (webpack-internal:///(rsc)/./node_modules/next/dist/compiled/@opentelemetry/api/index.js:2:518)
at NextTracerImpl.trace (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/trace/tracer.js:97:28)
at globalThis.fetch (webpack-internal:///(rsc)/./node_modules/next/dist/esm/server/lib/patch-fetch.js:166:81)
at getImages (webpack-internal:///(rsc)/./app/action.tsx:171:32)
at eval (webpack-internal:///(rsc)/./app/action.tsx:296:13)
at $$ACTION_0 (webpack-internal:///(rsc)/./app/action.tsx:350:7)
at endpoint (webpack-internal:///(rsc)/./node_modules/next/dist/build/webpack/loaders/next-flight-action-entry-loader.js?actions=%5B%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Capp%5C%5Caction.tsx%22%2C%5B%22%24%24ACTION_0%22%5D%5D%2C%5B%22D%3A%5C%5Csider_search%5C%5Cllm-answer-engine%5C%5Cnode_modules%5C%5Cai%5C%5Crsc%5C%5Cdist%5C%5Crsc-server.mjs%22%2C%5B%22%24%24ACTION_0%22%5D%5D%5D&client_imported=!:9:17)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async eval (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1146:24)
at async $$ACTION_0 (webpack-internal:///(rsc)/./node_modules/ai/rsc/dist/rsc-server.mjs:1142:12)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:310:31)
at async handleAction (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/action-handler.js:239:9)
at async renderToHTMLOrFlightImpl (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/app-render/app-render.js:756:33)
at async doRender (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1325:30)
at async cacheEntry.responseCache.get.routeKind (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/base-server.js:1450:28)
at async eval (webpack-internal:///(ssr)/./node_modules/next/dist/esm/server/response-cache/web.js:51:36) {
cause: ConnectTimeoutError: Connect Timeout Error
at onConnectTimeout (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9145:28)
at eval (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9101:50)
at Immediate.eval [as _onImmediate] (eval at requireWithFakeGlobalScope (D:\sider_search\llm-answer-engine\node_modules\next\dist\compiled\edge-runtime\index.js:1:657096), :9133:13)
at process.processImmediate (node:internal/timers:478:21)
at process.callbackTrampoline (node:internal/async_hooks:130:17) {
code: 'UND_ERR_CONNECT_TIMEOUT'
}
}
Error fetching search results: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/X86Spp7g0pY/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3kCjs-RhfUg3CAMXzikR2oggNgL6g: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/wF5prJnUyWw/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3novNItGM380ucM7KCFllpRiBW4Rw: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/eNYndWedZsE/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3lpRKdNneKwKCGDtja2hDF6oopEMA: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/qsQSGQlSPZ4/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3moXJBAE9nTYpkXFhor_ms7CTZOwg: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/KOEm1PLpmdQ/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3klKqd2pTdAinwyeg4cKRzjQdeQ4Q: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/mpD4SiOtLxU/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3lbjsWmgJDU7ivOZhPUbw6KjbsS3g: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/zm65gNrzAM8/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3k4hfkPBDeHvz3rlXwSDkJ-rsA20g: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
Error fetching image link https://i.ytimg.com/vi/v632jQzNDlY/mqdefault.jpg?sqp=-oaymwEFCJQBEFM&rs=AMzJL3mg0pPNhw2ad-QyceG6g4xgUKGCfQ: [TypeError: fetch failed] {
cause: [ConnectTimeoutError: Connect Timeout Error] {
name: 'ConnectTimeoutError',
code: 'UND_ERR_CONNECT_TIMEOUT',
message: 'Connect Timeout Error'
}
}
The streamable UI has been slow to update. This may be a bug or a performance issue or you forgot to call .done().

The above is my error log. As a novice with npm, how can I go about debugging?

WSL Run Issues

Hi I am on windows 11 using WSL with Ubuntu 20.0.4. I am having issues with installing/running this and am hoping you can help me?

When I try to try to install using NPM I get a slew of errors saying BadEngine or Unsupported Engine. If I try installing using Bun, the install works with no issues, but when running it simply states '$ next dev' and hangs there with no additional information.

I've tried using both in a conda environment as well as a fresh ubuntu install. When I've tried NPM vs Bun I've created a new directory and recloned the git to hopefully minimize errors.

I'm genuinely excited about getting this to work as I think it'd be incredibly valuable! Please let me know if you need any other info or have any suggestions. Thanks for the help!

Free embedding alternative?

Since openai's embedding is the only thing which requires payment, have you tried to figure out free/local alternatives?

Please add LiteLLM to this project

This project is pretty great BUT we need more options to use different LLM's.You don't have to worry about creating a solution which supports 100+ LLM easily as LiteLLM is another foss project which is capable of doing this task for you.
Project LiteLLM link - https://github.com/BerriAI/litellm
Adding LiteLLM will be big win for the project as many will be easily able to use many more LLM easily which everyone wants and project will require 3 major parameters from user like base url,model name,api key that's all and with open ai api general structure it can query and give back result for the query.Many big projects have started adding support for this project in there project to make things advanced in easier way so study it and after that if you have any query you can ask them they are pretty responsive plus if u want to know more about my personal experience of using it with other great projects like flowise then I can tell you that too in detail.

Hello, Im having some issues with my openai api key

hello,
I have balance on my account, i even removed my old key and created a new one but still have the same error on every search

llm-answer-engine-1  | Error processing content for https://medium.com/chainsecurity/zero-gas-price-transactions-what-they-do-who-creates-them-and-why-they-might-impact-scalability-aeb6487b8bb0:  [InsufficientQuotaError: 429 You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.] {
llm-answer-engine-1  |   name: 'InsufficientQuotaError'
llm-answer-engine-1  | }

HuggingFaceTransformersEmbeddings does not work

Hi,

I have tried to use the following for embedding. However, I got an error probably related to using the runtime = "edge"

const embeddings = new HuggingFaceTransformersEmbeddings({modelName: "Xenova/all-MiniLM-L6-v2"});

Here is the link to langchain js documentation: https://js.langchain.com/docs/integrations/text_embedding/transformers

I am wondering if it could be an easy fix for this.

Thanks

how to use my own openai proxy?

Thank you for the author's contribution in creating such an excellent project. I would like to ask how I can replace OpenAI's API with my own OpenAI proxy?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.