Giter Club home page Giter Club logo

chatgpt-api's Introduction

ChatGPT API

Node.js client for the official ChatGPT API.

NPM Build Status MIT License Prettier Code Formatting

Intro

This package is a Node.js wrapper around ChatGPT by OpenAI. TS batteries included. ✨

Example usage

Updates

April 10, 2023

This package now fully supports GPT-4! πŸ”₯

We also just released a TypeScript chatgpt-plugin package which contains helpers and examples to make it as easy as possible to start building your own ChatGPT Plugins in JS/TS. Even if you don't have developer access to ChatGPT Plugins yet, you can still use the chatgpt-plugin repo to get a head start on building your own plugins locally.

If you have access to the gpt-4 model, you can run the following to test out the CLI with GPT-4:

npx chatgpt@latest --model gpt-4 "Hello world"

Using the chatgpt CLI with gpt-4

We still support both the official ChatGPT API and the unofficial proxy API, but we now recommend using the official API since it's significantly more robust and supports GPT-4.

Method Free? Robust? Quality?
ChatGPTAPI ❌ No βœ… Yes βœ…οΈ Real ChatGPT models + GPT-4
ChatGPTUnofficialProxyAPI βœ… Yes ❌ No️ βœ… ChatGPT webapp

Note: We strongly recommend using ChatGPTAPI since it uses the officially supported API from OpenAI. We will likely remove support for ChatGPTUnofficialProxyAPI in a future release.

  1. ChatGPTAPI - Uses the gpt-3.5-turbo model with the official OpenAI chat completions API (official, robust approach, but it's not free)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
Previous Updates
March 1, 2023

The official OpenAI chat completions API has been released, and it is now the default for this package! πŸ”₯

Method Free? Robust? Quality?
ChatGPTAPI ❌ No βœ… Yes βœ…οΈ Real ChatGPT models
ChatGPTUnofficialProxyAPI βœ… Yes β˜‘οΈ Maybe βœ… Real ChatGPT

Note: We strongly recommend using ChatGPTAPI since it uses the officially supported API from OpenAI. We may remove support for ChatGPTUnofficialProxyAPI in a future release.

  1. ChatGPTAPI - Uses the gpt-3.5-turbo model with the official OpenAI chat completions API (official, robust approach, but it's not free)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
Feb 19, 2023

We now provide three ways of accessing the unofficial ChatGPT API, all of which have tradeoffs:

Method Free? Robust? Quality?
ChatGPTAPI ❌ No βœ… Yes β˜‘οΈ Mimics ChatGPT
ChatGPTUnofficialProxyAPI βœ… Yes β˜‘οΈ Maybe βœ… Real ChatGPT
ChatGPTAPIBrowser (v3) βœ… Yes ❌ No βœ… Real ChatGPT

Note: I recommend that you use either ChatGPTAPI or ChatGPTUnofficialProxyAPI.

  1. ChatGPTAPI - (Used to use) text-davinci-003 to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
  3. ChatGPTAPIBrowser - (deprecated; v3.5.1 of this package) Uses Puppeteer to access the official ChatGPT webapp (uses the real ChatGPT, but very flaky, heavyweight, and error prone)
Feb 5, 2023

OpenAI has disabled the leaked chat model we were previously using, so we're now defaulting to text-davinci-003, which is not free.

We've found several other hidden, fine-tuned chat models, but OpenAI keeps disabling them, so we're searching for alternative workarounds.

Feb 1, 2023

This package no longer requires any browser hacks – it is now using the official OpenAI completions API with a leaked model that ChatGPT uses under the hood. πŸ”₯

import { ChatGPTAPI } from 'chatgpt'

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY
})

const res = await api.sendMessage('Hello World!')
console.log(res.text)

Please upgrade to chatgpt@latest (at least v4.0.0). The updated version is significantly more lightweight and robust compared with previous versions. You also don't have to worry about IP issues or rate limiting.

Huge shoutout to @waylaidwanderer for discovering the leaked chat model!

If you run into any issues, we do have a pretty active ChatGPT Hackers Discord with over 8k developers from the Node.js & Python communities.

Lastly, please consider starring this repo and following me on twitter twitter to help support the project.

Thanks && cheers, Travis

CLI

To run the CLI, you'll need an OpenAI API key:

export OPENAI_API_KEY="sk-TODO"
npx chatgpt "your prompt here"

By default, the response is streamed to stdout, the results are stored in a local config file, and every invocation starts a new conversation. You can use -c to continue the previous conversation and --no-stream to disable streaming.

Usage:
  $ chatgpt <prompt>

Commands:
  <prompt>  Ask ChatGPT a question
  rm-cache  Clears the local message cache
  ls-cache  Prints the local message cache path

For more info, run any command with the `--help` flag:
  $ chatgpt --help
  $ chatgpt rm-cache --help
  $ chatgpt ls-cache --help

Options:
  -c, --continue          Continue last conversation (default: false)
  -d, --debug             Enables debug logging (default: false)
  -s, --stream            Streams the response (default: true)
  -s, --store             Enables the local message cache (default: true)
  -t, --timeout           Timeout in milliseconds
  -k, --apiKey            OpenAI API key
  -o, --apiOrg            OpenAI API organization
  -n, --conversationName  Unique name for the conversation
  -h, --help              Display this message
  -v, --version           Display version number

If you have access to the gpt-4 model, you can run the following to test out the CLI with GPT-4:

Using the chatgpt CLI with gpt-4

Install

npm install chatgpt

Make sure you're using node >= 18 so fetch is available (or node >= 14 if you install a fetch polyfill).

Usage

To use this module from Node.js, you need to pick between two methods:

Method Free? Robust? Quality?
ChatGPTAPI ❌ No βœ… Yes βœ…οΈ Real ChatGPT models + GPT-4
ChatGPTUnofficialProxyAPI βœ… Yes ❌ No️ βœ… Real ChatGPT webapp
  1. ChatGPTAPI - Uses the gpt-3.5-turbo model with the official OpenAI chat completions API (official, robust approach, but it's not free). You can override the model, completion params, and system message to fully customize your assistant.

  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)

Both approaches have very similar APIs, so it should be simple to swap between them.

Note: We strongly recommend using ChatGPTAPI since it uses the officially supported API from OpenAI and it also supports gpt-4. We will likely remove support for ChatGPTUnofficialProxyAPI in a future release.

Usage - ChatGPTAPI

Sign up for an OpenAI API key and store it in your environment.

import { ChatGPTAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTAPI({
    apiKey: process.env.OPENAI_API_KEY
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

You can override the default model (gpt-3.5-turbo) and any OpenAI chat completion params using completionParams:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  completionParams: {
    model: 'gpt-4',
    temperature: 0.5,
    top_p: 0.8
  }
})

If you want to track the conversation, you'll need to pass the parentMessageId like this:

const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

// send a message and wait for the response
let res = await api.sendMessage('What is OpenAI?')
console.log(res.text)

// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
  parentMessageId: res.id
})
console.log(res.text)

// send another follow-up
res = await api.sendMessage('What were we talking about?', {
  parentMessageId: res.id
})
console.log(res.text)

You can add streaming via the onProgress handler:

const res = await api.sendMessage('Write a 500 word essay on frogs.', {
  // print the partial response as the AI is "typing"
  onProgress: (partialResponse) => console.log(partialResponse.text)
})

// print the full text at the end
console.log(res.text)

You can add a timeout using the timeoutMs option:

// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage(
  'write me a really really long essay on frogs',
  {
    timeoutMs: 2 * 60 * 1000
  }
)

If you want to see more info about what's actually being sent to OpenAI's chat completions API, set the debug: true option in the ChatGPTAPI constructor:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  debug: true
})

We default to a basic systemMessage. You can override this in either the ChatGPTAPI constructor or sendMessage:

const res = await api.sendMessage('what is the answer to the universe?', {
  systemMessage: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each responseIf you are generating a list, do not have too many items.
Current date: ${new Date().toISOString()}\n\n`
})

Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to 4096).

Usage in CommonJS (Dynamic import)
async function example() {
  // To use ESM in CommonJS, you can use a dynamic import like this:
  const { ChatGPTAPI } = await import('chatgpt')
  // You can also try dynamic importing like this:
  // const importDynamic = new Function('modulePath', 'return import(modulePath)')
  // const { ChatGPTAPI } = await importDynamic('chatgpt')

  const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

Usage - ChatGPTUnofficialProxyAPI

The API for ChatGPTUnofficialProxyAPI is almost exactly the same. You just need to provide a ChatGPT accessToken instead of an OpenAI API key.

import { ChatGPTUnofficialProxyAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTUnofficialProxyAPI({
    accessToken: process.env.OPENAI_ACCESS_TOKEN
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

See demos/demo-reverse-proxy for a full example:

npx tsx demos/demo-reverse-proxy.ts

ChatGPTUnofficialProxyAPI messages also contain a conversationid in addition to parentMessageId, since the ChatGPT webapp can't reference messages across different accounts & conversations.

Reverse Proxy

You can override the reverse proxy by passing apiReverseProxyUrl:

const api = new ChatGPTUnofficialProxyAPI({
  accessToken: process.env.OPENAI_ACCESS_TOKEN,
  apiReverseProxyUrl: 'https://your-example-server.com/api/conversation'
})

Known reverse proxies run by community members include:

Reverse Proxy URL Author Rate Limits Last Checked
https://ai.fakeopen.com/api/conversation @pengzhile 5 req / 10 seconds by IP 4/18/2023
https://api.pawan.krd/backend-api/conversation @PawanOsman 50 req / 15 seconds (~3 r/s) 3/23/2023

Note: info on how the reverse proxies work is not being published at this time in order to prevent OpenAI from disabling access.

Access Token

To use ChatGPTUnofficialProxyAPI, you'll need an OpenAI access token from the ChatGPT webapp. To do this, you can use any of the following methods which take an email and password and return an access token:

These libraries work with email + password accounts (e.g., they do not support accounts where you auth via Microsoft / Google).

Alternatively, you can manually get an accessToken by logging in to the ChatGPT webapp and then opening https://chat.openai.com/api/auth/session, which will return a JSON object containing your accessToken string.

Access tokens last for days.

Note: using a reverse proxy will expose your access token to a third-party. There shouldn't be any adverse effects possible from this, but please consider the risks before using this method.

Docs

See the auto-generated docs for more info on methods and parameters.

Demos

Most of the demos use ChatGPTAPI. It should be pretty easy to convert them to use ChatGPTUnofficialProxyAPI if you'd rather use that approach. The only thing that needs to change is how you initialize the api with an accessToken instead of an apiKey.

To run the included demos:

  1. clone repo
  2. install node deps
  3. set OPENAI_API_KEY in .env

A basic demo is included for testing purposes:

npx tsx demos/demo.ts

A demo showing on progress handler:

npx tsx demos/demo-on-progress.ts

The on progress demo uses the optional onProgress parameter to sendMessage to receive intermediary results as ChatGPT is "typing".

A conversation demo:

npx tsx demos/demo-conversation.ts

A persistence demo shows how to store messages in Redis for persistence:

npx tsx demos/demo-persistence.ts

Any keyv adaptor is supported for persistence, and there are overrides if you'd like to use a different way of storing / retrieving messages.

Note that persisting message is required for remembering the context of previous conversations beyond the scope of the current Node.js process, since by default, we only store messages in memory. Here's an external demo of using a completely custom database solution to persist messages.

Note: Persistence is handled automatically when using ChatGPTUnofficialProxyAPI because it is connecting indirectly to ChatGPT.

Projects

All of these awesome projects are built using the chatgpt package. 🀯

If you create a cool integration, feel free to open a PR and add it to the list.

Compatibility

  • This package is ESM-only.
  • This package supports node >= 14.
  • This module assumes that fetch is installed.
    • In node >= 18, it's installed by default.
    • In node < 18, you need to install a polyfill like unfetch/polyfill (guide) or isomorphic-fetch (guide).
  • If you want to build a website using chatgpt, we recommend using it only from your backend API

Credits

License

MIT Β© Travis Fischer

If you found this project interesting, please consider sponsoring me or following me on twitter twitter

chatgpt-api's People

Contributors

0x7030676e31 avatar 189 avatar alxmiron avatar barnesoir avatar billylo1 avatar danielehrhardt avatar easydu2002 avatar fuergaosi233 avatar gencay avatar ikechan8370 avatar kodjunkie avatar mrloldev avatar noelzappy avatar nucks avatar oceanlvr avatar optionsx avatar pawanosman avatar simon300000 avatar skippyyy avatar timkmecl avatar timmsgithub avatar transitive-bullshit avatar waylaidwanderer avatar wong2 avatar xtremehpx avatar yi-ge avatar youking-lib avatar yunyu950908 avatar zeke avatar zhengxs2018 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatgpt-api's Issues

Can I use Chatgpt api in react-native app?

Hi, first of all thanks for your effort for publishing this api.
This is greatly helpful for making project in my side.
Due to problem for implementing this api to my hybrid app, I am asking this question.
Is it possible to use this api in my react-native app?
Please give me an answer with update.
Regards.

Fix history so the bot knows previous messages + misc cookie issue

Been playing around with the api. Here are a couple fixes that could make it work better

  • parentId should always be updated to the previous message sent by the bot
  • conversation id should be populated after initial message received from bot (comes in parsedData)

its not working for me with sessionToken (kept getting unauthorized), so i modified it to use the entire cookie string and it works.

Quietly exit for no reason

Thanks for making the API, it works pretty well until a very annoying bug surfaces:

if you make multiple queries consecutively, after about several queries, calling api.sendMessage() would exit without the program any error output at the console.

I'm using node v16.18.1, if that matters.

Add better support for conversations

  • Expose message IDs to the user
  • Store the latest message ID as the default conversationId
  • Automatically update the default conversation ID if no conversationId is passed

See #26 which I need to review

[ERROR] Error: ChatGPTAPI error 429

Hello,

Was able to install and use it for simple requests, but when it takes a bit to formulate a response I get the error

[ERROR] Error: ChatGPTAPI error 429

Add support for browser-like edge runtimes

It shouldn't take much effort to make this package work in both Node.js + browser environments.

This could be really important, for instance, for running on Edge runtimes like CF workers, Vercel edge functions, trusted code in Chrome extensions, etc.

Note
This package is not meant to be run client-side in a browser because it contains a reference to your private session token.

webpack bundle error

webpack 5.66.0 compiled successfully in 3940 ms
Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: xxx/node_modules/chatgpt/build/index.js
require() of ES modules is not supported.
require() of xxx/node_modules/chatgpt/build/index.js from xxx/dist/src/modules/chat/Chat.service.js is an ES module file as it is a .js file whose nearest parent package.json contains "type": "module" which defines all .js files in that package scope as ES modules.
Instead rename index.js to end in .cjs, change the requiring code to use import(), or remove "type": "module" from xxx/node_modules/chatgpt/package.json.

Why throw this error (`ChatGPTAPI error ${res.status || res.statusText}`)

file:///code/node_modules/chatgpt/src/fetch-sse.ts:13
    throw new Error(`ChatGPTAPI error ${res.status || res.statusText}`)
          ^
Error: ChatGPTAPI error 503
    at fetchSSE (file:///code/node_modules/chatgpt/src/fetch-sse.ts:13:11)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

env version: node: 16.18.0

image

Issue with published cjs version of package

wondering if i'm doing something wrong but

with version 1.3.0 installed like this:

{
    "dependencies": {
        "chatgpt": "^1.3.0" 
    }
}

and trying to import it like this:

let { ChatGPTAPI } = require("chatgpt");

i'm still getting this error:

Error [ERR_REQUIRE_ESM]: Must use import to load ES Module: /var/task/node_modules/chatgpt/build/index.js require() of ES modules is not supported.

am i doing anything wrong?

Add an automated way to fetch and refresh session tokens

This is currently the biggest hindrance with this package (and any other approach which uses the unofficial REST API).

At some point, session tokens expire. This is currently recognized and handled by throwing an error with the message ChatGPT failed to refresh auth token. Error: session token has expired, but most long-running integrations will need a better solution for this going forwards.

Does this work in browser js?

Forgive the noob question, but can I run this in browser js? I tried putting it into a static webpage, but I got some errors. Now I think I'm realizing this is only intended to run in node.js.

getting captcha'd for every prompt

Appreciate your effort getting things to work after Dec 11 changes πŸ™

I built a simple node app that can take a prompt and print the response on terminal window.

In my case it seems to always open a new Chrome window/instance (thus prompting a new captcha) each time I enter a new prompt. I haven't used puppeteer before but looks like it's opening a new window each time, but there might be a way to reuse an already open window? πŸ€” Please ignore if it already does that.

Some digging led me to links below:
https://medium.com/@jaredpotter1/connecting-puppeteer-to-existing-chrome-window-8a10828149e0

puppeteer/puppeteer#1788

TODO

  • Add message and conversation IDs
  • Add support for streaming responses
  • Add basic unit tests

ChatGPT failed to refresh auth token. Error: Unauthorized, request returns code 304

Hi! Very excited to use this lib, looks very polished for the amount of time that has passed since chatGPT was launched! I am trying to initialise a client, but keep getting an ChatGPT failed to refresh auth token. Error: Unauthorized error. I copy the value for the __Secure-next-auth.session-token cookie. Have tried the other ones as well, but they don't work either.

When I look at the requests being sent, I can see that the session request returns a 304, and the body is empty. I assume that that is not what is supposed to happen, as the same request on the openAI website does not return an empty body. Any ideas why this could be happening?

I overcome Cloudflare with puppeteer

the code is here

const fs = require('fs')
const puppeteer = require('puppeteer-extra')
puppeteer.use(require('puppeteer-extra-plugin-stealth')())

const sleep = duration => new Promise(resolve => setTimeout(resolve, duration))

async function start() {
  const browser = await puppeteer.launch({
    executablePath: '/Applications/Google Chrome.app/Contents/MacOS/Google Chrome',
    headless: false,
    args: ['--proxy-server=127.0.0.1:7890'],
    ignoreHTTPSErrors: true
  })
  const page = (await browser.pages())[0]

  await page.goto('https://chat.openai.com')
  await page.waitForSelector('#__next .btn-primary')
  // wait for networkidle
  await sleep(10 * 1000)
  await page.click('#__next .btn-primary')

  await page.waitForNavigation({
    waitUntil: 'networkidle0'
  })

  await page.type('#username', 'username')
  await page.click('button[name="action"]')

  await page.waitForSelector('#password')

  // await page.screenshot({
  //   path: 'screenshot.png'
  // })

  await page.type('#password', 'password')
  await page.click('button[name="action"]')

  await page.waitForNavigation({
    waitUntil: 'networkidle0'
  })

  fs.writeFileSync('cookies.json', JSON.stringify(await page.cookies()))

  // await sleep(60 * 1000)
  await browser.close()
}

start()
  .then(() => {
    process.exit(0)
  })
  .catch(() => {
    process.exit(1)
  })

Front-end cross-domain

hi ,I'm use react.js build,but show a CORS error :

Access to fetch at 'https://chat.openai.com/api/auth/session' from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

getting _undici.fetch error

I'm getting this error when authenticating, but I did verify I have the undici library installed. I'm on windows... maybe doing something dumb?

(node:11744) UnhandledPromiseRejectionWarning: Error: ChatGPT failed to refresh auth token. TypeError: _undici.fetch is not a function

Windows issue using chatgpt as commonjs package

My error is in the chatgpt package, it says that it needs a main module but it is not finding any. I am trying to use this package with react native.

The error:

Error: While trying to resolve module chatgpt from file D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\src\screens\Home\Home.tsx, the package D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\chatgpt\package.json was successfully found. However, this package itself specifies a main module field that could not be resolved (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\chatgpt\index. Indeed, none of these files exist:

  • D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\chatgpt\index(.native|.android.js|.native.js|.js|.android.jsx|.native.jsx|.jsx|.android.json|.native.json|.json|.android.ts|.native.ts|.ts|.android.tsx|.native.tsx|.tsx)
  • D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\chatgpt\index\index(.native|.android.js|.native.js|.js|.android.jsx|.native.jsx|.jsx|.android.json|.native.json|.json|.android.ts|.native.ts|.ts|.android.tsx|.native.tsx|.tsx)
    at DependencyGraph.resolveDependency (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\node-haste\DependencyGraph.js:277:17)
    at Object.resolve (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\lib\transformHelpers.js:170:21)
    at resolveDependencies (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\DeltaBundler\graphOperations.js:466:33)
    at processModule (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\DeltaBundler\graphOperations.js:232:31)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async addDependency (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\DeltaBundler\graphOperations.js:361:18)
    at async Promise.all (index 4)
    at async processModule (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\DeltaBundler\graphOperations.js:279:3)
    at async addDependency (D:\Acadamics and Work\Work\Software Projects\ChatgptAssistant\frontend\node_modules\metro\src\DeltaBundler\graphOperations.js:361:18)
    at async Promise.all (index 2)

Switching to direct API requests

Headless Chrome seems a bit overkill for a Node.js API wrapper, and may not work on all machines (see line 36 in chatgpt-api.ts). The ChatGPT API allows direct requests given a CSRF token (which doesn't seem to change in my testing) and a session token, which can be obtained through the /api/auth/session endpoint.

getting a 403 Forbidden error when use version 2.2

Hello, I am now getting a 403 Forbidden error when use chatgpt-api version 2.2.

After modify demo-conversation.ts file's api paramter, run it,then throw exception as below:

(node:6860) ExperimentalWarning: The Fetch API is an experimental feature. This
feature could change at any time
(Use `node --trace-warnings ...` to show where the warning was created)
ChatGPTError: ChatGPT failed to refresh auth token. Error: 403 Forbidden
    at ChatGPTAPI.refreshAccessToken (c:\Users\Administrator\Desktop\TestChatGPT
\chatgpt-api-main (1)\src\chatgpt-api.ts:342:21)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5
)
    at ChatGPTAPI.ensureAuth (c:\Users\Administrator\Desktop\TestChatGPT\chatgpt
-api-main (1)\src\chatgpt-api.ts:264:12)
    at main (c:\Users\Administrator\Desktop\TestChatGPT\chatgpt-api-main (1)\dem
os\demo-conversation.ts:36:3) {
  response: Response {
    [Symbol(realm)]: null,
    [Symbol(state)]: {
      aborted: false,
      rangeRequested: false,
      timingAllowPassed: true,
      requestIncludesCredentials: true,
      type: 'default',
      status: 403,
      timingInfo: [Object],
      cacheState: '',
      statusText: 'Forbidden',
      headersList: [HeadersList],
      urlList: [Array],
      body: [Object]
    },
    [Symbol(headers)]: HeadersList {
      [Symbol(headers map)]: [Map],
      [Symbol(headers map sorted)]: null
    }
  },
  statusCode: 403,
  statusText: 'Forbidden',
  originalError: ChatGPTError: 403 Forbidden
      at <anonymous> (c:\Users\Administrator\Desktop\TestChatGPT\chatgpt-api-mai
n (1)\src\chatgpt-api.ts:298:25)
      at process.processTicksAndRejections (node:internal/process/task_queues:95
:5)
      at ChatGPTAPI.refreshAccessToken (c:\Users\Administrator\Desktop\TestChatG
PT\chatgpt-api-main (1)\src\chatgpt-api.ts:292:19)
      at ChatGPTAPI.ensureAuth (c:\Users\Administrator\Desktop\TestChatGPT\chatg
pt-api-main (1)\src\chatgpt-api.ts:264:12)
      at main (c:\Users\Administrator\Desktop\TestChatGPT\chatgpt-api-main (1)\d
emos\demo-conversation.ts:36:3) {
    response: Response {
      [Symbol(realm)]: null,
      [Symbol(state)]: [Object],
      [Symbol(headers)]: [HeadersList]
    },
    statusCode: 403,
    statusText: 'Forbidden'
  }
}

and my demo-conversation.ts as below

import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'

import { ChatGPTAPI } from '../src'
import { getOpenAIAuthInfo } from './openai-auth-puppeteer'

//dotenv.config()

/**
 * Demo CLI for testing conversation support.
 *
 * ```
 * npx tsx src/demo-conversation.ts
 * ```
 */
async function main() {
  //const email = process.env.EMAIL
  //const password = process.env.PASSWORD

/*
  const authInfo = await getOpenAIAuthInfo({
    email,
    password
  })

  const api = new ChatGPTAPI({ ...authInfo })
  */
  const api = new ChatGPTAPI({
  sessionToken: "**my sessionToken*",
  clearanceToken: "IGIXGUiyKUrNckUjBZ.NiGTOYn7bKjjfmRDYAmU8aXQ-1670850561-0-160",
  userAgent: 'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/108.0.0.0 Safari/537.36' // needs to match your browser's user agent
})

  
  
  await api.ensureAuth()

  const conversation = api.getConversation()

  const prompt = 'What is OpenAI?'

  const response = await oraPromise(conversation.sendMessage(prompt), {
    text: prompt
  })

  console.log(response)

  const prompt2 = 'Did they made OpenGPT?'

  console.log(
    await oraPromise(conversation.sendMessage(prompt2), {
      text: prompt2
    })
  )

  const prompt3 = 'Who founded this institute?'

  console.log(
    await oraPromise(conversation.sendMessage(prompt3), {
      text: prompt3
    })
  )

  const prompt4 = 'Who is that?'

  console.log(
    await oraPromise(conversation.sendMessage(prompt4), {
      text: prompt4
    })
  )
}

main().catch((err) => {
  console.error(err)
  process.exit(1)
})

about the "undici" error

hello,when I send a message about "Write a singleton pattern in Java.",it throw a "undici" error:

/app/qqbot/service/bin/node_modules/undici/lib/fetch/index.js:1921
fetchParams.controller.controller.error(new TypeError('terminated', {
^

TypeError: terminated
at Fetch.onAborted (/app/qqbot/service/bin/node_modules/undici/lib/fetch/index.js:1921:49)
at Fetch.emit (node:events:390:28)
at Fetch.emit (node:domain:475:12)
at Fetch.terminate (/app/qqbot/service/bin/node_modules/undici/lib/fetch/index.js:93:10)
at Object.onError (/app/qqbot/service/bin/node_modules/undici/lib/fetch/index.js:2062:34)
at Request.onError (/app/qqbot/service/bin/node_modules/undici/lib/core/request.js:265:27)
at null.errorRequest (/app/qqbot/service/bin/node_modules/undici/lib/client.js:1739:13)
at TLSSocket.onSocketClose (/app/qqbot/service/bin/node_modules/undici/lib/client.js:998:5)
at TLSSocket.emit (node:events:402:35)
at TLSSocket.emit (node:domain:475:12)


how can i do for that?

How can I close a conversation?

It seems that if I don't close a conversion, the speed will become slower and slower.

Is there a way to close a conversation?

Error: Failed to launch the browser process

Error: Failed to launch the browser process! spawn /Applications/Google Chrome.app/Contents/MacOS/Google Chrome ENOENT


TROUBLESHOOTING: https://github.com/puppeteer/puppeteer/blob/main/docs/troubleshooting.md

    at onClose (C:\Users\User\Desktop\New folder\node_modules\puppeteer-core\lib\cjs\puppeteer\node\BrowserRunner.js:299:20)
    at ChildProcess.<anonymous> (C:\Users\User\Desktop\New folder\node_modules\puppeteer-core\lib\cjs\puppeteer\node\BrowserRunner.js:293:24)
    at ChildProcess.emit (node:events:513:28)
    at ChildProcess._handle.onexit (node:internal/child_process:289:12)
    at onErrorNT (node:internal/child_process:476:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
Error: Failed to launch the browser process! spawn /Applications/Google Chrome.app/Contents/MacOS/Google Chrome ENOENT


TROUBLESHOOTING: https://github.com/puppeteer/puppeteer/blob/main/docs/troubleshooting.md

    at onClose (C:\Users\User\Desktop\New folder\node_modules\puppeteer-core\lib\cjs\puppeteer\node\BrowserRunner.js:299:20)
    at ChildProcess.<anonymous> (C:\Users\User\Desktop\New folder\node_modules\puppeteer-core\lib\cjs\puppeteer\node\BrowserRunner.js:293:24)
    at ChildProcess.emit (node:events:513:28)
    at ChildProcess._handle.onexit (node:internal/child_process:289:12)
    at onErrorNT (node:internal/child_process:476:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21)

Handle response timeouts

Sometimes the request will freeze and never finish, might be useful to have a timeout in ms attribute.

Contribution to your project

Hi,

I want to contribute to your project. I can make translation, code, 2D graphics - icons, logos. Will you be interested?

Thank you in advance.

Valeriia Zhuravska

[Bug] TypeError [ERR_INVALID_CHAR]: Invalid character in header content ["cookie"]. Is ChatGPT API update?

image

here is error:

AggregateError: 
    Error: ChatGPT failed to refresh auth token. TypeError [ERR_INVALID_CHAR]: Invalid character in header content ["cookie"]
        at w.refreshAccessToken (/var/task/dist/index-6d56a3b6.js:21385:2251)
        at async w.sendMessage (/var/task/dist/index-6d56a3b6.js:21385:1039)
        at async search (/var/task/dist/index-6d56a3b6.js:21400:10)
        at async /var/task/dist/index-6d56a3b6.js:21483:24
        at async Promise.all (index 1)
        at async middleware (/var/task/node_modules/.pnpm/@[email protected]/node_modules/@octokit/webhooks/dist-node/index.js:355:5)
        at async Server.<anonymous> (/var/task/___vc/__helpers.js:813:13)

Add ability to override auth url

Currently the ChatGPTAPI constructor allows you to set the apiBaseURL and the backendAPIBaseURL however the auth url used for authentication in the refreshAccessTokens function is hardcoded to https://chat.openai.com/api/auth/session. To avoid issues with CORS I would like to be able to modify the auth url so it would go through a koa proxy.

Alternatively this could be resolved by implementing a proxy configuration as described by #47

Fetch

(node:1754) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
(Use node --trace-warnings ... to show where the warning was created)

How do i get rid of it?, this is generated when I used console.log();

Thoughts on using Google to bypass reCAPTCHA/hCAPTCHA?

By login using Google, the only thing that requires user operations appears to be the 2-step verification from Google, and this can be done away using your smart phone.

Or is it going to be detected by Google after a few tries?

Like this:

...
await page.goto('https://chat.openai.com/auth/login')
await page.waitForSelector('#__next .btn-primary', { timeout })
await delay(1000)
try {
    if (email && password) {
        await Promise.all([
            page.click('#__next .btn-primary'),
            page.waitForNavigation({
                waitUntil: 'networkidle0'
            })
        ])
        await page.waitForSelector('button[data-provider="google"]', { timeout });
        await page.click('button[data-provider="google"]');
        await page.waitForNavigation({
            waitUntil: 'networkidle0'
        })
        await page.type('input[type="email"]', email, { delay: 50 })
        await page.keyboard.press("Enter");
        await delay(3000);
        await page.keyboard.type(password, { delay: 50 });
        await page.keyboard.press("Enter");
    }
} catch (err) {
    await browser.close();
    throw err;
}
await page.waitForSelector("items-center");
...

This browser or app may not be secure.

When trying to log in with Google I get...

This browser or app may not be secure.
Try using a different browser. If you’re already using a supported browser, you can try again to sign in.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.