Giter Club home page Giter Club logo

helix-gpt's People

Contributors

defasdefbe avatar kyfanc avatar leona avatar mapomagpie avatar sigmasd avatar silvercoder avatar yukaii avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

helix-gpt's Issues

Undefined is not an object

I've set up helix-gpt (I think properly, see below), but when editing a Python file, I get

2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "TypeError: undefined is not an object (evaluating 'this.contents.split')\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at positionalUpdate (/$bunfs/root/helix-gpt:238:19)\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:232:9\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at forEach (:1:21)\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:231:7\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:230:41\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:260:9\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:258:39\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at emit (node:events:154:95)\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:319:7\n"
2024-01-26T14:08:07.382 helix_lsp::transport [ERROR] gpt err <- "      at receiveLine (/$bunfs/root/helix-gpt:313:23)\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "233 |       });\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "234 |       ctx.contentVersion = request.params.textDocument.version;\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "235 |     });\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "236 |   }\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "237 |   positionalUpdate(text, range) {\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "238 |     const lines = this.contents.split(\"\\n\");\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "                        ^\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "TypeError: undefined is not an object (evaluating 'this.contents.split')\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at positionalUpdate (/$bunfs/root/helix-gpt:238:19)\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:232:9\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at forEach (:1:21)\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:231:7\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:230:41\n"
2024-01-26T14:08:07.666 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:260:9\n"
2024-01-26T14:08:07.667 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:258:39\n"
2024-01-26T14:08:07.667 helix_lsp::transport [ERROR] gpt err <- "      at emit (node:events:154:95)\n"
2024-01-26T14:08:07.667 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:319:7\n"
2024-01-26T14:08:07.667 helix_lsp::transport [ERROR] gpt err <- "      at receiveLine (/$bunfs/root/helix-gpt:313:23)\n"

My languages.toml contains

[language-server.gpt]
command = "/opt/homebrew/bin/helix-gpt" 
config = {}
args = ["--logFile", "/tmp/helix-gpt.log", "--handler", "copilot", "--copilotApiKey", "xxxx"]

[[language]]
name = "python"
scope = "source.python"
language-servers = [ "gpt" ]

/opt/homebrew/bin/helix-gpt exists and is executable (I ran /opt/homebrew/bin/helix-gpt --authCopilot to get the copilot key).

An excerpt from the log shows

APP 2024-01-26T13:08:04.264Z --> sent request | {"jsonrpc":"2.0","method":"initialize","id":0,"result":{"capabilities":{"completionProvider":{"resolveProvider":false,"triggerCharacters":["{","(",")","=",">"," ",",",":",".","<","/"]},"textDocumentSync":{"change":2}}}}

APP 2024-01-26T13:08:04.265Z --> failed to parse line: | JSON Parse error: Unable to parse JSON string | Content-Length: 52

{"jsonrpc":"2.0","method":"initialized","params":{}}Content-Length: 3888

{"jsonrpc":"2.0","method":"textDocument/didOpen","params":{"textDocument":{"languageId":"python","text":"import base64\nimport json\nimport os\nfrom pathlib import Path\nfrom typing import Any, Dict\n\nfrom utilities.parser import parse_query\nfrom utilities.salesforce import Salesforce\nfrom google.cloud import storage\nfrom google.cloud import pubsub_v1\nfrom google.oauth2 import service_account\nimport logging\n\nfrom google.cloud.bigquery import Client\n\n\nqueries = {p.stem: p.open().read() for p in Path(\"queries\").glob(\"*.sql\")}\n\ndef get_salesforce_client() -> Salesforce:\n    blob: str = (\n        storage.Client(project=os.environ[\"GOOGLE_CLOUD_PROJECT\"])\n        .get_bucket(os.environ[\"STORAGE_BUCKET\"])\n        .get_blob(os.environ[\"APP\"])\n        .download_as_string()\n    )\n\n    parsed = json.loads(blob)\n\n    conn_login = parsed[\"conn_login\"]\n    conn_password = parsed[\"conn_password\"]\n    conn_security_token = parsed[\"conn_security_token\"]\n    conn_host = parsed[\"conn_host\"]\n\n    return Salesforce(\n        conn_login=conn_login,\n        conn_password=conn_password,\n        conn_security_token=conn_security_token,\n        conn_host=conn_host,\n    )\n\n\ndef get_pandas_gbq_credentials():\n    blob: str = (\n        storage.Client(project=os.environ[\"GOOGLE_CLOUD_PROJECT\"])\n        .get_bucket(os.environ[\"STORAGE_BUCKET\"])\n        .get_blob(\"pandas.json\")\n        .download_as_string()\n    )\n\n    account_info = json.loads(blob)\n\n    credentials = service_account.Credentials.from_service_account_info(account_info)\n\n    return credentials\n\n\ndef schedule_events(event, context):\n    \"\"\"\n    Publish the tables that need to be fetched from salesforce\n\n    :param dict event: The dictionary with data specific to this type of\n         event. The `data` field contains the PubsubMessage message. The\n         `attributes` field will contain custom attributes if there are any.\n    :param google.cloud.functions.Context context: The Cloud Functions event\n         metadata. The `event_id` field contains the Pub/Sub message ID. The\n         `timestamp` field contains the publish time.\n    \"\"\"\n    topic = os.environ[\"APP\"]\n    project_id = os.environ[\"GOOGLE_CLOUD_PROJECT\"]\n    publisher = pubsub_v1.PublisherClient()\n    topic_name = f\"projects/{project_id}/topics/{topic}\"\n\n    for table in queries.keys():\n        publisher.publish(topic_name, bytes(table, \"utf-8\"))\n\n\ndef import_salesforce(event: Dict[str, Any], context):\n    \"\"\"\n    Import the tables from salesforce to bigquery\n\n    :param event: The dictionary with data specific to this type of\n         event. The `data` field contains the PubsubMessage message. The\n         `attributes` field will contain custom attributes if there are any.\n    :param context: The Cloud Functions event\n         metadata. The `event_id` field contains the Pub/Sub message ID. The\n         `timestamp` field contains the publish time.\n    \"\"\"\n\n    sf = get_salesforce_client()\n\n    if \"data\" in event:\n        table_name = base64.b64decode(event[\"data\"]).decode(\"utf-8\")\n        query = queries.get(table_name)\n\n        if query:\n            parsed = parse_query(\n                query, strip_aliases=True, strip_escapes=False, strip_comments=True\n            )\n            table: str = parsed.table\n            logging.info(\"Importing %s\" % table)\n            df = sf.query_df(soql=query)\n            Salesforce.insert_in_bq(\n                df,\n                destination_table=f\"salesforce_xebia.{table}\",\n                project_id=os.environ[\"GOOGLE_CLOUD_PROJECT\"],\n                credentials=get_pandas_gbq_credentials(),\n            )\n            logging.info(\"Finished %s\" % table)\n","uri":"file:///Users/gio/code/salesforce-to-bigquery/main.py","version":0}}}Content-Length: 86

{"jsonrpc":"2.0","method":"workspace/didChangeConfiguration","params":{"settings":{}}}

APP 2024-01-26T13:08:07.750Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":2,"line":15},"textDocument":{"uri":"file:///Users/gio/code/salesforce-to-bigquery/main.py"}},"id":1}

APP 2024-01-26T13:08:07.877Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":3,"line":15},"textDocument":{"uri":"file:///Users/gio/code/salesforce-to-bigquery/main.py"}},"id":2}

What could be wrong?

chat handler: openai doesn't exist

Test cli:
helix-gpt --handler openai --openaiKey 'sk-ogR5nc.......'

Hanging, no errors, no infos

Helix languages.toml:

[language-server.gpt]
command = "helix-gpt"
config = {}
args = ["--handler", "openai", "--openaiKey", "sk-ogR5nc......."


[[language]]
name = "bash"
auto-format = true
language-servers = [ "gpt" ]


[[language]]
name = "php"
language-servers = ["phpactor","gpt"]
auto-format = true

In Helix, % , Space + a, generate documentation

It's not working, no logs.

tail -f /root/.cache/helix/helix.log
Blank

tail -f /app/helix-gpt.log # Or wherever you set --logFile to
No found

locate helix-gpt.log
Not found

[BUG] failed to initialize language server: request 0 timed out

device info
Apple M2 Pro

helix-editor version
helix 23.10 (1d1806c8)

helix-gpt version
0.31

Describe the bug
The initialization request from Helix times out when opening a file for which helix-gpt is configured to run.
helix-gpt waits in src/models/lsp.ts in line 207 for Chunks from. Bun.stdin.stream() but nothing ever arrives. I checked this by logging the chunkText.

If I start helix-gpt manually with

echo "Hello Helix-GPT" | bun run /Users/xaver.himmelsbach/Development/helix-gpt/src/app.ts --logFile /Users/xaver.himmelsbach/gpt.log

the input is read from stdin:

APP 2024-03-19T16:45:21.209Z --> triggerCharacters: | ["{","("," "]

APP 2024-03-19T16:45:21.213Z --> Hello Helix-GPT

APP 2024-03-19T16:45:21.214Z --> failed to parse line:| failed to parse | Hello Helix-GPT

helix-gpt logs
Most of the time, helix-gpt justs prints the triggerCharacters and receives nothing more.
Sometimes, when repeatedly executing :lsp-restart in Helix, an initialization request comes through, but that still doesn't seem to be handled correctly.

These logs show the request showing up after repeatedly calling :lsp-restart:

APP 2024-03-19T16:48:07.090Z --> triggerCharacters: | ["{","("," "]

APP 2024-03-19T16:49:41.674Z --> triggerCharacters: | ["{","("," "]

APP 2024-03-19T16:50:33.410Z --> triggerCharacters: | ["{","("," "]

APP 2024-03-19T16:51:46.195Z --> triggerCharacters: | ["{","("," "]

APP 2024-03-19T16:51:49.266Z --> Content-Length: 2016

{"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"tagSupport":{"valueSet":[1,2]},"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"fileOperations":{"didRename":true,"willRename":true},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (1d1806c8)"},"processId":23053,"rootPath":"/Users/xaver.himmelsbach/Development/cruise-cruised","rootUri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised","workspaceFolders":[{"name":"cruise-cruised","uri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised"}]},"id":0}

APP 2024-03-19T16:51:49.267Z --> received request: | {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"tagSupport":{"valueSet":[1,2]},"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"fileOperations":{"didRename":true,"willRename":true},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (1d1806c8)"},"processId":23053,"rootPath":"/Users/xaver.himmelsbach/Development/cruise-cruised","rootUri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised","workspaceFolders":[{"name":"cruise-cruised","uri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised"}]},"id":0}

APP 2024-03-19T16:51:49.267Z --> sent request | {"jsonrpc":"2.0","method":"initialize","id":0,"result":{"capabilities":{"codeActionProvider":true,"executeCommandProvider":{"commands":["resolveDiagnostics","generateDocs","improveCode","refactorFromComment","writeTest"]},"completionProvider":{"resolveProvider":false,"triggerCharacters":["{","("," "]},"textDocumentSync":{"change":1,"openClose":true}}}}

APP 2024-03-19T16:51:49.311Z --> triggerCharacters: | ["{","("," "]

helix logs
Helix seems to send the initialization request and times out after 10 seconds (default LSP timeout). Increasing the timeout in languages.toml just led to longer waiting times.

2024-03-19T17:50:33.362 helix_lsp::transport [INFO] gpt -> {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"tagSupport":{"valueSet":[1,2]},"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"fileOperations":{"didRename":true,"willRename":true},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (1d1806c8)"},"processId":23053,"rootPath":"/Users/xaver.himmelsbach/Development/cruise-cruised","rootUri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised","workspaceFolders":[{"name":"cruise-cruised","uri":"file:///Users/xaver.himmelsbach/Development/cruise-cruised"}]},"id":0}
2024-03-19T17:50:53.365 helix_lsp [ERROR] failed to initialize language server: request 0 timed out

Question about API key

Please forgive this stupid question, I seem to be having a blackout about the required API key. I pay for GitHub Copilot, and for a long time I've been using Copilot in Visual Studio, Visual Studio, and Neovim. In your instructions, you mention a required API key. But I can't figure out for the life of me how to obtain that key. I can find no corresponding function in my GitHub account. How do I find the key?

And is this key the same as the "token" you mention in the context of helix-gpt --authCopilot? Or are key and token two different things?

What is supposed to happen when I use helix-gpt --authCopilot? What I get is "error: no handler key provided". But isn't the very command supposed to get me the key?

Can you maybe clarify your instructions in this regard?

Publishing releases to github!

Hi, first of all, thank you for making this!
This is really nice!

Would you mind publishing an initial release to github ?
The reason I ask is that I would like to have a go at packaging this with nix.
This will likely take time, but having a first release (no matter the number) would be a first step.

ps: just in case you are interested, if you know about the litellm project https://github.com/zya/litellmjs , that could be an easy way to support other models.

[Feature request] Add copilot chat as a command and append the results to a buffer

Is your feature request related to a problem? Please describe.
I'm still configuring this, but I saw no reference to being able to use Copilot Chat.

Describe the solution you'd like
I'd like to run a command in Helix (even if it's an SH command) to ask a question in a copilot chat session, and see the response in a buffer dedicated to this conversation (so I can move it around Helix, split windows, etc).
This buffer would contain the question I entered as a command, and the response, and then each new question+response would still be in the buffer separated by some character and new lines.

Describe alternatives you've considered
Having a dedicated interactive window does not seem to be supported in helix? It'd be nicer
Getting the chat to suggest code would be OK, but having it in a buffer is more than enough.

Additional context
Feature in other IDEs:
https://docs.github.com/en/copilot/github-copilot-chat/using-github-copilot-chat-in-your-ide

Thanks!!

[BUG] completion doesn't get inserted correctly (codeium)

helix-editor version
Exact version of the Helix editor you are using..
helix 23.10 (a8761485)

helix-gpt version
Exact version of Helix GPT.
4647598

Describe the bug
A clear and concise description of what the bug is.

completion doesn't get inserted correctly (tried with codeium)

Screencast.from.2024-02-26.15-35-31.webm

The white-space after static triggers the bug, if I request completion and the cursor is immediatly at the end of static it works correctly

helix-gpt logs
Helix GPT output logs. Pass --logFile /app/helix-gpt.log to save them.

APP 2024-02-26T14:30:41.663Z --> running completion on buffer | {"uri":"file:///home/mrcool/dev/others/helix-gpt/a.ts","text":"class A {\n  static \n}\n","languageId":"typescript","version":1}

APP 2024-02-26T14:30:41.664Z --> calling completion event

APP 2024-02-26T14:30:41.664Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":1,"character":0},"end":{"line":2,"character":0}}}]

APP 2024-02-26T14:30:41.664Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/mrcool/dev/others/helix-gpt/a.ts","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":1,"character":0},"end":{"line":2,"character":0}},"source":"helix-gpt"}]}}

APP 2024-02-26T14:30:41.664Z --> codeium | completion request

APP 2024-02-26T14:30:41.664Z --> fetch | /exa.language_server_pb.LanguageServerService/GetCompletions

APP 2024-02-26T14:30:41.901Z --> response | https://web-backend.codeium.com/exa.language_server_pb.LanguageServerService/GetCompletions | 200

APP 2024-02-26T14:30:41.902Z --> completion hints: |   static a = 1,  static a: number,  static foo() {

APP 2024-02-26T14:30:41.902Z --> TEST | 1 | 23

APP 2024-02-26T14:30:41.902Z --> TEST | 1 | 27

APP 2024-02-26T14:30:41.902Z --> TEST | 1 | 25

APP 2024-02-26T14:30:41.902Z --> sent request | {"jsonrpc":"2.0","id":4,"result":{"isIncomplete":false,"items":[{"label":"static a = 1","kind":1,"preselect":true,"detail":"  static a = 1","insertText":"  static a = 1","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":1,"character":23},"end":{"line":1,"character":23}}}]},{"label":"static a: number","kind":1,"preselect":true,"detail":"  static a: number","insertText":"  static a: number","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":1,"character":27},"end":{"line":1,"character":27}}}]},{"label":"static foo() {","kind":1,"preselect":true,"detail":"  static foo() {","insertText":"  static foo() {","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":1,"character":25},"end":{"line":1,"character":25}}}]}]}}

llama.cpp endpoint alternative

First of all, thank you for your work on this!

There is a server implementation for llama.cpp and llamafile, and I use them very frequently. The API happens to be compatible with the OpenAI API.

It would be very helpful to set the URI with environment variable flag for people who self-host llama.cpp or llamafile.

P.S. I noticed the base URI is hardcoded. I'm not very knowledgeable in TS, but if I find the time I might make a PR.

[BUG] Working initially but at some point give error and prevent other LSPs also

helix-editor version
24.3

helix-gpt version
0.31

Describe the bug
It does work for some amount of time and then always give the same below error error.
the handler i use "codium"

helix logs

Fetch failed with status 400 body {"code":"invalid_argument","message":"validation error:\n - document.editor_language: value length must be at least 1 characters [string.min_len]: validation error:\n - document.editor_language: value length must be at least 1 characters [string.min_len]"} url: /exa.language_server_pb.LanguageServerService/GetCompletions

Remove all triggers

Is there a way to remove all triggers and just trigger it manually with C-x?

I know you can override them, but I tried passing empty string and it didn't work.

[language-server.gpt]
command = "bun"
args = ["run", "/Users/att/.local/bin/helix-gpt.js"]
triggerCharacters = ""

Provides normal code completion while waiting for gpt to complete the code

Is your feature request related to a problem? Please describe.
After adding helix-gpt, I always have to wait a long time for the patch window to pop up due to network issues, not only for gpt patches, but also for the ones provided by rust-ana, which can be very disruptive to my normal use.

Describe the solution you'd like
I'd like gpt's complement options to be inserted into the complement list asynchronously after loading is complete

Describe alternatives you've considered
I'm not sure that helix provides the ability to do that

Additional context
In lazyvim, the ai-provided patches are dynamically added to the code completion list, and I can choose between that and the other options provided by traditional lsp.

[BUG] Helix crashes on long completions

helix-editor version
Build from source: helix 23.10 (29ac2bb)

helix-gpt version
0.27 with Bun

Describe the bug
Helix crashes when accepting a long completion. Not sure if helix or the LSP is the problem. Created an issue at helix as well.

Reproduction Steps

I tried this:

  1. Installing helix-gpt as LSP accoring to the provided instructions
  2. hx test.ts
  3. write const longString = "Lorem " and wait for a long copilot completion
  4. accept the completion
  5. crash

helix-gpt logs

~/.cache/helix/helix-gpt.log
APP 2024-02-13T15:09:14.857Z --> triggerCharacters: | ["{","("," "]

APP 2024-02-13T15:09:14.860Z --> received request: | {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"fileOperations":{"didRename":true,"willRename":true},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (783ff27b)"},"processId":666987,"rootPath":"/home/hendrik/tmp","rootUri":null,"workspaceFolders":[]},"id":0}

APP 2024-02-13T15:09:14.860Z --> sent request | {"jsonrpc":"2.0","method":"initialize","id":0,"result":{"capabilities":{"codeActionProvider":true,"executeCommandProvider":{"commands":["resolveDiagnostics","generateDocs","improveCode","refactorFromComment","writeTest"]},"completionProvider":{"resolveProvider":false,"triggerCharacters":["{","("," "]},"textDocumentSync":{"change":1,"openClose":true}}}}

APP 2024-02-13T15:09:14.860Z --> received request: | {"jsonrpc":"2.0","method":"initialized","params":{}}

APP 2024-02-13T15:09:14.861Z --> received didOpen | language: typescript

APP 2024-02-13T15:09:20.271Z --> received didChange | language: typescript | contentVersion: 1 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:20.278Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":21,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":1}

APP 2024-02-13T15:09:20.482Z --> received didChange | language: typescript | contentVersion: 2 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:20.491Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":22,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":2}

APP 2024-02-13T15:09:20.612Z --> received didChange | language: typescript | contentVersion: 3 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:20.621Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":23,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":3}

APP 2024-02-13T15:09:20.682Z --> received didChange | language: typescript | contentVersion: 4 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:20.691Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":24,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":4}

APP 2024-02-13T15:09:20.761Z --> received didChange | language: typescript | contentVersion: 5 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:20.771Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":25,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":5}

APP 2024-02-13T15:09:21.022Z --> received didChange | language: typescript | contentVersion: 6 | uri: file:///home/hendrik/tmp/test.ts

APP 2024-02-13T15:09:21.022Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":26,"line":0},"textDocument":{"uri":"file:///home/hendrik/tmp/test.ts"}},"id":6}

APP 2024-02-13T15:09:21.424Z --> running completion on buffer | {"uri":"file:///home/hendrik/tmp/test.ts","text":"const longString = \"Lorem \"","languageId":"typescript","version":6}

APP 2024-02-13T15:09:21.425Z --> calling completion event

APP 2024-02-13T15:09:21.425Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":0,"character":0},"end":{"line":1,"character":0}}}]

APP 2024-02-13T15:09:21.425Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/hendrik/tmp/test.ts","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":0,"character":0},"end":{"line":1,"character":0}},"source":"helix-gpt"}]}}

APP 2024-02-13T15:09:21.425Z --> copilot | completion request

APP 2024-02-13T15:09:21.426Z --> fetch | /copilot_internal/v2/token

APP 2024-02-13T15:09:21.727Z --> response | https://api.github.com/copilot_internal/v2/token | 200

APP 2024-02-13T15:09:21.727Z --> fetch | /v1/engines/copilot-codex/completions

APP 2024-02-13T15:09:24.777Z --> response | https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions | 200

APP 2024-02-13T15:09:27.282Z --> completion hints: | ipsum dolor sit amet, consectetur adipiscing elit. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Etiam ut nisl at elit tincidunt tincidunt. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc,ipsum dolor sit amet, consectetur adipiscing elit. In sed urna vel massa molestie suscipit. Nullam sit amet mi in mauris efficitur vulputate. Nam fermentum, ipsum nec vehicula lacinia, odio dui dapibus leo, et condimentum enim tortor nec risus. Nulla facilisi. Duis quis eros ac mi tincidunt vestibulum. Nulla facilisi. Morbi in libero a elit suscipit eleifend. Aenean auctor, ipsum vel tincidunt vehicula, justo libero lacinia tortor, eget tempor magna purus id ex. Proin sodales, lectus id lobortis sollicitudin, ipsum arcu volutpat turpis, non ultricies elit ex eu tortor. Nam vehicula, erat at tincidunt tincidunt, nunc elit cursus mauris, eget luctus est odio sed nunc. Nullam et libero at libero maximus posuere vel auctor nunc. Nulla facilisi. Fusce at tincidunt libero. Donec non mi eget leo semper tincidunt. Aliquam erat volutpat. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Mauris ut ex vel enim ullamcorper malesuada. Morbi vel ligula vehicula, tincidunt purus et, egestas nulla. Vivamus pretium, dolor eget tincidunt tempus, ex eros mollis nisi, id volutpat nisl lectus in eros. In vel nunc ac ex auctor efficitur. Maecenas auctor, ex nec ultrices luctus, libero libero convallis elit, nec venenatis dolor arcu in libero. Nullam in nisi a libero varius vestibulum. Fusce auctor, libero at dignissim bibendum, nulla risus varius libero, vel tincidunt libero felis nec libero. Sed et sapien et mi volutpat egestas. Integer nec odio nec libero tempus malesuada. Nulla facilisi. Mauris in libero at libero semper fermentum. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. In vel libero id libero lacinia vehicula.,ipsum dolor sit amet, consectetur adipiscing elit. Donec et lacus sit amet ligula laoreet dapibus. Nullam vel mauris nec leo auctor lacinia. In hac habitasse platea dictumst. Donec dapibus, nunc et condimentum facilisis, odio eros vehicula erat, ac egestas risus elit ac nulla. Sed lacinia, dui non sagittis pellentesque, sem metus aliquam tortor, non sodales nunc orci eget ipsum. Nunc varius, sapien in suscipit pharetra, libero turpis venenatis lectus, nec fermentum sapien lectus ut elit. Sed eget orci at leo lacinia interdum. Sed nec leo sed lacus semper posuere. In hac habitasse platea dictumst. Quisque in ante nec tortor lacinia tincidunt. Morbi auctor velit vitae urna feugiat, id tincidunt lacus fermentum. Proin eget ipsum nec purus tincidunt tincidunt. Nullam et sapien auctor, suscipit odio in, cursus arcu. In hac habitasse platea dictumst. Nam at quam vitae libero aliquet

APP 2024-02-13T15:09:27.282Z --> sent request | {"jsonrpc":"2.0","id":6,"result":{"isIncomplete":false,"items":[{"label":"ipsum dolor sit amet, consectetur adipiscing elit. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Etiam ut nisl at elit tincidunt tincidunt. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc","kind":1,"preselect":true,"detail":"ipsum dolor sit amet, consectetur adipiscing elit. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Etiam ut nisl at elit tincidunt tincidunt. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc","insertText":"ipsum dolor sit amet, consectetur adipiscing elit. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Etiam ut nisl at elit tincidunt tincidunt. Maecenas nec purus vel mi sodales blandit. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc tincidunt lacinia. Nulla facilisi. Nulla facilisi. Suspendisse potenti. Nulla facilisi. Nulla facilisi. Mauris vel odio nec sem tincidunt tristique. Nulla facilisi. Nulla facilisi. Quisque eget leo a nunc","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":0,"character":1696},"end":{"line":0,"character":200}}}]},{"label":"ipsum dolor sit amet, consectetur adipiscing elit. In sed urna vel massa molestie suscipit. Nullam sit amet mi in mauris efficitur vulputate. Nam fermentum, ipsum nec vehicula lacinia, odio dui dapibus leo, et condimentum enim tortor nec risus. Nulla facilisi. Duis quis eros ac mi tincidunt vestibulum. Nulla facilisi. Morbi in libero a elit suscipit eleifend. Aenean auctor, ipsum vel tincidunt vehicula, justo libero lacinia tortor, eget tempor magna purus id ex. Proin sodales, lectus id lobortis sollicitudin, ipsum arcu volutpat turpis, non ultricies elit ex eu tortor. Nam vehicula, erat at tincidunt tincidunt, nunc elit cursus mauris, eget luctus est odio sed nunc. Nullam et libero at libero maximus posuere vel auctor nunc. Nulla facilisi. Fusce at tincidunt libero. Donec non mi eget leo semper tincidunt. Aliquam erat volutpat. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Mauris ut ex vel enim ullamcorper malesuada. Morbi vel ligula vehicula, tincidunt purus et, egestas nulla. Vivamus pretium, dolor eget tincidunt tempus, ex eros mollis nisi, id volutpat nisl lectus in eros. In vel nunc ac ex auctor efficitur. Maecenas auctor, ex nec ultrices luctus, libero libero convallis elit, nec venenatis dolor arcu in libero. Nullam in nisi a libero varius vestibulum. Fusce auctor, libero at dignissim bibendum, nulla risus varius libero, vel tincidunt libero felis nec libero. Sed et sapien et mi volutpat egestas. Integer nec odio nec libero tempus malesuada. Nulla facilisi. Mauris in libero at libero semper fermentum. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. In vel libero id libero lacinia vehicula.","kind":1,"preselect":true,"detail":"ipsum dolor sit amet, consectetur adipiscing elit. In sed urna vel massa molestie suscipit. Nullam sit amet mi in mauris efficitur vulputate. Nam fermentum, ipsum nec vehicula lacinia, odio dui dapibus leo, et condimentum enim tortor nec risus. Nulla facilisi. Duis quis eros ac mi tincidunt vestibulum. Nulla facilisi. Morbi in libero a elit suscipit eleifend. Aenean auctor, ipsum vel tincidunt vehicula, justo libero lacinia tortor, eget tempor magna purus id ex. Proin sodales, lectus id lobortis sollicitudin, ipsum arcu volutpat turpis, non ultricies elit ex eu tortor. Nam vehicula, erat at tincidunt tincidunt, nunc elit cursus mauris, eget luctus est odio sed nunc. Nullam et libero at libero maximus posuere vel auctor nunc. Nulla facilisi. Fusce at tincidunt libero. Donec non mi eget leo semper tincidunt. Aliquam erat volutpat. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Mauris ut ex vel enim ullamcorper malesuada. Morbi vel ligula vehicula, tincidunt purus et, egestas nulla. Vivamus pretium, dolor eget tincidunt tempus, ex eros mollis nisi, id volutpat nisl lectus in eros. In vel nunc ac ex auctor efficitur. Maecenas auctor, ex nec ultrices luctus, libero libero convallis elit, nec venenatis dolor arcu in libero. Nullam in nisi a libero varius vestibulum. Fusce auctor, libero at dignissim bibendum, nulla risus varius libero, vel tincidunt libero felis nec libero. Sed et sapien et mi volutpat egestas. Integer nec odio nec libero tempus malesuada. Nulla facilisi. Mauris in libero at libero semper fermentum. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. In vel libero id libero lacinia vehicula.","insertText":"ipsum dolor sit amet, consectetur adipiscing elit. In sed urna vel massa molestie suscipit. Nullam sit amet mi in mauris efficitur vulputate. Nam fermentum, ipsum nec vehicula lacinia, odio dui dapibus leo, et condimentum enim tortor nec risus. Nulla facilisi. Duis quis eros ac mi tincidunt vestibulum. Nulla facilisi. Morbi in libero a elit suscipit eleifend. Aenean auctor, ipsum vel tincidunt vehicula, justo libero lacinia tortor, eget tempor magna purus id ex. Proin sodales, lectus id lobortis sollicitudin, ipsum arcu volutpat turpis, non ultricies elit ex eu tortor. Nam vehicula, erat at tincidunt tincidunt, nunc elit cursus mauris, eget luctus est odio sed nunc. Nullam et libero at libero maximus posuere vel auctor nunc. Nulla facilisi. Fusce at tincidunt libero. Donec non mi eget leo semper tincidunt. Aliquam erat volutpat. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Mauris ut ex vel enim ullamcorper malesuada. Morbi vel ligula vehicula, tincidunt purus et, egestas nulla. Vivamus pretium, dolor eget tincidunt tempus, ex eros mollis nisi, id volutpat nisl lectus in eros. In vel nunc ac ex auctor efficitur. Maecenas auctor, ex nec ultrices luctus, libero libero convallis elit, nec venenatis dolor arcu in libero. Nullam in nisi a libero varius vestibulum. Fusce auctor, libero at dignissim bibendum, nulla risus varius libero, vel tincidunt libero felis nec libero. Sed et sapien et mi volutpat egestas. Integer nec odio nec libero tempus malesuada. Nulla facilisi. Mauris in libero at libero semper fermentum. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. In vel libero id libero lacinia vehicula.","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":0,"character":1738},"end":{"line":0,"character":200}}}]},{"label":"ipsum dolor sit amet, consectetur adipiscing elit. Donec et lacus sit amet ligula laoreet dapibus. Nullam vel mauris nec leo auctor lacinia. In hac habitasse platea dictumst. Donec dapibus, nunc et condimentum facilisis, odio eros vehicula erat, ac egestas risus elit ac nulla. Sed lacinia, dui non sagittis pellentesque, sem metus aliquam tortor, non sodales nunc orci eget ipsum. Nunc varius, sapien in suscipit pharetra, libero turpis venenatis lectus, nec fermentum sapien lectus ut elit. Sed eget orci at leo lacinia interdum. Sed nec leo sed lacus semper posuere. In hac habitasse platea dictumst. Quisque in ante nec tortor lacinia tincidunt. Morbi auctor velit vitae urna feugiat, id tincidunt lacus fermentum. Proin eget ipsum nec purus tincidunt tincidunt. Nullam et sapien auctor, suscipit odio in, cursus arcu. In hac habitasse platea dictumst. Nam at quam vitae libero aliquet","kind":1,"preselect":true,"detail":"ipsum dolor sit amet, consectetur adipiscing elit. Donec et lacus sit amet ligula laoreet dapibus. Nullam vel mauris nec leo auctor lacinia. In hac habitasse platea dictumst. Donec dapibus, nunc et condimentum facilisis, odio eros vehicula erat, ac egestas risus elit ac nulla. Sed lacinia, dui non sagittis pellentesque, sem metus aliquam tortor, non sodales nunc orci eget ipsum. Nunc varius, sapien in suscipit pharetra, libero turpis venenatis lectus, nec fermentum sapien lectus ut elit. Sed eget orci at leo lacinia interdum. Sed nec leo sed lacus semper posuere. In hac habitasse platea dictumst. Quisque in ante nec tortor lacinia tincidunt. Morbi auctor velit vitae urna feugiat, id tincidunt lacus fermentum. Proin eget ipsum nec purus tincidunt tincidunt. Nullam et sapien auctor, suscipit odio in, cursus arcu. In hac habitasse platea dictumst. Nam at quam vitae libero aliquet","insertText":"ipsum dolor sit amet, consectetur adipiscing elit. Donec et lacus sit amet ligula laoreet dapibus. Nullam vel mauris nec leo auctor lacinia. In hac habitasse platea dictumst. Donec dapibus, nunc et condimentum facilisis, odio eros vehicula erat, ac egestas risus elit ac nulla. Sed lacinia, dui non sagittis pellentesque, sem metus aliquam tortor, non sodales nunc orci eget ipsum. Nunc varius, sapien in suscipit pharetra, libero turpis venenatis lectus, nec fermentum sapien lectus ut elit. Sed eget orci at leo lacinia interdum. Sed nec leo sed lacus semper posuere. In hac habitasse platea dictumst. Quisque in ante nec tortor lacinia tincidunt. Morbi auctor velit vitae urna feugiat, id tincidunt lacus fermentum. Proin eget ipsum nec purus tincidunt tincidunt. Nullam et sapien auctor, suscipit odio in, cursus arcu. In hac habitasse platea dictumst. Nam at quam vitae libero aliquet","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":0,"character":915},"end":{"line":0,"character":200}}}]}]}}

APP 2024-02-13T15:09:27.282Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/hendrik/tmp/test.ts","diagnostics":[]}}

helix logs
Console Output:

thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Invalid char range 1786..200: start must be <= end', /home/hendrik/.cargo/registry/src/index.crates.io-6f17d22bba15001f/ropey-1.6.1/src/rope.rs:546:37
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Log: see linked issue as the char limit is reached for an github issue

[BUG] [ERROR] gpt err

helix-editor version
helix 23.10 (2661e05b)

helix-gpt version
version 0.21

Describe the bug
Not seeing any actions in Helix and not getting any code completion in typescript files

helix-gpt logs
Couldn't find any logs even though I have it enabled like so

[language-server.gpt]
command = "helix-gpt"
config = {}
args = ["--handler", "copilot", "--copilotApiKey", "<copilot_key>", "--logFile"]

helix logs

2024-01-29T22:21:05.071 helix_lsp::transport [INFO] gpt -> {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"fileOperations":{"didRename":true,"willRename":true},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (2661e05b)"},"initializationOptions":{},"processId":1706339,"rootPath":"/home/xxx/development/xxx-003","rootUri":"file:///home/xxx/development/xxx-003","workspaceFolders":[{"name":"xxx-003","uri":"file:///home/xxx/development/xxx-003"}]},"id":0}
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "46 |     query: \"Write a unit test for this code. Do not include any imports.\"\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "47 |   }\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "48 | ];\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "49 | \n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "50 | // src/config.ts\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "51 | var { values } = parseArgs({\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "                      ^\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "TypeError: Option '--logFile <value>' argument missing\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- " code: \"ERR_PARSE_ARGS_INVALID_OPTION_VALUE\"\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "\n"
2024-01-29T22:21:05.118 helix_lsp::transport [ERROR] gpt err <- "      at /$bunfs/root/helix-gpt:51:18\n"
2024-01-29T22:21:05.121 helix_lsp [ERROR] failed to initialize language server: server closed the stream
2024-01-29T22:21:05.121 helix_lsp::transport [ERROR] gpt err: <- StreamClosed

Support for codeium

Is your feature request related to a problem? Please describe.
Codeium is an alternative code specific completion engine that is free for individuals and supposedly better than copilot
https://codeium.com/

Describe the solution you'd like
Having an alternative handler codeium would be amazing

Describe alternatives you've considered
Not supporting it

Additional context
Add any other context or screenshots about the feature request here.

Let me know if you need more precisions.

Thanks a lot for making this !

Not getting completions

Hey this is amazing, I just can't get it to work. I'm a helix noob tho so please excuse my incompetence. This is what I did:

  • downloaded the js file and put it in ~/bin/helix-gpt.js
  • ran the script with --authCopilot and retrieved my key which i then put into a file: ~/bin/.env
  • created a ~/.config/helix/languages.toml file and added the code below
  • opened a .ts file in helix
  • press ctrl+x to open suggestions (is this how your supposed to do this?)
  • and no copilot suggestions shown

When i run bun run ~/bin/helix-gpt.js a process is running without errors and no output.

I also tried to run :lsp-workspace-commands in helix to see if its running but I only see this:

Screenshot 2024-01-26 at 15 15 03

Here are my files:

~/.config/helix/languages.toml

[language-server.gpt]
command = "bun"
args = ["run", "~/bin/helix-gpt.js --handler "]

[language-server.ts]
command = "typescript-language-server"
args = ["--stdio"]
language-id = "javascript"

[language-server.efm-lsp-prettier]
command = "efm-langserver"

[language-server.efm-lsp-prettier.config]
documentFormatting = true
languages = { typescript = [ { formatCommand ="prettier --stdin-filepath ${INPUT}", formatStdin = true } ] }

[[language]]
name = "typescript"
language-servers = [
    { name = "efm-lsp-prettier", only-features = [ "format" ] },
    "ts",
    "gpt"
]

~/.config/helix/config.toml

theme = "amberwood"

[editor]
line-number = "relative"
auto-save = true
color-modes = true

[editor.cursor-shape]
insert = "bar"
normal = "block"
select = "underline"

[editor.file-picker]
hidden = true

[keys.normal]
"ret" = "expand_selection"
"backspace" = "shrink_selection"

~/bin/.env

COPILOT_API_KEY=...
HANDLER=copilot

What am I missing here? Do you know some good ways to check if a lsp is connected and running, or to test the lsp directly?

Thanks in advance!

[BUG] Getting this error

helix-editor version
helix 23.10 (6f941978)

helix-gpt version
.27

Describe the bug
image

helix-gpt logs
Helix GPT output logs. Pass --logFile /app/helix-gpt.log to save them.

helix logs
image

[BUG] Looks like openai is using gpt-4 even though the config shows 3.5-turbo?

helix-editor version
helix 24.3 (b974716b)

helix-gpt version
helix-gpt-0.31-x86_64-linux

Describe the bug
I have the following config:

[language-server.gpt]
command = "helix-gpt"
args = ["--handler", "openai", "--openaiKey", "123", "--openaiModel", "gpt-3.5-turbo-16k",  "--logFile", "/tmp/helix-gpt.log"]

On the OpenAI usage dashboard, I'm noticing my gpt-4 usage increasing as I use helix.

helix-gpt logs
The only possibly relevant lots are as follows:

APP 2024-04-11T21:10:18.821Z --> fetch | /v1/chat/completions

APP 2024-04-11T21:10:24.465Z --> response | https://api.openai.com/v1/chat/completions | 200

helix logs
No relevant helix logs

Does helix-gpt default to gpt-4 for certain actions? I'm really testing documentation generation.

Allow fully customizing system and user message

The system-message can be partially customized using options like --ollamaContext. However, a portion of the system-message and all the user-message is hard-coded, https://github.com/leona/helix-gpt/blob/master/src/providers/ollama.ts#L68. The hard-coded messages work poorly for some models and use-cases.

How about, instead of,

config.ollamaContext?.replace("<languageId>", languageId) + "\n\n" + `End of file context:\n\n${contents.contentAfter}`

for the system-message, and,

`Start of file context:\n\n${contents.contentBefore}`

for the user-message, something like, config.ollamaSystemMessage?.replace("<languageId>", languageId).replace("<contentBefore>", contents.contentBefore).replace("<contentAfter>", contents.contentAfter) for the system-message, and config.ollamaUserMessage?.replace("<languageId>", languageId).replace("<contentBefore>", contents.contentBefore).replace("<contentAfter>", contents.contentAfter) for the user-message. The default messages can still be the same, and users can still create the same messages as before, but this allows greater customization.

P.S. It would be nice to similarly customize chat-messages. However, that may be better handled by fully customizing actions.

Unable to complete the final patch operation

helix-editor version
helix 23.10 (f6021dd0)

helix-gpt version
0.21

Describe the bug
helix-gpt can successfully get a reply from the copilot server, but cannot send the patched code to helix

helix-gpt logs

APP 2024-01-30T03:48:47.476Z --> received request: | {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (f6021dd0)"},"processId":1553,"rootPath":"/Users/ls/rust/gpt","rootUri":"file:///Users/ls/rust/gpt","workspaceFolders":[{"name":"gpt","uri":"file:///Users/ls/rust/gpt"}]},"id":0}

APP 2024-01-30T03:48:47.476Z --> sent request | {"jsonrpc":"2.0","method":"initialize","id":0,"result":{"capabilities":{"codeActionProvider":true,"executeCommandProvider":{"commands":["generateDocs","improveCode","refactorFromComment","writeTest"]},"completionProvider":{"resolveProvider":false,"triggerCharacters":["{","("," ","."]},"textDocumentSync":{"change":1,"openClose":true}}}}

APP 2024-01-30T03:48:47.477Z --> received request: | {"jsonrpc":"2.0","method":"initialized","params":{}}

APP 2024-01-30T03:48:47.477Z --> received didOpen | language: rust

APP 2024-01-30T03:48:52.870Z --> received didChange | language: rust | contentVersion: 1 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:53.593Z --> received didChange | language: rust | contentVersion: 2 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:54.028Z --> received didChange | language: rust | contentVersion: 3 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:54.186Z --> received didChange | language: rust | contentVersion: 4 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:54.189Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":2,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":1}

APP 2024-01-30T03:48:54.516Z --> received didChange | language: rust | contentVersion: 5 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:54.516Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":3,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":2}

APP 2024-01-30T03:48:54.926Z --> received didChange | language: rust | contentVersion: 6 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:55.022Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn p\n","languageId":"rust","version":6}

APP 2024-01-30T03:48:55.022Z --> skipping because content is stale

APP 2024-01-30T03:48:55.022Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:48:55.022Z --> sent request | {"jsonrpc":"2.0","id":2,"result":{"isIncomplete":false,"items":[]}}

APP 2024-01-30T03:48:55.071Z --> received didChange | language: rust | contentVersion: 7 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:55.072Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":5,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":3}

APP 2024-01-30T03:48:55.275Z --> received didChange | language: rust | contentVersion: 8 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:55.276Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":6,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":4}

APP 2024-01-30T03:48:55.383Z --> received didChange | language: rust | contentVersion: 9 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:55.386Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":7,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":5}

APP 2024-01-30T03:48:55.510Z --> received didChange | language: rust | contentVersion: 10 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:55.510Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":8,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":6}

APP 2024-01-30T03:48:56.012Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn print\n","languageId":"rust","version":10}

APP 2024-01-30T03:48:56.017Z --> calling completion event

APP 2024-01-30T03:48:56.017Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}}}]

APP 2024-01-30T03:48:56.017Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}},"source":"helix-gpt"}]}}

APP 2024-01-30T03:48:56.019Z --> copilot | completion request

APP 2024-01-30T03:48:56.021Z --> fetch | /copilot_internal/v2/token

APP 2024-01-30T03:48:57.007Z --> response | https://api.github.com/copilot_internal/v2/token | 200

APP 2024-01-30T03:48:57.008Z --> fetch | /v1/engines/copilot-codex/completions

APP 2024-01-30T03:48:58.397Z --> response | https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions | 200

APP 2024-01-30T03:48:58.577Z --> completion hints: | _usage(program: &str, opts: Options) {
    let brief = format!("Usage: {} [options]", program);
    print!("{}", opts.usage(&brief));
}
,_usage(program: &str, opts: &Options) {
    let brief = format!("Usage: {} [options]", program);
    print!("{}", opts.usage(&brief));
},_type_of<T>(_: &T) {
    println!("{}", std::any::type_name::<T>())
}

APP 2024-01-30T03:48:58.578Z --> sent request | {"jsonrpc":"2.0","id":6,"result":{"isIncomplete":true,"items":[{"label":"_usage(program: &str, opts: Options) {","kind":1,"preselect":true,"detail":"_usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertText":"_usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":8,"character":0},"end":{"line":8,"character":200}}}]},{"label":"_usage(program: &str, opts: &Options) {","kind":1,"preselect":true,"detail":"_usage(program: &str, opts: &Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}","insertText":"_usage(program: &str, opts: &Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":7,"character":1},"end":{"line":7,"character":200}}}]},{"label":"_type_of<T>(_: &T) {","kind":1,"preselect":true,"detail":"_type_of<T>(_: &T) {\n    println!(\"{}\", std::any::type_name::<T>())\n}","insertText":"_type_of<T>(_: &T) {\n    println!(\"{}\", std::any::type_name::<T>())\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":6,"character":1},"end":{"line":6,"character":200}}}]}]}}

APP 2024-01-30T03:48:58.579Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:48:58.925Z --> received didChange | language: rust | contentVersion: 11 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:48:58.926Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":9,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":7}

APP 2024-01-30T03:48:59.427Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn print_\n","languageId":"rust","version":11}

APP 2024-01-30T03:48:59.429Z --> calling completion event

APP 2024-01-30T03:48:59.429Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}}}]

APP 2024-01-30T03:48:59.430Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}},"source":"helix-gpt"}]}}

APP 2024-01-30T03:48:59.430Z --> copilot | completion request

APP 2024-01-30T03:48:59.430Z --> fetch | /v1/engines/copilot-codex/completions

APP 2024-01-30T03:48:59.996Z --> response | https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions | 200

APP 2024-01-30T03:49:00.170Z --> completion hints: | usage(program: &str, opts: Options) {
    let brief = format!("Usage: {} [options]", program);
    print!("{}", opts.usage(&brief));
},usage(program: &str, opts: Options) {
    let brief = format!("Usage: {} [options] FILE", program);
    print!("{}", opts.usage(&brief));
}
,usage(program: &str, opts: Options) {
    let brief = format!("Usage: {} FILE [options]", program);
    print!("{}", opts.usage(&brief));
}


APP 2024-01-30T03:49:00.170Z --> sent request | {"jsonrpc":"2.0","id":7,"result":{"isIncomplete":true,"items":[{"label":"usage(program: &str, opts: Options) {","kind":1,"preselect":true,"detail":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}","insertText":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":7,"character":1},"end":{"line":7,"character":200}}}]},{"label":"usage(program: &str, opts: Options) {","kind":1,"preselect":true,"detail":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options] FILE\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertText":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} [options] FILE\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":8,"character":0},"end":{"line":8,"character":200}}}]},{"label":"usage(program: &str, opts: Options) {","kind":1,"preselect":true,"detail":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} FILE [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertText":"usage(program: &str, opts: Options) {\n    let brief = format!(\"Usage: {} FILE [options]\", program);\n    print!(\"{}\", opts.usage(&brief));\n}\n","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":8,"character":0},"end":{"line":8,"character":200}}}]}]}}

APP 2024-01-30T03:49:00.171Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:00.325Z --> received didChange | language: rust | contentVersion: 12 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:00.326Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":10,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":8}

APP 2024-01-30T03:49:00.440Z --> received didChange | language: rust | contentVersion: 13 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:00.440Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":11,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":9}

APP 2024-01-30T03:49:00.759Z --> received didChange | language: rust | contentVersion: 14 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:00.761Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":12,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":10}

APP 2024-01-30T03:49:01.262Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn print_cat\n","languageId":"rust","version":14}

APP 2024-01-30T03:49:01.266Z --> calling completion event

APP 2024-01-30T03:49:01.266Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}}}]

APP 2024-01-30T03:49:01.266Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}},"source":"helix-gpt"}]}}

APP 2024-01-30T03:49:01.266Z --> copilot | completion request

APP 2024-01-30T03:49:01.267Z --> fetch | /v1/engines/copilot-codex/completions

APP 2024-01-30T03:49:01.877Z --> received didChange | language: rust | contentVersion: 15 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:01.877Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":13,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":11}

APP 2024-01-30T03:49:01.958Z --> response | https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions | 200

APP 2024-01-30T03:49:01.965Z --> completion hints: | () {
    let cat = "cat";
    println!("Hello, {}!", cat);
},() {
    println!("cat");
},() {
    println!("       _._     _,-'\"\"`-._");
    println!("      (,-.`._,'(       |\\`-/|");
    println!("          `-.-' \\ )-`( , o o)");
    println!("                `-    \\`_`\"'-");
}

APP 2024-01-30T03:49:01.965Z --> sent request | {"jsonrpc":"2.0","id":10,"result":{"isIncomplete":true,"items":[{"label":"() {","kind":1,"preselect":true,"detail":"() {\n    let cat = \"cat\";\n    println!(\"Hello, {}!\", cat);\n}","insertText":"() {\n    let cat = \"cat\";\n    println!(\"Hello, {}!\", cat);\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":7,"character":1},"end":{"line":7,"character":200}}}]},{"label":"() {","kind":1,"preselect":true,"detail":"() {\n    println!(\"cat\");\n}","insertText":"() {\n    println!(\"cat\");\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":6,"character":1},"end":{"line":6,"character":200}}}]},{"label":"() {","kind":1,"preselect":true,"detail":"() {\n    println!(\"       _._     _,-'\\\"\\\"`-._\");\n    println!(\"      (,-.`._,'(       |\\\\`-/|\");\n    println!(\"          `-.-' \\\\ )-`( , o o)\");\n    println!(\"                `-    \\\\`_`\\\"'-\");\n}\n","insertText":"() {\n    println!(\"       _._     _,-'\\\"\\\"`-._\");\n    println!(\"      (,-.`._,'(       |\\\\`-/|\");\n    println!(\"          `-.-' \\\\ )-`( , o o)\");\n    println!(\"                `-    \\\\`_`\\\"'-\");\n}\n","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":10,"character":3},"end":{"line":10,"character":200}}}]}]}}

APP 2024-01-30T03:49:01.966Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:02.382Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn print_cat()\n","languageId":"rust","version":15}

APP 2024-01-30T03:49:02.383Z --> calling completion event

APP 2024-01-30T03:49:02.383Z --> sending diagnostics | [{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}}}]

APP 2024-01-30T03:49:02.383Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[{"message":"Fetching completion...","severity":3,"range":{"start":{"line":4,"character":0},"end":{"line":5,"character":0}},"source":"helix-gpt"}]}}

APP 2024-01-30T03:49:02.383Z --> copilot | completion request

APP 2024-01-30T03:49:02.384Z --> fetch | /v1/engines/copilot-codex/completions

APP 2024-01-30T03:49:02.962Z --> response | https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions | 200

APP 2024-01-30T03:49:02.966Z --> completion hints: | ) {
    println!("        /\\_/\\");
    println!("   ____/ o o \\");
    println!(" /~____  =รธ= /");
    println!("(______)__m_m)");
}
,cat: &Cat) {
    println!("{} says {}", cat.name, cat.sound);
}

APP 2024-01-30T03:49:02.967Z --> sent request | {"jsonrpc":"2.0","id":11,"result":{"isIncomplete":true,"items":[{"label":") {","kind":1,"preselect":true,"detail":") {\n    println!(\"        /\\\\_/\\\\\");\n    println!(\"   ____/ o o \\\\\");\n    println!(\" /~____  =รธ= /\");\n    println!(\"(______)__m_m)\");\n}\n","insertText":") {\n    println!(\"        /\\\\_/\\\\\");\n    println!(\"   ____/ o o \\\\\");\n    println!(\" /~____  =รธ= /\");\n    println!(\"(______)__m_m)\");\n}\n","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":10,"character":0},"end":{"line":10,"character":200}}}]},{"label":"cat: &Cat) {","kind":1,"preselect":true,"detail":"cat: &Cat) {\n    println!(\"{} says {}\", cat.name, cat.sound);\n}","insertText":"cat: &Cat) {\n    println!(\"{} says {}\", cat.name, cat.sound);\n}","insertTextFormat":1,"additionalTextEdits":[{"newText":"","range":{"start":{"line":6,"character":1},"end":{"line":6,"character":200}}}]}]}}

APP 2024-01-30T03:49:02.967Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:03.470Z --> received didChange | language: rust | contentVersion: 16 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:03.471Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":15,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":12}

APP 2024-01-30T03:49:03.791Z --> received didChange | language: rust | contentVersion: 17 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:03.792Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/completion","params":{"position":{"character":16,"line":4},"textDocument":{"uri":"file:///Users/ls/rust/gpt/src/main.rs"}},"id":13}

APP 2024-01-30T03:49:04.142Z --> received didChange | language: rust | contentVersion: 18 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:04.296Z --> running completion on buffer | {"uri":"file:///Users/ls/rust/gpt/src/main.rs","text":"fn main() {\n    println!(\"Hello, world!\");\n}\n\nfn print_cat() {\n    \n}\n","languageId":"rust","version":18}

APP 2024-01-30T03:49:04.297Z --> skipping because content is stale

APP 2024-01-30T03:49:04.298Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:04.298Z --> sent request | {"jsonrpc":"2.0","id":13,"result":{"isIncomplete":false,"items":[]}}

APP 2024-01-30T03:49:06.022Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:09.431Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:11.270Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:12.384Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///Users/ls/rust/gpt/src/main.rs","diagnostics":[]}}

APP 2024-01-30T03:49:12.412Z --> received didChange | language: rust | contentVersion: 19 | uri: file:///Users/ls/rust/gpt/src/main.rs

APP 2024-01-30T03:49:52.649Z --> received request: | {"jsonrpc":"2.0","method":"shutdown","id":14}

APP 2024-01-30T03:49:52.649Z --> received shutdown request

helix logs

2024-01-30T11:49:01.965 helix_lsp::transport [ERROR] Tried sending response into a closed channel (id=Num(10)), original request likely timed out
2024-01-30T11:49:02.967 helix_lsp::transport [ERROR] gpt err: <- Parse(Error("EOF while parsing an object", line: 1, column: 935))
2024-01-30T11:49:52.650 helix_lsp::transport [ERROR] gpt err: <- StreamClosed
2024-01-30T11:49:52.732 helix_lsp::transport [ERROR] rust-analyzer err: <- StreamClosed
2024-01-30T11:49:55.650 helix_term::application [ERROR] Timed out waiting for language servers to shutdown

Support for Claude API

Problem:
Project lacks support for Claude API.

Solution:
Add support for Claude API

Alternatives:
None

Additional Context:
Claude 3's Opus model is SOTA, and considered by many even better than gpt-4

Consider changing an environment var to `OPENAI_API_KEY`

Hi, thanks for this project!

Please note that the two best and most used CLI apps for ChatGPT, aichat and shell_gpt, use the following to export the API key:

OPENAI_API_KEY=

rather than your:

OPENAI_KEY=123 # required

Changing to use the more commonly accepted format would make things easier as many users will be exporting their key already and making another identical export would be annoying. Cheers!

https://github.com/sigoden/aichat/wiki/Environment-Variables
https://github.com/TheR1D/shell_gpt#runtime-configuration-file

EDIT: You might want to go with sgpt DEFAULT_MODEL?

Default to higher timeout ?

Is your feature request related to a problem? Please describe.
For the copilot handler, I'm hitting a lot of timeout. Is there a way to customize the timeout ?

Describe the solution you'd like
I'm sure it's not without tradeoffs, but ideally there should be a way to have actions without a timeout for things triggered explicitely by the user (like code actions). And for everything that is triggered automatically (completions), then I think it makes sense to have a timeout.

Describe alternatives you've considered
you could customize the timeout with an env var, but I don't think it's an amazing solution, you might hit a lot the api or not enough, depending if the timeout is too short or too long.

Additional context
Thanks again for making this!

Add a shebang to the bundled file?

I did expect I could just execute the downloaded javascript file with bun installed on my system, but since there is no shebang, the system doesn't know how to execute it. I would suggest adding the following shebang to the top of the file:

#!/usr/bin/env bun

[BUG] triggering github's abuse systems

helix-editor version
24.3

helix-gpt version
0.31

Describe the bug
turns out github just emailed me!

Hello @lun-4,

On behalf of the GitHub Security team, I want to first extend our gratitude for your continued use of GitHub and for being a valued member of the GitHub community.

Recent activity on your account caught the attention of our abuse-detection systems. This activity included use of Copilot that was indicative of scripted interactions or of an otherwise deliberately unusual or strenuous nature. While we have not yet restricted Copilot access for your account, further anomalous activity could result in a temporary suspension of your Copilot access.

While Iโ€™m unable to share specifics on rate limits, we prohibit all use of our servers for any form of excessive automated bulk activity, as well as any activity that places undue burden on our servers through automated means. Please refer to our Acceptable Use Policies on this topic: https://docs.github.com/site-policy/acceptable-use-policies/github-acceptable-use-policies#4-spam-and-inauthentic-activity-on-github.

Please also refer to our Terms for Additional Products and Features for GitHub Copilot for specific terms: https://docs.github.com/site-policy/github-terms/github-terms-for-additional-products-and-features#github-copilot.

Sincerely,
GitHub Security


I do not use other editors than helix, in turn, there is no extension other than helix-gpt that could be triggering github's abuse systems for copilot. I am not automating helix in any way, but it's possible they ran some detection on the headers that helix-gpt uses to look like vscode. it doesn't look like the copilot vscode extension is under any kind of permissive license that would allow for reverse engineering, so I don't think an official GH ticket would give any results (it might even backfire and take the repo down, but that's the worst case), I am open to ideas though.

my current opinion is if GH is going to play cat-and-mouse, then we should act accordingly as the mouse. we should move towards continuously updating fingerprints such that we look more and more like vscode, that is a major maintenance request that I do not think I would be able to take on, myself. but it is a way. (of course, not even talking about possibly having to move the repo somewhere else, killing the social factor of having this plugin known, entirely. maybe a migration to codeberg could work, which is another major maintenance request, and is one I would put a lot of thought before actually going on with a random's requests).

Allow use of a simple config file for helix-gpt

Is your feature request related to a problem? Please describe.
I would like to use helix-gpt with paid LLM API providers, like OpenAI, but I can't really specify the API key inside my helix languages.toml config since I use NixOS, which has the languages.toml config specified in plaintext.

Describe the solution you'd like
A CLI flag, --config which can store the path of a file in which I can specify my API keys. This can just be like a .env file. To me, this doesn't seem like much effort to add in (I'll try and add a small PR to illustrate) .

Describe alternatives you've considered
For me specifically, I could use sops-nix, however this means I'd have to spend a bunch of time integrating it into my config.
More generally, having a config file for helix-gpt means even non-NixOS users can benefit, since they won't have to worry about exposing the API key to every program run through their shell, and can also create different API keys for helix-gpt, which can allow for easy monitoring of their creds.

Additional Comments
The config file doesn't necessarily need to be just for API keys, since we're also looking for OpenAI endpoint, timeout limit etc through environment variables, we could also set these through the config file.

[BUG] Nothing works for me

helix-editor version
23.10

helix-gpt version
31.0

Describe the bug
Nothing seems to work.
:lsp-workspace-command
image

Nothing happens if I press C-x nor on space-a
image

languages.toml

[[language]]
name = "tsx"
language-servers = [ "eslint", "emmet-ls", "typescript-language-server", "tailwindcss-ls", "gpt" ]
formatter = { command = "prettier", args = [ "--parser", "typescript" ] }
auto-format = true

[language-server.gpt]
command = "helix-gpt"
args = ["--logFile", "/tmp/helix-gpt.log"]

helix-gpt logs

tail -F /tmp/helix-gpt.log
APP 2024-03-26T09:14:27.113Z --> triggerCharacters: | ["{","("," "]

helix logs
This is the only reference to gpt I see on the logs

2024-03-26T10:14:27.077 helix_lsp::transport [INFO] gpt -> {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (f6021dd0)"},"processId":1720,"rootPath":"/Users/adrian/Developer/e-id","rootUri":"file:///Users/adrian/Developer/e-id","workspaceFolders":[{"name":"e-id","uri":"file:///Users/adrian/Developer/e-id"}]},"id":0}

Anything else to look for?

Fetch failed with status 404 body {"message": "Not Found"...

helix-editor version
helix 23.10 (83f09ecb)

helix-gpt version
0.27

Describe the bug
I can view the code actions successfully but when I pick one, I get the following error (I manually copied helix's diagnostic):

Fetch failed with status 404 body {"message": "Not Found", "documentation_url": "https://docs.github.com/rest"} {url: "/copilot_internal/v2/token"

helix-gpt logs
Helix GPT output logs. Pass --logFile /app/helix-gpt.log to save them.

helix logs

2024-02-15T20:27:59.618 helix_lsp::transport [INFO] pylsp <- {"jsonrpc":"2.0","id":1,"result":[]}
2024-02-15T20:27:59.618 helix_lsp::transport [INFO] pylsp <- []
2024-02-15T20:27:59.624 helix_lsp::transport [INFO] gpt <- {"jsonrpc":"2.0","id":1,"result":[{"title":"Resolve diagnostics","kind":"quickfix","diagnostics":[],"command":{"title":"Resolve diagnostics","command":"resolveDiagnostics","arguments":[{"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"query":"Resolve the diagnostics for this code.","diagnostics":[]}]}},{"title":"Generate documentation","kind":"quickfix","diagnostics":[],"command":{"title":"Generate documentation","command":"generateDocs","arguments":[{"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"query":"Add documentation to this code.","diagnostics":[]}]}},{"title":"Improve code","kind":"quickfix","diagnostics":[],"command":{"title":"Improve code","command":"improveCode","arguments":[{"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"query":"Improve this code.","diagnostics":[]}]}},{"title":"Refactor code from a comment","kind":"quickfix","diagnostics":[],"command":{"title":"Refactor code from a comment","command":"refactorFromComment","arguments":[{"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"query":"Refactor this code based on the comment.","diagnostics":[]}]}},{"title":"Write a unit test","kind":"quickfix","diagnostics":[],"command":{"title":"Write a unit test","command":"writeTest","arguments":[{"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"query":"Write a unit test for this code. Do not include any imports.","diagnostics":[]}]}}]}
2024-02-15T20:27:59.625 helix_lsp::transport [INFO] gpt <- [{"command":{"arguments":[{"diagnostics":[],"query":"Resolve the diagnostics for this code.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"resolveDiagnostics","title":"Resolve diagnostics"},"diagnostics":[],"kind":"quickfix","title":"Resolve diagnostics"},{"command":{"arguments":[{"diagnostics":[],"query":"Add documentation to this code.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"generateDocs","title":"Generate documentation"},"diagnostics":[],"kind":"quickfix","title":"Generate documentation"},{"command":{"arguments":[{"diagnostics":[],"query":"Improve this code.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"improveCode","title":"Improve code"},"diagnostics":[],"kind":"quickfix","title":"Improve code"},{"command":{"arguments":[{"diagnostics":[],"query":"Refactor this code based on the comment.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"refactorFromComment","title":"Refactor code from a comment"},"diagnostics":[],"kind":"quickfix","title":"Refactor code from a comment"},{"command":{"arguments":[{"diagnostics":[],"query":"Write a unit test for this code. Do not include any imports.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"writeTest","title":"Write a unit test"},"diagnostics":[],"kind":"quickfix","title":"Write a unit test"}]
2024-02-15T20:28:00.089 helix_lsp::transport [INFO] gpt -> {"jsonrpc":"2.0","method":"workspace/executeCommand","params":{"arguments":[{"diagnostics":[],"query":"Resolve the diagnostics for this code.","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}}}],"command":"resolveDiagnostics"},"id":2}
2024-02-15T20:28:00.099 helix_lsp::transport [INFO] gpt <- {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/lucca/api2/app.py","diagnostics":[{"message":"Executing resolveDiagnostics...","range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"severity":3,"source":"helix-gpt"}]}}
2024-02-15T20:28:00.396 helix_lsp::transport [INFO] gpt <- {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/lucca/api2/app.py","diagnostics":[{"message":"Fetch failed with status 404 body {\"message\":\"Not Found\",\"documentation_url\":\"https://docs.github.com/rest\"} url: /copilot_internal/v2/token","severity":1,"range":{"end":{"character":5,"line":18},"start":{"character":4,"line":18}},"source":"helix-gpt"}]}}

Having multiple providers running

Is your feature request related to a problem? Please describe.
I think the copilot handler can only to completion, right? Would be nice to also have gpt4 running for things like docs. Just running two instances doesn't really work.

Describe the solution you'd like
A way to add more than one provider and an a way to assign each provider to a different lsp function.

Describe alternatives you've considered
Running two instances, but that gets messy.

Configuration to always pick one-line completions

Describe the solution you'd like
To be able to configure helix-gpt to always pick one-line completions

Additional context
helix-gpt (at least using openai) provides three options for completion.

I would like to have the first one be a one-liner, like it happens in vscode.
So it could be a opt-in or opt-out feature, or even the norm

In general the one-liners have been simply better, and more useful, but usually it's only the last completion

What does the maintainer think about this?

Publish releases to Homebrew

Is your feature request related to a problem? Please describe.
I manage installed applications and tools via Homebrew wherever possible, and specifically use homebrew-bundle to easily install apps across systems.

Describe the solution you'd like
I would find it very useful if helix-gpt could be added to Homebrew. I install Helix the same way, and this would ensure both tools can be installed with one command.

Describe alternatives you've considered
The installation instructions in the README are the obvious alternative, but I don't currently use bun, and would prefer to avoid the manual step of copying and pasting the command to use the precompiled binary.

Additional context
This might be somewhat similar to #1.

InvalidExe ?

Im getting the following error log:

2024-01-26T21:45:37.138 helix_lsp::transport [ERROR] gpt err <- "error: Failed to run \"helix-gpt.js\" due to error InvalidExe\n"

helix-gpt is in my path. Im on a M1.

[language-server.gpt]
command = "bun"
args = ["run", "helix-gpt.js"]

[BUG] "error: no handler key provided

helix-editor version
Exact version of the Helix editor you are using..

helix-gpt version
Exact version of Helix GPT.

Describe the bug
A clear and concise description of what the bug is.

helix-gpt logs
Helix GPT output logs. Pass --logFile /app/helix-gpt.log to save them.

helix logs
Helix editor output logs. Ensure you start helix in verbose mode hx -v example.file

[Request] Expose workspace command to temporarily disable suggestions (and code actions)

First of all, thanks for the amazing work! Not having Copilot in Helix was really an issue for me, but not anymore ๐Ÿ™

Sometimes I'd like to temporarily disable the copilot/GPT suggestions, when it is making too many or irrelevant suggestions. Currently I'm not aware that there are any other way of accomplishing this than modifying the languages.toml file. I think it would be nice if the LSP would expose a workspace command, which could be executed from Helix using the :lsp-workspace-command.

What do you think?

[Request] Trigger completion also on `Ctrl-x`

This message occurs frequently in the helix-gpt log file:

skipping |  | not in | {,(,),=,>, ,,,:,.,<,/

I'm assuming its related to what you describe in your comment on the Helix thread about how helix-gpt works.

Obviously, it makes total sense not to be sending completion requests every key press. It would be useful, however, to be able to trigger a request manually with the builtin Ctrl-x keybinding for autocompletion (which I spam all the time).

Without it, use cases like the following (i.e. code generation based on a description) remain out of reach.

# function that takes a number x and multiplies it by 2
.. def double(x [...]

Also, I imagine that the set of characters/keywords that signal sensible contexts for auto-fetching completions varies depending on programming languages.

Of course, all this depends on whether LSP servers are able to differentiate between automatic requests and manually triggered requests. Any ideas?

ARM Support

Is your feature request related to a problem? Please describe.
Unable to use this on Mac M1 as it seems to only be built for AMD

Describe the solution you'd like
A compiled binary for ARM64

[BUG] "error: no handler key provided" when using `--authCopilot`

helix-editor version
23.10

helix-gpt version
0.28

Describe the bug
When booting up helix the GPT language server crashes with the log "error: no handler key provided" when configured using Copilot.
This happens when booting up helix with HANDLER=copilot hx main.rs

helix-gpt logs
n.a.

helix logs

2024-03-03T08:22:32.702 helix_lsp::transport [ERROR] gpt err <- "error: no handler key provided\n"
2024-03-03T08:22:32.702 helix_lsp::transport [ERROR] gpt err <- "      at /***.local/bin/helix-gpt:3:2197\n"
2024-03-03T08:22:32.703 helix_lsp::transport [ERROR] gpt err <- "error: \"helix-gpt\" exited with code 1\n"

My configuration for rust is:

[language-server.gpt]
command = "bun"
args = ["run", "/***.local/bin/helix-gpt", "--authCopilot"]

[[language]]
name = "rust"
language-servers = [
    "rust-analyzer",
    "gpt"
]

[BUG] Codeium - No chat provider for Codeium

helix-editor version
helix 23.10 (f6021dd0)

helix-gpt version
0.28

Describe the bug
Codeium does not work. I haven't set any environment variables for codeium as explained in the README. Here is my languages.toml:

[language-server.gpt]
command = "helix-gpt"
args = ["--handler","codeium","--logFile","/tmp/helix-gpt.log"]

[[language]]
name = "python"
roots = ["pyproject.toml", "."]
language-servers = [ "pyright","ruff-lsp","gpt"]
auto-format = true

Steps to reproduce

  • Highlight some code
  • code actions > improve code
  • Observe the error message: No chat provider for Codeium

Also, If you install codeium on VSCode it asks for an auth token. I suspect it is not possible to use Codeium without a token. Since, they explain that it is required to prevent abuse/spam.
helix-gpt logs

APP 2024-02-17T23:54:15.358Z --> triggerCharacters: | ["{","("," "]

APP 2024-02-17T23:54:15.362Z --> received request: | {"jsonrpc":"2.0","method":"initialize","params":{"capabilities":{"general":{"positionEncodings":["utf-8","utf-32","utf-16"]},"textDocument":{"codeAction":{"codeActionLiteralSupport":{"codeActionKind":{"valueSet":["","quickfix","refactor","refactor.extract","refactor.inline","refactor.rewrite","source","source.organizeImports"]}},"dataSupport":true,"disabledSupport":true,"isPreferredSupport":true,"resolveSupport":{"properties":["edit","command"]}},"completion":{"completionItem":{"deprecatedSupport":true,"insertReplaceSupport":true,"resolveSupport":{"properties":["documentation","detail","additionalTextEdits"]},"snippetSupport":true,"tagSupport":{"valueSet":[1]}},"completionItemKind":{}},"hover":{"contentFormat":["markdown"]},"inlayHint":{"dynamicRegistration":false},"publishDiagnostics":{"versionSupport":true},"rename":{"dynamicRegistration":false,"honorsChangeAnnotations":false,"prepareSupport":true},"signatureHelp":{"signatureInformation":{"activeParameterSupport":true,"documentationFormat":["markdown"],"parameterInformation":{"labelOffsetSupport":true}}}},"window":{"workDoneProgress":true},"workspace":{"applyEdit":true,"configuration":true,"didChangeConfiguration":{"dynamicRegistration":false},"didChangeWatchedFiles":{"dynamicRegistration":true,"relativePatternSupport":false},"executeCommand":{"dynamicRegistration":false},"inlayHint":{"refreshSupport":false},"symbol":{"dynamicRegistration":false},"workspaceEdit":{"documentChanges":true,"failureHandling":"abort","normalizesLineEndings":false,"resourceOperations":["create","rename","delete"]},"workspaceFolders":true}},"clientInfo":{"name":"helix","version":"23.10 (f6021dd0)"},"processId":98758,"rootPath":"/home/USER/Desktop/dotfiles","rootUri":"file:///home/USER/Desktop/dotfiles","workspaceFolders":[{"name":"dotfiles","uri":"file:///home/USER/Desktop/dotfiles"}]},"id":0}

APP 2024-02-17T23:54:15.362Z --> sent request | {"jsonrpc":"2.0","method":"initialize","id":0,"result":{"capabilities":{"codeActionProvider":true,"executeCommandProvider":{"commands":["resolveDiagnostics","generateDocs","improveCode","refactorFromComment","writeTest"]},"completionProvider":{"resolveProvider":false,"triggerCharacters":["{","("," "]},"textDocumentSync":{"change":1,"openClose":true}}}}

APP 2024-02-17T23:54:15.363Z --> received request: | {"jsonrpc":"2.0","method":"initialized","params":{}}

APP 2024-02-17T23:54:15.363Z --> received didOpen | language: python

APP 2024-02-17T23:54:22.150Z --> received request: | {"jsonrpc":"2.0","method":"textDocument/codeAction","params":{"context":{"diagnostics":[],"triggerKind":1},"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"textDocument":{"uri":"file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py"}},"id":1}

APP 2024-02-17T23:54:22.151Z --> sent request | {"jsonrpc":"2.0","id":1,"result":[{"title":"Resolve diagnostics","kind":"quickfix","diagnostics":[],"command":{"title":"Resolve diagnostics","command":"resolveDiagnostics","arguments":[{"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"query":"Resolve the diagnostics for this code.","diagnostics":[]}]}},{"title":"Generate documentation","kind":"quickfix","diagnostics":[],"command":{"title":"Generate documentation","command":"generateDocs","arguments":[{"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"query":"Add documentation to this code.","diagnostics":[]}]}},{"title":"Improve code","kind":"quickfix","diagnostics":[],"command":{"title":"Improve code","command":"improveCode","arguments":[{"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"query":"Improve this code.","diagnostics":[]}]}},{"title":"Refactor code from a comment","kind":"quickfix","diagnostics":[],"command":{"title":"Refactor code from a comment","command":"refactorFromComment","arguments":[{"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"query":"Refactor this code based on the comment.","diagnostics":[]}]}},{"title":"Write a unit test","kind":"quickfix","diagnostics":[],"command":{"title":"Write a unit test","command":"writeTest","arguments":[{"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"query":"Write a unit test for this code. Do not include any imports.","diagnostics":[]}]}}]}

APP 2024-02-17T23:54:23.409Z --> received request: | {"jsonrpc":"2.0","method":"workspace/executeCommand","params":{"arguments":[{"diagnostics":[],"query":"Improve this code.","range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}}}],"command":"improveCode"},"id":2}

APP 2024-02-17T23:54:23.409Z --> sending diagnostics | [{"message":"Executing improveCode...","range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"severity":3}]

APP 2024-02-17T23:54:23.409Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py","diagnostics":[{"message":"Executing improveCode...","range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"severity":3,"source":"helix-gpt"}]}}

APP 2024-02-17T23:54:23.410Z --> getting content from range | {"end":{"character":0,"line":142},"start":{"character":0,"line":134}} | uri: file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py | current buffers: ["file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py"]

APP 2024-02-17T23:54:23.410Z --> chat request content: | 
def insert_newlines(text, line_length=33, offset=0):
    lines = textwrap.wrap(text, line_length, break_long_words=False)
    if len(lines) > 1:
        indented_lines = [lines[0]] + [(offset * " ") + line for line in lines[1:]]
        return "\n".join(indented_lines)
    else:
        return text

APP 2024-02-17T23:54:23.410Z --> codeium | chat request | ["Improve this code.","\ndef insert_newlines(text, line_length=33, offset=0):\n    lines = textwrap.wrap(text, line_length, break_long_words=False)\n    if len(lines) > 1:\n        indented_lines = [lines[0]] + [(offset * \" \") + line for line in lines[1:]]\n        return \"\\n\".join(indented_lines)\n    else:\n        return text","file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py","python"]

APP 2024-02-17T23:54:23.410Z --> No chat provider for: codeium

APP 2024-02-17T23:54:23.411Z --> chat failed | No chat provider for: codeium

APP 2024-02-17T23:54:23.411Z --> sending diagnostics | [{"message":"No chat provider for: codeium","severity":1,"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}}}]

APP 2024-02-17T23:54:23.411Z --> sent request | {"jsonrpc":"2.0","method":"textDocument/publishDiagnostics","params":{"uri":"file:///home/USER/Desktop/dotfiles/system/scripts/python/update_wp/gtasks/main.py","diagnostics":[{"message":"No chat provider for: codeium","severity":1,"range":{"end":{"character":0,"line":142},"start":{"character":0,"line":134}},"source":"helix-gpt"}]}}


Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.