Giter Club home page Giter Club logo

codegpt.nvim's Introduction

CodeGPT.nvim

CodeGPT is a plugin for neovim that provides commands to interact with ChatGPT. The focus is around code related usages. So code completion, refactorings, generating docs, etc.

Installation

  • Set environment variable OPENAI_API_KEY to your openai api key.
  • The plugins 'plenary' and 'nui' are also required.
  • OpenAI's tokenizer tiktoken is recommended for accurate token count estimate.

Installing with Lazy.

{
    "dpayne/CodeGPT.nvim",
    dependencies = {
      'nvim-lua/plenary.nvim',
      'MunifTanjim/nui.nvim',
    },
    config = function()
        require("codegpt.config")
    end
}

Installing with packer.

use({
   "dpayne/CodeGPT.nvim",
   requires = {
      "MunifTanjim/nui.nvim",
      "nvim-lua/plenary.nvim",
   },
   config = function()
      require("codegpt.config")
   end
})

Installing with plugged.

Plug("nvim-lua/plenary.nvim")
Plug("MunifTanjim/nui.nvim")
Plug("dpayne/CodeGPT.nvim")

Installing OpenAI's tokenizer

pip install tiktoken

Commands

The top-level command is :Chat. The behavior is different depending on whether text is selected and/or arguments are passed.

Completion

  • :Chat with text selection will trigger the completion command, ChatGPT will try to complete the selected code snippet. completion

Code Edit

  • :Chat some instructions with text selection and command args will invoke the code_edit command. This will treat the command args as instructions on what to do with the code snippet. In the below example, :Chat refactor to use iteration will apply the instruction refactor to use iteration to the selected code. code_edit

Code Edit

  • :Chat <command> if there is only one argument and that argument matches a command, it will invoke that command with the given text selection. In the below example :Chat tests will attempt to write units for the selected code. tests

Chat

  • :Chat hello world without any text selection will trigger the chat command. This will send the arguments hello world to ChatGPT and show the results in a popup. chat

A full list of predefined commands are below

command input Description
completion text selection Will ask ChatGPT to complete the selected code.
code_edit text selection and command args Will ask ChatGPT to apply the given instructions (the command args) to the selected code.
explain text selection Will ask ChatGPT to explain the selected code.
question text selection Will pass the commands args to ChatGPT and return the answer in a text popup.
debug text selection Will pass the code selectiont to ChatGPT analyze it for bugs, the results will be in a text popup.
doc text selection Will ask ChatGPT to document the selected code.
opt text selection Will ask ChatGPT to optimize the selected code.
tests text selection Will ask ChatGPT to write unit tests for the selected code.
chat command args Will pass the given command args to ChatGPT and return the response in a popup.

Overriding Command Configurations

The configuration option vim.g["codegpt_commands_defaults"] = {} can be used to override command configurations. This is a lua table with a list of commands and the options you want to override.

vim.g["codegpt_commands_defaults"] = {
  ["completion"] = {
      user_message_template = "This is a template of the message passed to chat gpt. Hello, the code snippet is {{text_selection}}."
}

The above, overrides the message template for the completion command.

A full list of overrides

name default description
model "gpt-3.5-turbo" The model to use.
max_tokens 4096 The maximum number of tokens to use including the prompt tokens.
temperature 0.6 0 -> 1, what sampling temperature to use.
system_message_template "" Helps set the behavior of the assistant.
user_message_template "" Instructs the assistant.
callback_type "replace_lines" Controls what the plugin does with the response
language_instructions {} A table of filetype => instructions. The current buffer's filetype is used in this lookup. This is useful trigger different instructions for different languages.
extra_params {} A table of custom parameters to be sent to the API.

Overriding the global defaults

The overrides can be set globally using vim.g["codegpt_global_commands_defaults"]. This can be useful to setup a custom configuration for APIs that emulate OpenAI such as LocalAI.

    vim.g["codegpt_global_commands_defaults"] = {
        model = "mixtral",
        max_tokens = 4096,
        temperature = 0.4,
        -- extra_parms = { -- optional list of extra parameters to send to the API
        --     presence_penalty = 1,
        --     frequency_penalty= 1
        -- }
    }

Templates

The system_message_template and the user_message_template can contain template macros. For example:

macro description
{{filetype}} The filetype of the current buffer.
{{text_selection}} The selected text in the current buffer.
{{language}} The name of the programming language in the current buffer.
{{command_args}} Everything passed to the command as an argument, joined with spaces. See below.
{{language_instructions}} The found value in the language_instructions map. See below.

Language Instructions

Some commands have templates that use the {{language_instructions}} macro to allow for additional instructions for specific filetypes.

vim.g["codegpt_commands_defaults"] = {
  ["completion"] = {
      language_instructions = {
          cpp = "Use trailing return type.",
      },
  }
}

The above adds a specific Use trailing return type. to the command completion for the filetype cpp.

Command Args

Commands are normally a single value, for example :Chat completion. Normally, a command such as :Chat completion value will be interpreted as a code_edit command, with the arguments "completion value", and not completion with "value". You can make commands accept additional arguments by using the {{command_args}} macro anywhere in either user_message_template or system_message_template. For example:

vim.g["codegpt_commands"] = {
  ["testwith"] = {
      user_message_template =
        "Write tests for the following code: ```{{filetype}}\n{{text_selection}}```\n{{command_args}} " ..
        "Only return the code snippet and nothing else."
  }
}

After defining this command, any :Chat command that has testwith as its first argument will be handled. For example, :Chat testwith some additional instructions will be interpreted as testwith with "some additional instructions".

Custom Commands

Custom commands can be added to the vim.g["codegpt_commands"] configuration option to extend the available commands.

vim.g["codegpt_commands"] = {
  ["modernize"] = {
      user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nModernize the above code. Use current best practices. Only return the code snippet and comments. {{language_instructions}}",
      language_instructions = {
          cpp = "Refactor the code to use trailing return type, and the auto keyword where applicable.",
      },
  }
}

The above configuration adds the command :Chat modernize that attempts modernize the selected code snippet.

Command Defaults

The default command configuration is:

{
    model = "gpt-3.5-turbo",
    max_tokens = 4096,
    temperature = 0.6,
    number_of_choices = 1,
    system_message_template = "",
    user_message_template = "",
    callback_type = "replace_lines",
}

More Configuration Options

Custom status hooks

You can add custom hooks to update your status line or other ui elements, for example, this code updates the status line colour to yellow whilst the request is in progress.

vim.g["codegpt_hooks"] = {
	request_started = function()
		vim.cmd("hi StatusLine ctermbg=NONE ctermfg=yellow")
	end,
  request_finished = vim.schedule_wrap(function()
		vim.cmd("hi StatusLine ctermbg=NONE ctermfg=NONE")
	end)
}

Lualine Status Component

There is a convenience function get_status so that you can add a status component to lualine.

local CodeGPTModule = require("codegpt")

require('lualine').setup({
    sections = {
        -- ...
        lualine_x = { CodeGPTModule.get_status, "encoding", "fileformat" },
        -- ...
    }
})

Popup options

Popup commands

The default filetype of the text popup window is markdown. You can change this by setting the codegpt_popup_options variable.

vim.g["codegpt_text_popup_filetype"] = "markdown"

Popup commands

vim.g["codegpt_ui_commands"] = {
  -- some default commands, you can remap the keys
  quit = "q", -- key to quit the popup
  use_as_output = "<c-o>", -- key to use the popup content as output and replace the original lines
  use_as_input = "<c-i>", -- key to use the popup content as input for a new API request
}
vim.g["codegpt_ui_commands"] = {
  -- tables as defined by nui.nvim https://github.com/MunifTanjim/nui.nvim/tree/main/lua/nui/popup#popupmap
  {"n", "<c-l>", function() print("do something") end, {noremap = false, silent = false}}
}

Popup layouts

vim.g["codegpt_popup_options"] = {
  -- a table as defined by nui.nvim https://github.com/MunifTanjim/nui.nvim/tree/main/lua/nui/popup#popupupdate_layout
  relative = "editor",
  position = "50%",
  size = {
    width = "80%",
    height = "80%"
  }
}

Popup border

vim.g["codegpt_popup_border"] = {
  -- a table as defined by nui.nvim https://github.com/MunifTanjim/nui.nvim/tree/main/lua/nui/popup#border
  style = "rounded"
}

Popup window options

-- Enable text wrapping and line numbers
vim.g["codegpt_popup_window_options"] = {
  wrap = true,
  linebreak = true,
  relativenumber = true,
  number = true,
}

Move completion to popup window

For any command, you can override the callback type to move the completion to a popup window. An example below is for overriding the completion command.

require("codegpt.config")

vim.g["codegpt_commands"] = {
  ["completion"] = {
    callback_type = "code_popup",
  },
}

Horizontal or vertical split window

If you prefer a horizontal or vertical split window, you can change the popup type to horizontal or vertical.

-- options are "horizontal", "vertical", or "popup". Default is "popup"
vim.g["codegpt_popup_type"] = "horizontal"

To set the height of the horizontal window or the width of the vertical popup, you can use codegpt_horizontal_popup_size and codegpt_horizontal_popup_size variables.

vim.g["codegpt_horizontal_popup_size"] = "20%"
vim.g["codegpt_vertical_popup_size"] = "20%"

Miscellaneous Configuration Options

-- Open API key and api endpoint
vim.g["codegpt_openai_api_key"] = os.getenv("OPENAI_API_KEY")
vim.g["codegpt_chat_completions_url"] = "https://api.openai.com/v1/chat/completions"
vim.g["codegpt_api_provider"] = "OpenAI" -- or Azure

-- clears visual selection after completion
vim.g["codegpt_clear_visual_selection"] = true

Callback Types

Callback types control what to do with the response

name Description
replace_lines replaces the current lines with the response. If no text is selected it will insert the response at the cursor.
text_popup Will display the result in a text popup window.
code_popup Will display the results in a popup window with the filetype set to the filetype of the current buffer

Template Variables

name Description
language Programming language of the current buffer.
filetype filetype of the current buffer.
text_selection Any selected text.
command_args Command arguments.
filetype_instructions filetype specific instructions.

Example Configuration

Note that CodeGPT should work without any configuration. This is an example configuration that shows some of the options available:

require("codegpt.config")

-- Override the default chat completions url, this is useful to override when testing custom commands
-- vim.g["codegpt_chat_completions_url"] = "http://127.0.0.1:800/test"

vim.g["codegpt_commands"] = {
  ["tests"] = {
    -- Language specific instructions for java filetype
    language_instructions = {
        java = "Use the TestNG framework.",
    },
  },
  ["doc"] = {
    -- Language specific instructions for python filetype
    language_instructions = {
        python = "Use the Google style docstrings."
    },

    -- Overrides the max tokens to be 1024
    max_tokens = 1024,
  },
  ["code_edit"] = {
    -- Overrides the system message template
    system_message_template = "You are {{language}} developer.",

    -- Overrides the user message template
    user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nEdit the above code. {{language_instructions}}",

    -- Display the response in a popup window. The popup window filetype will be the filetype of the current buffer.
    callback_type = "code_popup",
  },
  -- Custom command
  ["modernize"] = {
    user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nModernize the above code. Use current best practices. Only return the code snippet and comments. {{language_instructions}}",
    language_instructions = {
        cpp = "Use modern C++ syntax. Use auto where possible. Do not import std. Use trailing return type. Use the c++11, c++14, c++17, and c++20 standards where applicable.",
    },
  }
}

Goals

  • Code related usages.
  • Simple.
  • Easy to add custom commands.

codegpt.nvim's People

Contributors

00sapo avatar aap01 avatar baggiponte avatar blob42 avatar blockchainian avatar dpayne avatar freedomben avatar gstokkink avatar harry-optimised avatar icholy avatar jaime10a avatar jcdickinson avatar nmnduy avatar roobert avatar sherlockhoe90 avatar tamamcglinn avatar ttbug avatar yuchanns avatar zzhirong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

codegpt.nvim's Issues

Feature request: allow styling of popup border

Hi, first of all, thanks for this excellent plugin!

What I especially like about this plugin is that's it's quite easy to customize. There's one more thing I'd like to be able to customize though: the popup's border. Perhaps you can add a vim.g['codegpt_popup_border'] option or something like that, that defaults to { style = 'rounded' }? Then you can simple change https://github.com/dpayne/CodeGPT.nvim/blob/master/lua/codegpt/ui.lua#L9 to:

border = vim.g['codegpt_popup_border'],

Thanks in advance!

Adding custom commands does not work

In the readme it says:

Custom Commands

Custom commands can be added to the vim.g["codegpt_commands"] configuration option to extend the available commands.

vim.g["codegpt_commands"] = {
  ["modernize"] = {
      user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nModernize the above code. Use current best practices. Only return the code snippet and comments. {{language_instructions}}",
      language_instructions = {
          cpp = "Refactor the code to use trailing return type, and the auto keyword where applicable.",
      },
  }
}

The above configuration adds the command :Chat modernize that attempts modernize the selected code snippet.

So, trying it, I added exactly that to my vimrc, selected this piece of C++ code:

#include <iostream>
using namespace std;

int foobar(){
  return 0;
}

bool whatnow(){
  return 1;
}

int main(){
  return 0;
}

And the call made by CodeGPT is not using the user_message_template at all, but doing this call:

{                                                                                                                                                                                                           
  max_tokens = 4003,
  messages = { {
      content = "You are a C++ coding assistant.",
      role = "system"
    }, {
      content = "I have the following C++ code: ```cpp\n#include <iostream>\nusing namespace std;\n\nint foobar(){\n  return 0;\n}\n\nbool whatnow(){\n  return 1;\n}\n\nint main(){\n  return 0;\n}```\nmodernize.
Only return the code snippet and nothing else.",
      role = "user"
    } },
  model = "gpt-3.5-turbo",
  n = 1,
  temperature = 0.8
}

I know because I added this little snippet to my vimrc, which I suggest to add to the README under heading 'debugging':

OpenAIApi = require("codegpt.openai_api")
function OpenAIApi.make_call(payload, cb)
  print(vim.inspect(payload))
end

Autocompleted result is broken

Hello. Any idea why this might be happening? Sometimes the result is correct, but half the time you get this broken result.

Is there a way to debug actual API response to see if the problem is with the autocompleted result and not in the way plugin replaces the selected lines with the ChatGPT result?

Kapture.2023-03-09.at.11.05.08.mp4

Thanks!

Feature Request: Horizontal Buffer Output

Would it be possible to make the ChatGPT output show up in a new horizontal buffer? This would make it easier to review the output and copy and paste what is needed back in to the project that I'am working.

Thanks for creating the plugin. It's been fun to experiment with ChatGPT.

Support for FIM (Fill-in-the-middle) ?

Hi,

With the recent introduction of Codestral (https://mistral.ai/news/codestral/), I am wondering if a support is planned or would be easy to implement for nvim, particularly with the FIM feature. So far, I haven't seen any implementation that handles this for Vim.

What I have in mind is the ability to activate the FIM mode in "copilot" mode to suggest auto-completion at the cursor automatically with a customizable backend. Specifically, I would like to:

  • Control when the suggestion is triggered or not
  • Control the timeout to trigger the call
  • Control cycling through suggestions
  • Control the context window size send (prompt and suffix)

What do you think about this ?

OpenAIApi Key not found, set in vim with 'codegpt_openai_api_key' or as the env variable 'OPENAI _API_KEY'

when I configure openai_api_key in config.lua like below:
vim.g["codegpt_openai_api_key"] = os.getenv("sk-**********************I")
how do I fix this bug?

Error Message:

Error executing Lua callback: ...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: OpenAIApi Key not found, set in vim with 'codegpt_openai_api_key' or as the env variable 'OPENAI
_API_KEY'
stack traceback:
[C]: in function 'error'
...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: in function 'make_call'
.../pack/packer/start/CodeGPT.nvim/lua/codegpt/commands.lua:53: in function 'run_cmd'
...nvim/site/pack/packer/start/CodeGPT.nvim/lua/codegpt.lua:34: in function 'run_cmd'
...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:4: in function <...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:3>

openSUSE curl (8.7.1) error "Failed writing received data to disk/application"

As I'm trying to get this configured I've run into the following issue after any :Chat call:

Error: post https://api.openai.com/v1/chat/completions - curl error exit_code=23 
stderr={ "curl: (23) Failed writing received data to disk/application" }.
  • Checked the OPENAI_API_KEY environmental variable
  • Checked that the tiktoken package is available

What else could be happening?

Language instructions appear to be ignored

Hi there, thanks for making this neat plugin! I have been trying to add language instructions as per the README but haven't had much luck. When following the Example configuration section to add instructions to use pytest for python tests, it appeared to not be respected:

require("codegpt.config")

vim.g["codegpt_commands"] = {
  ["tests"] = {
    language_instructions = {
        python = "Use the pytest unit framework",
    },
  }
}

Adding a print for debugging showed that the prompt didn't include these instructions, it said the following (two spaces between the sentences point to the language_instructions template not being rendered):

Write really good unit tests using best practices for the given language. Only return the unit tests.

I did confirm that the buffer I was in had the appropriate filetype.

I was able to get it working with this configuration; is the documentation out of date or is this a bug? (using this functionality)

vim.g.codegpt_lang_instructions_python_tests = "Use the pytest test framework."

Would be happy to try and contribute a fix if you think this is worth fixing. Thanks!

Enable custom commands with args and without text selection

lua/codegpt.lua makes it clear that when there is no text selection, but there are args, then the command must be chat. Why is that?

    if text_selection ~= "" and command_args ~= "" then
        local cmd_opts = CommandsList.get_cmd_opts(command)
        if cmd_opts ~= nil and has_command_args(cmd_opts) then
            command_args = table.concat(opts.fargs, " ", 2)
        elseif cmd_opts and 1 == #opts.fargs then
            command_args = ""
        else
            command = "code_edit"
        end
    elseif text_selection ~= "" and command_args == "" then
        command = "completion"
    elseif text_selection == "" and command_args ~= "" then
        command = "chat"
    end

For example, this custom command will not work, unless you select some arbitrary whitespace first:

vim.g["codegpt_commands"] = vim.tbl_extend("force", vim.g["codegpt_commands"] or {}, 
  {
    ["eli"] = {
        system_message_template = "You are a tutor to a ten year old child. Explain everything using simple English.",
        user_message_template = "{{command_args}}",
        callback_type = "text_popup",
    }
})

I would expect doing :Chat eli how are cars made? to use the custom template, but it actually just reverts to "you are a general assistant to a software developer". This behaviour works if I select some whitespace and do :'<,'>Chat eli how are cars made?

A current workaround is to define your own command that directly calls run_cmd with the right command name:

vim.api.nvim_create_user_command("Eli", function(opts)
    require("codegpt.commands").run_cmd("eli", table.concat(opts.fargs, " "), "")
end, {range = true, nargs = "*"})

Now :Eli how are cars made? works as expected. This shouldn't be necessary.

ChatGPT think my selection is not valid YAML

I have a yaml file where i want to reformulate some texts, or ensure there is no grammar error in my description.
If i select one sentence, and ask ChatGPT to reformulate, it tells me that it is not valid YAML.

Example:

info:
  title: Example
  version: '1.0'
  description: |
    Bienvenue sur la documentation XXX. Cette API vous permet de contrôler YYY et ZZZ.

If i select just the line with "Bienvenue", and ask:

:Chat Corrige la phrase

The output will be:

I'm sorry, but the given input is not a valid YAML code snippet. YAML is a data serialization language and requires proper indentation and syntax.

Is there a way to avoid the yaml tagging of the selection ?

Text does not wrap around in the popup by default

When using nvim with the setting vim.o.wrap = false, the popup that appears when using the Chat command :Chat hello world does also not wrap the text around.
This is probably expected behavior, but for this use case, I think no one would use it without wrapping, so setting wrapping to true by default on this popup window would probably be better.
Screenshot 2023-03-08 at 20 50 53

Suggestion: Add lifecycle callbacks to support having loading indicators

Hi,

Thanks for the plugin :), since it can take a few seconds to have the ChatGPT response it would be cool have some sort of UI loading feedback (e.g statusline loading spinner).

I suggest having some callbacks like:

vim.g["codegpt_hooks"] = {
  request_started = function() 
   -- update some ui, start a spinner
  end,

  request_finished = function()
   -- update some ui, stop the spinner
  end
}

A lualine component would be 🔥 .

Feature: automatically pull token definitions in the codebase into the context when the token is selected.

It would be nice to have the class/function, etc. definitions pulled into the context when we select some code.

Features:

  • When we select some code, find the definition of the class, functions, etc. in the codebase and put it into the context. Might require an external tool to achieve this.
  • Make this feature optional.

Additional thoughts:

  • If the function body is too large, maybe we should reduce the content of the function.
  • Add a parameter to limit how many class/function definitions to include in the context.

Feature Request: chat question

Please add a question command:

command input Description
question text selection and command args Will ask ChatGPT a question (the command args) to the selected code.

I think this is a quiet frequent use case. We already have "explain" but this offers a generic explanation and not a specific explanation to a given question for selected code.

License

Hello!
Thank you for this great plugin!
Could you provide under which license is the repository/plugin?

What's the difference with ChatGPT.nvim?

To my understanding, the only true difference is that CodeGPT implements actions from lua snippets, while ChatGPT also allows external json to be loaded, plus a number of utilities, UI, etc.

Is there any other difference?

Feature request: generate command

Please add a generate command:

command input Description
generate command args Will ask ChatGPT to generate the command args in the language of the buffer

This is like completion in that it only outputs the code and puts that into the buffer, but without requiring any existing code. For example, if I open a new file and do :Chat generate fibonacci function it should insert a function called fibonacci with the appropriate implementation in the language indicated by the language of the buffer.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.