Giter Club home page Giter Club logo

Comments (6)

jwortmann avatar jwortmann commented on June 24, 2024 2

I was able to reproduce it and I already have a fix for it, but I'll see if I can also capture the typed input while the overlay isn't yet open and insert it in the input field when it opens.

Edit: discarded the capturing idea for now, because I think for that we would need an individual command for each key then.

from lsp.

rchl avatar rchl commented on June 24, 2024

Note that we have logic for preventing keys from modifying the view when document symbols are requested but maybe some recent changes opened some timing window where this doesn't apply. Maybe due to extra input handler

from lsp.

jwortmann avatar jwortmann commented on June 24, 2024

I guess it might happen because we still erase that setting directly when the response arrives at

self.view.settings().erase(SUPPRESS_INPUT_SETTING_KEY)
but after that there might still be a small delay before the overlay opens because we run it through window.run_command. Maybe we can move that line somewhere into the input handler class. But I'll have to take a closer look later, not that we possibly miss some code path in that case where the setting accidentally won't get erased.

It might also be possible now to catch the typed input during that time and insert it into the text field when it opens, via initial_text of the newly introduced input handler. But I guess I'll need a sufficiently large file to be able to test this; when I run the command the overlay still opens basically instantly even on files with several thousand lines.

from lsp.

rchl avatar rchl commented on June 24, 2024

When testing on a server (tsc) that is still initializing I can easily trigger a case where documentSymbols will take a long time to respond but in that case I can't reproduce the issue (the input is blocked in that case). So I guess it's as you are saying that it's the period between when the response arrives and when the list is shown that is problematic.

from lsp.

sylbru avatar sylbru commented on June 24, 2024

I didn’t have a chance to answer before, but it’s great that you were able to fix this! Thanks!

from lsp.

jwortmann avatar jwortmann commented on June 24, 2024

So the bug that the text accidenatlly gets inserted into the file is fixed now on main, but I'm still investigating why the new Goto Symbol overlay is a little bit slower as before.

I tried it first with the largest Julia file I found (3000+ lines) and with a lot of symbols in it (~ 500k characters in the response payload) and the overlay still opens basically instantly, before I can move my fingers from the key binding back to the normal typing position. But now I tested with a huge JSON file with 1.3M character response payload and there is indeed a noticeable delay before the overlay opens.

The overlay was rewritten using a different Sublime API method (ListInputHandler instead of window.show_quick_panel), to make it possible to filter the symbols by symbol kind if desired. I hope that users find this new functionality useful, personally I like it a lot. But I don't think this alone has an effect on the performance.

However, I've got two ideas how to possibly improve the performance:

  1. When generating the list items for the overlay, we create the items with a ListInputItem.value like
    {"kind": 5, "region": [10, 20], "deprecated": false}
    And I remember from the completions handling that the Sublime API might be a bit slow if huge data from thousands of items is transported over the plugin host, so maybe it can be optimized like
    {"k": 5, "r": [10, 20], "d": false}
    or even
    [5, [10, 20], false]
  2. At the moment the symbol positions (which are used for highlighting and scrolling the view when you scroll through the items in the overlay) are converted from LSP row/column encoding to Sublime region offsets when the list items are generated. This means that there are two API calls needed for each list item. To be fair, the exactly same was done before and there were (potentially) even more API calls because previously we processed both the selectionRange and also the full range of each symbol, so this cannot really be a cause of performance regression. But nevertheless it's probably a bad idea to do this computation beforehand, and instead it could be done lazily in ListInputHandler.preview just when the item gets highlighted. The sorting of the symbols can likewise be done from LSP row/column encoding.

I will test if those points make any difference.

from lsp.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.