Giter Club home page Giter Club logo

Comments (8)

Prid13 avatar Prid13 commented on August 21, 2024 1

Upon closer inspection, why is the "Total size" showing 0? :(

from diskusage.

aleksaan avatar aleksaan commented on August 21, 2024 1

Upon closer inspection, why is the "Total size" showing 0? :(

50 rows it's limit will be printed in human readable mode.

Is there a planned feature to allow for all rows to be shown in -hr mode?

And what will you do if you get 100 000 rows to console, for example? :) I can create no limit flag but it will be dangerous and unpredictable behavior. You can't look over all of these rows with your eyes, you cannot proccess these rows manually. But if you want to process these rows in another program you should't use -hr mode and get full results in json.

Upon closer inspection, why is the "Total size" showing 0? :(
first few letters are omitted from folders

Yes, these are bugs. Thanks, I'l try to fix them near days

from diskusage.

aleksaan avatar aleksaan commented on August 21, 2024 1

@Prid13 please test new release https://github.com/aleksaan/diskusage/releases/tag/v2.9.0

from diskusage.

Prid13 avatar Prid13 commented on August 21, 2024

And what will you do if you get 100 000 rows to console, for example? :) I can create no limit flag but it will be dangerous and unpredictable behavior. You can't look over all of these rows with your eyes, you cannot proccess these rows manually. But if you want to process these rows in another program you should't use -hr mode and get full results in json.

Maybe increase the upper limit to something like 10,000 files or 50,000 files? :)

I'll come clean and admit that my use case is different from the intended usage of this tool you've developed. I simply wish to list all the files and folders in a given directory and get a human-readable output with nice file sizes and metadata, so that I can repeatedly do this in a scheduled script to print out and save the contents of certain folders for backup and history logging purposes :)

You don't need to implement this feature. But maybe you can show me how to compile the source code after editing the limit myself for Windows? :)

from diskusage.

Prid13 avatar Prid13 commented on August 21, 2024

You're an absolute legend! This works wonderfully, and I truly appreciate you doing this so much ~⭐

I'm sorry if I caused you any inconvenience. I know this is beyond the scope of this tool, but such a feature is really handy for my use case πŸ˜‡

Are there are any plans to add more options like showing creation/modification date, and sorting features? :)


Actually, just realized that this tool is actually a bit slow when it comes to deeply-nested folders like Documents and Downloads. I tried limiting the search by adding a -depth 1 option, but the listing took the same amount of time regardless of using that option or not. Take e.g. my usage of this on the Downloads folder:

image

image

Notice also how the total dirs, files and size remains unchanged.

The reason I discovered this is because I tried using this tool on my Documents folder, but really only want the top level (or level 2 at most) files and folders, but it was taking way too long that I thought perhaps an error was occurring for that folder. But testing it on the Downloads folder proved to me that limiting the search with a depth filter still yields the same amount of run time.

Is this the intended behavior? :O

from diskusage.

aleksaan avatar aleksaan commented on August 21, 2024

@Prid13 Thanks for thanks
Why are times with depth and without (no depth limit) the same?
If you want to calculate size of folder you schould calculate sizes of all included subfolders and files including deepest level.
There's not previosly calculated size of folder in the file system.

And option -depth is needed only for reducing number of results rows (but it's full scan yet)

from diskusage.

aleksaan avatar aleksaan commented on August 21, 2024

Are there are any plans to add more options like showing creation/modification date, and sorting features?

It should be new issue. Create it please

from diskusage.

aleksaan avatar aleksaan commented on August 21, 2024

a bit slow when it comes to deeply-nested folders like Documents and Downloads

14000+ files (folders) per second - It's not slow I think ) And, there is some intresting fact - this program calculates size more accuracy than FAR. You can check it on c:\Users folder for example, or C:\Program Data

from diskusage.

Related Issues (13)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.