freaky / compactor Goto Github PK
View Code? Open in Web Editor NEWA user interface for Windows 10 filesystem compression
License: MIT License
A user interface for Windows 10 filesystem compression
License: MIT License
WimBootCompress.ini
or
WimBootReCompress.ini
found in the Windows\System32 folder
are exclusion lists created during WinNTSetup installs on Wimboot mode and WinNTSetup installs on Compress mode, respectivly.
So as I understand things, using it/them as the default exclusion list should make it safe to use Compactor in the whole system disk; C:
??
More info is available in the below link of an app that enables Windows Compact all the way down to Win 7:
http://reboot.pro/topic/22007-wofcompress-tool-for-win7-win10/?p=211055
ALSO:
I noticed that the wimlib algorithms are said to be more performant than the Win 10 algorithms.
Faster and smaller files:
https://wimlib.net/compression.html#Benchmarks
I'm no programmer but thought I'd mention this in the hopes that they might be used to improve this excellent app?
If I want to compress the top 5 folders in Program Files, I have to select and compress each one individually. It would be better to be able to select the folders by holding ctrl, and compress them all at once.
Example video. Also tested on my laptop and got the same result.
https://wimlib.net/compression.html#Benchmarks
I hope you find some of those Compressors handy for speeding up Compactorand that on-the-fly write/compression filter
drivers might also be possible..?
Thank you for this great tool. Can you add it into scoop package manager? For ease installation and upgrades
Noticed a few folder scans on two different machines that reported small (few hundred KiB) negative amounts of space saved. Seems to only happen on an initial scan - a re-scan immediately afterwards usually doesn't see the same thing.
I have the following folders excluded for all drives:
*:\Windows*
*:$Recycle.Bin*
*:\Documents and Settings*
*:\System Volume Information*
Yet Compactor stills goes through each file in said folders whilst adding them to the total 'excluded' value, would it not be more efficient to not scan any excluded directories and subdirectories?
After a restart, Windows Search is not functional and index is gone.
It's unclear what files are causing this but it's repeatable if it's compressing whole C: with default blacklist
On the settings, there's an option for choosing compression method. One of them, LZX, says it's slow but it has high compression rate. Does it being slow just mark the act of compressing it? Or would it also make the accessing of the files slower?
This include modification time and last access time
Sorry if this is an inappropriate usage of Github, I normally only use this site as a user and I don't really use my account for much. This is my first time reporting an issue of any kind for anything on Github.
So it's pretty much as the title suggests. I noticed it after I did some digging in Appdata as I tried compressing a large game only to see I had lost 50gb of space. It really had me worried.
When compressing folders from OTHER drives, Compactor works normally and with CompactGUI, ALL my drives compress normally.
I can get into more specific details if need be, but I thought it might be worth including that my C drive is an SSD and my other 2 drives are HDDs.
When testing an updated i386 build, it's noted that the path database does not filter out known-incompressibles found by the 64-bit build. This doesn't really make sense since the databases should be bit-identical across architectures.
Hello Thomas!
I just wanted to pop in and say thank you for creating this handy piece of software. It's helped myself and many I know easily free up more bits and bytes. Great job!
All the best,
Blake
Tried whole folder - Recovery mode after reboot)
Is compactor safe to be used with SSD drives
After a restart, Windows Defender history will appear blank and non functional. I did recover this in the past but I don't remember the exact steps and what files are involved.
I'll update the issue if I will find a solution
subj
On my PC, the scanning part only uses ~11% of CPU (Ryzen 2600, 11% is usually single threaded stuff) and ~6% of my SSD (Crucial P1 NVME 1TB). Would it be possible to somehow speed this up?
Also, sometimes the compression itself doesn't use the PC fully either, usually disk usage is below 50% and CPU usage below 30%, I think there might be a way to optimize that too.
Windows 10 crashes into BSoD with a PAGE FAULT IN NONPAGED AREA
error when attempting to decompress a LZX compressed directory.
I've noticed that if Compactor compresses Discord in %localappdata%, on next launch Discord will force expire the session and reset the settings, require the user to login and configure it again.
I propose to add Discord to the blacklist
wrong place whoops
Files that get updated a lot, particularly if they're large, are poor candidates for compaction.
An obvious simple heuristic that seems likely to be relatively effective is to check file modification time, and simply have a configurable cutoff - if it's changed in the past n
days, skip it. Tiering this by file size might also make sense - frequently decompressing and recompressing a small file is much less costly than a large one.
This sort of thing will be more important should Compactor migrate to a background system service with patrol compactions.
subj
The compresstimator currently cuts-off at 0.95 - it might be nice to make this value (and indeed compresstimation in general) configurable so users can choose between quickly getting the low-hanging fruit or getting every last byte of potential saving.
This would be best done alongside changes to the incompressible database to store the estimate.
I'd like the option to tell Compactor to tell it to compress bin files. bin files can be anything. They aren't always binaries that are uncompressible, and it could save a lot of space for certain games.
It seems that Compactor forces itself to not compress these. Can this be adjusted?
EDIT: It also seems to ignore CAS files. Is there something it detects to prevent compression?
I find that CompactGUI compresses some folders way more than Compactor, sometimes saving a couple extra GB. (I've tested the same data decompressed)
Offer a read-only dry-run mode using the compression API, simulating the effects of compacting a folder without actually doing it.
This allows for a safe way to present a user with potential savings, and offers an opportunity to validate excludes and compresstimation both more efficiently than actually performing compaction, and more accurately than relying on lz4 as a stand-in.
This links with the desire to benchmark algorithms, which also requires compression API access.
Every time I'm trying to launch the program I get a blank window and then it closes itself, sometimes I get this error (so far after 10 tries or so I got it twice):
Se produjo una excepción unknown software exception (0x80000003) en la aplicación en la ubicación 0x00007FFD5E799173.
Haga clic en Aceptar para finalizar este programa
I've tried deleting both AppData folders to no avail. This error has happened just now, last time I launched the program which was just a week more or less ago, it launched normally.
I already used the Properties → Advanced → "compress contents to save disk space" attribute for years, and it saved me 55GB on my tiny 250GB SSD. (I compress almost everything, even things you exclude.) But now I discovered this amazing tool and over the past two days I uncompressed and recompressed my entire drive, having to do it folder by folder. (As there was no way to free up 50GB for doing it all in one go.) I now have 10 GB additional free space! (65GB saved.)
Please add a Recompress option that achieves this automatically; Uncompress files one by one, and immediately recompress them. Or a single folder at a time.
Please also add a Compress without analyzing button, as I trust this tool and don't care about waiting for the predicted details, only the actual results. I had to wait a long time per each folder. I wish I could have just "dry run" it.
I also tried on Windows 7: My entire C/: drive compressed offline. It was a bad idea:
Please warn Win7 users upon opening the program. And Win10 users dual booting Win7...
Luckily an offline decompress saved my data and everything is back to normal.
I would like to have a "fallback to a compatible algorithm" option for Win7 users.
Taken from LZX — new Windows 10 NTFS compression algorithm:
I strongly recommend avoiding LZX and using xpress16k instead.
The xpress algorithms do not require a full read of the file before performing operations, but can be done on the fly.
Thus, if you have a 1 GB file and you move it to another drive (say), with LZX your OS will sit at 0 until the whole thing is decompressed, and then it will eventually begin to start copying after sitting around for 30 seconds.
If you use xpress4k, xpress8k or xpress16k, the file will begin copying immediately.
And the compression improvement from xpress16k to lzx is always like 1.8 to 1.9, for example, it is not worth the overhead especially for large files.
I would suggest an option to use xpress16k on larger files even if LZX is choosen with a key to override it if really needed.
Since some Files can take relativly long to Compress I think a toggable option where the User decides wherever the Program should close itself after it's done compressing or shutting down the computer could come in really handy for some users!
Whenever I try to compact an Xbox game on PC (Forza Horizon 4 for example) it says Access Denied: OS error 5. Any way to solve this? I have full admin control on my user and I've tried launching the .exe as admin.
For task scheduling it would be useful to have some flags to run Compactor automatically on a certain folder
Compactor claims that some files were compressed and tells me how many GiB were saved, but my disk is still as full as before. Free space didn't change at all, numbers are still the same. Why is that?
Just noticed Compactor idling now consumes a core. Appears to be Boscop/web-view#241
As the title says ...
I instantly opened task manager to see if it was still running ... luckily it was ...
Opening the exe another time would open a new window
how do i reopen the same window
also if possible please add a tray icon
Following this Reddit thread, it would clearly be useful to warn the user if the selected folder can't be compressed.
There is literally a function for doing exactly this, I just need to hook it up.
My PC crashes with no BSOD when in the middle of compressing Metro Exodus.
Could be possible to make this app so you could get it from winget?
would it be possible to add a context menu option to quickly compress a folder, like CompactGUI does?
is it worth using lzx or will the games run worse i have a 3600x which is a somewhat good cpu
subj
I used Compactor on genshin impact because why not and It seems like it skips alot of files that could be compressed, I tried Compactgui instead and it Compressed it much better then compactor. Can you add a setting to allow us to change this so it compresses every file?
Compacting SQLite databases while they are in use causes them to become corrupted. I've not checked if this is limited to SQLite but after compacting my system drive, all the programs I faced problems with had SQLite databases. These include Thunderbird, and Sticky Notes (From MS), among others.
Simply excluding .sqlite or .db files and their corresponding -shm and -wal files isn't enough, as programs such as Chrome use SQLite DB files without any extensions.
There's like, 3 pixels left in this bar and it accounts for about 2/3rds of the data...
We've just had one release fixing a data corruption issue, and another one fixing a CPU use bug, and I have limited reach to inform people of these updates.
At its simplest, I could simply put a link to the releases page on the initial window and encourage people to manually check it from time to time. Easy to do, but equally easy to ignore and makes work for the user.
A custom check shouldn't be that difficult, but is bit of a can of worms with regard to how it's triggered, how the user can opt-out, and even just the basics of how it works - I guess checking a bit of JSON hosted on Github, triggered from a button the user hits.
Automated updates... can probably wait for the Steam release. Which would also be a convenient if somewhat inefficient place for people to donate...
As an addition to the great if statement trivial hash machine learning advanced condition-based AI logic database (sorry.. couldn't resist) i would suggest to implement a list of files/folders in categories (like archives, database, etc.) that are not meant to be compressed instead of a huge list with little to no explanation.
For files this would be etl, etw and other database or log files (like from Windows Search):
*.etl
*.etw
*.evtx
*.jrs
*.jtx
*.jcp
*.jfm
*.edb
*.crwl
*.gthr
%ALLUSERSPROFILE%\Microsoft\Windows Defender
%ALLUSERSPROFILE%\Microsoft\Search
%ALLUSERSPROFILE%\NVIDIA
%TEMP%
This would speed up the tool even more and SSD wear would be less.
Also a really cool option would be a way to scan for magic numbers: Read the first four bytes of a file (if the file extension is unknown) and compare that with a list of known magic numbers and decide on how to compress these.
A possible solution to programmatically get some of those files/paths would be to read the registry and see where some of those files are stored...
Thanks for this great tool! Works better than anything else i used before as it's using the API instead of just being a wrapper for compact.exe. Very nice!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.