Giter Club home page Giter Club logo

parpar's People

Contributors

animetosho avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

parpar's Issues

Building Static Binary Issue

Following the instructions you have on your Nyuu repo using your nexe build script, I get the following error coming from the resulting binary after building:

internal/bootstrap/loaders.js:111
    throw new Error(`No such module: ${module}`);
    ^

Error: No such module: parpar_gf
    at process.binding (internal/bootstrap/loaders.js:111:11)
    at Object.<anonymous> (/home/lib/par2.js:4:18)
    at Module._compile (internal/modules/cjs/loader.js:945:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:962:10)
    at Module.load (internal/modules/cjs/loader.js:798:32)
    at Function.Module._load (internal/modules/cjs/loader.js:711:12)
    at Module.require (internal/modules/cjs/loader.js:838:19)
    at require (internal/modules/cjs/helpers.js:74:18)
    at Object.<anonymous> (/home/lib/parpar.js:3:12)
    at Module._compile (internal/modules/cjs/loader.js:945:30)

node is 12.12.0
nexe is 3.3.2
node sources in the next folder are 4.9.1
yencode-src is latest pulled from your yencode repo

I am trying to build a static binary for linux.

Only one big par2 file gets created

I installed parpar today and I'm so far loving it (what a difference in speed compared to par2cmdline).

I run it with the following settings:
parpar -s2800000B -r5% -o xyz.par2 xyz*.rar
The 2800000B are a multiple of my 700K nyuu article size and I always like to have 5% redundancy.

My problem is that it creates the "little" .par2 file and only ONE big .volxxx+xxx.par2 file.
In my current example it created an 3700MB file for about 74GB of archives.
How can I achieve the "normal" par2 behaviour of several .volxxx+xxx.par2 files?

PS: Sorry I know this isn't an issue but an user error from my side but I didn't know where else to ask this question.

Repair completed but the data files still appear to be damaged

Hi,

Odd question maybe. Seen many downloads fail recently, the PAR2 creation software reported by NZBGet is almost always ParPar (sometimes par2cmdline).

I tried uploading some things with Usenet Farm and then downloading them using 7 USP's in NZBGet 10 mins later. NZBGet reports 99,9% availability even on the same USP used to upload.

After downloading it starts repairing but it always fails with repair completed but the data files still appear to be damaged. Recovery files created by: ParPar v0.4.0-dev [https://animetosho.org/app/parpar]

I checked the logs for one of them and it had 1 bad block in just 1 file. After the failed repair it now had several bad blocks in several files.

Another upload seemed to download fine but then came back with corrupted RARs and another failed repair. The log for that one is pretty long and posted here: https://kopy.io/py5x5#yPCMI7lVUZXKtV

Here's a check on that same file with par2cmdline:

Loading "9d38b600a3c96c4c7a3d401b760a6eee.par2".
Loaded 114 new packets
Loading "9d38b600a3c96c4c7a3d401b760a6eee.par2".
No new packets found

There are 56 recoverable files and 0 other files.
The block size used was 5242880 bytes.
There are a total of 2225 data blocks.
The total size of the data files is 11663825678 bytes.

Verifying source files:

Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part01.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part02.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part01.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part02.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part03.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part04.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part03.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part04.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part05.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part06.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part05.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part06.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part07.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part08.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part07.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part08.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part09.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part10.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part09.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part10.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part11.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part12.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part11.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part12.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part13.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part14.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part13.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part14.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part15.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part16.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part15.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part16.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part17.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part18.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part17.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part18.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part19.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part20.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part19.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part20.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part21.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part22.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part21.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part22.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part23.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part24.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part23.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part24.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part25.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part26.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part25.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part26.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part27.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part28.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part27.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part28.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part29.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part30.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part29.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part30.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part31.rar"
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part32.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part31.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part33.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part32.rar" - damaged. Found 37 of 40 data blocks.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part34.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part33.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part35.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part34.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part36.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part35.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part37.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part36.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part38.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part37.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part39.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part38.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part40.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part39.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part41.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part40.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part42.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part41.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part43.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part42.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part44.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part43.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part45.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part44.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part46.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part45.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part47.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part46.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part48.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part47.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part49.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part48.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part50.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part49.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part51.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part50.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part52.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part51.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part53.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part52.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part54.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part53.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part55.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part54.rar" - found.
Opening: "9d38b600a3c96c4c7a3d401b760a6eee.part56.rar"
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part55.rar" - found.
Target: "9d38b600a3c96c4c7a3d401b760a6eee.part56.rar" - found.

Scanning extra files:


Repair is required.
1 file(s) exist but are damaged.
55 file(s) are ok.
You have 2222 out of 2225 data blocks available.
Repair is not possible.
You need 3 more recovery blocks to be able to repair.

I used Nyuu to upload these 200mb RARs and tried these settings:

  "check-connections": 3,
  "check-tries": 3,
  "check-delay": "5s",
  "check-retry-delay": "5s",
  "check-post-tries": 3,

  "post-retries": 3,
  "request-retries": 3,
  "connect-retries": 3,
  "retry-on-bad-resp": true,
  "reconnect-delay": 3000,

  "skip-errors": false,
  "quiet": false

Meanwhile I used ParPar to create the PARs like this:

parpar.js -n -r5% -s5M --seq-read-size=5M -dpow2 -o outputname.par2 sourcefiles

No clue what's happening and why and hoping you could shed some light on this.

Guidance on how to pick slices and max/min inputs?

As a noob, I don't really know the ideal way of picking correct amount of slices, is there a guide on how to pick? So far I've just been doing random numbers with:

~/ParPar-0.3.2/bin/parpar.js -s random$M -r12% -o input ~/input.mkv

Sorry if this is really basic, but I haven't seen other people ask this question on Reddit or elsewhere.

Edit: I also meant to ask how slices should be adjusted according to the size and amount of the file(s).

Verbose commandline switch?

Is it possible to include a verbose commandline switch so we get more feedback at runtime it is actually doing stuff?

I am mainly asking because I run your application from mine. But right now my code checks if the external application is alive by looking at the returned messages. If no messages are returned in X minutes (configurable, default: 5) I consider the external process hung and kill it. A verbose mode would let me know things are still being worked on.

-d pow2 and -p parameters seems not working together

Hello,

I'm trying to use both parameters -d pow2 and -p to limit the number of recovery slices the .par2 file should contain and I'm facing bad results.

For example this command:

parpar -s 1536000 -r 573 -d pow2 -p 34

should generate 573 slices of 1536000 bytes each distributed in par2 files with a max of 34 slices, starting from 1 slice, then 2, 4, 8, 16, 32 and then 34 for the rest of the par2 files. I should then have 21 files (1 + 2 + 4 + 8 + 16 + 32 + 34*15 = 573)

Unfortunately the program generates only 10 files (+ the single .par2), for example:
test.vol000+001.par2
test.vol001+002.par2
test.vol003+004.par2
test.vol007+008.par2
test.vol015+016.par2
test.vol031+032.par2
test.vol063+034.par2
test.vol127+034.par2
test.vol255+034.par2
test.vol511+034.par2

Where are the rest of the files?

This is working fine with other software like MultiPar.

Thanks

ParPar recurse for rar files in a folder in Windows

Hi

I have several rar files inside a folder and I saw that under Linux might work like

parpar -s 2M -r 10% -m 3000M -o "/156570e54cd242c1847ed39a4b923e11.par2" "/156570e54cd242c1847ed39a4b923e11*rar"

But how about for Windows in command prompt ?

I tried:

parpar -s 2M -r 10% -m 3000M -o "F:\8043f2ba7cce429683acd5bad\8043f2ba7cce429683acd5bad.par2" "F:\8043f2ba7cce429683acd5bad\*.rar" but I got:

ENOENT: no such file or directory, stat 'F:\8043f2ba7cce429683acd5bad\*.rar' since that folder 8043f2ba7cce429683acd5bad have over 10 rar files (all have rar extension !)

Ubuntu 20.04: "Error: Cannot find module '../build/Release/parpar_gf.node'"

Possibly a PEBKAC, but I get this on Ubuntu 20.04:


$ sudo npm install -g @animetosho/parpar
/usr/local/bin/parpar -> /usr/local/lib/node_modules/@animetosho/parpar/bin/parpar.js

> @animetosho/[email protected] install /usr/local/lib/node_modules/@animetosho/parpar
> node-gyp rebuild

I'm assuming that output is OK. So:

$ parpar
internal/modules/cjs/loader.js:638
    throw err;
    ^

Error: Cannot find module '../build/Release/parpar_gf.node'
    at Function.Module._resolveFilename (internal/modules/cjs/loader.js:636:15)
    at Function.Module._load (internal/modules/cjs/loader.js:562:25)
    at Module.require (internal/modules/cjs/loader.js:692:17)
    at require (internal/modules/cjs/helpers.js:25:18)
    at Object.<anonymous> (/usr/local/lib/node_modules/@animetosho/parpar/lib/par2.js:4:10)
    at Module._compile (internal/modules/cjs/loader.js:778:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10)
    at Module.load (internal/modules/cjs/loader.js:653:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
    at Function.Module._load (internal/modules/cjs/loader.js:585:3)

Based on #19 (comment) I tried

$ sudo npm install [email protected]
npm WARN deprecated [email protected]: request has been deprecated, see https://github.com/request/request/issues/3142
npm WARN saveError ENOENT: no such file or directory, open '/home/sander/package.json'
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN enoent ENOENT: no such file or directory, open '/home/sander/package.json'
npm WARN sander No description
npm WARN sander No repository field.
npm WARN sander No README data
npm WARN sander No license field.

+ [email protected]
added 266 packages from 278 contributors and audited 1518 packages in 13.06s

2 packages are looking for funding
  run `npm fund` for details

found 1 low severity vulnerability
  run `npm audit fix` to fix them, or `npm audit` for details

I have no idea what is going on. Tips?

More:
I tried on my Ubuntu 18.04, and that gives a different output:

$ sudo npm install -g @animetosho/parpar
[sudo] password for sander: 
/usr/local/bin/parpar -> /usr/local/lib/node_modules/@animetosho/parpar/bin/parpar.js

> @animetosho/[email protected] install /usr/local/lib/node_modules/@animetosho/parpar
> node-gyp rebuild

make: Entering directory '/usr/local/lib/node_modules/@animetosho/parpar/build'
  CC(target) Release/obj.target/gf-complete/gf-complete/gf.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/shuffle128.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/shuffle128_neon.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/shuffle256.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/shuffle512.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/xor128.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/xor256.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/xor512.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/affine128.o
  CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16/affine512.o
  AR(target) Release/obj.target/gf-complete.a
  COPY Release/gf-complete.a
  CC(target) Release/obj.target/multi_md5/md5/md5.o
  CC(target) Release/obj.target/multi_md5/md5/md5-simd.o
  AR(target) Release/obj.target/multi_md5.a
  COPY Release/multi_md5.a
  CXX(target) Release/obj.target/parpar_gf/src/gf.o
  CXX(target) Release/obj.target/parpar_gf/gf-complete/module.o
  CXX(target) Release/obj.target/parpar_gf/src/gyp_warnings.o
  SOLINK_MODULE(target) Release/obj.target/parpar_gf.node
  COPY Release/parpar_gf.node
make: Leaving directory '/usr/local/lib/node_modules/@animetosho/parpar/build'
/usr/local/lib
└── @animetosho/[email protected] 

and then OK:

$ parpar
Values for `out` and `input-slices` are required
Enter `parpar --help` for usage information

macOS compile error

Versions
sw_vers

ProductName: Mac OS X
ProductVersion: 10.13.6
BuildVersion: 17G65

node --version

v8.14.0

gcc --version

Configured with: --prefix=/Applications/Xcode.app/Contents/Developer/usr --with-gxx-include-dir=/usr/include/c++/4.2.1
Apple LLVM version 9.0.0 (clang-900.0.39.2)
Target: x86_64-apple-darwin17.7.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin

/usr/bin/xcodebuild -version

Xcode 9.2
Build version 9C40`

The error
$ npm install -g @animetosho/parpar

clang: error: unsupported option '-fopenmp'`

I've just done basic novice searching and nothing has worked. Any help appreciated

How to add files from subdirectories ?

If i add a folder that includes subdirectories as input , files inside those are ignored by ParPar. I know i can add them file by file, but it's not convenient.

Pipe outputs to another process

In README, it is said that "output doesn’t have to be written to disk", but I can't find documentation on how to do that.

Specifically, how would we combine ParPar with, for example nyuu to have a single pipeline from a single file to a a group of that file and recovery files for Nyuu to post, without touching the disk?

failed install on Mac

matt@MacBook-Pro-Matt Downloads % npm install -g @animetosho/parpar
/Users/matt/.nvm/versions/node/v8.11.2/bin/parpar -> /Users/matt/.nvm/versions/node/v8.11.2/lib/node_modules/@animetosho/parpar/bin/parpar.js

[email protected] install /Users/matt/.nvm/versions/node/v8.11.2/lib/node_modules/@animetosho/parpar/node_modules/yencode
node-gyp rebuild

CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/crc32c_sse4.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/multiword_64_64_cl_i386_mmx.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/multiword_64_64_gcc_amd64_asm.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/multiword_64_64_gcc_i386_mmx.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/multiword_64_64_intrinsic_i386_mmx.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/code/multiword_128_64_gcc_amd64_sse2.o
CXX(target) Release/obj.target/crcutil/crcutil-1.0/examples/interface.o
LIBTOOL-STATIC Release/crcutil.a
CXX(target) Release/obj.target/yencode_sse2/src/encoder_sse2.o
In file included from ../src/encoder_sse2.cc:4:
../src/encoder_sse_base.h:31:2: error: use of undeclared identifier 'aligned_alloc'
ALIGN_ALLOC(lookups, sizeof(lookups), 16);
^
../src/common.h:42:57: note: expanded from macro 'ALIGN_ALLOC'
#define ALIGN_ALLOC(buf, len, align) (void
)&(buf) = aligned_alloc(align, ((len) + (align)-1) & ~((align)-1))
^
1 error generated.
make: *** [Release/obj.target/yencode_sse2/src/encoder_sse2.o] Error 1
gyp ERR! build error
gyp ERR! stack Error: make failed with exit code: 2
gyp ERR! stack at ChildProcess.onExit (/Users/matt/.nvm/versions/node/v8.11.2/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:258:23)
gyp ERR! stack at emitTwo (events.js:126:13)
gyp ERR! stack at ChildProcess.emit (events.js:214:7)
gyp ERR! stack at Process.ChildProcess._handle.onexit (internal/child_process.js:198:12)
gyp ERR! System Darwin 19.4.0
gyp ERR! command "/Users/matt/.nvm/versions/node/v8.11.2/bin/node" "/Users/matt/.nvm/versions/node/v8.11.2/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild"
gyp ERR! cwd /Users/matt/.nvm/versions/node/v8.11.2/lib/node_modules/@animetosho/parpar/node_modules/yencode
gyp ERR! node -v v8.11.2
gyp ERR! node-gyp -v v3.6.2
gyp ERR! not ok
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] install: node-gyp rebuild
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] install script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR! /Users/matt/.npm/_logs/2020-05-18T06_39_06_467Z-debug.log
matt@MacBook-Pro-Matt Downloads %

Error while running

~/compile/ParPar/bin/parpar.js -r 75% -s 10M -c "Set 2" -o set2 *
Multiply method used: XOR JIT (128 bit), 4 threads
Generating 99.1 GiB recovery data (10148 slices) from 132.12 GiB of data
~/compile/ParPar/lib/par2.js:266
if(this._processStarted) throw new Error('Cannot reset recovery buffers whilst processing');
^

Error: Cannot reset recovery buffers whilst processing
at PAR2Chunked.bufferedClear (/compile/ParPar/lib/par2.js:266:34)
at PAR2Chunked.setChunkSize (
/compile/ParPar/lib/par2.js:970:8)
at PAR2Chunked.setRecoverySlices (/compile/ParPar/lib/par2.js:904:9)
at PAR2Gen.freeMemory (
/compile/ParPar/lib/par2gen.js:602:18)
at ~/compile/ParPar/lib/par2gen.js:907:9
at /usr/share/javascript/async/async.js:680:28
at ~/compile/ParPar/lib/par2gen.js:865:19
at /usr/share/javascript/async/async.js:680:28
at /usr/share/javascript/async/async.js:254:17
at /usr/share/javascript/async/async.js:151:21

Error compiling on RPI (armhf)

Hi,
I've just tried to install ParPar on my Raspberry PI 4 running Raspbian and I'm getting a compilation error:

In file included from ../gf-complete/gf_w16.c:25:
../gf-complete/gf_w16_additions.c: In function ‘detect_cpu’:
../gf-complete/gf_w16_additions.c:124:35: error: ‘HWCAP_ASIMD’ undeclared (first use in this function); did you mean ‘HWCAP_THUMB’?
  has_neon = getauxval(AT_HWCAP) & HWCAP_ASIMD;
                                   ^~~~~~~~~~~
                                   HWCAP_THUMB
../gf-complete/gf_w16_additions.c:124:35: note: each undeclared identifier is reported only once for each function it appears in
make: *** [gf-complete.target.mk:115: Release/obj.target/gf-complete/gf-complete/gf_w16.o] Error 1
make: Leaving directory '/home/pi/apps/ParPar/build'
gyp ERR! build error 
gyp ERR! stack Error: `make` failed with exit code: 2
gyp ERR! stack     at ChildProcess.onExit (/usr/share/node-gyp/lib/build.js:262:23)
gyp ERR! stack     at ChildProcess.emit (events.js:189:13)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (internal/child_process.js:248:12)
gyp ERR! System Linux 4.19.75-v7l+
gyp ERR! command "/usr/bin/node" "/usr/bin/node-gyp" "rebuild"
gyp ERR! cwd /home/pi/apps/ParPar
gyp ERR! node -v v10.15.2
gyp ERR! node-gyp -v v3.8.0
gyp ERR! not ok 
pi@raspberrypi:~/apps/ParPar $ 

Can it be compiled on armhf architecture or is it not supported?

When supplying a file list, does \n or \0 act as separator vs terminator?

Running ParPar on Debian 11 under WSL (Windows Subsystem on Linux) on Windows 11.

ParPar was installed with the one liner npm install -g @animetosho/parpar.

On the help screen it says about input files:

  -i,  --input-file          Supply a list of files to be included as input,
                             separated by newlines. Can be `-` to read from
                             stdin, or a command prefixed with proc:// to read
                             from the stdout of specified process (example:
                             `proc://cat somefile.txt`). Can also be an fd
                             prefixed with fd:// (requires NodeJS >= 0.12),
                             i.e. `fd://0` is an alias for stdin.
                             Can be specified multiple times.
  -0,  --input-file0         Same as the `--input-file` option, except files
                             are separated by null characters instead of
                             newlines.

Are you using the word 'separated' in a rigorous sense? Normally if items in a list are separated by a delimiter, all except the last one has a trailing delimiter. If items are terminated, all (including the last one) have a trailing delimiter.

Anyway I did some test on the behaviour when using \0 and \n as separater and as terminator:

Test steps

  1. Create a test directory:

    mkdir ~/testbed
    cd ~/testbed
    
  2. Create three test files a, b and c.

    for file in a b c; do dd if=/dev/urandom of="$file" bs=10000 count=400; done
    
  3. Create parity set using \n as separator (works OK)

    printf "a\nb\nc" | parpar -i - -s10000B -r10% --out test1.par2
    
  4. Create parity set using \n as terminator (works OK)

    printf "a\nb\nc\n" | parpar -i - -s10000B -r10% --out test2.par2
    
  5. Create parity set using \0 as separator (works OK)

    printf "a\0b\0c" | parpar -0 - -s10000B -r10% --out test3.par2
    
  6. Create parity set using \0 as separator (not OK)

    printf "a\0b\0c\0" | parpar -0 - -s10000B -r10% --out test4.par2
    

    The error message is

    Error: ENOENT: no such file or directory, stat ''
    

So if \n is used to delimit paths, then ParPar does not mind whether the paths are separated or terminated. But if \0 is used, ParPar would encounter an error if used as a terminator.

Probably there are bugs but can't say which ones without knowing whether \n and \0 are intended as path separator or terminator.

[P.S. Usually such list is terminated because to create a separated list, you must add logic to make sure the last item does not end with a delimiter, which is less simple. ]

Generating the files takes a long time when the source size exceeds 50 GB

Hello!

This is not exactly an issue but rather a question. I have the following parameter configuration, and what I do is pass a set of files (between 10 to 20) that in total add up to between 50 to 100 GB. Could you help me understand why it takes approximately 1 hour to generate the par2 files?

It is running on a 12-core CPU and 16 GB of RAM, with an M.2 disk.

parpar -s5M -r1n*0.6 --threads 14 --memory 2048M -p1l --progress stdout -q file1 file2 file3 ...

No multithreading in macOS

Versions
sw_vers

ProductName: Mac OS X
ProductVersion: 10.12.6
BuildVersion: 16G29

node --version

v8.4.0

parpar --version

0.2.2

that's e351cc4
also tried git head as in b1dc024

Problem
parpar -r5% -s 716800 -o test test.mkv

Method used: Shuffle (128 bit), 1 thread

Only 1 thread, was expecting 4.

using -t, --threads even leads to an error:
parpar -t 4 -r5% -s 716800 -o test test.mkv

/Users/test/src/ParPar/bin/parpar.js:156
if(argv.t) ParPar.setMaxThreads(argv.t | 0);
^

TypeError: ParPar.setMaxThreads is not a function
at Object. (/Users/test/src/ParPar/bin/parpar.js:156:19)
at Module._compile (module.js:573:30)
at Object.Module._extensions..js (module.js:584:10)
at Module.load (module.js:507:32)
at tryModuleLoad (module.js:470:12)
at Function.Module._load (module.js:462:3)
at Function.Module.runMain (module.js:609:10)
at startup (bootstrap_node.js:158:16)
at bootstrap_node.js:598:3

Building v0.2.2 results in a binary with YError messages

Trying to build parpar v0.2.2 at this point in time results in a binary that outputs YError messages.
Possible some npm dependency issues.

Steps to reproduce
git checkout tags/v0.2.2

npm i

./bin/parpar.js

YError: Invalid first argument. Expected boolean or string but received function.
at argumentTypeError (/Users/test/src/ParPar/node_modules/yargs/lib/argsert.js:65:9)
at parsed.optional.forEach (/Users/test/src/ParPar/node_modules/yargs/lib/argsert.js:47:39)
at Array.forEach ()
at argsert (/Users/test/src/ParPar/node_modules/yargs/lib/argsert.js:42:21)
at Object.version (/Users/test/src/ParPar/node_modules/yargs/yargs.js:757:5)
at Object. (/Users/test/src/ParPar/bin/parpar.js:34:3)
at Module._compile (module.js:649:30)
at Object.Module._extensions..js (module.js:660:10)
at Module.load (module.js:561:32)
at tryModuleLoad (module.js:501:12)
ParPar, a high performance PAR2 creation tool
[...]

Possible culprits
npm list

[email protected] /Users/test/src/ParPar
├─┬ [email protected]
│ └── [email protected]
├─┬ [email protected]
│ ├─┬ [email protected]
│ │ ├── [email protected] deduped
│ │ ├─┬ [email protected]
│ │ │ └── [email protected]
│ │ └─┬ [email protected]
│ │ ├─┬ [email protected]
│ │ │ ├── [email protected]
│ │ │ ├─┬ [email protected]
│ │ │ │ └── [email protected]
│ │ │ └── [email protected] deduped
│ │ └─┬ [email protected]
│ │ └── [email protected]
│ ├── [email protected]
│ ├─┬ [email protected]
│ │ └─┬ [email protected]
│ │ ├─┬ [email protected]
│ │ │ └─┬ [email protected]
│ │ │ └── [email protected]
│ │ └── [email protected]
│ ├── [email protected]
│ ├─┬ [email protected]
│ │ ├─┬ [email protected]
│ │ │ ├─┬ [email protected]
│ │ │ │ ├─┬ [email protected]
│ │ │ │ │ ├── [email protected]
│ │ │ │ │ └── [email protected]
│ │ │ │ ├─┬ [email protected]
│ │ │ │ │ └── [email protected]
│ │ │ │ └─┬ [email protected]
│ │ │ │ └── [email protected]
│ │ │ ├── [email protected]
│ │ │ ├── [email protected]
│ │ │ ├─┬ [email protected]
│ │ │ │ └── [email protected]
│ │ │ ├── [email protected]
│ │ │ ├── [email protected]
│ │ │ └── [email protected]
│ │ ├─┬ [email protected]
│ │ │ └── [email protected]
│ │ └─┬ [email protected]
│ │ └── [email protected]
│ ├── [email protected]
│ ├── [email protected]
│ ├── [email protected]
│ ├─┬ [email protected]
│ │ ├── [email protected]
│ │ └── [email protected] deduped
│ ├── [email protected]
│ ├── [email protected]
│ └─┬ [email protected]
│ └── [email protected]
└── [email protected]

Workaround
Use git HEAD, but you didn't recommend that in the past:
https://www.reddit.com/r/usenet/comments/7bdymn/some_of_your_failing_downloads_might_be_due_to_a/dpipd9s/
So would prefer some fixed released version.

Recovery file size

In par2cmdline I used to have the option '-l' activated, to limit file size. I recall that this limits the recovery files to about the size of the .rars and splits them in multiple files if the total recovery size is greater.

I realised after parpar was finished (using '-s8192000b' -r10%') that I only got a single recovery file, which is pretty much useless for posting to usenet. Then I found the '-p' switch, but for bigger archives this results in way too many files (e.g. my slize size is 8192000b, and if I limit it to 10 slices per file, then my recovery files are only 82 MB, which is good for smaller archives of about 3 to 5 GB, but for 50 GB archives I'd like them to be like 250 MB just like the rar size)

Is it possible to add a similar option as the '-l' switch in par2cmdline? Or did I miss something?

Error when trying to run Parpar from the command line on Debian

I have successfully installed Parpar with no errors. But When running in the terminal I get this error

`tommyboy@storm :~$ parpar
internal/modules/cjs/loader.js:1187
return process.dlopen(module, path.toNamespacedPath(filename));
^

Error: /home200/tommy/lib/node_modules/@animetosho/parpar/build/Release/parpar_gf.node: undefined symbol: gf_w16_xor_final_avx512
at Object.Module._extensions..node (internal/modules/cjs/loader.js:1187:18)
at Module.load (internal/modules/cjs/loader.js:985:32)
at Function.Module._load (internal/modules/cjs/loader.js:878:14)
at Module.require (internal/modules/cjs/loader.js:1025:19)
at require (internal/modules/cjs/helpers.js:72:18)
at Object. (/home23/usb668/lib/node_modules/@animetosho/parpar/lib/par2.js:4:10)
at Module._compile (internal/modules/cjs/loader.js:1137:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1157:10)
at Module.load (internal/modules/cjs/loader.js:985:32)
at Function.Module._load (internal/modules/cjs/loader.js:878:14)
`

"MD5_SIMD_NUM == 2" may fail

I refered ParPar's MD5 code for optimization. I could improve my MD5 function. Thank you.

But, I found a strange point in your code. Though it's not used actually now, it will cause a problem someday. The file is "md5-sse2.c". The macro under line 99.

X(a) = _mm_loadl_epi64(data0++);
X(b) = _mm_loadl_epi64(data1++);

These lines read 8-bytes each from data0 and data1, and proceed 16-bytes them. While I'm not sure the data alignment, it may loss 8-bytes between calling the macro. I thought that adding one more variable as offset might be good.

Par2cmdline compatible command line parameters.

Hi,

Would it be possible to implement a "par2cmdline"-compatibility switch?
I am currently porting my backup scripts to use parpar. But I am missing the
equivalent of the -r switch - that is the level of redundancy.

The -r option of parpar seems to differ from par2cmdline:

Parpar2:
-r, --recovery-slices Number of recovery slices to generate.

par2cmdline:
-r<n> : Level of Redundancy (%%)

This is the option is miss the most - but having a "fully compatible" command line would be quite handy.

Error Compiling gf_w16.c

Attempted to compile the current ParPar (as of commit b8961c8) on CentOS Linux release 7.4.1708 (Core). I used git clone to download the files and when i attempt to compile i get the error listed below.

Specifically it appears to be related to this:
../gf-complete/gf_w16.c:390:5: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(int i=0; i<16; i++) {
^
../gf-complete/gf_w16.c:390:5: note: use option -std=c99 or -std=gnu99 to compile your code

If i move the initialization of i out of the for loop and just set i=0 in the for loop, everything compiles correctly.

I apologize if my report isn't detailed enough as I am new to github.

[[email protected] ParPar]# node-gyp rebuild
gyp info it worked if it ends with ok
gyp info using [email protected]
gyp info using [email protected] | linux | x64
gyp info spawn /usr/bin/python2
gyp info spawn args [ '/usr/lib/node_modules/node-gyp/gyp/gyp_main.py',
gyp info spawn args 'binding.gyp',
gyp info spawn args '-f',
gyp info spawn args 'make',
gyp info spawn args '-I',
gyp info spawn args '/root/File/ParPar/build/config.gypi',
gyp info spawn args '-I',
gyp info spawn args '/usr/lib/node_modules/node-gyp/addon.gypi',
gyp info spawn args '-I',
gyp info spawn args '/root/.node-gyp/6.12.3/include/node/common.gypi',
gyp info spawn args '-Dlibrary=shared_library',
gyp info spawn args '-Dvisibility=default',
gyp info spawn args '-Dnode_root_dir=/root/.node-gyp/6.12.3',
gyp info spawn args '-Dnode_gyp_dir=/usr/lib/node_modules/node-gyp',
gyp info spawn args '-Dnode_lib_file=/root/.node-gyp/6.12.3/<(target_arch)/node.lib',
gyp info spawn args '-Dmodule_root_dir=/root/File/ParPar',
gyp info spawn args '-Dnode_engine=v8',
gyp info spawn args '--depth=.',
gyp info spawn args '--no-parallel',
gyp info spawn args '--generator-output',
gyp info spawn args 'build',
gyp info spawn args '-Goutput_dir=.' ]
gyp info spawn make
gyp info spawn args [ 'BUILDTYPE=Release', '-C', 'build' ]
make: Entering directory /root/File/ParPar/build' CC(target) Release/obj.target/gf-complete/gf-complete/gf.o CC(target) Release/obj.target/gf-complete/gf-complete/gf_w16.o ../gf-complete/gf_w16.c: In function ‘gf_w16_split_init’: ../gf-complete/gf_w16.c:390:5: error: ‘for’ loop initial declarations are only allowed in C99 mode for(int i=0; i<16; i++) { ^ ../gf-complete/gf_w16.c:390:5: note: use option -std=c99 or -std=gnu99 to compile your code make: *** [Release/obj.target/gf-complete/gf-complete/gf_w16.o] Error 1 make: Leaving directory /root/File/ParPar/build'
gyp ERR! build error
gyp ERR! stack Error: make failed with exit code: 2
gyp ERR! stack at ChildProcess.onExit (/usr/lib/node_modules/node-gyp/lib/build.js:258:23)
gyp ERR! stack at emitTwo (events.js:106:13)
gyp ERR! stack at ChildProcess.emit (events.js:191:7)
gyp ERR! stack at Process.ChildProcess._handle.onexit (internal/child_process.js:219:12)
gyp ERR! System Linux 3.10.0-514.10.2.el7.x86_64
gyp ERR! command "/usr/bin/node" "/usr/bin/node-gyp" "rebuild"
gyp ERR! cwd /root/File/ParPar
gyp ERR! node -v v6.12.3
gyp ERR! node-gyp -v v3.6.2
gyp ERR! not ok

Error running parpar on MacOS

IHello there,

I'm on catalina btw. Every time I run parpar, I get an malloc error.

file.iso is a 2.1GB linuxmint iso.
command parpar -s 1M -r 64 -o my_recovery.par2 file.iso

Result:

Multiply method used: XOR JIT (128 bit), 1 thread
Generating 64 MiB recovery data (64 slices) from 2061.98 MiB of data
ParPar(20553,0x11b65ddc0) malloc: Incorrect checksum for freed object 0x7faa02557f98: probably modified after being freed.
Corrupt value: 0x39b70000b4cb
ParPar(20553,0x11b65ddc0) malloc: *** set a breakpoint in malloc_error_break to debug
zsh: abort      parpar -s 1M -r 64 -o my_recovery.par2

command parpar -s 10M -r 500 -o my_recovery.par2 file.iso

Result:

Multiply method used: XOR JIT (128 bit), 1 thread
Generating 5000 MiB recovery data (500 slices) from 2061.98 MiB of data
ParPar(20785,0x117b0adc0) malloc: Incorrect checksum for freed object 0x7fa3c8452dd8: probably modified after being freed.
Corrupt value: 0xc3e900001e67
ParPar(20785,0x117b0adc0) malloc: *** set a breakpoint in malloc_error_break to debug
zsh: abort      parpar -s 10M -r 500 -o my_recovery.par2

It keeps aborting, giving me the same malloc: Incorrect checksum for freed object error. Of the resulting par2 files, my_recovery.par2 will be 0 bytes; the others claim to have "all files correct" when I open them but nothing shows up to check. Macpar shows empty when I open any of the par files, not showing the file.iso.

I'm on Catalina, parpar has been installed with npm.

Cannot repair with par2 created by parpar

I received this NZB from a user. It has something I have never seen before: Multipar and par2cmdline repair it, but after repair the files are still damaged.
TestParPar.zip
Also NZBGet says PAR Failure.

I purposefully modified this NZB to have some missing articles, because on my usenet server the download is complete and verifies correctly as being 100% complete.
The user has a number of NZBs where repair files like this, all created using ParPar (could of course still be a coincidence!).

how to use parpar to create multiple par2 files?

Short (NOOB) question: how should I use parpar to create multiple par2 files?

From the help:

Examples

parpar -s 1M -r 64 -o my_recovery.par2 file1 file2
Generate 64MB of PAR2 recovery files from file1 and file2, named "my_recovery"

So I tried that:

sander@Stream-13:~/git/ParPar$ bin/parpar.js  -s 1M -r 64 -o my_recovery.par2 million-50M.bin 
Method used: XOR JIT (128 bit), 2 threads
Calculating: 100.00%
PAR2 created. Time taken: 1.541 second(s)

but that creates:

-rw-r--r--  1 sander sander     1424 nov  6 21:04 my_recovery.par2
-rw-r--r--  1 sander sander 67121180 nov  6 21:04 my_recovery.vol00-64.par2

With the classic par2 and par2 create -r10 -n7 my_par2_files million-50M.bin, I get

-rw-r--r--  1 sander sander    40408 nov  6 21:11 my_par2_files.par2
-rw-r--r--  1 sander sander   133292 nov  6 21:11 my_par2_files.vol000+02.par2
-rw-r--r--  1 sander sander   226176 nov  6 21:11 my_par2_files.vol002+04.par2
-rw-r--r--  1 sander sander   371636 nov  6 21:11 my_par2_files.vol006+08.par2
-rw-r--r--  1 sander sander   622248 nov  6 21:11 my_par2_files.vol014+16.par2
-rw-r--r--  1 sander sander  1083164 nov  6 21:11 my_par2_files.vol030+32.par2
-rw-r--r--  1 sander sander  1964688 nov  6 21:11 my_par2_files.vol062+64.par2
-rw-r--r--  1 sander sander  2227568 nov  6 21:11 my_par2_files.vol126+74.par2

Error using version 0.4.1 with ngPost

Hi, I just updated ParPar to the newest version, 0.4.1, I haven't changed any of the configs on my ngPost config file, but I'm getting a weird error generating par2

Command executed from ngPost configs:

Generating par2: /home/bin/parpar -s5M -r1n*0.6 -m2048M -p1l --progress stdout -q -o /tmp/xxxx/xxxx.par2 -R /tmp/xxxx

Error during par2 generation: 4
user@server:~$

Just like that, posting stops due to this error. Does it have to do with the current used arguments?

Any way to get more speed?

I am currently using parpar -n -t30 -r10% -s1M -dpow2 as the parameters to ParPar. I am reading files from a RAID10 SSD array and writing to that same array.

CPU has 32 cores, using 30 threads in ParPar to leave some CPU cycles for other stuff.

$cat /proc/cpuinfo

flags           : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm cpuid_fault epb pti intel_ppin ssbd ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase smep erms xsaveopt dtherm ida arat pln pts md_clear flush_l1d
vmx flags       : vnmi preemption_timer posted_intr invvpid ept_x_only ept_1gb flexpriority apicv tsc_offset vtpr mtf vapic ept vpid unrestricted_guest vapic_reg vid ple
bugs            : cpu_meltdown spectre_v1 spectre_v2 spec_store_bypass l1tf mds swapgs itlb_multihit
Multiply method used: Xor-Jit (SSE2), 30 threads

Is there any thing I can enable/disable to get more speed out of ParPar in this scenario?

utf-8 support

Non-ascii characters (like öäå) in file names are saved garbled and then reported as not found when verifying.

Recovery not possible from (large) generated par2 files

After generating a large par2 file (from 3 large input files) I can't seem to recover data.

Par2 file generation with ParPar:

$ parpar.js -s1000 -r$PARRATIO% -m 8G -F1 -o pardata.par2 data*
Calculating parity information...
Multiply method used: Shuffle2x (AVX2), 12 threads
Generating 630.16 GiB recovery data (100 slices) from 6297.68 GiB of data
Calculating: 100.00%
PAR2 created. Time taken: 39234.39 second(s)

damage one of the input files:

dd if=/dev/zero of=data02 conv=notrunc bs=1M count=300k

Recovery with par2cmdline:

$ par2cmdline/par2 v pardata.par2
Loading "pardata.par2".
Loaded 8 new packets
Loading "pardata.vol000+100.par2".
Loaded 100 new packets including 100 recovery blocks
Loading "pardata.par2".
No new packets found

There are 3 recoverable files and 0 other files.
The block size used was 6766304348 bytes.
There are a total of 1000 data blocks.
The total size of the data files is 6762085150752 bytes. 

Verifying source files:

Opening: "data01"
Opening: "data00"
Target: "data01" - found.
Target: "data00" - found.
Opening: "data02"
Target: "data02" - damaged. Found 216 of 264 data blocks.

Scanning extra files:

Repair is required.
1 file(s) exist but are damaged.
2 file(s) are ok.
You have 952 out of 1000 data blocks available.
You have 100 recovery blocks available.
Repair is possible.
You have an excess of 52 recovery blocks.
48 recovery blocks will be used to repair.
Command exited with non-zero status 1

Recovery with par2tbb:

$ par2tbb/par2 v pardata.par2 
par2cmdline version 0.4, Copyright (C) 2003 Peter Brian Clements.
Modifications for concurrent processing, Unicode support, and hierarchial
directory support are Copyright (c) 2007-2009 Vincent Tan.
Concurrent processing utilises Intel Thread Building Blocks 2.0,
Copyright (c) 2007-2008 Intel Corp.
Executing using the 64-bit x86 (AMD64) instruction set.

par2cmdline comes with ABSOLUTELY NO WARRANTY.

This is free software, and you are welcome to redistribute it and/or modify
it under the terms of the GNU General Public License as published by the
Free Software Foundation; either version 2 of the License, or (at your
option) any later version. See COPYING for details.

Processing verifications and repairs concurrently.
Loading "pardata.par2".
Loaded 8 new packets

There are 3 recoverable files and 0 other files.
The block size used was 6766304348 bytes.
There are a total of 1000 data blocks.
The total size of the data files is 6762085150752 bytes.

Verifying source files:

Could not read 13532608696 bytes from data02 at offset 0
Could not read 13532608696 bytes from data00 at offset 0
Could not read 13532608696 bytes from data01 at offset 0

Recovery with par2j:

$ wine multipar/par2j64.exe v pardata.par2 
fixme:heap:HeapSetInformation 0x3c4000 0 0x22fe10 4
Parchive 2.0 client version 1.3.1.1 by Yutaka Sawada

fixme:file:GetLongPathNameW UNC pathname L"\\\\?\\D:\\pardata.par2"
Base Directory  : "D:\"
Recovery File   : "D:\pardata.par2"
CPU thread      : 12 / 12
CPU cache       : 768 KB per set
CPU extra       : x64 SSSE3 CLMUL AVX2
Memory usage    : Auto (115751 MB available)

PAR File list :
         Size :  Filename
        20876 : "pardata.par2"
 676630566276 : "pardata.vol000+100.par2"

PAR File total size     : 676630587152
PAR File possible count : 2

valid file is not found

Are there some limits for par2 files in terms of file sizes? Same procedure works for smaller (e.g 5Gb) files.

[Request] Make the program check itself.

I wish you can add an auto checksum verification to the program (similar to the one implemented in par2j ), to make sure the executable is not damaged by bit rot for example.

unknown option -S, --auto-slice-size

I'm struggling with scripting parpar to use -S ... no matter what options I pass, parpar seems to balk at either of these options.

I have a friend who built his install manually (Debian), and he claims this option works; however, my installation (Arch) fails to recognize the option.

Configuration:

  • parpar-bin 0.3.2 (AUR)
  • Node v17.7.1

According to the github help file, -S is an alias for --max-input-slices=32768. However, when I give an initial -s 400k and --max-input-slices=32768 as an option (expecting parpar to adjust the slice size for me), parpar still fails with too many input slices.

My input sizes vary, so I'd really like to get --auto-slice-size working so I don't have to do the math in bash ... and also, I don't have enough experience to know what a good number of slices is to start ... I've read all of your closed issues and there are some good discussions. However, I wasn't convinced that say, if I divided my input size (in bytes) by 25000 to come up with an initial slice size (in bytes) ... that this math would cover all forseeable cases where I would call the script. I guess anything over 25000 bytes would be okay, but then you get into performance issues again for smaller (~100MiB) files not really needing that many slices or larger files incurring too much of a performance hit.

Piping from stdin

Hi! Is it possible to pipe from stdin to create par files? (Say, backing up a drive and piping it from dd)

Cannot reset recovery buffers whilst processing

Windows binary v0.3.1 running on a Xeon W3690
Running ParPar on a giant archive (around 200GB with multiple RAR volumes) with slices set to 7 MBytes

Parity file not generated: Multiply method used: XOR JIT (128 bit), 12 threads
Generating 19.21 GiB recovery data (2810 slices) from 192.08 GiB of data
[eval]:1154
		if(this._processStarted) throw new Error('Cannot reset recovery buffers whilst processing');
		                         ^
Error: Cannot reset recovery buffers whilst processing
    at PAR2Chunked.bufferedClear ([eval]:1154:34)
    at PAR2Chunked.setChunkSize ([eval]:1858:8)
    at PAR2Chunked.setRecoverySlices ([eval]:1792:9)
    at PAR2Gen.freeMemory ([eval]:2685:18)
    at [eval]:2991:9
    at [eval]:4192:16
    at next ([eval]:8445:25)
    at [eval]:2949:19

ParPar removes empty nested directories

If i par the following directory:
parpar -s1400K -m1024M --threads 2 --slice-size-multiple=700K --max-input-slices=2000 -r1n -R -o "[EAC][170215] Porter Robinson & Madeon - SHELTER" "[EAC][170215] Porter Robinson & Madeon - SHELTER"

[EAC][170215] Porter Robinson & Madeon - SHELTER
├── [EAC][170215] Porter Robinson & Madeon - SHELTER
│   └── BDROM
│       └── PORTER_ROBINSON_MADEON_SHELTER
│           ├── BDMV
│           └── CERTIFICATE
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol000+001.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol001+002.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol003+004.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol007+008.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol015+016.par2
├── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol031+032.par2
└── [EAC][170215] Porter Robinson & Madeon - SHELTER.vol063+062.par2

then uploaded it with nyuu

nyuu.exe -C config-sample.json -o "[EAC][170215] Porter Robinson & Madeon - SHELTER.nzb" "[EAC][170215] Porter Robinson & Madeon - SHELTER"
[2023-08-14 00:46:30.200][INFO] Uploading 2815 article(s) from 32 file(s) totalling 1905.05 MiB
[2023-08-14 00:46:30.201][INFO] Reading file 00000.clpi...
[2023-08-14 00:46:30.205][INFO] Reading file 00001.clpi...
[2023-08-14 00:46:30.206][INFO] Reading file 00002.clpi...
[2023-08-14 00:46:30.206][INFO] Reading file 00003.clpi...
[2023-08-14 00:46:30.207][INFO] Reading file index.bdmv...
[2023-08-14 00:46:30.207][INFO] Reading file MovieObject.bdmv...
[2023-08-14 00:46:30.207][INFO] Reading file 00000.mpls...
[2023-08-14 00:46:30.207][INFO] Reading file 00001.mpls...
[2023-08-14 00:46:30.209][INFO] Reading file 00002.mpls...
[2023-08-14 00:46:30.209][INFO] Reading file 00000.clpi...
[2023-08-14 00:46:30.209][INFO] Reading file 00001.clpi...
[2023-08-14 00:46:30.210][INFO] Reading file 00002.clpi...
[2023-08-14 00:46:30.210][INFO] Reading file 00003.clpi...
[2023-08-14 00:46:31.238][INFO] Reading file index.bdmv...
[2023-08-14 00:46:31.239][INFO] Reading file MovieObject.bdmv...
[2023-08-14 00:46:31.249][INFO] Reading file 00000.mpls...
[2023-08-14 00:46:31.257][INFO] Reading file 00001.mpls...
[2023-08-14 00:46:31.261][INFO] Reading file 00002.mpls...
[2023-08-14 00:46:31.264][INFO] Reading file 00000.m2ts...
[2023-08-14 00:46:38.813][INFO] Reading file 00001.m2ts...
[2023-08-14 00:46:38.826][INFO] Reading file 00002.m2ts...
[2023-08-14 00:49:24.860][INFO] Reading file 00003.m2ts...
[2023-08-14 00:49:25.153][INFO] Reading file id.bdmv...
[2023-08-14 00:49:25.172][INFO] Reading file id.bdmv...
[2023-08-14 00:49:25.221][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.par2...
[2023-08-14 00:49:25.334][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol000+001.par2...
[2023-08-14 00:49:25.457][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol001+002.par2...
[2023-08-14 00:49:25.848][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol003+004.par2...
[2023-08-14 00:49:26.383][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol007+008.par2...
[2023-08-14 00:49:27.277][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol015+016.par2...
[2023-08-14 00:49:29.533][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol031+032.par2...
[2023-08-14 00:49:33.743][INFO] Reading file [EAC][170215] Porter Robinson & Madeon - SHELTER.vol063+062.par2...
[2023-08-14 00:49:42.235][INFO] All file(s) read...
[2023-08-14 00:49:51.776][INFO] Finished uploading

and then download it with Sabnzbd

The resulting reconstruction looks like this

[EAC][170215] Porter Robinson & Madeon - SHELTER
├── BDMV
└── CERTIFICATE

I'm trying to avoid raring large files like BDMVs which is why I'm looking into the par reconstruction. The directory structure is important and needs to be kept 1:1 for it to work and maintain the cross-seed ability. The above probably becomes a bigger problem for multi discs with nested folders although I haven't done any extensive testing on it

Is --progress stdout working?

Hi,
It seems to not work on my env... Am I doing something wrong?

~/Downloads/testNgPost$ ~/apps/ParPar/bin/parpar.js --version
0.3.1
~/Downloads/testNgPost$ ~/apps/ParPar/bin/parpar.js -s1M -r8% -m1024M --progress stdout -o test.par2 jdk-8u171-linux-x64.tar.gz > out.txt 2> err.txt
~/Downloads/testNgPost$ wc -l out.txt err.txt 
  0 out.txt
  4 err.txt
  4 total

In fact --progress none is also doing the same :\

slice-dist not working

--slice-dist equal and --slice-dist uniform yield one big par2 file, and don't split it.
--slice-dist pow2 (the default) can yield very big files once it's past the +32 file. Is it possible to generate par2 volume sizes like par2cmdline does, which don't surpass the largest rar file size? For example:

image

help needed with CLI command to generat par2create like files

I would like to switch from par2 to parpar, but I don't know how to: which CLI command to use (so related to #2)

I always use
par2 create -r10 -n7 mypar2files *rar
with the output at the bottom of this post.

I tried a few parpar command, but can't generate the same result.

$ parpar -r10%  -s 100 -o somerandombin-500MB *rar
Cannot determine a slice size to satisfy required number of slices (100): using a slice size of 6553596 bytes would produce 101 slice(s), whilst a size of 6553600 bytes would produce 81 slice(s)
Enter `parpar --help` for usage information

and

$ parpar -r10  -s 100M -o somerandombin-500MB *rar
Multiply method used: XOR JIT (128 bit), 4 threads
Generating 1000 MiB recovery data (10 slices) from 500 MiB of data
Calculating: 100.00%
PAR2 created. Time taken: 18.204 second(s)

$ ll *par2
-rw-rw-r-- 1 sander sander       5820 jan 12 13:22 somerandombin-500MB.par2
-rw-rw-r-- 1 sander sander 1048593908 jan 12 13:22 somerandombin-500MB.vol00+10.par2

Tips appreciated

$ par2 create -r10 -n7 mypar2files   *rar

Block size: 264792
Source file count: 21
Source block count: 1981
Redundancy: 10%
Recovery block count: 198
Recovery file count: 7

Opening: somerandombin-500MB.part01.rar
Opening: somerandombin-500MB.part02.rar
Opening: somerandombin-500MB.part03.rar
Opening: somerandombin-500MB.part04.rar
Opening: somerandombin-500MB.part05.rar
Opening: somerandombin-500MB.part06.rar
Opening: somerandombin-500MB.part07.rar
Opening: somerandombin-500MB.part08.rar
Opening: somerandombin-500MB.part09.rar
Opening: somerandombin-500MB.part10.rar
Opening: somerandombin-500MB.part11.rar
Opening: somerandombin-500MB.part12.rar
Opening: somerandombin-500MB.part13.rar
Opening: somerandombin-500MB.part14.rar
Opening: somerandombin-500MB.part15.rar
Opening: somerandombin-500MB.part16.rar
Opening: somerandombin-500MB.part17.rar
Opening: somerandombin-500MB.part18.rar
Opening: somerandombin-500MB.part19.rar
Opening: somerandombin-500MB.part20.rar
Opening: somerandombin-500MB.part21.rar
Computing Reed Solomon matrix.
Constructing: done.
Wrote 16776936 bytes to disk
Wrote 16776936 bytes to disk
Wrote 16776936 bytes to disk
Wrote 2098008 bytes to disk
Writing recovery packets
Writing verification packets
Done
$ ll *par2
-rw-rw-r-- 1 sander sander    45008 jan 12 13:16 mypar2files.par2
-rw-rw-r-- 1 sander sander   619632 jan 12 13:16 mypar2files.vol000+02.par2
-rw-rw-r-- 1 sander sander  1194256 jan 12 13:16 mypar2files.vol002+04.par2
-rw-rw-r-- 1 sander sander  2298600 jan 12 13:16 mypar2files.vol006+08.par2
-rw-rw-r-- 1 sander sander  4462384 jan 12 13:16 mypar2files.vol014+16.par2
-rw-rw-r-- 1 sander sander  8745048 jan 12 13:16 mypar2files.vol030+32.par2
-rw-rw-r-- 1 sander sander 17265472 jan 12 13:16 mypar2files.vol062+64.par2
-rw-rw-r-- 1 sander sander 19384352 jan 12 13:16 mypar2files.vol126+72.par2

Crash on (I'm guessing) bad parameters for slice size / recovery slice count

Ran into this on a file <64M:

c:\\apps\\parpar\\parpar.cmd --slice-size 40000 --recovery-slices 10000 --memory 64M --index --slice-dist pow2 --alt-naming-scheme -o E:\file\some_file.par2 E:\file\some_file.dat
Method used: Shuffle (128 bit), 6 threads
(node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
(476 more lines of the above message)

RangeError: Maximum call stack size exceeded

None of the pars have any recovery blocks in them when viewed in Multipar.

-r does not wotk with %

Hi,

at help.txt i read
-r15.7%: generates 15.7% recovery

Does not work. I get

Values for out and input-slices are required
Enter parpar --help for usage information

%PARPAR% -s 1M -S -r15.7% -o %dir%\filename.par2 -R %dir%\

For example

%PARPAR% -s 1M -S -r100 -o %dir%\filename.par2 -R %dir%\

Works.

What am I doing wrong?

Unexpected size alert

Hi,

I'm backing up some personal videos that is in folders of 25GB each (to burn to BDR).

When I've pointed parpar to the 8 disc folders I get an unexpected error.

D:\discs>d:\temp\parpar -s 6M -r 23.1G -R -o c:\temp\my_recovery.par2 disc1 disc2 disc3 disc4 disc5 disc6 disc7 disc8
Too many input slices: 31230 exceeds maximum of 32768. Please consider increasing the slice size, or reducing the amount of input data
Enter parpar --help for usage information

31230 is < 32768 so why the error ?

ParPar randomly gets stuck at 100%

image
ParPar was stuck here for 2 days until I checked. This issue happens randomly. I encountered this multiple times on multiple different machines, but I can't reproduce it other than just running ParPar on multiple machines and waiting for it to happen. Killing (CTRL+C) parpar and then re-running always fixes the issue.

Here are my commands:

parpar -s1400k --slice-size-multiple=700K --max-input-slices=8000 -r10% -R -o base_dir base_dir
find "$dir" -name "*.mkv" -print0 | parallel -0 -j 8 parpar -s700K --threads 2 --slice-size-multiple=700K --max-input-slices=1000 -r1n*1.2 -R -o {.} {}

I know this isn't much to go off but I can't seem to find anything else that might be causing the issue.

TypeError: binding.FSReqWrap is not a constructor

Using ParPar installed from git source using node v11.1.0 gives the following error after calculating the par2 recovery data.

Multiply method used: XOR JIT (128 bit), 8 threads
Generating 227 MiB recovery data (227 slices) from 2265.38 MiB of data
/home/rcguy/usenet/bin/ParPar/lib/writev.js:14
    var req = new binding.FSReqWrap();
              ^

TypeError: binding.FSReqWrap is not a constructor
    at writev (/home/rcguy/usenet/bin/ParPar/lib/writev.js:14:15)
    at /home/rcguy/usenet/bin/ParPar/lib/par2gen.js:763:7
    at /home/rcguy/usenet/bin/ParPar/lib/par2gen.js:743:13
    at PAR2Gen.<anonymous> (/home/rcguy/usenet/bin/ParPar/lib/par2gen.js:744:7)
    at /home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:1135:9
    at replenish (/home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:1011:17)
    at /home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:1016:9
    at _asyncMap (/home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:1133:5)
    at /home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:1219:16
    at timeLimit (/home/rcguy/usenet/bin/ParPar/node_modules/async/dist/async.js:4997:5)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.