Comments (9)
You could do all the above things already with ImageMagick's compare tool:
http://www.imagemagick.org/script/compare.php
And check the different metrics here:
http://www.imagemagick.org/script/command-line-options.php#metric
I know ImageMagick's JPEG 2000 support used to be problematic because of its dependancy on the buggy Jasper library, but they're now using OpenJPEG instead, so I expect this should have improved by now (haven't tried though).
In any case this functionality requires decoding of the image data, which is completely out of the scope of jpylyzer. So I don't see this happening.
from jpylyzer.
Thanks. As I understand it, jpylzer is intended to help jp2k consumers validate vendor codecs.
If so, how do consumers currently validate pixel data?
I understand this is out of scope for jpylzer, but it would be useful to have a codec-agnostic tool that compared image quality between various codecs.
from jpylyzer.
The problem is that in order to do any analysis of pixel-level data you'll need to decompress the image data, and for this you need a codec. So there's simply no codec-agnostic way to do this (unless you would develop a brand new codec specifically for this purpose, but that would be an enormous task).
So for things like PSNR and losslessness I would use existing tools like ImageMagick / GraphicsMagick, since this is a job these tools do well (although yes, there'll be a codec dependency). This is something you can use on top of jpylyzer, see e.g. the link below for how we did this for a valuable collection of old TIFF images that we migrated to JP2:
http://wiki.opf-labs.org/pages/viewpage.action?pageId=36012209
However some recent work by the British Library suggests that not all codecs decode JPEG 2000 images in exactly the same way, which is something that may influence results. See:
http://www.scape-project.eu/wp-content/uploads/2013/11/iPres2013_Palmer_JPEG2000Codecs.pdf
Note that their analysis only covers lossy compression; for our Metamorfoze work (which uses lossless compression) I've never encountered any of these issues. There we simply compare the pixel values before and after the migration, and count the number of pixels that aren't identical (must be 0 for lossless compression, and this is also exactly what we got in all cases) .
from jpylyzer.
Thanks. Very interesting. I see how complex the situation is.
For lossless, I have the following concern: encode by one codec and decode by another codec may give a result that is different from original. Have you had a chance to investigate this scenario?
I have been working a fair bit with the openjpeg code, and here is an example that is concerning: for wavelet compression, the standard mandates how the codec should deal with pixels on the boundary: the standard indicates that mirroring should be used at the boundary, but openjpeg uses clamping: pixels outside of the boundary are clamped to boundary values.
So, if one codec uses mirroring and another uses clamping, then I think one would get a lossy encode-decode round trip for certain images.
from jpylyzer.
Yes, we did this. For the Metamorfoze migration we used the Aware codec to compress source TIFFs to lossless JP2. Then we converted those JP2s back to a temporary TIFF using Kakadu's kdu_expand tool. Finally we did a pixel-level comparison between each source TIFF and its corresponding temporary TIFF (latter is result of full compress-decompress cycle). We did this with GraphicsMagick using the following command line (from the top of my head):
gm compare -metric MAE source.tif fromjp2.tif
Result if pixels are identical:
Image Difference (MeanAbsoluteError):
Normalized Absolute
============ ==========
Red: 0.0000000000 0.0
Green: 0.0000000000 0.0
Blue: 0.0000000000 0.0
Total: 0.0000000000 0.0
Metrics explained here:
http://www.imagemagick.org/script/command-line-options.php#metric
Having a look at this again PAE might be a better metric; we originally used ImageMagick with AE but then we switched to GraphicsMagick which doesn't support that metric; just do a few tests yourself to check what works best in your case.
from jpylyzer.
Thanks. I will try this. Did you investigate the open source codecs, such as OpenJPEG and Jasper?
It looks like Jasper is no longer developed, but OpenJPEG is set to become a reference implementation for the standard, and has had a lot of activity lately. The big issue with OpenJPEG, of course, is that it is very slow.
from jpylyzer.
Not much experience with those codecs. Jasper isn't actively developed, it has various issues & I would really recommend to stay away from it. The OpenJPEG situation seems to have improved a lot over the past few years, it's just that we haven't really used it for production work.
from jpylyzer.
Great, thanks so much for your help.
from jpylyzer.
By the way, I am actually developing a new jpeg 2000 codec. It will run on the GPU, and its going to be very fast. Also, open source. Should be in alpha by spring 2015.
from jpylyzer.
Related Issues (20)
- Add missing exception type HOT 1
- Check Pep8 compliance and fix HOT 1
- Report compression ratio for raw codestreams HOT 1
- maxOneCqccPerComponent and maxOneCcocPerComponent are overly restrictive HOT 1
- Check for COC and QCC markers that are not in first tile-part of a tile HOT 2
- Command line options not set correctly in 2-2-dev HOT 1
- Add file-level warnings to output file HOT 1
- Debian package build process fails HOT 4
- Debian package name of RC releases modified by Github
- Remove test CPFnumConsistentWithRsiz (and possibly also CPFnumConsistentWithPRFnum) HOT 1
- Running Debian build script results in deprecation warning
- Dockerfile issue with --install-option=\"--prefix=/install\" HOT 1
- Calculation of numberOfTilesX and numberOfTilesY subtracts xOsiz/yOsiz instead of xTOsiz/yTOsiz HOT 5
- Add more HTJ2K test files
- Docker PyPI script broken / outdated HOT 1
- Jpylyzer fails with "unknown error" HOT 1
- Adobe Photoshop images with erroneous tile-part information no longer result in validation error HOT 2
- Unit tests broken? HOT 3
- Question about new PLM marker support HOT 3
- Contribution: Validating ICC Profile HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from jpylyzer.