0

I'm trying to get the quality value from a Bitmap which was loaded from a jpeg image. This is what I have so far:

    int GetQuality(Bitmap bitmap) {
        if (!bitmap.RawFormat.Equals(ImageFormat.Jpeg)) return -1;

        ImageCodecInfo jpegEncoder = ImageCodecInfo.GetImageEncoders().First(c => c.FormatID == ImageFormat.Jpeg.Guid);
        var jpegGuid = jpegEncoder.Clsid;
        var jpegParams = bitmap.GetEncoderParameterList(jpegGuid);
        EncoderParameter qualityParam = jpegParams?.Param[1]!;

        BindingFlags bindFlags = BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Static;
        FieldInfo? field = typeof(EncoderParameter).GetField("_parameterValue", bindFlags);
        object obj = field!.GetValue(qualityParam)!;
        IntPtr qualityParamValue = (IntPtr) obj;
        unsafe {
            //byte* qpvPtr = (byte*) qualityParamValue;
            //var quality = *qpvPtr;    // val is type byte with a value of 0

            var quality = *(byte*) qualityParamValue; // val is type byte with a value of 0
            return quality;
        }
    }

But the quality value comes back zero for a jpeg bitmap which was saved at a quality compression of 50% (EncoderParameter(Encoder.Quality, 50L)). Either my reflection and unsafe code is wrong, or the quality value is not saved in the jpeg bitmap. I derived this code by looking at the EncoderParameter constructor, which looks looks like this:

public EncoderParameter(Encoder encoder, byte value)
{
    _parameterGuid = encoder.Guid;

    _parameterValueType = EncoderParameterValueType.ValueTypeByte;
    _numberOfValues = 1;
    _parameterValue = Marshal.AllocHGlobal(sizeof(byte));

    *(byte*)_parameterValue = value;
    GC.KeepAlive(this);
}

where byte value is the quality (from 0 - 100). Does my unsafe code correctly do the inverse of *(byte*)_parameterValue = value;? If not, what am I doing wrong?. If so, does that mean the quality is not actually stored in the compressed bitmap?

8
  • 3
    If you're lucky, you could parse exif data. Otherwise you may be able to fingerprint a jpeg file, deducing which application and which quality level was used. Or use a simpler approximation like pixelcount / filesize. But there's no simple answer to this question. FYI stackoverflow.com/questions/2024947/… Commented Mar 11 at 1:14
  • Why do you think you want this data? Commented Mar 11 at 4:27
  • A jpg file doesn't contain a quality value, the value determines the quantization table (QT) used during compression, but there is no standard QT set. Commented Mar 11 at 4:32
  • @shingo: Where does visual Studio get the _parameterValue when I view the contents of qualityParam in it's debugger? Commented Mar 11 at 14:28
  • @mjwills: I'm adding an 'Image Info' page to an image viewer. One of the questions I'd like it to answer is: "What quality value was used when this jpeg image was saved?" Commented Mar 11 at 14:32

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.