How to get "mean" or "average" from image statistics?

Questions and postings pertaining to the usage of ImageMagick regardless of the interface. This includes the command-line utilities, as well as the C and C++ APIs. Usage questions are like "How do I use ImageMagick to create drop shadows?".
Post Reply
konstantin
Posts: 50
Joined: 2013-08-07T13:50:31-07:00
Authentication code: 6789

How to get "mean" or "average" from image statistics?

Post by konstantin »

When I type:

Code: Select all

identify -verbose 1.bmp
I can see:

Code: Select all

Channel statistics:
    Pixels: 57600
    Gray:
      min: 0 (0)
      max: 255 (1)
      mean: 254.996 (0.999983)
      standard deviation: 1.06249 (0.00416663)
      kurtosis: 57595
      skewness: -239.994
      entropy: 0.000299591
  Colors: 2
  Histogram:
         1: (  0,  0,  0) #000000 gray(0)
     57599: (255,255,255) #FFFFFF gray(255)
So I can see the "mean" calculated data: 254.996 (0.999983)

But when I type:

Code: Select all

identify -format "%[mean]" 1.bmp
I get :

Code: Select all

65533.9
How can I get the correct result for the mean and what is its exact meaning?
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: How to get "mean" or "average" from image statistics?

Post by fmw42 »

You need to convert the [mean] from the Qrange of your install to the range 0 to 255. That is where the numbers in parenthesis come in. They are always in the range 0 to 1. So better to do:

Code: Select all

convert 1.bmp -format "%[fx:255*mean]" info:

%[mean} gives values from 0 to quantumrange for you compile

%[fx:mean] gives values from 0 to 1
Post Reply