Gamma correction, contrast stretching and blurring.
posted by dJsLiM on Saturday, September 24, 2005 ::
click for class reference material ::
Ok, so where were we... Oh yeah, examples of image processing filters. Right. So, let's start with a somewhat obscure filter called gamma correction. Gamma correction is an action taken to adjust the intensity value projected at a given pixel, so that it either becomes darker or brighter. Sounds like a pretty simple brightness control, eh? Well, it is, but with a slight twist.
If you have heard of the term gamma correction, chances are good that you would have heard it in relation to your monitor. The problem with monitors is that the light they project in correspondence to a certain brightness value, is often perceived by humans to have a different brightness value. It's somewhat of a
s = rγ
You'll notice the gamma power in the equation, and this is where the term gamma in gamma correction comes from. Now, given this relationship, if we can figure out the value of gamma , we can find the inverse of the function that will yield the "corrected" intensity value for each intensity value that the monitor would normally produce. The "corrected" intensity value would essentially be compensating for flaws in human perception or the electronics of the monitor.
The simplest way to carry this operation out at the absence of a photometer requires projecting an image that is known to be perceived by humans to have a certain brightness, then, right next to it, we need to project what the monitor produces when it's told to project the same brightness. With both of these images present, a human being can either raise or lower the intensity of the image projected by the monitor until the two match up. Given this mapping, the inverse function can be found from the power law. Now, the only thing left for us to figure out is to determine how to project an image that we know for sure will be perceived by humans to have a certain brightness. Well, when we can't depend on the brain of the machine to figure things out, I guess we'll just have to do it the old fashioned way and let the human brain do the job. =)
As we have discussed before, the human eye is not a photometer, but it is very sensitive to differences in brightness. So when we have two colors right next to each other that are different enough, the human brain notices the difference and averages them out. By taking advantage of this fact we can project a pattern of alternating 0% and 100% intensity values to yield an image of 50% brightness. This is called dithering, and it is the technique used to produce the illusion of gradation on a monochrome monitor.
Ok, so on with the next filter. The next filter has a rather well-known name, and it's quite likely that you're familiar with it. The filter is called contrast stretching. Often times, an image is acquired in such a way that the brightness values found in the image do not make full use of the available range. For example, although the camera is able to show brightness values ranging from
The most well-known algorithm used to carry this stretch operation out is the use of histogram equalization. Histogram equalization entails plotting the normalized cumulative histogram graph where the X-axis encodes the intensity values found in the original image and the Y-axis encodes the number of pixels found to have that intensity value. Once you plot this graph you''re basically going to end up with a lookup table where the X-axis represents the old intensity value and the Y-axis represents the factor by which you can multiply the maximum intensity value available to get the new intensity value.
The last filter we'll cover is called the mean filter aka blurring filter. Mean filtering is basically a technique used to filter out noise to smooth up a given image. What you do in mean filtering is you pick a window over which to apply the filter, and modify the value at the center of that window so that it is the mean of all the surrounding values within the window. You would then repeat this process for all the pixels you'd like to have smoothed out. This process is also known as cross-correlation filtering and the formula takes on the following form:
G = H (×) F.
In this equation, H is the "kernel" and F is the original image. The kernel is basically the aforementioned window with a multiplier in each of the slot that the pixel values found in the slot would be multiplied by before being summed together. You can take a look at an example of the kernel in action here. The sum of all multipliers in the kernel always amounts to 1.
In a mean filter, the kernel contains the exact same multiplier in each of the slot. This means that all the pixels found inside a kernel are given equal weight when calculating the mean. The problem with this is that as the window gets larger, the chance of the mean becoming skewed by distant intensity values that happen to be drastically bigger or smaller than the rest. To counter this
Was that nough filter babble for ya? =) Now whenever you open up Photoshop and click on that mighty "filters" menu, take some time to stop and think about how they work. Who knows? You might be the one to come up with the next coolest filter! =)
1 Comments:
Nice explanation.. For further more information you can refer Outsource Images to India|Real Estate Photo Editing Service
Post a Comment
<< Home