|
| Dark processing is a term that is generally misused to include the processing of dead, off scale, hot and cold pixels in the CCD. A dead or off scale pixel gives either a zero or a maximum value no matter what the amount of light the pixel is exposed to. Dead pixels appear black, which on a black background don't stand out. Off scale pixels show up as white and stand out on a black background. If you are viewing a processed image, a hot pixel may affect a pixel or two around it and cause a hot spot. Hot spots are easily confused with stars thus making on the fly drizzle a crap shoot. Knowing your hot spots will help you avoid them for alignment and probably upset you every time you think about all the money you spent on them. RAW images have not been processed for darks. However, Uncombined images may or may not have been processed for darks. And there is no way to determine (other than visual inspection) if an Uncombined image has been processed for darks. Dark processing is based on the exposure time and on the temperature. You should have a set of darks at all of the exposure times that you use. Also, as temperature changes, be sure to make a new set, as hot and cold spots vary with temperature. Even with a full set of darks, you will find that this part of the processing seems to be an area that requires some further investigations. I'm simply not happy with the results I've attained so far using Meade;s dark processing. The DSI uses the Sony ICX404AK video chip which is an ultra sensitive Camcorder CCD. Because the CCD was not designed for astronomical observing, you will find lots of imperfections with the DSI. When discussing astronomical imagers the terms Class I and Class II identify the amount of pixels that are "bad". A Class I can not have even one whole column or row "bad". My DSI does not have any rows or column visually "bad", but it has easily over 50 hot spots. To further complicate dark processing, RAW images are stored as CYMG mosaic. This means that a hot pixel may be affecting other pixels in the same row, as pixels are blended as the imaging chip scans the (on chip) CCD array and sends the raw data to the computer. The computer software then separates the colors back to CYMG (at least we assume this is how Meade stores their RAW CYMG images). The actual process or formula does not seem to be documented anywhere. Having this information would provide the ability to produce algorithms that provide better results. In the mean time, we are stuck with Meade's algorithms. |
PS - If you find anything on this page that is copyrighted material and we did not give an appropriate copyright notice for the owner, first realize that it is an oversight, as we are not trying to claim credit but for only a few of the pictures on this site. Then we ask you to please let us know about the item in question. And finally, also realize that this is a private and non-commercial and hopefully educational site. So buzz off.Copyright © 2007, Gary Gorsline. All Rights Reserved |