Author Topic: Basic PixInsight processing example of a galaxy image  (Read 34603 times)

Offline Nocturnal

  • PixInsight Jedi Council Member
  • *******
  • Posts: 2727
    • http://www.carpephoton.com
All,

I've uploaded some screenshots to show how I process a typical galaxy
image with PixInsight. I spent less than an hour doing this session so
it's far from optimal but it demonstrates some of the techniques I use
regularly.

http://gallery.tungstentech.com/main.php?g2_itemId=1247

Demonstrated are:

- previews
- DynamicBackgroundExtraction (DBE)
- ScreenTransferFunction (STF)
- Histogram
- ACDNR
- HDRWaveletTransform
- PixelMath
- Curves
- Real Time Preview

naturally I'm always looking to improve my processes so if you see mistakes please let me know :) My biggest challenge right now is to preserve a nice background after doing large scale ADCNR. It seems to look too much pastel colored.
Best,

    Sander
---
Edge HD 1100
QHY-8 for imaging, IMG0H mono for guiding, video cameras for occulations
ASI224, QHY5L-IIc
HyperStar3
WO-M110ED+FR-III/TRF-2008
Takahashi EM-400
PIxInsight, DeepSkyStacker, PHD, Nebulosity

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Basic PixInsight processing example of a galaxy image
« Reply #1 on: 2009 March 28 09:41:24 »
Sander,

this tutorial is just great. Thanks a lot!

I sometimes wonder if it is wise to apply  a non-linear RGB stretch so early in the process. The non-linear stretch also removes some of the color information that is present in the picture. Example: If your RGB values are 0.01, 0.02, 0.04 before the stretch, they are something like 0.2, 0.3, 0.35 afterwards, giving an entirely different color balance, and generally making bright points more white than they were in the raw data.

I recentl tried (using a quick Lulin image) using histogram transform for linear transformation only, essentially making the sky backgrund and star colors more or less neutral in colour. I then did a HSV Channel extraction, did the non-linear transform on the V channel only, and then an HSV channel combination.  The result retains the greeninsh colour of the comet core, that is otherwise transformed to white by the non-linear transform. Also, there is less noise, and the stars are more colourful.

Being an amateur in astronomical image processing, I wonder if this is a valid procedure, and would benefit your galaxy image as well,?!?

Have a nice day (and a clear night)

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline Nocturnal

  • PixInsight Jedi Council Member
  • *******
  • Posts: 2727
    • http://www.carpephoton.com
Basic PixInsight processing example of a galaxy image
« Reply #2 on: 2009 March 28 12:51:59 »
Hi Georg,

this process is based on a series of steps as presented by Juan in the videos and Jack in his AIC talk. Certain aspects of it mirror the Zone System by Ron Wodaski. It is quite possible I mis-interpreted what they taught of course :)

Certainly there are alternative processing methods and I thank you for sharing yours. I can't quite picture yet what you're doing so when I have some time I'll give it a try.
Best,

    Sander
---
Edge HD 1100
QHY-8 for imaging, IMG0H mono for guiding, video cameras for occulations
ASI224, QHY5L-IIc
HyperStar3
WO-M110ED+FR-III/TRF-2008
Takahashi EM-400
PIxInsight, DeepSkyStacker, PHD, Nebulosity

Offline shold

  • Newcomer
  • Posts: 21
    • http://www.flickr.com/photos/siegi252
Basic PixInsight processing example of a galaxy image
« Reply #3 on: 2009 March 29 00:57:42 »
Sander,

Thanks,this tutorial is just great.
This Tutorial is much helpful for new PixInsight user!

Greetings
Siegfried

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Basic PixInsight processing example of a galaxy image
« Reply #4 on: 2009 March 29 02:37:38 »
Hi Sander.

Quote from: "Nocturnal"
It is quite possible I mis-interpreted what they taught of course :)


I dont think you mis-interpreted anything. I just wanted to outline a different procedure for the non-linear stretch that, in my opinion, better preserves color than the usual non-linear histogram-stretch. As always in astronomical image processing, the preferred procedure may depend on the processing goals.

Greetings from Munich,

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Basic PixInsight processing example of a galaxy image
« Reply #5 on: 2009 March 29 17:09:49 »
Hi Sander,

Thank you very much for this tutorial. It's great and I'm sure it can help a lot of people to understand many basic and important things.

Georg, to neutralize the background you can use this simple PixelMath expression (for the RGB/K slot and "Use a single RGB/K expression" enabled):

Code: [Select]
$T - Med($T)

with rescaling enabled. This is a purely linear transformation. It subtracts the median from each channel of the image, then the final rescaling operation redistributes all values within the [0,1] range.

You can also use:

Code: [Select]
$T - Med($T) + 0.005

with rescaling disabled. The constant is to avoid zero clipping; you may want to use a different number depending on the image.

Linearity is extremely important and should be preserved during the initial stages of processing. In this regard we are preparing a little surprise for PI 1.5  8)
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Median
« Reply #6 on: 2009 March 30 01:40:01 »
Juan,

thanks for the median-hint. This works very nicely indeed, and is much more accurare than the manual work with Histogram-transform. Maybe adding a "median"-button to the histogram-transforn tool would be a good idea?

Some websites suggest to adjust the white balance of a picture by balancing the top 0.5% of the image, since these usually are mainly the stars. (Considering your comments in the G2V discussion,  know you are critical of such approaches. But nevertheless I would like to give it  a try.) I tried to do this with Pixelmath, but I did not find a way to determine the statistics of the top 0.5% of an image. Do you see a way to do it (in one expression)? Would it be possible to add functions for percentile based statistics?

Have a nice day,

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Basic PixInsight processing example of a galaxy image
« Reply #7 on: 2009 March 30 02:42:26 »
Quote
Some websites suggest to adjust the white balance of a picture by balancing the top 0.5% of the image, since these usually are mainly the stars.


My advice is: keep yourself as far away from those websites as you can. Colorful stars are an essential part of a deep-sky image. Setting them to pure white is a crime.

Color balancing is no reason to clip a single pixel in the highlights. You can apply simple color correction factors such as:

Code: [Select]
$T * 0.850
$T
$T * 1.195


where the magic numbers should come from a white balancing procedure, including (heaven forbids) G2V.

Instead of assuming that all objects in a deep sky image are being illuminated by a particular type of star (which is what G2V does :) ), you can use an approach similar to what we have been discussing in this thread:

http://pixinsight.com/forum/viewtopic.php?t=1077

which IMHO follows much better founded criteria.

Quote
I tried to do this with Pixelmath, but I did not find a way to determine the statistics of the top 0.5% of an image. Do you see a way to do it (in one expression)? Would it be possible to add functions for percentile based statistics?


Definitely yes; thank you for this suggestion. I'll think on a new PixelMath function that returns the clipping point corresponding to a given percentile.

You can use The HistogramTransformation tool to do this very easily. Select your view and specify the desired percentage of the total pixels by clicking the Auto Clip Setup button, then click the Auto clip highlights button and the high clipping point will be set automatically at the required position. With HistogramTransformation you can always know how many pixels are being clipped.

I wish a nice day also to you and to everyone on this forum.
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Basic PixInsight processing example of a galaxy image
« Reply #8 on: 2009 March 30 02:56:07 »
Juan,

the literature I am refering to suggests to use the 0.5% pixels with the highest luminosity to do the white balancing, assuming that a reasonably large population of stars usually is "white" on average. They dont suggest any clipping (which would be harmful, I agree!).

I thing that this procedure quite similar to the approach suggested in http://pixinsight.com/forum/viewtopic.php?t=1077 , where you just use a galaxy as a population of stars. I have to admit that a galaxy usually consists of some millions of stars, and the average star field has some hundreds only, but hundreds should still be sufficient for reasonable statistics.

Anyway, I never tried this procedure (yet), but maybe some time soon.

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline Juan Conejero

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 7111
    • http://pixinsight.com/
Basic PixInsight processing example of a galaxy image
« Reply #9 on: 2009 March 30 03:25:28 »
Quote
the literature I am refering to suggests to use the 0.5% pixels with the highest luminosity to do the white balancing, assuming that a reasonably large population of stars usually is "white" on average. They dont suggest any clipping (which would be harmful, I agree!).


Ah, I see. I misunderstood what you said (indeed you didn't use the word "clipping" I must admit). Sorry for my unjustified vehemency. We're seeing some things recently that make us a bit touchy :)

Do you have some links to those websites, or some references?

Quote
I thing that this procedure quite similar to the approach suggested in http://pixinsight.com/forum/viewtopic.php?t=1077 , where you just use a galaxy as a population of stars. I have to admit that a galaxy usually consists of some millions of stars, and the average star field has some hundreds only, but hundreds should still be sufficient for reasonable statistics.


We're achieving very good results using the integrated light of nearby galaxies to derive color correction factors. In images where no suitable galaxy is available, this can be a good alternative, and the 0.5% pixels sounds reasonable. I see some potential pitfalls though. For example, objects that are being reddened due to absorption could be a problem.

It would be great if Vicent Peris, who first used galaxies as white balance references (see for example http://pixinsight.com/examples/NGC7331-CAHA/en.html) chimes in here. Vicent! :)
Juan Conejero
PixInsight Development Team
http://pixinsight.com/

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Basic PixInsight processing example of a galaxy image
« Reply #10 on: 2009 March 30 10:47:02 »
Juan,

the reference that I immediately can retrieve is Richard Berry, James Brunell, "The Handbook of Astronomical Image Processing", Second English Edition, p. 577: "Use Histogram Percentiles to set Black and White: ...".

For me, this is one of the best books on astromical image processing available, since the authors don't hesitate to give some mathematical background that is usually missing in the more colorful books on astrophotography.

Greetings from cloudy Munich,

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline vicent_peris

  • PTeam Member
  • PixInsight Padawan
  • ****
  • Posts: 988
    • http://www.astrofoto.es/
Basic PixInsight processing example of a galaxy image
« Reply #11 on: 2009 March 31 05:20:08 »
Hello all,

the method of integrating the star light of a single photo can be valid as a relative color calibration. IMO, this cannot be an absolute color calibration method for your optical system, as it will give you a different RGB scale factors in each photo.

OTOH, sometimes can be preferable to have a relative color calibration. In my experience in professional telescopes, when photographing far far away galaxies, it's better to state as white reference the closest galaxies in the photo. If you take as white reference a near galaxy, all your photo of distant galaxies will appear yellowish, as all the objects in the photo have a significant redshift. So making a relative color calibration can enhance the redshift effects of the objects of the photo: you will see the reddening of the farther galaxies respect to the closer ones.

I have given a try to George's idea with a M42 photo, but with some significant variations...

To measure only the star intensities, I have removed the wavelet layers of the image down to the 16 pixel one. This acts as a kind of local background substraction.

Also, to remove saturated pixels, I made a mask that is a binarization of the image. This mask is white only for the saturated pixels. With this mask, you  make the pixels of the flattened image to zero. This will cancel these pixels for statistic calculations.

Last, to measeure only the illumination of the 0.5% brigther pixels, we will do a small trick. In the histograms tool, we will set the shadow clipping parameter to 99.5%. :) Applying this clipping to the flattened image we will set to zero 99.5% of the pixels, and will remain only the brighter ones.

At the end, we will have the original pixel values of the 0.5% brighter pixels that are not completely saturated.

In this screen image you can see the the complete process:



"m42_Original" is the original image and "M42_ColorStars" is the processed one. "satmask" is the mask of the saturated pixels. "color_Stars" is the linear flattened image. "bg" is the background reference image, used to preserve background level in the RGB channels.

You can see also in the image two instances:

- The Statistics instance shows the average illumination of the flattened image in each RGB channel. These values are our white reference and are used to scale the RGB channels.
 
- The PixelMath instance shows the formuae used to scale the RGB channels. As the G channel has the higher average illumination, we use this as the unitary channel. So we will scale the R and B respect to G. The formula acts as below:

     - We multiply the channel by a factor after substracting the background level ( (m42[0]-Med(bg[0])) ). The factor is calculated as the average illumination of the reference color channel divided by the average illumination of the target color channel ( *(Avg(color[1])/Avg(color[0])) ).
     - We add the background level, multiplied by the same factor we have multiplied the image without the background level (  +(Med(bg[0])*(Avg(color[1])/Avg(color[0])) ).


As you can see, the color is slightly redder. The original image has a completely manual color calibration, adjusted simply by eye. The resulting color balance must by judged by your taste, I think... What do you think?


Regarding color calibration with spiral galaxies. I'm starting a work on this problem. My idea is to stablish a spiral galaxy as a standard candle for astrophotography color calibration. The fact is that spiral galaxies act as a good white reference as they contain the broader type of objects in a single object. My idea is to make a survey of spiral galaxies to calculate an "average spiral galaxy", and to stablish a correction factor for each survey galaxy. In this way, we will have a limited number of "standard galaxies" (around 70 - 80) where to stablish the RGB scaling factors for our imaging system. The galaxies in the survey must have the properties below (to be reviewed):

- Must be closer than 50 mpc.
- Hubble classification of Sa, Sb, Sc, Scd, SBa, SBb, SBc or SBcd.
- Inclination of <60º.
- Integrated intrinsic intergalactic and galactic reddening of <0.5 mag in Johnson B.

I think I will have the results at the end of this year.



Hope this helps... Best regards,
Vicent.

Offline georg.viehoever

  • PTeam Member
  • PixInsight Jedi Master
  • ******
  • Posts: 2132
Basic PixInsight processing example of a galaxy image
« Reply #12 on: 2009 March 31 08:37:20 »
Wow!

This is following the general idea that I had in mind, but is certainly a lot more sophisticated! And the result looks nice indeed. Such a procedure is certainly very useful to get an initial white balance. And then it it up to the user to refine it using aesthetic or photometric criteria!

Georg
Georg (6 inch Newton, unmodified Canon EOS40D+80D, unguided EQ5 mount)

Offline vicent_peris

  • PTeam Member
  • PixInsight Padawan
  • ****
  • Posts: 988
    • http://www.astrofoto.es/
Basic PixInsight processing example of a galaxy image
« Reply #13 on: 2009 March 31 09:14:03 »
Hi again,

I'm sorry, there was an error in the formulae. The correct ones to have a good bakground leveling are these below:

R channel:   m42[0]*(Avg(color[1])/Avg(color[0])) - (Med(bg[0])*(Avg(color[1])/Avg(color[0]))) + Med(bg[0])

G channel: m42[1]

B channel:   m42[2]*(Avg(color[1])/Avg(color[2])) - (Med(bg[2])*(Avg(color[1])/Avg(color[2]))) + Med(bg[2])



Best regards,
Vicent.

Offline ManoloL

  • PixInsight Addict
  • ***
  • Posts: 220
Basic PixInsight processing example of a galaxy image
« Reply #14 on: 2009 March 31 14:42:22 »
Hola a todos:

En primer lugar pido disculpas por no seguir en inglés.
Vicent:
Me alegro de que por fin vayas soltando prendas.
Mañana intentare aplicar tu "receta" con una de mis últimas imágenes.
Tengo curiosidad de ver que coeficientes me salen para la 400d modificada aplicando el procedimiento que has expuesto.
Supongo que se podrá aplicar a una imagen apilada con DSS sin correcciones en los canales RGB.

Saludos.
Saludos.

Manolo L.