Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - mmnb

Pages: [1]
Gallery / coma cluster - C14 Hyperstar EOS 60Da
« on: 2019 July 16 17:14:12 »
This is was I did with 32x30s exposures of the coma cluster.  Standard preprocessing (no darks because of how the canon camera darks are hard to measure) up to pcc.  Just did two gentle passes of denoising with multiscale linear transform, I tried using a blurry luminance mask but I could always see "noise seam" in the galaxies so I denoised without masking.  Adaptive stretch and some curve bumping in saturation and lightness.  What I tried was looking at was the face on spiral and try to be sure I wasn't loosing the "S" in my processing.  Does anyone see any obvious problems in the image?

Finally have my hyperstar setup working. My stars suffer from some eccentricity, hopefully will be resolved after tuning collimation.  I also had some backlash in my focusing that must've affected things.

Not sure how I did here.  After calibration and PCC, mostly did some very gentle denoising, deconv, stretching and really just bumped saturation curve.  Other images have better hue contrast between the core and arms, but somehow this looks ok to me.  I noticed a bad blue cast after integration, I fixed with LinearFit.  Not sure how that happened (blue was weaker in flats?) so I don't think I lost any color contrast but a little worried since MaximDL's RAW Color images (automatically debayered, balanced) seem to suggest I would see more.

Would love any constructive crit. 500MB linear xsif after drizzle int and PCC:

General / Overscan newb questions
« on: 2018 October 15 18:11:08 »
I just got a camera (QHY 16200A) with overscan capabilities, wanted to check my understanding to see if I have this right:

Previously I created a master bias according to the Light Vortex tutorials with a camera that did not have overscan.  I should be able to use the overscan capability to do better bias correction for an image.  Because the bias is dependent on the state of the (non-cooled) components in the camera, overscan provides better correction than a master bias frame from an average of a bunch of rapidly acquired bias images.

The bias for a given image is determined by taking the overscan region, averaging columns and fitting a smooth representation of bias *along* the columns.  This is done for *all* images (lights, darks, flats).  The master bias (now the zero frame) is also corrected this way.  The zero frame is still subtracted from darks, flights and flats the way the original master bias would have been.  All of this is done simply by specifying the overscan regions in BatchPreProcessing.

Is that right?  I am not sure what the 'target' region is in the overscan region specification.  Does using the overscan region provide materially better accuracy than simply making a master bias out of a quickly acquired set of bias frames?

Image Processing Challenges / PCC help...again for ngc4565
« on: 2018 June 05 09:15:44 »
Some stupidity and a little bad luck led to the focuser screws coming loose, changing the camera angle midway and giving a funny alignment; as resolved in another thread, moonlight was pretty bad as well (DBE leaves a bit of a dark spot around the galaxy..seems difficult to avoid).  One thing that did go well is that I was able to collect while autofocusing between subs on a nearby star (FWHM for subs seemed significantly improved).

PCC still seems difficult after trying to play with focus length/pixel size (FL: 1371.60, pixel size 9).  Trying to figure out if it is the image quality (FWHM, noise etc.) or the ugly cropped areas that might be interfering with things?

I recently acquired some images of ngc4565 while fixing a  few thing with my setup.  There was stupidity that resulted in non-aligned images, but I am trying to see if I've dealt with moonlight correctly.  I was unsure if there was something about weighting/local normalization that I may have done incorrectly but there threads make it seem like I am not:
(didn't realize the PI documentation was so good, things are laid out very clearly)

This is the image after drizzle integration:

An earlier sub exposure where the bg isn't so bad:

A later one where it is worse, but better S/N (to the best of my understanding the better S/N is not an artifact of the brighter bg):

Vanilla application of DBE doesn't *seem* to pick up the background correctly?

Performing DBE in a standard sort of way does drop remove a lot of the background, but it still seems too high.  Does anyone have any advice on how to reduce the background intensity?

Bug Reports / Batch Preprocessing Bug?
« on: 2018 March 24 00:12:56 »
If I am plowing through data over multiple dates, I'd like to 'Clear' flats and lights and then rerun with my master flat and bias that I just used on the previous day.  Something about the BPP state gets messed up after the first run, when I 'load flats' for the next date, the grouping seems messed up (this is repeatable).  The weird grouping goes away if I close the script and start it again.

EDIT: Crap, was on the lights tab when I did the screen cap...but be assured that the flats look fine in that tab.

Image Processing Challenges / Yet another m42 image
« on: 2018 March 07 13:22:09 »
Hi all,
I did some quick exposures of M42 (20-30 for each channel, L exposure was around 30s, used a set of shorter exposures to erase blooming with pixelmath) on my 402ME.

Starting to feel a little more comfortable with PI.

Any suggestions on improving this image? Looking at it now, it looks a bit blurry from the drizzled increase in resolution; I would downsample a bit (402ME chip is pretty small).

I was having trouble with my guider and I left it off (since the exposures were short), so I think there is a bit of movement in the stars; but overall things felt pretty smooth.  Stars had to be shrunk a fair bit, but I think it worked without too much artifact introduction.  The central stars are really hard to separate; is there a good reason why stars shouldn't be shrunk while the image is linear? (EDIT: just answered my own dumb question, you can't deconvolve if your stars don't reflect the atmospheric conditions of the acquisition).

Link to denoised L and  RGB (which was PCC'd):

Image Processing Challenges / First try with an m27 dataset
« on: 2017 November 10 11:50:52 »
I just love PI! It is a joy to use.

After going through the lightvortex tutorials, the attached png is the result of a first try with an m27 dataset.  The RGB and L images after preprocessing and color correction are here:

Some notes on what I am thinking as I did things (for others to point out potential flaws in my thinking):

-About 28 L images and ~12 images each for R, G and B and 1x1 binning.

-Some other small problems during acquisition (only a single set of flats due to some bugaboos, RGB only got about 9 flat frames to average).  There is some flat overcorrection in the B image.  I'll take a closer look at that when with a dataset that has a proper set of flats (some suggestion in thread that this is due to pedestal setting in BatchPreProcessing).

-Stars are *so* bloaty in L image, I think this will be the bottleneck for final quality.  I want to blame seeing conditions but am very concerned some sort of fundamental focus problem (I have an FLI DF2 with no temp regulation and have filter offsets set for each filter). I'll be acquiring with FocusMax for the next dataset (and likely better seeing) to see if things improve.

-Photometric color calibration was a little problematic (resolved in another thread), but didn't seem to change things very much.

-Began with a nebulosity mask (subtracted stars) and deconvolution on linear image.  Higher iterations introduced "streaks" in nebula (still some there).  Tried to be conservative to balance slightly improved image (brighter stars in nebula areas) against some clear defects.

-Made a mask for dark areas to denoise, some processes really blurred areas strongly, so again, kept things fairly conservative with MultiScaleLinearTransform.

-Noticed that dimmest parts of nebula ("wings") had some noticeable noise in them, masked those areas with range selection and gently denoised same way as before.

-Deconvolution on linear RGB image resulted in severe desaturation.  Left it out (is that supposed to happen?).

-Some gentle denoising on RGB image.

-Applied some gentle star reduction, trying to over shrink leaves ugly halos.  The general bloatiness made it hard to avoid this entirely.

-LRGB combined and tried to pump saturation.

-Finally did stretched via HistogramTransform such that the nebula looked saturated and some subtle contrast in dark areas of sky was visible.

But perhaps I've committed an obvious/grave error somewhere.  Is it possible to do a lot better with this dataset?
Many thanks in advance for your input.

Pages: [1]