Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - mmnb

Pages: [1]
Bug Reports / pixel trails that follow cursor?
« on: 2019 September 01 23:14:37 »
I have PI running on a newer MacBook Air (just fine) and an older MacBook Pro where I am running into an odd problem:

I've adjusted the UI settings a bit for accessibility and see rather annoying pixel trails that follow the mouse cursor on images (they disappear after a bit).  I don't see anything similar in the forum.  Some machine info:

15 inch (Late 2013)
Mojave 10.14.6
PixInsight Core

Image Processing Challenges / m8 lagoon nebula
« on: 2019 August 14 17:04:08 »
C14+hyperstar+ EOS 60Da

Found my darks before I thought I could get a hold of them and indeed including them in the calibration (as recommended here: by bulrichl) really did seem to make the thin dark lines disappear (that wasn't going away, despite dithering). I'll try to post some controlled comparisons on that since there seems to be some varying advice o how to deal with Canon dark current suppression.

Nothing fancy:
Up to PCC
Deconvolution (still hard to see the effects here, very easy to overdo)
Denoising dark
Sharpening nebulosity
Star shrinking (while linear, very light)
arcsinh stretch (I liked this stretch better than the masked stretch as I seemed to be getting smaller stars and better color although the outer nebulosity was dimmer)
HDMRT (no mask, brought out the internal dark areas to look more like masked stretch)
Small L and Sat curve adjustment.

Lots of nice images out there but it seems to be in the right ballpark but would love to hear any criticism.

XISF up to PCC:

Image Processing Challenges / m106
« on: 2019 August 07 13:39:33 »
C14 Edge HD + Hyperstar + EOS 60Da

Standard preprocessing up to deconvolution, masked stretch then HDRMT (masked to galaxies) and some saturation and curve nudging.  There is a line that shows up in the bottom 6th in all my frames, not sure what the cause is there; I might try and kicking it out in cosmetic correction.   Is collimation/focus error obvious here in the cropped frame (I can see the chromatic aberration in the stars when working on the image)?  I'm trying figure out to get it fixed, right now it seems that the error is position dependent.

EDIT: Link to xsif, linear image processed upto PCC:

Gallery / ngc4236 C14 hyperstar + EOS 60Da
« on: 2019 July 23 23:51:39 »
Stars are rounder than last time from subframe selector stats and PSF, and seem less 'painful' visually. I hope this was a result of a tiny bit of hyperstar adjustment and persists in my other datasets I've acquired since.

This was 150s x 24, would love any advice on improving.  Sharpening with deconvolution or multiscale linear transform didn't help much, the galaxy was quite noisy so just applied gentle denoising. A touch of extra curves adjustment in GIMP.

Took some image with my C14 Hyperstar + EOS 60Da (32x30s IRRC) and trying to take the drizzled and pcc'd linear image through deconvolution/denoising and stretching. I think I am reasonably proficient with star masks now (I do miss a few double stars....seems like a lot of trouble to refine to get those last bits, is this typical?) and tried to create an appropriate deringing support as advised here:

No matter what I do, I can't seem to to improve the image with deconvolution.  I can see "too much" where it looks like it is being stretched around noise or what looks like next to nothing. I can't see the same sorts of visual contrasts that are in the tutorial.  An important issue:

When I create the external PSF, the star profile is not round; a brutally honest reminder that the collimation is off (I've been trying to fix, but having a hard time).  Does the poor collimation affect how deconvolution will perform? E.g. Will it try an model collimation error in the underlying model that is based on atmosphere?

Are there any other image issues that stand out? Note I only correct with bias and flat due to the way the Canon does darks.

I've included masks in the zip file (1G):

Edit: Added a screenshot of a blind mask stretch and HDRMT.  Nice to see that there is some comparable detail in the data...I wouldn't know much more than to do pre/post denoising in the dark and a little bit of saturation and brightness adjustment.  Should I be able to do much better with this data? I swear I see a bit the IFN when I compare to other pictures, but it is quite faint and not sure if I can successfully pull it out.

Gallery / coma cluster - C14 Hyperstar EOS 60Da
« on: 2019 July 16 17:14:12 »
This is was I did with 32x30s exposures of the coma cluster.  Standard preprocessing (no darks because of how the canon camera darks are hard to measure) up to pcc.  Just did two gentle passes of denoising with multiscale linear transform, I tried using a blurry luminance mask but I could always see "noise seam" in the galaxies so I denoised without masking.  Adaptive stretch and some curve bumping in saturation and lightness.  What I tried was looking at was the face on spiral and try to be sure I wasn't loosing the "S" in my processing.  Does anyone see any obvious problems in the image?

Finally have my hyperstar setup working. My stars suffer from some eccentricity, hopefully will be resolved after tuning collimation.  I also had some backlash in my focusing that must've affected things.

Not sure how I did here.  After calibration and PCC, mostly did some very gentle denoising, deconv, stretching and really just bumped saturation curve.  Other images have better hue contrast between the core and arms, but somehow this looks ok to me.  I noticed a bad blue cast after integration, I fixed with LinearFit.  Not sure how that happened (blue was weaker in flats?) so I don't think I lost any color contrast but a little worried since MaximDL's RAW Color images (automatically debayered, balanced) seem to suggest I would see more.

Would love any constructive crit. 500MB linear xsif after drizzle int and PCC:

General / Overscan newb questions
« on: 2018 October 15 18:11:08 »
I just got a camera (QHY 16200A) with overscan capabilities, wanted to check my understanding to see if I have this right:

Previously I created a master bias according to the Light Vortex tutorials with a camera that did not have overscan.  I should be able to use the overscan capability to do better bias correction for an image.  Because the bias is dependent on the state of the (non-cooled) components in the camera, overscan provides better correction than a master bias frame from an average of a bunch of rapidly acquired bias images.

The bias for a given image is determined by taking the overscan region, averaging columns and fitting a smooth representation of bias *along* the columns.  This is done for *all* images (lights, darks, flats).  The master bias (now the zero frame) is also corrected this way.  The zero frame is still subtracted from darks, flights and flats the way the original master bias would have been.  All of this is done simply by specifying the overscan regions in BatchPreProcessing.

Is that right?  I am not sure what the 'target' region is in the overscan region specification.  Does using the overscan region provide materially better accuracy than simply making a master bias out of a quickly acquired set of bias frames?

Image Processing Challenges / PCC help...again for ngc4565
« on: 2018 June 05 09:15:44 »
Some stupidity and a little bad luck led to the focuser screws coming loose, changing the camera angle midway and giving a funny alignment; as resolved in another thread, moonlight was pretty bad as well (DBE leaves a bit of a dark spot around the galaxy..seems difficult to avoid).  One thing that did go well is that I was able to collect while autofocusing between subs on a nearby star (FWHM for subs seemed significantly improved).

PCC still seems difficult after trying to play with focus length/pixel size (FL: 1371.60, pixel size 9).  Trying to figure out if it is the image quality (FWHM, noise etc.) or the ugly cropped areas that might be interfering with things?

I recently acquired some images of ngc4565 while fixing a  few thing with my setup.  There was stupidity that resulted in non-aligned images, but I am trying to see if I've dealt with moonlight correctly.  I was unsure if there was something about weighting/local normalization that I may have done incorrectly but there threads make it seem like I am not:
(didn't realize the PI documentation was so good, things are laid out very clearly)

This is the image after drizzle integration:

An earlier sub exposure where the bg isn't so bad:

A later one where it is worse, but better S/N (to the best of my understanding the better S/N is not an artifact of the brighter bg):

Vanilla application of DBE doesn't *seem* to pick up the background correctly?

Performing DBE in a standard sort of way does drop remove a lot of the background, but it still seems too high.  Does anyone have any advice on how to reduce the background intensity?

Bug Reports / Batch Preprocessing Bug?
« on: 2018 March 24 00:12:56 »
If I am plowing through data over multiple dates, I'd like to 'Clear' flats and lights and then rerun with my master flat and bias that I just used on the previous day.  Something about the BPP state gets messed up after the first run, when I 'load flats' for the next date, the grouping seems messed up (this is repeatable).  The weird grouping goes away if I close the script and start it again.

EDIT: Crap, was on the lights tab when I did the screen cap...but be assured that the flats look fine in that tab.

Image Processing Challenges / Yet another m42 image
« on: 2018 March 07 13:22:09 »
Hi all,
I did some quick exposures of M42 (20-30 for each channel, L exposure was around 30s, used a set of shorter exposures to erase blooming with pixelmath) on my 402ME.

Starting to feel a little more comfortable with PI.

Any suggestions on improving this image? Looking at it now, it looks a bit blurry from the drizzled increase in resolution; I would downsample a bit (402ME chip is pretty small).

I was having trouble with my guider and I left it off (since the exposures were short), so I think there is a bit of movement in the stars; but overall things felt pretty smooth.  Stars had to be shrunk a fair bit, but I think it worked without too much artifact introduction.  The central stars are really hard to separate; is there a good reason why stars shouldn't be shrunk while the image is linear? (EDIT: just answered my own dumb question, you can't deconvolve if your stars don't reflect the atmospheric conditions of the acquisition).

Link to denoised L and  RGB (which was PCC'd):

Image Processing Challenges / First try with an m27 dataset
« on: 2017 November 10 11:50:52 »
I just love PI! It is a joy to use.

After going through the lightvortex tutorials, the attached png is the result of a first try with an m27 dataset.  The RGB and L images after preprocessing and color correction are here:

Some notes on what I am thinking as I did things (for others to point out potential flaws in my thinking):

-About 28 L images and ~12 images each for R, G and B and 1x1 binning.

-Some other small problems during acquisition (only a single set of flats due to some bugaboos, RGB only got about 9 flat frames to average).  There is some flat overcorrection in the B image.  I'll take a closer look at that when with a dataset that has a proper set of flats (some suggestion in thread that this is due to pedestal setting in BatchPreProcessing).

-Stars are *so* bloaty in L image, I think this will be the bottleneck for final quality.  I want to blame seeing conditions but am very concerned some sort of fundamental focus problem (I have an FLI DF2 with no temp regulation and have filter offsets set for each filter). I'll be acquiring with FocusMax for the next dataset (and likely better seeing) to see if things improve.

-Photometric color calibration was a little problematic (resolved in another thread), but didn't seem to change things very much.

-Began with a nebulosity mask (subtracted stars) and deconvolution on linear image.  Higher iterations introduced "streaks" in nebula (still some there).  Tried to be conservative to balance slightly improved image (brighter stars in nebula areas) against some clear defects.

-Made a mask for dark areas to denoise, some processes really blurred areas strongly, so again, kept things fairly conservative with MultiScaleLinearTransform.

-Noticed that dimmest parts of nebula ("wings") had some noticeable noise in them, masked those areas with range selection and gently denoised same way as before.

-Deconvolution on linear RGB image resulted in severe desaturation.  Left it out (is that supposed to happen?).

-Some gentle denoising on RGB image.

-Applied some gentle star reduction, trying to over shrink leaves ugly halos.  The general bloatiness made it hard to avoid this entirely.

-LRGB combined and tried to pump saturation.

-Finally did stretched via HistogramTransform such that the nebula looked saturated and some subtle contrast in dark areas of sky was visible.

But perhaps I've committed an obvious/grave error somewhere.  Is it possible to do a lot better with this dataset?
Many thanks in advance for your input.

Pages: [1]