Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - mmnb

Pages: [1] 2
1
Lately I've found it pretty easy to create starmasks with MLT (I used to have a really hard time with the star mask tool, trying to get the noise threshold just right). My typical workflow has been to create a small star mask with MLT, a big star mask with range selection, take the max of those to make a final star mask.  I then take the lightness channel, lo pass filter it, stretch, paint out rings from bright stars and then subtract the star mask to make the nebulosity mask.  In some nebulas I will use the red channel instead of lightness to get a better nebula "shape".

This linear M17 image (Hyperstar C14, EOS 60Da ~20 exposures) has been processed up to PCC. I am having a lot of trouble getting a good starmask....making the threshold low picks up nebulosity (there are some really bright regions in there....I am honestly not sure if they are poorly resolved stars?).  Is the right solution to clone paint that stuff away?  Or am I just not making star masks competently? 


https://drive.google.com/open?id=1vtFBJJ8RWCDvykjMt3kClM6vfWTrzp5a

2
Catalina 10.15.1
Late 2013 macbook

*** PCL Unix Signal Handler: Critical signal caught (11): Segmentation violation

PixInsight 1.8.8-3 - Critical Signal Backtrace
Received signal 11 (SIGSEGV)
Module: 0   PixInsight                          0x0000000107edff4e InitializePixInsightModule + 1849342
================================================================================
 45: 1   libsystem_platform.dylib            0x00007fff63e4fb1d _sigtramp + 29
 44: 2   ???                                 0x0000000000000a80 0x0 + 2688
 43: 3   PixInsight                          0x0000000107ca21a2 PixInsight + 14520738
 42: 4   QtWidgets                           0x0000000110b7d979 _ZN7QWidget5eventEP6QEvent + 1161
 41: 5   QtWidgets                           0x0000000110b42c20 _ZN19QApplicationPrivate13notify_helperEP7QObjectP6QEvent + 272
 40: 6   QtWidgets                           0x0000000110b43fd5 _ZN12QApplication6notifyEP7QObjectP6QEvent + 581
 39: 7   PixInsight                          0x00000001070010e4 PixInsight + 1278180
 38: 8   QtCore                              0x0000000111a82094 _ZN16QCoreApplication15notifyInternal2EP7QObjectP6QEvent + 212
 37: 9   QtWidgets                           0x0000000110b7c71d _ZN14QWidgetPrivate11show_helperEv + 413
 36: 10  QtWidgets                           0x0000000110b7d32c _ZN14QWidgetPrivate10setVisibleEb + 828
 35: 11  PixInsight                          0x00000001072e7a0b PixInsight + 4319755
 34: 12  PixInsight                          0x00000001072e8e9b PixInsight + 4325019
 33: 13  QtWidgets                           0x0000000110b7d896 _ZN7QWidget5eventEP6QEvent + 934
 32: 14  QtWidgets                           0x0000000110b42c20 _ZN19QApplicationPrivate13notify_helperEP7QObjectP6QEvent + 272
 31: 15  QtWidgets                           0x0000000110b43fd5 _ZN12QApplication6notifyEP7QObjectP6QEvent + 581
 30: 16  PixInsight                          0x00000001070010e4 PixInsight + 1278180
 29: 17  QtCore                              0x0000000111a82094 _ZN16QCoreApplication15notifyInternal2EP7QObjectP6QEvent + 212
 28: 18  QtWidgets                           0x0000000110b4274a _ZN19QApplicationPrivate18dispatchEnterLeaveEP7QWidgetS1_RK7QPointF + 1674
 27: 19  QtWidgets                           0x0000000110b43501 _ZN19QApplicationPrivate14sendMouseEventEP7QWidgetP11QMouseEventS1_S1_PS1_R8QPointerIS0_Ebb + 833
 26: 20  QtWidgets                           0x0000000110b9c91f _ZN14QDesktopWidget11qt_metacallEN11QMetaObject4CallEiPPv + 9087
 25: 21  QtWidgets                           0x0000000110b9b66a _ZN14QDesktopWidget11qt_metacallEN11QMetaObject4CallEiPPv + 4298
 24: 22  QtWidgets                           0x0000000110b42c20 _ZN19QApplicationPrivate13notify_helperEP7QObjectP6QEvent + 272
 23: 23  QtWidgets                           0x0000000110b43fd5 _ZN12QApplication6notifyEP7QObjectP6QEvent + 581
 22: 24  PixInsight                          0x00000001070010e4 PixInsight + 1278180
 21: 25  QtCore                              0x0000000111a82094 _ZN16QCoreApplication15notifyInternal2EP7QObjectP6QEvent + 212
 20: 26  QtGui                               0x000000011113ae61 _ZN22QGuiApplicationPrivate17processMouseEventEPN29QWindowSystemInterfacePrivate10MouseEventE + 3441
 19: 27  QtGui                               0x00000001111224bb _ZN22QWindowSystemInterface22sendWindowSystemEventsE6QFlagsIN10QEventLoop

3
PCL and PJSR Development / Telea inpainting
« on: 2019 December 05 21:27:04 »
Coming back from  J-P Metsavainio's talk at AIC, I really liked the idea of separating stars and working on nebulosity.  Adam Block was also talking about his star shrinking method which involves removing stars and then MMT to "inpaint".

I was just playing with OpenCV's inpainting using PI to create the star subtracted image and star mask (created with MLT and some dilation):
 https://www.learnopencv.com/image-inpainting-with-opencv-c-python/


It seems that opencv's Telea method does a pretty good job (takes a ridiculously long time though, several hours on my pokey 2015 mac airbook for a 2x drizzled image (originally 10k x 7k pixels) .  Nothing great about the m31 image (just the last thing I took/processed), but the star traces seem a touch better with open cv...I think the larger stars need masks that are a bit bigger (they leave bigger traces in the bg).   I'm no expert here, just wondering if anyone might be interested in having this in PI.  Masking the bigger stars with bigger masks for inpainting would improve thing a bit more I think.

(big pngs)
Original:
https://drive.google.com/open?id=1T1TAUwjb_4iB3KXBocnygmt74J9ZNqTy

Starnet (default params):
https://drive.google.com/open?id=13mkj5TGKb7PGvz_UyuoCi51ocuKVcGkH


OpenCV inpaint:
https://drive.google.com/open?id=12uEw5qmw0WQmftcdLtd9MXTd2Tugprmx

EDIT: zooming in, Telea sure does leave very "radial' patterns....from far away, the filled in patterns look lower contrast to me though...

4
Gallery / m33
« on: 2019 November 22 23:09:17 »
EOS 60Da + C14 EdgeHD

Would love any critical comments.

5
General / filtering with MLT vs. starmask
« on: 2019 November 18 21:57:00 »
I've always really been struggling with StarMask, trying to tweak it exactly to not pick up too much noise, checking the mask over my image to make sure I am getting all the stars, trying to get the ones in front nebulosity etc. 

Adam Block demo'd his star removal technique and starts my constructing a star mask with MLT by simply removing the residual and the very smallest scale components.  Am I missing something or is this straightforward filtering this way much easier?  I just tried it and it seems that I got much better quality star mask with no effort relative to using the StarMask process.

I feel a little silly for such a simple question, but is there something that I am missing? Do others prefer making their star mask one way or the other?

6
Bug Reports / pixel trails that follow cursor?
« on: 2019 September 01 23:14:37 »
I have PI running on a newer MacBook Air (just fine) and an older MacBook Pro where I am running into an odd problem:

I've adjusted the UI settings a bit for accessibility and see rather annoying pixel trails that follow the mouse cursor on images (they disappear after a bit).  I don't see anything similar in the forum.  Some machine info:

15 inch (Late 2013)
Mojave 10.14.6
PixInsight Core 01.08.06.1475

7
Image Processing Challenges / m8 lagoon nebula
« on: 2019 August 14 17:04:08 »
C14+hyperstar+ EOS 60Da
20x30s

Found my darks before I thought I could get a hold of them and indeed including them in the calibration (as recommended here: https://pixinsight.com/forum/index.php?topic=11968 by bulrichl) really did seem to make the thin dark lines disappear (that wasn't going away, despite dithering). I'll try to post some controlled comparisons on that since there seems to be some varying advice o how to deal with Canon dark current suppression.

Nothing fancy:
Up to PCC
Deconvolution (still hard to see the effects here, very easy to overdo)
Denoising dark
Sharpening nebulosity
Star shrinking (while linear, very light)
arcsinh stretch (I liked this stretch better than the masked stretch as I seemed to be getting smaller stars and better color although the outer nebulosity was dimmer)
HDMRT (no mask, brought out the internal dark areas to look more like masked stretch)
Small L and Sat curve adjustment.

Lots of nice images out there but it seems to be in the right ballpark but would love to hear any criticism.

XISF up to PCC: https://drive.google.com/open?id=1_iyvukiNlpD-Dj2xKsGIivsykwnD13vl

8
Image Processing Challenges / m106
« on: 2019 August 07 13:39:33 »
C14 Edge HD + Hyperstar + EOS 60Da

Standard preprocessing up to deconvolution, masked stretch then HDRMT (masked to galaxies) and some saturation and curve nudging.  There is a line that shows up in the bottom 6th in all my frames, not sure what the cause is there; I might try and kicking it out in cosmetic correction.   Is collimation/focus error obvious here in the cropped frame (I can see the chromatic aberration in the stars when working on the image)?  I'm trying figure out to get it fixed, right now it seems that the error is position dependent.

EDIT: Link to xsif, linear image processed upto PCC:
https://drive.google.com/open?id=16cdndGEv2eXvJTQmWNv3IUyDG7omTZlZ

9
Gallery / ngc4236 C14 hyperstar + EOS 60Da
« on: 2019 July 23 23:51:39 »
Stars are rounder than last time from subframe selector stats and PSF, and seem less 'painful' visually. I hope this was a result of a tiny bit of hyperstar adjustment and persists in my other datasets I've acquired since.

This was 150s x 24, would love any advice on improving.  Sharpening with deconvolution or multiscale linear transform didn't help much, the galaxy was quite noisy so just applied gentle denoising. A touch of extra curves adjustment in GIMP.

10
Took some image with my C14 Hyperstar + EOS 60Da (32x30s IRRC) and trying to take the drizzled and pcc'd linear image through deconvolution/denoising and stretching. I think I am reasonably proficient with star masks now (I do miss a few double stars....seems like a lot of trouble to refine to get those last bits, is this typical?) and tried to create an appropriate deringing support as advised here:
http://www.pixinsight.com/examples/M81M82/index.html

No matter what I do, I can't seem to to improve the image with deconvolution.  I can see "too much" where it looks like it is being stretched around noise or what looks like next to nothing. I can't see the same sorts of visual contrasts that are in the tutorial.  An important issue:

When I create the external PSF, the star profile is not round; a brutally honest reminder that the collimation is off (I've been trying to fix, but having a hard time).  Does the poor collimation affect how deconvolution will perform? E.g. Will it try an model collimation error in the underlying model that is based on atmosphere?

Are there any other image issues that stand out? Note I only correct with bias and flat due to the way the Canon does darks.


I've included masks in the zip file (1G):
https://drive.google.com/open?id=1tA7bES-s9rlv09VReQBoYFz6qVgUcDxv

Edit: Added a screenshot of a blind mask stretch and HDRMT.  Nice to see that there is some comparable detail in the data...I wouldn't know much more than to do pre/post denoising in the dark and a little bit of saturation and brightness adjustment.  Should I be able to do much better with this data? I swear I see a bit the IFN when I compare to other pictures, but it is quite faint and not sure if I can successfully pull it out.

11
Gallery / coma cluster - C14 Hyperstar EOS 60Da
« on: 2019 July 16 17:14:12 »
This is was I did with 32x30s exposures of the coma cluster.  Standard preprocessing (no darks because of how the canon camera darks are hard to measure) up to pcc.  Just did two gentle passes of denoising with multiscale linear transform, I tried using a blurry luminance mask but I could always see "noise seam" in the galaxies so I denoised without masking.  Adaptive stretch and some curve bumping in saturation and lightness.  What I tried was looking at was the face on spiral and try to be sure I wasn't loosing the "S" in my processing.  Does anyone see any obvious problems in the image?

12
Finally have my hyperstar setup working. My stars suffer from some eccentricity, hopefully will be resolved after tuning collimation.  I also had some backlash in my focusing that must've affected things.

Not sure how I did here.  After calibration and PCC, mostly did some very gentle denoising, deconv, stretching and really just bumped saturation curve.  Other images have better hue contrast between the core and arms, but somehow this looks ok to me.  I noticed a bad blue cast after integration, I fixed with LinearFit.  Not sure how that happened (blue was weaker in flats?) so I don't think I lost any color contrast but a little worried since MaximDL's RAW Color images (automatically debayered, balanced) seem to suggest I would see more.

Would love any constructive crit. 500MB linear xsif after drizzle int and PCC:
https://drive.google.com/open?id=1x3L7CZ7NfR-RFOF9XeV58pQB7R8zwcdr

13
General / Overscan newb questions
« on: 2018 October 15 18:11:08 »
I just got a camera (QHY 16200A) with overscan capabilities, wanted to check my understanding to see if I have this right:

Previously I created a master bias according to the Light Vortex tutorials with a camera that did not have overscan.  I should be able to use the overscan capability to do better bias correction for an image.  Because the bias is dependent on the state of the (non-cooled) components in the camera, overscan provides better correction than a master bias frame from an average of a bunch of rapidly acquired bias images.

The bias for a given image is determined by taking the overscan region, averaging columns and fitting a smooth representation of bias *along* the columns.  This is done for *all* images (lights, darks, flats).  The master bias (now the zero frame) is also corrected this way.  The zero frame is still subtracted from darks, flights and flats the way the original master bias would have been.  All of this is done simply by specifying the overscan regions in BatchPreProcessing.

Is that right?  I am not sure what the 'target' region is in the overscan region specification.  Does using the overscan region provide materially better accuracy than simply making a master bias out of a quickly acquired set of bias frames?

14
Image Processing Challenges / PCC help...again for ngc4565
« on: 2018 June 05 09:15:44 »
Some stupidity and a little bad luck led to the focuser screws coming loose, changing the camera angle midway and giving a funny alignment; as resolved in another thread, moonlight was pretty bad as well (DBE leaves a bit of a dark spot around the galaxy..seems difficult to avoid).  One thing that did go well is that I was able to collect while autofocusing between subs on a nearby star (FWHM for subs seemed significantly improved).

PCC still seems difficult after trying to play with focus length/pixel size (FL: 1371.60, pixel size 9).  Trying to figure out if it is the image quality (FWHM, noise etc.) or the ugly cropped areas that might be interfering with things?

https://drive.google.com/open?id=1BmgayEaRCDwF4dsZuBDhwWgae0FGL2QV

15
I recently acquired some images of ngc4565 while fixing a  few thing with my setup.  There was stupidity that resulted in non-aligned images, but I am trying to see if I've dealt with moonlight correctly.  I was unsure if there was something about weighting/local normalization that I may have done incorrectly but there threads make it seem like I am not:

https://www.cloudynights.com/topic/594474-what-really-is-the-best-reference-frame-for-local-normalization/
https://pixinsight.com/doc/tools/ImageIntegration/ImageIntegration.html#description_002
(didn't realize the PI documentation was so good, things are laid out very clearly)

This is the image after drizzle integration:
https://drive.google.com/open?id=1CbhL0ghV-jLAIDee-OX_qqd10i2LJo-d


An earlier sub exposure where the bg isn't so bad:
https://drive.google.com/open?id=1XBYpe3ocNrPml0H9f2_W9ArZ0MZpTqze

A later one where it is worse, but better S/N (to the best of my understanding the better S/N is not an artifact of the brighter bg):
https://drive.google.com/open?id=1E096hK-trZ3OANngpFuemr4D9cfGITxu

Vanilla application of DBE doesn't *seem* to pick up the background correctly?
https://drive.google.com/open?id=15FJQdcmrNTLNVUe9J1uSvy_JPJ5TP_Qc


Performing DBE in a standard sort of way does drop remove a lot of the background, but it still seems too high.  Does anyone have any advice on how to reduce the background intensity?

Pages: [1] 2