Author Topic: Prevent growth of spider veins?  (Read 5939 times)

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Prevent growth of spider veins?
« on: 2016 May 19 09:32:28 »
I'm wondering how I can reduce star emphasis in images without incurring the growth of unseemly spider veins. Example shown below. The left is the original frame, and the right is the result of 4 successive passes of morphological selection under protection of a star mask.

The selection operation gently tends toward erosion and does a decent job of deemphasis when viewed from much lower spatial resolution. But it takes almost 4:1 reduction in spatial resolution before the unseemly spider veins are no longer apparent. The gentle nature of the morphological filtering seems intended to prevent even worse artifact growth. But bridges still do tend to form between pairs of close-by stars. And faint stars get smeared into strings.

Suggestions?


Offline msmythers

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1178
    • astrobin
Re: Prevent growth of spider veins?
« Reply #1 on: 2016 May 19 10:26:58 »
I don't know what type of star mask your using but I generally find this type of problem is because the noise threshold is to low. Everywhere that stars or noise blend together in a star mask will blend together when using Morphological Transformation.

Here is a simple star mask using contours showing the difference between a 0.15 and 0.25 setting of the noise threshold.

Of course you can target certain size stars with both the scale and the mask preprocessing settings. You can also tighten up the size of the stars in the mask with smoothness to create more separation between stars. You have to be careful though as this can cause dark rings in your image if not used carefully.


Mike

Offline Andres.Pozo

  • PTeam Member
  • PixInsight Padawan
  • ****
  • Posts: 927
Re: Prevent growth of spider veins?
« Reply #2 on: 2016 May 19 10:38:33 »
Hi,
your image is a bit strange. It is like if your mask is protecting the stars and applying the morphological transform to the background. You should be using an inverse mask that protects the background and it is clear in the stars.

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: Prevent growth of spider veins?
« Reply #3 on: 2016 May 19 10:39:13 »
i think the mask is not turned on... this is what MT looks like with no mask. the screenshot shows no mask applied, but it's possible you turned it off before taking the screenshot.

rob

Offline msmythers

  • PTeam Member
  • PixInsight Jedi
  • *****
  • Posts: 1178
    • astrobin
Re: Prevent growth of spider veins?
« Reply #4 on: 2016 May 19 10:49:19 »
Wow, I didn't even notice that but it does look like it wasn't inverted or maybe not applied at all.

Mike

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: Prevent growth of spider veins?
« Reply #5 on: 2016 May 19 11:29:07 »
the only reason i suspect this is that early on in my PI career something like this happened to me while doing star reduction. the result was the same.

i eventually believed (wrongly) that i could no longer apply masks to images. but what really happened is that the mask on/off state is separate from the act of choosing a mask... in other words, you might expect that choosing a mask would cause the mask to become active, but it doesn't work that way. at that time PI did not have the colored tabs indicating that a mask was active, nor did it have the green stripe for STF. even today, sometimes i choose or drag-apply a mask and forget that the mask state was off.

i guess it's also possible that the star mask reveals too much which allows the 'tendril' growth to occur, or as Andres says it's been applied in the wrong inversion state.

rob



Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #6 on: 2016 May 20 13:29:54 »
Thanks for the suggestion about setting a higher noise level. The mask is fine. Had it been inverted then all my nebular structures would have been eroded, and they were not.

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #7 on: 2016 May 20 18:20:21 »
I tried the suggestions. But along the way, I did some experiments on mask building. And what I'm finding is that MorphologicalTransform is ignoring the mask. I built a completely binary mask and tried the erosion on star images with the mask sense in both directions. And the results are absolutely identical with / without mask. It appears that MT is just ignoring the mask.

I had the sense that the mask is a destination mask, meaning that computations take place with all image pixels, and the mask determines only how strongly to mix the result back to the image. But stars completely hidden beneath the mask (expected to be shielded) still get eroded in the resultant image, and that is what leads to spider veins - faint stars near to the noise level get merged into stringy channels - even when completely covered up by the mask. And that holds no matter whether the mask is inverted or not.

So, either I am completely misunderstanding how masks are supposed to work, or this is a bug in MT or the image store routines.

Eh?

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #8 on: 2016 May 20 18:37:19 »
Okay, digging even deeper... I find that one-off MT respects the mask. I had been performing small amounts of MT in succession, storing the sequence in a ProcessContainer. THAT does not respect the existing image mask.

Looking further the ProcessContainer has a column indicating MASK. It is not set when I construct the PC with a series of MT operations. But if I perform a manual series of MT on a masked image, then copy the PC belonging to the image, it *does* contain an indication of the mask in use.

It seems there is no way to force the PC to use a mask. And when mask column is left empty, it forces unmasked behavior. This default behavior makes the PC somewhat less than useful for building up command chains that could be applied to any image, masked or not.

So what to do?

Offline pfile

  • PTeam Member
  • PixInsight Jedi Grand Master
  • ********
  • Posts: 4729
Re: Prevent growth of spider veins?
« Reply #9 on: 2016 May 20 18:44:50 »
oh... yes that explains it.

if you drag out a process from a process container (a single line of the process container) onto an image, the mask will not be applied. however, i'm pretty sure that if you create a process container and then apply the whole thing in one go the masks will be applied.

rob



Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #10 on: 2016 May 20 18:54:56 »
Ahem.... yes you are correct...

But I see upon reading the source code in the ProcessContainer, that the mother process establishes the masks separately from each contained process, to be applied when that sub-process is performed.

I see the conflict that arises... Does an image mask currently in effect have the right to override the mask that was used to perform a sub-process along the way? Yes, and no... PI chose the *no* variant.

The conflict arises because you have a procedural language being applied in a functional manner, but it can't quite cut it. These are not truly functional language building blocks. They pretend to be. But beware.

I see that as long as I make a mask image with a special name, then I can use these PC blocks in a functional manner against any image, as long as the mask in use has that same special name. But that does not answer the question of when an image mask should override PC behavior. For those cases, I can manually edit the code in the PC block and get what I want.

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #11 on: 2016 May 20 19:02:50 »
Personally, I think a more useful behavior would be for PC blocks to respect an image mask in effect when the subprocesses have no mask specified. That allows one to build up a functional PC block that can be applied at will to any image, masked or not.

By forcing a PC no-mask when no mask has been specified in its creation, you subvert the desires of the user. If a mask has been specified when the PC was established, it can make sense to re-enforce that mask on the next application. But if no mask had been specified, it is less useful to impose a no-mask condition.

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #12 on: 2016 May 20 19:05:40 »
... there are good arguments for any variation on this theme, including the existing one. Hence my complaint that a procedural language is being used in a functional context. That underlying language is the problem.

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #13 on: 2016 May 20 19:11:23 »
... maybe we need a NO-MASK option so that sub-processes that truly need no mask in effect will have their way. Then any subprocesses that have no mask specified can be used more flexibly against masked and non-masked images. And you also need a way to specify the mask in use for any sub-process. (I suppose you already do by allowing us to edit the source code. But then why pretend to be an object block language?)

Offline dmcclain

  • PixInsight Addict
  • ***
  • Posts: 117
Re: Prevent growth of spider veins?
« Reply #14 on: 2016 June 10 10:27:10 »
In response to the original question...

I find that these "spider veins" are a natural byproduct of closely spaced stars. The star removal morphological operations are helpless to construct bridges between the centers of these stars as they become eroded.

So what I found to be a partial remedy, going a long way toward elimination, but not quite total avoidance, of spider vein growth, is to use a two-stage star reduction. Concentrate on the faintest and smallest star images first, erode them, then attack the next higher level of star images.

I select out the faintest star images by using MMT with layers 2 & 3 (scales 2 & 4) selected, all others disabled, against a Luminance gray scale image. Then perform a 3 element morphological closing, followed by a 3 element dilation, and finally a Histogram midrange stretch to saturate most of those resulting islands. This builds the mask to be laid over the image, and then normal star reduction is applied using an iterated morphological selection with a 5 element disk kernel.

This may be sufficient star removal, and you could stop there. But if you are intent on removing all but the very brightest stars, the second stage will do that.

The second stage attacks the remaining brighter stars by repeating the same exercise, but with the initial MMT filtering layers 4 & 5 (scales 8 & 16) selected, all others disabled. Everything else remains the same.

Doing this in two stages helps to eliminate the growth of spider veins, since we diminish the faintest member of closely spaced star groups. The second wave of star shrinkage now has less intensity in whatever bridge remains between closely spaced brighter stars.