Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - TinySpeck

Pages: [1] 2
General / PixInsight Big Files
« on: 2020 January 25 15:26:03 »
As I work on an image, many mystery files build up in my operating folder.  They range in size from a few dozen kB to 100 MB, and often total about a GB after a while.

I back up my computer daily, including my current image processing folder, and I get many GB of accumulated mystery files even using incremental backups. 

What are these files for?  Do I need to restore them to restart a project after a computer crash?  Can I reduce or eliminate them?

General / Weighted Batch PreProcessing (WBPP) cosmetic correction?
« on: 2020 January 13 18:21:15 »
I'm trying this script out for the first time, and I can't find anything about the Cosmetic Correction box on the Lights tab.  To use it you need to specify a "template icon".  Can anyone explain what that is?

General / 1.8.7 out of memory errors
« on: 2019 October 08 21:01:26 »
I'm stacking the largest collection of subs I've ever tried, and running into out-of-memory errors during ImageIntegration.  I'm closing all programs except PI.  I have the latest PI release and updates (including one from today), and the latest Windows 10 with updates.  I'm using the new automatic buffer sizes option (but also fixed buffer sizes).  I got through the initial 'no pixel rejection' integration after a couple out-of-memory failures, but now I can't proceed with pixel rejection.  After reading the subs in a half a minute or so from the cache, the console reports:

Integration of 1806 images:
Pixel combination .................. average
Output normalization ............... additive + scaling
Weighting mode ..................... custom keyword: SUB_WEIGHT
Scale estimator .................... iterative k-sigma / BWMV
Pixel rejection .................... Winsorized sigma clipping
Rejection normalization ............ scale + zero offset
Rejection clippings ................ low=yes high=yes
Rejection parameters ............... sigma_low=4.000 sigma_high=4.000 cutoff=4.000
Large-scale rejection clippings .... low=no high=yes
Large-scale rejection parameters ... lsr_layers_low=2 lsr_grow_low=2 lsr_layers_high=3 lsr_grow_high=4

* Available physical memory: 11.869 GiB
* Allocated pixel buffer: 319 rows, 8.894 GiB
* Using 12 concurrent pixel stack(s), 1.338 GiB

* Integrating channel 1 of 3:
  Analyzing pixel rows:     0 ->   318:   3%
*** Error: Out of memory
<* failed *>

There is a pause of a couple minutes after "analyzing pixel rows", and then the memory failure happens.  The amount of physical memory reported varies by a GiB or two each time I try this.  I have tried rebooting.  I had success once on the initial integration by setting fixed buffer sizes of 32 MiB and stack of 2048 MiB, but now that doesn't seem to help.

It seems like PI should never run out of memory; it should adjust its processes to use what's available.  Is this problem a function of the new automatic buffer sizes?

Can anyone suggest anything?

General / Superpixel = supergreen
« on: 2019 September 23 16:16:38 »
I'm fooling around with some old data, taken with a Canon DSLR.  I don't have bias, flat, or dark frames for it.  I stacked without image calibration, and used SuperPixel debayering for the first time.  My result looks reasonable, but it's way too green.  I've tried BackgroundNeutralization followed by ColorCalibration using the image stars as a white reference, and also PhotometricColorCalibration.  Both results are nearly identical, with about 2x the green there should be (i.e. the green hump in the image histogram is about twice as high as the red and blue).

Is there something fishy about the Superpixel method?  I've read here in the forum that it does divide G intensity by two, which sounds right, but is there something else I need to do?

Thanks for any help!

General / Script functions available?
« on: 2019 September 22 10:38:33 »
I've written 1 (one) PixInsight script, which is just enough to whet my appetite and make me dangerous.  ;D  I have a much more complicated script in mind now, which could make use of the following large-scale functions I find in PixInsight:

  * Star finding, as in StarAlignment
  * Gaussian surface fitting, as in DynamicPSF
  * The entire StarAlignment process

Are these complex tools available in the programming interface of PI scripts?  If they are, I will plunge in and figure out how to use them.

Thanks for any help!

SFS graphs are wonderful for seeing trends and deciding on acceptance criteria.  I would like to see the ability to graph only approved subs, though.  Sometimes you don't want to see the effects of unapproved subs, which can distort the results.  Setting weighting so approved subs run from a lower to an upper limit, for example, would be a useful effect of this.  Right now I have to copy the data into Excel and use the Excel "NA" data value to exclude unapproved subs.

General / Star saturation in 3DPlot script
« on: 2019 June 27 11:18:04 »
I'm examining the saturation of a few bright stars from the Extract Lightness version of my still-linear RGB image.  I'm attaching a screen shot of the region I'm examining (auto-stretched) and the result of running the 3DPlot script on it.

You can see the three brightest stars flat-topping in the 3D plot.  But when I zoom way in on the original image and use the Readout mode with the cursor I don't see this.  They all look like they have reasonably rounded peaks (as near as I can tell from the Readout data).  The biggest star peaks out around 0.90, but the highest I can see on the others is 0.28 and 0.18.  None of them is saturating.

So why are they all flattened at the same Z level?

General / Suppress rejected subs data in SFS?
« on: 2019 June 24 09:17:05 »
I have an image with about 1000 subs in SubframeSelector, working on setting a Weighting expression.  My problem is that my 100 or so rejected subs are polluting the data with their outlier numbers.  I would like to see the results of my weight expressions only for the subs which I'm not rejecting.

I could go through the subs table and manually select and remove subs, but that is time-consuming and a poor solution if I change my rejection criteria or want to see the rejected sub data for some reason.

Can anyone recommend a way to suppress the data from rejected subs in SFS?

I do my astrophotography with a 14-bit DSLR, and find that I usually have several bright stars with blown-out "magenta" cores in my images.  The magenta comes from out-of-range values being created in the debayering process, way early on during stacking.  With a 14-bit camera all my raw pixels should be 0.25 or less in the normalized PixInsight data range, but in my blown-out cores I see values much higher.

The Repaired HSV Separation script doesn't work well for me.  It helps, but star cores are still blown out.  It also puts a break in the image history since it creates a new repaired image.

ColorClip replaces out-of-range pixels with the mean of neighboring in-range pixels.  This can be done early in the linear phase, so dynamic range during stretching is maximized.  It does a decent job of replacing blown-out cores with the color immediately outside the blown-out region and results in more natural-looking stars.  The threshold is adjustable, with 0.25 the default (for 14-bit cameras).

I hope people find this useful.  You can get the script at .  Please let me know if you find bugs or problems.  It's my first Java script so I'm sure there are some beginner gaffes in there.

General / Script slider not working
« on: 2019 February 16 19:17:12 »
I've almost got my first script going, except for one little detail: my slider control doesn't do anything.  The numeric data doesn't update when I move the slider, the tool tip doesn't appear when I hover over the slider or numeric data, and the onValueUpdated function of the NumericControl doesn't get called.

My script runs without console messages if I don't move the slider, but if I move the slider and then click OK I see
** Warning [162]: C:/Program Files/PixInsight/include/pjsr/NumericControl.jsh, line 314: reference to undefined property this.parent.edit
*** Error [022]: C:/Program Files/PixInsight/include/pjsr/NumericControl.jsh, line 322: TypeError: this.parent.sliderValueToControl is not a function

on the console.

I based my script originally on PolarCoordinates.js, which doesn't have any user controls, but I copied the NumericControl code from  AberrationSpotter.js into my dialog GUI code, and it seems to run okay.  I've been comparing my script to others with sliders and I can't see any differences which would explain this.  I've checked my #includes too.  I've been poking and experimenting for a couple hours now and can't get it going.

Does this ring a bell for anyone?  What am I missing?  Thanks for any help!

General / Script methods for get/set pixel values by coords?
« on: 2019 February 14 11:39:52 »
I'm having a hard time finding useful information on the methods for the PCL classes.  I'm looking for methods which read and write the pixel data at a given (x, y) coordinate in an Image.  I've been scanning the available script source code for clues, and the closest I can find is Image.interpolate(), which seems way overkill.


General / Fix debayered saturated stars?
« on: 2019 February 13 09:20:09 »
I shoot astrophotos with a 14-bit DSLR.  My raw images are therefore constrained to 0 - 0.25 in PixInsight data space.  My exposures and ISO are set to bring my histogram humps up about 10% from the left side, so my photo noise is separated from my camera noise.

When I do this, there are usually 20 or so bright stars in the image whose cores are saturated.  Right after debayering I see magenta cores with R, G, or B pixel values well above 0.25.  So saturation and wonky data are happening way early in the process.

Because the saturated cores are artificially well over 0.25 due to debayering, they limit the dynamic range available during stretching.  I want to limit the saturated cores while still linear so stretching can work with its optimum dynamic range.

I've tried the Repaired HSV Separation script, and it helps but does not work very well.  It still leaves saturated cores with artificial data.  It also puts a break in the image history since it creates a new image rather than updating the original.

I've also done HDRComposition using subs with a wide range of exposures from 100 s down to 3 s.  This works better, although there are still saturated cores at 3 s.  It's also a big pain.  The results are nice though: stars which simply show their color almost to the center of their core.

What I want is to merge the linear pixel data from just outside the saturated cores inward to fill the saturated core.  This would result in just about the same thing HDRComposition does, but would be a lot easier and less time consuming.  It would also work with images where HDR short-exposure subs weren't taken.  I can't find any PI processes or scripts to do this, though, and can't think of a way to coax it out of the ones I find.

Any ideas?  Thanks for any help.

General / So are my stars saturated or not?
« on: 2018 November 29 17:27:15 »
When I load my integrated images into PI and hover my mouse over the cores of the biggest, brightest stars, it looks like I have plenty of headroom.  50% - 80% maximum is typical for each of R, G, and B on the readout at the bottom of the screen.  The maximum pixel values in Statistics also show no more than about 80%.  When I look at the linear image there are typically a few dim dots for the brightest stars.

But when I try to stretch the linear image (with no other processing) it gets messy.  HistogramTransformation and STF stretches work fine, bringing the star cores up to full white with a natural falloff to the background.  But with ArcsinhStretch, MaskedStretch, and AdaptiveStretch there is a big off-colored blob in the middle of the brightest stars.  See the example photo attached.  You can see an off-colored blob in the linear image (the star halo is blue and the central blob is magenta) and this just gets brighter in the stretched image.

So what's going on here?  Are my stars actually saturated?  If so, why doesn't that show up in the Readout or Statistics data, and why do some stretches handle the saturation just fine?

Bug Reports / Access violation installing modules
« on: 2018 November 28 18:29:25 »
I downloaded three .dlls from Carlos's web page : TGV-pxm, InterChannelCurves-pxm, and CMIntensity-pxm.  They all give me an access violation error when I do Process / Modules / Install Modules in PI on my Windows 10 computer (see attached image).

Can anyone tell me how to get around this?

General / Lumpy background extraction
« on: 2018 November 27 11:35:31 »
This is a recurring problem for me.  No matter what I do with ABE or DBE I end up with a lumpy chrominance background like the attachment.  The lumps seem to be in the raw integrated data just prior to background extraction, too, to some extent.

I've tried many different function degrees in ABE and DBE, a range of sampling density and box size, various other settings, and multiple passes with either/both ABE/DBE.  The lumps seem to move around a little but they don't flatten out.  I can't seem to get rid of them in processing after background extraction either (i.e. BackgroundNeutralization or large-scale chrominance noise reduction).

Can anyone recommend any tricks for this?

Pages: [1] 2