Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - pk825

Pages: [1] 2
1
Anyway yesterday I get some bin1 frames of 1093, the night was with high cloud and moon so the photos are terrible, but there is the asteroid and I think it is possible to do astrometry .
I share the single frames and the integration frames (star aligned and comet aligned)
https://drive.google.com/open?id=1dhF9qkS7umc00gOx57IdG7FG18SJngc1
the comet aligned one have elongated stars an I get the solving of the image only putting the sensitivity star detection to minimum.

2
I have to check if the frames have the correct date time, because I think I have fixed the synchronization time just after this acquisition. In that case I have about 30s error.
I have send you 772 because is the only one that is in the list you attach in the first post, if you want I have some frames related to other asteroids not in list with the correct date time.

You say: It would be great if we had access to the unbinned images, since PSF parametrization, and hence centroid coordinates, would be more accurate with a better sampling.
But in may case, I'm not sure it is better to acquire image in bin1 because the seeing in the observatory zone is not very good and the guide error is about RMS of 0,7'', so the bin2 with a sampling of 1.7''/pix seems the best choice. Do you suggest also in my case to use bin1?

3
Do you still need images? I have just take some frames of 772 Tanete can I could share.
Is made with a remote personal observatory and a G2-4000 at bin2 (https://sites.google.com/view/3zobservatory).
TEL 0.30-m f/8.0 Ritchey-Chretien + CCD + f/5.8 focal reducer
COM Long. 11 33 48 E, Lat. 42 28 48 N, Alt. 150m, Google Earth

Here is a link
https://drive.google.com/file/d/11KNRYMm_vNkSw8wKyGp_bUux3VMw56NR/view?usp=sharing
of the raw data and a integration frame with dark, bias and flat.
you have only to pay attention to date time in the fits header that is not correct (Ascom driver problem), the UTC is +2h respect the real one, so you have to subtract two hours.



4
General / Re: Star detection
« on: 2019 June 10 10:32:40 »
Yes with the latest 1.8.6 versions of PixInsight:

Code: [Select]
#include <pjsr/StarDetector.jsh>

var window = ImageWindow.activeWindow;
var S = new StarDetector;
var stars = S.stars( window.mainView.image );
var f = File.createFileForWriting( "/tmp/stars.txt" );
f.outTextLn( "Star      X        Y      Flux       R.A.         Dec.    " );
f.outTextLn( "===== ======== ======== ======== ============ ============" );
for ( let i = 0; i < stars.length; ++i )
{
   let q = window.imageToCelestial( stars[i].pos );
   f.outTextLn( format( "%5d %8.2f %8.2f %8.3f %12.8f %+12.8f", i, stars[i].pos.x, stars[i].pos.y, stars[i].flux, q.x, q.y ) );
}
f.close();

Of course, this script requires the active image to have a valid astrometric solution. The computed right ascension and declination are ICRS/J2000.0 coordinates in degrees for the baycenter of each detected star.

It is perfect: I got all the data from the image! Thank you very much.
I have just modified the output format in order to have a csv file.

5
General / Re: Star detection
« on: 2019 June 09 02:42:08 »
You'll have to use JavaScript to do this. The StarDetector object is the best option. To get a first grasp of this object, here is an example that will create a mask with all stars detected on the current image:

Code: [Select]
#include <pjsr/StarDetector.jsh>

var S = new StarDetector;
S.test( ImageWindow.activeWindow.mainView.image, true/*createStarmask*/ );

Activate the generated mask on the image and inspect it. You'll see a circle drawn centered on each detected star.

A more interesting example:

Code: [Select]
#include <pjsr/StarDetector.jsh>

var S = new StarDetector;
var stars = S.stars( ImageWindow.activeWindow.mainView.image );
var f = File.createFileForWriting( "/tmp/stars.txt" );
for ( let i = 0; i < stars.length; ++i )
   f.outTextLn( format( "%5d %8.2f %8.2f %8.3f", i, stars[i].pos.x, stars[i].pos.y, stars[i].flux ) );
f.close();

This example will create a new text file where each line corresponds to a detected star. The columns are: star index, X coordinate, Y coordinate, and star flux (accumulated pixel data on the star detection region). Obviously, star detection works best on linear images.

You can also use the StarAlignment process, although it doesn't provide information on star fluxes. The StarAlignment.outputData property contains relevant data for all stars used for alignment after a successful execution of the StarAlignment process. Here is a schematic example:

Code: [Select]
#define NUMBER_OF_PAIR_MATCHES    2
#define REFERENCE_X              29
#define REFERENCE_Y              30
#define TARGET_X                 31
#define TARGET_Y                 32

   let SA = new StarAlignment;
   if ( SA.executeOn( view ) )
   {
      let stars = [];
      let n = SA.outputData[0][NUMBER_OF_PAIR_MATCHES];
      for ( let i = 0; i < n; ++i )
         stars.push( { refX:SA.outputData[0][REFERENCE_X][i],
                       refY:SA.outputData[0][REFERENCE_Y][i],
                       tgtX:SA.outputData[0][TARGET_X][i],
                       tgtY:SA.outputData[0][TARGET_Y][i] } );
   }

This would create an array of objects, where refX,refY are the image coordinates of a reference star and tgtX,tgtY are the corresponding coordinates of the same star matched on the target image (the image in the 'view' object in the example above). Coordinates are in pixels, where 0,0 is the top left corner of the pixel at the top left corner of the image. As noted before, StarAlignment does not provide star fluxes.

Let me know if this helps.

Great code, it's near perfect to my needs. Is it possible to get also RA and DEC coordinates?

6
General / Re: Average date time of integration frame
« on: 2019 June 06 22:18:22 »
In the integration frame Pixinsight puts in the header the average time of the integrated frames. But, if I correctly analyzed it, it puts the average between the start data time of the first frame and the start data time of the last frame. For my astrometry analysis is correct, instead, to use the average of the start data time of the first frame and the end data time of the last frame.
Is it possible to have this information in the header?

Your analysis is correct, ImageIntegration computes the average acquisition time from exposure start times, ignoring exposure durations. A difference of a few minutes (at most, usually much smaller) is completely irrelevant for calculation of star proper motions, so this cannot have any impact on astrometric solutions computed for the integrated image.

However, for ephemeris calculations the difference can be significant at the arcsecond level, which may be important for measuring object positions on the integrated image. I'll change the way this metadata is generated in the next version of the ImageIntegration tool, by computing the midpoint between the first starting time and the last starting time plus its corresponding exposure timeā€”or if the DATE-END keyword is present, which unfortunately happens rarely, its value will be used instead.

Thank you for pointing out this issue. I'll try to release an update to the ImageIntegration module with these changes implemented as soon as possible.

Thank you very much, I need it for asteroid astrometry analysis and an automatic way to get this information will help me to avoid mistake.
Pixinsight is a great software and your job is great  :D

7
General / Average date time of integration frame
« on: 2019 June 06 04:23:28 »
In the integration frame Pixinsight puts in the header the average time of the integrated frames. But, if I correctly analyzed it, it puts the average between the start data time of the first frame and the start data time of the last frame. For my astrometry analysis is correct, instead, to use the average of the start data time of the first frame and the end data time of the last frame.
Is it possible to have this information in the header?

8
New Scripts and Modules / Re: Subframe Selector PCL Module
« on: 2019 June 05 07:40:35 »
Thanks Brian,
I can confirm the issue is with the Weighting Expression, not the Approval Expression, although that one can display the X for some reason.

The 'plain' variables such as FWHM and Eccentricity only represent the current subframe being measured. So, Math.min(FWHM) and Math.max(FWHM) will return the same thing as FWHM, and subtracting these values will result in 0. The 'nan' error appears when it's dividing by 0.

The simple fix here is to use the 'special' variables such as FWHMMax and FWHMMin which are calculated before each subframe and represent the global max and min of all subframe FWHMs. Here's that expression modified to work for me:

Code: [Select]
35*(1-(FWHM - FWHMMin) / (FWHMMax - FWHMMin)) + 7*(1-(Eccentricity - EccentricityMin) / (EccentricityMax - EccentricityMin)) + 18*((SNRWeight - SNRWeightMin) / (SNRWeightMax - SNRWeightMin)) + 40

That said, I will definitely look into 'nicer' or more obvious messages when 'nan's are involved, and try to make the X's work better in these cases.

Could you help me understand if is it possible to get the min or max of variables such as FWHM , Eccentricity or SNRWeight of the analysed frames?

I understand how it works and I have to use variables FWHMmax e FWHMmin.

9
New Scripts and Modules / Re: Subframe Selector PCL Module
« on: 2019 June 04 01:51:33 »
Thanks Brian,
I can confirm the issue is with the Weighting Expression, not the Approval Expression, although that one can display the X for some reason.

The 'plain' variables such as FWHM and Eccentricity only represent the current subframe being measured. So, Math.min(FWHM) and Math.max(FWHM) will return the same thing as FWHM, and subtracting these values will result in 0. The 'nan' error appears when it's dividing by 0.

The simple fix here is to use the 'special' variables such as FWHMMax and FWHMMin which are calculated before each subframe and represent the global max and min of all subframe FWHMs. Here's that expression modified to work for me:

Code: [Select]
35*(1-(FWHM - FWHMMin) / (FWHMMax - FWHMMin)) + 7*(1-(Eccentricity - EccentricityMin) / (EccentricityMax - EccentricityMin)) + 18*((SNRWeight - SNRWeightMin) / (SNRWeightMax - SNRWeightMin)) + 40

That said, I will definitely look into 'nicer' or more obvious messages when 'nan's are involved, and try to make the X's work better in these cases.

Could you help me understand if is it possible to get the min or max of variables such as FWHM , Eccentricity or SNRWeight of the analysed frames?

10
Thank you for your good tutorial.
Just a question: why do you use an aggregated image of background with stars for Background Neutralization? I suppose it could be without stars or dso.

11
Image Processing Challenges / m106 color problem
« on: 2014 May 27 23:30:59 »
Hi all, I have a lot of problem to calibrate the color and improve the center of an integration of 15 lights/8 minutes of m106.
https://www.dropbox.com/s/rlzoj7zh0dxwqj7/L_m106_Bin1x1_480s_2014-05-25_00-47-02__-16C_c_d_r_i_work03.fit
the link is the image with BN, DBE, deconvolution, MLT (for noise), maskedStretch.
Could you help me in order to enhance the photo?

12
Thank you bitli for your hints.
It isn't a problem to crop the image and make the background dark to avoid problem.
I don't found the process attached, could you add it again?

13
Hi all,
I have a lot of problem with DBE on my m81/m82 photo, it is two hours integration from 60 lights of 120s at 800iso, with darks, biases and flats. All images are good (they seem good).
Could you help me to flat the image with DBE?
Here is the integrated photo:
https://www.dropbox.com/s/qfshun54yu48xef/light-BINNING_1.fit

14
General / Re: Color balance DSLR e batch preprocessing
« on: 2014 February 09 10:40:12 »
Thank you, it is clear.
I confirm that with DBE the colors are correct.

15
General / Re: Color balance DSLR e batch preprocessing
« on: 2014 February 09 09:30:43 »
Thank you for the quick answer. But my question is why Pixinsight produces blue while DSS have more natural colors? I do the same steps in both the products: dark, flats, biases, align and combine.

Pages: [1] 2