Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - pfile

Pages: [1] 2 3 ... 274
1
General / Re: can I download a second copy of PI
« on: Today at 20:48 »
yes, see section 2.6:

https://pixinsight.com/faq/index.html

rob

2
move your mouse over the format explorer tab on the left side of the screen and double click the "FITS" module. then change the coordinate origin setting to whatever it's not currently set to and click OK. then rewrite your FITS file and check again in aladin.

rob

3
General / Re: Debayering mystique
« on: 2019 July 17 11:15:41 »
ok - sorry then to rehash but from your initial post it seemed you didn't have a grasp on that...

rob

4
General / Re: Debayering mystique
« on: 2019 July 17 08:11:26 »
all sensors are monochrome sensors. that's just how CCDs work. as i tried to explain, you need filters to isolate the colors you are interested in. in the case of an OSC, the filters are permanently attached to the sensor, in a 2x2 square pattern, where 1 pixel has a red filter, 1 has a blue filter, and 2 have green filters. this 2x2 square pattern is called a "bayer matrix" after the guy at kodak who invented it.

to produce a 3-plane color image from an OSC camera, you have to debayer. debayering must be done on the calibrated light, which is still in bayer format. if you try to register the subs before debayering, you will destroy the 2x2 matrix and it will be impossible to debayer the image.

the right flow is: calibrate, debayer, register, integrate.

rob

5
General / Re: Debayering mystique
« on: 2019 July 16 21:11:31 »
check this link for some understanding of what is going on

http://deepskystacker.free.fr/english/technical.htm#rawdecod

in short, a CCD device is sensitive to wavelengths of light from the ultraviolet to the infrared. in order to produce a color image compatible with human vision, you need filters in front of the sensor to isolate the wavelengths of light that the eye is sensitive to. in a one-shot color camera, there is a permanent set of R,G,G and B filters arranged in the checkerboard pattern you see in the link above. copying only the pixels belonging to a certain filter to one plane of an RGB image is what debayering does.

rob

6
not yet... the only module that exists right now is for windows. however there is a standalone program called starnet++ that runs on macosx. you need some familiarity with the osx command line to use it though.

rob

7
there is a chance this is caused by apple's sub-optimal thermal design for these new macbooks. we've seen this before.

to rule that out, you can go to Edit > Global Preferences. when the global preferences window comes up, click on "Parallel Processing and Threads" in the left-hand side pane, and uncheck "Allow Using all available processors", then enter a number of processors which is less than the total number of threads your CPU supports. i think on that i9 there are no hyperthreads, so maybe put in 4 or 5 instead of 6, which is the total number of cores. after this, click the circular button at the bottom of the global preferences to apply your changes. if you're able to successfully run ImageIntegration a few times, the problem is probably thermal in nature.

rob

8
Gallery / Re: 2019 Solar eclipse from Punta Colorada, Chile
« on: 2019 July 10 15:43:54 »
i will take a look at that paper. i did actually do something like what you are describing - filling in the moon with the color of the corona right behind the moon. maybe making the whole moon a single value was a mistake. it did reduce the dark ringing but i still ended up with bright ringing right at the edge of the moon. although, if i remember right this may have come from the processing i did on the moon itself, which might have benefitted from the same trick.

rob



9
General / Re: best time to debayer?
« on: 2019 July 10 14:07:34 »
the reason to use bias frames is if you want to scale (or "optimize") your master dark.

the idea is this: the dark current in any sensor is a function of temperature and time. since we usually use cooled cameras, we can hold the temperature constant between the darks and the lights. it turns out the dark current is a linear function of time, so it is possible to linearly scale the dark before subtraction from the light, where the dark and light durations do not match. so for instance you might sometimes do 20 minute lights for narrowband and sometimes 10min lights for LRGB. you can make a dark out of 20 minute subs and then scale them to 10 minutes by multiplying them by 0.5 before subtraction, instead of maintaining two separate master darks.

but there is a problem - the bias signal is not a function of time. so if you want to scale your dark, first you have to subtract the bias signal, and then scale the dark. if you don't do this, the calibration will be completely wrong. so when optimizing darks you either have to load a master bias and tell PI to both calibrate and optimize the dark, or you need to make a master dark which has the bias signal pre-subtracted. in that case you *always* need to load the master bias (whether or not you optimize the dark) since the bias signal is missing from the master dark. in this case you need to be sure to never check "calibrate dark" as that will amount to a double bias subtraction, which is also bad.

as an aside, PI doesn't actually care about the dark or light durations. what it does is iteratively scale the dark by calibrating a small portion of the light with different dark scaling factors and choosing the scaling factor that minimizes the noise in the calibrated result.

as a 2nd aside, this generally works for dedicated astro cameras. DSLRs play all sorts of dark current suppression tricks in the camera firmware and these tricks can not be turned off. so sometimes dark optimization does not work well for DSLRs.

rob

10
Gallery / Re: 2019 Solar eclipse from Punta Colorada, Chile
« on: 2019 July 10 10:26:22 »
OK i'm glad to hear that the trouble i had wasn't "just me".

the large scale/small scale stuff seems very important. in my own images there is a huge low-frequency "halo" around the sun that i had trouble removing. when you look at images from druckmuller, these large-scale components are entirely gone. but i understand he and his team have developed lots of code for processing eclipse images so i am not surprised.

in my own efforts i also ended up with a sort of ringing artifact around the moon which was undoubtedly caused by HDR compression, but my efforts to supress it were not successful. i have been meaning to go back and try again but frankly it is a lot of work!

rob

11
Gallery / Re: 2019 Solar eclipse from Punta Colorada, Chile
« on: 2019 July 09 22:47:38 »
are there any background stars in your image? for the 2017 eclipse many of the brighter stars in leo were visible. regulus was especially apparent so i was able to use DynamicAlignment to hand-align my frames. automatic methods tend to fail (FFTRegistration) since they tend to latch on to the moon's shadow and so you get a moon-aligned image.

carlos, that is a great image. did you have any problem with the overexposed areas in the HDR merge? i had a lot of posterization in my 2017 merge which i think may have been caused by the 14-bit data from the camera not filling up the entire i16 space... in other words similar to the "pink stars" problem you sometimes see with DSLRs. in the end i had to split the channels before doing the HDR merges to prevent the posterization.

rob

12
for any integrated image you should use a 32bit sample format. xisf is superior to fits in a lot of ways and is the native filetype for pixinsight, so it makes sense to use xisf.

rob

13
General / Re: RAW Modules Does Not Product DSLR Color Images
« on: 2019 July 06 11:55:49 »
what was the problem?

14
open the global preferences process, click on the directories and network section, and then specify one or more directories where you want to store swap files. then when done, click the global apply button (the circular button at the bottom of the window.)

the system temporary folder(s) is a totally rational place to store files, but over the years it was discovered that sometimes windows (and macosx) will clean out these directories automatically, thus preventing you from saving your projects. so juan added this warning so that you will know you are at risk of this happening.

rob

15
General / Re: Empty Image? Not so much.
« on: 2019 July 04 14:01:38 »
yep, changing the gain and or camera drivers will probably lead to problems like this. can you re-take the calibration frames using the same gain/driver as the new lights? (just a handful would work to test this)

rob

Pages: [1] 2 3 ... 274