About the new MARS project, how to contribute?

Hi

I am interested in contributing as well....also in NZ (Wellington) under Bortle 4 skies....using OSC (ASI2600MC-Pro)....have a number of images I can upload (where?)
 
The website and upload function will be availible in the coming weeks, as is explained in the article. It is a bit long, but very much worth reading, it is very informative.

CS Gerrit
 
  • Like
Reactions: PTA
Hello,

I think this is a really brilliant and exciting project, I am very much looking forward to the first results from this!

I have some integrations taken in bortle 3 with a 50 mm lens and APS-C camera from last year. Maybe they are good enough to contribute them, I will give it a try as soon as possible. They cover the fields in the northern winter constellations (Orion, Taurus, etc…)

Just some points that come to my mind:
- How are gradients in the 35 mm handled? These will be the initial reference for all the other images, thus gradients still present in them could influence all the other images. Maybe all sky images from truly dark sky are a solution to correct for gradients in these initial large field images.
- The MARS-pi survey will be captured with 1h/2h integration per field. How will this play out when normalizing gradients in images with very high SNR (>4h integration with similar conditions)? Will the gradient normalization in our images be limited by the depth of the MARS survey, or is such a limit only of theoretical nature?
- For the sake of this example, let's assume I'm imaging with an unmodified DSLR. All the images from the MARS survey will be captured with Ha sensitive camera (at least that's what I assume). Will normalizing my images against the survey introduce the brightness variations caused by Ha nebula into my images? I'm asking because the same principle (although with much less intensity) will probably apply to all differences in spectral sensitivity.

CS Gerrit

Hi,

As you say, in this survey you always need some top-quality data that has been acquired under completely controlled conditions. That's why we are going to do the MARS-pi survey. The user-contributed data is very important and will help increase the SNR of the final data set, but we need also data sets acquired by ourselves.

In our experiments, we conclude that you don't need a very long exposure to reach a good gradient correction (as shown in the gradient correction article). But this will be improved over time. We don't plan to remove the telescope. :-)

As for unmodified DSLR data, we can handle these images as well. Everything is welcome and everything is in my head. :-)


Best regards,
Vicent.
 
Hi,

Thank you all so much for your interest. We are going as fast as possible to get the website working as soon as possible.

Best regards,
Vicent.
 
Hi Vicent,

thanks for your reply, that's good to know!

As for unmodified DSLR data, we can handle these images as well.
All my images are captured with modified DSLRs. My question was the other way around.

Let's assume the tool is ready and all the survey data are captured. And let's also assume that I capture my images (just for the sake of the example) using an unmodified DSLR. Will correcting the gradients in my image introduce the "gradients" caused by that is present in the survey images but not my own one? Or is this somehow corrected for? Admittedly, this is an extreme example, but I think that the same principle would apply to other differences in the spectral response between different equipment. I hope that makes my question somewhat clearer.

CS Gerrit
 
Hi,

I have ideas we'll implement to make this effect as small as possible. In the end, we'll need a minimum of two databases since there are a lot of people taking pictures with unmodified DSLR cameras.


Best regards,
Vicent.
 
I don't have anything against ML, as there are directions of research which aim to respect physics (see physics-informed neural networks) and I am open to adopt such solutions, especially if they are developed from Pleiades.

My concern is about others (and not Pleiades) who can potentially use PI for the purpose of assembling training sets for their "AI" products. This malpractice is already a threat: for example some can use SPCC in order to create training sets for a commercial, standalone "AI" color correction tool/product outside the PI ecosystem. I am far from being an expert in intellectual property but I suspect that neither the PI license nor the Gaia DR3/SP license allow for such practices.

I have nothing against machine learning when correctly applied for legitimate purposes. For example, ML is being used to purify the Gaia catalog by detecting nonstellar objects. This is an excellent example of sound—and remarkably successful—ML usage in science.

Professional software developers and software development companies are basically defenseless against the practices you are describing. Nothing available to us can prevent this type of malpractice because we cannot avoid it effectively, and small companies like ours cannot afford the necessary legal resources.

The only ones who can defend us are our users, both current and potential. On one hand, by not using this type of "derived product" generated through fraudulent activities. On the other hand, by realizing that astronomical imaging is scientific/documentary imaging, where methodologies based on observational data and respectful for the acquired data are the only valid ones. This is clear in the example you have mentioned: color calibration. The same happens with the project we are working on: gradient modeling and correction.
 
Gday all.
Im in a bortle4 area of AU. Just a question on drizzle data. I cant get my head around. I use the drizzle x1 in weighted batch processing because thats what I watched in the PI videos. Reading a previous post would I still drizzle that data in the drizzle intergartion tool as explained? I have been using PI for a year and still a tad new. I have some good mono data that may be ok wich I have taken with a zwo2600. Ta
 
Hi !, excuse me but I do not understand how is the way to contribute with MasterLight. I have some captures of differnts objets from south hemisphere bortle 2 and some of bortle 4.
I Appreciate comments
federico
 
Hi Vicent & Juan,
First of all congratulations for this new (and really necessary) project!!
I have certain amount of data to contribute, taken from my observatory in Àger (Bortle 3) with 530 and 1070 mm focal lengths with CCD monochrome cameras (9 um pixel).
I can roughly group my data in two categories:
1) Images taken in my first period of astrophoto activity from 2008 to 2015 (*).
For these images, I can provide master lights without problem BUT they are:
- Not drizzled
- In fits format
Are the images on this conditions still useful for the project?

2) Images taken on my second period, from 2020 until now (**). These images are already in xsif format and some of them drizzled.
I understand that these images are OK.
Best regards
Jordi Gallego
(*) http://www.astrosurf.com/jordigallego/album/Files_dark_sky_SBIG/index.html
(**) http://www.astrosurf.com/jordigallego/album/Files_dark_sky_SBIG_remote/index.html
 
Hi Jordi, and everybody willing to contribute to the MARS project.

I am working on a dedicated MARS website where we'll include a form to upload files to our servers and instructions about the requirements of contributed data, along with information on project completion and technical data. So I ask for a bit of patience. We appreciate your support and willingness to contribute. This is a fundamental project that will change many essential things in astronomical imaging.
 
Back
Top