About the new MARS project, how to contribute?

Marcelofig

Well-known member
Thank you for this project, I will be watching and happy to contribute.

But I have a couple of doubts, what kind of data exactly do you need?

- What camera, mono, color, both, cooled or not?
- What filters, narrowband, broadband?
- Images with some calibration (darks and flats) or pure .fits?
- Does light pollution matter?
- Any sensor size preference, will a small one like the 533 work for example?

That, for now :D
 
Hi Marcelo,

Thank you so much! As described in the MARS webpage, your contributed images must meet the following requirements:
  • The images can be acquired with mono or color cameras (RGB bands). In the case of mono cameras, we also ask for narrowband H-alpha, [O-III], and [S-II] images.
  • In the case of broadband images, they should have been acquired from a sky with a minimum darkness quality (Bortle scale 1 to 4).
  • The images must be completely preprocessed master lights with no additional processing. We need your original masters in XISF format without any gradient correction.
  • The master lights must be generated with drizzle (even if you use drizzle x1). This is especially true for color cameras since drizzle preserves the validity of the stellar photometry.

Best regards,
Vicent.
 
Ok, to avoid future confusion I just uploaded* to my google drive my latest master light Ha of the Tarantula Nebula (NGC 2070, southern hemisphere :D, Bortle 8, Askar 107 PHQ, ASI 533MM), processed only with WBPP, including LN.



Is this the kind of data they need?


*I guess for the official image collection there will be a better way to send the images.
 
Thank you very much for your data. It is better for us to have wider fields of view, but your data is very welcome.
 
Hi
Unfortunatley I will not be able to contribute as live in a bortle 6 + area
but wanted to say I always hoped this could be done this way and will be a huge step forward particularly
for thoese of us who live in the orange goo
really exciting stuff

Harry
(y)(y)(y)(y)(y):cool::cool::cool::cool:
 
Hi Vicent

I live in NZ in a Bortle 2-3

I think we can help here for the southern hemisphere

I can do NB / broadband too

was there specific width you need ... it was 3-50 degrees

do you need any particular groups of image wdith / and pixel scale??

thanks

Simon

Greendale Observatory, Greendale, NZ
 
Hi,

I live in a Bortle 2 area in NZ. I have taken over 150 1hr-1.5hr ZWO 2400MC images with Nikon 200mm f/2 [10x 7deg field size] on 7.5x5deg field centres. The data covers about 25% of the Southern Sky at 6arcsec resolution, with a SNR~40 at 22mag/arcsec^2.

All data have been processed by WBPP [max quality], with no ABE/DBE applied. This is quite a large volume of data, but its all sitting on a DropBox account. I am happy to give you access.

It is part of an attempt to cover the whole sky - initiated as part of an Astrobin community survey. Unfortunately there are only a few active participants. But they will probably be happy to share their data too. In total we cover about 40% of the Southern Sky and aim to have a complete first pass but July next year.

Full survey field list and observations to date can be found here:


Brian

Professor Brian J Boyle
Antimony Observatory and Vineyard
Gibbston,
Queenstown RD1
New Zealand.
 
That is amazing! I have made a thread on Astrobin to let everyone know.. I'm from Australia, so hopefully I can get a few peeps to contribute. Do the original masters really need to be in XISF? not everyone uses Pixinsight for stacking in my crew of astrophotographers.
 
Last edited:
That is amazing! I have made a thread on Astrobin to let everyone know.. I'm from Australia, so hopefully I can get a few peeps to contribute. Do the original masters really need to be in XISF? not everyone uses Pixinsight for stacking in my crew of astrophotographers.
Hi @bsteeve,
Thank you for your interest. Images in FITS format are also welcome.

Best regards
 
Hi all,

Thank you so much for your interest. Certainly, your contribution would be very welcome since we are more limited in acquiring our own data in the southern hemisphere. Any field of view in the range we specify is needed, though by the method design the critical piece here is the wider fields since they are the seed for a global correction towards longer focal lengths.

The survey data you are collecting in Astrobin would be a very valuable contribution to our project.

We also want to thank the community for this enthusiastic reception.


Best regards,
Vicent.
 
Hi, a quick question to Vincet et al. I have a few data sets taken with my ASI6200 mono and Canon EF200mm lens that fit the criteria for field of view etc. The FOV is 10d x 7d in this case. I'm also in Bortle 3 site in the southern hemisphere which helps. My question relates to the drizzle integration requirements. I have the masters that have been prepared by the WBPP script which comes with the requisite *.xdrz files. To supply these masters as Drizzle x 1 do I need to run the registered files through the drizzle integration process to produce new masters before I post them to a Dropbox file?

TIA,
Rodney
 
Hello MARS team!

I am interested contributing with my northern hemisphere data obtained with an 135mm lens and a crop sensor DSLR. I visually rate my sky as Bortle 5 but I have a feeling that it might be less than that. Most data have been taken with care in order to avoid local LP sources and with the objects as high as possible in the local sky.

I have some questions:
  • The data have been manually calibrated and integrated without local normalization by using older versions of PI. Is this acceptable?
  • What about the licensing of the data? I don't want my dataset being part of some closed-source machine learning training dataset from someone outside Pleiades.
  • If possible, I would like to have some feedback regarding the quality of my submissions in order to improve my acquisition procedure for any future submissions.
  • Are you planning publishing your efforts in a scientific journal? If not, please consider to do so!
Thanks, and good luck with your project!
 
Hi, I am interested in contributing if possible. I live in Australia under a Bortle 2 sky. Is there guidance for areas of the sky you especially need data for? Are there areas that you do not need more data for? cheers
 
Hi, I am interested in contributing if possible. I live in Australia under a Bortle 2 sky. Is there guidance for areas of the sky you especially need data for? Are there areas that you do not need more data for? cheers
Hi, thank you so much in the first place! We are just starting this project, so any data is welcome. We can only cover the northern hemisphere (plus a band down to about 25 degrees south in good conditions) with the observing station we are installing in Spain (MARS-pi), so all contributions with southern data are fantastic.
 
Hello MARS team!

I am interested contributing with my northern hemisphere data obtained with an 135mm lens and a crop sensor DSLR. I visually rate my sky as Bortle 5 but I have a feeling that it might be less than that. Most data have been taken with care in order to avoid local LP sources and with the objects as high as possible in the local sky.

Great! Thank you!

  • The data have been manually calibrated and integrated without local normalization by using older versions of PI. Is this acceptable?

Yes, there should be no problems.

  • What about the licensing of the data? I don't want my dataset being part of some closed-source machine learning training dataset from someone outside Pleiades.

Rest assured that the data used to build the MARS databases will be used exclusively by our team and staff. Nobody outside Pleiades Astrophoto will have access to it. MARS is a project based exclusively on observational data. We are against using generative "AI" techniques for gradient correction so that nobody will use either the raw data or the final MARS databases for these purposes.

However, I want to be crystal clear about this project since the very beginning: MARS databases and the tools that will use them will be closed-source and exclusive to the PixInsight platform. We are investing significant intellectual and economic resources in this project and will invest much more in the future.

  • f possible, I would like to have some feedback regarding the quality of my submissions in order to improve my acquisition procedure for any future submissions.

Of course. We'll try to inform all contributors about our use of their data. Many aspects of this project are still under development, and we are now working on a dedicated MARS website, where we'll provide up-to-date (or even real-time) detailed information.

  • Are you planning publishing your efforts in a scientific journal? If not, please consider to do so!

We have yet to think about this, but it's always possible. Thank you for pointing out this.

Thanks, and good luck with your project!

Thank you so much!
 
  • Like
Reactions: dld
Hi, a quick question to Vincet et al. I have a few data sets taken with my ASI6200 mono and Canon EF200mm lens that fit the criteria for field of view etc. The FOV is 10d x 7d in this case. I'm also in Bortle 3 site in the southern hemisphere which helps. My question relates to the drizzle integration requirements. I have the masters that have been prepared by the WBPP script which comes with the requisite *.xdrz files. To supply these masters as Drizzle x 1 do I need to run the registered files through the drizzle integration process to produce new masters before I post them to a Dropbox file?

TIA,
Rodney

Hi Rodney, thank you for your interest!

We ask for drizzle-integrated data because we need master images generated without any pixel interpolation. This is important in all cases because pixel interpolation generates aliasing artifacts, which are large-scale structures that can potentially contaminate the survey. Aliasing also degrades stellar photometry, which we will apply (using Gaia photometric and spectral data) to normalize all the data. This is particularly critical for color images (DSLR, OSC) because demosaicing artifacts can have a very negative impact on photometry. In other words, we need data as pure as possible, and currently only drizzle can guarantee this.

To apply a drizzle integration, just load the .xdrz files with the DrizzleIntegration tool and set Scale = 1. Ensure the original calibrated frames (before registration) are still on the same folders when WBPP generated the .xdrz files; otherwise, open the Format Hints section and select the appropriate input directory. Apply the process globally (F6, or blue circle) and wait. Let us know if you need further help with this.
 
Hi Rodney, thank you for your interest!

We ask for drizzle-integrated data because we need master images generated without any pixel interpolation. This is important in all cases because pixel interpolation generates aliasing artifacts, which are large-scale structures that can potentially contaminate the survey. Aliasing also degrades stellar photometry, which we will apply (using Gaia photometric and spectral data) to normalize all the data. This is particularly critical for color images (DSLR, OSC) because demosaicing artifacts can have a very negative impact on photometry. In other words, we need data as pure as possible, and currently only drizzle can guarantee this.

To apply a drizzle integration, just load the .xdrz files with the DrizzleIntegration tool and set Scale = 1. Ensure the original calibrated frames (before registration) are still on the same folders when WBPP generated the .xdrz files; otherwise, open the Format Hints section and select the appropriate input directory. Apply the process globally (F6, or blue circle) and wait. Let us know if you need further help with this.
Excellent, thanks for the heads up! I will run the new process as soon as I get a chance. I have moved the folders since I did the preprocessing so will need to spend a moment to assign the appropriate directories.

Cheers,
Rodney
 
Hello,

I think this is a really brilliant and exciting project, I am very much looking forward to the first results from this!

I have some integrations taken in bortle 3 with a 50 mm lens and APS-C camera from last year. Maybe they are good enough to contribute them, I will give it a try as soon as possible. They cover the fields in the northern winter constellations (Orion, Taurus, etc…)

Just some points that come to my mind:
- How are gradients in the 35 mm handled? These will be the initial reference for all the other images, thus gradients still present in them could influence all the other images. Maybe all sky images from truly dark sky are a solution to correct for gradients in these initial large field images.
- The MARS-pi survey will be captured with 1h/2h integration per field. How will this play out when normalizing gradients in images with very high SNR (>4h integration with similar conditions)? Will the gradient normalization in our images be limited by the depth of the MARS survey, or is such a limit only of theoretical nature?
- For the sake of this example, let's assume I'm imaging with an unmodified DSLR. All the images from the MARS survey will be captured with Ha sensitive camera (at least that's what I assume). Will normalizing my images against the survey introduce the brightness variations caused by Ha nebula into my images? I'm asking because the same principle (although with much less intensity) will probably apply to all differences in spectral sensitivity.

CS Gerrit
 
Hello Juan and thank you for your elaborate answers.

Rest assured that the data used to build the MARS databases will be used exclusively by our team and staff. Nobody outside Pleiades Astrophoto will have access to it. MARS is a project based exclusively on observational data. We are against using generative "AI" techniques for gradient correction so that nobody will use either the raw data or the final MARS databases for these purposes.

However, I want to be crystal clear about this project since the very beginning: MARS databases and the tools that will use them will be closed-source and exclusive to the PixInsight platform. We are investing significant intellectual and economic resources in this project and will invest much more in the future.

I don't have anything against ML, as there are directions of research which aim to respect physics (see physics-informed neural networks) and I am open to adopt such solutions, especially if they are developed from Pleiades.

My concern is about others (and not Pleiades) who can potentially use PI for the purpose of assembling training sets for their "AI" products. This malpractice is already a threat: for example some can use SPCC in order to create training sets for a commercial, standalone "AI" color correction tool/product outside the PI ecosystem. I am far from being an expert in intellectual property but I suspect that neither the PI license nor the Gaia DR3/SP license allow for such practices.
 
Back
Top