The default RGBWS is sRGB. It has been set as default to keep PixInsight compatible with the rest of imaging applications. If I had chosen a uniform RGBWS (1:1:1) as default, Google would be plenty of things like "guys, never use PixInsight to convert your images into grayscale", or "PI is a poor performer for daylight images", etc. PI's scope is broad ad as such I sometimes have to make a choice that is suboptimal for astronomy but reasonable as a "mean" for the largest possible set of scenarios. The default sRGB RGBWS is clearly one of those cases.
Thanks for picking good defaults Juan!
Thanks Sander but I'm affraid I don't deserve that in this occasion (it's nice to hear, though

). The sRGB space isn't a good option for deep-sky images, basically due to its strongly green-biased luminance that mimics the human eye's chromatic response. This doesn't mean that you can't get good results using it, of course, but optimal choices are always preferable.
In general, a reasonable choice is what we call a
uniform RGBWS. In a uniform RGBWS, all luminance weights are equal, so no particular color is given more importance. The RGBWorkingSpace tool allows you to specify three ones as the weights for red, green and blue; it rescales them internally to yield a unit vector (if the three weights are mutually equal, they are rescaled to 1/3).
Another good option can be 1:0.25:1, or something similar to decrease the relative weight of green, for deep-sky images where green transports little information. It depends on how you acquire your images. For one-shot color images (DSLR, OSCCD), I'd use a uniform RGBWS because these images have been acquired through broadband filters with extended superposition, that is, adjacent pairs of filters share a region of the spectrum.
Another thing to consider is linearity of the transformations performed in a RGBWS. We talk of a
linear RGBWS when it has a gamma value of one. If we are going to perform luminance/chrominance separations on linear images (for deconvolution, for example), then the resulting luminance and chrominance components must also be linear. The CIE XYZ components are calculated as follows:

where M is the
RGB-to-CIEXYZ conversion matrix. X and Z are the components of the
linear chrominance, and Y is the
linear luminance.
In the formula above, note that chrominance and luminance are linear functions
of the coefficients M
ij, but X, Y and Z can be nonlinear functions
of the RGB components. To ensure that XYZ components are linear combinations of RGB, gamma must be equal to one. In this way, we can perform luminance/chrominance separations of linear RGB data where the resulting luminance and chrominance are also linear functions of incident light. This is what the "Linear" checkbox does in Deconvolution, for example: separate luminance and chrominance in the CIE XYZ space instead of CIE L*c*h*, and assume that the user knows what she is doing.