Non-normalized Separable Gaussian Filter in Star Detection

ghilios

Member
I'm digging into Star Detection to better understand it, and I noticed that the separable gaussian filter applied during noise detection isn't normalized.

JavaScript:
      let G = Matrix.gaussianFilterBySize( 1 + (1 << this.structureLayers) );
      s.convolveSeparable( G.rowVector( G.rows >> 1 ), G.rowVector( G.rows >> 1 ) );

For 5 layers it outputs the following, which sums to 13.19.
JavaScript:
0.010000000707805157, 0.017465760931372643, 0.02942727319896221, 0.04782858118414879, 0.0749894231557846, 0.11341944336891174, 0.16548171639442444, 0.2329096645116806, 0.3162277638912201, 0.4141784608364105, 0.5232990980148315, 0.637804388999939, 0.7498942017555237, 0.8505257964134216, 0.9305720329284668, 0.982171893119812, 1, 0.982171893119812, 0.9305720329284668, 0.8505257964134216, 0.7498942017555237, 0.637804388999939, 0.5232990980148315, 0.4141784608364105, 0.3162277638912201, 0.2329096645116806, 0.16548171639442444, 0.11341944336891174, 0.0749894231557846, 0.04782858118414879, 0.02942727319896221, 0.017465760931372643, 0.010000000707805157

If I understand the math correctly, this results in blurred values that are a constant multiple of what they should be, which could result in some unexpected clipping unless you're rescaling values. Is this intentional?
 
this results in blurred values that are a constant multiple of what they should be

This does not happen. Our implementations of nonseparable and separable convolutions always normalize low-pass filters internally. In the case of separable filters the filter weight is equal to the product of the sums of the row and column vectors. In the case of this Gaussian filter the weight is 174.03915 (to 32-bit floating point precision).
 
Last edited:
Ahh I see - so you normalize whatever vector is provided. What confused me is that OpenCV doesn't do that, so I got a different result when I took the same filter. I'm glad there isn't a bug here, thanks!
 
Back
Top