Thanks, John. I use that script too. I found I had to set the "repaired level" down to 20% or so to get it to see my saturated cores though, and the result usually needs a little smoothing to look natural.
Rob, when I look at bright stars in a single calibrated debayered light I see the magenta core but the RGB values are roughly 17%, 14%, and 40%. So the same strange >25% in B.
If I load a raw .CR2 image direct from the camera (still bayered) it shows up in PI as a checkerboard grey scale. The core of bright stars are a uniform gray fixed at K = .2335 over many pixels though. Not quite .25, but this sure looks like 14-bit saturation to me, coming straight from the camera.
So I guess cal and debayering scale the data to >25%. I think my stars are indeed saturating in the camera.
There is still the question of why some stretch processes handle saturated data fine while others seem to propagate the false core color. I probably wouldn't understand the reason anyway, so I should just put up with it.