In our data-reduction methods, image-alignment is often involved. It is based on interpolation. An important questions i whether the mean is conserved during such alignment?
To test this we iterate the shift of an observed image, measuring the mean of a patch given by selenographic coordinates. We iterate many times - i.e. reuse the previously shifted image for next round - in order to get a clearer answer. Visual inspection shows the iterated image getting 'fuzzy' as the shifts recur. That is, image standard deviation suffers. How bad is the situation for the mean of an area?
The answer is 'not very much'. Iterating the shifts 40 times we find thatthe per-shift loss in mean value over a realistically sized patch on the DS is -0.005 %. The mean in the patch was about 10 counts which is 'bright' for the DS. At other lunar phases the counts may be just 2 or 3 - still just a factor of 5 from the above.
We would never shift the same image 40 times, of course, but the iterated shifts above allows us to build up a change in the mean that can be fitted with a regression. We have learned that one shift of an image may 'cost' 0.005% of the mean flux. This is a lot less than the goals for accuracy we have set, which are '0.1%'.
Added later: an identical test, but using ROT to rotate an image bya random amount, iterated, showed that errors per rotation were almost always less then 0.01%.