Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a normalization option #113

Open
rmjarvis opened this issue Jun 12, 2020 · 8 comments
Open

Add a normalization option #113

rmjarvis opened this issue Jun 12, 2020 · 8 comments

Comments

@rmjarvis
Copy link
Owner

Right now our PSF models are all normalized to have unit integral.

@erykoff has explained that this isn't the right thing if you expect to get accurate photometry at the end. Here is the procedure he laid out for doing the aperture correction:

  1. Take a model
  2. Apply it to all the stars on a ccd
  3. Measure the psf flux with the “unnormalized” psf
  4. Measure the aperture flux in the reference aperture
  5. normalization is the median/mean/clipped mean/favorite statistic of the the ratio between the unnormalized psf flux and reference aperture flux

My proposal is to do this at the very end of the fitting process of an exposure on all the stars used for fitting. This will set an overall normalization number for that exposure, which can be saved at the PSF level and applied when drawing the PSF using the final model.

This seems independent of any particular PSF type, so I think this can be a top-level field, which could look like this in the config:

normalization:
    type: Aperture
    diameter: 22.22   # radius also allowed
    units: pixels   # arcsec also allowed

@beckermr @esheldon @brianyanny

@rmjarvis
Copy link
Owner Author

Further info from Eli on Slack:

As an update to above, step 5 will break if you have systematic background errors. Which we do. So you want to take the brightest x% when you do this, because those are less affected by backgrounds. Notably, the pure-psf-model version is immune to this and so might actually be better!

@rmjarvis
Copy link
Owner Author

rmjarvis commented Jun 12, 2020

We could also consider a different normalization type that does the pure psf model version, which Eli thinks should work, but Lupton thinks won't. Then it could be relatively easy to compare and see which is better.

The other possible algorithm looks something like:

  1. Take a given Piff model for a specific location.
  2. Draw it at fine resolution (or maybe just don't care about the accuracy issues due to finite pixel scale.)
  3. Integrate the profile within aperture. This gives a number modestly less than 1. Call this f_ap.
  4. Renormalize full profile by 1/f_ap.

@rmjarvis
Copy link
Owner Author

@beckermr
Copy link
Collaborator

We could also consider a different normalization type that does the pure psf model version, which Eli thinks should work, but Lupton thinks won't.

LLOL

@beckermr
Copy link
Collaborator

normalization is the median/mean/clipped mean/favorite statistic of the the ratio between the unnormalized psf flux and reference aperture flux

Is the norm supposed to be constant per exposure, or should it be allowed to vary in some way?

@erykoff
Copy link

erykoff commented Jun 12, 2020

It should at least be per-ccd, not per-exposure. In the Rubin stack it can vary at sub-ccd scales, in sextractor/psfex it can't.

@beckermr
Copy link
Collaborator

sorry yes. I meant CCD

@erykoff
Copy link

erykoff commented Jun 12, 2020

I am not convinced that allowing to vary at sub-ccd scales is particularly useful overall, as it creates more degrees of freedom for things to go wrong. And it's also something that might be more important for HSC than DECam.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants