-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support configuring fixed values automatically from Model #1051
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1051 +/- ##
=======================================
Coverage 96.78% 96.78%
=======================================
Files 59 59
Lines 3389 3394 +5
Branches 489 490 +1
=======================================
+ Hits 3280 3285 +5
Misses 68 68
Partials 41 41
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kratsg Core wise this is great. Thanks for stepping up and knocking this out this week. Just some readability suggestions and questions for the most part.
Suggestion: For the minuit optimizer, the initial step size is set in |
9434b8a
to
7693b84
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kratsg LGTM for the original scope. As this is a release PR if we can get @lukasheinrich's review as well that would be good.
I haven't looked at @alexander-held's suggestion in terms of scope yet, so should we try to get that in here or is that beyond scope and should be its own Issue + PR? Also, would be good to understand how the changes in 0ad8a7d managed to break everything if there's time. :?
4204b87
to
ff171ca
Compare
ff171ca
to
a947035
Compare
a947035
to
8a7c992
Compare
This is done in #1054. |
fb6b303
to
d465637
Compare
d465637
to
3364a79
Compare
I've queued this up. When it gets merged, I'll create a patch release from v0.5.1 → v0.5.2 which includes the following 19 change(s) [including this PR]: If you make any more changes, you probably want to re-trigger me again by removing the bumpversion/patch label and then adding it back again.
|
Triggered by #1051 via GitHub Actions.
Description
Replaces #846.
Provides support for automatically passing through the fixed parameters from the model configuration into all of the minimization/inference that we do in the
pyhf
code-base, namely:pyhf.infer.hypotest
pyhf.infer.calculators.generate_asimov_data
pyhf.infer.calculators.AsymptoticCalculators
pyhf.infer.mle.fit
pyhf.infer.mle.fixed_poi_fit
pyhf.infer.test_statistics._qmu_like
pyhf.infer.test_statistics._tmu_like
pyhf.infer.test_statistics.qmu
pyhf.infer.test_statistics.qmu_tilde
pyhf.infer.test_statistics.tmu
pyhf.infer.test_statistics.tmu_tilde
We will pass in
fixed_params
all the way through to the optimization which will stitch together the two if need-be.This is a two-fold feature that needs to be refactored/be more elegant a bit later.
fixed_vals
are things that are a list of tuples like so:[(parameter_index, parameter_value), ...]
.fixed_params
are what comes from thepdf.config.suggested_fixed()
.fixed_params
is optional, the model fixed parameters are retrievedfixed_params
is not optional, nothing is done (a pass-through)This tries to model the behavior we have identically for
init_pars
andpar_bounds
everywhere.Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: