-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enforce consistent tensor shape for scalars #413
Conversation
In mxnet >= v1.4.0 the shpae of a NDArray when a scalar has been passed to it is empty. As a result, this should be caught and forced to have a consistent shape by wrapping the scalar in a list.
We're missing coverage tests where we force the shape checks for the exceptions. |
Actually, looking at this in more depth now I think that the MXNet's backend's |
@kratsg @lukasheinrich Looking at this again, is there a specific reason that for our backends the
into the equivalent of
such that they both return
? I can't remember if there is a reason that we don't force this, beyond trying to keep parity with the underlying libraries themselves as
Another reason I am tempted to do this, is as
to do this properly we would want to have |
This is perhaps more efficient(?)
The TensorFlow backend's astensor method already ensures that a tensor has shape 1 at minimum Example: >>> import pyhf >>> import tensorflow as tf >>> pyhf.set_backend(pyhf.tensor.tensorflow_backend(session=tf.Session())) >>> pyhf.tensorlib.astensor(10) <tf.Tensor 'Cast:0' shape=(1,) dtype=float32>
Hi @matthewfeickert -- I think you are right, the point of the tensorlib is to act as a shim that unifies the interfaces across the different backends (as annoyingly they have small differences) in that sense, we should expect all shapes to be the same for the a given input shape (as well as decide on specific casts) and implement them accordingly |
Instead of trying to catch an IndexError, just first check if the input is a scalar and if so, wrap it in a list
Additionally add tests for these types of bad tesnor types
I think we shouldn't cast scalars to lists. There's nothing in our code that is returning scalars -- so it's a user problem if scalars are passed in... |
So do you just want the offending tests to get changed then? The CI currently breaks without this because we don't enforce a consistent return structure. |
Maybe. I wonder what @lukasheinrich thinks. The problem is that I think for a given fit, we call |
Yeah, this is a very valid point, though we already have this in the TensorFlow backend so do we know how much it gets slowed down? |
Instead of doing any type checking, try to access different data members and if they don't exist then make the necessary corrections to the object to make it be a tensor of non-empty shape
If we have people trying to pass strings as tensors then there are bigger problems afoot
This is more appropriate for the use case. From the pytest docs: Using pytest.raises is likely to be better for cases where you are testing exceptions your own code is deliberately raising, whereas using @pytest.mark.xfail with a check function is probably better for something like documenting unfixed bugs (where the test describes what "should" happen) or bugs in dependencies. c.f. https://docs.pytest.org/en/latest/assert.html#assertions-about-expected-exceptions
* Allow for arbitrary shape tensors, including 0-dimensional ("empty") shape tensors (floats) - Reverts PR #413 * Enforce optimizers return objective function as a scalar to harmonize return type * Revert tests that enforce minimum shape * Add tests for return structure shape of pyhf.infer.mle.fit * Add docstring examples for astensor
Description
In
mxnet >= v1.4.0
the shape of aNDArray
when a scalar has been passed to it is empty. As a result, this should be caught and forced to have a consistent shape by wrapping the scalar in a list. However, at this point it is probably worth enforcing a default shape on tensors made from scalars, as the different tensor libraries handle them differently and we should have a consistent form.Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: