-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Conversation
@ptrendx Thanks for your contribution! Could you please look into CI failures? @mxnet-label-bot add [pr-awaiting-review , python] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add an unit test for this?
@junrushao1994 @reminisce FYI ndim()
API is used here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the fix. I am not sure about the last two params. They were addd by @eric-haibin-lin. We can add comments in later PRs. This PR is good to merge.
* \param fdefault default function used for inference if the node does not | ||
* provide its own implementation. | ||
* \param scalars_only whether the inferred attribute may be only a scalar | ||
* \param bwd_identity_assign TODO |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only equal to false
in inferring storage types. It means whether the attributes of forward ndarray and backward ndarray are necessarily the same.
* provide its own implementation. | ||
* \param scalars_only whether the inferred attribute may be only a scalar | ||
* \param bwd_identity_assign TODO | ||
* \param dispatch_mode_name TODO |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Name of the dispatch mode attribute on the node. Used for storage type inference.
* \param scalars_only whether the inferred attribute may be only a scalar | ||
* \param bwd_identity_assign TODO | ||
* \param dispatch_mode_name TODO | ||
* param default_mode_val TODO |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Default value of the dispatch mode attribute on the node. Used for storage type inference.
@junrushao1994 I wanted to rebase this PR on top of the current master and I'm confused - in PR #14193 you made InferShapeAttr as a standalone function, but then in #14270 you changed InferShape to use generic InferAttr again. Was this intentional? Which one should I use then? |
@ptrendx Sorry it is a mistake. I am actually working on two projects, dynamic shape and numpy operators. In dynamic shape, I should make infer shape pass standalone, but in numpy operator everything is just fine. I think these two projects interfere a bit. |
Ok, so I should move it back to InferShapeAttr and make my changes there then? |
@ptrendx Yes, please. I am so sorry for the inconvenience. |
6aec7b6
to
29ed75d
Compare
@junrushao1994 @reminisce I think this PR is good to go. Do you have any other comments? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
This PR introduces fnum_unknown
to the shape inference pass, which counts the number of unknown dimensions in an NDArray, and leverages this function to do shape inference more correctly.
* Fix InferShape pass * nnvm::TShape -> mxnet::TShape in InferShapeAttr * More nnvm->mxnet namespace changes * And more nnvm -> mxnet * Retrigger CI
* Fix InferShape pass * nnvm::TShape -> mxnet::TShape in InferShapeAttr * More nnvm->mxnet namespace changes * And more nnvm -> mxnet * Retrigger CI
* Fix InferShape pass * nnvm::TShape -> mxnet::TShape in InferShapeAttr * More nnvm->mxnet namespace changes * And more nnvm -> mxnet * Retrigger CI
Description
This PR fixes shape inference, which currently results in wrong result for some cases.
An example problematic case (as it works in current version of MXNet, before applying this change):
If I do
I get expected result:
but when I change H and W dimensions in the shape to 0s,
I get
so the shape of the weight changed to
()
.Checklist
Essentials
Please feel free to remove inapplicable items for your PR.
Changes
num_unknown
if it has at least 1 zero. After the change number of 0 elements is added to thenum_unknown
- that way shape inference pass does not end prematurely if only some of the elements of shape were deduced.Comments