-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conventions: Use "rank" for variable names, when appropriate #627
Conventions: Use "rank" for variable names, when appropriate #627
Conversation
@fdwr - can you take a look? I think the remaining uses of "size" in variable names refer to element counts or magnitudes (e.g. width and height). Maybe I missed some? I was on the fence about the broadcast algorithms that still use sizeA/sizeB. I'm happy to convert those over, but wanted your opinion. If you want to search the rendered spec for "size" I find it handy to remove the IDL and code samples, e.g. running this on the console |
Thanks Josh 🙂. Scanning for all "size" occurrences, I only found one other operator to replace with rank, plus some dubious size-related statements. Do these proposed edits look valid to you?
1. If |options|.{{MLBatchNormalizationOptions/scale}} [=map/exists=]:
- 1. If its [=list/size=] is not 1, then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/rank=] is not 1, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLBatchNormalizationOptions/scale}}'s [=MLOperand/shape=][0] is not equal to |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLBatchNormalizationOptions/bias}} [=map/exists=]:
- 1. If its [=list/size=] is not 1, then [=exception/throw=] a {{TypeError}}.
+ 1. If its [=MLOperand/rank=] is not 1, then [=exception/throw=] a {{TypeError}}.
1. If |options|.{{MLBatchNormalizationOptions/bias}}'s [=MLOperand/shape=][0] is not equal to |input|'s [=MLOperand/shape=][|options|.{{MLBatchNormalizationOptions/axis}}], then [=exception/throw=] a {{TypeError}}.
1. *Make graph connections:* Since the rest of these are
: <dfn>keepDimensions</dfn>
::
- If true, retains reduced dimensions with [=list/size=] 1.
+ If true, the output has the same rank as the input, setting any reduced dimensions to size 1.
<details open algorithm>
<summary>
The <dfn method for=MLGraphBuilder>slice(|input|, |starts|, |sizes|)</dfn> method steps are:
</summary>
1. If [=MLGraphBuilder/validating operand=] with [=this=] and |input| returns false, then [=exception/throw=] a {{TypeError}}.
- 1. If |sizes|'s [=list/size=] is 0, then [=exception/throw=] a {{TypeError}}.
+ 1. If any of |sizes|'s have size 0, then [=exception/throw=] a {{TypeError}}.
1. If |starts|'s [=list/size=] and |sizes|'s [=list/size=] are not both equal to |input|'s [=MLOperand/rank=], then [=exception/throw=] a {{TypeError}}. I'm okay with sizeA here because it's actually about the shape's size. Though, somebody reported to me that - To unidirectionally broadcast the shapes A and B, perform the following steps. A and B are [lists](https://infra.spec.whatwg.org/#list) of positive integers, representing the dimensions of tensors, and the steps return a new [list](https://infra.spec.whatwg.org/#list) of positive integers, or failure.
+ To unidirectionally broadcast the shapes shapeA and shapeB, perform the following steps. shapeA and shapeB are [lists](https://infra.spec.whatwg.org/#list) of positive integers, representing the dimensions of tensors, and the steps return a new [list](https://infra.spec.whatwg.org/#list) of positive integers, or failure.
- 1. Let |sizeA| be |A|'s [=list/size=].
+ 1. Let |sizeA| be |shapeA|'s [=list/size=].
- 1. Let |sizeB| be |B|'s [=list/size=].
+ 1. Let |sizeB| be |shapeB|'s [=list/size=].
1. If |sizeB| > |sizeA|, then return failure.
- 1. Let |paddedB| be a [=list/clone=] of |B|.
+ 1. Let |paddedB| be a [=list/clone=] of |shapeB|.
1. While |paddedB|'s [=list/size=] is less than |sizeA|, [=list/prepend=] 1 to |paddedB|.
1. Let |outputShape| be a new [=/list=].
1. [=list/For each=] |index| in [=the range=] 0 to |sizeA|, exclusive:
- 1. Let |dimA| be |A|[|index|].
+ 1. Let |dimA| be |shapeA|[|index|].
1. Let |dimB| be |paddedB|[|index|].
1. If |dimA| is not equal to |dimB| and |dimA| is not equal to 1, then return failure.
1. [=list/Append=] |dimA| to |outputShape|.
1. Return |outputShape|. |
Just to confirm: this was referring to the
Nah, too much noise, this is all related.
Yes - incorporated verbatim in 38f9261 except for in slice()
I made this change instead: - 1. If |sizes|'s [=list/size=] is 0, then [=exception/throw=] a {{TypeError}}.
+ 1. If any of |sizes|'s [=list/items=] are 0, then [=exception/throw=] a {{TypeError}}. .. and made sure |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM JB. TY. 😎
Yep, it looked like you found every other case of size -> rank for other operators. @huningxin Any thoughts on this one? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks @inexorabletash !
SHA: e52f163 Reason: push, by huningxin Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Fixes #588
Preview | Diff