Skip to content

Commit

Permalink
add back gelu
Browse files Browse the repository at this point in the history
  • Loading branch information
a-sully committed Jun 7, 2024
1 parent 78d9d7b commit e948f7e
Showing 1 changed file with 18 additions and 0 deletions.
18 changes: 18 additions & 0 deletions index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -2755,6 +2755,7 @@ Compute the <a href="https://en.wikipedia.org/wiki/Rectifier_(neural_networks)#G
<script type=idl>
partial interface MLGraphBuilder {
MLOperand gelu(MLOperand input);
MLActivation gelu();
};
</script>

Expand Down Expand Up @@ -2798,6 +2799,23 @@ partial interface MLGraphBuilder {
1. Return |output|.
</details>

#### {{MLGraphBuilder/gelu()}} #### {#api-mlgraphbuilder-gelu}
<div>
**Arguments:**
- None.

**Returns:**
- an {{MLActivation}}. The activation function representing the gelu operation.
</div>

<details open algorithm>
<summary>
The <dfn method for=MLGraphBuilder id=gelu-noargs>gelu()</dfn> method steps are:
</summary>
1. Let |op| be the result of [=creating an MLActivation=] given [=this=] and "gelu".
1. Return |op|.
</details>

### gemm ### {#api-mlgraphbuilder-gemm}
Calculate the [general matrix multiplication of the Basic Linear Algebra Subprograms](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms#Level_3). The calculation follows the expression `alpha * A * B + beta * C`, where `A` is a 2-D tensor with shape [M, K] or [K, M], `B` is a 2-D tensor with shape [K, N] or [N, K], and `C` is [=unidirectionally broadcastable=] to the shape [M, N]. `A` and `B` may optionally be transposed prior to the calculation.

Expand Down

0 comments on commit e948f7e

Please sign in to comment.