Skip to content

Commit

Permalink
Deploying to gh-pages from @ 5504d97 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
facebook-github-bot committed Apr 11, 2024
1 parent 87938a5 commit 230e459
Show file tree
Hide file tree
Showing 5 changed files with 60 additions and 7 deletions.
20 changes: 16 additions & 4 deletions genindex.html
Original file line number Diff line number Diff line change
Expand Up @@ -1460,6 +1460,8 @@ <h2 id="F">F</h2>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.dp_sharding.DpPooledEmbeddingDist.forward">(torchrec.distributed.sharding.dp_sharding.DpPooledEmbeddingDist method)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.dp_sharding.DpSparseFeaturesDist.forward">(torchrec.distributed.sharding.dp_sharding.DpSparseFeaturesDist method)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.rw_sharding.InferCPURwSparseFeaturesDist.forward">(torchrec.distributed.sharding.rw_sharding.InferCPURwSparseFeaturesDist method)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.rw_sharding.InferRwPooledEmbeddingDist.forward">(torchrec.distributed.sharding.rw_sharding.InferRwPooledEmbeddingDist method)</a>
</li>
Expand Down Expand Up @@ -1638,7 +1640,7 @@ <h2 id="G">G</h2>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.quant_embeddingbag.get_device_from_parameter_sharding">get_device_from_parameter_sharding() (in module torchrec.distributed.quant_embeddingbag)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.quant_embeddingbag.get_device_from_sharding_type">get_device_from_sharding_type() (in module torchrec.distributed.quant_embeddingbag)</a>
<li><a href="torchrec.distributed.html#torchrec.distributed.quant_embeddingbag.get_device_from_sharding_infos">get_device_from_sharding_infos() (in module torchrec.distributed.quant_embeddingbag)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding.get_ec_index_dedup">get_ec_index_dedup() (in module torchrec.distributed.embedding)</a>
</li>
Expand Down Expand Up @@ -1666,9 +1668,11 @@ <h2 id="G">G</h2>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.types.ShardingPlan.get_plan_for_module">get_plan_for_module() (torchrec.distributed.types.ShardingPlan method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup.get_tbes_to_register">get_tbes_to_register() (torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup method)</a>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.get_tbes_to_register">get_tbes_to_register() (torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup method)</a>

<ul>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup.get_tbes_to_register">(torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferGroupedPooledEmbeddingsLookup.get_tbes_to_register">(torchrec.distributed.embedding_lookup.InferGroupedPooledEmbeddingsLookup method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.MetaInferGroupedEmbeddingsLookup.get_tbes_to_register">(torchrec.distributed.embedding_lookup.MetaInferGroupedEmbeddingsLookup method)</a>
Expand Down Expand Up @@ -1747,6 +1751,10 @@ <h2 id="I">I</h2>
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="torchrec.sparse.html#torchrec.sparse.jagged_tensor.KeyedJaggedTensor.index_per_key">index_per_key() (torchrec.sparse.jagged_tensor.KeyedJaggedTensor method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup">InferCPUGroupedEmbeddingsLookup (class in torchrec.distributed.embedding_lookup)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.rw_sharding.InferCPURwSparseFeaturesDist">InferCPURwSparseFeaturesDist (class in torchrec.distributed.sharding.rw_sharding)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.cw_sharding.InferCwPooledEmbeddingDist">InferCwPooledEmbeddingDist (class in torchrec.distributed.sharding.cw_sharding)</a>
</li>
Expand Down Expand Up @@ -1810,6 +1818,8 @@ <h2 id="I">I</h2>
<li><a href="torchrec.modules.html#torchrec.modules.mc_modules.MCHManagedCollisionModule.input_size">(torchrec.modules.mc_modules.MCHManagedCollisionModule method)</a>
</li>
</ul></li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="torchrec.distributed.html#id21">input_sizes (torchrec.distributed.comm_ops.ReduceScatterBaseInfo attribute)</a>, <a href="torchrec.distributed.html#torchrec.distributed.comm_ops.ReduceScatterBaseInfo.input_sizes">[1]</a>

<ul>
Expand All @@ -1818,8 +1828,6 @@ <h2 id="I">I</h2>
<li><a href="torchrec.distributed.html#id25">(torchrec.distributed.comm_ops.ReduceScatterVInfo attribute)</a>, <a href="torchrec.distributed.html#torchrec.distributed.comm_ops.ReduceScatterVInfo.input_sizes">[1]</a>
</li>
</ul></li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="torchrec.distributed.html#id18">input_split_sizes (torchrec.distributed.comm_ops.All2AllVInfo attribute)</a>, <a href="torchrec.distributed.html#torchrec.distributed.comm_ops.All2AllVInfo.input_split_sizes">[1]</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.comm_ops.All2AllDenseInfo.input_splits">input_splits (torchrec.distributed.comm_ops.All2AllDenseInfo attribute)</a>
Expand Down Expand Up @@ -3838,6 +3846,8 @@ <h2 id="T">T</h2>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.GroupedEmbeddingsLookup.training">(torchrec.distributed.embedding_lookup.GroupedEmbeddingsLookup attribute)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.GroupedPooledEmbeddingsLookup.training">(torchrec.distributed.embedding_lookup.GroupedPooledEmbeddingsLookup attribute)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.training">(torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup attribute)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup.training">(torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup attribute)</a>
</li>
Expand Down Expand Up @@ -3884,6 +3894,8 @@ <h2 id="T">T</h2>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.dp_sharding.DpPooledEmbeddingDist.training">(torchrec.distributed.sharding.dp_sharding.DpPooledEmbeddingDist attribute)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.dp_sharding.DpSparseFeaturesDist.training">(torchrec.distributed.sharding.dp_sharding.DpSparseFeaturesDist attribute)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.rw_sharding.InferCPURwSparseFeaturesDist.training">(torchrec.distributed.sharding.rw_sharding.InferCPURwSparseFeaturesDist attribute)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.rw_sharding.InferRwPooledEmbeddingDist.training">(torchrec.distributed.sharding.rw_sharding.InferRwPooledEmbeddingDist attribute)</a>
</li>
Expand Down
Binary file modified objects.inv
Binary file not shown.
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

20 changes: 18 additions & 2 deletions torchrec.distributed.html
Original file line number Diff line number Diff line change
Expand Up @@ -4009,6 +4009,22 @@

</dd></dl>

<dl class="py class">
<dt class="sig sig-object py" id="torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup">
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torchrec.distributed.embedding_lookup.</span></span><span class="sig-name descname"><span class="pre">InferCPUGroupedEmbeddingsLookup</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">grouped_configs_per_rank</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">List</span><span class="p"><span class="pre">[</span></span><span class="pre">List</span><span class="p"><span class="pre">[</span></span><a class="reference internal" href="#torchrec.distributed.embedding_types.GroupedEmbeddingConfig" title="torchrec.distributed.embedding_types.GroupedEmbeddingConfig"><span class="pre">GroupedEmbeddingConfig</span></a><span class="p"><span class="pre">]</span></span><span class="p"><span class="pre">]</span></span></span></em>, <em class="sig-param"><span class="n"><span class="pre">world_size</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">int</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">fused_params</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">Optional</span><span class="p"><span class="pre">[</span></span><span class="pre">Dict</span><span class="p"><span class="pre">[</span></span><span class="pre">str</span><span class="p"><span class="pre">,</span></span><span class="w"> </span><span class="pre">Any</span><span class="p"><span class="pre">]</span></span><span class="p"><span class="pre">]</span></span></span><span class="w"> </span><span class="o"><span class="pre">=</span></span><span class="w"> </span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">device</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">Optional</span><span class="p"><span class="pre">[</span></span><span class="pre">device</span><span class="p"><span class="pre">]</span></span></span><span class="w"> </span><span class="o"><span class="pre">=</span></span><span class="w"> </span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup" title="Permalink to this definition">¶</a></dt>
<dd><p>Bases: <a class="reference internal" href="#torchrec.distributed.embedding_lookup.InferGroupedLookupMixin" title="torchrec.distributed.embedding_lookup.InferGroupedLookupMixin"><code class="xref py py-class docutils literal notranslate"><span class="pre">InferGroupedLookupMixin</span></code></a>, <a class="reference internal" href="#torchrec.distributed.embedding_types.BaseEmbeddingLookup" title="torchrec.distributed.embedding_types.BaseEmbeddingLookup"><code class="xref py py-class docutils literal notranslate"><span class="pre">BaseEmbeddingLookup</span></code></a>[<a class="reference internal" href="#torchrec.distributed.embedding_types.KJTList" title="torchrec.distributed.embedding_types.KJTList"><code class="xref py py-class docutils literal notranslate"><span class="pre">KJTList</span></code></a>, <code class="xref py py-class docutils literal notranslate"><span class="pre">List</span></code>[<code class="xref py py-class docutils literal notranslate"><span class="pre">Tensor</span></code>]], <code class="xref py py-class docutils literal notranslate"><span class="pre">TBEToRegisterMixIn</span></code></p>
<dl class="py method">
<dt class="sig sig-object py" id="torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.get_tbes_to_register">
<span class="sig-name descname"><span class="pre">get_tbes_to_register</span></span><span class="sig-paren">(</span><span class="sig-paren">)</span> <span class="sig-return"><span class="sig-return-icon">&#x2192;</span> <span class="sig-return-typehint"><span class="pre">Dict</span><span class="p"><span class="pre">[</span></span><span class="pre">IntNBitTableBatchedEmbeddingBagsCodegen</span><span class="p"><span class="pre">,</span></span><span class="w"> </span><a class="reference internal" href="#torchrec.distributed.embedding_types.GroupedEmbeddingConfig" title="torchrec.distributed.embedding_types.GroupedEmbeddingConfig"><span class="pre">GroupedEmbeddingConfig</span></a><span class="p"><span class="pre">]</span></span></span></span><a class="headerlink" href="#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.get_tbes_to_register" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>

<dl class="py attribute">
<dt class="sig sig-object py" id="torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.training">
<span class="sig-name descname"><span class="pre">training</span></span><em class="property"><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="pre">bool</span></em><a class="headerlink" href="#torchrec.distributed.embedding_lookup.InferCPUGroupedEmbeddingsLookup.training" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>

</dd></dl>

<dl class="py class">
<dt class="sig sig-object py" id="torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup">
<em class="property"><span class="pre">class</span><span class="w"> </span></em><span class="sig-prename descclassname"><span class="pre">torchrec.distributed.embedding_lookup.</span></span><span class="sig-name descname"><span class="pre">InferGroupedEmbeddingsLookup</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">grouped_configs_per_rank</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">List</span><span class="p"><span class="pre">[</span></span><span class="pre">List</span><span class="p"><span class="pre">[</span></span><a class="reference internal" href="#torchrec.distributed.embedding_types.GroupedEmbeddingConfig" title="torchrec.distributed.embedding_types.GroupedEmbeddingConfig"><span class="pre">GroupedEmbeddingConfig</span></a><span class="p"><span class="pre">]</span></span><span class="p"><span class="pre">]</span></span></span></em>, <em class="sig-param"><span class="n"><span class="pre">world_size</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">int</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">fused_params</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">Optional</span><span class="p"><span class="pre">[</span></span><span class="pre">Dict</span><span class="p"><span class="pre">[</span></span><span class="pre">str</span><span class="p"><span class="pre">,</span></span><span class="w"> </span><span class="pre">Any</span><span class="p"><span class="pre">]</span></span><span class="p"><span class="pre">]</span></span></span><span class="w"> </span><span class="o"><span class="pre">=</span></span><span class="w"> </span><span class="default_value"><span class="pre">None</span></span></em>, <em class="sig-param"><span class="n"><span class="pre">device</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">Optional</span><span class="p"><span class="pre">[</span></span><span class="pre">device</span><span class="p"><span class="pre">]</span></span></span><span class="w"> </span><span class="o"><span class="pre">=</span></span><span class="w"> </span><span class="default_value"><span class="pre">None</span></span></em><span class="sig-paren">)</span><a class="headerlink" href="#torchrec.distributed.embedding_lookup.InferGroupedEmbeddingsLookup" title="Permalink to this definition">¶</a></dt>
Expand Down Expand Up @@ -6468,8 +6484,8 @@
<dd></dd></dl>

<dl class="py function">
<dt class="sig sig-object py" id="torchrec.distributed.quant_embeddingbag.get_device_from_sharding_type">
<span class="sig-prename descclassname"><span class="pre">torchrec.distributed.quant_embeddingbag.</span></span><span class="sig-name descname"><span class="pre">get_device_from_sharding_type</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">emb_shard_infos</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">List</span><span class="p"><span class="pre">[</span></span><a class="reference internal" href="#torchrec.distributed.embedding_sharding.EmbeddingShardingInfo" title="torchrec.distributed.embedding_sharding.EmbeddingShardingInfo"><span class="pre">EmbeddingShardingInfo</span></a><span class="p"><span class="pre">]</span></span></span></em><span class="sig-paren">)</span> <span class="sig-return"><span class="sig-return-icon">&#x2192;</span> <span class="sig-return-typehint"><span class="pre">str</span></span></span><a class="headerlink" href="#torchrec.distributed.quant_embeddingbag.get_device_from_sharding_type" title="Permalink to this definition">¶</a></dt>
<dt class="sig sig-object py" id="torchrec.distributed.quant_embeddingbag.get_device_from_sharding_infos">
<span class="sig-prename descclassname"><span class="pre">torchrec.distributed.quant_embeddingbag.</span></span><span class="sig-name descname"><span class="pre">get_device_from_sharding_infos</span></span><span class="sig-paren">(</span><em class="sig-param"><span class="n"><span class="pre">emb_shard_infos</span></span><span class="p"><span class="pre">:</span></span><span class="w"> </span><span class="n"><span class="pre">List</span><span class="p"><span class="pre">[</span></span><a class="reference internal" href="#torchrec.distributed.embedding_sharding.EmbeddingShardingInfo" title="torchrec.distributed.embedding_sharding.EmbeddingShardingInfo"><span class="pre">EmbeddingShardingInfo</span></a><span class="p"><span class="pre">]</span></span></span></em><span class="sig-paren">)</span> <span class="sig-return"><span class="sig-return-icon">&#x2192;</span> <span class="sig-return-typehint"><span class="pre">str</span></span></span><a class="headerlink" href="#torchrec.distributed.quant_embeddingbag.get_device_from_sharding_infos" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>

</section>
Expand Down
Loading

0 comments on commit 230e459

Please sign in to comment.