Skip to content

Commit

Permalink
Deploying to gh-pages from @ 10c07a9 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
facebook-github-bot committed Apr 14, 2024
1 parent 230e459 commit 8d55657
Show file tree
Hide file tree
Showing 5 changed files with 34 additions and 11 deletions.
14 changes: 10 additions & 4 deletions genindex.html
Original file line number Diff line number Diff line change
Expand Up @@ -873,6 +873,8 @@ <h2 id="C">C</h2>
<li><a href="torchrec.modules.html#torchrec.modules.crossnet.CrossNet">CrossNet (class in torchrec.modules.crossnet)</a>
</li>
<li><a href="torchrec.distributed.html#id2">cumsum_dim_sum_per_rank_tensor (torchrec.distributed.comm_ops.All2AllPooledInfo attribute)</a>, <a href="torchrec.distributed.html#torchrec.distributed.comm_ops.All2AllPooledInfo.cumsum_dim_sum_per_rank_tensor">[1]</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.CustomTopologyData">CustomTopologyData (class in torchrec.distributed.planner.types)</a>
</li>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.cw_sharding.CwPooledEmbeddingSharding">CwPooledEmbeddingSharding (class in torchrec.distributed.sharding.cw_sharding)</a>
</li>
Expand Down Expand Up @@ -1324,8 +1326,6 @@ <h2 id="F">F</h2>
<li><a href="torchrec.distributed.sharding.html#torchrec.distributed.sharding.tw_sharding.BaseTwEmbeddingSharding.features_per_rank">(torchrec.distributed.sharding.tw_sharding.BaseTwEmbeddingSharding method)</a>
</li>
</ul></li>
<li><a href="torchrec.quant.html#torchrec.quant.embedding_modules.features_to_dict">features_to_dict() (in module torchrec.quant.embedding_modules)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_types.FeatureShardingMixIn">FeatureShardingMixIn (class in torchrec.distributed.embedding_types)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer.feedback">feedback() (torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer method)</a>
Expand Down Expand Up @@ -1637,6 +1637,8 @@ <h2 id="G">G</h2>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer.get_budget">get_budget() (torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer static method)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer.get_cacheability">get_cacheability() (torchrec.distributed.planner.proposers.EmbeddingOffloadScaleupProposer static method)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.CustomTopologyData.get_data">get_data() (torchrec.distributed.planner.types.CustomTopologyData method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.quant_embeddingbag.get_device_from_parameter_sharding">get_device_from_parameter_sharding() (in module torchrec.distributed.quant_embeddingbag)</a>
</li>
Expand Down Expand Up @@ -1724,18 +1726,20 @@ <h2 id="G">G</h2>
<h2 id="H">H</h2>
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.CustomTopologyData.has_data">has_data() (torchrec.distributed.planner.types.CustomTopologyData method)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.embedding_types.GroupedEmbeddingConfig.has_feature_processor">has_feature_processor (torchrec.distributed.embedding_types.GroupedEmbeddingConfig attribute)</a>

<ul>
<li><a href="torchrec.modules.html#torchrec.modules.embedding_configs.EmbeddingTableConfig.has_feature_processor">(torchrec.modules.embedding_configs.EmbeddingTableConfig attribute)</a>
</li>
</ul></li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.Storage.hbm">hbm (torchrec.distributed.planner.types.Storage attribute)</a>
</li>
<li><a href="torchrec.distributed.html#torchrec.distributed.types.ParameterStorage.HBM">HBM (torchrec.distributed.types.ParameterStorage attribute)</a>
</li>
</ul></td>
<td style="width: 33%; vertical-align: top;"><ul>
<li><a href="torchrec.distributed.html#torchrec.distributed.types.ParameterStorage.HBM">HBM (torchrec.distributed.types.ParameterStorage attribute)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.Topology.hbm_mem_bw">hbm_mem_bw (torchrec.distributed.planner.types.Topology property)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.planners.HeteroEmbeddingShardingPlanner">HeteroEmbeddingShardingPlanner (class in torchrec.distributed.planner.planners)</a>
Expand Down Expand Up @@ -3278,6 +3282,8 @@ <h2 id="S">S</h2>
<li><a href="torchrec.sparse.html#torchrec.sparse.jagged_tensor.KeyedJaggedTensor.stride_per_key_per_rank">stride_per_key_per_rank() (torchrec.sparse.jagged_tensor.KeyedJaggedTensor method)</a>
</li>
<li><a href="torchrec.modules.html#torchrec.modules.embedding_configs.PoolingType.SUM">SUM (torchrec.modules.embedding_configs.PoolingType attribute)</a>
</li>
<li><a href="torchrec.distributed.planner.html#torchrec.distributed.planner.types.CustomTopologyData.supported_fields">supported_fields (torchrec.distributed.planner.types.CustomTopologyData attribute)</a>
</li>
<li><a href="torchrec.modules.html#torchrec.modules.activation.SwishLayerNorm">SwishLayerNorm (class in torchrec.modules.activation)</a>
</li>
Expand Down
Binary file modified objects.inv
Binary file not shown.
2 changes: 1 addition & 1 deletion searchindex.js

Large diffs are not rendered by default.

Loading

0 comments on commit 8d55657

Please sign in to comment.