Skip to content

Commit

Permalink
#8764: Add note about PYTHON_ENV_DIR in the Models getting started an…
Browse files Browse the repository at this point in the history
…d put notes about ttnn jupyter tutorials only working on GS for now
  • Loading branch information
tt-rkim committed Jun 3, 2024
1 parent 6a2d4b8 commit 09d74ad
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 13 deletions.
4 changes: 4 additions & 0 deletions docs/source/ttnn/tt_metal_models/get_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,10 @@ which you'll be working.

source python_env/bin/activate

.. note::
You can use the ``PYTHON_ENV_DIR`` environment variable with the provided
``create_venv.sh`` script to control where the environment is created.

Set ``PYTHONPATH`` to the root for running models. This is a common practice.

::
Expand Down
32 changes: 19 additions & 13 deletions docs/source/ttnn/ttnn/get_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,19 +18,7 @@ Install and build the project by following the instructions in the
`installation guide
<../ttnn/installing.html>`_.

2. TT-NN Tutorial: Multi-Head Attention (Simple)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Learn the basics of multi-head attention operations with TT-NN
with a simple example: `TT-NN simple module <../../ttnn/ttnn/tutorials/ttnn_tutorials/003.html#Write-Multi-Head-Attention-using-ttnn>`_.

3. TT-NN Tutorial: Multi-Head Attention (Optimized)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Dive deeper into multi-head attention operations in TT-NN, optimizing
performance: `optimizing performance <../../ttnn/ttnn/tutorials/ttnn_tutorials/003.html#Write-optimized-version-of-Multi-Head-Attention>`_.

4. Explore our model demos
2. Explore our model demos
^^^^^^^^^^^^^^^^^^^^^^^^^^

Get started with the Falcon 7B demo. Navigate to the `Falcon 7B demo folder
Expand All @@ -44,6 +32,24 @@ You can also check our demos for
and
`Llama2-70B (coming soon on our T3000 platforms) <https://github.com/tenstorrent/tt-metal/tree/main/models/demos/t3000/llama2_70b>`_.

3. TT-NN Tutorial: Multi-Head Attention (Simple)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. note::
This tutorial currently works on Grayskull only.

Learn the basics of multi-head attention operations with TT-NN
with a simple example: `TT-NN simple module <../../ttnn/ttnn/tutorials/ttnn_tutorials/003.html#Write-Multi-Head-Attention-using-ttnn>`_.

4. TT-NN Tutorial: Multi-Head Attention (Optimized)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. note::
This tutorial currently works on Grayskull only.

Dive deeper into multi-head attention operations in TT-NN, optimizing
performance: `optimizing performance <../../ttnn/ttnn/tutorials/ttnn_tutorials/003.html#Write-optimized-version-of-Multi-Head-Attention>`_.

Where to go from here
^^^^^^^^^^^^^^^^^^^^^

Expand Down

0 comments on commit 09d74ad

Please sign in to comment.