Skip to content

Commit

Permalink
Merge branch 'k2-fsa:master' into dev/cv-zipformer
Browse files Browse the repository at this point in the history
  • Loading branch information
JinZr authored Mar 23, 2024
2 parents c274003 + bb9ebcf commit 1d92107
Show file tree
Hide file tree
Showing 7 changed files with 545 additions and 4 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ljspeech.yml
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ jobs:
path: ./*.wav

- name: Release exported onnx models
if: matrix.python-version == '3.9' && matrix.torch-version == '2.2.0'
if: matrix.python-version == '3.9' && matrix.torch-version == '2.2.0' && github.event_name == 'push'
uses: svenstaro/upload-release-action@v2
with:
file_glob: true
Expand Down
4 changes: 4 additions & 0 deletions docs/source/for-dummies/environment-setup.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,10 @@ to install dependencies of `icefall`_:
pip install k2==1.24.4.dev20231220+cpu.torch2.0.0 -f https://k2-fsa.github.io/k2/cpu.html
# For users from China
# 中国国内用户,如果访问不了 huggingface, 请使用
# pip install k2==1.24.4.dev20231220+cpu.torch2.0.0 -f https://k2-fsa.github.io/k2/cpu-cn.html
# Install the latest version of lhotse
pip install git+https://github.com/lhotse-speech/lhotse
Expand Down
3 changes: 3 additions & 0 deletions docs/source/installation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,9 @@ We will install `k2`_ from pre-compiled wheels by following
.. code-block:: bash
(test-icefall) kuangfangjun:~$ pip install k2==1.24.3.dev20230725+cuda11.6.torch1.13.0 -f https://k2-fsa.github.io/k2/cuda.html
# For users from China
# 中国国内用户,如果访问不了 huggingface, 请使用
# pip install k2==1.24.3.dev20230725+cuda11.6.torch1.13.0 -f https://k2-fsa.github.io/k2/cuda-cn.html
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Looking in links: https://k2-fsa.github.io/k2/cuda.html
Expand Down
2 changes: 1 addition & 1 deletion egs/librispeech/ASR/zipformer/zipformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -788,7 +788,7 @@ def forward(
selected_attn_weights = attn_weights[0:1]
if torch.jit.is_scripting() or torch.jit.is_tracing():
pass
elif not self.training and random.random() < float(self.const_attention_rate):
elif self.training and random.random() < float(self.const_attention_rate):
# Make attention weights constant. The intention is to
# encourage these modules to do something similar to an
# averaging-over-time operation.
Expand Down
6 changes: 4 additions & 2 deletions egs/librispeech/ASR/zipformer_adapter/export-onnx.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,13 @@
2. Export the model to ONNX
./zipformer/export-onnx.py \
./zipformer_adapter/export-onnx.py \
--tokens $repo/data/lang_bpe_500/tokens.txt \
--use-averaged-model 0 \
--epoch 99 \
--avg 1 \
--use-adapters 1 \
--adapter-dim 32 \
--exp-dir $repo/exp \
--num-encoder-layers "2,2,3,4,3,2" \
--downsampling-factor "1,2,4,8,4,2" \
Expand Down Expand Up @@ -131,7 +133,7 @@ def get_parser():
parser.add_argument(
"--exp-dir",
type=str,
default="zipformer/exp",
default="zipformer_adapter/exp",
help="""It specifies the directory where all training related
files, e.g., checkpoints, log, etc, are saved
""",
Expand Down
Loading

0 comments on commit 1d92107

Please sign in to comment.