Skip to content

Latest commit

 

History

History
63 lines (52 loc) · 2.92 KB

README.md

File metadata and controls

63 lines (52 loc) · 2.92 KB

Transformer-TTS

Training

  1. Download and extract the LJ Speech dataset
  2. Make preprocessed folder in LJSpeech directory and make char_seq & phone_seq & melspectrogram folder in it
  3. Set data_path in hparams.py as the LJSpeech folder
  4. Using prepare_data.ipynb, prepare melspectrogram and text (converted into indices) tensors.
  5. python train.py

Training curve (Orange: character / Blue: phoneme)

  • Stop prediction loss (train / val)
  • Guided attention loss (train / val)
  • L1 loss (train / val)

Alignments (Left: character / Right: phoneme)

  • Encoder Alignments

- Decoder Alignments

- Encoder-Decoder Alignments

- Melspectrogram (target / before / after POSTNET)

- Stop prediction

Audio Samples

You can hear the audio samples here

Notice

  1. Unlike the original paper, I didn't use the encoder-prenet following espnet
  2. I apply additional "guided attention loss" to the two heads of the last two layers
  3. Batch size is important, so I use gradient accumulation
  4. You can also use DataParallel. Change the n_gpus, batch_size, accumulation appropriately.

TODO

  1. Dynamic batch

Fastspeech

  1. For fastspeech, generated melspectrograms and attention matrix should be saved for later.
    1-1. Set teacher_path in hparams.py and make alignments and targets directories there.
    1-2. Using prepare_fastspeech.ipynb, prepare alignmetns and targets.

  2. To draw attention plots for every each head, I change return values of the "torch.nn.functional.multi_head_attention_forward()"

#before
return attn_output, attn_output_weights.sum(dim=1) / num_heads  

#after  
return attn_output, attn_output_weights
  1. Among num_layers*num_heads attention matrices, the one with the highest focus rate is saved.

Reference

1.NVIDIA/tacotron2: https://github.com/NVIDIA/tacotron2
2.espnet/espnet: https://github.com/espnet/espnet
3.soobinseo/Transformer-TTS: https://github.com/soobinseo/Transformer-TTS