You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
thanks for your excellent study on deep neural networks for tabular data! I came across your repository and paper during the research for my on-going master's thesis.
I noticed two aspects that I wanted to share with you:
You base your implementation of the TabTransformer on this repository. The implementation, however deviates from the paper (http://arxiv.org/abs/2012.06678) concerning the column embedding, which is one of the author's contributions. The paper proposes a column embedding that consists of feature-value-specific embedding and a shared embedding. Both get concatenated or added element-wisely. The shared embedding is important in the author's work (see p. 10-11). The used implementation doesn't implement any shared embedding. Also, the implementation introduces a scaling factor for the hidden dimensions in input_size // 8 (see https://github.com/kathrinse/TabSurvey/blob/main/models/tabtransformer.py) of the MLP, that I could not find in the paper (see p. 3). Thus, the net might have a much smaller capacity.
Dear @kathrinse,
thanks for your excellent study on deep neural networks for tabular data! I came across your repository and paper during the research for my on-going master's thesis.
I noticed two aspects that I wanted to share with you:
input_size // 8
(see https://github.com/kathrinse/TabSurvey/blob/main/models/tabtransformer.py) of the MLP, that I could not find in the paper (see p. 3). Thus, the net might have a much smaller capacity.Sorry if I have missed something.
Keep up the excellent work 💯
Best,
Markus
The text was updated successfully, but these errors were encountered: