This system is part of a paper accepted at NAACL-HLT 2016 Conference. See the paper here: http://arxiv.org/pdf/1603.01360v1.pdf
John Smith went to Pittsburgh .
PER----- O O LOC O
Corresponding sequence of operations (generated by convert-conll2trans.pl
)
SHIFT
SHIFT
REDUCE(PER)
OUT
OUT
SHIFT
REDUCE(LOC)
OUT
- buffer - sequence of tokens, read from left to right
- stack - working memory
- output buffer - sequence of labeled segments constructed from left to right
SHIFT
- move word from buffer to top of stackREDUCE(X)
- all words on stack are popped, combined to form a segment and labeled withX
and copied to output bufferOUT
- move one token from buffer to output buffer
We use the datasets from conll2002 and conll2003
Convert conll format to ner action (convert-conll2trans.pl) and convert it to parser friendly format (conll2parser.py).
perl convert-conll2trans.pl conll2003/train > conll2003/train.trans
python conll2parser.py -f conll2003/train.trans > conll2003/train.parser
Link to the word vectors that we used in the NAACL 2016 paper for English: sskip.100.vectors.
The first time you clone the repository, you need to sync the cnn/ submodule.
git submodule init
git submodule update
mkdir build
cd build
cmake .. -DEIGEN3_INCLUDE_DIR=/path/to/eigen
make -j2
./lstm-parse -T conll2003/train.parser -d conll2003/dev.parser --hidden_dim 100 --lstm_input_dim 100 -w sskip.100.vectors --pretrained_dim 100 --rel_dim 20 --action_dim 20 --input_dim 100 -t -S -D 0.3 > logNERYesCharNoPosYesEmbeddingsD0.3.txt &
./lstm-parse -T conll2003/train.parser -d conll2003/test.parser --hidden_dim 100 --lstm_input_dim 100 -w sskip.100.vectors --pretrained_dim 100 --rel_dim 20 --action_dim 20 --input_dim 100 -m latest_model -S > output.txt
python attach_prediction.py -p output.txt -t conll2003/test -o evaloutput.txt
Attach your prediction to test file
python attach_prediction.py -p (prediction) -t /path/to/conll2003/test -o (output file)
./conlleval < (output file)
If you make use of this software, please cite the following:
@inproceedings{2016naacl,
author={Guillaume Lample and Miguel Ballesteros and Kazuya Kawakami and Sandeep Subramanian and Chris Dyer},
title={Neural Architectures for Named Entity Recognition},
booktitle={Proc. NAACL-HLT},
year=2016,
}
This software is released under the terms of the Apache License, Version 2.0.
For questions and usage issues, please contact [email protected]