This demonstration uses GATE's bootstrapped training data and 32-size embeddings, with their mostly PTB tagset.
Tokenizer implementation by Myle Ott.
Important disclaimer: I'm not responsible for any tweets shown here, or the contents of any external links or images they contain.
Questions? Feedback? Comments? Send a tweet or email
richard<at>sentimentron.co.uk