Inspired by CharSCNN [1,3], Dracula combines character-level embeddings with two levels of deep LSTM representation for compelling performance, with small model sizes.
Dracula's performance (with 128-size word embeddings) on GATE's TweetIE T-Eval dataset is competitive with Derczynski et. al , but Dracula achieves this performance without word lists, frequency filtering, or any regular expressions.
Dracula is implemented using Theano, and originates from Pierre Luc Carrier and Kyunghyun Cho's LSTM Networks for Sentiment Analysis tutorial. It's also readily portable to other frameworks, such as Google's TensorFlow.