Training a text classifier

We are now ready for training our text classifier model. Let's start with something simple: we are going to consider this model to be a black box for now.

Model architecture is better explained by other sources, including several YouTube videos such as those by CS224n at Stanford (http://web.stanford.edu/class/cs224n/). I suggest that you explore and connect it with the know-how that you already have:

class SimpleLSTMBaseline(nn.Module):    def __init__(self, hidden_dim, emb_dim=300,                 spatial_dropout=0.05, recurrent_dropout=0.1, num_linear=2):        super().__init__() # don't forget to call this!        self.embedding = nn.Embedding(len(TEXT.vocab), emb_dim) self.encoder = nn.LSTM(emb_dim, hidden_dim, num_layers=num_linear, ...

Get Natural Language Processing with Python Quick Start Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.