sentiment-lstm

pytorchv1.0.0

Classifies text sequences into positive/negative sentiment using a bidirectional LSTM with learned attention weights. The attention mechanism computes importance scores for each timestep, enabling the model to focus on sentiment-bearing words. Includes embedding layer, dropout, and synthetic data generation for self-contained training.

NLPtext-classificationsentimentlstmintermediate

Install

1openmodelstudio install sentiment-lstm

SDK Usage

1import openmodelstudio as oms
2
3model = oms.use_model("sentiment-lstm")
4handle = oms.register_model("my-sentiment-lstm", model=model)
5job = oms.start_training(handle.model_id, wait=True)

Source Preview(104 lines)

View full source
1"""Sentiment analysis with bidirectional LSTM."""
2
3import torch
4import torch.nn as nn
5
6class SentimentLSTM(nn.Module):
7 def __init__(self, vocab_size=10000, embed_dim=128,
8 hidden_dim=256, n_classes=2):
9 super().__init__()
10 self.embedding = nn.Embedding(vocab_size, embed_dim)
11 self.lstm = nn.LSTM(embed_dim, hidden_dim,
12 batch_first=True, bidirectional=True)
13 self.attention = nn.Linear(hidden_dim * 2, 1)
14 self.classifier = nn.Linear(hidden_dim * 2, n_classes)
15
16 def forward(self, x):
17 embedded = self.embedding(x)
18 lstm_out, _ = self.lstm(embedded)
19 attn = torch.softmax(self.attention(lstm_out), dim=1)
20 context = (lstm_out * attn).sum(dim=1)
21 return self.classifier(context)

Details

Author
openmodelstudio
License
MIT
Source
104 lines
Version
1.0.0

Dependencies

torch>=2.0
numpy>=1.24

Hyperparameters

epochs10

Number of training epochs

lr0.001

Learning rate

hidden_dim256

LSTM hidden dimension

vocab_size10000

Vocabulary size

View on GitHub