Skip to content

PyTorch/TensorFlow Deep Learning Quick Start

Simple Applications of Neural Networks


What is Deep Learning?

Deep Learning = Multi-layer Neural Networks

Social Science Use Cases:

  • Text classification (sentiment analysis, topic classification)
  • Image analysis (social media images)
  • Sequence prediction (time series)

Note: For social science research, traditional machine learning (sklearn) is usually sufficient!


PyTorch Basics

Installation

bash
pip install torch

Simple Neural Network

python
import torch
import torch.nn as nn
import torch.optim as optim

# Define model
class SimpleNN(nn.Module):
    def __init__(self, input_size, hidden_size, output_size):
        super(SimpleNN, self).__init__()
        self.fc1 = nn.Linear(input_size, hidden_size)
        self.relu = nn.ReLU()
        self.fc2 = nn.Linear(hidden_size, output_size)

    def forward(self, x):
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        return x

# Create model
model = SimpleNN(input_size=10, hidden_size=20, output_size=1)

# Loss function and optimizer
criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)

# Training loop (simplified)
for epoch in range(100):
    # Forward pass
    outputs = model(X_train)
    loss = criterion(outputs, y_train)

    # Backward pass
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

# Prediction
model.eval()
with torch.no_grad():
    predictions = model(X_test)

TensorFlow/Keras

More Concise API

python
import tensorflow as tf
from tensorflow import keras

# Define model
model = keras.Sequential([
    keras.layers.Dense(20, activation='relu', input_shape=(10,)),
    keras.layers.Dense(1)
])

# Compile
model.compile(
    optimizer='adam',
    loss='mse',
    metrics=['mae']
)

# Train
history = model.fit(
    X_train, y_train,
    epochs=100,
    batch_size=32,
    validation_split=0.2,
    verbose=0
)

# Predict
predictions = model.predict(X_test)

Practical Example: Text Sentiment Classification

python
from tensorflow import keras
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.preprocessing.sequence import pad_sequences

# Text data
texts = [
    "This product is amazing!",
    "Terrible experience, would not recommend.",
    "Pretty good overall.",
    # ...
]
labels = [1, 0, 1, ...]  # 1=positive, 0=negative

# Text vectorization
tokenizer = Tokenizer(num_words=10000)
tokenizer.fit_on_texts(texts)
sequences = tokenizer.texts_to_sequences(texts)
X = pad_sequences(sequences, maxlen=100)

# Model
model = keras.Sequential([
    keras.layers.Embedding(10000, 16, input_length=100),
    keras.layers.GlobalAveragePooling1D(),
    keras.layers.Dense(16, activation='relu'),
    keras.layers.Dense(1, activation='sigmoid')
])

model.compile(
    optimizer='adam',
    loss='binary_crossentropy',
    metrics=['accuracy']
)

# Train
model.fit(X, labels, epochs=10, batch_size=32)

When to Use Deep Learning?

Good for Deep Learning

  • Large-scale data (>100k samples)
  • Images, text, audio
  • Complex nonlinear relationships

Not Suitable for Deep Learning

  • Small samples (<1000)
  • Tabular data (sklearn is better)
  • Requires interpretability

Social Science Recommendation: Prioritize sklearn unless you have special needs!


Practice Exercises

python
# (Optional) Try using PyTorch or TensorFlow:
# 1. Build a simple regression model
# 2. Train and evaluate
# 3. Compare results with sklearn

Next Steps

Next Section: LLM APIs Quick Start (Most practical!)

Keep going!

Released under the MIT License. Content © Author.