IncomeNet-5.5k-dropout-log
This model is a Multi-Layer Perceptron (MLP) featuring Dropout layers for improved generalization. It classifies whether an individual's income exceeds $50,000 per year based on census data.
Model Description
- Architecture: 3-Layer MLP with Dropout (approx. 5.5k parameters).
- Regularization: Dropout was implemented to prevent the model from over-relying on specific features.
- Key Preprocessing: Log-transformation on numerical features was applied to handle skewed data distributions.
Performance
The Dropout-Log model shows a very strong balance between predictive power and stability:
- Accuracy: 0.814
- F1-Score: 0.776
Experimental Comparison
In our experimental series, this model (indicated by the triangle) consistently outperformed the Embedding-based models:
Training Stability & Generalization
A key advantage of this model is its training behavior. While the Base-Log model shows earlier signs of diverging evaluation loss, the Dropout-Log variant remains more stable:
How to Use
To run inference, you need the model_architecture.py (containing the IncomeNetMLPDropout class) and the preprocessor.pkl.
import torch
from safetensors.torch import load_model
from model_architecture import IncomeNetMLPDropout
# Instantiate architecture and load weights
model = IncomeNetMLPDropout(input_dim=105)
load_model(model, "model.safetensors")
model.eval()
- Downloads last month
- 2

