DocRED Stage2 Model (ATLOP + DREEAM)
Base Model
- bert-base-uncased (Apache 2.0)
Dataset
- DocRED (CC BY-SA 4.0)
Training
- Stage 2: distant pre-training + annotated fine-tuning
- pooling: logsumexp
- classifier: ATLOP
- threshold: adaptive
- Encoder: bert-base-uncased
- Max sequence length: 512
- Batch size: 4
- Optimizer: AdamW
- Encoder LR: 2e-5
- Classifier LR: 1e-4
- Warmup ratio: 0.06
Stage Enhancements
- Entity pooling: LogSumExp (ATLOP)
- Relation classifier: ATLOP (bilinear-style)
- Thresholding: Adaptive threshold
- Evidence supervision: Enabled (DREEAM)
License
This model is licensed under CC BY-SA 4.0 due to the use of the DocRED dataset.
- Base model: bert-base-uncased (Apache 2.0)
- Dataset: DocRED (CC BY-SA 4.0)
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for yeseul0-0/docred-stage2-atlop
Base model
google-bert/bert-base-uncased