Darwin V6 Evolved Model
Created by Darwin V6 diagnostic-guided evolutionary merge engine.
Parent Models
- Father:
FINAL-Bench/Darwin-4B-Opus - Mother:
DavidAU/gemma-4-E4B-it-The-DECKARD-Expresso-Universe-HERETIC-UNCENSORED-Thinking
Evolution Result
- Benchmark score: 0.8412
- Merge method: slerp
- Merge hash: 663333c9
Merge Statistics
- Total tensors merged: 0
- Transplant A (Father preserved): 0
- Transplant B (Mother preserved): 0
- Blended: 0
Optimal Genome
global_ratio: 0.5024
attn_ratio: 0.0625
ffn_ratio: 0.9059
embed_ratio: 0.4207
density_a: 0.9875
density_b: 0.9038
block_0_ratio: 0.8219
block_1_ratio: 0.5590
block_2_ratio: 0.6907
block_3_ratio: 0.3676
block_4_ratio: 0.3214
block_5_ratio: 0.5250
block_6_ratio: 0.6208
block_7_ratio: 0.6995
mri_trust: 0.9357
merge_method_weight: 0.3998
MRI Prescription Summary
- Average ratio_b: nan
- Attention ratio: nan
- FFN ratio: nan
- Embed ratio: 0.500
- Transplant A: 0
- Transplant B: 0
- Blended: 1160
Health Check
failed: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/data/ray_temp/ginipick/darwin_merge_cache/merged_42bf4a71'. Use repo_type argument if needed.
Method
Darwin V6 implements DARE-TIES merge directly via PyTorch tensor operations. Per-tensor ratios are determined by MRI diagnostic (static tensor analysis + probe-based functional importance) combined with evolutionary genome search.
Formula: final_ratio = mri_ratio * mri_trust + genome_ratio * (1 - mri_trust)
DARE-TIES algorithm: Yadav et al., 2023 (re-implemented, not library-dependent)
Built by VIDRAFT. Apache 2.0.
- Downloads last month
- 22
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for SeaWolf-AI/Darwin-Darwin-4B-Opus-x-gemma-4-E4B-it-The-D-08412
Base model
google/gemma-4-E4B-it