LG-AI-EXAONE commited on
Commit
b779282
ยท
1 Parent(s): 9165a5c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -4
README.md CHANGED
@@ -1,4 +1,6 @@
1
  ---
 
 
2
  license: other
3
  license_name: exaone
4
  license_link: LICENSE
@@ -31,7 +33,7 @@ library_name: transformers
31
  <a href="https://www.lgresearch.ai/blog/view?seq=641" style="text-decoration: none;">
32
  <img src="https://img.shields.io/badge/๐Ÿ“-Blog-E343BD?style=for-the-badge" alt="Blog">
33
  </a>
34
- <a href="https://github.com/LG-AI-EXAONE/EXAONE-4.5/blob/main/assets/Technical_Report__EXAONE_4_5.pdf" style="text-decoration: none;">
35
  <img src="https://img.shields.io/badge/๐Ÿ“‘-Technical_Report-684CF4?style=for-the-badge" alt="Technical Report">
36
  </a>
37
  <a href="https://github.com/LG-AI-EXAONE/EXAONE-4.5" style="text-decoration: none;">
@@ -54,7 +56,7 @@ Integrating a dedicated visual encoder into the existing EXAONE 4.0 framework, w
54
  EXAONE 4.5 features 33 billion parameters in total, including 1.2 billion parameters from the vision encoder.
55
  EXAONE 4.5 achieves competitive performance in general benchmark while outperforming SOTA models of similar size in document understanding and Korean contextual reasoning, inheriting powerful language capabilities from our previous language models.
56
 
57
- For more details, please refer to the [technical report](https://github.com/LG-AI-EXAONE/EXAONE-4.5/blob/main/assets/Technical_Report__EXAONE_4_5.pdf), [blog](https://www.lgresearch.ai/blog/view?seq=641) and [GitHub](https://github.com/LG-AI-EXAONE/EXAONE-4.5).
58
 
59
 
60
  ### Model Configuration
@@ -88,7 +90,7 @@ For more details, please refer to the [technical report](https://github.com/LG-A
88
 
89
  ## Evaluation Results
90
 
91
- The following table shows the benchmark results for the original EXAONE 4.5. Detailed evaluation results of the original model can be found in our [technical report](https://github.com/LG-AI-EXAONE/EXAONE-4.5/blob/main/assets/Technical_Report__EXAONE_4_5.pdf).
92
 
93
 
94
  ### Vision-Language Tasks
@@ -772,7 +774,7 @@ The model is licensed under [EXAONE AI Model License Agreement 1.2 - NC](./LICEN
772
  @article{exaone-4.5,
773
  title={EXAONE 4.5 Technical Report},
774
  author={{LG AI Research}},
775
- journal={arXiv preprint arXiv:XXXX.XXXXX},
776
  year={2026}
777
  }
778
  ```
 
1
  ---
2
+ base_model: LGAI-EXAONE/EXAONE-4.5-33B
3
+ base_model_relation: quantized
4
  license: other
5
  license_name: exaone
6
  license_link: LICENSE
 
33
  <a href="https://www.lgresearch.ai/blog/view?seq=641" style="text-decoration: none;">
34
  <img src="https://img.shields.io/badge/๐Ÿ“-Blog-E343BD?style=for-the-badge" alt="Blog">
35
  </a>
36
+ <a href="http://arxiv.org/abs/2604.08644" style="text-decoration: none;">
37
  <img src="https://img.shields.io/badge/๐Ÿ“‘-Technical_Report-684CF4?style=for-the-badge" alt="Technical Report">
38
  </a>
39
  <a href="https://github.com/LG-AI-EXAONE/EXAONE-4.5" style="text-decoration: none;">
 
56
  EXAONE 4.5 features 33 billion parameters in total, including 1.2 billion parameters from the vision encoder.
57
  EXAONE 4.5 achieves competitive performance in general benchmark while outperforming SOTA models of similar size in document understanding and Korean contextual reasoning, inheriting powerful language capabilities from our previous language models.
58
 
59
+ For more details, please refer to the [technical report](http://arxiv.org/abs/2604.08644), [blog](https://www.lgresearch.ai/blog/view?seq=641) and [GitHub](https://github.com/LG-AI-EXAONE/EXAONE-4.5).
60
 
61
 
62
  ### Model Configuration
 
90
 
91
  ## Evaluation Results
92
 
93
+ The following table shows the benchmark results for the original EXAONE 4.5. Detailed evaluation results of the original model can be found in our [technical report](http://arxiv.org/abs/2604.08644).
94
 
95
 
96
  ### Vision-Language Tasks
 
774
  @article{exaone-4.5,
775
  title={EXAONE 4.5 Technical Report},
776
  author={{LG AI Research}},
777
+ journal={arXiv preprint arXiv:2604.08644},
778
  year={2026}
779
  }
780
  ```