Update README.md
Browse files
README.md
CHANGED
|
@@ -32,13 +32,13 @@ Two additional model variants explore different capabilities and inference optim
|
|
| 32 |
|
| 33 |
We evaluated granite-speech-4.1-2b alongside other speech-language models in the less than 8b parameter range as well as dedicated ASR and AST systems on standard benchmarks. The evaluation spanned multiple public benchmarks, with particular emphasis on English ASR tasks while also including multilingual ASR and AST for X-En and En-X translations.
|
| 34 |
<br>
|
| 35 |
-
 capability by comparing performance with and without KWB applied at inference time.
|
|
@@ -49,17 +49,17 @@ We also evaluated our model on a variety of corpora to assess its punctuation an
|
|
| 49 |
|
| 50 |
| Test Set | PER (↓) | Cap-F1 (↑) |
|
| 51 |
|:---------|:----:|:------:|
|
| 52 |
-
| LScln | 25.
|
| 53 |
-
| LSoth | 22.
|
| 54 |
-
| VoxPopuli |
|
| 55 |
-
| Earnings-22 | 22.
|
| 56 |
-
| CV-EN | 9.
|
| 57 |
-
| CV-DE | 3.
|
| 58 |
-
| CV-ES | 11.
|
| 59 |
-
| CV-FR | 11.
|
| 60 |
-
| CV-PT |
|
| 61 |
-
|
| 62 |
-
† *We report a Cap-F1 of 99.
|
| 63 |
|
| 64 |
<br>
|
| 65 |
|
|
|
|
| 32 |
|
| 33 |
We evaluated granite-speech-4.1-2b alongside other speech-language models in the less than 8b parameter range as well as dedicated ASR and AST systems on standard benchmarks. The evaluation spanned multiple public benchmarks, with particular emphasis on English ASR tasks while also including multilingual ASR and AST for X-En and En-X translations.
|
| 34 |
<br>
|
| 35 |
+

|
| 36 |
<br>
|
| 37 |
+

|
| 38 |
<br>
|
| 39 |
+

|
| 40 |
<br>
|
| 41 |
+

|
| 42 |
<br>
|
| 43 |
|
| 44 |
We evaluated the model’s keyword list biasing (KWB) capability by comparing performance with and without KWB applied at inference time.
|
|
|
|
| 49 |
|
| 50 |
| Test Set | PER (↓) | Cap-F1 (↑) |
|
| 51 |
|:---------|:----:|:------:|
|
| 52 |
+
| LScln | 25.70 | 89.71 |
|
| 53 |
+
| LSoth | 22.27 | 91.26 |
|
| 54 |
+
| VoxPopuli | 24.86 | 95.35 |
|
| 55 |
+
| Earnings-22 | 22.87 | 95.19 |
|
| 56 |
+
| CV-EN | 9.13 | 96.75 |
|
| 57 |
+
| CV-DE | 3.66 | 99.50† |
|
| 58 |
+
| CV-ES | 11.61 | 95.68 |
|
| 59 |
+
| CV-FR | 11.00 | 97.25 |
|
| 60 |
+
| CV-PT | 7.86 | 98.51 |
|
| 61 |
+
|
| 62 |
+
† *We report a Cap-F1 of 99.5 on German, where noun capitalization is required.*
|
| 63 |
|
| 64 |
<br>
|
| 65 |
|