Dataset Viewer
Auto-converted to Parquet Duplicate
text
large_stringlengths
1
334
prompt
large_stringlengths
2.16k
40.4k
n_audio_frames
int64
72
1.35k
n_tokens
int64
219
4.1k
input_ids
listlengths
219
4.1k
attention_mask
listlengths
219
4.1k
labels
listlengths
219
4.1k
input_ids_1cb
listlengths
75
1.43k
input_ids_2cb
listlengths
147
2.76k
GO TO THE CORNER OF WHICH YOU WIST AND BRING TO ME THE LARGE OLD KIST AND THE LITTLE TOAD WENT AND BROUGHT OUT A GREAT CHEST THEN THEY GAVE HER FOOD AND DRINK AND LED HER TO A BEAUTIFULLY MADE BED OF SILK AND SAMITE
GO TO THE CORNER OF WHICH YOU WIST AND BRING TO ME THE LARGE OLD KIST AND THE LITTLE TOAD WENT AND BROUGHT OUT A GREAT CHEST THEN THEY GAVE HER FOOD AND DRINK AND LED HER TO A BEAUTIFULLY MADE BED OF SILK AND SAMITE<|audio_start|><|c1_330|><|c2_684|><|c3_973|><|c1_330|><|c2_875|><|c3_145|><|c1_330|><|c2_204|><|c3_331|>...
1,299
3,962
[ 15513, 5146, 3168, 26465, 13896, 3008, 78164, 14985, 467, 3846, 3567, 18803, 1718, 5146, 16292, 3168, 76656, 64891, 730, 3846, 3567, 3168, 444, 60109, 5146, 1808, 467, 1825, 3567, 77587, 72182, 9808, 362, 60993, 49521, 784, 24117, 62493, 47...
[ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1...
[ 15513, 5146, 3168, 26465, 13896, 3008, 78164, 14985, 467, 3846, 3567, 18803, 1718, 5146, 16292, 3168, 76656, 64891, 730, 3846, 3567, 3168, 444, 60109, 5146, 1808, 467, 1825, 3567, 77587, 72182, 9808, 362, 60993, 49521, 784, 24117, 62493, 47...
[ 15513, 5146, 3168, 26465, 13896, 3008, 78164, 14985, 467, 3846, 3567, 18803, 1718, 5146, 16292, 3168, 76656, 64891, 730, 3846, 3567, 3168, 444, 60109, 5146, 1808, 467, 1825, 3567, 77587, 72182, 9808, 362, 60993, 49521, 784, 24117, 62493, 47...
[ 15513, 5146, 3168, 26465, 13896, 3008, 78164, 14985, 467, 3846, 3567, 18803, 1718, 5146, 16292, 3168, 76656, 64891, 730, 3846, 3567, 3168, 444, 60109, 5146, 1808, 467, 1825, 3567, 77587, 72182, 9808, 362, 60993, 49521, 784, 24117, 62493, 47...
"HE WAS USUALLY A GREAT GLUTTON AND I PROMISED MYSELF SOME DIVERSION IN HALF STARVING HIM HE AGREED (...TRUNCATED)
"HE WAS USUALLY A GREAT GLUTTON AND I PROMISED MYSELF SOME DIVERSION IN HALF STARVING HIM HE AGREED (...TRUNCATED)
1,183
3,603
[1799,37776,2274,52,28455,362,60993,5588,14782,3567,358,67688,26458,18224,65773,65555,46260,9504,196(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[1799,37776,2274,52,28455,362,60993,5588,14782,3567,358,67688,26458,18224,65773,65555,46260,9504,196(...TRUNCATED)
[1799,37776,2274,52,28455,362,60993,5588,14782,3567,358,67688,26458,18224,65773,65555,46260,9504,196(...TRUNCATED)
[1799,37776,2274,52,28455,362,60993,5588,14782,3567,358,67688,26458,18224,65773,65555,46260,9504,196(...TRUNCATED)
"FROM OUT OF THE SWAMP THERE CAME A PACK OF THREE AND NOW ABOUT THE ROCK THERE GREW A MADDENED YELPI(...TRUNCATED)
"FROM OUT OF THE SWAMP THERE CAME A PACK OF THREE AND NOW ABOUT THE ROCK THERE GREW A MADDENED YELPI(...TRUNCATED)
1,260
3,853
[30093,9808,3008,3168,13387,16024,61107,356,2729,362,33332,3008,14040,3567,22407,51812,3168,78520,61(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[30093,9808,3008,3168,13387,16024,61107,356,2729,362,33332,3008,14040,3567,22407,51812,3168,78520,61(...TRUNCATED)
[30093,9808,3008,3168,13387,16024,61107,356,2729,362,33332,3008,14040,3567,22407,51812,3168,78520,61(...TRUNCATED)
[30093,9808,3008,3168,13387,16024,61107,356,2729,362,33332,3008,14040,3567,22407,51812,3168,78520,61(...TRUNCATED)
"SO HAD I HE CONFESSED RUNNING BACK AND THROWING HIMSELF DOWN BESIDE HER NOW THEN DO BEGIN ON YOUR G(...TRUNCATED)
"SO HAD I HE CONFESSED RUNNING BACK AND THROWING HIMSELF DOWN BESIDE HER NOW THEN DO BEGIN ON YOUR G(...TRUNCATED)
1,229
3,745
[13880,472,1808,358,11685,45189,1570,20275,97677,32082,3567,77534,1718,88894,65773,27494,425,1570,12(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[13880,472,1808,358,11685,45189,1570,20275,97677,32082,3567,77534,1718,88894,65773,27494,425,1570,12(...TRUNCATED)
[13880,472,1808,358,11685,45189,1570,20275,97677,32082,3567,77534,1718,88894,65773,27494,425,1570,12(...TRUNCATED)
[13880,472,1808,358,11685,45189,1570,20275,97677,32082,3567,77534,1718,88894,65773,27494,425,1570,12(...TRUNCATED)
"HAD THERE BEEN BUT TWO OR THREE OF ALL THOSE SILKEN TRIFLERS TOO LATE COME ON THE TERRACES ABOVE TO(...TRUNCATED)
"HAD THERE BEEN BUT TWO OR THREE OF ALL THOSE SILKEN TRIFLERS TOO LATE COME ON THE TERRACES ABOVE TO(...TRUNCATED)
1,291
3,950
[39,1808,61107,74653,10915,46258,2726,14040,3008,13097,4434,75634,59723,61829,4984,2773,43,4321,9257(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[39,1808,61107,74653,10915,46258,2726,14040,3008,13097,4434,75634,59723,61829,4984,2773,43,4321,9257(...TRUNCATED)
[39,1808,61107,74653,10915,46258,2726,14040,3008,13097,4434,75634,59723,61829,4984,2773,43,4321,9257(...TRUNCATED)
[39,1808,61107,74653,10915,46258,2726,14040,3008,13097,4434,75634,59723,61829,4984,2773,43,4321,9257(...TRUNCATED)
"SHE FELT SO SOUR AND UNRELENTING THAT FOR A FEW MINUTES SHE ALMOST FORGOT ABOUT DICKON AND THE GREE(...TRUNCATED)
"SHE FELT SO SOUR AND UNRELENTING THAT FOR A FEW MINUTES SHE ALMOST FORGOT ABOUT DICKON AND THE GREE(...TRUNCATED)
792
2,428
[50,1799,434,2749,51,5627,48763,3567,6643,787,867,6408,1718,25269,4613,362,27931,54,16701,53785,5359(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[50,1799,434,2749,51,5627,48763,3567,6643,787,867,6408,1718,25269,4613,362,27931,54,16701,53785,5359(...TRUNCATED)
[50,1799,434,2749,51,5627,48763,3567,6643,787,867,6408,1718,25269,4613,362,27931,54,16701,53785,5359(...TRUNCATED)
[50,1799,434,2749,51,5627,48763,3567,6643,787,867,6408,1718,25269,4613,362,27931,54,16701,53785,5359(...TRUNCATED)
"FOR SUBSTITUTING A REAL FOR A SHAM KNOWLEDGE SINCE IF YOU WOULD APPEAR TO KNOW ANYTHING BY FAR THE (...TRUNCATED)
"FOR SUBSTITUTING A REAL FOR A SHAM KNOWLEDGE SINCE IF YOU WOULD APPEAR TO KNOW ANYTHING BY FAR THE (...TRUNCATED)
1,163
3,551
[30902,16140,25765,1381,1718,362,25272,4613,362,6434,1402,58027,13639,10777,70002,2104,11551,14985,4(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[30902,16140,25765,1381,1718,362,25272,4613,362,6434,1402,58027,13639,10777,70002,2104,11551,14985,4(...TRUNCATED)
[30902,16140,25765,1381,1718,362,25272,4613,362,6434,1402,58027,13639,10777,70002,2104,11551,14985,4(...TRUNCATED)
[30902,16140,25765,1381,1718,362,25272,4613,362,6434,1402,58027,13639,10777,70002,2104,11551,14985,4(...TRUNCATED)
"BUT AFTER SHE HAD TWICE REPORTED VIRGINIE TO BE ASLEEP WITHOUT A WORD BEING UTTERED IN REPLY TO HER(...TRUNCATED)
"BUT AFTER SHE HAD TWICE REPORTED VIRGINIE TO BE ASLEEP WITHOUT A WORD BEING UTTERED IN REPLY TO HER(...TRUNCATED)
1,103
3,370
[33,1381,48264,53595,472,1808,35555,5487,44076,1479,74695,20650,5371,5146,7206,5752,43290,6007,362,3(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[33,1381,48264,53595,472,1808,35555,5487,44076,1479,74695,20650,5371,5146,7206,5752,43290,6007,362,3(...TRUNCATED)
[33,1381,48264,53595,472,1808,35555,5487,44076,1479,74695,20650,5371,5146,7206,5752,43290,6007,362,3(...TRUNCATED)
[33,1381,48264,53595,472,1808,35555,5487,44076,1479,74695,20650,5371,5146,7206,5752,43290,6007,362,3(...TRUNCATED)
"TOLINE ACCEPTED THEM WITH A SHY GRACE THAT WAS VERY CHARMING THE CONVERSATION WITH HIM HOWEVER WAS (...TRUNCATED)
"TOLINE ACCEPTED THEM WITH A SHY GRACE THAT WAS VERY CHARMING THE CONVERSATION WITH HIM HOWEVER WAS (...TRUNCATED)
1,252
3,818
[51,1930,3981,54906,1479,73703,4769,362,6434,56,14773,5576,25269,37776,47074,6826,17911,1718,3168,34(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[51,1930,3981,54906,1479,73703,4769,362,6434,56,14773,5576,25269,37776,47074,6826,17911,1718,3168,34(...TRUNCATED)
[51,1930,3981,54906,1479,73703,4769,362,6434,56,14773,5576,25269,37776,47074,6826,17911,1718,3168,34(...TRUNCATED)
[51,1930,3981,54906,1479,73703,4769,362,6434,56,14773,5576,25269,37776,47074,6826,17911,1718,3168,34(...TRUNCATED)
"WITH THE PREMEDITATED PURPOSE OF TANTALIZING HIM I SUPPOSE SHE WAS BEGINNING TO KNOW HER POWER OVER(...TRUNCATED)
"WITH THE PREMEDITATED PURPOSE OF TANTALIZING HIM I SUPPOSE SHE WAS BEGINNING TO KNOW HER POWER OVER(...TRUNCATED)
1,111
3,386
[44671,3168,20952,44,16949,9005,7515,3008,350,2821,969,2843,1718,88894,358,53622,7150,53595,37776,22(...TRUNCATED)
[1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1(...TRUNCATED)
[44671,3168,20952,44,16949,9005,7515,3008,350,2821,969,2843,1718,88894,358,53622,7150,53595,37776,22(...TRUNCATED)
[44671,3168,20952,44,16949,9005,7515,3008,350,2821,969,2843,1718,88894,358,53622,7150,53595,37776,22(...TRUNCATED)
[44671,3168,20952,44,16949,9005,7515,3008,350,2821,969,2843,1718,88894,358,53622,7150,53595,37776,22(...TRUNCATED)
End of preview. Expand in Data Studio

Speech DAC Tokens (3 Codebooks)

Pre-tokenized speech dataset using the Descript Audio Codec (DAC). Each audio clip has been encoded into discrete codebook tokens from DAC's first 3 residual vector quantization codebooks, paired with its text transcription.

Dataset Summary

Stat Value
Total samples 241,451
Total audio ~780 hours
Language English
Codebooks 3 (of DAC's 9)
Codebook size 1,024 entries each
DAC model 44kHz
Tokens per second ~258 (86 frames x 3 codebooks)
Token sequence length 219-4,096 (mean: 3,063)
Audio duration range ~0.8s-15.7s

Data Sources

Source Split Clips License
LibriSpeech clean-100 train.100 ~24,200 CC BY 4.0
LibriSpeech clean-360 train.360 ~88,500 CC BY 4.0
LibriSpeech other-500 train.500 ~128,750 CC BY 4.0

Format

Each row contains:

Column Type Description
text string Original text transcription
prompt string Full training prompt: {text}<|audio_start|><|c1_X|><|c2_Y|><|c3_Z|>...<|audio_end|>
input_ids list[int] Pre-tokenized 3-codebook prompt. Ready for training.
input_ids_1cb list[int] Pre-tokenized 1-codebook prompt (c1 only, shorter sequences).
input_ids_2cb list[int] Pre-tokenized 2-codebook prompt (c1+c2).
attention_mask list[int] All 1s, same length as input_ids (3cb).
labels list[int] Copy of input_ids (3cb). Used as training targets.
n_audio_frames int Number of DAC time frames
n_tokens int Total token count (text + audio tokens)

Audio tokens are interleaved per time frame: c1, c2, c3, c1, c2, c3, ... where:

  • c1 (codebook 1): Coarse structure - pitch, rhythm, broad spectral shape
  • c2 (codebook 2): Fine detail - residual from c1
  • c3 (codebook 3): Finest detail - residual from c1+c2

Use Cases

Text-to-Speech Training

Train a language model to predict DAC tokens from text input. The model learns to generate the audio token sequence, which is then decoded back to audio using DAC's decoder. No spectrogram or vocoder needed - just token prediction.

Input:  Hello world
Output: <|audio_start|><|c1_551|><|c2_118|><|c3_42|>...<|audio_end|>
-> DAC decoder -> audio waveform

Audio Language Modeling

Train unconditional or conditional audio generation models using discrete tokens, similar to how language models generate text.

Speech Understanding

Use the tokenized representation for speech classification, speaker identification, or other downstream tasks that benefit from discrete audio representations.

Codec Research

Study the information captured at different codebook levels, or compare DAC's tokenization against other codecs (EnCodec, SpeechTokenizer).

How to Decode Audio

import torch
import dac
from dac.utils import load_model
import re

# Load DAC decoder
dac_model = load_model(tag="latest", model_type="44khz")
dac_model.eval()

# Parse tokens from a prompt
prompt = dataset[0]["prompt"]
pattern = r'<\|c(\d+)_(\d+)\|>'
matches = re.findall(pattern, prompt)

# Group into frames (every 3 tokens = 1 frame)
frames = []
frame = [None, None, None]
for cb_str, val_str in matches:
    cb = int(cb_str) - 1
    frame[cb] = int(val_str)
    if cb == 2:
        frames.append(list(frame))
        frame = [None, None, None]

# Decode: pad to 9 codebooks (DAC expects all 9)
codes = torch.tensor(frames).T.unsqueeze(0).long()
full_codes = torch.zeros(1, 9, codes.shape[2], dtype=torch.long)
full_codes[:, :3, :] = codes

with torch.no_grad():
    z = dac_model.quantizer.from_codes(full_codes)
    audio = dac_model.decode(z)

# audio[0, 0] is the waveform at 44100 Hz

Related

Processing Details

  • Audio resampled from 16kHz (LibriSpeech native) to 44.1kHz (DAC native)
  • Clips exceeding 4,096 tokens were excluded (~17% of source data)
  • DAC encoding performed on Apple MPS (M4 Max) at ~2.4 clips/sec
  • No word-level alignment or prosodic features - raw text + DAC codes only
  • 0 CPU fallback failures during encoding

Citation

If you use this dataset, please cite the original data sources:

@inproceedings{panayotov2015librispeech,
  title={Librispeech: an ASR corpus based on public domain audio books},
  author={Panayotov, Vassil and Chen, Guoguo and Povey, Daniel and Khudanpur, Sanjeev},
  booktitle={ICASSP},
  year={2015}
}

@article{kumar2024high,
  title={High-fidelity audio compression with improved RVQGAN},
  author={Kumar, Rithesh and others},
  journal={NeurIPS},
  year={2024}
}
Downloads last month
199