Japanese ASR

This repository contains all the models and datasets for train/evaluate the Japanese ASR dataset generated through the process of achieving kotoba-whisper models. Following table shows CER comparison with different data size of ReazonSpeech used to distill openai/whisper-large-v3. The model names follows japanese-asr/distil-whisper-large-v3-ja-reazonspeech-{size of reazonspeech}.

CER

WER

Note that kotoba-tech/kotoba-whisper-v1.0 is an alias of japanese-asr/distil-whisper-large-v3-ja-reazonspeech-large and kotoba-tech/kotoba-whisper-v2.0 is an alias of japanese-asr/distil-whisper-large-v3-ja-reazonspeech-all.

Please find more detailed results at kotoba-whisper codebase.