transformers
a72e5a4b - 🚨 Fix Inconsistant `input_feature` length and `attention_mask` length in `WhisperFeatureExtractor` (#39221)

Commit
197 days ago
🚨 Fix Inconsistant `input_feature` length and `attention_mask` length in `WhisperFeatureExtractor` (#39221) * Update feature_extraction_whisper.py * Reformat * Add feature extractor shape test * reformat * fix omni * fix new failing whisper test * Update src/transformers/models/whisper/feature_extraction_whisper.py * make style * revert omni test changes * add comment --------- Co-authored-by: lvyuanjun.lyj <lvyuanjun.lyj@alibaba-inc.com> Co-authored-by: Anton Vlasjuk <73884904+vasqu@users.noreply.github.com> Co-authored-by: Vasqu <antonprogamer@gmail.com> Co-authored-by: eustlb <94853470+eustlb@users.noreply.github.com> Co-authored-by: Eustache Le Bihan <eulebihan@gmail.com>
Author
Parents
Loading