transformers
bb965d8e - fix type annotation for ALL_ATTENTION_FUNCTIONS (#36690)

Commit
278 days ago
fix type annotation for ALL_ATTENTION_FUNCTIONS (#36690) Corrects the type annotation to match actual usage. The variable was typed as Dict[str, Dict[str, Callable]] but is actually used as Dict[str, Callable] where keys are attention mechanism names and values are the corresponding attention functions directly. This change makes the type annotation consistent with how the dictionary is used in the codebase.
Author
Parents
Loading