text-generation-inference
59ea38cb - Simplify the `attention` function (#2609)

Commit
1 year ago
Simplify the `attention` function (#2609) * Simplify the `attention` function - Use one definition rather than multiple. - Add `key`/`value` arguments, so that we don't need the `PREFILL_IN_KVCACHE` constant. - Make it kwargs-only (to avoid mixing up the various `Tensor` args). * Fixup flashinfer support
Author
Parents
Loading