text-generation-inference
8e0c161d - fix: incomplete generations w/ single tokens generations and models that did not support chunking (#2770)

Commit
1 year ago
fix: incomplete generations w/ single tokens generations and models that did not support chunking (#2770) * Incomplete generation stream fix (#2754) entries.len() could > batch.size in prefill, so need to filter as well. Signed-off-by: Wang, Yi A <yi.a.wang@intel.com> * entries was wrongly extended for model that did not support chunking --------- Signed-off-by: Wang, Yi A <yi.a.wang@intel.com> Co-authored-by: Wang, Yi <yi.a.wang@intel.com>
Parents
Loading