feat(server): Add Non flash MPT. (#514)
# What does this PR do?
This adds a non flash version of MPT.
Flash is harder because we need to create a bias ready cuda kernel of
flash attention.
Fixes
https://github.com/huggingface/text-generation-inference/issues/361
Fixes
https://github.com/huggingface/text-generation-inference/issues/491
Fixes
https://github.com/huggingface/text-generation-inference/issues/290