Add guide to deploy TGI to SM (#1409)
* add how to deploy LLM to SM
copied from https://aws.amazon.com/blogs/machine-learning/announcing-the-launch-of-new-hugging-face-llm-inference-containers-on-amazon-sagemaker/
* add intro with link to TGI
* add link to supported models
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
* Update docs/sagemaker/inference.md
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>
---------
Co-authored-by: Philipp Schmid <32632186+philschmid@users.noreply.github.com>