The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
Hey! This is great. Is this already in alpha?
Team, is there any tentative time to release this v3 alpha ???
I can't wait anymore :) Please update me when it will be released!
@xenova Can I test v3-alpha by using NPM? When I try to run, I get this issue.
@xenova Can I test v3-alpha by using NPM? When I try to run, I get this issue.
use this https://github.com/kishorekaruppusamy/transformers.js/commit/7af8ef1e5c37f3052ed3a8e38938595702836f09
commit to resolve this issue ...
Thanks for your reply @kishorekaruppusamy I tried with your branch and I got other issues.
Please give me your advise!
Thanks for your reply @kishorekaruppusamy I tried with your branch and I got other issues.
Please give me your advise!
https://github.com/kishorekaruppusamy/transformers.js/blob/V3_BRANCH_WEBGPU_BUG_FIX/src/backends/onnx.js#L144
change this url to local dist dir inside build ..
1 | /** | ||
2 | * @typedef {'cpu'|'gpu'|'wasm'|'webgpu'|null} DeviceType |
Out of curiosity, what is 'gpu'
?
It's meant to be a "catch-all" for the different ways that the library can be used with GPU support (not just in the browser with WebGPU). The idea is that it will simplify documentation, as transformers.js will select the best execution provider depending on the environment. For example, DML/CUDA support in onnxruntime-node (see microsoft/onnxruntime#16050 (comment))
Of course, this is still a work in progress, so it can definitely change!
Login to write a write a comment.
In preparation for Transformers.js v3, I'm compiling a list of issues/features which will be fixed/included in the release.
onnxruntime-web
to 1.17.0).onnxruntime-web
โ 1.17.0). Closes:topk
->top_k
parameter.transpose
->permute
Useful commands:
npm version prerelease --preid=alpha -m "[version] Update to %s"
How to use WebGPU
First, install the development branch
Then specify the
device
parameter when loading the model. Here's example code to get started. Please note that this is still a WORK IN PROGRESS, so the following usage may change before release.