benchmark
ca751200
- run timm models in optimized_for_inference mode (#383)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
4 years ago
run timm models in optimized_for_inference mode (#383) * run timm models in optimized_for_inference mode * fix errors * fix another bug
References
#383 - run timm models in optimized_for_inference mode
Author
Krovatkin
Parents
1e795528
Loading