-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Closed
Labels
model:transformerissues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.platform:webissues related to ONNX Runtime web; typically submitted using templateissues related to ONNX Runtime web; typically submitted using template
Description
Describe the issue
I am trying to load a model on WebGPU backend env.
I could load the model downloaded from:
https://github.com/onnx/models/blob/main/vision/classification/mobilenet/model/mobilenetv2-12.onnx
But I couldn't load the following model:
https://huggingface.co/runwayml/stable-diffusion-v1-5/tree/onnx/vae_encoder
Both models can be loaded using Python onnxruntime.
To reproduce
Download the model from:
https://huggingface.co/runwayml/stable-diffusion-v1-5/tree/onnx/vae_encoder
and run the following code:
const ort = require('onnxruntime-web/webgpu');
async function main() {
const modelPath = './models/sd15_vae_encoder_model.onnx';
const session = await ort.InferenceSession.create(modelPath, {executionProviders: ['webgpu']});
}Urgency
No response
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
Execution Provider
Other / Unknown
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
model:transformerissues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.platform:webissues related to ONNX Runtime web; typically submitted using templateissues related to ONNX Runtime web; typically submitted using template