Skip to content

Using OnnxTransformer throws TypeInitializationException #5262

@yousiftouma

Description

@yousiftouma

System information

  • OS version/distro: Windows 7
  • .NET Version (eg., dotnet --info): core 3.1

Issue

When trying to use OnnxTransformer, the native libraries aren't loaded properly. I can see them under bin\Debug\netcoreapp3.1\runtimes(platform)\native.
If I use package version 1.4.0 of OnnxTransformer, without installing the runtime myself, it works.
I couldn't find any docs regarding the requirement to install the runtime manually (I figured it out by browsing all over the place, but not through docs really). I suppose this should be clear when you're not using the onnxruntime package explicitly, but rather the higher level API of OnnxTransformer?

On a separate note: Is it sufficient to install the GPU natives and use the fallbackToCpu flag of ApplyOnnxModel to be able to run inferencing on both CPU and GPU? I'm having a hard time finding this documented.

  • What did you do?
    Installed Microsoft.ML.OnnxTransformer 1.5.0 and Microsoft.ML.OnnxRuntime 1.3.0 and used ApplyOnnxModel in a pipeline.

  • What happened?
    Calling ApplyOnnxModel throws System.TypeInitializationException.

  • What did you expect?
    That my ONNX model can be used.

Source code / logs

Inner exception message:

"Unable to load DLL 'onnxruntime' or one of its dependencies: The specified module could not be found. (0x8007007E)"

Metadata

Metadata

Assignees

Labels

Awaiting User InputAwaiting author to supply further info (data, model, repro). Will close issue if no more info given.P3Doc bugs, questions, minor issues, etc.documentationRelated to documentation of ML.NETquestionFurther information is requested

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions