Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

How to get Inferencer output for token_classification & sequence_classification heads at the same time? #408

@FTuma

Description

@FTuma

Question
I'm currently in the process of building an custom processor with a sequence_classification & a token_classification head. After some small adjustments to the FARM codebase and the creation of a new type of processor, I was able to sucessfully train and evaluate my model on the eval & test data but I'm stuck predicting on new inputs...

My task at hand is about Joint Intent Classification & Slot Filling, e.g. basically text classification + NER but could probably easily extended to any other task involving joint token & sequence classification and I would be open to provide a PR once it is working properly (at the very least for the more specific Joint Intent Classification & Slot Filling Task).

I would like to know what's the easiest/best way to get the inference output result for a custom task consisting of 1 token_classification & 1 sequence_classification head.
Currently I'm running into an KeyError because it's using the else clause used for the Natural Questions in formatted_preds of the trained model.

Additional context

FARM_error_screenshot

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions