Skip to content
This repository was archived by the owner on Apr 8, 2025. It is now read-only.
This repository was archived by the owner on Apr 8, 2025. It is now read-only.

can't make inference on conll03-en  #320

@renaud

Description

@renaud

I am trying to perform NER inference on conll03-en.

This is what I tried:

  1. Train conll03-en from config, then perform inference with saved model
from farm.experiment import run_experiment, load_experiments
experiments = load_experiments('experiments/ner/conll2003_en_config.json')
run_experiment(experiments[0])

model = Inferencer.load('saved_models/CONLL2003')
basic_texts = [
    {"text": "Japan began the defence of their Asian Cup title with a lucky 2 win against Syria in a Group C championship match on Friday."},
]
result = model.inference_from_dicts(dicts=basic_texts)
print(result)

This does not return any predictions: [{'task': 'ner', 'predictions': []}]

  1. I also try to rewrite the conll2003_en_config.json as pure python code, by adapting examples/ner.py, but no luck either

  2. Also note that the NER evaluations work on both examples above, just not the inference.

  3. Note also that examples/ner.py (German CONLL) worked fine for inference.

What am I missing?

Thanks

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions