Skip to content

Add Generative QA Models like RAG #443

@tholor

Description

@tholor

What?
So far most generative QA models were not really useful in practice, because they could only answer very generic questions that were included in their "wiki + web" training corpora. For use cases in the industry we mostly want to:

  • ask domain-specific questions
  • get the answer from a specified, reliable corpus
  • have a possibility to check the answer (e.g. by inspecting the document / context where it came from)

Pure generative models don't fulfill these requirements. However, recent retrieval-augmented approaches could be interesting to test.

How?

The latest transformers release comes with the RAG model from Facebook (https://ai.facebook.com/blog/retrieval-augmented-generation-streamlining-the-creation-of-intelligent-natural-language-processing-models). We could add a "TransformersGenerator" class in Haystack that gets documents from the retriever and generates the answer conditioned on those.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions