Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate Transformer-Based Detectors #184

Open
meriemjebali opened this issue Nov 20, 2024 · 0 comments
Open

Integrate Transformer-Based Detectors #184

meriemjebali opened this issue Nov 20, 2024 · 0 comments

Comments

@meriemjebali
Copy link

Description of Problem:

While HF models can be efficiently and easily used in the Melusine scope, the current version combines only deterministic based-on-regex detectors.

Overview of the Solution:

  1. Technical features

Adding a base class allowing to load models from hugging-face and incorporating them in its methods . Thus providing more accurate results.

  • Adding a MelusineMlDetector base class with the following methods :
    • pre_detect
    • by_regex_detect
    • load_model
    • predict
    • by_ml_detect
    • post_detect
  1. Functional implementations :

    • Adding a DissatisfactionDetector : detection of dissatisfaction emails
      /or
    • Enriching the thanksDetector with a transformer-based model for enhanced predictions
  2. Documentation
    The documentation of this work would be necessary , to provide a how-to-use template of these kind of models in the Melusine environment.

Examples:
Models such as. : distil-camembert-base

Blockers: None

Definition of Done: New Melusine version

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant