Using BERT for doing the task of Conditional Natural Langauge Generation by fine-tuning pre-trained BERT on custom dataset.
1. Install the necessary requirements
2. Download the fine-tuned BERT model and keep at same level.
3. Run python3 app.py
4. Open index.html in the browser
- This implementation does the task of fill-in-the blanks recursively till the length of the text is reached.
- Top-k (where k=5) is chosen as the decoding strategy at every time step. One can also try implementing Beam Search for experimentation purposes.
- Bert-base-uncased is chosen as the pre-trained model overwhich it's LM is tuned on Hotel Reviews.
- Download review fine-tuned BERT model here
- You can tweak in length of the text you want to generate by using the slider shown in the demo image. The granularity of the slider is at word level. Currently the limit is set to maximum of 100 words at a time.
- You can choose between Random Hop and Left to Right generation schemes. Random Hop is usually seen to perform better than Left to Right.
- Left to Right Scheme
- Random Scheme