Skip to content

Latest commit

 

History

History
19 lines (12 loc) · 864 Bytes

README.md

File metadata and controls

19 lines (12 loc) · 864 Bytes

This is a demo to showcase Dolphin's ability to generate conversational datasets for training AI with a simple python script.

Usage

  • install ollama
ollama pull dolphin-mixtral
pip install ollama
python generate-usecase-chats.py

example output:

image

There is a reason I'm outputting in this wonky schema. This schema forces there to be 1 system message, and human/gpt pairs per conversation.

After generating, you will need convert it to ShareGPT (either Axolotl style or LlamaFactory style)