Skip to content

Releases: simonw/llm-gemini

0.9

22 Jan 03:38
Compare
Choose a tag to compare
  • Added support for grounding prompts against Google search using the new -o google_search 1 option. Thanks, Ricardo Mestre. #29
  • Added support for text/plain and text/csv attachments. Thanks, Michele Gregori. #34
  • Fixed bug handling application/ogg attachments. #35
  • New model: learnlm-1.5-pro-experimental. Thanks, Leonard Tulipan. #36
  • New model: gemini-2.0-flash-thinking-exp-01-21. #37

0.8

19 Dec 23:44
Compare
Choose a tag to compare
0.8
  • Support for Gemini 2.0 Flash Thinking Mode experimental model: llm -m gemini-2.0-flash-thinking-exp-1219 "Solve 3*x^3-5*x=1". #31
  • Fixed bug with async models used in conversations. Thanks, Sukhbinder Singh. #30

0.7

11 Dec 16:10
Compare
Choose a tag to compare
0.7

0.6

06 Dec 17:29
Compare
Choose a tag to compare
0.6
  • New model: gemini-exp-1206. #27

0.5

02 Dec 00:12
Compare
Choose a tag to compare
0.5
  • Track token usage using new capability introduced in LLM 0.19. #25

0.4.2

22 Nov 00:43
Compare
Choose a tag to compare
  • Support for new gemini-exp-1121 model. #26

0.5a0

20 Nov 05:13
Compare
Choose a tag to compare
0.5a0 Pre-release
Pre-release
  • Track token usage. #25

0.4.1

18 Nov 08:04
5af719d
Compare
Choose a tag to compare
  • Depends on LLM 0.18 or higher. #23

0.4

18 Nov 07:33
Compare
Choose a tag to compare
0.4
  • Handle attachments that are sent without a prompt. #20
  • Support for new gemini-exp-1114 model. Thanks, Dominik Hayon. #21
  • Support for JSON output mode using -o json_object 1. #22
  • Now provides async versions of the Gemini models, compatible with LLM 0.18. #23

0.3

29 Oct 04:12
Compare
Choose a tag to compare
0.3
  • Multi-modal model support with LLM 0.17 attachments. Gemini 1.5 models can now accept images, audio and video. #17
    llm -m gemini-1.5-flash-8b-latest 'describe image' \
      -a https://static.simonwillison.net/static/2024/pelicans.jpg
  • Support for code execution mode. #18
    llm -m gemini-1.5-pro-latest 'write and execute python to calculate factorial of 13' -o code_execution 1
  • Support for options: temperature, max_output_tokens, top_p, top_k. Pass these as e.g. -o temperature 0.5. #3