project / repo / mailing list / issues
ℹ️ Note
The canonical project locations are linked above. Other locations are mirrors.
An AI assistant for your terminal.
- Works with Anthropic and OpenAI models
- Configurable parameters: change models, system prompt, response length, and other parameters
- Automatic environment detection: automatically detects your system and shell environment for accurate responses
- Automatic copy and paste: optionally copies the generated response to the clipboard and pastes it in your terminal, ready to execute
-
Install
term-assist
usingpipx
:pipx install term-assist
-
In your terminal, configure your API keys depending on which model(s) you want to use.
usage: ta [-h] [--version] [--model MODEL] [--environment ENVIRONMENT]
[prompt ...]
term-assist: an AI assistant for your terminal.
positional arguments:
prompt prompt for the AI model
options:
-h, --help show this help message and exit
--version display the program version
--model MODEL, -m MODEL
specify a model to use in the format
BRAND:MODEL (overrides the setting in your
config file)
--environment ENVIRONMENT, -e ENVIRONMENT
specify environment details (overrides
automatic environment detection)
> ta unzip a tgz archive
> ta --model openai:gpt-4o follow a file as it updates
> ta -m ...
> ta --environment "windows 95 dos" list com ports
> ta --environment bash list free drive space
> ta -e ...
The configuration file is ~/.config/term-assist/config.json
and will
be created on first run and initialized with default configuration
parameters if it does not already exist.
See ~/.config/term-assist/config_default.json
for the default
configuration.
The AI model to use.
This parameter should be set in the format BRAND:MODEL
. For example,
to use OpenAI's GPT-4o model, set this parameter
to openai:gpt-4o
.
See ~/.config/term-assist/models.json
for available models.
The maximum number of tokens that will be generated for output.
The amount of randomness injected into the response. Ranges from 0.0 to 1.0.
The system prompt that is given to the model.
Automatically collected information about your system environment will
be appended to the end of the string. This information can be overridden
using the --environment/-e
command.
If true, automatically copy the AI's response to your clipboard.
If true (and if auto_copy is true), automatically paste the AI's response so it is ready to execute.
Module tests are located in the src/term_assist/test/
directory.
- Install docker.
- Run
run.sh
, which will build the docker image and initiate testing inside a container. - Results will be copied to an
src/term_assist/test/output
in both XML and plaintext log format.