Skip to content

Commit

Permalink
Clarified aborting examples and readme (#157)
Browse files Browse the repository at this point in the history
Clarified aborting examples and updated readme
  • Loading branch information
ParthSareen authored Nov 5, 2024
1 parent 6b6221d commit c97f231
Show file tree
Hide file tree
Showing 5 changed files with 109 additions and 60 deletions.
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -204,8 +204,10 @@ ollama.ps()
ollama.abort()
```

This method will abort all streamed generations currently running.
All asynchronous threads listening to streams (typically the ```for await (const part of response)```) will throw an ```AbortError``` exception
This method will abort **all** streamed generations currently running with the client instance.
If there is a need to manage streams with timeouts, it is recommended to have one Ollama client per stream.

All asynchronous threads listening to streams (typically the ```for await (const part of response)```) will throw an ```AbortError``` exception. See [examples/abort/abort-all-requests.ts](examples/abort/abort-all-requests.ts) for an example.

## Custom client

Expand Down
55 changes: 55 additions & 0 deletions examples/abort/abort-all-requests.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
import ollama from 'ollama'

// Set a timeout to abort all requests after 5 seconds
setTimeout(() => {
console.log('\nAborting all requests...\n')
ollama.abort()
}, 5000) // 5000 milliseconds = 5 seconds

// Start multiple concurrent streaming requests
Promise.all([
ollama.generate({
model: 'llama3.2',
prompt: 'Write a long story about dragons',
stream: true,
}).then(
async (stream) => {
console.log(' Starting stream for dragons story...')
for await (const chunk of stream) {
process.stdout.write(' 1> ' + chunk.response)
}
}
),

ollama.generate({
model: 'llama3.2',
prompt: 'Write a long story about wizards',
stream: true,
}).then(
async (stream) => {
console.log(' Starting stream for wizards story...')
for await (const chunk of stream) {
process.stdout.write(' 2> ' + chunk.response)
}
}
),

ollama.generate({
model: 'llama3.2',
prompt: 'Write a long story about knights',
stream: true,
}).then(
async (stream) => {
console.log(' Starting stream for knights story...')
for await (const chunk of stream) {
process.stdout.write(' 3>' + chunk.response)
}
}
)
]).catch(error => {
if (error.name === 'AbortError') {
console.log('All requests have been aborted')
} else {
console.error('An error occurred:', error)
}
})
50 changes: 50 additions & 0 deletions examples/abort/abort-single-request.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
import { Ollama } from 'ollama'

// Create multiple ollama clients
const client1 = new Ollama()
const client2 = new Ollama()

// Set a timeout to abort just the first request after 5 seconds
setTimeout(() => {
console.log('\nAborting dragons story...\n')
// abort the first client
client1.abort()
}, 5000) // 5000 milliseconds = 5 seconds

// Start multiple concurrent streaming requests with different clients
Promise.all([
client1.generate({
model: 'llama3.2',
prompt: 'Write a long story about dragons',
stream: true,
}).then(
async (stream) => {
console.log(' Starting stream for dragons story...')
for await (const chunk of stream) {
process.stdout.write(' 1> ' + chunk.response)
}
}
),

client2.generate({
model: 'llama3.2',
prompt: 'Write a short story about wizards',
stream: true,
}).then(
async (stream) => {
console.log(' Starting stream for wizards story...')
for await (const chunk of stream) {
process.stdout.write(' 2> ' + chunk.response)
}
}
),

]).catch(error => {
if (error.name === 'AbortError') {
console.log('Dragons story request has been aborted')
} else {
console.error('An error occurred:', error)
}
})


27 changes: 0 additions & 27 deletions examples/abort/any-request.ts

This file was deleted.

31 changes: 0 additions & 31 deletions examples/abort/specific-request.ts

This file was deleted.

0 comments on commit c97f231

Please sign in to comment.