Skip to content
This repository has been archived by the owner on Jun 22, 2024. It is now read-only.

Latest commit

 

History

History
231 lines (171 loc) · 7.04 KB

README.md

File metadata and controls

231 lines (171 loc) · 7.04 KB

Edgen Node API Library

This library provides convenient access to the Edgen REST API from TypeScript or JavaScript. It is a fork of the official OpenAI Python library.

The original library was generated from the OpenAPI specification with Stainless.

Installation

npm install --save edgen
# or
yarn add edgen

Usage

The full API of this library can be found in api.md file along with many code examples. The code below shows how to get started using the chat completions API.

import Edgen from 'edgen';

const edgen = new Edgen({
  baseURL: process.env['EDGEN_BASE_URL'], // This is the default and can be omitted
});

async function main() {
  const chatCompletion = await edgen.chat.completions.create({
    messages: [{ role: 'user', content: 'Say this is a test' }],
    model: 'default',
  });
}

main();

Streaming Responses

We provide support for streaming responses using Server Sent Events (SSE).

import Edgen from 'edgen';

const edgen = new Edgen();

async function main() {
  const stream = await edgen.chat.completions.create({
    model: 'default',
    messages: [{ role: 'user', content: 'Say this is a test' }],
    stream: true,
  });
  for await (const chunk of stream) {
    process.stdout.write(chunk.choices[0]?.delta?.content || '');
  }
}

main();

If you need to cancel a stream, you can break from the loop or call stream.controller.abort().

Request & Response types

This library includes TypeScript definitions for all request params and response fields. You may import and use them like so:

import Edgen from 'edgen';

const edgen = new Edgen();

async function main() {
  const params: Edgen.Chat.ChatCompletionCreateParams = {
    messages: [{ role: 'user', content: 'Say this is a test' }],
    model: 'default',
  };
  const chatCompletion: Edgen.Chat.ChatCompletion = await edgen.chat.completions.create(params);
}

main();

Documentation for each method, request param, and response field are available in docstrings and will appear on hover in most modern editors.

Handling errors

When the library is unable to connect to the API, or if the API returns a non-success status code (i.e., 4xx or 5xx response), a subclass of APIError will be thrown:

async function main() {
  const job = await edgen.chat.completions
    .create({
        messages: [{ role: 'user', content: 'Say this is a test' }],
        model: 'default',
     })
    .catch((err) => {
      if (err instanceof Edgen.APIError) {
        console.log(err.status); // 400
        console.log(err.name); // BadRequestError
        console.log(err.headers); // {server: 'nginx', ...}
      } else {
        throw err;
      }
    });
}

main();

Error codes are as followed:

Status Code Error Type
400 BadRequestError
401 AuthenticationError
403 PermissionDeniedError
404 NotFoundError
422 UnprocessableEntityError
429 RateLimitError
>=500 InternalServerError
N/A APIConnectionError

Retries

Certain errors will be automatically retried 2 times by default, with a short exponential backoff. Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict, 429 Rate Limit, and >=500 Internal errors will all be retried by default.

You can use the maxRetries option to configure or disable this:

// Configure the default for all requests:
const edgen = new Edgen({
  maxRetries: 0, // default is 2
});

// Or, configure per-request:
await edgen.chat.completions.create({ messages: [{ role: 'user', content: 'How can I get the name of the current day in Node.js?' }], model: 'default' }, {
  maxRetries: 5,
});

Timeouts

Requests time out after 10 minutes by default. You can configure this with a timeout option:

// Configure the default for all requests:
const edgen = new Edgen({
  timeout: 20 * 1000, // 20 seconds (default is 10 minutes)
});

// Override per-request:
await edgen.chat.completions.create({ messages: [{ role: 'user', content: 'How can I list all files in a directory using Python?' }], model: 'default' }, {
  timeout: 5 * 1000,
});

On timeout, an APIConnectionTimeoutError is thrown.

Note that requests which time out will be retried twice by default.

Advanced Usage

Customizing the fetch client

By default, this library uses node-fetch in Node, and expects a global fetch function in other environments.

If you would prefer to use a global, web-standards-compliant fetch function even in a Node environment, (for example, if you are running Node with --experimental-fetch or using NextJS which polyfills with undici), add the following import before your first import from "Edgen":

// Tell TypeScript and the package to use the global web fetch instead of node-fetch.
// Note, despite the name, this does not add any polyfills, but expects them to be provided if needed.
import 'edgen/shims/web';
import Edgen from 'edgen';

To do the inverse, add import "edgen/shims/node" (which does import polyfills). This can also be useful if you are getting the wrong TypeScript types for Response - more details here.

You may also provide a custom fetch function when instantiating the client, which can be used to inspect or alter the Request or Response before/after each request:

import { fetch } from 'undici'; // as one example
import Edgen from 'edgen';

const client = new Edgen({
  fetch: async (url: RequestInfo, init?: RequestInfo): Promise<Response> => {
    console.log('About to make a request', url, init);
    const response = await fetch(url, init);
    console.log('Got response', response);
    return response;
  },
});

Note that if given a DEBUG=true environment variable, this library will log all requests and responses automatically. This is intended for debugging purposes only and may change in the future without notice.

Semantic Versioning

This package follows SemVer conventions.

We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.

Requirements

TypeScript >= 4.5 is supported.

The following runtimes are supported:

  • Node.js 18 LTS or later (non-EOL) versions.
  • Bun 1.0 or later.
  • Cloudflare Workers.
  • Vercel Edge Runtime.
  • Jest 28 or greater with the "node" environment ("jsdom" is not supported at this time).
  • Nitro v2.6 or greater.

Note that React Native is not supported at this time.

If you are interested in other runtime environments, please open or upvote an issue on GitHub.