Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(specs): Typos in API descriptions #3932

Merged
merged 6 commits into from
Oct 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions scripts/buildLanguages.ts
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,9 @@ async function buildLanguage(language: Language, gens: Generator[], buildType: B

if (buildType !== 'guides') {
fileNames = gens.reduce((prev, curr) => `${prev} ${createClientName(curr.client, curr.language)}.ts`, '');
} else if (!fileNames.includes('search')) {
// only search is needed for guides right now, if it's not being built, no need to validate guides
break;
}

await run(`yarn tsc ${fileNames} --noEmit`, {
Expand Down
6 changes: 3 additions & 3 deletions specs/crawler/common/schemas/configuration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Configuration:
actions:
type: array
description: |
Instructions how to process crawled URLs.
Instructions about how to process crawled URLs.

Each action defines:

Expand All @@ -31,7 +31,7 @@ Configuration:
The API key must have the following access control list (ACL) permissions:
`search`, `browse`, `listIndexes`, `addObject`, `deleteObject`, `deleteIndex`, `settings`, `editSettings`.
The API key must not be the admin API key of the application.
The API key must have access to the indices which the crawler is supposed to create.
The API key must have access to create the indices that the crawler will use.
For example, if `indexPrefix` is `crawler_`, the API key must have access to all `crawler_*` indices.
appId:
$ref: '../parameters.yml#/applicationID'
Expand All @@ -54,7 +54,7 @@ Configuration:
description: |
References to external data sources for enriching the extracted records.

For more information, see [Enrich extrated records with external data](https://www.algolia.com/doc/tools/crawler/guides/enriching-extraction-with-external-data/).
For more information, see [Enrich extracted records with external data](https://www.algolia.com/doc/tools/crawler/guides/enriching-extraction-with-external-data/).
maxItems: 10
items:
type: string
Expand Down
4 changes: 2 additions & 2 deletions specs/crawler/paths/crawlerTaskCancel.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ post:
description: |
Cancels a blocking task.

Tasks that ran into an error block the futher schedule of your Crawler.
To unblock the crawler, you can cancel the blocking task.
Tasks that ran into an error will block your crawler's schedule.
To unblock the crawler, cancel the blocking task.
tags:
- tasks
parameters:
Expand Down
4 changes: 2 additions & 2 deletions specs/crawler/spec.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ info:

## Availability and authentication

Acess to the Crawler API is available with the [Crawler add-on](https://www.algolia.com/pricing/).
Access to the Crawler API is available with the [Crawler add-on](https://www.algolia.com/pricing/).

To authenticate your API requests, use the **basic authentication** header:

Expand Down Expand Up @@ -73,7 +73,7 @@ tags:
In the Crawler configuration, you specify which URLs to crawl, when to crawl, how to extract records from the crawl, and where to index the extracted records.
The configuration is versioned, so you can always restore a previous version.
It's easiest to make configuration changes in the [Crawler dashboard](https://crawler.algolia.com/admin/).
The editor has autocomplete and builtin validation so you can try your configuration changes before comitting them.
The editor has autocomplete and built-in validation so you can try your configuration changes before committing them.
- name: crawlers
x-displayName: Crawler
description: |
Expand Down
2 changes: 1 addition & 1 deletion specs/ingestion/common/schemas/transformation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Description:
description: A descriptive name for your transformation of what it does.

AuthenticationIDs:
description: The authentications associated for the current transformation.
description: The authentications associated with the current transformation.
type: array
items:
$ref: './common.yml#/authenticationID'
Expand Down
2 changes: 1 addition & 1 deletion specs/ingestion/paths/runs/events/events.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ get:
tags:
- observability
summary: List task run events
description: Retrieves a list of events for a task run, identified by it's ID.
description: Retrieves a list of events for a task run, identified by its ID.
operationId: listEvents
x-acl:
- addObject
Expand Down
7 changes: 1 addition & 6 deletions specs/ingestion/spec.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,14 +38,9 @@ info:
Successful responses return a `2xx` status. Client errors return a `4xx` status. Server errors are indicated by a `5xx` status.
Error responses have a `message` property with more information.

The Insights API doesn't validate if the event parameters such as `indexName`, `objectIDs`, or `userToken`,
correspond to anything in the Search API. It justs checks if they're formatted correctly.
Check the [Events](https://dashboard.algolia.com/events/health) health section,
whether your events can be used for Algolia features such as Analytics, or Dynamic Re-Ranking.

## Version

The current version of the Insights API is version 1, as indicated by the `/1/` in each endpoint's URL.
The current version of the Ingestion API is version 1, as indicated by the `/1/` in each endpoint's URL.

version: 1.0.0
components:
Expand Down
6 changes: 3 additions & 3 deletions specs/insights/spec.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ info:
## Client libraries

Use Algolia's API clients, libraries, and integrations to collect events from your UI and send them to the Insights API.
See: [Algolia's ecosystem](https://www.algolia.com/doc/guides/getting-started/how-algolia-works/in-depth/ecosystem/)
See: [Algolia's ecosystem](https://www.algolia.com/doc/guides/getting-started/how-algolia-works/in-depth/ecosystem/).

## Base URLs

Expand Down Expand Up @@ -45,7 +45,7 @@ info:
Error responses have a `message` property with more information.

The Insights API doesn't validate if the event parameters such as `indexName`, `objectIDs`, or `userToken`,
correspond to anything in the Search API. It justs checks if they're formatted correctly.
correspond to anything in the Search API. It just checks if they're formatted correctly.
Check the [Events](https://dashboard.algolia.com/events/health) health section,
whether your events can be used for Algolia features such as Analytics, or Dynamic Re-Ranking.

Expand Down Expand Up @@ -79,7 +79,7 @@ tags:
x-displayName: Events
description: >-
Events represent user interactions with your website or app.
They include details like the event's name, type, a timestamp or a user token.
They include details like the event's name, type, a timestamp, or a user token.
- name: usertokens
x-displayName: User tokens
description: |
Expand Down
2 changes: 1 addition & 1 deletion specs/monitoring/common/parameters.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Clusters:
name: clusters
in: path
required: true
description: Subset of clusters, separated by comma.
description: Subset of clusters, separated by commas.
schema:
# The `correct` (?) schema should be type array/items string,
# But the SDK generator expects a string, because it can't replace a list of strings in the URL.
Expand Down
Loading