Skip to content

Commit

Permalink
various minor edits
Browse files Browse the repository at this point in the history
  • Loading branch information
benironside committed Nov 18, 2024
1 parent e968841 commit 194dabd
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/AI-for-security/knowledge-base.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ Refer to the following video for an example of adding an index to Knowledge Base

You can use an {es} connector or web crawler to create an index that contains data you want to add to Knowledge Base.

This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {ref}/crawler.html[Elastic web crawler].
This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {enterprise-search-ref}/crawler.html[Elastic web crawler].

[discrete]
==== Use a web crawler to add threat intelligence to Knowledge Base
Expand All @@ -154,12 +154,12 @@ First, you'll need to set up a web crawler to add the desired data to an index,
. Click **New web crawler**.
.. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**.
.. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**.
. The previous step opens a page with the details of your new crawler. Go to its **Mappings** tab, then click **Add field**.
. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**.
+
NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field.
.. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**.
. Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule.
. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**.
. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules].
.. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**.
.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**.
.. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up".
Expand Down

0 comments on commit 194dabd

Please sign in to comment.