Skip to content

Commit

Permalink
Update more docs
Browse files Browse the repository at this point in the history
  • Loading branch information
samzhou2 committed Oct 11, 2024
1 parent e9bbc18 commit a81793b
Show file tree
Hide file tree
Showing 21 changed files with 319 additions and 124 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
39 changes: 33 additions & 6 deletions modules/cloud4/modules/gsql-editor/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,17 @@ The GSQL Editor is a powerful tool for developing and executing GSQL queries, al

The GSQL Editor contains 4 work areas:

image::Screenshot 2024-04-18 at 10.04.32 AM.png[]
* Left image:Screenshot 2024-04-18 at 10.05.54 AM.png[width=50] - GSQL xref:gsql-editor:index.adoc#_file_list[File List ] Panel.
* Bottom image:Screenshot 2024-04-18 at 10.06.17 AM.png[width=50] - GSQL Result Panel.
* Right image:Screenshot 2024-04-18 at 10.06.37 AM.png[width=50] Schema Designer Panel.
image::Screenshot 2024-10-11 at 11.39.23.png[]
* Left image:left.png - GSQL xref:gsql-editor:index.adoc#_file_list[File List ] Panel.
* Bottom image:bottom.png - GSQL Result Panel.
* Right image:right.png Schema Designer Panel.
+
[NOTE]
====
For more details please see xref:cloud4:schema-designer:index.adoc[].
====

* Center image:Screenshot 2024-04-23 at 10.05.36 PM.png[width=50] Main GSQL Editing Panel.
* Main area image:main.png Main GSQL Editing Panel.

[TIP]
====
Expand All @@ -35,7 +35,7 @@ It helps you organize and manage your GSQL queries effectively with a list of GS

* Navigate through the files and folders by clicking on their names in the list.
+
image:Screenshot 2024-04-18 at 10.07.50 AM.png[]
image:Screenshot 2024-10-11 at 10.59.49.png[]
+
* It also includes a search bar that allows you to search for specific files by their names.
+
Expand Down Expand Up @@ -86,6 +86,33 @@ image::Screenshot 2024-04-18 at 10.10.47 AM.png[width=250]
+
image::Screenshot 2024-04-18 at 10.11.14 AM.png[width=250]

== Query List

The Query List in the GSQL Editor provides an overview of all custom queries for each graph. It helps you manage your queries efficiently, allowing you to create, edit, delete, and install queries.

* View all custom queries for the current graph.

image::queries.png

* Create a query by clicking on the btn:[ + ] button on the side of the target graph.

image::add-query.png

[TIP]
====
This feature helps you quickly view and edit custom queries on your graph.
====

image::query-details.png

* Edit existing queries by selecting the query and clicking the btn:[ Edit ] button.

* Delete queries that are no longer needed by selecting the query and clicking the btn:[ Delete ] button.

* If the query is not installed, you can install the query by selecting the query and clicking the btn:[ Install ] button.

image::install-query.png

== Edit Schema

The schema defines the structure of your graph database, influencing its performance and functionality.
Expand Down
4 changes: 3 additions & 1 deletion modules/cloud4/modules/load-data/nav.adoc
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
* xref:index.adoc[Load Data]
** xref:load-data:load-from-local.adoc[]
** xref:load-data:load-from-cloud.adoc[]
** xref:load-data:load-from-s3.adoc[]
** xref:load-data:load-from-gcs.adoc[]
** xref:load-data:load-from-blob.adoc[]
** xref:load-data:load-from-other-sources.adoc[]
*** xref:load-data:jdbc.adoc[]
12 changes: 10 additions & 2 deletions modules/cloud4/modules/load-data/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,17 @@ Step-by step-support will be added for all sources, but during the beta release

Check out our step-by-step guide on loading data from a local file.

== xref:cloud4:load-data:load-from-cloud.adoc[]
== xref:cloud4:load-data:load-from-s3.adoc[]

Check out our step-by-step guide on loading data from a AWS s3.
Check out our step-by-step guide on loading data from a AWS S3.

== xref:cloud4:load-data:load-from-gcs.adoc[]

Check out our step-by-step guide on loading data from a Google Cloud Storage.

== xref:cloud4:load-data:load-from-blob.adoc[]

Check out our step-by-step guide on loading data from a Azure Blob Storage.


== xref:cloud4:load-data:load-from-other-sources.adoc[]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
= Load from the AWS S3
= Load from Azure Blob Storage
:experimental:

If you store your data in AWS S3, TigerGraph Cloud provides seamless integration for data ingestion.
Expand Down Expand Up @@ -67,8 +67,7 @@ image::Screenshot 2024-04-17 at 5.55.35 PM.png[]
+
[CAUTION]
====
TigerGraph Cloud 4 is still in beta release and the schema generation feature is still a preview feature.
The correctness and efficiency of the resulting graph schema and mapping could vary.
The schema generation feature is still a preview feature. The correctness and efficiency of the resulting graph schema and mapping could vary.
====

. In the `Source` column, you can choose the specific column from the data source that you want to map with the attribute.
Expand Down
109 changes: 109 additions & 0 deletions modules/cloud4/modules/load-data/pages/load-from-gcs.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
= Load from Google Cloud Storage (GCS)
:experimental:

If you store your data in AWS S3, TigerGraph Cloud provides seamless integration for data ingestion.
You can directly load data from your S3 buckets into your graph databases, eliminating the need for manual data transfers.
This simplifies the process of importing large datasets and enables you to leverage the scalability and durability of AWS S3 for your graph analysis.

== 1) Select Source

.Once you’ve selected AWS S3, you will be asked to configure the AWS S3 data source.
. Click on image:Screenshot 2024-04-17 at 9.36.58 PM.png[width=50] to add a new AWS S3 data source.
+
image:Screenshot 2024-04-17 at 9.36.27 PM.png[]

. You will need to provide your S3 `AWS access key id` and S3 `AWS secret access key`.
+
image:Screenshot 2024-04-17 at 9.37.32 PM.png[]
. Once you have those configured, you can add one or multiple `S3 URI` within the same S3 bucket.

. Click btn:[ Next ] to process the file(s).
+
[NOTE]
====
The current data loading tool only supports CSV files. Other formats will be available in later releases.
====

== 2) Configure File
.This step lets you configure the source file details.
. The data loading tool will automatically detect the `.csv` separators and line breaks.
The parser automatically splits each line into a series of tokens.
+
image::Screenshot 2024-04-17 at 5.53.45 PM.png[]
+
[TIP]
====
If the parsing is *not* correct, click on the image:Screenshot 2024-04-17 at 5.54.17 PM.png[width=75]
button to configure a different option for the delimiter, such as `eol`, or `quote and header`.
image:Screenshot 2024-04-17 at 5.54.50 PM.png[]
====
+
The enclosing character is used to mark the boundaries of a token, overriding the delimiter character.
+
====
For example, if your delimiter is a comma, but you have commas in some strings, then you can define single or double quotes as the enclosing character to mark the endpoints of your string tokens.
====
+
[NOTE]
====
It is not necessary for every token to have enclosing characters. The parser uses enclosing characters when it encounters them.
====
+
[TIP]
====
You can edit the header line of the parsing result to give each column a more intuitive name, since you will be referring to these names when loading data to the graph.
The header name is ignored during data loading.
====

. Once you are satisfied with the file settings, click btn:[ Next ] to proceed.

== 3) Configure Map

.If you are loading data into a brand new graph, you will be prompted to let our engine generate a schema and mapping for you. Or you can start from scratch. For more details of schema design please refer to xref:cloud4:schema-designer:index.adoc[Schema Designer].
. Select `Generate the schema only` or `Generate the schema and data mapping`.
+
image::Screenshot 2024-04-17 at 5.55.35 PM.png[]
+
[CAUTION]
====
The schema generation feature is still a preview feature. The correctness and efficiency of the resulting graph schema and mapping could vary.
====

. In the `Source` column, you can choose the specific column from the data source that you want to map with the attribute.
+
image::Screenshot 2024-04-17 at 5.56.15 PM.png[]
+
. Use the `+` button to create a new attribute of the target vertex or edge.
+
image::Screenshot 2024-04-17 at 5.57.07 PM.png[]

. The `Map all to target` button aligns existing attribute names with the corresponding data source headers to establish mappings and also introduces new attributes based on unmatched data source headers.
+
image::Screenshot 2024-04-17 at 5.57.52 PM.png[]
. Click btn:[Next] to proceed.



== 4) Confirm

.This step will let you confirm the changes made to the schema and the data mapping you created to load the data.
. Simply review the `Schema to be change` and `Data to be loaded` lists.
+
image::Screenshot 2024-04-17 at 5.58.36 PM.png[]
+
[CAUTION]
====
Please be aware that some schema changes will result in unintentional deletion of the data. Please carefully review the warning message before confirming the loading.
====
. Click on the btn:[Confirm] button to run the loading jobs and monitor their `Status`.
+
image::Screenshot 2024-04-17 at 5.59.16 PM.png[]

== Next Steps

Next, learn how to use the xref:cloud4:schema-designer:index.adoc[Schema Designer] in TigerGraph Cloud 4.

Or return to the xref:cloud4:overview:index.adoc[Overview] page for a different topic.


3 changes: 1 addition & 2 deletions modules/cloud4/modules/load-data/pages/load-from-local.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -61,8 +61,7 @@ image::Screenshot 2024-04-17 at 5.55.35 PM.png[]
+
[CAUTION]
====
TigerGraph Cloud 4 is still in beta release and the schema generation feature is still a preview feature.
The correctness and efficiency of the resulting graph schema and mapping could vary.
The schema generation feature is still a preview feature. The correctness and efficiency of the resulting graph schema and mapping could vary.
====

. In the `Source` column, you can choose the specific column from the data source that you want to map with the attribute.
Expand Down
109 changes: 109 additions & 0 deletions modules/cloud4/modules/load-data/pages/load-from-s3.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
= Load from AWS S3
:experimental:

If you store your data in AWS S3, TigerGraph Cloud provides seamless integration for data ingestion.
You can directly load data from your S3 buckets into your graph databases, eliminating the need for manual data transfers.
This simplifies the process of importing large datasets and enables you to leverage the scalability and durability of AWS S3 for your graph analysis.

== 1) Select Source

.Once you’ve selected AWS S3, you will be asked to configure the AWS S3 data source.
. Click on image:Screenshot 2024-04-17 at 9.36.58 PM.png[width=50] to add a new AWS S3 data source.
+
image:Screenshot 2024-04-17 at 9.36.27 PM.png[]

. You will need to provide your S3 `AWS access key id` and S3 `AWS secret access key`.
+
image:Screenshot 2024-04-17 at 9.37.32 PM.png[]
. Once you have those configured, you can add one or multiple `S3 URI` within the same S3 bucket.

. Click btn:[ Next ] to process the file(s).
+
[NOTE]
====
The current data loading tool only supports CSV files. Other formats will be available in later releases.
====

== 2) Configure File
.This step lets you configure the source file details.
. The data loading tool will automatically detect the `.csv` separators and line breaks.
The parser automatically splits each line into a series of tokens.
+
image::Screenshot 2024-04-17 at 5.53.45 PM.png[]
+
[TIP]
====
If the parsing is *not* correct, click on the image:Screenshot 2024-04-17 at 5.54.17 PM.png[width=75]
button to configure a different option for the delimiter, such as `eol`, or `quote and header`.
image:Screenshot 2024-04-17 at 5.54.50 PM.png[]
====
+
The enclosing character is used to mark the boundaries of a token, overriding the delimiter character.
+
====
For example, if your delimiter is a comma, but you have commas in some strings, then you can define single or double quotes as the enclosing character to mark the endpoints of your string tokens.
====
+
[NOTE]
====
It is not necessary for every token to have enclosing characters. The parser uses enclosing characters when it encounters them.
====
+
[TIP]
====
You can edit the header line of the parsing result to give each column a more intuitive name, since you will be referring to these names when loading data to the graph.
The header name is ignored during data loading.
====

. Once you are satisfied with the file settings, click btn:[ Next ] to proceed.

== 3) Configure Map

.If you are loading data into a brand new graph, you will be prompted to let our engine generate a schema and mapping for you. Or you can start from scratch. For more details of schema design please refer to xref:cloud4:schema-designer:index.adoc[Schema Designer].
. Select `Generate the schema only` or `Generate the schema and data mapping`.
+
image::Screenshot 2024-04-17 at 5.55.35 PM.png[]
+
[CAUTION]
====
The schema generation feature is still a preview feature. The correctness and efficiency of the resulting graph schema and mapping could vary.
====

. In the `Source` column, you can choose the specific column from the data source that you want to map with the attribute.
+
image::Screenshot 2024-04-17 at 5.56.15 PM.png[]
+
. Use the `+` button to create a new attribute of the target vertex or edge.
+
image::Screenshot 2024-04-17 at 5.57.07 PM.png[]

. The `Map all to target` button aligns existing attribute names with the corresponding data source headers to establish mappings and also introduces new attributes based on unmatched data source headers.
+
image::Screenshot 2024-04-17 at 5.57.52 PM.png[]
. Click btn:[Next] to proceed.



== 4) Confirm

.This step will let you confirm the changes made to the schema and the data mapping you created to load the data.
. Simply review the `Schema to be change` and `Data to be loaded` lists.
+
image::Screenshot 2024-04-17 at 5.58.36 PM.png[]
+
[CAUTION]
====
Please be aware that some schema changes will result in unintentional deletion of the data. Please carefully review the warning message before confirming the loading.
====
. Click on the btn:[Confirm] button to run the loading jobs and monitor their `Status`.
+
image::Screenshot 2024-04-17 at 5.59.16 PM.png[]

== Next Steps

Next, learn how to use the xref:cloud4:schema-designer:index.adoc[Schema Designer] in TigerGraph Cloud 4.

Or return to the xref:cloud4:overview:index.adoc[Overview] page for a different topic.


4 changes: 3 additions & 1 deletion modules/cloud4/modules/overview/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,9 @@ Learn how to Load Data into TigerGraph Cloud 4.

xref:load-data:index.adoc[Overview] |
xref:load-data:load-from-local.adoc[Local] |
xref:load-data:load-from-cloud.adoc[Cloud] |
xref:load-data:load-from-s3.adoc[AWS S3] |
xref:load-data:load-from-gcs.adoc[Google Cloud Storage] |
xref:load-data:load-from-blob.adoc[Azure Blob Storage] |
xref:load-data:load-from-other-sources.adoc[Other Sources]
¦
image:TG_Icon_Library-135.png[alt=schemadesigner,width=74,height=74]
Expand Down
Loading

0 comments on commit a81793b

Please sign in to comment.