Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modify the repl and spark ec documents #773

Merged
merged 2 commits into from
Dec 16, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 17 additions & 0 deletions docs/engine-usage/repl.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;

### 3.1 Submit `java` tasks through `Linkis-cli`

Single method
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
Expand All @@ -85,6 +86,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```

Multiple methods
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;

public void sayHello() {
System.out.println(\"hello\");
System.out.println(StringUtils.isEmpty(\"hello\"));
}
public void sayHi() {
System.out.println(\"hi\");
System.out.println(StringUtils.isEmpty(\"hi\"));
}" \
-codeType repl -runtimeMap linkis.repl.type=java -runtimeMap linkis.repl.method.name=sayHi
```

### 3.2 Submit `scala` tasks through `Linkis-cli`

```shell
Expand Down
13 changes: 13 additions & 0 deletions docs/engine-usage/spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,19 @@ Token-User: linkis

### 3.5 Submitting spark yarn cluster tasks via `Linkis-cli`

Upload the jar package and configuration
```shell
# Upload the jar package under the lib of the linkis spark engine (modify the following parameters according to your actual installation directory)
cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
hdfs dfs -put *.jar hdfs:///spark/cluster/

# Upload the linkis configuration file (modify the following parameters according to your actual installation directory)
cd /appcom/Install/linkis/conf
hdfs dfs -put * hdfs:///spark/cluster/
```
Can pass `linkis.spark.yarn.cluster.jars`parameters to modify`hdfs:///spark/cluster`

Execute the test case
```shell
# Use `engingeConnRuntimeMode=yarnCluster` to specify the yarn cluster mode
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code "select 123"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;

### 3.1 通过 `Linkis-cli` 提交`java`任务

单个方法
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
Expand All @@ -83,6 +84,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```

多个方法
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;

public void sayHello() {
System.out.println(\"hello\");
System.out.println(StringUtils.isEmpty(\"hello\"));
}
public void sayHi() {
System.out.println(\"hi\");
System.out.println(StringUtils.isEmpty(\"hi\"));
}" \
-codeType repl -runtimeMap linkis.repl.type=java -runtimeMap linkis.repl.method.name=sayHi
```

### 3.2 通过 `Linkis-cli` 提交`scala`任务

```shell
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -178,6 +178,19 @@ Token-User: linkis

### 3.5 通过 `Linkis-cli` 提交spark yarn cluster任务

上传jar包和配置
```shell
# 上传linkis spark引擎的lib下的jar包 (根据您的实际安装目录修改以下参数)
cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
hdfs dfs -put *.jar hdfs:///spark/cluster/

# 上传linkis 配置文件 (根据您的实际安装目录修改以下参数)
cd /appcom/Install/linkis/conf
hdfs dfs -put * hdfs:///spark/cluster/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need put hive-site.xml to hdfs dir

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, thanks for reviewing the code

```
可以通过`linkis.spark.yarn.cluster.jars`参数来修改`hdfs:///spark/cluster`

执行测试用例
```shell
# 使用 `engingeConnRuntimeMode=yarnCluster` 来指定yarn cluster模式
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code "select 123"
Expand Down
Loading