Skip to content

Commit

Permalink
Modify the repl and spark ec documents (#773)
Browse files Browse the repository at this point in the history
* Modify the repl and spark ec documents

* Modify spark.md
  • Loading branch information
ChengJie1053 authored Dec 16, 2023
1 parent f79357b commit 0b4ee17
Show file tree
Hide file tree
Showing 4 changed files with 68 additions and 0 deletions.
17 changes: 17 additions & 0 deletions docs/engine-usage/repl.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;

### 3.1 Submit `java` tasks through `Linkis-cli`

Single method
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
Expand All @@ -85,6 +86,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```

Multiple methods
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
public void sayHello() {
System.out.println(\"hello\");
System.out.println(StringUtils.isEmpty(\"hello\"));
}
public void sayHi() {
System.out.println(\"hi\");
System.out.println(StringUtils.isEmpty(\"hi\"));
}" \
-codeType repl -runtimeMap linkis.repl.type=java -runtimeMap linkis.repl.method.name=sayHi
```

### 3.2 Submit `scala` tasks through `Linkis-cli`

```shell
Expand Down
17 changes: 17 additions & 0 deletions docs/engine-usage/spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,23 @@ Token-User: linkis

### 3.5 Submitting spark yarn cluster tasks via `Linkis-cli`

Upload the jar package and configuration
```shell
# Upload the jar package under the lib of the linkis spark engine (modify the following parameters according to your actual installation directory)
cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
hdfs dfs -put *.jar hdfs:///spark/cluster

# Upload the linkis configuration file (modify the following parameters according to your actual installation directory)
cd /appcom/Install/linkis/conf
hdfs dfs -put * hdfs:///spark/cluster

# Upload hive-site.xml (modify the following parameters according to your actual installation directory)
cd $HIVE_CONF_DIR
hdfs dfs -put hive-site.xml hdfs:///spark/cluster
```
Can pass `linkis.spark.yarn.cluster.jars`parameters to modify`hdfs:///spark/cluster`

Execute the test case
```shell
# Use `engingeConnRuntimeMode=yarnCluster` to specify the yarn cluster mode
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code "select 123"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;

### 3.1 通过 `Linkis-cli` 提交`java`任务

单个方法
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
Expand All @@ -83,6 +84,22 @@ select * from linkis_cg_engine_conn_plugin_bml_resources;
-codeType repl -runtimeMap linkis.repl.type=java
```

多个方法
```shell
sh bin/linkis-cli -engineType repl-1 -code \
"import org.apache.commons.lang3.StringUtils;
public void sayHello() {
System.out.println(\"hello\");
System.out.println(StringUtils.isEmpty(\"hello\"));
}
public void sayHi() {
System.out.println(\"hi\");
System.out.println(StringUtils.isEmpty(\"hi\"));
}" \
-codeType repl -runtimeMap linkis.repl.type=java -runtimeMap linkis.repl.method.name=sayHi
```

### 3.2 通过 `Linkis-cli` 提交`scala`任务

```shell
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -178,6 +178,23 @@ Token-User: linkis

### 3.5 通过 `Linkis-cli` 提交spark yarn cluster任务

上传jar包和配置
```shell
# 上传linkis spark引擎的lib下的jar包 (根据您的实际安装目录修改以下参数)
cd /appcom/Install/linkis/lib/linkis-engineconn-plugins/spark/dist/3.2.1/lib
hdfs dfs -put *.jar hdfs:///spark/cluster

# 上传linkis 配置文件 (根据您的实际安装目录修改以下参数)
cd /appcom/Install/linkis/conf
hdfs dfs -put * hdfs:///spark/cluster

# 上传hive-site.xml (根据您的实际安装目录修改以下参数)
cd $HIVE_CONF_DIR
hdfs dfs -put hive-site.xml hdfs:///spark/cluster
```
可以通过`linkis.spark.yarn.cluster.jars`参数来修改`hdfs:///spark/cluster`

执行测试用例
```shell
# 使用 `engingeConnRuntimeMode=yarnCluster` 来指定yarn cluster模式
sh ./bin/linkis-cli -engineType spark-3.2.1 -codeType sql -labelMap engingeConnRuntimeMode=yarnCluster -submitUser hadoop -proxyUser hadoop -code "select 123"
Expand Down

0 comments on commit 0b4ee17

Please sign in to comment.