Skip to content
This repository has been archived by the owner on Jan 9, 2020. It is now read-only.

add spark.kubernetes.hadoop.conf.configmap.name conf to use exist had… #599

Open
wants to merge 1 commit into
base: branch-2.2-kubernetes
Choose a base branch
from

Conversation

ChenLingPeng
Copy link

…oop conf configmap

Signed-off-by: forrestchen [email protected]

What changes were proposed in this pull request?

see issue #580

How was this patch tested?

(Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests)
manual tests

I manually run the pagerank example by following script

export HADOOP_CONF_DIR=`pwd`/littlehadoopconf
export SPARK_EXECUTOR_MEMORY=2G

bin/spark-submit \
  --deploy-mode cluster \
  --class org.apache.spark.examples.SparkPageRank \
  --master k8s://http://10.0.0.1:8080 \
  --kubernetes-namespace default \
  --conf spark.app.name=forrestperf-pagerank \
  --conf spark.kubernetes.driver.docker.image=spark-driver:cm0108 \
  --conf spark.kubernetes.executor.docker.image=spark-executor:cm0108 \
  --conf spark.kubernetes.initcontainer.docker.image=spark-init:cm0108 \
  --conf spark.kubernetes.resourceStagingServer.uri=http://10.0.0.2:31000 \
  --conf spark.eventLog.enabled=true \
  --conf spark.eventLog.dir=hdfs://hdfsCluster/spark/ramieventlog \
  --conf spark.kubernetes.delete.executors=false \
  --conf spark.kubernetes.initcontainer.inannotation=true \
  --conf spark.dynamicAllocation.enabled=true \
  --conf spark.shuffle.service.enabled=true \
  --conf spark.kubernetes.shuffle.namespace=default \
  --conf spark.kubernetes.shuffle.labels="app=spark-shuffle-service,spark-version=2.2.0" \
  --conf spark.kubernetes.hadoop.conf.configmap.name="little-hadoop-config" \
  --conf spark.local.dir=/data/spark-local \
  --conf spark.kubernetes.node.selector.subnet=10.175.106.192-26 \
  --conf spark.kubernetes.docker.image.pullPolicy=Always \
  examples/jars/spark-examples_2.11-2.2.0-k8s-0.5.0.jar \
  hdfs://hdfsCluster/spark/data/pagerank/data/part-00001 10

with the conf spark.kubernetes.hadoop.conf.configmap.name, we can re-use the existing hadoop-conf-file instead create a new one indicate by export HADOOP_CONF_DIR=xxx

@ChenLingPeng ChenLingPeng force-pushed the hadoop-conf-configmap branch from 2ec9b55 to b8216b2 Compare January 11, 2018 04:34
@ChenLingPeng
Copy link
Author

ChenLingPeng commented Jan 11, 2018

It seems now the build failed not related to this pr
image

@liyinan926 how to re-trigger this test

@ChenLingPeng ChenLingPeng force-pushed the hadoop-conf-configmap branch from b8216b2 to 704430a Compare January 18, 2018 08:59
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant