Skip to content
This repository has been archived by the owner on Jan 9, 2020. It is now read-only.

Process spark-env.sh in containers not using spark-class. #605

Open
wants to merge 1 commit into
base: branch-2.2-kubernetes
Choose a base branch
from

Conversation

coderanger
Copy link

This allows setting things like HADOOP_CONF_DIR in the more traditional Spark way.

What changes were proposed in this pull request?

Adds a source "${SPARK_HOME}/bin/load-spark-env.sh" to the command in each non-spark-class container.

How was this patch tested?

Manual testing with my local development environment.

This allows setting things like HADOOP_CONF_DIR in the more traditional Spark way.
@foxish
Copy link
Member

foxish commented Jan 24, 2018

@coderanger, would be great if you could help rebase this entire fork on top of the spark upstream effort - then we'd be in a better position to use this PR - since the dockerfiles etc are now very different;

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants