Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V3.4.1 lyft #57

Merged
merged 89 commits into from
Oct 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
89 commits
Select commit Hold shift + click to select a range
ed7a392
Preparing development version 3.4.1-SNAPSHOT
xinrong-meng Apr 7, 2023
f19c37b
[SPARK-43069][BUILD] Use `sbt-eclipse` instead of `sbteclipse-plugin`
dongjoon-hyun Apr 7, 2023
d7f5c4c
[SPARK-43067][SS] Correct the location of error class resource file i…
HeartSaVioR Apr 8, 2023
2f6725d
[SPARK-43075][CONNECT] Change `gRPC` to `grpcio` when it is not insta…
bjornjorgensen Apr 9, 2023
a0938cf
[MINOR][SQL][TESTS] Tests in `SubquerySuite` should not drop view cre…
bersprockets Apr 10, 2023
5f53a44
[SPARK-43072][DOC] Include TIMESTAMP_NTZ type in ANSI Compliance doc
gengliangwang Apr 10, 2023
7f31b98
[SPARK-43071][SQL] Support SELECT DEFAULT with ORDER BY, LIMIT, OFFSE…
dtenedor Apr 10, 2023
83484c5
[SPARK-43083][SQL][TESTS] Mark `*StateStoreSuite` as `ExtendedSQLTest`
dongjoon-hyun Apr 10, 2023
933abb8
[SPARK-43085][SQL] Support column DEFAULT assignment for multi-part t…
dtenedor Apr 13, 2023
ede226b
[SPARK-43126][SQL] Mark two Hive UDF expressions as stateful
cloud-fan Apr 14, 2023
89d3e39
[SPARK-43125][CONNECT] Fix Connect Server Can't Handle Exception With…
Hisoka-X Apr 14, 2023
e04bdbe
[SPARK-43050][SQL] Fix construct aggregate expressions by replacing g…
wangyum Apr 15, 2023
e077310
[SPARK-43139][SQL][DOCS] Fix incorrect column names in sql-ref-syntax…
wangyum Apr 16, 2023
64afee8
[SPARK-42475][DOCS][FOLLOW-UP] Fix PySpark connect Quickstart binder …
HyukjinKwon Apr 17, 2023
b0e1263
[SPARK-43158][DOCS] Set upperbound of pandas version for Binder integ…
HyukjinKwon Apr 17, 2023
6a799f0
[SPARK-42475][DOCS][FOLLOW-UP] Fix the version string with dev0 to wo…
HyukjinKwon Apr 17, 2023
de79e2c
Revert "[SPARK-42475][DOCS][FOLLOW-UP] Fix the version string with de…
HyukjinKwon Apr 17, 2023
3dff7ba
[SPARK-43141][BUILD] Ignore generated Java files in checkstyle
HyukjinKwon Apr 16, 2023
29730dd
[SPARK-42078][PYTHON][FOLLOWUP] Add `CapturedException` to utils
itholic Apr 17, 2023
4686fe8
[SPARK-43113][SQL] Evaluate stream-side variables when generating cod…
bersprockets Apr 18, 2023
404259d
[SPARK-43098][SQL] Fix correctness COUNT bug when scalar subquery has…
jchen5 Apr 19, 2023
8bda273
[SPARK-37829][SQL] Dataframe.joinWith outer-join should return a null…
kings129 Apr 19, 2023
b8bb32d
Revert [SPARK-39203][SQL] Rewrite table location to absolute URI base…
cloud-fan Apr 21, 2023
279da72
[MINOR][CONNECT][PYTHON][DOCS] Fix the doc of parameter `num` in `Dat…
zhengruifeng Apr 21, 2023
c5172ad
[SPARK-43113][SQL][FOLLOWUP] Add comment about copying steam-side var…
bersprockets Apr 21, 2023
d3f1eec
[SPARK-43249][CONNECT] Fix missing stats for SQL Command
grundprinzip Apr 24, 2023
8f52bbd
[SPARK-43293][SQL] `__qualified_access_only` should be ignored in nor…
cloud-fan Apr 27, 2023
3b681ff
[SPARK-43156][SQL][3.4] Fix `COUNT(*) is null` bug in correlated scal…
Hisoka-X May 2, 2023
75be2ac
[SPARK-43336][SQL] Casting between Timestamp and TimestampNTZ require…
gengliangwang May 2, 2023
27b2797
[SPARK-43313][SQL] Adding missing column DEFAULT values for MERGE INS…
dtenedor May 4, 2023
0bba750
[SPARK-43378][CORE] Properly close stream objects in deserializeFromC…
eejbyfeldt May 5, 2023
02aa835
[SPARK-43284] Switch back to url-encoded strings
databricks-david-lewis May 5, 2023
3544ee6
[SPARK-43284][SQL][FOLLOWUP] Return URI encoded path, and add a test
databricks-david-lewis May 5, 2023
8b1b153
[SPARK-43337][UI][3.4] Asc/desc arrow icons for sorting column does n…
maytasm May 5, 2023
ff81db5
[SPARK-43340][CORE] Handle missing stack-trace field in eventlogs
amahussein May 5, 2023
d7c034b
[SPARK-43374][INFRA] Move protobuf-java to BSD 3-clause group and upd…
yaooqinn May 5, 2023
eea5dac
[SPARK-43395][BUILD] Exclude macOS tar extended metadata in make-dist…
pan3793 May 6, 2023
b824fc5
[SPARK-43342][K8S] Revert SPARK-39006 Show a directional error messag…
dcoliversun May 7, 2023
0e92da5
[SPARK-43414][TESTS] Fix flakiness in Kafka RDD suites due to port bi…
JoshRosen May 8, 2023
81de599
[SPARK-43425][SQL][3.4] Add `TimestampNTZType` to `ColumnarBatchRow`
Fokko May 11, 2023
2921bb6
[SPARK-43441][CORE] `makeDotNode` should not fail when DeterministicL…
TQJADE May 11, 2023
689e35d
[SPARK-43471][CORE] Handle missing hadoopProperties and metricsProper…
dongjoon-hyun May 11, 2023
e820b23
[SPARK-43483][SQL][DOCS] Adds SQL references for OFFSET clause
beliefer May 15, 2023
5f5459a
[SPARK-43281][SQL] Fix concurrent writer does not update file metrics
ulysses-you May 16, 2023
7234334
[SPARK-43517][PYTHON][DOCS] Add a migration guide for namedtuple monk…
HyukjinKwon May 16, 2023
318ceb0
[SPARK-43043][CORE] Improve the performance of MapOutputTracker.updat…
jiangxb1987 May 16, 2023
dbceb0f
[SPARK-43527][PYTHON] Fix `catalog.listCatalogs` in PySpark
zhengruifeng May 16, 2023
482dce6
[SPARK-42826][3.4][FOLLOWUP][PS][DOCS] Update migration notes for pan…
itholic May 18, 2023
c19e0f3
[SPARK-43547][3.4][PS][DOCS] Update "Supported Pandas API" page to po…
itholic May 18, 2023
f6db4f5
[SPARK-43157][SQL] Clone InMemoryRelation cached plan to prevent clon…
robreeves May 18, 2023
7324cf2
[SPARK-43522][SQL] Fix creating struct column name with index of array
Hisoka-X May 18, 2023
38ae900
[SPARK-43450][SQL][TESTS] Add more `_metadata` filter test cases
olaky May 18, 2023
ce79b99
Revert "[SPARK-43313][SQL] Adding missing column DEFAULT values for M…
cloud-fan May 18, 2023
2681707
[SPARK-43541][SQL][3.4] Propagate all `Project` tags in resolving of …
MaxGekk May 18, 2023
b071461
[SPARK-43587][CORE][TESTS] Run `HealthTrackerIntegrationSuite` in a d…
dongjoon-hyun May 19, 2023
e23d149
[SPARK-43589][SQL] Fix `cannotBroadcastTableOverMaxTableBytesError` t…
dongjoon-hyun May 19, 2023
fc9401e
[SPARK-43718][SQL] Set nullable correctly for keys in USING joins
bersprockets May 23, 2023
21c12e2
[SPARK-43719][WEBUI] Handle `missing row.excludedInStages` field
dongjoon-hyun May 23, 2023
963a368
[MINOR][PS][TESTS] Fix `SeriesDateTimeTests.test_quarter` to work pro…
itholic May 23, 2023
382a0fe
[SPARK-43758][BUILD] Upgrade snappy-java to 1.1.10.0
sunchao May 24, 2023
68d34dd
[SPARK-43758][BUILD][FOLLOWUP][3.4] Update Hadoop 2 dependency manifest
dongjoon-hyun May 24, 2023
2d792a0
[SPARK-43759][SQL][PYTHON] Expose TimestampNTZType in pyspark.sql.types
ueshin May 24, 2023
d8b79c7
[SPARK-43751][SQL][DOC] Document `unbase64` behavior change
pan3793 May 26, 2023
177d172
[SPARK-43802][SQL][3.4] Fix codegen for unhex and unbase64 with failO…
Kimahriman May 27, 2023
3b3ffe3
[SPARK-42421][CORE] Use the utils to get the switch for dynamic alloc…
jiwq May 29, 2023
80a396f
[SPARK-43894][PYTHON] Fix bug in df.cache()
grundprinzip May 31, 2023
4248442
[SPARK-43760][SQL][3.4] Nullability of scalar subquery results
agubichev Jun 1, 2023
fd6397d
[SPARK-43949][PYTHON] Upgrade cloudpickle to 2.2.1
HyukjinKwon Jun 2, 2023
9db9002
[SPARK-43956][SQL][3.4] Fix the bug doesn't display column's sql for …
beliefer Jun 3, 2023
c8884e8
[SPARK-43911][SQL] Use toSet to deduplicate the iterator data to prev…
mcdull-zhang Jun 4, 2023
71f3bbc
Revert "[SPARK-43911][SQL] Use toSet to deduplicate the iterator data…
HyukjinKwon Jun 6, 2023
f7c4f1f
[SPARK-43973][SS][UI] Structured Streaming UI should display failed q…
rednaxelafx Jun 6, 2023
d7532d5
[SPARK-43510][YARN] Fix YarnAllocator internal state when adding runn…
manuzhang Jun 6, 2023
c435245
[SPARK-43976][CORE] Handle the case where modifiedConfigs doesn't exi…
dongjoon-hyun Jun 6, 2023
0f6c5da
[SPARK-43973][SS][UI][TESTS][FOLLOWUP][3.4] Fix compilation by switch…
dongjoon-hyun Jun 6, 2023
c74b99c
[MINOR][SQL][TESTS] Move ResolveDefaultColumnsSuite to 'o.a.s.sql'
dongjoon-hyun Jun 8, 2023
45812eb
[SPARK-42290][SQL] Fix the OOM error can't be reported when AQE on
Hisoka-X Jun 8, 2023
445e3ed
[SPARK-43404][SS][3.4] Skip reusing sst file for same version of Rock…
anishshri-db Jun 9, 2023
1f5d7da
[SPARK-43398][CORE] Executor timeout should be max of idle shuffle an…
warrenzhu25 Jun 12, 2023
9a2bdbe
[SPARK-32559][SQL] Fix the trim logic did't handle ASCII control char…
Jun 13, 2023
ea87ac5
[SPARK-44031][BUILD] Upgrade silencer to 1.7.13
dongjoon-hyun Jun 13, 2023
f1c8b0d
Revert "[SPARK-44031][BUILD] Upgrade silencer to 1.7.13"
dongjoon-hyun Jun 13, 2023
af8425e
[SPARK-44038][DOCS][K8S] Update YuniKorn docs with v1.3
dongjoon-hyun Jun 13, 2023
ead8e1f
[SPARK-44053][BUILD][3.4] Update ORC to 1.8.4
guiyanakuang Jun 14, 2023
729396d
[SPARK-44040][SQL] Fix compute stats when AggregateExec node above Qu…
wangyum Jun 16, 2023
15e69ee
[SPARK-44070][BUILD] Bump snappy-java 1.1.10.1
wangyum Jun 16, 2023
4134bac
[MINOR][K8S][DOCS] Fix all dead links for K8s doc
wangyum Jun 17, 2023
ffa7e68
[SPARK-44018][SQL] Improve the hashCode and toString for some DS V2 E…
beliefer Jun 19, 2023
9e54608
Preparing Spark release v3.4.1-rc1
dongjoon-hyun Jun 19, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion LICENSE-binary
Original file line number Diff line number Diff line change
Expand Up @@ -431,7 +431,6 @@ javolution:javolution
com.esotericsoftware:kryo-shaded
com.esotericsoftware:minlog
com.esotericsoftware:reflectasm
com.google.protobuf:protobuf-java
org.codehaus.janino:commons-compiler
org.codehaus.janino:janino
jline:jline
Expand All @@ -443,6 +442,7 @@ pl.edu.icm:JLargeArrays
BSD 3-Clause
------------

com.google.protobuf:protobuf-java
dk.brics.automaton:automaton
org.antlr:antlr-runtime
org.antlr:ST4
Expand Down
2 changes: 1 addition & 1 deletion R/pkg/DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: SparkR
Type: Package
Version: 3.4.0
Version: 3.4.1
Title: R Front End for 'Apache Spark'
Description: Provides an R Front end for 'Apache Spark' <https://spark.apache.org>.
Authors@R:
Expand Down
2 changes: 1 addition & 1 deletion assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down
5 changes: 3 additions & 2 deletions binder/postBuild
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@ else
fi

if [[ ! $VERSION < "3.4.0" ]]; then
pip install plotly "pyspark[sql,ml,mllib,pandas_on_spark,connect]$SPECIFIER$VERSION"
pip install plotly "pandas<2.0.0" "pyspark[sql,ml,mllib,pandas_on_spark,connect]$SPECIFIER$VERSION"
else
pip install plotly "pyspark[sql,ml,mllib,pandas_on_spark]$SPECIFIER$VERSION"
pip install plotly "pandas<2.0.0" "pyspark[sql,ml,mllib,pandas_on_spark]$SPECIFIER$VERSION"
fi

# Set 'PYARROW_IGNORE_TIMEZONE' to surpress warnings from PyArrow.
Expand All @@ -44,6 +44,7 @@ echo "export PYARROW_IGNORE_TIMEZONE=1" >> ~/.profile
# Add sbin to PATH to run `start-connect-server.sh`.
SPARK_HOME=$(python -c "from pyspark.find_spark_home import _find_spark_home; print(_find_spark_home())")
echo "export PATH=${PATH}:${SPARK_HOME}/sbin" >> ~/.profile
echo "export SPARK_HOME=${SPARK_HOME}" >> ~/.profile

# Add Spark version to env for running command dynamically based on Spark version.
SPARK_VERSION=$(python -c "import pyspark; print(pyspark.__version__)")
Expand Down
2 changes: 1 addition & 1 deletion common/kvstore/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-shuffle/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/network-yarn/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/sketch/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/tags/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion common/unsafe/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -499,6 +499,14 @@ private UTF8String copyUTF8String(int start, int end) {
return UTF8String.fromBytes(newBytes);
}

/**
* Determines if the specified character (Unicode code point) is white space or an ISO control
* character according to Java.
*/
private boolean isWhitespaceOrISOControl(int codePoint) {
return Character.isWhitespace(codePoint) || Character.isISOControl(codePoint);
}

/**
* Trims space characters (ASCII 32) from both ends of this string.
*
Expand Down Expand Up @@ -535,14 +543,14 @@ public UTF8String trim() {
public UTF8String trimAll() {
int s = 0;
// skip all of the whitespaces in the left side
while (s < this.numBytes && Character.isWhitespace(getByte(s))) s++;
while (s < this.numBytes && isWhitespaceOrISOControl(getByte(s))) s++;
if (s == this.numBytes) {
// Everything trimmed
return EMPTY_UTF8;
}
// skip all of the whitespaces in the right side
int e = this.numBytes - 1;
while (e > s && Character.isWhitespace(getByte(e))) e--;
while (e > s && isWhitespaceOrISOControl(getByte(e))) e--;
if (s == 0 && e == numBytes - 1) {
// Nothing trimmed
return this;
Expand Down Expand Up @@ -1131,11 +1139,11 @@ public boolean toLong(LongWrapper toLongResult) {

private boolean toLong(LongWrapper toLongResult, boolean allowDecimal) {
int offset = 0;
while (offset < this.numBytes && Character.isWhitespace(getByte(offset))) offset++;
while (offset < this.numBytes && isWhitespaceOrISOControl(getByte(offset))) offset++;
if (offset == this.numBytes) return false;

int end = this.numBytes - 1;
while (end > offset && Character.isWhitespace(getByte(end))) end--;
while (end > offset && isWhitespaceOrISOControl(getByte(end))) end--;

byte b = getByte(offset);
final boolean negative = b == '-';
Expand Down Expand Up @@ -1228,11 +1236,11 @@ public boolean toInt(IntWrapper intWrapper) {

private boolean toInt(IntWrapper intWrapper, boolean allowDecimal) {
int offset = 0;
while (offset < this.numBytes && Character.isWhitespace(getByte(offset))) offset++;
while (offset < this.numBytes && isWhitespaceOrISOControl(getByte(offset))) offset++;
if (offset == this.numBytes) return false;

int end = this.numBytes - 1;
while (end > offset && Character.isWhitespace(getByte(end))) end--;
while (end > offset && isWhitespaceOrISOControl(getByte(end))) end--;

byte b = getByte(offset);
final boolean negative = b == '-';
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -229,6 +229,10 @@ public void trims() {
assertEquals(fromString("1"), fromString("1").trim());
assertEquals(fromString("1"), fromString("1\t").trimAll());

assertEquals(fromString("1中文").toString(), fromString("1中文").trimAll().toString());
assertEquals(fromString("1"), fromString("1\u0003").trimAll());
assertEquals(fromString("1"), fromString("1\u007F").trimAll());

assertEquals(fromString("hello"), fromString(" hello ").trim());
assertEquals(fromString("hello "), fromString(" hello ").trimLeft());
assertEquals(fromString(" hello"), fromString(" hello ").trimRight());
Expand Down
2 changes: 1 addition & 1 deletion connector/avro/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/connect/client/jvm/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/connect/common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/connect/server/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../../pom.xml</relativePath>
</parent>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1667,7 +1667,7 @@ class SparkConnectPlanner(val session: SparkSession) {
.build())

// Send Metrics
SparkConnectStreamHandler.sendMetricsToResponse(sessionId, df)
responseObserver.onNext(SparkConnectStreamHandler.createMetricsResponse(sessionId, df))
}

private def handleRegisterUserDefinedFunction(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ class SparkConnectService(debug: Boolean)
}

private def buildStatusFromThrowable(st: Throwable): RPCStatus = {
val message = StringUtils.abbreviate(st.getMessage, 2048)
RPCStatus
.newBuilder()
.setCode(RPCCode.INTERNAL_VALUE)
Expand All @@ -85,7 +86,7 @@ class SparkConnectService(debug: Boolean)
.setDomain("org.apache.spark")
.putMetadata("classes", compact(render(allClasses(st.getClass).map(_.getName))))
.build()))
.setMessage(StringUtils.abbreviate(st.getMessage, 2048))
.setMessage(if (message != null) message else "")
.build()
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ class SparkConnectStreamHandler(responseObserver: StreamObserver[ExecutePlanResp
SparkConnectStreamHandler.sendSchemaToResponse(request.getSessionId, dataframe.schema))
processAsArrowBatches(request.getSessionId, dataframe, responseObserver)
responseObserver.onNext(
SparkConnectStreamHandler.sendMetricsToResponse(request.getSessionId, dataframe))
SparkConnectStreamHandler.createMetricsResponse(request.getSessionId, dataframe))
if (dataframe.queryExecution.observedMetrics.nonEmpty) {
responseObserver.onNext(
SparkConnectStreamHandler.sendObservedMetricsToResponse(request.getSessionId, dataframe))
Expand Down Expand Up @@ -215,7 +215,7 @@ object SparkConnectStreamHandler {
.build()
}

def sendMetricsToResponse(sessionId: String, rows: DataFrame): ExecutePlanResponse = {
def createMetricsResponse(sessionId: String, rows: DataFrame): ExecutePlanResponse = {
// Send a last batch with the metrics
ExecutePlanResponse
.newBuilder()
Expand Down
2 changes: 1 addition & 1 deletion connector/docker-integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10-sql/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10-token-provider/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/kafka-0-10/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -240,9 +240,7 @@ private[kafka010] class KafkaTestUtils extends Logging {
private def brokerConfiguration: Properties = {
val props = new Properties()
props.put("broker.id", "0")
props.put("host.name", localHostNameForURI)
props.put("advertised.host.name", localHostNameForURI)
props.put("port", brokerPort.toString)
props.put("listeners", s"PLAINTEXT://$localHostNameForURI:$brokerPort")
props.put("log.dir", brokerLogDir)
props.put("zookeeper.connect", zkAddress)
props.put("zookeeper.connection.timeout.ms", "60000")
Expand Down
2 changes: 1 addition & 1 deletion connector/kinesis-asl-assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/kinesis-asl/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/protobuf/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion connector/spark-ganglia-lgpl/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent_2.12</artifactId>
<version>3.4.0</version>
<version>3.4.1</version>
<relativePath>../pom.xml</relativePath>
</parent>

Expand Down
2 changes: 1 addition & 1 deletion core/src/main/resources/error/error-classes.json
Original file line number Diff line number Diff line change
Expand Up @@ -4790,7 +4790,7 @@
},
"_LEGACY_ERROR_TEMP_2249" : {
"message" : [
"Cannot broadcast the table that is larger than <maxBroadcastTableBytes>GB: <dataSize> GB."
"Cannot broadcast the table that is larger than <maxBroadcastTableBytes>: <dataSize>."
]
},
"_LEGACY_ERROR_TEMP_2250" : {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ function formatStatus(status, type, row) {
}

if (status) {
if (row.excludedInStages.length == 0) {
if (typeof row.excludedInStages === "undefined" || row.excludedInStages.length == 0) {
return "Active"
}
return "Active (Excluded in Stages: [" + row.excludedInStages.join(", ") + "])";
Expand Down
Loading
Loading