-
Notifications
You must be signed in to change notification settings - Fork 378
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[#3515] feat(flink-connector): Support flink iceberg catalog #5914
base: main
Are you sure you want to change the base?
Conversation
e5cb05a
to
24cd6b8
Compare
24cd6b8
to
ef294ba
Compare
@FANNG1 @coolderli PTAL |
Cool!, I'll review this PR, but may need some time, :) |
ok, thanks |
The Apache Gravitino Flink connector offers the capability to read and write Iceberg tables, with the metadata managed by the Gravitino server. To enable the use of the Iceberg catalog within the Flink connector, you must download the Iceberg Flink runtime JAR to the Flink classpath. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Apache Gravitino Flink connector offers the capability to read and write Iceberg tables, with the metadata managed by the Gravitino server. To enable the use of the Iceberg catalog within the Flink connector, you must download the Iceberg Flink runtime JAR to the Flink classpath. | |
The Apache Gravitino Flink connector can be used to read and write Iceberg tables, with the metadata managed by the Gravitino server. | |
To enable the Flink connector, you must download the Iceberg Flink runtime JAR and place it in the Flink classpath. | |
- `INSERT INTO & OVERWRITE` | ||
- `SELECT` | ||
|
||
#### Not supported operations: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#### Not supported operations: | |
#### Operations Not Supported: |
|
||
## Catalog properties | ||
|
||
Gravitino Flink connector will transform the following property names defined in catalog properties to Flink Iceberg connector configuration. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gravitino Flink connector will transform the following property names defined in catalog properties to Flink Iceberg connector configuration. | |
The Gravitino Flink connector transforms the following properties in a catalog to Flink connector configuration. |
|
||
### S3 | ||
|
||
You need to add s3 secret to the Flink configuration using `s3.access-key-id` and `s3.secret-access-key`. Additionally, download the [Iceberg AWS bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-aws-bundle) and place it in the classpath of Flink. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to add s3 secret to the Flink configuration using `s3.access-key-id` and `s3.secret-access-key`. Additionally, download the [Iceberg AWS bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-aws-bundle) and place it in the classpath of Flink. | |
You need to add an S3 secret to the Flink configuration using `s3.access-key-id` and `s3.secret-access-key`. | |
Additionally, you need to download the [Iceberg AWS bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-aws-bundle) | |
and place it in the Flink classpath. |
|
||
### OSS | ||
|
||
You need to add OSS secret key to the Flink configuration using `client.access-key-id` and `client.access-key-secret`. Additionally, download the [Aliyun OSS SDK](https://gosspublic.alicdn.com/sdks/java/aliyun_java_sdk_3.10.2.zip) and copy `aliyun-sdk-oss-3.10.2.jar`, `hamcrest-core-1.1.jar`, `jdom2-2.0.6.jar` in the classpath of Flink. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You need to add OSS secret key to the Flink configuration using `client.access-key-id` and `client.access-key-secret`. Additionally, download the [Aliyun OSS SDK](https://gosspublic.alicdn.com/sdks/java/aliyun_java_sdk_3.10.2.zip) and copy `aliyun-sdk-oss-3.10.2.jar`, `hamcrest-core-1.1.jar`, `jdom2-2.0.6.jar` in the classpath of Flink. | |
You need to add an OSS secret key to the Flink configuration using `client.access-key-id` and `client.access-key-secret`. | |
Additionally, you need download the [Aliyun OSS SDK](https://gosspublic.alicdn.com/sdks/java/aliyun_java_sdk_3.10.2.zip), | |
and copy `aliyun-sdk-oss-3.10.2.jar`, `hamcrest-core-1.1.jar`, `jdom2-2.0.6.jar` to the Flink classpath. |
|
||
### GCS | ||
|
||
No extra configuration is needed. Please make sure the credential file is accessible by Flink, like using `export GOOGLE_APPLICATION_CREDENTIALS=/xx/application_default_credentials.json`, and download [Iceberg GCP bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-gcp-bundle) and place it to the classpath of Flink. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No extra configuration is needed. Please make sure the credential file is accessible by Flink, like using `export GOOGLE_APPLICATION_CREDENTIALS=/xx/application_default_credentials.json`, and download [Iceberg GCP bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-gcp-bundle) and place it to the classpath of Flink. | |
No extra configuration is needed. Please make sure the credential file is accessible by Flink. | |
For example, `export GOOGLE_APPLICATION_CREDENTIALS=/xx/application_default_credentials.json`. | |
You need to download [Iceberg GCP bundle](https://mvnrepository.com/artifact/org.apache.iceberg/iceberg-gcp-bundle) and place it in the Flink classpath. |
FactoryUtils.createCatalogFactoryHelper(this, context); | ||
return new GravitinoIcebergCatalog( | ||
context.getName(), | ||
helper.getOptions().get(GravitinoIcebergCatalogFactoryOptions.DEFAULT_DATABASE), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have to use such a long name?
Maybe CatalogFactoryOptions
or even FactoryOptions
is enough in such a well-defined package hierarchy?
*/ | ||
@Override | ||
public String gravitinoCatalogProvider() { | ||
return "lakehouse-iceberg"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a constant for this string?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no constant, this value comes from the catalog provider, and there is no unified definition as a constant
|
||
String GRAVITINO_ICEBERG_CATALOG_BACKEND_HIVE = "hive"; | ||
|
||
@VisibleForTesting String ICEBERG_CATALOG_BACKEND_REST = CatalogUtil.ICEBERG_CATALOG_TYPE_REST; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Most, if not all fields in this interface are constants, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, they are all constants
all.put( | ||
CommonCatalogOptions.CATALOG_TYPE.key(), GravitinoIcebergCatalogFactoryOptions.IDENTIFIER); | ||
// Map "catalog-backend" to "catalog-type". | ||
String catalogBackend = all.remove(IcebergConstants.CATALOG_BACKEND); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can maintain a mapping list.
What changes were proposed in this pull request?
Support flink iceberg catalog
Why are the changes needed?
Fix: #3515
Does this PR introduce any user-facing change?
no
How was this patch tested?
FlinkIcebergCatalogIT
FlinkIcebergHiveCatalogIT