You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had searched in the feature and found no similar feature requirement.
Description
Now that we have implemented the catalogstore and streampark catalog management, we need to do two more things:
The flink cli copies the catalogstore into the flink runtime via yarn.provide.lib.
Rebuild the catalog object through the catalog configuration and crud the managed database as follows
a. Build a catalog-plugin that contains catalogstore, flink-connnector-jdbc, flink-connector-paimon, flink-connector-jdbc, Make sure spi can scan to the CatalogFactory.
b. Package the catalog-plugin into the streampark/plugin directory.
c. Load the catalog-plugin into classloader/ and catlogstore configuration, reflect Catalog, and then implement Catalog crud for database/table. public static Catalog createCatalog(String catalogName,Map<String, String> options,ReadableConfig configuration,ClassLoader classLoader)
Usage Scenario
The flink sql job starts by lazy loading the catalog and parsing it.
Search before asking
Description
Now that we have implemented the catalogstore and streampark catalog management, we need to do two more things:
a. Build a catalog-plugin that contains catalogstore, flink-connnector-jdbc, flink-connector-paimon, flink-connector-jdbc, Make sure spi can scan to the CatalogFactory.
b. Package the catalog-plugin into the streampark/plugin directory.
c. Load the catalog-plugin into classloader/ and catlogstore configuration, reflect Catalog, and then implement Catalog crud for database/table. public static Catalog createCatalog(String catalogName,Map<String, String> options,ReadableConfig configuration,ClassLoader classLoader)
Usage Scenario
The flink sql job starts by lazy loading the catalog and parsing it.
Related issues
#3912
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: