Skip to content

Commit

Permalink
Updated integration tests account setup scripts (#1445)
Browse files Browse the repository at this point in the history
  • Loading branch information
sfc-gh-astus authored Aug 20, 2024
1 parent db8d9f1 commit 903b2b1
Show file tree
Hide file tree
Showing 18 changed files with 176 additions and 134 deletions.
1 change: 1 addition & 0 deletions .github/workflows/test_fork.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ jobs:
SNOWFLAKE_CONNECTIONS_INTEGRATION_AUTHENTICATOR: SNOWFLAKE_JWT
SNOWFLAKE_CONNECTIONS_INTEGRATION_USER: ${{ secrets.SNOWFLAKE_USER }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_ACCOUNT: ${{ secrets.SNOWFLAKE_ACCOUNT }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE: ${{ secrets.SNOWFLAKE_DATABASE }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_PRIVATE_KEY_PATH: ${{ env.PRIVATE_KEY_PATH }}
run: python -m hatch run ${{ inputs.hatch-run }}

Expand Down
1 change: 1 addition & 0 deletions .github/workflows/test_trusted.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -66,5 +66,6 @@ jobs:
SNOWFLAKE_CONNECTIONS_INTEGRATION_AUTHENTICATOR: SNOWFLAKE_JWT
SNOWFLAKE_CONNECTIONS_INTEGRATION_USER: ${{ secrets.SNOWFLAKE_USER }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_ACCOUNT: ${{ secrets.SNOWFLAKE_ACCOUNT }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE: ${{ secrets.SNOWFLAKE_DATABASE }}
SNOWFLAKE_CONNECTIONS_INTEGRATION_PRIVATE_KEY_PATH: ${{ env.PRIVATE_KEY_PATH }}
run: python -m hatch run ${{ inputs.hatch-run }}
49 changes: 27 additions & 22 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,40 +69,45 @@ or by running `pytest` inside activated environment.
Every integration test should have `integration` mark. By default, integration tests are not execute when running `pytest`.

To execute only integration tests run `hatch run integration:test` or `pytest -m integration` inside environment.
### Connection parameters in `config.toml`

Add the following connection to your `config.toml`
### User setup

```toml
[connections.integration]
host = <host>
account = <account_name>
user = <user>
password = <password>
```
Integration tests require environment variables to be set up. Parameters must use the following format:

### Connection parameters in environment parameters
``SNOWFLAKE_CONNECTIONS_INTEGRATION_<key>=<value>``

Parameters must use the following format:
where ``<key>`` is the name of the key. The following environment variables are required:

``SNOWFLAKE_CONNECTIONS_INTEGRATION_<key>=<value>``
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_HOST`
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_ACCOUNT`
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_USER`
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_PASSWORD` or `SNOWFLAKE_CONNECTIONS_INTEGRATION_PRIVATE_KEY_PATH` (if using private key authentication `SNOWFLAKE_CONNECTIONS_INTEGRATION_PASSWORD=SNOWFLAKE_JWT` should be set)
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_ROLE`
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE`
- `SNOWFLAKE_CONNECTIONS_INTEGRATION_WAREHOUSE`

where ``<key>`` is the name of the key
### Integration account setup script

For example: SNOWFLAKE_CONNECTIONS_INTEGRATION_ACCOUNT="my-account"
To set up an account for integration tests, run the following script with `ACCOUNTADMIN` role:

List of required parameter keys:
- host
- account
- user
- password
```bash
snow sql \
-f tests_integration/scripts/integration_account_setup.sql \
-D "user=${SNOWFLAKE_CONNECTIONS_INTEGRATION_USER}" \
-D "role=${SNOWFLAKE_CONNECTIONS_INTEGRATION_ROLE}" \
-D "warehouse=${SNOWFLAKE_CONNECTIONS_INTEGRATION_WAREHOUSE}" \
-D "main_database=${SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE}"\
-c <your_connection_name>
```

### User setup
Note: Before running the script, set up your environment variables.

### Build and push Docker images

Run the script with ACCOUNTADMIN role
To build and push all required Docker images, run the following script:

```bash
tests_integration/scripts/integration_account_setup.sql
./tests_integration/spcs/docker/build_and_push_all.sh
```

## Remote debugging with PyCharm or IntelliJ
Expand Down
4 changes: 2 additions & 2 deletions scripts/cleanup.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def remove_resources(single: str, plural: str, known_instances: t.List[str], rol


if __name__ == "__main__":
role = "INTEGRATION_TESTS"
role = os.getenv("SNOWFLAKE_CONNECTIONS_INTEGRATION_ROLE", "INTEGRATION_TESTS")
config = {
"authenticator": "SNOWFLAKE_JWT",
"account": os.getenv("SNOWFLAKE_CONNECTIONS_INTEGRATION_ACCOUNT"),
Expand All @@ -64,7 +64,7 @@ def remove_resources(single: str, plural: str, known_instances: t.List[str], rol
update_connection_details_with_private_key(config)
session = Session.builder.configs(config).create()

session.use_role("INTEGRATION_TESTS")
session.use_role(role)

known_objects: t.Dict[t.Tuple[str, str], t.List[str]] = {
("database", "databases"): [
Expand Down
1 change: 0 additions & 1 deletion tests_integration/config/connection_configs.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,3 @@
[connections.integration]
authenticator = "SNOWFLAKE_JWT"
schema = "public"
role = "INTEGRATION_TESTS"
1 change: 0 additions & 1 deletion tests_integration/config/world_readable.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,3 @@
[connections.default]
[connections.integration]
schema = "public"
role = "INTEGRATION_TESTS"
149 changes: 79 additions & 70 deletions tests_integration/scripts/integration_account_setup.sql
Original file line number Diff line number Diff line change
Expand Up @@ -13,98 +13,107 @@
See the License for the specific language governing permissions and
limitations under the License.
*/

SET INT_TEST_USER = 'SNOWCLI_TEST';
CREATE USER IF NOT EXISTS IDENTIFIER($INT_TEST_USER);
CREATE USER IF NOT EXISTS IDENTIFIER('&{ user }');

-- BASE SETUP
CREATE ROLE IF NOT EXISTS INTEGRATION_TESTS;
GRANT CREATE ROLE ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT CREATE DATABASE ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT CREATE COMPUTE POOL ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT BIND SERVICE ENDPOINT ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT CREATE APPLICATION PACKAGE ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT CREATE APPLICATION ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT CREATE DATABASE ON ACCOUNT TO ROLE INTEGRATION_TESTS WITH GRANT OPTION;
GRANT CREATE WAREHOUSE ON ACCOUNT TO ROLE INTEGRATION_TESTS;
GRANT ROLE INTEGRATION_TESTS TO USER IDENTIFIER($INT_TEST_USER);
CREATE ROLE IF NOT EXISTS &{ role };
GRANT CREATE ROLE ON ACCOUNT TO ROLE &{ role };
GRANT CREATE DATABASE ON ACCOUNT TO ROLE &{ role };
GRANT CREATE COMPUTE POOL ON ACCOUNT TO ROLE &{ role };
GRANT BIND SERVICE ENDPOINT ON ACCOUNT TO ROLE &{ role };
GRANT CREATE APPLICATION PACKAGE ON ACCOUNT TO ROLE &{ role };
GRANT CREATE APPLICATION ON ACCOUNT TO ROLE &{ role };
GRANT CREATE DATABASE ON ACCOUNT TO ROLE &{ role } WITH GRANT OPTION;
GRANT CREATE WAREHOUSE ON ACCOUNT TO ROLE &{ role };
GRANT ROLE &{ role } TO USER IDENTIFIER('&{ user }');

-- WAREHOUSE SETUP
CREATE WAREHOUSE IF NOT EXISTS XSMALL WAREHOUSE_SIZE=XSMALL;
GRANT ALL ON WAREHOUSE XSMALL TO ROLE INTEGRATION_TESTS;
CREATE WAREHOUSE IF NOT EXISTS &{ warehouse } WAREHOUSE_SIZE=XSMALL;
GRANT ALL ON WAREHOUSE &{ warehouse } TO ROLE &{ role };

-- DATABASES SETUP
CREATE DATABASE IF NOT EXISTS SNOWCLI_DB;
GRANT ALL ON DATABASE SNOWCLI_DB TO ROLE INTEGRATION_TESTS;
GRANT ALL ON SCHEMA SNOWCLI_DB.PUBLIC TO ROLE INTEGRATION_TESTS;
-- MAIN DATABASES SETUP
CREATE DATABASE IF NOT EXISTS &{ main_database };
GRANT ALL ON DATABASE &{ main_database } TO ROLE &{ role };
GRANT ALL ON SCHEMA &{ main_database }.PUBLIC TO ROLE &{ role };
USE DATABASE &{ main_database };

-- STAGES SETUP
CREATE STAGE IF NOT EXISTS SNOWCLI_DB.PUBLIC.SNOWCLI_STAGE DIRECTORY = ( ENABLE = TRUE );
CREATE STAGE IF NOT EXISTS &{ main_database }.PUBLIC.SNOWCLI_STAGE DIRECTORY = ( ENABLE = TRUE );

-- CONTAINERS SETUP
CREATE OR REPLACE IMAGE REPOSITORY SNOWCLI_DB.PUBLIC.SNOWCLI_REPOSITORY;
GRANT READ, WRITE ON IMAGE REPOSITORY SNOWCLI_DB.PUBLIC.SNOWCLI_REPOSITORY TO ROLE INTEGRATION_TESTS;
CREATE IMAGE REPOSITORY IF NOT EXISTS &{ main_database }.PUBLIC.SNOWCLI_REPOSITORY;
GRANT READ, WRITE ON IMAGE REPOSITORY &{ main_database }.PUBLIC.SNOWCLI_REPOSITORY TO ROLE &{ role };

CREATE COMPUTE POOL IF NOT EXISTS SNOWCLI_COMPUTE_POOL
MIN_NODES = 1
MAX_NODES = 1
INSTANCE_FAMILY = CPU_X64_XS;
CREATE COMPUTE POOL IF NOT EXISTS snowcli_compute_pool
MIN_NODES = 1
MAX_NODES = 1
INSTANCE_FAMILY = CPU_X64_XS;

GRANT USAGE ON COMPUTE POOL SNOWCLI_COMPUTE_POOL TO ROLE INTEGRATION_TESTS;
GRANT MONITOR ON COMPUTE POOL SNOWCLI_COMPUTE_POOL TO ROLE INTEGRATION_TESTS;
GRANT USAGE ON COMPUTE POOL snowcli_compute_pool TO ROLE &{ role };
GRANT MONITOR ON COMPUTE POOL snowcli_compute_pool TO ROLE &{ role };

ALTER COMPUTE POOL SNOWCLI_COMPUTE_POOL SUSPEND;
ALTER COMPUTE POOL snowcli_compute_pool SUSPEND;

-- EXTERNAL ACCESS INTEGRATION
CREATE OR REPLACE NETWORK RULE snowflake_docs_network_rule
CREATE NETWORK RULE IF NOT EXISTS snowflake_docs_network_rule
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('docs.snowflake.com');

CREATE OR REPLACE SECRET test_secret
CREATE SECRET IF NOT EXISTS test_secret
TYPE = GENERIC_STRING
-- SECRET_STRING = ''; -- provide password
GRANT READ ON SECRET test_secret TO ROLE integration_tests;
SECRET_STRING = 'test'; -- provide password
GRANT READ ON SECRET test_secret TO ROLE &{ role };

CREATE EXTERNAL ACCESS INTEGRATION IF NOT EXISTS snowflake_docs_access_integration
ALLOWED_NETWORK_RULES = (snowflake_docs_network_rule)
ALLOWED_AUTHENTICATION_SECRETS = (test_secret)
ENABLED = true;
GRANT USAGE ON INTEGRATION snowflake_docs_access_integration TO ROLE &{ role };

CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION snowflake_docs_access_integration
CREATE EXTERNAL ACCESS INTEGRATION IF NOT EXISTS cli_test_integration
ALLOWED_NETWORK_RULES = (snowflake_docs_network_rule)
ALLOWED_AUTHENTICATION_SECRETS = (test_secret)
ENABLED = true;
GRANT USAGE ON INTEGRATION snowflake_docs_access_integration TO ROLE integration_tests;
GRANT USAGE ON INTEGRATION cli_test_integration TO ROLE &{ role };

-- API INTEGRATION FOR SNOWGIT
CREATE API INTEGRATION snowcli_testing_repo_api_integration
API_PROVIDER = git_https_api
API_ALLOWED_PREFIXES = ('https://github.com/snowflakedb/')
ALLOWED_AUTHENTICATION_SECRETS = ()
ENABLED = true;
GRANT USAGE ON INTEGRATION snowcli_testing_repo_api_integration TO ROLE INTEGRATION_TESTS;

-- Notebooks setup
CREATE DATABASE NOTEBOOK;

-- CORTEX SEARCH SETUP UNCOMMENT THIS WHEN ENABLING CORTEX INTEGRATION TESTS
-- CREATE TABLE transcripts (
-- transcript_text VARCHAR,
-- region VARCHAR,
-- agent_id VARCHAR
-- );
--
-- INSERT INTO transcripts VALUES('Ah, I see you have the machine that goes "ping!". This is my favourite.', 'Meaning of Life', '01'),
-- ('First shalt thou take out the Holy Pin. Then shalt thou count to three, no more, no less.', 'Holy Grail', '02'),
-- ('And the beast shall be huge and black, and the eyes thereof red with the blood of living creatures', 'Life of Brian', '03'),
-- ('This parrot is no more! It has ceased to be! It`s expired and gone to meet its maker!', 'Flying Circus', '04');
--
-- CREATE OR REPLACE CORTEX SEARCH SERVICE test_service
-- ON transcript_text
-- ATTRIBUTES region
-- WAREHOUSE = mywh
-- TARGET_LAG = '1 day'
-- AS (
-- SELECT
-- transcript_text,
-- region,
-- agent_id
-- FROM support_transcripts
-- );
-- END OF CORTEX SETUP - THIS LINE CAN BE DELETED AFTER UNCOMMENTING ABOVE CODE WHEN ENABLING CORTEX TESTS
CREATE API INTEGRATION IF NOT EXISTS snowcli_testing_repo_api_integration
API_PROVIDER = git_https_api
API_ALLOWED_PREFIXES = ('https://github.com/snowflakedb/')
ALLOWED_AUTHENTICATION_SECRETS = ()
ENABLED = true;
GRANT USAGE ON INTEGRATION snowcli_testing_repo_api_integration TO ROLE &{ role };

-- NOTEBOOKS SETUP
CREATE DATABASE IF NOT EXISTS NOTEBOOK;

-- CORTEX SEARCH SETUP
CREATE TABLE IF NOT EXISTS transcripts (
transcript_text VARCHAR,
region VARCHAR,
agent_id VARCHAR
);

-- INSERT IF NOT EXISTS
MERGE INTO transcripts AS t USING (
VALUES('Ah, I see you have the machine that goes "ping!". This is my favourite.', 'Meaning of Life', '01'),
('First shalt thou take out the Holy Pin. Then shalt thou count to three, no more, no less.', 'Holy Grail', '02'),
('And the beast shall be huge and black, and the eyes thereof red with the blood of living creatures', 'Life of Brian', '03'),
('This parrot is no more! It has ceased to be! It`s expired and gone to meet its maker!', 'Flying Circus', '04')
) AS s (c1, c2, c3) ON t.agent_id = s.c3
WHEN NOT MATCHED THEN
INSERT (transcript_text, region, agent_id) VALUES (s.c1, s.c2, s.c3);

CREATE CORTEX SEARCH SERVICE IF NOT EXISTS test_service
ON transcript_text
ATTRIBUTES region
WAREHOUSE = &{ warehouse }
TARGET_LAG = '1 day'
AS (
SELECT
transcript_text,
region,
agent_id
FROM transcripts
);
6 changes: 6 additions & 0 deletions tests_integration/spcs/docker/build_and_push_all.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
SCRIPT_DIR=$(dirname "$0")

cd "$SCRIPT_DIR/echo_service"
source "build_and_push.sh"
cd "../test_counter"
source "build_and_push.sh"

This file was deleted.

8 changes: 8 additions & 0 deletions tests_integration/spcs/docker/echo_service/build_and_push.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
set -e
export SF_REGISTRY="$(snow spcs image-registry url -c integration)"
DATABASE=$(echo "${SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE}" | tr '[:upper:]' '[:lower:]')

echo "Using registry: ${SF_REGISTRY}"
docker build --platform linux/amd64 -t "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository/snowpark_test_echo:1" .
snow spcs image-registry token --format=json -c integration | docker login "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository" -u 0sessiontoken --password-stdin
docker push "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository/snowpark_test_echo:1"
8 changes: 8 additions & 0 deletions tests_integration/spcs/docker/test_counter/build_and_push.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
set -e
export SF_REGISTRY="$(snow spcs image-registry url -c integration)"
DATABASE=$(echo "${SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE}" | tr '[:upper:]' '[:lower:]')

echo "Using registry: ${SF_REGISTRY}"
docker build --platform linux/amd64 -t "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository/test_counter" .
snow spcs image-registry token --format=json -c integration | docker login "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository" -u 0sessiontoken --password-stdin
docker push "${SF_REGISTRY}/${DATABASE}/public/snowcli_repository/test_counter"
6 changes: 0 additions & 6 deletions tests_integration/spcs/docker/test_counter/build_image.sh

This file was deleted.

7 changes: 5 additions & 2 deletions tests_integration/spcs/test_image_repository.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,24 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os

import pytest
from snowflake.cli.api.project.util import escape_like_pattern

from tests_integration.test_utils import contains_row_with, row_from_snowflake_session
from tests_integration.testing_utils import ObjectNameProvider

INTEGRATION_DATABASE = "SNOWCLI_DB"
INTEGRATION_DATABASE = os.environ.get(
"SNOWFLAKE_CONNECTIONS_INTEGRATION_DATABASE", "SNOWCLI_DB"
)
INTEGRATION_SCHEMA = "PUBLIC"
INTEGRATION_REPOSITORY = "snowcli_repository"


@pytest.mark.integration
def test_list_images_tags(runner):
# test assumes the testing environment has been set up with /SNOWCLI_DB/PUBLIC/snowcli_repository/snowpark_test_echo:1
# test assumes the testing environment has been set up with /<DATABASE>/PUBLIC/snowcli_repository/snowpark_test_echo:1
_list_images(runner)
_list_tags(runner)

Expand Down
1 change: 0 additions & 1 deletion tests_integration/spcs/test_services.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import uuid
from typing import Tuple

Expand Down
Loading

0 comments on commit 903b2b1

Please sign in to comment.