Skip to content

Commit

Permalink
Remove Try Snowplow (#793)
Browse files Browse the repository at this point in the history
  • Loading branch information
stanch authored Mar 8, 2024
1 parent 85c058b commit 2637851
Show file tree
Hide file tree
Showing 43 changed files with 36 additions and 1,172 deletions.
6 changes: 0 additions & 6 deletions docs/first-steps/custom-events/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,4 @@ description: "How to define your own events and entities"

While Snowplow provides various [events](/docs/understanding-your-pipeline/events/index.md) and [entities](/docs/understanding-your-pipeline/entities/index.md) out of the box, you can define your own to match your business more closely.

:::note

This is currently not available in Try Snowplow.

:::

The documentation on [self-describing events](/docs/understanding-your-pipeline/events/index.md#self-describing-events) provides a high-level view of how you might create and track one. For more guidance, check out [this tutorial](/docs/recipes/recipe-basic-tracking-design/index.md).
8 changes: 1 addition & 7 deletions docs/first-steps/installing/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,6 @@ Each offering has its own setup guide:
* [Snowplow BDP Cloud](/docs/getting-started-on-snowplow-bdp-cloud/index.md)
* [Snowplow Community Edition](/docs/getting-started-on-community-edition/index.md)

## Other options

### Try Snowplow

If you want to experiment with Snowplow but don’t feel like installing anything, check out [Try Snowplow](/docs/try-snowplow/index.md) — it’s a free 14-day self-serve trial experience (similar to BDP Cloud). Try Snowplow comes with a Postgres database included, so you can start tracking and analyzing events immediately.

### Snowplow Micro
## Snowplow Micro

While not a full substitute for a real Snowplow pipeline, [Snowplow Micro](/docs/testing-debugging/snowplow-micro/index.md) could be a quick way to get a feel for how Snowplow works for more technical users. Note that Micro does not store data in any warehouse or database, but you will be able to look at the available fields.
5 changes: 0 additions & 5 deletions docs/first-steps/modeling/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,6 @@ Refer to the [setup instructions](/docs/modeling-your-data/running-data-models-v
</TabItem>
<TabItem value="cloud" label="BDP Cloud">

You will need to install dbt and run the models yourself — see the “quick start” links below.

</TabItem>
<TabItem value="try" label="Try Snowplow">

You will need to install dbt and run the models yourself — see the “quick start” links below.

</TabItem>
Expand Down
7 changes: 0 additions & 7 deletions docs/first-steps/querying/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,6 @@ Use the connection details you provided when setting up BDP Enterprise.

You can find the connection details in the [Console](https://console.snowplowanalytics.com/destinations/catalog), under the destination you’ve selected.

</TabItem>
<TabItem value="try" label="Try Snowplow">

You can find the connection details in the [Try Snowplow UI](https://try.snowplowanalytics.com/access-data): hostname, port, database, username and password (request credentials in the UI if you haven’t done so).

For a step-by-step guide on how to query data in Try Snowplow, see [this tutorial](/docs/recipes/querying-try-data/index.md).

</TabItem>
<TabItem value="community" label="Community Edition">

Expand Down
18 changes: 0 additions & 18 deletions docs/first-steps/tracking/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,6 @@ import { sampleTrackingCode } from '@site/src/components/FirstSteps/sampleTracki

Once your pipeline is set up, you will want to send some events to it. Here’s an overview of the different options.

:::note Limits

Keep in mind that some of our offerings have limits on the number of events you can send:
* For Try Snowplow, there is cap of 50 events per second. Any events above this cap will be dropped
* For Community Edition, the default setup is sized for around 100 events per second

:::

:::tip Latency

Regardless of how you send the events, it might take a few minutes for them to reach your destination (e.g. data warehouse). This depends on which [destination and loader](/docs/storing-querying/storage-options/index.md) you have configured.
Expand All @@ -43,11 +35,6 @@ You can find the Collector URL (Collector Endpoint) in the [Console](https://con

You can find the Collector URL (Collector Endpoint) in the [Console](https://console.snowplowanalytics.com/environments).

</TabItem>
<TabItem value="try" label="Try Snowplow">

You can find the Collector URL (Collector Endpoint) in the [Console](https://try.snowplowanalytics.com/).

</TabItem>
<TabItem value="community" label="Community Edition">

Expand Down Expand Up @@ -96,11 +83,6 @@ BDP Enterprise can automatically generate the snippet for you. Go to the [tag ge

You can find the pre-generated snippet in the [Getting started](https://console.snowplowanalytics.com/environments/start-tracking-events?fromDocs) section.

</TabItem>
<TabItem value="try" label="Try Snowplow">

You can find the pre-generated snippet in the [Console](https://try.snowplowanalytics.com/).

</TabItem>
<TabItem value="community" label="Community Edition">

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,10 @@ sidebar_position: 2
```mdx-code-block
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
import TrySnowplow from "@site/docs/reusable/try-snowplow/_index.md";
```

This guide will take you through how to spin up a Snowplow Community Edition pipeline using the [Snowplow Terraform modules](https://registry.terraform.io/namespaces/snowplow-devops). _(Not familiar with Terraform? Take a look at [Infrastructure as code with Terraform](https://learn.hashicorp.com/tutorials/terraform/infrastructure-as-code?in=terraform/aws-get-started).)_

<TrySnowplow/>

## Prerequisites

[Sign up](https://snowplow.io/pricing/) for Snowplow Community Edition and follow the link in the email to get a copy of the repository containing the Terraform code.
Expand Down
2 changes: 1 addition & 1 deletion docs/getting-started-on-snowplow-bdp-enterprise/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,4 +34,4 @@ Snowplow BDP customers can take advantage of a number of additional features inc
- **[Outage protection](https://snowplowanalytics.com/blog/2021/02/11/how-to-protect-your-data-pipeline-against-the-next-cloud-outage/):** Protect against cloud platform outages by leveraging Snowplow Outage Protection, rerouting traffic to data pipelines in “backup” regions to minimize data loss.
- **[Multi-cloud topology](https://snowplowanalytics.com/blog/2020/02/25/why-run-a-multi-cloud-data-pipeline/):** Prevent vendor lock-in and benefit from features across multiple cloud platforms with Snowplow’s multi-cloud support.

If you’d like to learn more about Snowplow BDP you can **[book a demo with our team](https://snowplowanalytics.com/get-started/?utm_content=try-snowplow&utm-medium=related-content&utm_campaign=snowplow-docs)**, or if you’d prefer, you can **[try Snowplow technology for yourself quickly and easily](https://try.snowplowanalytics.com/?utm_content=get-started&utm-medium=related-content&utm_campaign=snowplow-docs)**.
If you’d like to learn more about Snowplow BDP you can **[book a demo with our team](https://snowplowanalytics.com/get-started/?utm-medium=related-content&utm_campaign=snowplow-docs)**.
Original file line number Diff line number Diff line change
Expand Up @@ -10,21 +10,13 @@ Each type of failed event is stored in its own table. You can get a full list of

```sql
SELECT * FROM information_schema.tables
WHERE table_schema = 'badrows';
WHERE table_schema = 'atomic_bad';
```

:::info Database schema name

The example above uses `badrows` as the database schema name in Postgres. This will depend on how you’ve set up your loader. Typically, it’s `badrows` for [Try Snowplow](/docs/try-snowplow/index.md) and `atomic_bad` for [Community Edition Quick Start](/docs/getting-started-on-community-edition/what-is-quick-start/index.md).

We will use `badrows` throughout the rest of this page — feel free to substitute your own schema name.

:::

For instance, to check the number of [schema violations](/docs/understanding-your-pipeline/failed-events/index.md#schema-violation), you can query the respective table:

```sql
SELECT COUNT(*) FROM badrows.com_snowplowanalytics_snowplow_badrows_schema_violations_2;
SELECT COUNT(*) FROM atomic_bad.com_snowplowanalytics_snowplow_badrows_schema_violations_2;
```

Taking it further, you can check how many failed events you have by [schema](/docs/understanding-your-pipeline/schemas/index.md) and error type:
Expand All @@ -34,7 +26,7 @@ SELECT
"failure.messages"->0->'error'->'error' AS error,
"failure.messages"->0->'schemaKey' AS schema,
count(*) AS failed_events
FROM badrows.com_snowplowanalytics_snowplow_badrows_schema_violations_2
FROM atomic_bad.com_snowplowanalytics_snowplow_badrows_schema_violations_2
GROUP BY 1,2
ORDER BY 3 DESC
```
5 changes: 0 additions & 5 deletions docs/recipes/custom-events-entities/_index.md

This file was deleted.

Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
81 changes: 0 additions & 81 deletions docs/recipes/querying-try-data/index.md

This file was deleted.

2 changes: 1 addition & 1 deletion docs/recipes/recipe-anonymous-tracking/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ By default, Snowplow captures identifiers with all events that can be considered
Snowplow provides two ways for limiting the amount of PII you capture and store:

- Not collecting the PII in the first place (this is covered in this recipe)
- Pseudonymizing the PII during enrichment (on by default in Try Snowplow and BDP Cloud, but configurable with Snowplow BDP)
- Pseudonymizing the PII during enrichment

You will be updating your JavaScript tracker implementation to stop setting and collecting the following four fields:

Expand Down
22 changes: 8 additions & 14 deletions docs/recipes/recipe-content-analytics/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,6 @@ For this purpose, you can add a content [entity](/docs/understanding-your-pipeli

We have already created a custom `content` entity for you in [Iglu Central](http://iglucentral.com/).

```mdx-code-block
import DataStructuresSharedBlock from "@site/docs/recipes/custom-events-entities/_index.md"
<DataStructuresSharedBlock/>
```

The `content` entity has the following fields:

<table><tbody><tr><td><strong>Field</strong></td><td><strong>Description</strong></td><td><strong>Type</strong></td><td><strong>Validation</strong></td><td><strong>Required?</strong></td></tr><tr><td><code>name</code></td><td>The name of the piece of content</td><td>string</td><td><code>maxLength: 255</code></td><td>✅&nbsp;</td></tr><tr><td>id</td><td>The content identifier</td><td>string</td><td>maxLength: 255</td><td>❌</td></tr><tr><td><code>category</code></td><td>The category of the piece of content</td><td>string</td><td><code>maxLength: 255</code></td><td>❌</td></tr><tr><td><code>date_published</code></td><td>The date the piece of content was published</td><td>string</td><td><code>maxLength: 255</code></td><td>❌</td></tr><tr><td><code>author</code></td><td>The author of the piece of content</td><td>string</td><td><code>maxLength: 255</code></td><td>❌</td></tr></tbody></table>
Expand All @@ -54,7 +48,7 @@ window.snowplow('trackPageView', {
"data": {
"name": "example_name",
"id": "example_id",
"category": "example_category",
"category": "example_category",
"date_published": "01-01-1970",
"author": "example_author"
}
Expand Down Expand Up @@ -98,30 +92,30 @@ CREATE TABLE derived.content AS(

SELECT
wp.id AS page_view_id,
c.category AS content_category,
c.name AS content_name,
c.category AS content_category,
c.name AS content_name,
c.date_published AS date_published,
c.author AS author,
10*SUM(CASE WHEN ev.event_name = 'page_ping' THEN 1 ELSE 0 END) AS time_engaged_in_s,
10*SUM(CASE WHEN ev.event_name = 'page_ping' THEN 1 ELSE 0 END) AS time_engaged_in_s,
ROUND(100*(LEAST(LEAST(GREATEST(MAX(COALESCE(ev.pp_yoffset_max, 0)), 0), MAX(ev.doc_height)) + ev.br_viewheight, ev.doc_height)/ev.doc_height::FLOAT)) AS percentage_vertical_scroll_depth

FROM atomic.events AS ev
INNER JOIN atomic.com_snowplowanalytics_snowplow_web_page_1 AS wp
ON ev.event_id = wp.root_id AND ev.collector_tstamp = wp.root_tstamp
INNER JOIN atomic.io_snowplow_foundation_content_1 AS c
ON ev.event_id = c.root_id AND ev.collector_tstamp = c.root_tstamp

GROUP BY 1,2,3,4,5,ev.br_viewheight,ev.doc_height

)

SELECT
content_category,
content_name,
content_category,
content_name,
date_published,
author,
COUNT(DISTINCT page_view_id) AS page_views,
ROUND(SUM(time_engaged_in_s)/COUNT(DISTINCT page_view_id)) AS average_time_engaged_in_s,
ROUND(SUM(time_engaged_in_s)/COUNT(DISTINCT page_view_id)) AS average_time_engaged_in_s,
ROUND(SUM(percentage_vertical_scroll_depth)/COUNT(DISTINCT page_view_id))AS average_percentage_vertical_scroll_depth

FROM content_page_views
Expand Down
Loading

0 comments on commit 2637851

Please sign in to comment.