Explore our expert-made templates & start with the right one for you.
Join us for this interactive hands-on workshop, where together we will ingest data in near-real-time from different sources into Snowflake using Upsolver.
Our wizard-based UI helps you to set up connections and configure your ingestion job in just a few clicks. Launch the job and your pipeline is deployed with data beginning to appear in your Snowflake schema in less than 10 minutes. You will be able to watch the data flow and identify quality issues within seconds of launching the ingestion job.
In the workshop we will cover the following topics related to data quality: volume spikes and dips, stale and new fields, missing values in key columns and illegal characters in field names
We will also preview and monitor our ingestion job, including pipeline health, cluster utilization, data volume scanned and written, and data delays or errors.
Lastly, we will set data quality “expectations” using SQL and demonstrate how ingestion jobs automatically quarantine bad records.
There will be time for Q&A after the workshop completes, as well as a sneak peek into future workshop sessions.
Since the workshop is hands-on, you need access to a Snowflake account. Your three choices are:
Explore our expert-made templates & start with the right one for you.