


Using a task, you can schedule the MERGE statement to run on a recurring basis and execute only if there is data in the NATION_TABLE_CHANGES stream. Luckily you don’t have to execute it manually. Manually executing the MERGE command is starting to get tiresome. The following figure shows that the stream has been purged: To verify this, run the following command: select * from nation_table_changes select * from nation_history where n_nationkey in (2,3) order by n_nationkey, start_time Īgain, because you executed a DML statement using the stream on the NATION table, the stream was purged to ensure you don’t process the same changed data twice. If you view the data in the NATION_HISTORY table, you’ll now see 27 records instead of 25, as shown in the following figure: select * from nation_history ΩΩĪs shown in the following figure, filtering for only the N_NATIONKEY values you updated shows how records are stamped for START_TIME, END_TIME, and CURRENT_TIME. The MERGE statement updated two records in the NATION_HISTORY table and inserted two more, as shown in the following figure: M.country_code, m.start_time, m.end_time, m.current_flag) The record is stamped and no newer version is inserted set nh.end_time = m.end_time, nh.current_flag = 0 when not matched and m.dml_type = 'I' then insert - Inserting a new n_nationkey and updating an existing one both result in an Insert (n_nationkey, n_name, n_regionkey, n_comment, country_code, start_time, end_time, current_flag) values (m.n_nationkey, m.n_name, m.n_regionkey, m.n_comment,
#Snowflake tasks update#
on nh.n_nationkey = m.n_nationkey - n_nationkey and start_time determine whether there is a unique record in the NATION_HISTORY table and nh.start_time = m.start_time when matched and m.dml_type = 'U' then update - Indicates the record has been updated and is no longer current and the end_time needs to be stamped set nh.end_time = m.end_time, nh.current_flag = 0 when matched and m.dml_type = 'D' then update - Deletes are essentially logical deletes. Next, merge the stream data into the NATION_HISTORY table by executing the MERGE statement again: - MERGE statement that uses the CHANGE_DATA view to load data into the NATION_HISTORY table merge into nation_history nh - Target table to merge changes from NATION into using nation_change_data m - CHANGE_DATA is a view that holds the logic that determines what to insert/update into the NATION_HISTORY table. You updated two records in the NATION table, so there are four rows in the stream. In the following figure, notice the METADATA$ISUPDATE column contains TRUE. Run the following command: select * from nation_table_changes Next, let’s look at what the stream captured. In the following figure, notice the updated values in the N_COMMENT and UPDATE_TIMESTAMP columns for N_NATIONKEY 2 and N_NATIONKEY 3. View the data in the NATION table by running the following command: select * from nation where n_nationkey in (1, 2,3) To start, run the following transaction to update two records: begin update nation set n_comment = 'New comment for Brazil', update_timestamp = current_timestamp()::timestamp_ntz where n_nationkey = 2 update nation set n_comment = 'New comment for Canada', update_timestamp = current_timestamp()::timestamp_ntz where n_nationkey = 3 commit Then, you’ll delete data and set up automatic processing. First, you’ll update some data and then manually process it. Now, let’s automate the stream and have it run on a schedule.

#Snowflake tasks how to#
Part 1 of this two-part post demonstrated how to build a Type 2 Slowly Changing Dimension (SCD) using Snowflake’s Stream functionality to set up a stream and insert data.
