Snowflake offers a straightforward solution for safely sharing data, allowing account-to-account data sharing via Snowflake database tables, secure views, and secure UDFs. Snowflake data sharing seems to be a simple but powerful feature that allows users to share data from one account and then use it to other. The data producer could provide direct exposure to his real time data to every number of data consumers throughout minutes without replicating or relocating the data. Because of the snowflakes multi-cluster shared data architecture, the data consumer could indeed query the ability to share data from the data producer without experiencing any bottlenecks.
Snowflake utilises data sharing to distribute the latest details to all Snowflake accounts:
The term "secure data sharing" refers to the fact that no relevant figures are copied or converted between accounts. Because all communication is done through Snowflake's distinctive services layer as well as metadata store, shared data doesn't really take up any data storage in a consumer account and thus does not make a contribution to the consumer's quarterly data storage charges.
Become a Snowflake Certified professional by learning this HKR Snowflake Training !
When designers generate the shares and give them a catchy name. First-class Snowflake objects which contain all of the data necessitate the sharing of a database.
Each share includes the following components:
New objects added to a share are immediately available to all consumers, allowing them to access shared data in real time.Access to a share (or any of its contents) can be revoked at any time.
Hardly data sharing among Snowflake accounts is endorsed. As a database server, you might just want to share information with an end user who still does not have a Snowflake account and/or has not yet been ready to get to be a licenced Snowflake customer. Snowflake claims to support suppliers in creating audience account balances in facilitating information exchange with these consumers. Reader account holders (previously known as "read-only accounts") offer a quick, simple, and cost-effective way to share data without requiring the consumer to become a Snowflake customer. Users with a reader account can query information which has been communicated with them, but they cannot perform any DML tasks that really are available to full account users (data loading, insert, update, etc.).
Snowflake data sharing actually does work because of its multi-cluster shared information architecture, which allows any snowflake customer (data consumer) who wishes to communicate information with someone to grant other organisations live access to data via a secured network. Instead of being physically transferred to the customer, the data is kept in the provider's account and is accessible and visible to the consumer via SQL.
The data producer would only be charged for the snowflake computing and storage resources that they use. The compute resources used to query the data flows will be paid for by the data consumer. When an object is shared with a data consumer, the object appears in the producer's account, so the data consumers incur no storage costs for the shared data except if those who copy the data in the table.
Get ahead in your career with our Snowflake Tutorial !
We must first create the person account with whom we will share the account. In Snowflake, we can create user accounts or readers accounts as needed:
For example, in this example, we will see how to create a Readers account.
In the below form fill all the needed information as shown below.
Now the reader account is created, next you need to create the snowflake user account.
If you have the ACCOUNT ADMIN role (or a role that has been granted the CREATE SHARES privilege), you can perform most tasks related to creating and managing shares by clicking the Shares button on the menu list in the Snowflake web interface.
Next click on the create share option, a window will be opened , fill in all the details.
Once a share has already been invented in the UI, the consumer can be added.
A pop-up window appears, allowing you to select the accounts with which you want to share the data.
Top 30 frequently asked snowflake interview questions & answers for freshers & experienced professionals
The same steps can be performed in worksheets using SQL commands.
Step 1: Make a Share.
You can follow these steps in worksheets once the accounts for the clients with whom we need to share the data have been created.
To make a share, use CREATE SHARE. At this point, the share is merely a container awaiting the addition of objects and accounts.
Step 2: Grant Privileges to Add Objects to the Share
GRANT privilege>... TO SHARE to grant the share the following object privileges:
Step3:Add an Account or Accounts to the Share
ALTER SHARE can be used to add one or more accounts to the share. SHOW GRANTS can be used to view the accounts that have been added to the share.
It is critical that the person sharing the DB has the DB owner rights, or else the issue will be granted insufficient permission.
That's all! The share has become suitable for consumption by the designated accounts.
The following illustration depicts the entire provider process as described above.
It should be noted that the following assumptions are made in this example:
There is a database called sales db, which has a schema called aggregates eula and a table called aggregate 1.
Two accounts, xy12345 and yz23456, will have access to the database, schema, and table.
USE ROLE account admin;
CREATE SHARE sales_s;
GRANT USAGE ON DATABASE sales_db TO SHARE sales_s;
GRANT USAGE ON SCHEMA sales_db.aggregates_eula TO SHARE sales_s;
GRANT SELECT ON TABLE sales_db.aggregates_eula.aggregate_1 TO SHARE sales_s;
SHOW GRANTS TO SHARE sales_s;
ALTER SHARE sales_s ADD ACCOUNTS=xy12345, yz23456;
SHOW GRANTS OF SHARE sales_s;
Consumers face some serious limitations when using shared databases:
In the above blog post we had discussed the snowflake data sharing and steps to create the shared accounts in a detailed way. If you have any doubts please drop them in the comments section to get them resolved.
Batch starts on 2nd Oct 2022, Weekend batch
Batch starts on 6th Oct 2022, Weekday batch
Batch starts on 10th Oct 2022, Weekday batch