Labour Day Special 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

SnowPro Advanced: Architect Recertification Exam Question and Answers

SnowPro Advanced: Architect Recertification Exam

Last Update May 1, 2024
Total Questions : 162

We are offering FREE ARA-R01 Snowflake exam questions. All you do is to just go and sign up. Give your details, prepare ARA-R01 free exam questions and then go for complete pool of SnowPro Advanced: Architect Recertification Exam test questions that will help you more.

ARA-R01 pdf

ARA-R01 PDF

$35  $99.99
ARA-R01 Engine

ARA-R01 Testing Engine

$42  $119.99
ARA-R01 PDF + Engine

ARA-R01 PDF + Testing Engine

$56  $159.99
Questions 1

Which of the following are characteristics of Snowflake’s parameter hierarchy?

Options:

A.  

Session parameters override virtual warehouse parameters.

B.  

Virtual warehouse parameters override user parameters.

C.  

Table parameters override virtual warehouse parameters.

D.  

Schema parameters override account parameters.

Discussion 0
Questions 2

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.  

There needs to be fewer objects per tenant.

B.  

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.  

Compute costs must be optimized.

D.  

Tenant data shape may be unique per tenant.

E.  

Storage costs must be optimized.

Discussion 0
Questions 3

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.  

Use, at minimum, the Business Critical edition of Snowflake.

B.  

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.  

Use the Internal Tokenization feature to obfuscate sensitive data.

D.  

Use the External Tokenization feature to obfuscate sensitive data.

E.  

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.  

Avoid sharing data with partner organizations.

Discussion 0
Questions 4

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.  

Use ON_ERROR = continue in the copy into command.

B.  

Use purge = TRUE in the copy into command.

C.  

Use FURGE = FALSE in the copy into command.

D.  

Use on error = SKIP_FILE in the copy into command.

Discussion 0
Questions 5

What integration object should be used to place restrictions on where data may be exported?

Options:

A.  

Stage integration

B.  

Security integration

C.  

Storage integration

D.  

API integration

Discussion 0
Questions 6

The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?

A)

B)

C)

D)

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Questions 7

A company has an external vendor who puts data into Google Cloud Storage. The company's Snowflake account is set up in Azure.

What would be the MOST efficient way to load data from the vendor into Snowflake?

Options:

A.  

Ask the vendor to create a Snowflake account, load the data into Snowflake and create a data share.

B.  

Create an external stage on Google Cloud Storage and use the external table to load the data into Snowflake.

C.  

Copy the data from Google Cloud Storage to Azure Blob storage using external tools and load data from Blob storage to Snowflake.

D.  

Create a Snowflake Account in the Google Cloud Platform (GCP), ingest data into this account and use data replication to move the data from GCP to Azure.

Discussion 0
Questions 8

A company’s client application supports multiple authentication methods, and is using Okta.

What is the best practice recommendation for the order of priority when applications authenticate to Snowflake?

Options:

A.  

1) OAuth (either Snowflake OAuth or External OAuth)

2) External browser

3) Okta native authentication

4) Key Pair Authentication, mostly used for service account users

5) Password

B.  

1) External browser, SSO

2) Key Pair Authentication, mostly used for development environment users

3) Okta native authentication

4) OAuth (ether Snowflake OAuth or External OAuth)

5) Password

C.  

1) Okta native authentication

2) Key Pair Authentication, mostly used for production environment users

3) Password

4) OAuth (either Snowflake OAuth or External OAuth)

5) External browser, SSO

D.  

1) Password

2) Key Pair Authentication, mostly used for production environment users

3) Okta native authentication

4) OAuth (either Snowflake OAuth or External OAuth)

5) External browser, SSO

Discussion 0
Questions 9

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.  

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.  

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.  

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.  

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.  

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Discussion 0
command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Questions 10

    An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.

    Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

    Options:

    A.  

    Utilize a higher Buffer.flush.time in the connector configuration.

    B.  

    Utilize a higher Buffer.size.bytes in the connector configuration.

    C.  

    Utilize a lower Buffer.size.bytes in the connector configuration.

    D.  

    Utilize a lower Buffer.count.records in the connector configuration.

    Discussion 0
    Questions 11

    A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

    What would be the MOST efficient solution?

    Options:

    A.  

    Ask the partner to create a share and add the company's account.

    B.  

    Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

    C.  

    Keep the current structure but request that the partner stop changing files, instead only appending new files.

    D.  

    Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

    Discussion 0
    Questions 12

    An Architect needs to design a solution for building environments for development, test, and pre-production, all located in a single Snowflake account. The environments should be based on production data.

    Which solution would be MOST cost-effective and performant?

    Options:

    A.  

    Use zero-copy cloning into transient tables.

    B.  

    Use zero-copy cloning into permanent tables.

    C.  

    Use CREATE TABLE ... AS SELECT (CTAS) statements.

    D.  

    Use a Snowflake task to trigger a stored procedure to copy data.

    Discussion 0
    Questions 13

    Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

    Options:

    A.  

    Extended Time Travel (up to 90 days)

    B.  

    Customer-managed encryption keys through Tri-Secret Secure

    C.  

    Periodic rekeying of encrypted data

    D.  

    AWS, Azure, or Google Cloud private connectivity to Snowflake

    E.  

    Federated authentication and SSO

    Discussion 0
    Questions 14

    Database DB1 has schema S1 which has one table, T1.

    DB1 --> S1 --> T1

    The retention period of EG1 is set to 10 days.

    The retention period of s: is set to 20 days.

    The retention period of t: Is set to 30 days.

    The user runs the following command:

    Drop Database DB1;

    What will the Time Travel retention period be for T1?

    Options:

    A.  

    10 days

    B.  

    20 days

    C.  

    30 days

    D.  

    37 days

    Discussion 0
    Questions 15

    A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

    The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

    Options:

    A.  

    1. Create a share.

    2. Add objects to the share.

    3. Add a consumer account to the share for the vendor to access.

    B.  

    1. Create a share.

    2. Create a reader account for the vendor to use.

    3. Add the reader account to the share.

    C.  

    1. Create a new role called db_share.

    2. Grant the db_share role privileges to read data from the company database and schema.

    3. Create a user for the vendor.

    4. Grant the ds_share role to the vendor's users.

    D.  

    1. Promote an existing database in the company's local account to primary.

    2. Replicate the database to Snowflake on Azure in the West-Europe region.

    3. Create a share and add objects to the share.

    4. Add a consumer account to the share for the vendor to access.

    Discussion 0
    Questions 16

    Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

    Options:

    A.  

    A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

    B.  

    If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

    C.  

    If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

    D.  

    If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

    Discussion 0
    Questions 17

    What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

    Options:

    A.  

    Privileges can be granted at the database level and can be inherited by all underlying objects.

    B.  

    A user can use a "super-user" access along with securityadmin to bypass authorization checks and access all databases, schemas, and underlying objects.

    C.  

    A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.

    D.  

    A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

    Discussion 0
    Questions 18

    A Developer is having a performance issue with a Snowflake query. The query receives up to 10 different values for one parameter and then performs an aggregation over the majority of a fact table. It then

    joins against a smaller dimension table. This parameter value is selected by the different query users when they execute it during business hours. Both the fact and dimension tables are loaded with new data in an overnight import process.

    On a Small or Medium-sized virtual warehouse, the query performs slowly. Performance is acceptable on a size Large or bigger warehouse. However, there is no budget to increase costs. The Developer

    needs a recommendation that does not increase compute costs to run this query.

    What should the Architect recommend?

    Options:

    A.  

    Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The query results will then be cached and ready to respond quickly when the users re-issue the query.

    B.  

    Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The task will be scheduled to align with the users' working hours in order to allow the warehouse cache to be used.

    C.  

    Enable the search optimization service on the table. When the users execute the query, the search optimization service will automatically adjust the query execution plan based on the frequently-used parameters.

    D.  

    Create a dedicated size Large warehouse for this particular set of queries. Create a new role that has USAGE permission on this warehouse and has the appropriate read permissions over the fact and dimension tables. Have users switch to this role and use this warehouse when they want to access this data.

    Discussion 0
    Questions 19

    When loading data from stage using COPY INTO, what options can you specify for the ON_ERROR clause?

    Options:

    A.  

    CONTINUE

    B.  

    SKIP_FILE

    C.  

    ABORT_STATEMENT

    D.  

    FAIL

    Discussion 0

    Questions 20

    A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

    Options:

    A.  

    Setting a '0' or NULL value means the warehouses will never suspend.

    B.  

    Setting a '0' or NULL value means the warehouses will suspend immediately.

    C.  

    Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

    D.  

    Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

    Discussion 0
    Questions 21

    What transformations are supported in the below SQL statement? (Select THREE).

    CREATE PIPE ... AS COPY ... FROM (...)

    Options:

    A.  

    Data can be filtered by an optional where clause.

    B.  

    Columns can be reordered.

    C.  

    Columns can be omitted.

    D.  

    Type casts are supported.

    E.  

    Incoming data can be joined with other tables.

    F.  

    The ON ERROR - ABORT statement command can be used.

    Discussion 0
    statement used by Snowpipe to load data from an ingestion queue into tables1. The statement uses a subquery in the FROM clause to transform the data from the staged files before loading it into the table2.
  • The transformations supported in the subquery are as follows2:
  • SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable

    from (

    select * from @mystage

    where col1 = 'A' and col2 > 10

    );

    • uk.co.certification.simulator.questionpool.PList@18277a90

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2, col3)

    from (

    select col3, col1, col2 from @mystage

    );

    • uk.co.certification.simulator.questionpool.PList@18277ad0

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2)

    from (

    select col1, col2 from @mystage

    );

    • The other options are not supported in the subquery because2:

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2)

    from (

    select col1::date, col2 from @mystage

    );

    • uk.co.certification.simulator.questionpool.PList@18275290

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable (col1, col2, col3)

    from (

    select s.col1, s.col2, t.col3 from @mystage s

    join othertable t on s.col1 = t.col1

    );

    • uk.co.certification.simulator.questionpool.PList@182752f0

    SQLAI-generated code. Review and use carefully. More info on FAQ.

    create pipe mypipe as

    copy into mytable

    from (

    select * from @mystage

    on error abort

    );

    References:

    • 1: CREATE PIPE | Snowflake Documentation
    • 2: Transforming Data During a Load | Snowflake Documentation

    Questions 22

    A new table and streams are created with the following commands:

    CREATE OR REPLACE TABLE LETTERS (ID INT, LETTER STRING) ;

    CREATE OR REPLACE STREAM STREAM_1 ON TABLE LETTERS;

    CREATE OR REPLACE STREAM STREAM_2 ON TABLE LETTERS APPEND_ONLY = TRUE;

    The following operations are processed on the newly created table:

    INSERT INTO LETTERS VALUES (1, 'A');

    INSERT INTO LETTERS VALUES (2, 'B');

    INSERT INTO LETTERS VALUES (3, 'C');

    TRUNCATE TABLE LETTERS;

    INSERT INTO LETTERS VALUES (4, 'D');

    INSERT INTO LETTERS VALUES (5, 'E');

    INSERT INTO LETTERS VALUES (6, 'F');

    DELETE FROM LETTERS WHERE ID = 6;

    What would be the output of the following SQL commands, in order?

    SELECT COUNT (*) FROM STREAM_1;

    SELECT COUNT (*) FROM STREAM_2;

    Options:

    A.  

    2 & 6

    B.  

    2 & 3

    C.  

    4 & 3

    D.  

    4 & 6

    Discussion 0
    Questions 23

    A company wants to deploy its Snowflake accounts inside its corporate network with no visibility on the internet. The company is using a VPN infrastructure and Virtual Desktop Infrastructure (VDI) for its Snowflake users. The company also wants to re-use the login credentials set up for the VDI to eliminate redundancy when managing logins.

    What Snowflake functionality should be used to meet these requirements? (Choose two.)

    Options:

    A.  

    Set up replication to allow users to connect from outside the company VPN.

    B.  

    Provision a unique company Tri-Secret Secure key.

    C.  

    Use private connectivity from a cloud provider.

    D.  

    Set up SSO for federated authentication.

    E.  

    Use a proxy Snowflake account outside the VPN, enabling client redirect for user logins.

    Discussion 0
    Questions 24

    Two queries are run on the customer_address table:

    create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

    VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

    VARCHAR(20) );

    ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

    Which queries will benefit from the use of the search optimization service? (Select TWO).

    Options:

    A.  

    select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

    B.  

    select * from DEMO_D

    B.  

    DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

    C.  

    select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE ’%BAAASKD%';

    D.  

    select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE '%PHPP%';

    E.  

    select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDNOT LIKE '%AAAAAAAAPHPPL%';

    Discussion 0
    Questions 25

    An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

    1. Update the order in the f_0RDERS fact table.

    2. Load the changed order data into the special table ORDER _REPAIRS.

    This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

    What data processing logic design will be the MOST performant?

    Options:

    A.  

    Useone stream and one task.

    B.  

    Useone stream and two tasks.

    C.  

    Usetwo streams and one task.

    D.  

    Usetwo streams and two tasks.

    Discussion 0
    Questions 26

    A user has the appropriate privilege to see unmasked data in a column.

    If the user loads this column data into another column that does not have a masking policy, what will occur?

    Options:

    A.  

    Unmasked data will be loaded in the new column.

    B.  

    Masked data will be loaded into the new column.

    C.  

    Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.

    D.  

    Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.

    Discussion 0
    Questions 27

    An Architect needs to grant a group of ORDER_ADMIN users the ability to clean old data in an ORDERS table (deleting all records older than 5 years), without granting any privileges on the table. The group’s manager (ORDER_MANAGER) has full DELETE privileges on the table.

    How can the ORDER_ADMIN role be enabled to perform this data cleanup, without needing the DELETE privilege held by the ORDER_MANAGER role?

    Options:

    A.  

    Create a stored procedure that runs with caller’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

    B.  

    Create a stored procedure that can be run using both caller’s and owner’s rights (allowing the user to specify which rights are used during execution), and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

    C.  

    Create a stored procedure that runs with owner’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

    D.  

    This scenario would actually not be possible in Snowflake – any user performing a DELETE on a table requires the DELETE privilege to be granted to the role they are using.

    Discussion 0
    Questions 28

    Following objects can be cloned in snowflake

    Options:

    A.  

    Permanent table

    B.  

    Transient table

    C.  

    Temporary table

    D.  

    External tables

    E.  

    Internal stages

    Discussion 0
    Questions 29

    Which Snowflake architecture recommendation needs multiple Snowflake accounts for implementation?

    Options:

    A.  

    Enable a disaster recovery strategy across multiple cloud providers.

    B.  

    Create external stages pointing to cloud providers and regions other than the region hosting the Snowflake account.

    C.  

    Enable zero-copy cloning among the development, test, and production environments.

    D.  

    Enable separation of the development, test, and production environments.

    Discussion 0
    Questions 30

    How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?

    Options:

    A.  

    Create multiple clustering keys for a table.

    B.  

    Create multiple materialized views with different cluster keys.

    C.  

    Create super projections that will automatically create clustering.

    D.  

    Create a clustering key that contains all columns used in the access paths.

    Discussion 0
    Questions 31

    An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

    What should the Architect do to enable the Snowflake search optimization service on this table?

    Options:

    A.  

    Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

    B.  

    Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

    C.  

    Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

    D.  

    Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

    Discussion 0
    Questions 32

    What are some of the characteristics of result set caches? (Choose three.)

    Options:

    A.  

    Time Travel queries can be executed against the result set cache.

    B.  

    Snowflake persists the data results for 24 hours.

    C.  

    Each time persisted results for a query are used, a 24-hour retention period is reset.

    D.  

    The data stored in the result cache will contribute to storage costs.

    E.  

    The retention period can be reset for a maximum of 31 days.

    F.  

    The result set cache is not shared between warehouses.

    Discussion 0
    Questions 33

    A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

    The Architect has been given the following requirements:

    1. Provide access to frequently changing data

    2. Keep egress costs to a minimum

    3. Maintain low latency

    How can these requirements be met with the LEAST amount of operational overhead?

    Options:

    A.  

    Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

    B.  

    Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

    C.  

    Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

    D.  

    Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

    Discussion 0
    Questions 34

    A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

    Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

    Options:

    A.  

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

    B.  

    From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

    C.  

    Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

    D.  

    Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

    Discussion 0
    Questions 35

    There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

    An Architect needs to create a read-only role for certain employees working in the human resources department.

    Which permission sets must be granted to this role?

    Options:

    A.  

    USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

    B.  

    USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

    C.  

    MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

    D.  

    USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

    Discussion 0
    Questions 36

    An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.

    Why is this occurring?

    Options:

    A.  

    Tasks cannot be cloned.

    B.  

    The objects that the tasks reference are not fully qualified.

    C.  

    Cloned tasks are suspended by default and must be manually resumed.

    D.  

    The Architect has insufficient privileges to alter tasks on the cloned database.

    Discussion 0
    Questions 37

    An Architect is implementing a CI/CD process. When attempting to clone a table from a production to a development environment, the cloning operation fails.

    What could be causing this to happen?

    Options:

    A.  

    The table is transient.

    B.  

    The table has a masking policy.

    C.  

    The retention time for the table is set to zero.

    D.  

    Tables cannot be cloned from a higher environment to a lower environment.

    Discussion 0
    Questions 38

    Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

    Options:

    A.  

    Changing the name of the organization

    B.  

    Creating an account

    C.  

    Viewing a list of organization accounts

    D.  

    Changing the name of an account

    E.  

    Deleting an account

    F.  

    Enabling the replication of a database

    Discussion 0
    Questions 39

    The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:

    1) Finance and Vendor Management team members who require reporting and visualization

    2) Data Science team members who require access to raw data for ML model development

    3) Sales team members who require engineered and protected data for data monetization

    What Snowflake data modeling approaches will meet these requirements? (Choose two.)

    Options:

    A.  

    Consolidate data in the company’s data lake and use EXTERNAL TABLES.

    B.  

    Create a raw database for landing and persisting raw data entering the data pipelines.

    C.  

    Create a set of profile-specific databases that aligns data with usage patterns.

    D.  

    Create a single star schema in a single database to support all consumers’ requirements.

    E.  

    Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.

    Discussion 0
    Questions 40

    A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

    After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

    What would cause this to occur? (Choose two.)

    Options:

    A.  

    The staging schema has not been setup for MANAGED ACCESS.

    B.  

    The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

    C.  

    The tables exceed the 1 TB limit for data recovery.

    D.  

    The staging tables are of the TRANSIENT type.

    E.  

    The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

    Discussion 0
    Questions 41

    A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

    The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

    Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

    Options:

    A.  

    Use Secure Data Sharing with an S3 bucket as a destination.

    B.  

    Publish product_category and product_details data sets on the Snowflake Marketplace.

    C.  

    Create a database user for the partner and give them access to the required data sets.

    D.  

    Create a reader account for the partner and share the data sets as secure views.

    Discussion 0
    Questions 42

    Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

    Options:

    A.  

    External table

    B.  

    Materialized view

    C.  

    Search optimization

    D.  

    Result cache

    Discussion 0
    Questions 43

    What Snowflake system functions are used to view and or monitor the clustering metadata for a table? (Select TWO).

    Options:

    A.  

    SYSTEMSCLUSTERING

    B.  

    SYSTEMSTABLE_CLUSTERING

    C.  

    SYSTEMSCLUSTERING_DEPTH

    D.  

    SYSTEMSCLUSTERING_RATIO

    E.  

    SYSTEMSCLUSTERING_INFORMATION

    Discussion 0
    Questions 44

    The following table exists in the production database:

    A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

    How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

    Options:

    A.  

    Use a masking policy on the username column using a entitlement table with valid dates.

    B.  

    Use a row level policy on the user_events table using a entitlement table with valid dates.

    C.  

    Use a masking policy on the username column with event_timestamp as a conditional column.

    D.  

    Use a secure view on the user_events table using a case statement on the username column.

    Discussion 0
    Questions 45

    The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

    What will happen to the consumer account if a new table (table_6) is added to the provider schema?

    Options:

    A.  

    The consumer role will automatically see the new table and no additional grants are needed.

    B.  

    The consumer role will see the table only after this grant is given on the consumer side:

    grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

    C.  

    The consumer role will see the table only after this grant is given on the provider side:

    use role accountadmin;

    Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

    D.  

    The consumer role will see the table only after this grant is given on the provider side:

    use role accountadmin;

    grant usage on database EDW to share PSHARE_EDW_4TEST ;

    grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

    Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

    Discussion 0
    Questions 46

    A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

    The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

    What step can be taken to improve the pruning of the reporting tables?

    Options:

    A.  

    Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

    B.  

    Increase the size of the virtual warehouse to a size 5X-Large.

    C.  

    Use an ORDER BY command to load the reporting tables.

    D.  

    Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

    Discussion 0
    Questions 47

    An Architect is using SnowCD to investigate a connectivity issue.

    Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

    Options:

    A.  

    SYSTEMSALLOWLIST ()

    B.  

    SYSTEMSGET_PRIVATELINK

    C.  

    SYSTEMSAUTHORIZE_PRIVATELINK

    D.  

    SYSTEMSALLOWLIST_PRIVATELINK ()

    Discussion 0
    Questions 48

    A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

    The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

    According to Snowflake recommended best practice, how should these requirements be met?

    Options:

    A.  

    Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

    B.  

    Deploy a Private Data Exchange in combination with data shares for the European accounts.

    C.  

    Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

    D.  

    Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

    Discussion 0