Summer Special Discount 60% Offer - Ends in 0d 00h 00m 00s - Coupon code: brite60

ExamsBrite Dumps

SnowPro Advanced: Architect Certification Exam Question and Answers

SnowPro Advanced: Architect Certification Exam

Last Update Oct 16, 2025
Total Questions : 162

We are offering FREE ARA-C01 Snowflake exam questions. All you do is to just go and sign up. Give your details, prepare ARA-C01 free exam questions and then go for complete pool of SnowPro Advanced: Architect Certification Exam test questions that will help you more.

ARA-C01 pdf

ARA-C01 PDF

$42  $104.99
ARA-C01 Engine

ARA-C01 Testing Engine

$50  $124.99
ARA-C01 PDF + Engine

ARA-C01 PDF + Testing Engine

$66  $164.99
Questions 1

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

Options:

A.  

10 days

B.  

20 days

C.  

30 days

D.  

37 days

Discussion 0
Questions 2

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

B)

C)

D)

E)

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

E.  

Option E

Discussion 0
Questions 3

An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

1. Update the order in the f_0RDERS fact table.

2. Load the changed order data into the special table ORDER _REPAIRS.

This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

What data processing logic design will be the MOST performant?

Options:

A.  

Useone stream and one task.

B.  

Useone stream and two tasks.

C.  

Usetwo streams and one task.

D.  

Usetwo streams and two tasks.

Discussion 0
Questions 4

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.  

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.  

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Discussion 0
Questions 5

A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

Options:

A.  

Setting a '0' or NULL value means the warehouses will never suspend.

B.  

Setting a '0' or NULL value means the warehouses will suspend immediately.

C.  

Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

D.  

Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

Discussion 0
Questions 6

A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

Options:

A.  

Use Secure Data Sharing with an S3 bucket as a destination.

B.  

Publish product_category and product_details data sets on the Snowflake Marketplace.

C.  

Create a database user for the partner and give them access to the required data sets.

D.  

Create a reader account for the partner and share the data sets as secure views.

Discussion 0
Questions 7

What Snowflake features should be leveraged when modeling using Data Vault?

Options:

A.  

Snowflake’s support of multi-table inserts into the data model’s Data Vault tables

B.  

Data needs to be pre-partitioned to obtain a superior data access performance

C.  

Scaling up the virtual warehouses will support parallel processing of new source loads

D.  

Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins

Discussion 0
Questions 8

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.  

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.  

Call loadHistoryScan every minute for the maximum time range.

C.  

Call insertReport every 8 minutes for a 10-minute time range.

D.  

Call loadHistoryScan every 10 minutes for a 15-minutes range.

Discussion 0
Questions 9

The following chart represents the performance of a virtual warehouse over time:

A Data Engineer notices that the warehouse is queueing queries. The warehouse is sizeX-Small, theminimum and maximum cluster counts are set to 1, thescaling policy is set to standard, andauto-suspend is set to 10 minutes.

How can the performance be improved?

Options:

A.  

Change the cluster settings.

B.  

Increase the size of the warehouse.

C.  

Change the scaling policy to economy.

D.  

Change auto-suspend to a longer time frame.

Discussion 0
Questions 10

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.  

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.

B.  

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.

C.  

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.

D.  

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.

Discussion 0
Questions 11

What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

Options:

A.  

Privileges can be granted at the database level and can be inherited by all underlying objects.

B.  

A user can use a "super-user" access along with securityadmin to bypass authorization checks and access all databases, schemas, and underlying objects.

C.  

A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.

D.  

A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

Discussion 0
Questions 12

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.  

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.  

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.  

Connect the applications using the - URL. Use the Enterprise Snowflake edition.

D.  

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

Discussion 0
Questions 13

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.  

Use, at minimum, the Business Critical edition of Snowflake.

B.  

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.  

Use the Internal Tokenization feature to obfuscate sensitive data.

D.  

Use the External Tokenization feature to obfuscate sensitive data.

E.  

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.  

Avoid sharing data with partner organizations.

Discussion 0
Questions 14

An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

What is the MOST cost-effective way to increase the availability of the reports?

Options:

A.  

Use materialized views and pre-calculate the data.

B.  

Increase the warehouse to size Large and set auto_suspend = 600.

C.  

Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

D.  

Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

Discussion 0
Questions 15

What are characteristics of the use of transactions in Snowflake? (Select TWO).

Options:

A.  

Explicit transactions can contain DDL, DML, and query statements.

B.  

The autocommit setting can be changed inside a stored procedure.

C.  

A transaction can be started explicitly by executing a BEGIN WORK statement and ended explicitly by executing a COMMIT WORK statement.

D.  

A transaction can be started explicitly by executing a BEGIN TRANSACTION statement and ended explicitly by executing an END TRANSACTION statement.

E.  

Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.

Discussion 0
Questions 16

Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

A)

B)

C)

D)

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Questions 17

An Architect runs the following SQL query:

How can this query be interpreted?

Options:

A.  

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.  

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.  

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.  

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Discussion 0
Questions 18

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.  

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.  

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.  

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.  

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Discussion 0
Questions 19

A company’s daily Snowflake workload consists of a huge number of concurrent queries triggered between 9pm and 11pm. At the individual level, these queries are smaller statements that get completed within a short time period.

What configuration can the company’s Architect implement to enhance the performance of this workload? (Choose two.)

Options:

A.  

Enable a multi-clustered virtual warehouse in maximized mode during the workload duration.

B.  

Set the MAX_CONCURRENCY_LEVEL to a higher value than its default value of 8 at the virtual warehouse level.

C.  

Increase the size of the virtual warehouse to size X-Large.

D.  

Reduce the amount of data that is being processed through this workload.

E.  

Set the connection timeout to a higher value than its default.

Discussion 0
Questions 20

What Snowflake system functions are used to view and or monitor the clustering metadata for a table? (Select TWO).

Options:

A.  

SYSTEMSCLUSTERING

B.  

SYSTEMSTABLE_CLUSTERING

C.  

SYSTEMSCLUSTERING_DEPTH

D.  

SYSTEMSCLUSTERING_RATIO

E.  

SYSTEMSCLUSTERING_INFORMATION

Discussion 0
Questions 21

A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently.

Currently all reports share the same Snowflake virtual warehouse.

How should this situation be addressed? (Select TWO).

Options:

A.  

Use a Business Intelligence tool for in-memory computation to improve performance.

B.  

Configure a dedicated virtual warehouse for the Store Manager team.

C.  

Configure the virtual warehouse to be multi-clustered.

D.  

Configure the virtual warehouse to size 4-XL

E.  

Advise the Store Manager team to defer report execution to off-business hours.

Discussion 0
Questions 22

An Architect would like to save quarter-end financial results for the previous six years.

Which Snowflake feature can the Architect use to accomplish this?

Options:

A.  

Search optimization service

B.  

Materialized view

C.  

Time Travel

D.  

Zero-copy cloning

E.  

Secure views

Discussion 0
Questions 23

Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

Options:

A.  

Extended Time Travel (up to 90 days)

B.  

Customer-managed encryption keys through Tri-Secret Secure

C.  

Periodic rekeying of encrypted data

D.  

AWS, Azure, or Google Cloud private connectivity to Snowflake

E.  

Federated authentication and SSO

Discussion 0
Questions 24

When using the COPY INTO

command with the CSV file format, how does the MATCH_BY_COLUMN_NAME parameter behave?

Options:

A.  

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.  

The parameter will be ignored.

C.  

The command will return an error.

D.  

The command will return a warning stating that the file has unmatched columns.

Discussion 0
Questions 25

How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

Options:

A.  

Set masking policy conditions using current_role targeting the role in use for the current session.

B.  

Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

C.  

Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

D.  

Determine if there are ownership privileges on the masking policy that would allow the use of any function.

E.  

Assign the accountadmin role to the user who is executing the object.

Discussion 0
Questions 26

An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

What design should be used?

Options:

A.  

Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.

B.  

Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.

C.  

Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes

D.  

Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.

Discussion 0
Questions 27

Which columns can be included in an external table schema? (Select THREE).

Options:

A.  

VALUE

B.  

METADATASROW_ID

C.  

METADATASISUPDATE

D.  

METADAT A$ FILENAME

E.  

METADATAS FILE_ROW_NUMBER

F.  

METADATASEXTERNAL TABLE PARTITION

Discussion 0
Questions 28

A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

What is the MOST cost-effective way to bring this data into a Snowflake table?

Options:

A.  

An external table

B.  

A pipe

C.  

A stream

D.  

A copy command at regular intervals

Discussion 0
Questions 29

Which of the below commands will use warehouse credits?

Options:

A.  

SHOW TABLES LIKE 'SNOWFL%';

B.  

SELECT MAX(FLAKE_ID) FROM SNOWFLAKE;

C.  

SELECT COUNT(*) FROM SNOWFLAKE;

D.  

SELECT COUNT(FLAKE_ID) FROM SNOWFLAKE GROUP BY FLAKE_ID;

Discussion 0
Questions 30

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

What would be the output of this query?

Options:

A.  

Table T_SALES_CLONE successfully created.

B.  

Time Travel data is not available for table T_SALES.

C.  

The offset -> is not a valid clause in the clone operation.

D.  

Syntax error line 1 at position 58 unexpected 'at’.

Discussion 0
Questions 31

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

Options:

A.  

Choose columns that are frequently used in join predicates.

B.  

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.  

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.  

Choose cluster columns that are most actively used in selective filters.

E.  

Choose cluster columns that are actively used in the GROUP BY clauses.

Discussion 0
Questions 32

An Architect entered the following commands in sequence:

USER1 cannot find the table.

Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

Options:

A.  

GRANT ROLE PUBLIC TO ROLE INTERN;

B.  

GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

C.  

GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

D.  

GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

E.  

GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

Discussion 0
Questions 33

An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

What changes can be made to Improve the data loading performance?

Options:

A.  

Increase the size of the virtual warehouse.

B.  

Create a multi-cluster warehouse and merge smaller files to create bigger files.

C.  

Create a specific storage landing bucket to avoid file scanning.

D.  

Change the file format from CSV to JSON.

Discussion 0
Questions 34

A user named USER_01 needs access to create a materialized view on a schema EDW. STG_SCHEMA. How can this access be provided?

Options:

A.  

GRANT CREATE MATERIALIZED VIEW ON SCHEMA EDW.STG_SCHEMA TO USER USER_01;

B.  

GRANT CREATE MATERIALIZED VIEW ON DATABASE EDW TO USER USERJD1;

C.  

GRANT ROLE NEW_ROLE TO USER USER_01;GRANT CREATE MATERIALIZED VIEW ON SCHEMA ECW.STG_SCHEKA TO NEW_ROLE;

D.  

GRANT ROLE NEW_ROLE TO USER_01;GRANT CREATE MATERIALIZED VIEW ON EDW.STG_SCHEMA TO NEW_ROLE;

Discussion 0
Questions 35

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.  

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.  

The parameter will be ignored.

C.  

The command will return an error.

D.  

The command will return a warning stating that the file has unmatched columns.

Discussion 0
Questions 36

A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

What should the Architect tell the data organization? (Select TWO).

Options:

A.  

Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

B.  

Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

C.  

Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

D.  

Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

E.  

There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

Discussion 0
Questions 37

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.  

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.  

Call loadHistoryScan every minute for the maximum time range.

C.  

Call insertReport every 8 minutes for a 10-minute time range.

D.  

Call loadHistoryScan every 10 minutes for a 15-minute time range.

Discussion 0
Questions 38

Why might a Snowflake Architect use a star schema model rather than a 3NF model when designing a data architecture to run in Snowflake? (Select TWO).

Options:

A.  

Snowflake cannot handle the joins implied in a 3NF data model.

B.  

The Architect wants to remove data duplication from the data stored in Snowflake.

C.  

The Architect is designing a landing zone to receive raw data into Snowflake.

D.  

The Bl tool needs a data model that allows users to summarize facts across different dimensions, or to drill down from the summaries.

E.  

The Architect wants to present a simple flattened single view of the data to a particular group of end users.

Discussion 0
Questions 39

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.  

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.  

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.  

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.  

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.  

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Discussion 0
Questions 40

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.  

External table

B.  

Materialized view

C.  

Search optimization

D.  

Result cache

Discussion 0
Questions 41

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.  

Use ON_ERROR = continue in the copy into command.

B.  

Use purge = TRUE in the copy into command.

C.  

Use FURGE = FALSE in the copy into command.

D.  

Use on error = SKIP_FILE in the copy into command.

Discussion 0
Questions 42

An Architect uses COPY INTO with the ON_ERROR=SKIP_FILE option to bulk load CSV files into a table called TABLEA, using its table stage. One file named file5.csv fails to load. The Architect fixes the file and re-loads it to the stage with the exact same file name it had previously.

Which commands should the Architect use to load only file5.csv file from the stage? (Choose two.)

Options:

A.  

COPY INTO tablea FROM @%tablea RETURN_FAILED_ONLY = TRUE;

B.  

COPY INTO tablea FROM @%tablea;

C.  

COPY INTO tablea FROM @%tablea FILES = ('file5.csv');

D.  

COPY INTO tablea FROM @%tablea FORCE = TRUE;

E.  

COPY INTO tablea FROM @%tablea NEW_FILES_ONLY = TRUE;

F.  

COPY INTO tablea FROM @%tablea MERGE = TRUE;

Discussion 0
Questions 43

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

Options:

A.  

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.  

Create a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.  

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.  

Alter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales share set accounts = 'Consumerl’ share restrictions = true

Discussion 0
Questions 44

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.  

There needs to be fewer objects per tenant.

B.  

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.  

Compute costs must be optimized.

D.  

Tenant data shape may be unique per tenant.

E.  

Storage costs must be optimized.

Discussion 0
Questions 45

How does a standard virtual warehouse policy work in Snowflake?

Options:

A.  

It conserves credits by keeping running clusters fully loaded rather than starting additional clusters.

B.  

It starts only if the system estimates that there is a query load that will keep the cluster busy for at least 6 minutes.

C.  

It starts only f the system estimates that there is a query load that will keep the cluster busy for at least 2 minutes.

D.  

It prevents or minimizes queuing by starting additional clusters instead of conserving credits.

Discussion 0
Questions 46

When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

Options:

A.  

At the root level (HSM)

B.  

At the account level (AMK)

C.  

At the table level (TMK)

D.  

At the micro-partition level

Discussion 0
Questions 47

A Snowflake Architect is designing a multiple-account design strategy.

This strategy will be MOST cost-effective with which scenarios? (Select TWO).

Options:

A.  

The company wants to clone a production database that resides on AWS to a development database that resides on Azure.

B.  

The company needs to share data between two databases, where one must support Payment Card Industry Data Security Standard (PCI DSS) compliance but the other one does not.

C.  

The company needs to support different role-based access control features for the development, test, and production environments.

D.  

The company security policy mandates the use of different Active Directory instances for the development, test, and production environments.

E.  

The company must use a specific network policy for certain users to allow and block given IP addresses.

Discussion 0
Questions 48

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

Options:

A.  

TRANSIENT

B.  

TEMPORARY

C.  

PERMANENT

Discussion 0