Summer Special Discount 60% Offer - Ends in 0d 00h 00m 00s - Coupon code: brite60

ExamsBrite Dumps

SnowPro Advanced: Architect Recertification Exam Question and Answers

SnowPro Advanced: Architect Recertification Exam

Last Update Oct 16, 2025
Total Questions : 162

We are offering FREE ARA-R01 Snowflake exam questions. All you do is to just go and sign up. Give your details, prepare ARA-R01 free exam questions and then go for complete pool of SnowPro Advanced: Architect Recertification Exam test questions that will help you more.

ARA-R01 pdf

ARA-R01 PDF

$42  $104.99
ARA-R01 Engine

ARA-R01 Testing Engine

$50  $124.99
ARA-R01 PDF + Engine

ARA-R01 PDF + Testing Engine

$66  $164.99
Questions 1

How can the Snowpipe REST API be used to keep a log of data load history?

Options:

A.  

Call insertReport every 20 minutes, fetching the last 10,000 entries.

B.  

Call loadHistoryScan every minute for the maximum time range.

C.  

Call insertReport every 8 minutes for a 10-minute time range.

D.  

Call loadHistoryScan every 10 minutes for a 15-minutes range.

Discussion 0
Questions 2

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.  

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.  

The parameter will be ignored.

C.  

The command will return an error.

D.  

The command will return a warning stating that the file has unmatched columns.

Discussion 0
Questions 3

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

Options:

A.  

A task scheduled in a UTC-based schedule will have no issues with the time changes.

B.  

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

C.  

A task will move to a suspended state during the daylight savings time change.

D.  

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

E.  

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

Discussion 0
Questions 4

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.  

Database

B.  

Schema

C.  

Table

D.  

Stage

E.  

Role

F.  

Warehouse

Discussion 0
Questions 5

An Architect needs to allow a user to create a database from an inbound share.

To meet this requirement, the user’s role must have which privileges? (Choose two.)

Options:

A.  

IMPORT SHARE;

B.  

IMPORT PRIVILEGES;

C.  

CREATE DATABASE;

D.  

CREATE SHARE;

E.  

IMPORT DATABASE;

Discussion 0
Questions 6

Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

Options:

A.  

A screen shot of a computer Description automatically generated

B.  

A white background with black text Description automatically generated

C.  

A white background with black text Description automatically generated

D.  

A close up of a message Description automatically generated

Discussion 0
Questions 7

A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

What would be the MOST efficient solution?

Options:

A.  

Ask the partner to create a share and add the company's account.

B.  

Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

C.  

Keep the current structure but request that the partner stop changing files, instead only appending new files.

D.  

Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

Discussion 0
Questions 8

Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

What is required to allow data sharing between these two companies?

Options:

A.  

Create a pipeline to write shared data to a cloud storage location in the target cloud provider.

B.  

Ensure that all views are persisted, as views cannot be shared across cloud platforms.

C.  

Setup data replication to the region and cloud platform where the consumer resides.

D.  

Company A and Company B must agree to use a single cloud platform: Data sharing is only possible if the companies share the same cloud provider.

Discussion 0
Questions 9

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.  

Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.  

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Discussion 0
Questions 10

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

Options:

A.  

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.  

The company must replicate data between Snowflake accounts.

C.  

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.  

The company should use a storage integration for the external stage.

Discussion 0
Questions 11

What are purposes for creating a storage integration? (Choose three.)

Options:

A.  

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.  

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.  

Support multiple external stages using one single Snowflake object.

D.  

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.  

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.  

Manage credentials from multiple cloud providers in one single Snowflake object.

Discussion 0
Questions 12

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.  

1. Create a share.

2. Add objects to the share.

3. Add a consumer account to the share for the vendor to access.

B.  

1. Create a share.

2. Create a reader account for the vendor to use.

3. Add the reader account to the share.

C.  

1. Create a new role called db_share.

2. Grant the db_share role privileges to read data from the company database and schema.

3. Create a user for the vendor.

4. Grant the ds_share role to the vendor's users.

D.  

1. Promote an existing database in the company's local account to primary.

2. Replicate the database to Snowflake on Azure in the West-Europe region.

3. Create a share and add objects to the share.

4. Add a consumer account to the share for the vendor to access.

Discussion 0
Questions 13

Following objects can be cloned in snowflake

Options:

A.  

Permanent table

B.  

Transient table

C.  

Temporary table

D.  

External tables

E.  

Internal stages

Discussion 0
Questions 14

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

Options:

A.  

The query is processing a very large dataset.

B.  

The query has overly complex logic.

C.  

The query Is queued for execution.

D.  

The query Is reading from remote storage

Discussion 0
Questions 15

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

Options:

A.  

Choose columns that are frequently used in join predicates.

B.  

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.  

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.  

Choose cluster columns that are most actively used in selective filters.

E.  

Choose cluster columns that are actively used in the GROUP BY clauses.

Discussion 0
Questions 16

A company needs to have the following features available in its Snowflake account:

1. Support for Multi-Factor Authentication (MFA)

2. A minimum of 2 months of Time Travel availability

3. Database replication in between different regions

4. Native support for JDBC and ODBC

5. Customer-managed encryption keys using Tri-Secret Secure

6. Support for Payment Card Industry Data Security Standards (PCI DSS)

In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

Options:

A.  

Standard

B.  

Enterprise

C.  

Business Critical

D.  

Virtual Private Snowflake (VPS)

Discussion 0
Questions 17

Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?

Options:

A.  

Snowflake Connector for Kafka

B.  

Snowflake streams

C.  

Snowpipe

D.  

Spark

Discussion 0
Questions 18

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.  

External table

B.  

Materialized view

C.  

Search optimization

D.  

Result cache

Discussion 0
Questions 19

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.  

Use, at minimum, the Business Critical edition of Snowflake.

B.  

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.  

Use the Internal Tokenization feature to obfuscate sensitive data.

D.  

Use the External Tokenization feature to obfuscate sensitive data.

E.  

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.  

Avoid sharing data with partner organizations.

Discussion 0
Questions 20

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

Options:

A.  

Any pipes in the source are not cloned.

B.  

Any pipes in the source referring to internal stages are not cloned.

C.  

Any pipes in the source referring to external stages are not cloned.

D.  

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.  

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Discussion 0
Questions 21

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity,maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.  

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.  

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.  

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Discussion 0
Questions 22

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

Options:

A.  

Use a masking policy on the username column using a entitlement table with valid dates.

B.  

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.  

Use a masking policy on the username column with event_timestamp as a conditional column.

D.  

Use a secure view on the user_events table using a case statement on the username column.

Discussion 0
Questions 23

What is a key consideration when setting up search optimization service for a table?

Options:

A.  

Search optimization service works best with a column that has a minimum of 100 K distinct values.

B.  

Search optimization service can significantly improve query performance on partitioned external tables.

C.  

Search optimization service can help to optimize storage usage by compressing the data into a GZIP format.

D.  

The table must be clustered with a key having multiple columns for effective search optimization.

Discussion 0
Questions 24

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

Options:

A.  

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.  

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.  

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.  

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Discussion 0
Questions 25

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

Options:

A.  

Call the LOGIN_HISTORY Information Schema table function.

B.  

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.  

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.  

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Discussion 0
Questions 26

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

Options:

A.  

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.

create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.  

Create a row access policy as shown below and assign it to the data share.

create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.  

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.

alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.  

Alter the share settings as shown below, in order to impersonate a specific consumer account.

alter share sales share set accounts = 'Consumerl’ share restrictions = true

Discussion 0
Questions 27

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

What would be the output of this query?

Options:

A.  

Table T_SALES_CLONE successfully created.

B.  

Time Travel data is not available for table T_SALES.

C.  

The offset -> is not a valid clause in the clone operation.

D.  

Syntax error line 1 at position 58 unexpected 'at’.

Discussion 0
Questions 28

Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

Options:

A.  

Changing the name of the organization

B.  

Creating an account

C.  

Viewing a list of organization accounts

D.  

Changing the name of an account

E.  

Deleting an account

F.  

Enabling the replication of a database

Discussion 0
Questions 29

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.  

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.  

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.  

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.  

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Discussion 0
Questions 30

An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

How can this requirement be met?

Options:

A.  

Use SnowSQL.

B.  

Use the Snowpipe REST API.

C.  

Use the Snowflake SQL REST API.

D.  

Use the Snowflake ODBC driver.

Discussion 0
Questions 31

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

Options:

A.  

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.  

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.  

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.  

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.  

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Discussion 0
Questions 32

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

Options:

A.  

Shared databases are read-only.

B.  

Shared databases must be refreshed in order for new data to be visible.

C.  

Shared databases cannot be cloned.

D.  

Shared databases are not supported by Time Travel.

E.  

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.  

Shared databases can also be created as transient databases.

Discussion 0
Questions 33

The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?

Options:

A.  

A computer error message Description automatically generated

B.  

A close up of text Description automatically generated

C.  

A black text on a white background Description automatically generated

D.  

A screen shot of a computer Description automatically generated

Discussion 0
Questions 34

An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

What changes can be made to Improve the data loading performance?

Options:

A.  

Increase the size of the virtual warehouse.

B.  

Create a multi-cluster warehouse and merge smaller files to create bigger files.

C.  

Create a specific storage landing bucket to avoid file scanning.

D.  

Change the file format from CSV to JSON.

Discussion 0
Questions 35

A user can change object parameters using which of the following roles?

Options:

A.  

ACCOUNTADMIN, SECURITYADMIN

B.  

SYSADMIN, SECURITYADMIN

C.  

ACCOUNTADMIN, USER with PRIVILEGE

D.  

SECURITYADMIN, USER with PRIVILEGE

Discussion 0
Questions 36

Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

Options:

A.  

Graph model

B.  

Dimensional/Kimball

C.  

Data lake

D.  

lnmon/3NF

E.  

Bayesian hierarchical model

F.  

Data vault

Discussion 0
Questions 37

A new user user_01 is created within Snowflake. The following two commands are executed:

Command 1-> show grants to user user_01;

Command 2 ~> show grants on user user 01;

What inferences can be made about these commands?

Options:

A.  

Command 1 defines which user owns user_01

Command 2 defines all the grants which have been given to user_01

B.  

Command 1 defines all the grants which are given to user_01 Command 2 defines which user owns user_01

C.  

Command 1 defines which role owns user_01

Command 2 defines all the grants which have been given to user_01

D.  

Command 1 defines all the grants which are given to user_01

Command 2 defines which role owns user 01

Discussion 0
Questions 38

Which columns can be included in an external table schema? (Select THREE).

Options:

A.  

VALUE

B.  

METADATASROW_ID

C.  

METADATASISUPDATE

D.  

METADAT A$ FILENAME

E.  

METADATAS FILE_ROW_NUMBER

F.  

METADATASEXTERNAL TABLE PARTITION

Discussion 0
Questions 39

Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

Options:

A.  

IDEF1X

B.  

Schema-on-write

C.  

Schema-on-read

D.  

Information schema

Discussion 0
Questions 40

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.  

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.  

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.  

Connect the applications using the - URL. Use the Enterprise Snowflake edition.

D.  

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

Discussion 0