Labour Day Special 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

SnowPro Advanced: Architect Certification Exam Question and Answers

SnowPro Advanced: Architect Certification Exam

Last Update Apr 22, 2024
Total Questions : 155

We are offering FREE ARA-C01 Snowflake exam questions. All you do is to just go and sign up. Give your details, prepare ARA-C01 free exam questions and then go for complete pool of SnowPro Advanced: Architect Certification Exam test questions that will help you more.

ARA-C01 pdf

ARA-C01 PDF

$35  $99.99
ARA-C01 Engine

ARA-C01 Testing Engine

$42  $119.99
ARA-C01 PDF + Engine

ARA-C01 PDF + Testing Engine

$56  $159.99
Questions 1

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.

What is the BEST way to find recent and ongoing login attempts to Snowflake?

Options:

A.  

Call the LOGIN_HISTORY Information Schema table function.

B.  

Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.

C.  

View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".

D.  

View the Users section in the Account tab in the Snowflake UI and review the last login column.

Discussion 0
Questions 2

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.  

Use ON_ERROR = continue in the copy into command.

B.  

Use purge = TRUE in the copy into command.

C.  

Use FURGE = FALSE in the copy into command.

D.  

Use on error = SKIP_FILE in the copy into command.

Discussion 0
Questions 3

An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

What is the reason for this?

Options:

A.  

The query is processing a very large dataset.

B.  

The query has overly complex logic.

C.  

The query is queued for execution.

D.  

The query is reading from remote storage.

Discussion 0
Questions 4

A user can change object parameters using which of the following roles?

Options:

A.  

ACCOUNTADMIN, SECURITYADMIN

B.  

SYSADMIN, SECURITYADMIN

C.  

ACCOUNTADMIN, USER with PRIVILEGE

D.  

SECURITYADMIN, USER with PRIVILEGE

Discussion 0
Questions 5

The following DDL command was used to create a task based on a stream:

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

Options:

A.  

The warehouse MY_WH will be made active every five minutes to check the stream.

B.  

The warehouse MY_WH will only be active when there are results in the stream.

C.  

The warehouse MY_WH will never suspend.

D.  

The warehouse MY_WH will automatically resize to accommodate the size of the stream.

Discussion 0
Questions 6

Which of the following are characteristics of Snowflake’s parameter hierarchy?

Options:

A.  

Session parameters override virtual warehouse parameters.

B.  

Virtual warehouse parameters override user parameters.

C.  

Table parameters override virtual warehouse parameters.

D.  

Schema parameters override account parameters.

Discussion 0
Questions 7

An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.

Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

Options:

A.  

Utilize a higher Buffer.flush.time in the connector configuration.

B.  

Utilize a higher Buffer.size.bytes in the connector configuration.

C.  

Utilize a lower Buffer.size.bytes in the connector configuration.

D.  

Utilize a lower Buffer.count.records in the connector configuration.

Discussion 0
Questions 8

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

Options:

A.  

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

B.  

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

C.  

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

D.  

All rows loaded using a specific COPY statement will have the same timestamp value.

Discussion 0
Questions 9

Which Snowflake data modeling approach is designed for BI queries?

Options:

A.  

3 NF

B.  

Star schema

C.  

Data Vault

D.  

Snowflake schema

Discussion 0
Questions 10

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

Options:

A.  

Create a clone of the primary database then replicate the database.

B.  

Move the external tables to a database that is not replicated, then replicate the primary database.

C.  

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.  

Share the primary database with an account in the same region that the database will be replicated to.

Discussion 0
Questions 11

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.  

Use, at minimum, the Business Critical edition of Snowflake.

B.  

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.  

Use the Internal Tokenization feature to obfuscate sensitive data.

D.  

Use the External Tokenization feature to obfuscate sensitive data.

E.  

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.  

Avoid sharing data with partner organizations.

Discussion 0
Questions 12

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

Options:

A.  

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.  

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.  

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.  

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.  

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Discussion 0
Questions 13

An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

Options:

A.  

1) Create a share in the Production account for each database

2) Share access to the QA account as a Consumer

3) The QA account creates a database directly from each share

4) Create clones of those databases on a nightly basis

5) Run tests directly on those cloned databases

B.  

1) Create a stage in the Production account

2) Create a stage in the QA account that points to the same external object-storage location

3) Create a task that runs nightly to unload each table in the Production account into the stage

4) Use Snowpipe to populate the QA account

C.  

1) Enable replication for each database in the Production account

2) Create replica databases in the QA account

3) Create clones of the replica databases on a nightly basis

4) Run tests directly on those cloned databases

D.  

1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

Discussion 0
Questions 14

Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

Options:

A.  

A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

B.  

If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

C.  

If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

D.  

If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

Discussion 0
Questions 15

Two queries are run on the customer_address table:

create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

VARCHAR(20) );

ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

Which queries will benefit from the use of the search optimization service? (Select TWO).

Options:

A.  

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

B.  

select * from DEMO_D

B.  

DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

C.  

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE ’%BAAASKD%';

D.  

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE '%PHPP%';

E.  

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDNOT LIKE '%AAAAAAAAPHPPL%';

Discussion 0
Questions 16

A Developer is having a performance issue with a Snowflake query. The query receives up to 10 different values for one parameter and then performs an aggregation over the majority of a fact table. It then

joins against a smaller dimension table. This parameter value is selected by the different query users when they execute it during business hours. Both the fact and dimension tables are loaded with new data in an overnight import process.

On a Small or Medium-sized virtual warehouse, the query performs slowly. Performance is acceptable on a size Large or bigger warehouse. However, there is no budget to increase costs. The Developer

needs a recommendation that does not increase compute costs to run this query.

What should the Architect recommend?

Options:

A.  

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The query results will then be cached and ready to respond quickly when the users re-issue the query.

B.  

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The task will be scheduled to align with the users' working hours in order to allow the warehouse cache to be used.

C.  

Enable the search optimization service on the table. When the users execute the query, the search optimization service will automatically adjust the query execution plan based on the frequently-used parameters.

D.  

Create a dedicated size Large warehouse for this particular set of queries. Create a new role that has USAGE permission on this warehouse and has the appropriate read permissions over the fact and dimension tables. Have users switch to this role and use this warehouse when they want to access this data.

Discussion 0
Questions 17

Following objects can be cloned in snowflake

Options:

A.  

Permanent table

B.  

Transient table

C.  

Temporary table

D.  

External tables

E.  

Internal stages

Discussion 0
Questions 18

A Data Engineer is designing a near real-time ingestion pipeline for a retail company to ingest event logs into Snowflake to derive insights. A Snowflake Architect is asked to define security best practices to configure access control privileges for the data load for auto-ingest to Snowpipe.

What are the MINIMUM object privileges required for the Snowpipe user to execute Snowpipe?

Options:

A.  

OWNERSHIP on the named pipe, USAGE on the named stage, target database, and schema, and INSERT and SELECT on the target table

B.  

OWNERSHIP on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

C.  

CREATE on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

D.  

USAGE on the named pipe, named stage, target database, and schema, and INSERT and SELECT on the target table

Discussion 0
Questions 19

A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

* Confirmed Private Link URLs are working by logging in with a username/password account

* Verified DNS resolution by running nslookups against Private Link URLs

* Validated connectivity using SnowCD

* Disabled public access using a network policy set to use the company’s IP address range

However, the following error message is received when using SSO to log into the company account:

IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

Options:

A.  

Alter the Azure security integration to use the Private Link URLs.

B.  

Add the IP address in the error message to the allowed list in the network policy.

C.  

Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

D.  

Update the configuration of the Azure AD SSO to use the Private Link URLs.

E.  

Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

Discussion 0
Questions 20

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

Options:

A.  

Shared databases are read-only.

B.  

Shared databases must be refreshed in order for new data to be visible.

C.  

Shared databases cannot be cloned.

D.  

Shared databases are not supported by Time Travel.

E.  

Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.

F.  

Shared databases can also be created as transient databases.

Discussion 0
Questions 21

Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

A)

B)

C)

D)

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Questions 22

Role A has the following permissions:

. USAGE on db1

. USAGE and CREATE VIEW on schemal in db1

. SELECT on tablel in schemal

Role B has the following permissions:

. USAGE on db2

. USAGE and CREATE VIEW on schema2 in db2

. SELECT on table2 in schema2

A user has Role A set as the primary role and Role B as a secondary role.

What command will fail for this user?

Options:

A.  

use database db1;

use schema schemal;

create view v1 as select * from db2.schema2.table2;

B.  

use database db2;

use schema schema2;

create view v2 as select * from dbl.schemal. tablel;

C.  

use database db2;

use schema schema2;

select * from db1.schemal.tablel union select * from table2;

D.  

use database db1;

use schema schemal;

select * from db2.schema2.table2;

Discussion 0
Questions 23

When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

Options:

A.  

At the root level (HSM)

B.  

At the account level (AMK)

C.  

At the table level (TMK)

D.  

At the micro-partition level

Discussion 0
Questions 24

What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

Options:

A.  

Privileges can be granted at the database level and can be inherited by all underlying objects.

B.  

A user can use a "super-user" access along with securityadmin to bypass authorization checks and access all databases, schemas, and underlying objects.

C.  

A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.

D.  

A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

Discussion 0
Questions 25

A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

Options:

A.  

Setting a '0' or NULL value means the warehouses will never suspend.

B.  

Setting a '0' or NULL value means the warehouses will suspend immediately.

C.  

Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

D.  

Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

Discussion 0
Questions 26

A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.

Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

Options:

A.  

Create accounts for each tenant in the Snowflake organization.

B.  

Create an object for each tenant strategy if row level security is viable for isolating tenants.

C.  

Create an object for each tenant strategy if row level security is not viable for isolating tenants.

D.  

Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Discussion 0
Questions 27

A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

What should the Architect tell the data organization? (Select TWO).

Options:

A.  

Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

B.  

Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

C.  

Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

D.  

Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

E.  

There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

Discussion 0
Questions 28

An Architect runs the following SQL query:

How can this query be interpreted?

Options:

A.  

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.  

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.  

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.  

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Discussion 0
Questions 29

What are some of the characteristics of result set caches? (Choose three.)

Options:

A.  

Time Travel queries can be executed against the result set cache.

B.  

Snowflake persists the data results for 24 hours.

C.  

Each time persisted results for a query are used, a 24-hour retention period is reset.

D.  

The data stored in the result cache will contribute to storage costs.

E.  

The retention period can be reset for a maximum of 31 days.

F.  

The result set cache is not shared between warehouses.

Discussion 0
Questions 30

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.  

The MERGE command

B.  

The UPSERT command

C.  

The CHANGES clause

D.  

A STREAM object

E.  

The CHANGE_DATA_CAPTURE command

Discussion 0
Questions 31

Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

Options:

A.  

IDEF1X

B.  

Schema-on-write

C.  

Schema-on-read

D.  

Information schema

Discussion 0
Questions 32

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

Options:

A.  

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.  

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.  

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.  

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Discussion 0
Questions 33

A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

What is the MOST cost-effective way to bring this data into a Snowflake table?

Options:

A.  

An external table

B.  

A pipe

C.  

A stream

D.  

A copy command at regular intervals

Discussion 0
Questions 34

Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

Options:

A.  

create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

B.  

create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

C.  

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

D.  

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

Discussion 0
Questions 35

What transformations are supported in the below SQL statement? (Select THREE).

CREATE PIPE ... AS COPY ... FROM (...)

Options:

A.  

Data can be filtered by an optional where clause.

B.  

Columns can be reordered.

C.  

Columns can be omitted.

D.  

Type casts are supported.

E.  

Incoming data can be joined with other tables.

F.  

The ON ERROR - ABORT statement command can be used.

Discussion 0