SnowPro Advanced: Architect Recertification Exam
Last Update May 1, 2024
Total Questions : 162
We are offering FREE ARA-R01 Snowflake exam questions. All you do is to just go and sign up. Give your details, prepare ARA-R01 free exam questions and then go for complete pool of SnowPro Advanced: Architect Recertification Exam test questions that will help you more.
Which of the following are characteristics of Snowflake’s parameter hierarchy?
A.
Session parameters override virtual warehouse parameters.
B.
Virtual warehouse parameters override user parameters.
C.
Table parameters override virtual warehouse parameters.
D.
Schema parameters override account parameters.
A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.
Which requirements will be addressed with this approach? (Choose two.)
A.
There needs to be fewer objects per tenant.
B.
Security and Role-Based Access Control (RBAC) policies must be simple to configure.
C.
Compute costs must be optimized.
D.
Tenant data shape may be unique per tenant.
E.
Storage costs must be optimized.
The Account Per Tenant strategy involves creating separate Snowflake accounts for each tenant within the multi-tenant application. This approach offers a number of advantages.
Option B: With separate accounts, each tenant's environment is isolated, making security and RBAC policies simpler to configure and maintain. This is because each account can have its own set of roles and privileges without the risk of cross-tenant access or the complexity of maintaining a highly granular permission model within a shared environment.
Option D: This approach also allows for each tenant to have a unique data shape, meaning that the database schema can be tailored to the specific needs of each tenant without affecting others. This can be essential when tenants have different data models, usage patterns, or application customizations.
A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.
Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)
A.
Use, at minimum, the Business Critical edition of Snowflake.
B.
Create Dynamic Data Masking policies and apply them to columns that contain PHI.
C.
Use the Internal Tokenization feature to obfuscate sensitive data.
D.
Use the External Tokenization feature to obfuscate sensitive data.
E.
Rewrite SQL queries to eliminate projections of PHI data based on current_role().
F.
Avoid sharing data with partner organizations.
References: : Snowflake’s Security & Compliance Reports : Snowflake Editions : Dynamic Data Masking : External Tokenization : Secure Data Sharing
A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.
How can these requirements be met?
A.
Use ON_ERROR = continue in the copy into command.
B.
Use purge = TRUE in the copy into command.
C.
Use FURGE = FALSE in the copy into command.
D.
Use on error = SKIP_FILE in the copy into command.
What integration object should be used to place restrictions on where data may be exported?
A.
Stage integration
B.
Security integration
C.
Storage integration
D.
API integration
The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?
A)
B)
C)
D)
A.
Option A
B.
Option B
C.
Option C
D.
Option D
A company has an external vendor who puts data into Google Cloud Storage. The company's Snowflake account is set up in Azure.
What would be the MOST efficient way to load data from the vendor into Snowflake?
A.
Ask the vendor to create a Snowflake account, load the data into Snowflake and create a data share.
B.
Create an external stage on Google Cloud Storage and use the external table to load the data into Snowflake.
C.
Copy the data from Google Cloud Storage to Azure Blob storage using external tools and load data from Blob storage to Snowflake.
D.
Create a Snowflake Account in the Google Cloud Platform (GCP), ingest data into this account and use data replication to move the data from GCP to Azure.
A company’s client application supports multiple authentication methods, and is using Okta.
What is the best practice recommendation for the order of priority when applications authenticate to Snowflake?
A.
1) OAuth (either Snowflake OAuth or External OAuth)
2) External browser
3) Okta native authentication
4) Key Pair Authentication, mostly used for service account users
5) Password
B.
1) External browser, SSO
2) Key Pair Authentication, mostly used for development environment users
3) Okta native authentication
4) OAuth (ether Snowflake OAuth or External OAuth)
5) Password
C.
1) Okta native authentication
2) Key Pair Authentication, mostly used for production environment users
3) Password
4) OAuth (either Snowflake OAuth or External OAuth)
5) External browser, SSO
D.
1) Password
2) Key Pair Authentication, mostly used for production environment users
3) Okta native authentication
4) OAuth (either Snowflake OAuth or External OAuth)
5) External browser, SSO
An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).
A.
Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.
B.
Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.
C.
Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.
D.
Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.
E.
Configure the client application to issue a COPY INTO