This set of 30 MCQs focuses on advanced Blob Storage features in Azure. Topics include lifecycle management for automatic tiering, configuring CORS (Cross-Origin Resource Sharing), and leveraging Azure Data Lake Storage Gen2 for analytics.
Lifecycle Management for Automatic Tiering
What is the primary purpose of Azure Blob Storage lifecycle management? A) To provide encryption for stored data B) To automate data tiering and management C) To increase storage capacity D) To back up data regularly
Which tier is used for data that is infrequently accessed in Azure Blob Storage? A) Hot B) Cool C) Archive D) Premium
What action does Azure lifecycle management support for blobs? A) Automatically deleting blobs after a certain period B) Moving blobs between storage accounts C) Archiving blobs based on last access time D) Converting blobs to files
What is the primary function of the Archive tier in Blob Storage? A) To store blobs that need to be frequently accessed B) To store blobs that are infrequently accessed C) To store blobs that are rarely accessed and have low cost D) To store blobs for real-time analytics
What happens when a blob is moved to the “cool” tier? A) It is archived and cannot be accessed B) It is automatically deleted after 30 days C) It is stored with a lower cost but slower access times D) It is indexed for search optimization
Configuring CORS (Cross-Origin Resource Sharing)
What does CORS stand for in Azure Blob Storage? A) Cloud Origin Request Service B) Cloud Operations Resource Service C) Cross-Origin Resource Sharing D) Cross-Origin Request Service
What is the primary use of CORS in Azure Blob Storage? A) To improve data security B) To allow cross-origin requests from specific domains C) To encrypt data during transmission D) To monitor storage account activity
Which HTTP method is NOT typically configured in a CORS rule? A) GET B) PUT C) DELETE D) POST
How can you configure CORS in Azure Blob Storage? A) Using Azure CLI or PowerShell B) Only through the Azure Portal C) By modifying the storage account’s primary keys D) Through a third-party application
Which of the following is required to enable CORS for an Azure Blob Storage account? A) Specify allowed HTTP methods B) Enable multi-region replication C) Define a custom domain name D) Create a virtual network
Using Azure Data Lake Storage Gen2 for Analytics
What feature does Azure Data Lake Storage Gen2 offer that makes it suitable for big data analytics? A) High availability B) Native integration with Hadoop and Spark C) Built-in encryption D) Reduced storage costs
How does Azure Data Lake Storage Gen2 differ from regular Blob Storage? A) It supports hierarchical namespace B) It offers lower storage costs C) It is only for video storage D) It can only be used with Azure SQL
What is the key benefit of using the hierarchical namespace in Azure Data Lake Storage Gen2? A) Faster retrieval of data from the archive B) Easier file management and organization C) Better support for blob versioning D) Improved encryption techniques
Which Azure service can you integrate with Azure Data Lake Storage Gen2 to process large datasets? A) Azure Cosmos DB B) Azure Databricks C) Azure Logic Apps D) Azure Kubernetes Service
What is the main advantage of using Data Lake Storage Gen2 for analytics? A) It is optimized for low-latency file access B) It provides high throughput for streaming data C) It allows for complex analytics workloads at scale D) It supports only basic querying features
Lifecycle Management for Automatic Tiering (Continued)
Which of the following is true about lifecycle management in Azure Blob Storage? A) It is free of charge B) It requires manual intervention for tier changes C) It automates the movement of blobs between tiers D) It only supports tier changes for large blobs
What is a key benefit of automatic tiering in Blob Storage? A) It allows for more granular access control B) It reduces costs by automatically managing the data lifecycle C) It enhances data encryption automatically D) It boosts performance by moving data to premium storage
Which Azure Blob Storage tier is designed for rarely accessed data but requires quick retrieval? A) Hot B) Cool C) Archive D) Premium
How can you automate the transition of blobs between storage tiers? A) Use Azure Data Factory B) Use lifecycle management policies C) Use PowerShell scripting D) Use Azure Monitor
Which of the following is NOT a valid lifecycle management action in Azure Blob Storage? A) Move data to another subscription B) Delete data after a certain retention period C) Archive data for long-term retention D) Convert data to another format
Configuring CORS (Cross-Origin Resource Sharing) (Continued)
What is the default behavior of CORS in Azure Blob Storage? A) It is disabled by default B) It is enabled for all domains C) It only allows Azure domain requests D) It is enabled for secure URLs only
When configuring CORS in Azure Blob Storage, which of the following is NOT required? A) Allowed origins B) Allowed methods C) Allowed headers D) Allowed IP addresses
What happens if a CORS request is made from a domain that is not allowed by the configured CORS rules? A) The request is automatically blocked B) The request is logged and allowed C) A CORS error is returned to the client D) The request is redirected to another domain
What does the “max-age” parameter do in a CORS configuration? A) Defines the maximum duration for which CORS responses are cached by browsers B) Sets the maximum age of the storage account C) Sets the expiration time for blobs D) Controls how long CORS rules can be modified
What is a limitation of CORS in Azure Blob Storage? A) It can only be configured via the Azure Portal B) It can’t be used with certain HTTP methods like PUT C) It is limited to only 5 cross-origin domains D) It does not support certain HTTP headers
Using Azure Data Lake Storage Gen2 for Analytics (Continued)
What permission model does Azure Data Lake Storage Gen2 use? A) Role-based access control (RBAC) B) Access Control List (ACL) C) Shared access signatures (SAS) D) Public access
Which of the following is an example of a use case for Azure Data Lake Storage Gen2? A) Hosting small websites B) Storing large, unstructured datasets for analytics C) Storing transactional data in relational databases D) Backup of SQL databases
How does Azure Data Lake Storage Gen2 help with performance optimization for analytics workloads? A) By providing low-cost storage B) By supporting high-throughput and low-latency reads/writes C) By providing built-in machine learning models D) By automatically compressing stored data
What is the default security setting for Data Lake Storage Gen2? A) Public access enabled B) No encryption C) Azure Active Directory (AAD) authentication D) Anonymous access
How can you process data stored in Azure Data Lake Storage Gen2? A) Using Power BI only B) Using Azure Databricks or HDInsight C) Using Azure Logic Apps D) Only via manual download and processing
Answers
QNo
Answer (Option with the text)
1
B) To automate data tiering and management
2
B) Cool
3
C) Archiving blobs based on last access time
4
C) To store blobs that are rarely accessed and have low cost
5
C) It is stored with a lower cost but slower access times
6
C) Cross-Origin Resource Sharing
7
B) To allow cross-origin requests from specific domains
8
C) DELETE
9
A) Using Azure CLI or PowerShell
10
A) Specify allowed HTTP methods
11
B) Native integration with Hadoop and Spark
12
A) It supports hierarchical namespace
13
B) Easier file management and organization
14
B) Azure Databricks
15
C) It allows for complex analytics workloads at scale
16
C) It automates the movement of blobs between tiers
17
B) It reduces costs by automatically managing the data lifecycle
18
B) Cool
19
B) Use lifecycle management policies
20
A) Move data to another subscription
21
A) It is disabled by default
22
D) Allowed IP addresses
23
C) A CORS error is returned to the client
24
A) Defines the maximum duration for which CORS responses are cached by browsers
25
D) It does not support certain HTTP headers
26
B) Access Control List (ACL)
27
B) Storing large, unstructured datasets for analytics
28
B) By supporting high-throughput and low-latency reads/writes