Azure Data Factory (ADF) is a cloud-based data integration service that orchestrates and automates data movement and transformation. This collection of Azure Data Factory MCQs questions explores key concepts such as pipelines, activities, and triggers. Learn about pipeline design, data movement activities, control flow logic, and scheduling for efficient data integration and processing in ADF.
MCQs
1. Understanding Pipelines, Activities, and Triggers
What is the purpose of a pipeline in Azure Data Factory? a) Storing data b) Orchestrating data movement and transformation activities c) Running SQL queries d) Managing virtual machines
Which of the following is an activity in Azure Data Factory? a) Azure Monitor b) Data transformation c) Data replication d) Data modeling
In Azure Data Factory, what is a trigger used for? a) Automating virtual network configurations b) Initiating a pipeline based on a specific schedule or event c) Storing large datasets d) Monitoring real-time data
Pipelines in Azure Data Factory are made up of: a) Triggers and flows b) Activities and datasets c) Virtual machines d) Key vaults and secrets
Which type of trigger in ADF starts a pipeline based on an event? a) Tumbling window trigger b) Event-based trigger c) Schedule-based trigger d) Time-based trigger
An activity in Azure Data Factory represents: a) A logical unit of work performed within a pipeline b) A monitoring tool for Azure resources c) A replication task d) A data storage unit
2. Designing Pipelines Using the ADF Studio
The ADF Studio interface is used to: a) Create and manage pipelines, datasets, and triggers b) Run SQL queries on Azure databases c) Manage virtual networks d) Configure Azure AD policies
Which section in ADF Studio helps monitor pipeline runs? a) Author b) Monitor c) Manage d) Data flow
In the pipeline design interface, activities are added to: a) Linked services b) Datasets c) The canvas area d) Azure Storage
What is the first step when creating a pipeline in ADF Studio? a) Adding a dataset b) Configuring linked services c) Defining activities d) Testing the pipeline
ADF Studio allows debugging pipelines by: a) Running the pipeline with sample data b) Editing the source code c) Creating multiple triggers d) Exporting the pipeline
Which feature in ADF Studio enables reusing pipeline configurations? a) Linked templates b) Parameters c) Debugging tools d) Activity logs
3. Data Movement Activities
The Copy activity in Azure Data Factory is used to: a) Move and transform data from source to destination b) Monitor network traffic c) Replicate Azure VMs d) Perform SQL transformations
What is the purpose of the Lookup activity in ADF? a) Retrieve a single row or value from a dataset b) Monitor data pipeline progress c) Store secrets securely d) Create a virtual machine
Which activity would you use to remove unwanted files from a data lake? a) Copy activity b) Delete activity c) Lookup activity d) Move activity
Data movement activities require: a) Virtual machine resources b) Linked services for source and destination configuration c) Azure Active Directory integration d) Blob encryption
The Copy activity supports: a) Only SQL-based transformations b) Data movement across different storage formats c) Real-time data replication only d) Monitoring of virtual networks
Which is NOT a data movement activity in ADF? a) Copy b) Lookup c) ForEach d) Delete
4. Control Flow Activities
What is the purpose of the ForEach activity in ADF? a) Iterating through items in a collection b) Managing linked services c) Debugging pipeline issues d) Configuring Azure AD authentication
The Until activity in ADF is used for: a) Looping until a condition is met b) Scheduling pipelines c) Managing dataset dependencies d) Encrypting data pipelines
What does the If Condition activity do in ADF? a) Filters data from a source dataset b) Performs conditional logic within a pipeline c) Encrypts pipeline configurations d) Manages virtual machine backups
Control flow activities enable: a) Orchestrating complex workflows in pipelines b) Monitoring real-time data c) Configuring Azure networking d) Storing large datasets
Which is an example of a control flow activity in ADF? a) Lookup b) Copy c) ForEach d) SQL transformation
How do you specify the order of activities in a pipeline? a) By using activity dependencies b) By configuring linked services c) By adding triggers d) By monitoring logs
5. Scheduling and Triggering Pipelines
Which trigger type schedules pipelines to run at regular intervals? a) Event-based trigger b) Tumbling window trigger c) Schedule-based trigger d) Activity-based trigger
Pipelines can be triggered manually by: a) Clicking “Trigger Now” in ADF Studio b) Creating a service principal c) Deleting existing triggers d) Running PowerShell scripts
Tumbling window triggers are useful for: a) Real-time data processing b) Scheduling non-overlapping pipeline runs c) Debugging pipelines d) Running event-based pipelines
Event-based triggers respond to: a) Data ingestion events like file uploads b) Scheduled intervals c) Looping conditions d) Network traffic changes
What is required to use triggers effectively in ADF? a) A configured Azure Function b) A defined pipeline and linked services c) An encrypted blob storage account d) Azure Policy integration
Triggers can be monitored in which ADF Studio section? a) Author b) Monitor c) Manage d) Data flow
Answers
QNo
Answer (Option with the text)
1
b) Orchestrating data movement and transformation activities
2
b) Data transformation
3
b) Initiating a pipeline based on a specific schedule or event
4
b) Activities and datasets
5
b) Event-based trigger
6
a) A logical unit of work performed within a pipeline
7
a) Create and manage pipelines, datasets, and triggers
8
b) Monitor
9
c) The canvas area
10
b) Configuring linked services
11
a) Running the pipeline with sample data
12
b) Parameters
13
a) Move and transform data from source to destination
14
a) Retrieve a single row or value from a dataset
15
b) Delete activity
16
b) Linked services for source and destination configuration