Implementing an Azure Data Solution (DP-200)

In this course, the students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.

The students will also explore how to implement data security including authentication, authorization, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing and streaming data solutions.


Rampup in order to pass for for exam DP-200 ‘Implementing an Azure Data Solution’.


The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure. The secondary audience for this course are individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.


Microsoft Azure Fundamentals (AZ-900) or comparable knowledge.
In addition to their professional experience, students who take this training should have technical knowledge equivalent to the AZ-900 Microsoft Azure Fundamentals course.


    • Module 1: Azure for the Data Engineer
      • Explain the evolving world of data
      • Survey the services in the Azure Data Platform
      • Identify the tasks that are performed by a Data Engineer
      • Describe the use cases for the cloud in a Case Study
    • Module 2: Working with Data Storage
      • Choose a data storage approach in Azure
      • Create an Azure Storage Account
      • Explain Azure Data Lake storage
      • Upload data into Azure Data Lake
    • Module 3: Enabling Team Based Data Science with Azure Databricks
      • Explain Azure Databricks and Machine Learning Platforms
      • Describe the Team Data Science Process
      • Provision Azure Databricks and workspaces
      • Perform data preparation tasks
    • Module 4: Building Globally Distributed Databases with Cosmos DB
      • Create an Azure Cosmos DB database built to scale
      • Insert and query data in your Azure Cosmos DB database
      • Provision a .NET Core app for Cosmos DB in Visual Studio Code
      • Distribute your data globally with Azure Cosmos DB
    • Module 5: Working with Relational Data Stores in the Cloud
      • SQL Database and SQL Data Warehouse
      • Provision an Azure SQL database to store data
      • Provision and load data into Azure SQL Data
  • Module 6: Performing Real-Time Analytics with Stream Analytics
    • Explain data streams and event processing
    • Querying streaming data using Stream Analytics
    • How to process data with Azure Blob and Stream Analytics
    • How to process data with Event Hubs and Stream Analytics
  • Module 7: Orchestrating Data Movement with Azure Data Factory
    • Explain how Azure Data Factory works
    • Create Linked Services and datasets
    • Create pipelines and activities
    • Azure Data Factory pipeline execution and triggers
  • Module 8: Securing Azure Data Platforms
    • Configuring Network Security
    • Configuring Authentication
    • Configuring Authorization
    • Auditing Security
  • Module 9: Monitoring and Troubleshooting Data Storage and Processing
    • Data Engineering troubleshooting approach
    • Azure Monitoring Capabilities
    • Troubleshoot common data issues
    • Troubleshoot common data processing issues
  • Module 10: Integrating and Optimizing Data Platforms
    • Integrating data platforms
    • Optimizing data stores
    • Optimize streaming data
    • Manage disaster recovery



Meer weten

    Actieve filters: Wis alle filters
    Algemene voorwaarden

    Jouw persoonsgegevens worden opgenomen in onze beschermde database en worden niet aan derden verstrekt. Je stemt hiermee in dat wij jou van onze aanbiedingen op de hoogte houden. In al onze correspondentie zit een afmeldmogelijkheid