Overview
Level up with Microsoft Certified: Azure Data Engineer Associate.
In this DP-203T00 Data Engineering on Microsoft Azure course, the student will learn about the data engineering patterns and practices as it pertains to working with batch and real-time analytical solutions using Azure data platform technologies. Students will begin by understanding the core compute and storage technologies that are used to build an analytical solution.
They will then explore how to design an analytical serving layers and focus on data engineering considerations for working with source files. The students will learn how to interactively explore data stored in files in a data lake. They will learn the various ingestion techniques that can be used to load data using the Apache Spark capability found in Azure Synapse Analytics or Azure Databricks, or how to ingest using Azure Data Factory or Azure Synapse pipelines.
The students will also learn the various ways they can transform the data using the same technologies that is used to ingest data. The student will spend time on the course learning how to monitor and analyze the performance of analytical system so that they can optimize the performance of data loads, or queries that are issued against the systems.
They will understand the importance of implementing security to ensure that the data is protected at rest or in transit. The student will then show how the data in an analytical system can be used to create dashboards, or build predictive models in Azure Synapse Analytics.
Get Microsoft 365 certified with Microsoft Malaysia’s Learning Partner of the Year 2023 today.
Skills Covered
- Explore compute and storage options for data engineering workloads in Azure
- Design and Implement the serving layer
- Understand data engineering considerations
- Run interactive queries using serverless SQL pools
- Explore, transform, and load data into the Data Warehouse using Apache Spark
- Perform data Exploration and Transformation in Azure Databricks
- Ingest and load Data into the Data Warehouse Transform Data with Azure Data Factory or Azure Synapse Pipelines
- Integrate Data from Notebooks with Azure Data Factory or Azure Synapse Pipelines
- Optimize Query Performance with Dedicated SQL Pools in Azure Synapse
- Analyze and Optimize Data Warehouse Storage Support Hybrid Transactional Analytical Processing (HTAP) with Azure Synapse Link
- Perform end-to-end security with Azure Synapse Analytics
- Perform real-time Stream Processing with Stream Analytics
- Create a Stream Processing Solution with Event Hubs and Azure Databricks
- Build reports using Power BI integration with Azure Synpase Analytics
- Perform Integrated Machine Learning Processes in Azure Synapse Analytics
Who Should Attend
The primary audience for this Azure certification is data professionals, data architects, and business intelligence professionals who want to learn about data engineering and building analytical solutions using data platform technologies that exist on Microsoft Azure. The secondary audience for this course data analysts and data scientists who work with analytical solutions built on Microsoft Azure.
This Microsoft Official Course prepares students for the Microsoft Certified: Azure Data Engineer Associate certification.
The associated DP-203 exam measures your ability to accomplish the following technical tasks: design and implement data storage; design and develop data processing; design and implement data security; and monitor and optimize data storage and data processing.
Course Curriculum
Prerequisites
Successful students start this course with knowledge of cloud computing and core data concepts and professional experience with data solutions.
Specifically completing:
- AZ-900T00: Microsoft Azure Fundamentals
- DP-900T00: Microsoft Azure Data Fundamentals
Download Course Syllabus
Course Modules
Learn about the features and capabilities of Azure Synapse Analytics – a cloud-based platform for big data processing and analysis.
Learning objectives
In this module, you’ll learn how to:
- Identify the business problems that Azure Synapse Analytics addresses.
- Describe core capabilities of Azure Synapse Analytics.
- Determine when to use Azure Synapse Analytics.
Prerequisites
Before completing this module, you should have the following prerequisite knowledge and experience:
- Familiarity with cloud computing concepts and Microsoft Azure.
- Familiarity with fundamental data concepts.
Azure Databricks is a cloud service that provides a scalable platform for data analytics using Apache Spark.
Learning objectives
In this module, you’ll learn how to:
- Provision an Azure Databricks workspace.
- Identify core workloads and personas for Azure Databricks.
- Describe key concepts of an Azure Databricks solution.
Prerequisites
Before starting this module, you should have a fundamental knowledge of data analytics concepts. Consider completing Azure Data Fundamentals certification before starting this module.
Learn how Azure Data Lake Storage provides a cloud storage service that is highly available, secure, durable, scalable, and redundant and brings new efficiencies to processing big data analytics workloads.
Learning objectives
In this module you will:
- Decide when you should use Azure Data Lake Storage Gen2
- Create an Azure storage account by using the Azure portal
- Compare Azure Data Lake Storage Gen2 and Azure Blob storage
- Explore the stages for processing big data by using Azure Data Lake Store
- List the supported open-source platforms
Prerequisites
None
Explore how Azure Stream Analytics integrates with your applications or Internet of Things (IoT) devices to gain insights with real-time streaming data. Learn how to consume and analyze data streams and derive actionable results.
Learning objectives
In this module, you will:
- Understand data streams.
- Understand event processing.
- Learn about processing events with Azure Stream Analytics
Prerequisites
Before taking this module, it is recommended that you complete Data Fundamentals.
With Azure Synapse serverless SQL pool, you can leverage your SQL skills to explore and analyze data in files, without the need to load the data into a relational database.
Learning objectives
After the completion of this module, you will be able to:
- Identify capabilities and use cases for serverless SQL pools in Azure Synapse Analytics
- Query CSV, JSON, and Parquet files using a serverless SQL pool
- Create external database objects in a serverless SQL pool
Prerequisites
Consider completing the Explore data analytics in Azure and Get started querying with Transact-SQL learning paths before starting this module. You will need knowledge of:
- Analytical data workloads in Microsoft Azure
- Querying data with Transact-SQL
Why choose between working with files in a data lake or a relational database schema? With lake databases in Azure Synapse Analytics, you can combine the benefits of both.
Learning objectives
After completing this module, you will be able to:
- Understand lake database concepts and components
- Describe database templates in Azure Synapse Analytics
- Create a lake database
Prerequisites
Consider completing the Explore data analytics in Azure and Get started querying with Transact-SQL learning paths before starting this module. You will need knowledge of:
- Analytical data workloads in Microsoft Azure
- Querying data with Transact-SQL
Learn how you can set up security when using Azure Synapse serverless SQL pools
Learning objectives
After the completion of this module, you will be able to:
- Choose an authentication method in Azure Synapse serverless SQL pools
- Manage users in Azure Synapse serverless SQL pools
- Manage user permissions in Azure Synapse serverless SQL pools
Prerequisites
- It is recommended that students have completed Data Fundamentals before starting this learning path.
Azure Databricks is built on Apache Spark and enables data engineers and analysts to run Spark jobs to transform, analyze and visualize data at scale.
Learning objectives
In this module, you’ll learn how to:
- Describe key elements of the Apache Spark architecture.
- Create and configure a Spark cluster.
- Describe use cases for Spark.
- Use Spark to process and analyze data stored in files.
- Use Spark to visualize data.
Prerequisites
Before starting this module, you should have a basic knowledge of Azure Databricks. Consider completing the previous modules in the Data Engineering with Azure Databricks learning path before this one.
Delta Lake is an open source relational storage area for Spark that you can use to implement a data lakehouse architecture in Azure Databricks.
Learning objectives
In this module, you’ll learn how to:
- Describe core features and capabilities of Delta Lake.
- Create and use Delta Lake tables in Azure Databricks.
- Create Spark catalog tables for Delta Lake data.
- Use Delta Lake tables for streaming data.
Prerequisites
Before starting this module, you should have a basic knowledge of Azure Databricks. Consider completing the previous modules in the Data Engineering with Azure Databricks learning path before this one.
Apache Spark is a core technology for large-scale data analytics. Learn how to use Spark in Azure Synapse Analytics to analyze and visualize data in a data lake.
Learning objectives
After completing this module, you will be able to:
- Identify core features and capabilities of Apache Spark.
- Configure a Spark pool in Azure Synapse Analytics.
- Run code to load, analyze, and visualize data in a Spark notebook.
Prerequisites
If you are not already familiar with Azure Synapse Analytics, consider completing the Introduction to Azure Synapse Analytics module before starting this module.
Learn how to integrate SQL and Apache Spark pools in Azure Synapse Analytics.
Learning objectives
After completing this module, you will be able to:
- Describe the integration methods between SQL and Spark Pools in Azure Synapse Analytics
- Understand the use-cases for SQL and Spark Pools integration
- Authenticate in Azure Synapse Analytics
- Transfer data between SQL and Spark Pool in Azure Synapse Analytics
- Authenticate between Spark and SQL Pool in Azure Synapse Analytics
- Integrate SQL and Spark Pools in Azure Synapse Analytics
- Externalize the use of Spark Pools within Azure Synapse workspace
- Transfer data outside the Synapse workspace using SQL Authentication
- Transfer data outside the Synapse workspace using the PySpark Connector
- Transform data in Apache Spark and write back to SQL Pool in Azure Synapse Analytics
Prerequisites
Before taking this module, it is recommended that you complete the following modules:
- Data Fundamentals
- Introduction to Azure Data Factory
- Introduction to Azure Synapse Analytics
Learn the best practices you need to adopt to load data into a data warehouse in Azure Synapse Analytics.
Learning objectives
In this module, you will:
- Understand data loading design goals
- Explain loading methods into Azure Synapse Analytics
- Manage source data files
- Manage singleton updates
- Set-up dedicated data loading accounts
- Manage concurrent access to Azure Synapse Analytics
- Implement Workload Management
- Simplify ingestion with the Copy Activity
Prerequisites
- Before taking this module, it is recommended that you complete Data Fundamentals.
In this module, you will learn the various methods that can be used to ingest data between various data stores using Azure Data Factory.
Learning objectives
- Introduction
- List the data factory ingestion methods
- Describe data factory connectors
- Exercise: Use the data factory copy activity
- Exercise: Manage the self hosted integration runtime
- Exercise: Setup the Azure integration runtime
- Understand data ingestion security considerations
- Knowledge check
- Summary
Prerequisites
The student should be able to:
- Log into the Azure portal
- Explain and create resource groups
- Describe Azure Data Factory and it’s core components
In this module, you will examine Azure Data Factory and the core components that enable you to create large scale data ingestion solutions in the cloud
Learning objectives
In this module, you will:
- Understand Azure Data Factory
- Describe data integration patterns
- Explain the data factory process
- Understand Azure Data Factory components
- Azure Data Factory security
- Set up Azure Data Factory
- Create linked services
- Create datasets
- Create data factory activities and pipelines
- Manage integration runtime
Prerequisites
The student should be able to:
- Log into the Azure portal
- Explain and create resource groups
In this module, you will learn how to perform common data transformation and cleansing activities within Azure Data Factory without using code.
Learning objectives
- Introduction
- Explain Data Factory transformation methods
- Describe Data Factory transformation types
- Exercise – Author an Azure Data Factory mapping data flow
- Debug mapping data flow
- Exercise – Use Data Factory wrangling data
- Exercise – Use compute transformations within Data Factory
- Exercise – Integrate SQL Server Integration Services packages within Data Factory
- Knowledge check
- Summary
Prerequisites
The student should be able to:
- Log into the Azure portal
- Explain and create resource groups
- Describe Azure Data Factory core components
- Ingest data into Azure Data Factory using the Copy Activity
In this module, you will learn how Azure Data Factory can orchestrate large scale data movement by using other Azure Data Platform and Machine Learning technologies.
Learning objectives
- Introduction
- Understand data factory control flow
- Work with data factory pipelines
- Debug data factory pipelines
- Add parameters to data factory components
- Integrate a Notebook within Azure Synapse Pipelines
- Execute data factory packages
- Knowledge check
- Summary
Prerequisites
The student should be able to:
- Log into the Azure portal
- Explain and create resource groups
- Describe Azure Data Factory and it’s core components
- Ingest data into Azure Data Factory using the Copy Activity
Learn how hybrid transactional / analytical processing (HTAP) can help you perform operational analytics with Azure Synapse Analytics.
Learning objectives
After completing this module, you’ll be able to:
- Describe Hybrid Transactional / Analytical Processing patterns.
- Identify Azure Synapse Link services for HTAP.
Prerequisites
Before starting this module, you should have a basic knowledge of data analytics and Azure services for data. Consider completing the Azure Data Fundamentals certification first.
Azure Synapse Link for Azure Cosmos DB enables HTAP integration between operational data in Azure Cosmos DB and Azure Synapse Analytics runtimes for Spark and SQL.
Learning objectives
After completing this module, you’ll be able to:
- Configure an Azure Cosmos DB Account to use Azure Synapse Link.
- Create an analytical store enabled container.
- Create a linked service for Azure Cosmos DB.
- Analyze linked data using Spark.
- Analyze linked data using Synapse SQL.
Prerequisites
Before starting this module, you should have a basic knowledge of Azure Cosmos DB and Azure Synapse Analytics. Consider completing the following modules first:
Learn how to approach and implement security to protect your data with Azure Synapse Analytics.
Learning objectives
In this module, you will:
- Understand network security options for Azure Synapse Analytics
- Configure Conditional Access
- Configure Authentication
- Manage authorization through column and row level security
- Manage sensitive data with Dynamic Data masking
- Implement encryption in Azure Synapse Analytics
Prerequisites
- Before taking this module, it is recommended that you complete Data Fundamentals.
Storing and handling secrets, encryption keys, and certificates directly is risky, and every usage introduces the possibility of unintentional data exposure. Azure Key Vault provides a secure storage area for managing all your app secrets so you can properly encrypt your data in transit or while it’s being stored.
Learning objectives
In this module, you will:
- Explore proper usage of Azure Key Vault
- Manage access to an Azure Key Vault
- Explore certificate management with Azure Key Vault
- Configure a Hardware Security Module Key-generation solution
None
Explore data classification capabilities and degrees of confidentiality. Implement security options to maintain private data safe, including Azure SQL auditing, Microsoft Defender for SQL, row-level security, Dynamic Data Masking and Azure SQL Database Ledger.
Learning objectives
After completing this module, you will be able to:
- Plan and implement data classification in Azure SQL Database
- Understand and configure row-level security and dynamic data masking
- Understand the usage of Microsoft Defender for SQL
- Explore how Azure SQL Database Ledger works
Prerequisites
- Ability to write code in the SQL language, particular the Microsoft T-SQL dialect, at a basic level.
- Experience creating and configuring resources using the Azure portal
Connect sending and receiving applications with Event Hubs so you can handle extremely high loads without losing data.
Learning objectives
In this module, you will:
- Create an event hub using the Azure CLI
- Configure applications to send or receive messages through the event hub
- Evaluate performance of event hub using the Azure portal
Prerequisites
- Experience creating and managing resources using the Azure portal
- Experience with using Azure CLI to sign into Azure, and to create resources
- Knowledge of basic big data concepts such as streaming and event processing
Request More Information
Training Options
- ILT: Instructor-Led Training
- VILT: Virtual Instructor-Led Training
Exam & Certification
Microsoft Certified: Azure Data Engineer Associate.
The Azure Data Engineer Associate certification is for candidates who have subject matter expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions.
In this role, you help stakeholders understand the data, and you use different tools and techniques to explore, build, and maintain secure and compliant data processing pipelines. To store and produce cleansed and enhanced datasets for analysis, you use various Azure data services and languages. As an Azure data engineer, you also help ensure that data pipelines and stores are high-performing, efficient, organized, and reliable, given a specific set of business requirements and constraints. You deal with unanticipated issues swiftly, and you minimize data loss. You also design, implement, monitor, and optimize data platforms to meet the data pipeline needs.
Training & Certification Guide
A candidate for the Azure Data Engineer Associate certification should have subject matter expertise integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions.
Responsibilities for this role include helping stakeholders understand the data through exploration, building and maintaining secure and compliant data processing pipelines by using different tools and techniques. This professional uses various Azure data services and languages to store and produce cleansed and enhanced datasets for analysis.
An Azure Data Engineer also helps ensure that data pipelines and data stores are high-performing, efficient, organized, and reliable, given a specific set of business requirements and constraints. This professional deals with unanticipated issues swiftly and minimizes data loss. An Azure Data Engineer also designs, implements, monitors, and optimizes data platforms to meet the data pipeline needs.
A candidate for this certification must have solid knowledge of data processing languages, such as SQL, Python, or Scala, and they need to understand parallel processing and data architecture patterns.
This exam measures your ability to accomplish the following technical tasks: design and implement data storage; design and develop data processing; design and implement data security; and monitor and optimize data storage and data processing.
Skills measured:
- Design and implement data storage (40-45%)
- Design and develop data processing (25-30%)
- Design and implement data security (10-15%)
- Monitor and optimize data storage and data processing (10-15%)
When you earn a certification or learn a new skill, it’s an accomplishment worth celebrating with your network. It often takes less than a minute to update your LinkedIn profile and share your achievements, highlight your skills, and help boost your career potential. Here’s how:
- If you’ve earned a certification already, follow the instructions in the congratulations email you received. Or find your badge on your Certification Dashboard, and follow the instructions there to share it. (You’ll be transferred to the Acclaim website.)
- To add specific skills, visit your LinkedIn profile and update the Skills and endorsements section. Tip: We recommend that you choose skills listed in the skills outline guide for your certification.
If you’ve already earned your Azure Data Engineer Associate certification, but it’s expiring in the near future, we’ve got good news. You’ll soon be able to renew your current certifications by passing a free renewal assessment on Microsoft Learn—anytime within six months before your certification expires. For more details, please read our blog post, Stay current with in-demand skills through free certification renewals.
Azure Strategy & Implementation Guide
Get a step-by-step introduction to using Azure for your cloud infrastructure with this Pack e-book. Read the latest edition of the Azure Strategy and Implementation Guide for detailed guidance on how to create a successful cloud adoption strategy with new innovations, capabilities, and security features from Microsoft Azure.
Microsoft Azure SQL Jumpstart Guide
Find out how to get started launching your first Azure SQL database or find ways to make your existing SQL database work harder. Download the Azure SQL Jumpstart Guide for detailed instructions and in-depth insights to help you make your Azure SQL deployment, migration, or enhancement run smoothly.
Low-code Application Development – Microsoft PowerApps and Azure
Build production-ready apps faster with a low-code environment. Quickly stand up your applications with Power Apps and get more time to apply your technical expertise to extending and optimizing those apps in Azure.
Azure Cloud Native Architecture Mapbook
Grow your cloud architecture skills with guidance from Azure Experts. Go beyond developing cloud-native applications to planning and implementing cloud application infrastructure. In this free e-book from Packt Publishing, you’ll find best practices for infrastructure design and patterns for building a complete solution.
Windows Virtual Desktop Security
Find out how to secure your Windows Virtual Desktop environment when migrating your virtual desktop infrastructure (VDI) to Azure. Read this security handbook to get technical hands-on guidance on how to help protect your apps and data in your Windows Virtual Desktop deployment.
Discover how to get more value from your on premises Windows Server and SQL Server investments and move some or all of your workloads to the cloud using your existing skills. See how to start using the cloud to support new ways of doing business and help ensure business continuity even if you need to keep some of your IT assets on-premises due to regulatory or data governance requirements.
Discover how to build highly scalable applications using containers and how to deploy and manage those containers at scale with Kubernetes on Azure. Read the completely reviewed and updated Packet e-book, Hands-On Kubernetes on Azure, Third Edition and discover what’s new, including security enhancements, continuous integration and continuous delivery (CI/CD) automation, and the latest supported technologies. Gain insight into building reliable applications in the new foreword by Kubernetes co-founder Brendan Burns.
Azure Synapse Analytics Proof of Concept Playbook
Learn how to perform a proof of concept efficiently and economically with Azure Synapse Analytics. Read the Azure Synapse Analytics Proof of Concept Playbook to understand the key concepts involved in deploying data warehousing, data lake, and big data workloads with Azure Synapse and get the evidence you need to make the case for implementation at your organization.
Spend less time managing server infrastructure and more time building great apps. Get your solutions to market faster using Azure Functions, a fully managed compute platform for processing data, integrating systems, and building simple APIs and microservices. The Azure Serverless Computing Cookbook will, through the development of basic back-end wep API that performs simple operations, helps you understand how to persist data in Azure Storage services.
Top 7 Data Analytics Certification 2023
Are you looking to level up your career in data analytics? With the increasing demand for data-driven insights in today’s business landscape, obtaining a data analytics certification can be a game-changer.
But with so many options available, how do you choose the best one for you?
Top Data Science Certifications You Should Know in 2024
Data science certifications are vital for IT professionals to validate their skills, set themselves apart in a competitive job market, and meet the rising demand in the field, as the employment of data scientists is projected to grow by 35% from 2022 to 2032.
Frequently Asked Questions
A data engineer integrates, transforms, and consolidates data from various structured and unstructured data systems into structures that are suitable for building analytics solutions. The data engineer also helps design and support data pipelines and data stores that are high-performing, efficient, organized, and reliable, given a specific set of business requirements and constraints.
Azure data engineers are responsible for managing and transforming data within Azure environments. They design and implement data storage solutions, help ensure data quality, and create pipelines to process and analyze data.
As an Azure data engineer, you work with big data technologies, data warehouses, and data lakes to enable effective data-driven decision-making. Proficiency in Azure services, like Azure SQL Database, Azure Data Factory, and Azure Databricks, is crucial for this role. Strong SQL and data-modeling skills, along with knowledge of data integration and extract, transform, and load (ETL) processes, are also essential.
Many organizations today have petabytes of data, and analytics and AI play pivotal roles in putting this data to work—as do data engineers. These professionals work with data from many sources, and they know how to do this quickly and securely to deliver cost savings, new insights, improved business processes, and ground-breaking value. Ready to prove your worth to your team—and to current and future employers? Roll up your sleeves, and get started earning your Azure Data Engineer certification.
Earning a Microsoft Certification is globally recognized and industry-endorsed evidence of mastering real world skills. It shows you demonstrate proficiency in keeping pace with technology. It’s a career move that yields many positive results.
Getting a Microsoft Certification is also a great way to break into the tech industry. A Microsoft Certification immediately confers a level of authority and expertise, especially helpful for someone new to the industry.
Earning a Microsoft Certification is globally recognized and industry-endorsed evidence of mastering real world skills. It shows you demonstrate proficiency in keeping pace with technology. It’s a career move that yields many positive results.
Getting a Microsoft Certification is also a great way to break into the tech industry. A Microsoft Certification immediately confers a level of authority and expertise, especially helpful for someone new to the industry.
The number of questions on a certification exam is subject to change as Microsoft make updates to ensure it aligns with current changes in the technology and job role. Most Microsoft Certification exams typically contain between 40-60 questions; and around 60-140 minutes.
Starting June 30 2021, all newly earned role-based and specialty certifications will be valid for one year from the date the certification was earned.
To stay up to date, IT pros are constantly learning and adding skills. The IDC study concluded that Microsoft Learning Partners are well positioned to help organizations achieve their business and learning goals. The IT leaders who were surveyed found the most value from a Learning Partner that provides:
- An end-to-end solution which starts with identifying skill gaps, simplifies the learning experience, and finishes by evaluating how well the Learning Partner met the organization goals.
- Scale, flexibility, and speed to train teams of any size, in any location, amid changing circumstances.
- Value-added services, such as hands-on labs, classroom training, and custom content that help the skills development program succeed.
- High-quality content and delivery, meaning accurate, relevant courseware, top-notch instructors, and a path to certification, if needed.
DP-060T00: Migrate NoSQL Workloads to Azure Cosmos DB
This DP-060T00: Migrate NoSQL Workloads to Azure Cosmos DB course will teach the students what is Cosmos DB and how you can migrate MongoDB and Cassandra workloads to Cosmos DB.
DP-070T00: Migrate Open Source Data Workloads to Azure
This course will enable the students to understand Azure SQL Database, and educate the students on what is required to migrate MySQL and PostgreSQL workloads to Azure SQL Database.
DP-080T00: Querying Data with Microsoft Transact-SQL Get started with Transact SQL
Learn the basics of Microsoft’s standard SQL language and master skills required as a data analyst, a data engineer, a data scientist, a database administrator or a database developer to query and modify data in relational databases that are hosted in Microsoft SQL Server-based database systems.
DP-090T00: Implementing a Machine Learning Solution with Microsoft Azure Databricks
Master the art and science of how to use machine learning to deliver valuable insights based on your organization’s data with Microsoft Azure Databricks. Learn the key concepts behind Azure Databricks to prepare data for modeling and analytics; model predictive analytics solution for real-world customer scenarios; and implement an end-to-end machine learning pipeline.
DP-100T01: Designing and Implementing a Data Science Solution on Azure
Learn how to operate machine learning solutions at cloud scale using Azure Machine Learning. This course teaches you to leverage your existing knowledge of Python and machine learning to manage data ingestion and preparation, model training and deployment, and machine learning solution monitoring in Microsoft Azure.
DP-203-AO: Building a Data LakeHouse using Azure Synapse Analytics
This session will explore what is the concept of Data LakeHouse – a new, open data management architecture that combines the flexibility, cost-efficiency, and scale of data lakes with the data management and ACID transactions of data warehouses, enabling business intelligence (BI) and machine learning (ML) on all data.
DP-300T00: Administering Relational Databases on Microsoft Azure
This course provides students with the knowledge and skills to administer a SQL Server database infrastructure for cloud, on-premises and hybrid relational databases and who work with the Microsoft PaaS relational database offerings. Additionally, it will be of use to individuals who develop applications that deliver content from SQL-based relational databases.
DP-420T00: Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB
This course teaches developers how to create application using the SQL API and SDK for Azure Cosmos DB. Students will learn how to write efficient queries, create indexing policies, manage and provisioned resources, and perform common operations with the SDK.
This Microsoft Azure Enterprise Data Analyst course covers methods and practices for performing advanced data analytics at scale. Students will build on existing analytics experience and will learn to implement and manage a data analytics environment, query and transform data, implement and manage data models, and explore and visualize data.