Candidates for this exam analyze the requirements for AI solutions, recommend appropriate tools and technologies, and implements solutions that meet scalability and performance requirements.
Candidates translate the vision from solution architects and work with data scientists, data engineers, IoT specialists, and AI developers to build complete end-to-end solutions. Candidates design and implement AI apps and agents that use Microsoft Azure Cognitive Services and Azure Bot Service. Candidates can recommend solutions that use open source technologies.
Candidates understand the components that make up the Azure AI portfolio and the available data storage options.
Candidates implement AI solutions that use Cognitive Services, Azure bots, Azure Search, and data storage in Azure. Candidates understand when a custom API should be developed to meet specific requirements.
- Analyze solution requirements (25-30%)
- Design AI solutions (40-45%)
- Implement and monitor AI solutions (25-30%)
DP-100: Designing and Implementing a Data Science Solution on Azure
Candidates for this exam apply scientific rigor and data exploration techniques to gain actionable insights and communicate results to stakeholders. Candidates use machine learning techniques to train, evaluate, and deploy models to build AI solutions that satisfy business objectives. Candidates use applications that involve natural language processing, speech, computer vision, and predictive analytics.
Candidates serve as part of a multi-disciplinary team that incorporates ethical, privacy, and governance considerations into the solution. Candidates typically have background in mathematics, statistics, and computer science.
- Define and prepare the development environment (15-20%)
- Prepare data for modeling (25-30%)
- Perform feature engineering (15-20%)
- Develop models (40-45%)
Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to implement data solutions that use Azure data services.
Azure data engineers are responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources.
Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
- Implement data storage solutions (40-45%)
- Manage and develop data processing (25-30%)
- Monitor and optimize data solutions (30-35%)
Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to design data solutions that use Azure data services.
Azure data engineers are responsible for data-related tasks that include designing Azure data storage solutions that use relational and non-relational data stores, batch and real-time data processing solutions, and data security and compliance solutions.
Candidates for this exam must design data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure SQL Data Warehouse, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
- Design Azure data storage solutions (40-45%)
- Design data processing solutions (25-30%)
- Design for data security and compliance (25-30%)
This exam is intended for SQL Server database administrators, system engineers, and developers with two or more years of experience who are seeking to validate their skills and knowledge in writing queries.
- Manage data with Transact-SQL (40–45%)
- Query data with advanced Transact-SQL components (30–35%)
- Program databases by using Transact-SQL (25–30%)
This exam is intended for database professionals who build and implement databases across organizations and who ensure high levels of data availability. Their responsibilities include creating database files, data types, and tables; planning, creating, and optimizing indexes; ensuring data integrity; implementing views, stored procedures, and functions; and managing transactions and locks.
- Design and implement database objects (25–30%)
- Implement programmability objects (20–25%)
- Manage database concurrency (25–30%)
- Optimize database objects and SQL infrastructure (20–25%)
This exam is intended for database professionals who perform installation, maintenance, and configuration tasks. Other responsibilities include setting up database systems, making sure those systems operate efficiently, and regularly storing, backing up, and securing data from unauthorized access.
- Configure data access and auditing (20–25%)
- Manage backup and restore of databases (20–25%)
- Manage and monitor SQL Server instances (35–40%)
- Manage high availability and disaster recovery (20–25%)
This exam is intended for architects, senior developers, infrastructure specialists, and development leads. Candidates have a working knowledge of the various cloud service models and service model architectures, data storage options, and data synchronization techniques. Candidates also have a working knowledge of deployment models, upgrading and migrating databases, and applications and services, in addition to integrating Azure applications with external resources.
- Implement SQL in Azure (40–45%)
- Manage databases and instances (30-35%)
- Manage Storage (30–35%)
This exam is intended for extract, transform, and load (ETL) and data warehouse developers who create business intelligence (BI) solutions. Their responsibilities include data cleansing, in addition to ETL and data warehouse implementation.
- Design, implement, and maintain a data warehouse (35–40%)
- Extract, transform, and load data (40–45%)
- Build data quality solutions (15–20%)
This exam is intended for business intelligence (BI) developers who focus on creating BI solutions that require implementing multidimensional data models, implementing and maintaining OLAP cubes, and implementing tabular data models.
- Design a multidimensional business intelligence (BI) semantic model (25–30%)
- Design a tabular BI semantic model (20–25%)
- Develop queries using Multidimensional Expressions (MDX) and Data Analysis Expressions (DAX) (15–20%)
- Configure and maintain SQL Server Analysis Services (SSAS) (30–35%)
Candidates for this exam are developers and architects who leverage Azure Cosmos DB. Candidates should understand fundamental concepts of partitioning, replication, and resource governance for building and configuring scalable applications that are agnostic of a Cosmos DB API. Candidates should also have basic working knowledge of the Cosmos DB SQL API.
Candidates for this exam design, build, and troubleshoot Cosmos DB solutions that meet business and technical requirements.
- Partition and Model Data
- Replicate Data Across the World
- Tune and Debug Azure Cosmos DB Solutions
- Perform Integration and Develop Solutions
Candidates for this exam should have a good understanding of how to use Power BI to perform data analysis. Candidates should be proficient in connecting to data sources and performing data transformations, modeling and visualizing data by using Microsoft Power BI Desktop, and configuring dashboards by using the Power BI service. Candidates should also be proficient in implementing direct connectivity to Microsoft SQL Azure and SQL Server Analysis Services (SSAS), and implementing data analysis in Microsoft Excel. Candidates may include BI professionals, data analysts, and other roles responsible for creating reports by using Power BI.
- Consuming and Transforming Data By Using Power BI Desktop
- Modeling and Visualizing Data
- Configure Dashboards, Reports and Apps in the Power BI Service
Candidates for this exam should have a strong understanding of how to use Microsoft Excel to perform data analysis. Candidates should be able to consume, transform, model, and visualize data in Excel. Candidates should also be able to configure and manipulate data in PowerPivot, PivotTables, and PivotCharts. Candidates may include BI professionals, data analysts, and other roles responsible for analyzing data with Excel.
- Consume and Transform Data by Using Microsoft Excel (30-35%)
- Model Data (35-40%)
- Visualize Data (30-35%)