How to evaluate the AZ-204 exam, difficult and easy?

AZ-204 exam

Let me talk about how to prepare: The first step is to start with the official and take notes. The second step is to practice the test. It’s important to choose a good learning resource platform. The Pass4itSure website is recommended here. Pass4itSure Microsoft AZ-204 dumps has many years of experience, is updated quickly, and completely covers all the exam questions and answers of the AZ-204 exam syllabus. AZ-204 dumps link: (Q&As: 324). The AZ-204 exam is neither easy nor difficult, so if you do the points mentioned, it should be easier to pass.

Excellent! Yes! 😀

In this article, I will share all the free practice PDF + online practice done by Pass4itSure to pass this exam. Of course, this is not complete, please visit Pass4itSure AZ-204 dumps for complete!

Easy! Microsoft AZ-204 exam dumps free PDF

Microsoft AZ-204 exam PDF from Pass4itSure Microsoft AZ-204 PDF share

Microsoft AZ-204 exam questions practice test


You are developing an ASP.NET Core website that can be used to manage photographs that are stored in Azure Blob
Storage containers.
Users of the website authenticate by using their Azure Active Directory (Azure AD) credentials.

You implement role-based access control (RBAC) role permissions on the containers that store photographs. You
assign users to RBAC roles.
You need to configure the website\\’s Azure AD Application so that user\\’s permissions can be used with the Azure Blob containers.

How should you configure the application? To answer, drag the appropriate setting to the correct location. Each setting
can be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view
NOTE: Each correct selection is worth one point.

Select and Place:

Correct Answer:

Box 1: user_impersonation

Box 2: delegated Example:
1. Select the API permissions section
2. Click the Add a permission button and then: Ensure that the My APIs tab is selected
3. In the list of APIs, select the API TodoListService-aspnetcore.
4. In the Delegated permissions section, ensure that the right permissions are checked: user_impersonation.
5. Select the Add permissions button.

Box 3: delegated Example
1. Select the API permissions section
2. Click the Add a permission button and then, Ensure that the Microsoft APIs tab is selected
3. In the Commonly used Microsoft APIs section, click on Microsoft Graph
4. In the Delegated permissions section, ensure that the right permissions are checked: User. Read. Use the search box if necessary.
5. Select the Add permissions button



You need to ensure that all messages from Azure Event Grid are processed. What should you use?

A. Azure Event Grid topic
B. Azure Service Bus topic
C. Azure Service Bus queue
D. Azure Storage queue
E. Azure Logic App custom connector
Correct Answer: C

As a solution architect/developer, you should consider using Service Bus queues when:
Your solution needs to receive messages without having to poll the queue. With Service Bus, you can achieve it by
using a long-polling receive operation using the TCP-based protocols that Service Bus supports.



You manage several existing Logic Apps.
You need to change definitions, add new logic, and optimize these apps on a regular basis.

What should you use? To answer, drag the appropriate tools to the correct functionalities. Each tool may be used once,
more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each
correct selection is worth one point.

Box 1: Enterprise Integration Pack
For business-to-business (B2B) solutions and seamless communication between organizations, you can build
automated scalable enterprise integration workflows by using the Enterprise Integration Pack (EIP) with Azure Logic

Box 2: Code View Editor
Edit JSON – Azure portal
1. Sign in to the Azure portal.
2. From the left menu, choose All services. In the search box, find “logic apps”, and then from the results, select your logic app.
3. On your logic app\’s menu, under Development Tools, select Logic App Code View.
4. The Code View editor opens and shows your logic app definition in JSON format.

Box 3: Logic Apps Designer



Note: This question is part of a series of questions that present the same scenario. Each question in the series contains
a unique solution that might meet the stated goals. Some question sets might have more than one correct solution,
while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.

You are developing a medical records document management website. The website is used to store scanned copies of
patient intake forms.

If the stored intake forms are downloaded from storage by a third party, the contents of the forms must not be

You need to store the intake forms according to the requirements.

1. Create an Azure Key Vault key named sky.
2. Encrypt the intake forms using the public key portion of the sky.
3. Store the encrypted data in Azure Blob storage. Does the solution meet the goal?

A. Yes
B. No
Correct Answer: A


You are developing a Docker/Go using Azure App Service Web App for Containers. You plan to run the container in an
App Service on Linux. You identify a Docker container image to use.
None of your current resource groups reside in a location that supports Linux. You must minimize the number of
resource groups required.
You need to create the application and perform an initial deployment.
Which three Azure CLI commands should you use to develop the solution? To answer, move the appropriate
commands from the list of commands to the answer area and arrange them in the correct order.
Select and Place:

Correct Answer:

You can host native Linux applications in the cloud by using Azure Web Apps. To create a Web App for Containers, you
must run Azure CLI commands that create a group, then a service plan, and finally the web app itself.

Step 1: az group create
In the Cloud Shell, create a resource group with the az group create command.

Step 2: az app service plan create
In the Cloud Shell, create an App Service plan in the resource group with the az app service plan create command.

Step 3: az webapp create
In the Cloud Shell, create a web app in the myAppServicePlan App Service plan with the az web app create command.
Don\\’t forget to replace it with a unique app name, and with your Docker ID.



You are creating a CLI script that creates an Azure web app and related services in Azure App Service. The web app
uses the following variables:

You need to automatically deploy code from Git-Hub to the newly created web app.
How should you complete the script? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Hot Area:

Correct Answer:

Box 1: az appservice plan create
The azure group creates a command that successfully returns JSON results. Now we can use a resource group to create an azure app service plan

Box 2: az webapp create
Create a new web app.

Box 3: –plan $webappname
..with the service plan, we created in step 1.

Box 4: az web app deployment
Continuous Delivery with GitHub. Example:
az webapp deployment source config –name firstsamplewebsite1 –resource-group websites–repo-URL $gitrepo –branch master –git-token $token

Box 5: –repo-url $gitrepo –branch master –manual-integration



You develop an app that allows users to upload photos and videos to Azure storage. The app uses a storage REST API
call to upload the media to a blob storage account named Account1. You have blob storage containers named
Container1 and Container2.

Uploading of videos occurs on an irregular basis.
You need to copy specific blobs from Container1 to Container2 in real-time when specific requirements are met,
excluding backup blob copies.
What should you do?

A. Download the blob to a virtual machine and then upload the blob to Container2.
B. Run the Azure PowerShell command Start-AzureStorageBlobCopy.
C. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API.
D. Use AzCopy with the Snapshot switch blobs to Container2.
Correct Answer: B

The Start-AzureStorageBlobCopy cmdlet starts to copy a blob.
Example 1: Copy a named blob
C:\PS>Start-AzureStorageBlobCopy -SrcBlob “ContosoPlanning2015” -DestContainer “ContosoArchives” -SrcContainer
This command starts the copy operation of the blob named ContosoPlanning2015 from the container named
ContosoUploads to the container named ContosoArchives.



You are developing an Azure function that connects to an Azure SQL Database instance. The function is triggered by an
Azure Storage queue.

You receive reports of numerous Systems.InvalidOperationExceptions with the following message:
“Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool.

This may have occurred because all pooled connections were in use and max pool size was reached.”
You need to prevent the exception.

What should you do?

A. In the host.json file, decrease the value of the batchSize option
B. Convert the trigger to Azure Event Hub
C. Convert the Azure Function to the Premium plan
D. In the function.json file, change the value of the type option to queue calling
Correct Answer: C

With the Premium plan, the max outbound connections per instance are unbounded compared to the 600 active (1200
total) in a Consumption plan.

Note: The number of available connections is limited partly because a function app runs in a sandbox environment. One of the restrictions that the sandbox imposes on your code is a limit on the number of outbound connections, which is currently 600 active (1,200 total) connections per instance.

When you reach this limit, the functions runtime writes the following message to the logs: Host thresholds exceeded: Connections.



You need to add code at line AM09 to ensure that users can review content using ContentAnalysisService.
How should you complete the code? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Hot Area:

Correct Answer:

Box 1: “oauth2Permissions”: [“login”]
oauth2Permissions specifies the collection of OAuth 2.0 permission scopes that the web API (resource) app exposes to
client apps. These permission scopes may be granted to client apps during consent.

Box 2: “oauth2AllowImplicitFlow”:true
For applications (Angular, Ember.js, React.js, and so on), the Microsoft identity platform supports the OAuth 2.0 Implicit
Grant flow.



You are developing a microservices solution. You plan to deploy the solution to a multimode Azure Kubernetes Service
(AKS) cluster.

You need to deploy a solution that includes the following features:
1. reverse proxy capabilities
2. configurable traffic routing
3. TLS termination with a custom certificate

Which component should you use? To answer, drag the appropriate components to the correct requirements. Each
component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Select and Place:

Correct Answer:

Box 1: Helm
To create the ingress controller, use Helm to install Nginx-ingress.

Box 2: kubectl
To find the cluster IP address of a Kubernetes pod, use the kubectl get pod command on your local machine, with the
option -o wide.

Box 3: Ingress Controller
An ingress controller is a piece of software that provides reverse proxy, configurable traffic routing, and TLS termination
for Kubernetes services. Kubernetes ingress resources are used to configure the ingress rules and routes for individual
Kubernetes services.

Incorrect Answers:
Virtual Kubelet: Virtual Kubelet is an open-source Kubernetes kubelet implementation that masquerades as a kubelet.
This allows Kubernetes nodes to be backed by Virtual Kubelet providers such as serverless cloud container platforms.

CoreDNS: CoreDNS is a flexible, extensible DNS server that can serve as the Kubernetes cluster DNS. Like
Kubernetes, the CoreDNS project is hosted by the CNCF.



You need to configure the integration for Azure Service Bus and Azure Event Grid.
How should you complete the CLI statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Box 1: event grid
To create event subscription use: az event grid event-subscription create
Box 2: event-subscription
Box 3: service bus queue
Scenario: Azure Service Bus and Azure Event Grid

Azure Event Grid must use Azure Service Bus for queue-based load leveling.
Events in Azure Event Grid must be routed directly to Service Bus queues for use in buffering.
Events from Azure Service Bus and other Azure services must continue to be routed to Azure Event Grid for



You are implementing an order processing system. A point of sale application publishes orders to topics in an Azure
Service Bus queue. The Label property for the topic includes the following data:

You need to implement filtering and maximize throughput while evaluating filters.
Which filter types should you implement? To answer, drag the appropriate filter types to the correct subscriptions. Each
filter type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll
view content.
NOTE: Each correct selection is worth one point.

Select and Place:

FutureOrders: SQLFilter HighPriortyOrders: CorrelationFilter CorrelationID only InternationalOrders: SQLFilter Country
NOT USA requires an SQL Filter HighQuantityOrders: SQLFilter Need to use relational operators so an SQL Filter is

orders: No Filter SQL Filter: SQL Filters – A SqlFilter holds a SQL-like conditional expression that is
evaluated in the broker against the arriving messages\’ user-defined properties and system properties. All system
properties must be prefixed with sys. in the conditional expression.

The SQL-language subset for filter conditions tests
for the existence of properties (EXISTS), as well as for null-values (IS NULL), logical NOT/AND/OR, relational
operators, simple numeric arithmetic, and simple text pattern matching with LIKE.

Correlation Filters – A CorrelationFilter
holds a set of conditions that are matched against one or more of an arriving message\’s user and system properties.

A common use is to match against the CorrelationId property, but the application can also choose to match against
ContentType, Label, MessageId, ReplyTo, ReplyToSessionId, SessionId, To, and any user-defined properties.

A match exists when an arriving message\’s value for a property is equal to the value specified in the correlation filter. For string expressions, the comparison is case-sensitive.

When specifying multiple match properties, the filter combines them as a logical AND condition, meaning for the filter to match, all conditions must match.

Boolean filters – The TrueFilter and
FalseFilter either causes all arriving messages (true) or none of the arriving messages (false) to be selected for the



You are developing an Azure App Service hosted ASP.NET Core web app to deliver video-on-demand streaming

You enable an Azure Content Delivery Network (CDN) Standard for the web endpoint. Customer videos are
downloaded from the web app by using the following example URL:http//
All media content must expire from the cache after one hour.

Customer videos with varying quality must be delivered to the closest regional point of presence (POP) node.

You need to configure Azure CDN caching rules.
Which options should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.

Hot Area:

Correct Answer:

Box 1: Override
Override: Ignore origin-provided cache duration; use the provided cache duration instead. This will not override cache-control: no-cache.

Set if missing: Honor origin-provided cache-directive headers, if they exist; otherwise, use the provided cache duration.
Bypass cache: Do not cache and ignore origin-provided cache-directive headers.

Box 2: 1 hour
All media content must expire from the cache after one hour.

Box 3: Cache every unique URL
Cache every unique URL: In this mode, each request with a unique URL, including the query string, is treated as a
a unique asset with its own cache.

For example, the response from the origin server for a request for example.ashx?q=test1 is cached at the POP node and returned for subsequent caches with the same query string. A request for
example.ashx?q=test2 is cached as a separate asset with its own time-to-live setting.

Incorrect Answers:
Bypass caching for query strings: In this mode, requests with query strings are not cached at the CDN POP node. The
POP node retrieves the asset directly from the origin server and passes it to the requestor with each request.

Ignore query strings: Default mode. In this mode, the CDN point-of-presence (POP) node passes the query strings from
the requestor to the origin server on the first request and caches the asset.

All subsequent requests for the asset that
is served from the POP ignore the query strings until the cached asset expires.



You are developing an ASP.NET Core Web API web service. The web service uses Azure Application Insights for all
telemetry and dependency tracking. The web service reads and writes data to a database other than Microsoft SQL

You need to ensure that dependency tracking works for calls to the third-party database.
Which two dependency telemetry properties should you use? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

A. Telemetry.Context.Cloud.RoleInstance
B. Telemetry.Id
C. Telemetry.Name
D. Telemetry.Context.Operation.Id
E. Telemetry.Context.Session.Id
Correct Answer: BD


public async Task Enqueue(string payload)
// StartOperation is a helper method that initializes the telemetry item
// and allows correlation of this operation with its parent and children.
var operation = telemetryClient.StartOperation(“enqueue ” + queueName);
operation.Telemetry.Type = “Azure Service Bus”;

operation.Telemetry.Data = “Enqueue ” + queueName;
var message = new BrokeredMessage(payload);
// Service Bus queue allows the property bag to pass along with the message.
// We will use them to pass our correlation identifiers (and other context)
// to the consumer.

message.Properties.Add(“ParentId”, operation.Telemetry.Id);
message.Properties.Add(“RootId”, operation.Telemetry.Context.Operation.Id);



You are using Azure Front Door Service.
You are expecting inbound files to be compressed by using Brotli compression. You discover that inbound XML files are
not compressed. The files are 9 megabytes (MB) in size.

You need to determine the root cause of the issue.
To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Hot Area:

Correct Answer:

Box 1: No
Front Door can dynamically compress content on the edge, resulting in a smaller and faster response to your clients. All
files are eligible for compression. However, a file must be of a MIME type that is eligible for compression list.

Box 2: No
Sometimes you may wish to purge cached content from all edge nodes and force them all to retrieve newly updated
assets. This might be due to updates to your web application, or to quickly update assets that contain incorrect

Box 3: Yes
These profiles support the following compression encodings: Gzip (GNU zip), Brotli


If you are interested in Microsoft certification, you can check my other blog Microsoft exam practice test article


This exam is not difficult nor easy! It takes time and practice to prepare properly. The key is to find your learning pattern and keep improving! Pass4itSure AZ-204 dumps are your good helper, come and get it: (PDF +VCE) Two formats for you to choose from!