Thursday, March 20, 2025

Optimizing Your Cloud Spending: Planning Resource Group Costs for the Next Year

Cloud cost management remains crucial even after migrating and modernizing your workloads. It's easy to overlook cost optimization once you've adopted the cloud. Without proper governance, you may end up overspending on unnecessary cloud resources.

That's why cost optimization is a key pillar of the Azure Well-Architected Framework. In this article, we’ll explore a structured approach to effectively control and manage costs for a specific Resource Group.

Following is the approach I took

1. Analysis with Daily costs report

First, we need to understand the historical costs for the specific Resource group. To gain these insights, navigate to the Cost Analysis section in Cost Management.

Let's change the scope to our target Resource group first
















I like to start the analysis with the Daily Costs report. However, the daily view doesn’t provide meaningful insights for effective cost governance. To get a clearer picture, I change the granularity to Monthly.















This initially gives me only one data point. To gain a broader perspective, I adjust the time frame to one year.



































Now, I can see that my usual monthly cost is around $70.


2. Analysis with Cost by resource

With the Daily Cost report, we gained a general view of our Resource group expenditure. Now, we can dive deeper using the built-in Cost by resource report. This will show how individual resources contribute to the monthly costs.

Navigate to the Cost analysis section and select the Cost by resource report.





















Set granularity to Monthly as we did for the previous report 












Next, navigate to the time period filter and select Last 12 Months.




























Finally, change the chart type to Column (Grouped) for better visibility and a clearer breakdown of resource costs.














Following is the final report I received

















With this report, I can clearly see that the majority of the cloud spend for this Resource group comes from my Azure API Management instance.

3. Control and save costs

Now, let's explore how we can control and manage this expenditure for the next year. Ideally, we should review the Azure services within the Resource group and identify measures we can enforce to reduce costs.

In my example, I should focus on cost optimization measures for my APIM instance. One effective way to reduce costs is by implementing Auto-scaling, which adjusts resources based on demand, preventing unnecessary overprovisioning.

Here are some strategies to help reduce costs for your Azure resources:
  • Right-Sizing Resources
  • Use Reserved Instances
  • Leverage Spot Instances
  • Auto-scaling
  • Optimize storage
  • Turn-off unused resources

4. Monitor and govern costs

The next step is to ensure my costs for next year are properly planned and set up alerts to notify me of any deviations from my budget. This helps maintain control and take timely action to avoid unexpected expenses.

Let's create a Budget for my Resource group. To do this, navigate to the Budgets section in Cost Management, ensuring the Scope is set to the specific Resource group.

























Click Add to create a new Budget. Then, provide the Creation date, Expiration date, and set a Budget amount. After my cost optimization efforts, I plan for a $60 monthly budget.




















Next, I will set up conditions to generate alerts for any deviations or anomalies in my cloud spending. According to the following example, I have configured two Alert conditions.
  • Alert 1: Trigger an alert when actual spending reaches 80% of the budget ($48).
  • Alert 2: Trigger an alert when Azure predicts that spending will exceed 90% of the budget before the end of the current month.



















These measures help me achieve cost optimization for my Resource group, ensuring better financial control and efficient cloud spending.

Tuesday, February 18, 2025

How to Obtain an Access Token for Azure Cost Management API Calls

In this article I will explain how to obtain an Access Token which will be used when calling Azure Cost Management APIs.

Following were the steps I used

Create an App Registration and Note the Tenant ID and Client ID


















Next, generate a client secret and store it securely.



















Our app registration is now successful. Next, we need to assign the service principal (app registration) the appropriate permissions to access cost data from our subscription.

Go to your Subscription, navigate to the Access Control (IAM) section, and click on 'Add role assignment.


















We will assign the Cost Management Reader permission




















Next, select the App registration we created and complete the role assignment process.













That's all we have to do within Azure portal. Let's construct our API request using Postman client

#URL
https://login.microsoftonline.com/{Tenant ID}/oauth2/v2.0/token

#METHOD
POST

#X-WWW-FORM-URLENCODED
#X-WWW-FORM-URLENCODED
grant_type=client_credentials
&client_id={Client ID}
&client_secret={Client Secret}
&scope=https://management.azure.com/.default
















Following is the response I get with the access token



Monday, February 17, 2025

Optimizing Data Retention for Specific Tables in Azure Log Analytics Workspace

Application Insights plays a crucial role in applications hosted on Microsoft Azure. It is important to periodically review the associated Log Analytics Workspace instances to ensure they are properly governed and optimized.

The following areas should be examined for these resources in alignment with the Azure Well-Architected Framework.

  • Reliability
  • Security
  • Cost Optimization
  • Operational Excellence
  • Performance Efficiency
The following analysis aligns with Cost Optimization, Operational Excellence, and Performance Efficiency pillars of the well architected framework.

Recently, I noticed an increase in app usage for a client, leading to higher log volumes. 

We can easily analyze this pattern by navigating to the Log Analytics Workspace and checking the Usage & Estimated Costs section.







It is evident that AppTraces have increased recently, which may negatively impact overall query performance.

My Log Analytics Workspace is set to the default 90-day data retention period, which I prefer to keep unchanged to achieve cost optimization.








Since the outlier is only in the AppTraces table, I want to reduce the retention period specifically for that table. The expectation is to maintain a lower data volume in that container while keeping the overall workspace retention unchanged.

Following was the approach I used.

Navigate to Tables in the Log Analytics Workspace and select the AppTraces table.








Click on Manage Table to access the table settings.









Change the retention period to 60 days, which overrides the default retention setting for this table only.












That's it! The AppTraces table will now retain data for 60 days, reducing log volume while keeping the overall workspace retention unchanged.

Let's consider the opposite approach: How can you increase data retention while still optimizing costs?

You can also use this approach to optimize costs. For example, if you need to increase the retention period from 90 to 120 days, instead of applying the change to all tables, you can selectively extend retention only for the tables that truly require it. This helps balance cost efficiency and data availability.

Additional retention will incur costs based on the amount of storage consumed.











By strategically planning data retention for specific tables, you can effectively optimize costs.

Thursday, January 16, 2025

Using Azure Cost Management API to Analyze Resource Group Expenditures by Cost Unit

Azure Cost Management offers various tools to track expenses and optimize costs across our tenants. Following are some features provided by Azure Cost Management
  • Cost Analysis
  • Budgets
  • Cost Alerts
  • Cost Allocation
  • Exports
  • Recommendations
  • Multi-Cloud Support
  • Governance and Accountability
  • Integration with Power BI
  • API
In this short article, I'll explain how to view expenditures for a specific resource group, broken down by the various meters Azure uses for cost calculation.

By analyzing the output, we can gain insights into how Azure calculates costs for our resources, ensuring there are no surprises when the bill arrives.

One way to achieve above is by using Cost Management, specifically within the Cost Analysis section.












However, I sometimes prefer using the Cost Management API over the Cost Analysis section because it offers greater flexibility and allows me to customize parameters to better achieve my objectives.

Following is the API I used

Request
#URL
https://management.azure.com/subscriptions/{Tenant ID}/resourceGroups/{Resource Group}/providers/Microsoft.CostManagement/query?api-version=2024-08-01

#METHOD
POST

#AUTHORIZATION
Bearer Token

#BODY
{
  "type": "Usage",
  "timeframe": "MonthToDate",
  "dataset": {
    "granularity": "None",
    "aggregation": {
      "totalCost": {
        "name": "PreTaxCost",
        "function": "Sum"
      }
    },
    "grouping": [
      {
        "type": "Dimension",
        "name": "ResourceType"
      },
      {
        "type": "Dimension",
        "name": "MeterCategory"
      }
    ]
  }
}

You can use this approach to generate an access token

Response
{
    "id": "subscriptions/{Tenant ID}/resourcegroups/{Resource Group}/providers/Microsoft.CostManagement/query/5c25de9e-06aa-4839-81b4-64273e0bc86f",
    "name": "5c25de9e-06aa-4839-81b4-64273e0bc86f",
    "type": "Microsoft.CostManagement/query",
    "location": null,
    "sku": null,
    "eTag": null,
    "properties": {
        "nextLink": null,
        "columns": [
            {
                "name": "PreTaxCost",
                "type": "Number"
            },
            {
                "name": "ResourceType",
                "type": "String"
            },
            {
                "name": "MeterCategory",
                "type": "String"
            },
            {
                "name": "Currency",
                "type": "String"
            }
        ],
        "rows": [
            [
                34.957915,
                "microsoft.apimanagement/service",
                "API Management",
                "AUD"
            ],
            [
                0.0016342144,
                "microsoft.keyvault/vaults",
                "Key Vault",
                "AUD"
            ],
            [
                0.036166705555556,
                "microsoft.loadtestservice/loadtests",
                "Azure Load Testing",
                "AUD"
            ],
            [
                0.034666562722667,
                "microsoft.operationalinsights/workspaces",
                "Log Analytics",
                "AUD"
            ],
            [
                0.000007542612,
                "microsoft.storage/storageaccounts",
                "Bandwidth",
                "AUD"
            ],
            [
                0.0001935796,
                "microsoft.storage/storageaccounts",
                "Storage",
                "AUD"
            ],
            [
                0.0,
                "microsoft.web/sites",
                "Azure App Service",
                "AUD"
            ],
            [
                0.000000359172,
                "microsoft.web/sites",
                "Bandwidth",
                "AUD"
            ]
        ]
    }
}

With these insights, I can identify the various meters used for my resources and see the expenditure associated with each meter.

Wednesday, January 15, 2025

Mocking Custom Responses with Azure API Management – Custom Mock Response from External Files

This article is the third part of a multi-part series. In this section, I will explain how to render custom mock responses from JSON files stored in a storage container. Below are the different parts of this article series.

As discussed in the second part of this series, we can embed JSON responses directly within the policy. However, this approach can become messy and difficult to modify or maintain over time if you have many mock scenarios.

Custom mock responses from external files

As an improvement, we can store mock responses externally and retrieve them based on the request using APIM policies. An added advantage is that we can better organize our mock responses in external storage, where we can leverage features like versioning, enhanced availability, and more efficient management.

Here’s the approach I used:


















I provisioned an Azure Storage Account and created a Blob Container. Then, I organized my mock responses within a dedicated folder inside the container.













Following is the content of a JSON file

{
    "id": 1,
    "name": "Introduction to Azure",
    "category": "Cloud Computing",
    "description": "A beginner-friendly course covering the fundamentals of Microsoft Azure.",
    "duration": "4 weeks",
    "instructor": "John Doe"
}


Next, I generated a Shared Access Signature (SAS) token with Read permissions and obtained the Blob SAS URL to securely access the mock responses.













Then, I navigated to the APIM instance, selected the Inbound Processing section, and opened the Policy Editor.









Following is the extract of the policy
<inbound>
        <base />
        <choose>
            <when condition="@(context.Request.Url.Query.GetValueOrDefault("id") == "1")">
                <send-request mode="new" response-variable-name="response" timeout="10" ignore-error="false">
                    <set-url>@($"https://rgstoragefedora001.blob.core.windows.net/data/Subject/1.json?sp=r&st=2024-03-08T12:01:42Z&se=2024-03-08T20:01:42Z&spr=https&sv=2022-11-02&sr=b&sig=DrqWZJxGP38PNJYFKCoMgIqn6AjNC71nw0x%2FjITu4aM%3D")</set-url>
                    <set-method>GET</set-method>
                </send-request>
                <return-response>
                    <set-status code="200" reason="OK" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>@(new JObject(((IResponse)context.Variables["response"]).Body.As<JObject>()).ToString())</set-body>
                </return-response>
            </when>
            <when condition="@(context.Request.Url.Query.GetValueOrDefault("id") == "2")">
                <send-request mode="new" response-variable-name="response" timeout="10" ignore-error="false">
                    <set-url>@($"https://rgstoragefedora001.blob.core.windows.net/data/Subject/2.json?sp=r&st=2024-03-08T12:34:15Z&se=2024-03-08T20:34:15Z&spr=https&sv=2022-11-02&sr=b&sig=OVhf3uGIASMHWHtg8F%2BypQzLatI9DeEzqoFns2iGOUk%3D")</set-url>
                    <set-method>GET</set-method>
                </send-request>
                <return-response>
                    <set-status code="200" reason="OK" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>@(new JObject(((IResponse)context.Variables["response"]).Body.As<JObject>()).ToString())</set-body>
                </return-response>
            </when>
            <otherwise>
                <return-response>
                    <set-status code="404" reason="Not Found" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>{
                            "error": "Student ID not found"
                        }</set-body>
                </return-response>
            </otherwise>
        </choose>
    </inbound>


That's all we need to configure within the policy. Next, I navigated to the Test console to verify the response. I have to specify the query string as below












Following is the response I got as expected








Tuesday, December 24, 2024

Mocking Custom Responses with Azure API Management – Custom Mock Response

This article is the second part of a three-part series. We will discuss how to render custom mock responses using APIM policy. Below are the different parts of this article series.

Custom mock responses

Often, we need more than just a 200 OK response without a body. Instead, we require comprehensive responses formatted as JSON messages. Adding to the complexity, the response often needs to vary based on specific query string parameters.

Here’s the approach I used to generate custom mock responses in Azure API Management based on varying query string parameters.

Navigate to your API Management instance and select the specific API operation you want to configure.


Click on the Inbound Processing section and open the Policy Code Editor.







We will modify the inbound section of the policy. We will add a policy segment to dynamically generate the response body based on a specified query string parameter.

Here is the code I used


<inbound>
        <base />
        <choose>
            <when condition="@(context.Request.Url.Query.GetValueOrDefault("id") == "1")">
                <return-response>
                    <set-status code="200" reason="OK" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>{
                        "studentId": "1",
                        "name": "Jane Smith",
                        "grade": "B"
                    }</set-body>
                </return-response>
            </when>
            <when condition="@(context.Request.Url.Query.GetValueOrDefault("id") == "2")">
                <return-response>
                    <set-status code="200" reason="OK" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>{
                        "studentId": "2",
                        "name": "Jane Smith",
                        "grade": "B",
                    }</set-body>
                </return-response>
            </when>
            <otherwise>
                <return-response>
                    <set-status code="404" reason="Not Found" />
                    <set-header name="Content-Type" exists-action="override">
                        <value>application/json</value>
                    </set-header>
                    <set-body>{
                            "error": "Student ID not found"
                        }</set-body>
                </return-response>
            </otherwise>
        </choose>
    </inbound>

You can test the outcome by navigating to the Test console and specifying the appropriate query string parameter, as shown below.









Then, submit the request to verify that the appropriate response is returned based on the specified query string parameter.






Tuesday, December 17, 2024

Ensuring Static IP for Azure App Service When Accessing External APIs Over the Internet

Azure App Service assigns a range of outbound IPs when accessing external resources. However, if the external resource requires IP whitelisting, the default configuration may not be practical. This article outlines the steps to ensure that the external API is accessed using a static public IP.

Following is the default configuration. With this setting, the external API may be called with any of the IPs within the range














To achieve this objective, I chose to use NAT Gateway and related components. we need to complete the following tasks in order to implement the solution:
  • Integrate the Web App with a Subnet in a Virtual Network
  • Create a Public IP
  • Create and Configure a NAT Gateway
  • Associate the Web App with the NAT Gateway
  • Test

If we had the external API or resource within a private network (e.g On-Premises) we could've used Hybrid Connections.

Let's discuss the implementation of each item

1. Integrate the Web App with a Subnet in a Virtual Network

Create a virtual network and a subnet or consume an existing virtual network





















Next, navigate to your Web App, go to the Networking section, and enable Virtual Network Integration to connect it to the designated subnet.































2. Create a Public IP



































3.Create and Configure a NAT Gateway

























Specify Outbound IP. We can specify multiple public IP addresses if we want once NAT Gateway is configured

















Specify the Subnet



















4.Associate the Web App with the NAT Gateway

Since resources are within the same network, the NAT Gateway will be automatically configured with your Web App













5. Test

Let's test the solution. To validate the setup, I deployed a sample .NET API that calls an external service, which returns the calling IP address. 


    [ApiController]
    [Route("api/[controller]")]
    public class GatewayTestController : ControllerBase
    {
        private readonly HttpClient _httpClient;
        public GatewayTestController(HttpClient httpClient)
        {
            _httpClient = httpClient;
        }

        [HttpGet]
        public async Task GetOutboundIp()
        {
            // Call external API over internet
            var response = await _httpClient.GetAsync("https://httpbin.org/ip");
            if (!response.IsSuccessStatusCode)
            {
                return StatusCode((int)response.StatusCode, "Failed to get outbound IP");
            }

            var callerIP = await response.Content.ReadAsStringAsync();
            return Ok(callerIP);
        }
    }

Following is the response I get. This matches exactly with the Public IP I provisioned