Sunday, August 4, 2024
Presentation - API Security: Essential Practices for Developers
Tuesday, July 30, 2024
Mentorship - Mentoring Circle for Microsoft Data & AI Bootcamp
Monday, June 10, 2024
Presentation - Securely expose backend services with Azure API Management
Sunday, April 21, 2024
Presentation - Implementing Zero Trust strategy with Azure
I recently conducted a session on addressing modern security challenges and implementing the zero-trust model at Global Azure 2024, hosted by Microsoft in Perth.
It was an enlightening event with a wealth of technical content presented.
Following is the presentation I did at the event.
Monday, February 26, 2024
Understand your Azure spending: Harnessing Power BI to analyze monthly expenditure
Cloud cost management, a component of FinOps, presents a complex challenging exercise. Azure, being a public cloud, hosts diverse workloads across different service tiers and regions, making cost management a difficult task.
In this article, I will demonstrate how I developed a Power BI dashboard to delve into and analyze the costs associated with my usage
There are several methods to access usage and associated costs. While utilizing the Azure cost management API is one approach, for this article, I will opt for the monthly usage file, which offers a more convenient solution.
As the first step we need to navigate to the subscription and navigate to the invoices section.
Afterward, proceed to the 'More Options' section and download the usage file in CSV format for a designated billing period
Next, upload the CSV file to your Power BI environment. Once uploaded, you'll be able to view the schema in the data pane.
Let's begin creating our dashboard. Firstly, we'll analyze the cost by each resource type. To do this, drag the 'Cost' and 'MeterCategory' columns onto the canvas. Then, convert the visualization to a Pie Chart.
Now, let's proceed to our second visualization. This visualization will enable us to analyze the cost of each service based on the plan or tier. To achieve this, we will create a table displaying the 'Cost', 'MeterSubCategory', and 'MeterName'.
Sunday, February 18, 2024
Optimize Your Azure Spending: How to visualize expenditure across services with Azure Cost Management - Cost Analysis
Managing our cloud expenditures can present a formidable challenge. The presence of multiple tiers and a diverse range of services further complicates the task. However, cost management is very important in building a well-architected cloud.
Azure provides a feature called Cost Management, which is the central place to monitor and govern our cloud expenditure.
The landing page of the Cost Analysis will give you a basic idea on your cloud cost. Additionally, it will show which categories to look after in order to control and govern your costs.
However, you can't drill down to each service to further analyze what products/services caused the expenditure. But, this is analysis is mandatory to implement the Cost Optimization pillar as per the Azure well-architected framework.
Fortunately, the Cost Analysis tool offers distinct dashboards tailored for examining expenses at the service level.
To access these dashboards, simply select the "Services" option from the Cost by Resource menu as shown in the diagram below.
Now you will be navigated to a different dashboard with interesting insights.
If you have multiple workloads originating from a particular service, you can expand that service to view the expenditure breakdown for each individual product or component.
Clicking on a specific service, such as Azure App Service, will navigate you to another dashboard. Notably, this dashboard provides a suggested monthly budget value to assist with financial planning.
With insights gleaned from the aforementioned dashboards, we can make informed decisions regarding our cloud expenditure. By creating tailored budgets for specific services and setting up alerts to notify us when these services are nearing predefined thresholds, we can effectively manage our spending and optimize resource utilization.
Wednesday, February 7, 2024
Interact with the cache using Azure Cache for Redis - Redis Console
Azure Cache for Redis proves immensely valuable for optimizing response times by caching data, thereby mitigating latency. Utilizing this service enables significant enhancements in performance.
You can interact with your Redis cluster redis-cli by installing required tools in your client workstation.
Alternatively, Azure offers a convenient solution directly within the Azure portal: the Redis Console. Accessible through the Console menu, this integrated feature provides a user-friendly interface for streamlined management of your Redis cache.
Within Redis Console, you can interact with your redis cluster using redis-cli commands. Following are some examples
scan 0 //Get current keys
GET hello //Get value for a specific key
HGETALL userprofile //Get value for a specific key where the value is a collection, object
Overall, the experience with Redis console is seamless.
Monday, January 29, 2024
Optimizing Static File Performance: Implementing Caching and Compression with Azure Front Door
Azure Front Door is a global CDN service that enables you to securely expose your web artifacts to the external world. In this short article, I will demonstrate the process of caching and compressing responses by leveraging the caching and compression features provided by Azure Front Door.
It is advisable to apply caching and compression to static files such as CSS, images, JSON files, CSV, etc., as opposed to dynamic content. Therefore, careful route planning is imperative before embarking on the implementation of caching and compression strategies.
Following is an example.
- route 1 - /api/*
- route 2 -/assets/*
Tuesday, January 9, 2024
Investigate the root cause for latency with Azure Application Insights
In this article, I will demonstrate how to pinpoint the root cause when end users experience general latency. Azure Monitor - Application Insights will be instrumental in this process.
Firstly, we need to navigate to the Performance blade of the Application Insights instance.
Following that, it's better apply filters to refine the dataset.
As we aim to identify the lowest performance, it is advisable to conduct the investigation using the 99th percentile.
There is a distinct outlier present. Let's delve deeper into the investigation by narrowing down the time range to examine specific instances of the failure.