Friday 23 February 2024

UML Activity Diagrams

UML (Unified Modeling Language) activity diagrams are graphical representations used to model workflows, processes, and behaviours within a system. They provide a visual overview of the steps involved in accomplishing a task or achieving a specific outcome. Generally most UML diagrams are appropriate for software design, however a few can be leveraged to understand systems from both a human as well as technology perspective. In the previous example on simple flow charts we created a high level model of a checkout experience, for this example we'll focus on the cash register process.

Symbols

Activity diagrams unlike simple flow charts, have a more stringent set of symbols, this is because UML is much more standardised, whereas flow charts are much more pragmatic. Generally activity diagrams are not appropriate for all audiences, they are a great way for business analysts and software engineers to collaborate and ensure that they have a shared understanding.

Symbol Description
A solid dot represents the start of the process
A solid dot with a circle around it represents the end of the process
A circle with an X in the middle represents the end of a branch or activity
A diamond indicates that a branch is created, or that two branches merge into one
A rectangle with rounded corners represents an activity, something that is done
A bar with a flow going into it and a number of flows exiting depicts parrallel processes
A bar with multiple flows entering and only one exiting depicts a pause point, where all parallel process must converge
The note symbol allows the diagram creator the opportunity to annotate their diagram with some contextual information for the reader
The send signal notation, indicates that the process is outputting a message to another process
The receive signal notation, indicates that the process is waiting for input from another process
Timer notation, how long a process is suppose to wait before Continuing
Frequency notation, indicates a timer job, how long before triggering an action
Interrupt notation, indicates that something has occurred which needs to branch to a different flow
Call other process, indicates that the activity is actually a sub process

Example

Previously we created a simple flowchart depicting a checkout process, this time let's create an activity diagram laying out the cash register process.


again this is a quick contrived example that only touches upon this process, there is no doubt in my mind that if you sat down with a cashier you would find many more nuances and improvements to this process, however notice that if you showed this diagram to a cashier, they most likely would be confused, however a simple flow chart would most likely facilitate a discovery workshop

Monday 19 February 2024

Swim lanes

Swim lanes are not exclusive to any particular business processing technique, and can in fact be used with simple flowcharts, activity diagrams, BPMN 2.0 or almost any other process modelling technique. Swim lanes are visual containers that partition a business process diagram horizontally or vertically. They represent different participants, roles, departments, systems, or organisational units involved in the execution of a process. Swim lanes provide clarity and structure to process diagrams by categorising activities based on their performers or responsibilities.

The above is called a 'Pool' this pool has swim lanes, in the above I've depicted vertical swim lanes, but there is nothing wrong with using horizontal lanes. I generally choose vertical ones, because, most mediums have more vertical space than horizontal, not to mention that no one likes reading sideways text, however the following is perfectly legitimate.

Whether the lanes or horizontal or vertical is completely in consequential, the value of the swim lane is two fold, it easily identity's the actor, and segments their contribution to the process.

One thing to keep in mind about the above is that it is very much a contrived example, once could argue that this is in fact multiple business process, that the stock clerk is a variation, etc. 

The take away here is simple, Swim lanes segment a process by actor, they represent different participants, roles, departments, systems, or organisational units involved in the execution of a process. They provide a clarity of who does what and at which point during a process.

Friday 16 February 2024

Simple flowcharts

Flowcharts are visual representations of processes or workflows, often used in various fields such as software development, engineering, business management, and more. They provide a clear and structured way to illustrate the sequence of steps within a system or a procedure. 

Symbols

Simple flowcharts have a number of standard symbols and notations, however they are a pragmatic approach to process modelling, and often times incorporate symbols outside their loosely defined standard set of symbols. This pragmatism is as much a benefit as it is drawback, it is very easy to overuse flowcharts; this is because anyone can understand them and often times initial drafts become finalised documents, even though there are other notations that maybe more appropriate for the particular thing being modelled.

I generally look at simple flowcharts as rough sketches, an easy and quick way to discuss a process, but by no means the finished deliverable.

Symbol Description
The start stop symbol is exactly what it sounds like, it indicates that the process is started or terminating
An action or step is something that happens during the process, this is generally where the work is done
The input or output, represents a need by the process from a participant to continue, or an output which a participant needs to proceed
Decision, a decision symbol is a split in logic, it just means that based on some condition, the process enters an alternative flow
Merge, often times during a process there is a split, the process can have multiple flows which run in parallel, the merge represents the coming together of these flows, think of it as parallel actions that eventually must wait for each other to complete
An artifact is similar to an output, however the difference is that the artifact will exist after the process is compelete
Artifacts simply represents multiple artifacts
The annotation is a simple decoration, it provides the viewer of the process some contextual inforamtion that may not be obvioulsy communicated via the diagram itself
The process link out symbol indicates that this flow continues somewhere on the page, the content of the symbol is the key to look for in the porcess flow in counterpart
The process link in symbol indicates that this flow is the continuation from somewhere on the page, the content of the symbol is the key to look for in the porcess flow out counterpart
The off page link simply indicates that this porcess continues on a differnt page, generally the content of this symbol indicates the continuation page
The subprocess indicates that this step is actually its own sub process, and generally contains where this process is indicated
The above are the eleven basic symbols one could use to to model just about any process. These symbols are simple to understand, and though they may not get into the granular details of technical implementations, they are an excellent way to simply depict a process for the purpose of a general discussion.

Grocery store checkout example

keep in mind that the following is a is a simplification, there are a large number of sub processes that are not depicted, things like what if the person client does not have the means to pay, what if they will soon return to pay, what if the item doesn't have a price, there are so many more variations that could exist. however the following does serve the purpose of a simple depiction.



Friday 9 February 2024

Business process scoping

The core idea of Business Process Modelling (BPM) is to understand the outcome, sequence and  activities needed to achieve a specific result. To define the rules of interaction between all participants.

Any business process is an end-to-end set of activities which collectively respond to an event and transform information, materials, and other resources into outputs that deliver value directly to the customer(s) of the process. This value may be realised internally within an organisation, or it may span several organisations.

When looking to automate a process, at a high level there are only four  aspects to any process:
  1. Trigger: What starts the process
  2. Result: what does the process accomplish
  3. Steps: What are the steps in the process
  4. Needs: What are the specific needs for each step
If you can identify all of the above, you can successfully map any business process, this may seem simple enough on paper, however in real life this is generally where murky waters start. Often times organisations do not have clear cut business processes, most processes are born out of necessity and start relatively simple, however over time they tend to grow and mutate into something that started pragmatically, but has over years transformed into a behemoth. Many times these 'organic' processes are never documented, and more often than not reside in someone or a group of peoples heads. In the latter, often times various stakeholders understand part of the processes or worse yet they have varying opinions on what the actual processes is. For this reason before modelling anything it is important to lay out the boundaries of the process. 

There are a number of business process scoping methods, these methods help you understand the environment around the process, the value of the process, and a high level overview of the process. By understanding the 

SIPOC (seabook)

SIPOC is an acronym that stands for Suppliers, Inputs, Process, Outputs, and Customers. It is a high-level process mapping tool used in business analysis and process improvement.
  • Suppliers: These are the entities that provide the inputs needed for the process to function.
  • Inputs: These are the resources, materials, or information required to initiate the process.
  • Process: This refers to the series of steps or activities that transform inputs into outputs.
  • Outputs: These are the results or products generated by the process.
  • Customers: These are the individuals or entities who receive the outputs of the process.
SIPOC diagrams help to identify and understand the relationships between these key elements of a process, providing a clear overview that aids in analyzing and improving processes within an organization.
Suppliers Inputs Process Outputs Customers
These are from where all of your inputs come, there could be one or dozens of suppliers This represents all of the tangible or intangible things you get from your suppliers, and you need for your process to produce outputs This is a list of all of the steps you need to take your inputs, and transform them into your output(s) These are the outputs of your process, ideally each process should be mapped to one output, however every rule has its exceptions This column represents your customers, these are the people/organisations/departments, etc who will benefit from the outputs.

Keep in mind that the SIPOC is not a tool meant to model your process, this tool's purpose is to have a broader conversation around the upstream and downstream aspects of your process, understand the suppliers and their inputs, as well as the outputs and their customers. 

IGOE

The IGOE (Inputs, Governance, Outputs, and Enablers) framework is a method used in systems thinking and analysis. It breaks down a system into four main components:
  • Inputs: These are the resources, materials, or information that are utilized by the system.
  • Governance: The rules, regulations, and decision-making structures that guide the system.
  • Outputs: These are the results or products generated by the system in pursuit of its goals.
  • Enablers: Factors that facilitate or support the achievement of the system's goals
The IGOE framework places the 'Process' at the centre and then looks to the left and right of it, as well as other factors which impact the process. 


This approach again has it's short comings, however the two analysis approaches together provide a strong understanding of the process.

Process scoping

Process scoping is a hybrid of the previous two approaches, it creates a holistic view of the particular process, the advantage of merging the two is a holistic overview in one model, then downside is that this model is complex to create as well as understand, for this reason it may make more sense to use the previous to models to gain the necessary understanding, but then to combine the two into one model for a cohesive representation. It is made up of seven parts.
  1. Outcomes: the result of the process
  2. Process steps: the chain of granular steps which result in the outcome
  3. Triggers: anything which starts the process
  4. Participants: every individual or group who's input is required for the process
  5. Variants: Any edge or alternative flows to the main process. 
  6. Governance: The rules, regulations, and decision-making structures that guide the system.
  7. Enablers: Factors that facilitate or support the achievement of the system's goals


Generally when creating a model such as this, one would work backwards, from the outcomes through to the triggers, than to work out from the participants, variables, governance and enablers, this provides a high level overview of the process and the all of the influencers surrounding it.

Establishing granularity

Regardless of scoping technique, you always end up asking yourself the question is this a process of processes, that is are any steps within my process, processes themselves? Let's take a look at the following process steps:

  • Register a lead
  • Score a lead
  • Update status of a lead
  • Sign a contract with a lead
  • Register service request
  • Dispatch a field worker
  • Process payment
  • Assess Service quality
  • Apply correction
  • Produce report

We can ask ourselves, is this one process? or are there multiple process here? Though all of the above are granular steps in an overall flow, however these could be segmented into four different processes.

Token analysis

In token analysis a business process should only handle one thing at a time, the token; meaning that each step should impact the token in some way or form, transform it, capture some information about it, route it. If there is a change in token between steps, then you are most likely dealing with a separate business process.

Process a lead
  • Register a lead
  • Score a lead
  • update status of a lead
  • Sign a contract with a lead

Process a request
  • Registeres service request
  • Dispatch a field worker
  • Process payment

Assessment of services
  • Assess Service quality
  • Apply correction

Generation of report
  • Produce report

As. you can see the 'macro' flow deals with four different tokens, hence it can be segmented into multiple process each time there is a change in token.

Takeaway

As you may recall we have four parts to a business process:
  1. Events: Things that happen outside of the process, but have an effect on the process
    • Triggers: an action that starts the process
    • Timers: an interval which starts the process
    • Messages: information which the process receives
    • Error: a fault that impacts the process flow
  2. Outcomes: what does the process accomplish
    • Every process exists to deliver a specific repeatable outcome
    • A process must have at least one outcome if not several
  3. Actions: What are the steps in the process
    • Each action is an activity carried out by an agent; it may be a person, organisation or an automated solution.
    • Actions represent an activity used to product a final ore intermediate result
    • Actions may be standalone steps, or the may represent a sub-process
  4. Participants: Who or what performs the actions within the process
    • The executors as well as any relevant parties of the actions or process:
      • Supervisors
      • Informed party
      • Decision maker
      • Operator
At a macro level every process should consist of the four key elements above

Monday 8 January 2024

Docker repo

With our Azurite docker image running, it's time to configure our docker repo. Let's start by adding a docker profile to our 'launchSettings.json' file


"Docker": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"launchUrl": "swagger",
"applicationUrl": "http://localhost:5050",
"environmentVariables": {
"ASPNETCORE_ENVIRONMENT": "Development"
}
},

As I've mention in the past, your naming conventions are very important, they are what let future you figure out what the hell you did; in this case all you're going to have to remember is that you have to check your 'launchSettings.json' file and you'll be able to follow the bread crumb trail of what this profile is meant for.

before we continue we're going to have to add a nuget to our project to work with azureBlobStorage, go to your .csproj file and enter in the following command 

dotnet add package azure.storage.blobs

you should see the following in your terminal

We need to add one more nuget package aht that is the 'azure.Identity' package

Your .csproj should now have the"azure.storage.blobs" package reference added

<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="azure.storage.blobs" Version="12.19.1" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.5.0" />
</ItemGroup>

</Project>

these will let us leverage Azures prebuilt classes for interacting with our blob storage.

Before we dive into our DockerRepo class, let's open our app settings file and add our azurite development connection string

{
"flatFileLocation": "DefaultEndpointsProtocol=https;
AccountName=devstoreaccount1;
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;
BlobEndpoint=https://127.0.0.1:10000/devstoreaccount1;"
}

I broke the string up onto multiple lines for ease of reading, however you'll have to concatinate it onto one line.

Now we can finally start coding Next open up your 'DockerRepo.cs' file, it should look like the following


using pav.mapi.example.models;

namespace pav.mapi.example.repos
{
public class DockerRepo : IRepo
{
public Task<IPerson[]> GetPeopleAsync()
{
throw new NotImplementedException();
}

public Task<IPerson> GetPersonAsync(string id)
{
throw new NotImplementedException();
}
}
}

We configured the class but never implemented the methods, Let's take the opportunity to do so now; we're going to create a constructor that takes in a connection string and a the logic for our two get functions.

using System.Text.Json;
using Azure.Storage.Blobs;
using pav.mapi.example.models;

namespace pav.mapi.example.repos
{
public class DockerRepo : IRepo
{
BlobContainerClient _client;
public DockerRepo(string storageUrl)
{
this._client = new BlobContainerClient(storageUrl, "flatfiles");
}

public async Task<IPerson[]> GetPeopleAsync()
{
var openingsBlob = _client.GetBlobClient("people.json");
var openingJSON = (await openingsBlob.DownloadContentAsync()).Value.Content.ToString();

if (openingJSON != null)
{
var opt = new JsonSerializerOptions
{
PropertyNameCaseInsensitive = true
};

return JsonSerializer.Deserialize<Person[]>(openingJSON, opt);
}

throw new Exception();
}

public async Task<IPerson> GetPersonAsync(string id)
{
var ppl = await this.GetPeopleAsync();
if(ppl != null)
return ppl.First(p=> p.Id == id);
throw new KeyNotFoundException();
}
}
}

Now if we take a look at our main

using pav.mapi.example.models;
using pav.mapi.example.repos;

namespace pav.mapi.example
{
public class Program
{
public static void Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
var flatFileLocation = builder.Configuration.GetValue<string>("flatFileLocation");

if (String.IsNullOrEmpty(flatFileLocation))
throw new Exception("Flat file location not specified");

if(builder.Environment.EnvironmentName != "Production")
switch (builder.Environment.EnvironmentName)
{
case "Local":
builder.Services.AddScoped<IRepo, LocalRepo>(x => new LocalRepo(flatFileLocation));
goto default;
case "Development":
builder.Services.AddScoped<IRepo, DockerRepo>(x => new DockerRepo(flatFileLocation));
goto default;
default:
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
break;
}

var app = builder.Build();

switch (app.Environment.EnvironmentName)
{
case "Local":
case "Development":
app.UseSwagger();
app.UseSwaggerUI();
break;
}

app.MapGet("/v1/dataSource", () => flatFileLocation);
app.MapGet("/v1/people", async (IRepo repo) => await GetPeopleAsync(repo));
app.MapGet("/v1/person/{personId}", async (IRepo repo, string personId) => await GetPersonAsync(repo, personId));

app.Run();
}

public static async Task<IPerson[]> GetPeopleAsync(IRepo repo)
{
return await repo.GetPeopleAsync();
}

public static async Task<IPerson> GetPersonAsync(IRepo repo, string personId)
{
var people = await GetPeopleAsync(repo);
return people.First(p => p.Id == personId);
}
}
}

we pass our connection string to our Docker repo service in the form our flatfilelocation variable which is populated based on the profile loaded.

Now if we run our application with the 'local' profile our static GetPersonAsync and GetPeopleAsync functions will receive the LocalRepo implementation of our IRepo interface and if we run our application with the 'docker' profile, it will use the DockerRepo implementation of our IRepo interface.

Tuesday 2 January 2024

Azurite Powershell

In order to upload files to our azurite storage, we're going to leverage PowerShell... on a mac, so please read the MSDN documentation on that since, it may change.


Though it may seem as if we are adding unnecessary complexity to our project, with powershell we can automate the process of uploading data to our blob storage, which may not seem very important now, however in the future when you we'll be deploying to the cloud, it will make life much simpler.

In your application create a 'powershell' folder, if you haven't already, let's start with a simple powershell script to copy our people.json file from our local hard drive to our docker container.



if(0 -eq ((Get-Module -ListAvailable -Name az).count) ){
Write-host "installing AZ module." -foregroundcolor Yellow
Install-Module -Name Az -Repository PSGallery -Force
Write-host "AZ module installed." -foregroundcolor Yellow
}
else{
Write-host "AZ module already installed." -foregroundcolor Green
}

$cs = "DefaultEndpointsProtocol=https;"
$cs += "AccountName=devstoreaccount1;"
$cs += "AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;"
$cs += "BlobEndpoint=https://127.0.0.1:10000/devstoreaccount1;"
$cs += "QueueEndpoint=https://127.0.0.1:10001/devstoreaccount1;"
$cs += "TableEndpoint=https://127.0.0.1:10002/devstoreaccount1;"

$context = New-AzStorageContext -ConnectionString $cs

function UploadFiles {
Param(
[Parameter(Mandatory=$true)][String]$containerName,
[Parameter(Mandatory=$true)][String]$localPath)
# upload all flat files to local azurite
Get-ChildItem ` -Path $localPath ` -Recurse | `
Set-AzStorageBlobContent `
-Container $containerName `
-Context $context `
}

function createContainer { Param(
[Parameter(Mandatory=$true)][String]$containerName,
[Parameter(Mandatory=$true)][Int32]$permission)

$container = $context | Get-AzStorageContainer -Name $containerName -ErrorAction SilentlyContinue
if($null -ne $container){
Remove-AzStorageContainer -Name $containerName -Context $context
}

$container = $context | Get-AzStorageContainer -Name $containerName -ErrorAction SilentlyContinue
if($null -eq $container){
Write-Host "Createing local $containerName contaier" -ForegroundColor Magenta
$context | New-AzStorageContainer -Name $containerName -Permission $permission
}
else {
Write-Host "local $containerName container already exists" -ForegroundColor Yellow
}
}


createContainer -containerName "flatfiles" -permission 0;

UploadFiles -containerName "flatfiles" -localPath "/Volumes/dev/data/"

$sasToken = New-AzStorageContainerSASToken -name "flatfiles" -Permission "rwdalucp" -Context $context

Write-Host "your SAS token is:$sasToken it has been copied to your clipboard" -ForegroundColor Green
Write-Host "The full connection URL has been copied to your clipboard" -ForegroundColor Green
Write-Output "https://127.0.0.1:10000/devstoreaccount1/flatfiles/people.json?" $sasToken | Set-Clipboard

You should have the URL to the people.json file copied to your clipboard with a SAS token to let you access it, simply past the URL in your browser or postman or insomnia and you should be able to get your json file.

Next let's create a script that will list all of the blobs in our Azurite blob storage


if(0 -eq ((Get-Module -ListAvailable -Name az).count) ){
Write-host "installing AZ module." -foregroundcolor Yellow
Install-Module -Name Az -Repository PSGallery -Force
Write-host "AZ module installed." -foregroundcolor Yellow
}
else{
Write-host "AZ module already installed." -foregroundcolor Green
}

$cs = "DefaultEndpointsProtocol=https;"
$cs += "AccountName=devstoreaccount1;"
$cs += "AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;"
$cs += "BlobEndpoint=https://127.0.0.1:10000/devstoreaccount1;"
$cs += "QueueEndpoint=https://127.0.0.1:10001/devstoreaccount1;"
$cs += "TableEndpoint=https://127.0.0.1:10002/devstoreaccount1;"

$context = New-AzStorageContext -ConnectionString $cs
$containerName = "flatfiles"

Get-AzStorageBlob -Container $containerName -Context $context | Select-Object -Property Name

And finally a powershell script to delete files.


Param (
[Parameter()][String]$blobToDelete,
[Parameter()][Boolean]$deleteAllFiles)

if(0 -eq ((Get-Module -ListAvailable -Name az).count) ){
Write-host "installing AZ module." -foregroundcolor Yellow
Install-Module -Name Az -Repository PSGallery -Force
Write-host "AZ module installed." -foregroundcolor Yellow
}
else{
Write-host "AZ module already installed." -foregroundcolor Green
}

$cs = "DefaultEndpointsProtocol=https;"
$cs += "AccountName=devstoreaccount1;"
$cs += "AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;"
$cs += "BlobEndpoint=https://127.0.0.1:10000/devstoreaccount1;"
$cs += "QueueEndpoint=https://127.0.0.1:10001/devstoreaccount1;"
$cs += "TableEndpoint=https://127.0.0.1:10002/devstoreaccount1;"

$context = New-AzStorageContext -ConnectionString $cs
$containerName = "flatfiles"

if($true -ne ([string]::IsNullOrWhiteSpace($blobToDelete))){
Remove-AzStorageBlob -Blob $blobToDelete -Container $containerName -Context $context
}

if($true -eq $deleteAllFiles) {
Get-AzStorageBlob -Container $containerName -Context $context | Remove-AzStorageBlob
}


the above unlike the other two scripts takes in parameters letting us specify if we want to delete a specific file or all files. it can be called several ways, but the following two are probably the easiest

./deleteBlobs.ps1 people.json         
./deleteBlobs.ps1 -deleteAllFiles $true     

and that's it for now, keep in mind that theses scripts are specific to our azurite environment, when it comes time to deploy to production we'll cross that bridge when we get to it.

Saturday 30 December 2023

Azurite - docker


In our previous post we configured a local repository which can read flat files from our hard drive, next we're going to set up a docker image with azurite.

The Azurite open-source emulator provides a free local environment for testing your Azure Blob, Queue Storage, and Table Storage applications. When you're satisfied with how your application is working locally, switch to using an Azure Storage account in the cloud. The emulator provides cross-platform support on Windows, Linux, and macOS.


Though one can connect to Azurite using standard http, We're going to opt for https, it adds a bit of complexity, however it's worth the investment, since later on it will be simpler to create a production version of an AzureRepo.

We're going to have to install the mkcert
mkcert is a simple tool for making locally-trusted development certificates. It requires no configuration.
GitHub - FiloSottile/mkcert: A simple zero-config tool to make locally trusted development certificates with any names you'd like.

I always suggest you go to the official documentation, because it may very well change and to be honest generally whenever i read my own blog posts, I usually go to back to the source material. That said at the time of this article you can install mkcert on a mac using homebrew 

brew install mkcert
brew install nss # if you use Firefox

next with mkcert installed you'll have to install a local CA 

mkcert -install


You'll have to provide your password to install the mkcert utility, but once it's setup you'll be able to easily make as many locally trusted development certificates as you please.

Now that you have the mkcert utility installed create a folder in your project called _azuriteData,  realistically this folder can have any name you like, however it's nice when folders and variables describe what they are used for; makes future you not want to kill present you. Once you have your folder created navigate into it and enter in the following command
 
mkcert 127.0.0.1


This will create two files inside your _azuriteData folder or whatever directory you're currently in, a key.pem file and a .pem file; I'm not going to get into the deep details of what these two files are, partially because that's not the aim of this post and partially because I have a passing awareness when it comes to security, high level, and with little practical experience, you know like most consultants.

However I can tell you that PEM stand for Privacy Enhanced Mail, but the format is widely used for various cryptographic purposes beyond email security. You can think of the two files as the public key (.pem) and the corresponding private key (key.pem) for symmetric encryption.

In essence what happens is that the message is encrypted with the public key (127.0.0.1.pem), and then decrypted with the private one (127.0.0.1-key.pem); however you really don't need to worry too much about the details.

Your folder structure should like something like this:


next we are going to create a docker compose file, for this to work you need to have docker installed on your system, you can download the install file from https://www.docker.com/.

At the root of your project create a docker-compose-yaml file, this file will contain all the instructions you need to install and run an azurite storage emulator locally for testing purposes.


version: '3.9'
services:
azurite:
image: mcr.microsoft.com/azure-storage/azurite
container_name: "azurite-data"
hostname: azurite
restart: always
command: "azurite \
--oauth basic \
--cert /workspace/127.0.0.1.pem \
--key /workspace/127.0.0.1-key.pem \
--location /workspace \
--debug /workspace/debug.log \
--blobHost 0.0.0.0 \
--queueHost 0.0.0.0 \
--tableHost 0.0.0.0 \
--loose"
ports:
- "10000:10000"
- "10001:10001"
- "10002:10002"
volumes:
- ./_azuriteData:/workspace


One of the trickiest things about this compose file is that whatever local folder contains our public/private pem keys must be mapped to the 'workspace' of our docker container. This command basically tells docker to use the azurite-data folder as the volume for this container. 

now if we go to our terminal and we enter the command 

docker-compose up

The first thing that docker will do is download the azurite image, once the download is complete, the image should be running in our docker container, that is of course if you have docker installed and running on your local machine.


You should see something like the above. Now if you navigate to your docker dashboard you should see your environment up and running

some caveats you may get some sort of 

open /Users/[user]/.docker/buildx/current: permission denied

in that case run the following command, or commands

sudo chown -R $(whoami) ~/.docker
sudo chown -R $(whoami) ./_azurite-data

The above sets the current user to the owner of the docker application folder as well as the folder which you will map to your azurite container running in docker, this should fix your docker permission issues.