Mount Azure File Storage as Volume on Docker container in Azure Virtual Machine

Written by: Chirag (Srce Cde)

Containers is one of the preferred ways to deploy applications. It contains everything that the application needs to run. As we know that data does not persist when the container is killed or if it does not exist anymore. For some reason, if there is a need to persist data then the bind mount or volumes can be of help. With bind mounts, a file or directory on the host machine is mounted into a container with limited functionality.

Azure File Storage Mount
Azure File Storage Mount

Let’s consider, that if we are running the container on the cloud and the container needs to read/write the data from one of the cloud provider’s storage services (i.e. Azure File Storage) then the docker volumes are the way to go. You can read more about storage types in docker here. Let’s start with the hands-on setup.

As a part of this article, we will read the CSV file uploaded on Azure File Storage, print the data, do some modification and write it back to Azure File Storage via docker container for demo purposes.


Create Azure File Share storage account

Go to Azure Portal → search for Storage Accounts → Create the Azure File share storage account as below.

Create Azure File Storage
Create Azure File Storage

Post creating the Azure File Share, go to that resource → click File Shares under Data storage → click File share and fill in the required details and click on Create

Create File Share
Create File Share

Open the file share that you created and create two directories (i.e. input & output). Upload the CSV file that you want to transform in the input directory. I have uploaded a CSV file which we will access within the container for the demo purpose.

Create directory
Create directory

Create & setup Virtual Machine

Search for V_irtual Machines_ on Azure Portal and create Azure Virtual Machine

create virtual machine
Create Viratual Machine

As a next step, connect to VM via Visual Studio Code since we are going to write some code and docker config files. But you can also SSH into the instance for the same.

After a successful remote connection, execute the below commands.

  • sudo apt-get update
  • sudo apt-get install docker-compose

Now, it is time to write some code. Go ahead and create a directory in your VM and copy the code from here

The folder structure looks like this

directory structure
Directory structure

Within the service directory, we have an app directory (afs_app), which contains _ main.py_ (gets executed when the docker container is up & running), (contains the logic to read, modify & write the file from AFS).

The requirements.txt is under service which contains the information about the python packages to be installed. will package the code and install it.

Dockerfile contains all the definitions/commands to create an image

docker-compose.yml is used to configure the application service. It has the standard configuration except for the volumes part, where the Azure File Share is linked and mounted as a volume on the data directory (under services) within the container via Common Internet File System.

version: '3'
    container_name: afs_service
      context: ./
      dockerfile: Dockerfile
    restart: always
      - AFSMount:/project/service/afs_app/data
    driver: local
      type: cifs
      o: "mfsymlinks,vers=3.0,username=${AFS_NAME},password=${AFS_KEY},addr=${AFS_NAME}"
      device: "//${AFS_NAME}${AFS_CONTAINER}"

Once you have the code in place, create the .env file in the same directory as the docker-compose file and configure the below ENV variables.


For AFS_KEY, go to the storage account and click Access keys under Security & Networking. Reveal & copy the key and paste it as a value of AFS_KEY.

azure file storage key
Azure File Storage Key

After the environment variables are configured, execute the below command to build the image and run the container.

sudo docker-compose up --build

Once the container is up, the application reads the CSV file & uploads the modified file back to Azure File Storage within the output folder as unique_id_processed.csv

azure file storage

I hope this is helpful and can cater to the number of use cases.

Thank you for reading!