Introduction
In my previous article, I walked through how to host a public website using Azure Blob Storage. While public access is useful for static websites and publicly available content, not every file stored in the cloud should be accessible to everyone.
In many real-world environments, organizations need a secure way to store internal documents, sensitive files, backups, and private company data while still maintaining scalability and availability in the cloud.
In this article, we will build on the same Azure environment from the previous lab by reusing the existing Resource Group and Storage Account setup. If you did not follow the previous article, you can access it here:
For this lab, we will configure private Azure Blob Storage, restrict anonymous access, generate secure Shared Access Signature (SAS) access, and implement lifecycle management and replication policies for better storage optimization and resiliency.
By the end of this guide, you will understand how to securely manage private documents in Azure Blob Storage using industry-standard cloud practices.
Scenario
A company requires secure cloud storage for internal office and departmental documents. Since the data contains private organizational information, it must not be publicly accessible without authorization.
The storage solution must also provide high availability in the event of a regional outage while supporting backup and replication for the company’s public website storage.
In this lab, we will configure Azure Blob Storage to meet these requirements using private containers, Shared Access Signatures (SAS), lifecycle management, and object replication.
Create a storage account and configure high availability.
-
Create a storage account for the internal private company documents.
- In the portal, search for and select Storage accounts.
- Select + Create.
- Select the Resource group created in the previous lab.
- Set the Storage account name to private. Add an identifier to the name to ensure the name is unique (i added eazi).
- Select Review, and then Create the storage account.
- Click on create.
- Wait for the storage account to deploy, and then select Go to resource.
- In the portal, search for and select Storage accounts.
-
This storage requires high availability if there’s a regional outage. Read access in the secondary region is not required. Configure the appropriate level of redundancy.
- In the storage account, in the Data management section, select the Redundancy blade.
- Ensure Geo-redundant storage (GRS) is selected.
-
Refresh the page.
- Review the primary and secondary location information.
-
Save your changes.
- In the storage account, in the Data management section, select the Redundancy blade.
Create a storage container, upload a file, and restrict access to the file.
-
Create a private storage container for the corporate data.
- In the storage account, in the Data storage section, select the Containers blade.
- Select + Container.
- Ensure the Name of the container is private.
- Ensure the Public access level is Private (no anonymous access).
- As you have time, review the Advanced settings, but take the defaults.
- Select Create.
- In the storage account, in the Data storage section, select the Containers blade.
-
For testing, upload a file to the private container. he type of file doesn’t matter. A small image or text file is a good choice. Test to ensure the file isn’t publicly accessible.
- Select the container.
- Select Upload.
-
Browse to files and select a file.
-
Upload the file.
- Select the uploaded file.
- On the Overview tab, copy the URL.
- Paste the URL into a new browser tab.
- Verify the file doesn’t display and you receive an error.
- Select the container.
-
An external partner requires read and write access to the file for at least the next 24 hours. Configure and test a shared access signature (SAS).
- Select your uploaded blob file and move to the Generate SAS tab.
- In the Permissions drop-down, ensure the partner has only Read permissions.
- Verify the Start and expiry date/time is for the next 24 hours.
- Select Generate SAS token and URL.
- Copy the Blob SAS URL to a new browser tab.
- Verify you can access the file. If you have uploaded an image file it will display in the browser. Other file types will be downloaded.
- Select your uploaded blob file and move to the Generate SAS tab.
Configure storage access tiers and content replication.
-
To save on costs, after 30 days, move blobs from the hot tier to the cool tier.
- Return to the storage account.
- In the Overview section, notice the Default access tier is set to Hot.
- In the Data management section, select the Lifecycle management blade.
- Select Add rule.
- Set the Rule name to movetocool.
- Set the Rule scope to Apply rule to all blobs in the storage account.
- Select Next.
- Ensure Last modified is selected.
- Set More than (days ago) to 30.
- In the Then drop-down select Move to cool storage.
- As you have time, review other lifecycle options in the drop-down.
-
Add the rule.
- Return to the storage account.
-
The public website files need to be backed up to another storage account.
- In your storage account, create a new container called backup. Use the default values. Refer back to Lab 02a if you need detailed instructions.
- Navigate to your publicwebsite storage account. This storage account was created in the previous exercise.
- In the Data management section, select the Object replication blade.
- Select Create replication rules.
- Set the Destination storage account to the private storage account.
- Set the Source container to public and the Destination container to backup.
-
Create the replication rule.
- In the Data management section, select the Object replication blade.
- Optionally, as you have time, upload a file to the public container. Return to the private storage account and refresh the backup container. Within a few minutes your public website file will appear in the backup folder.
- In your storage account, create a new container called backup. Use the default values. Refer back to Lab 02a if you need detailed instructions.
Conclusion
Congratulations on completing the lab.
In this exercise, we explored how to secure internal company documents using Azure Blob Storage by implementing private access controls and secure sharing mechanisms. We also looked at additional storage management features such as lifecycle policies and object replication to improve both cost optimization and data resiliency.
Some key takeaways from this lab include:
- Azure Blob Storage can securely store private organizational data.
- Containers can be configured to block anonymous public access.
- Shared Access Signatures (SAS) provide controlled and temporary access to files.
- Lifecycle management helps automate storage tier optimization.
- Object replication improves backup and disaster recovery capabilities.
This lab also demonstrates the importance of designing cloud storage solutions based on business requirements, balancing security, accessibility, scalability, and cost efficiency.
If you followed my previous article on hosting a public website using Azure Blob Storage, you have now seen both sides of Azure Storage, public access for web hosting and private access for secure internal storage.
See you in the next article.
Top comments (0)