Publishing RobotFramework reports in Azure Blob Storage
In this article I'm describing new functionality added to rf-service - publishing RobotFramework reports in Azure Blob Storage.
In a long way off goal of making rf-service covering everything between writing tests and viewing test results I decided to work a little more on where the test reports could be published. Using managed service for that seems to be perfect choice, especially if you could handle it in a ‘send and forget’ manner. In my work we are using Azure extensively so Azure Blob storage with its static website hosting is what I decided to use. In this article you will find very basic introduction to static web hosting on Azure and how you can send Robot Framework reports there programmatically.
This article is a part of series connected with testing on Kubernetes. You can find more info in following articles:
Robot Framework library for testing Kubernetes - in this part I’m describing Robot Framework library (Python) that uses Kubernetes client for getting info about your cluster and turning it into actual test suites.
Testing on kubernetes - rf-service - this article describes Python service executed in a form of CronJob that actually runs the tests from KubeLibrary on kubernetes cluster.
Azure blob storage is a great way of keeping files in cloud. It’s a simplified abstraction of file system with API allowing to configure things like retention, access control, redundancy etc. It is part of more general concept - storage account - which can be used for services like queues, tables or even big data scenarios like Data Lake storage.
Creating blob storage and static website
Creating blob storage is pretty easy, as a prerequisite, you need to have resource group in place, as a container for your storage account. Depending on the requirements you can select performance, account type and proper replication level. I will go with the defaults.
After storage account is ready to be used, just go to Static website and enable the feature. This is exactly the thing that will make Robot Framework html reports to be visible as a website. You don’t need to set index or error document in this case, we won’t be really serving any startpage or anything like this, just the reports. It is good to take note of the primary endpoint, this is the URL under which your content will be served.
Enabling static website will create $web container, which is kind of root of everything you want to publish.
Now the important part, by default static websites are public - even though blob storage is by default not accessible publicly the $web container is handled differently (it supposed to be serving website ). You can limit the access to it in different ways, in corporate environment you would probably have private network connecting your work environment with Azure, network of internal/public IPs, etc. There is quick and easy way to limit the access only to your PC though, you can get back to storage account level, go to Networking and allow access from selected networks and use button to add client IP address. This will configure firewall to let in only your IP (remember that this IP may change depending on how exactly you are connecting to the internet).
It is worth to go through the Python code that is responsible for sending the reports in case you want to use it outside the rf-service.
Going from top to bottom, first of all I’m writing Robot Framework report to file which then will be uploaded to Azure Blob storage. Then, there is part containing initialization of blob service client and setting target blob path and name. Remember to set content settings to ‘text/html’ to make sure that report is rendered as static page. Last step is just sending the content and viewing the url of the published report.
Azure blob storage is an interesting option for keeping Robot Framework reports. There are couple of functionalities like retention, tagging, access control that can be configured for your reports but were not described in this article, but even without it, simple publishing in managed service is very convenient. Microsoft makes sure that API for blob storage is accessible through different programming languages like Python, JavaScript, Ruby and others so it can become storage for different kind of automation.