SoDA

Spyrosoft has prepared a microservice platform using Microsoft Azure

At Spyrosoft, we always use the latest technologies to help our customers build excellent products. This time, we’ve employed Microsoft Azure services to upgrade a set of mobile and web applications that use geospatial data.

According to the data released in the Gartner’s report ‘Forecast: Public Cloud Services, Worldwide, 2018–2024, 2Q20 Update’, the usage of public cloud platforms will be increasing to the level of USD 306,948 and 354,062 million in 2021 and 2022 respectively. It is perhaps not surprising if we look at the advantages of the platforms. One of them is their reliability. Physical data centers are located worldwide, which means that all data is dispersed among them and secure at all times, even if something unpredictable occurs. Another asset is how easy is to scale cloud solutions — virtual machines can be set up or turned off whenever it is necessary and addresses your project’s current capabilities. It is also worth mentioning that cloud solutions such as Microsoft Azure are environmentally sustainable — Microsoft has declared that their centers will be powered using 100% renewable energy by 2025 and zero-waste by 2030 and they have introduced a number of programs to achieve these ambitious goals.

The developer teams at Spyrosoft have been using Microsoft Azure services for years, employing them in key projects that we’ve completed for our customers. The most frequently used services include:

  • Azure Active Directory B2C
  • Azure Kubernetes Service
  • Azure PostgreSQL
  • Azure Content Delivery Network
  • Azure Cognitive Services
  • Azure Service Bus
  • Azure Functions
  • Azure DevOps
  • Azure Cognitive Search
  • Azure Traffic Manager

You will find a case study for one of these key projects below. It shows how versatile the Microsoft Azure products are and how they can be used for almost any type of task.

The challenge

One of our customers is a British company that produces paper and digital maps. In their product portfolio, there is a set of web and mobile applications that users can employ for exploring their surroundings, either locally or further afield, as well as for ordering personalized maps and other products.

Over 2 years ago our customers came to us with the project including both upgrading existing web and mobile applications, creating several new ones, and moving their infrastructure to Azure cloud. They also needed a new platform that would be able to host API and data for all the applications in a unified way. All of the products and the infrastructure were already located and maintained using a public cloud from another provider. The platform also had a few different services written in Java and several more in PHP and JavaScript.

We decided to adjust or write anew the existing components in line with the Cloud Native approach which will shorten the delivery time for new functionalities and allow for using the latest technologies.

The solution

To make the upgrade process possible and allow for introducing new components, we had to complete the migration process for all data and user profiles from the existing solution that was located on-premise. This step was crucial to increase the security of the user data and to implement authentication and authorization using OAuth 2.0. The then-current ecosystem included data for over 1.5 million users in the UK. The most crucial thing for us was transferring this information as securely as possible and avoiding any changes for legacy applications that were using the system. ‘We decided to use the one at a time strategy’ — says Lukasz Macuga, Tech Lead/Architect at Spyrosoft.

Łukasz Macuga, Tech Lead/Architect at Spyrosoft

The migration process was completed using Azure Active Directory B2C and the strategy itself included a user migration to the new public cloud at the moment they logged in. This technique was more time-consuming, but it was required for the maximum transparency of the change for the users.

The main cluster for the applications was set up using Azure Kubernetes Service where our developers configured 4 different environments. The cluster is currently hosting more than 30 services.

We use an Azure Storage Account for storing the data with Azure Blob Storage being employed for storing data import elements and for storing the users’ audio and video files, as well as images. We also make use of Azure PostgreSQL service — Postgres-As-A-Service databases — for storing operational data as they provide excellent capabilities for processing geospatial data. Static data provided by the platform are delivered using Azure Content Delivery Network which means less delays and, in the future, being able to extend the data access for users worldwide.

One of the most interesting services we use is Azure Cognitive Services which is employed to validate the content provided by the users and allow for an early check and removal of any text or images that go against the platform’s policy standards.

As for the communication/message bus, we use Azure Service Bus. It helps us to process data asynchronously and organize the communication process between the microservices upon which the whole system architecture is based.

For the services (API) that manage the unpredictable traffic spikes, we implemented Azure Functions that allow for scaling dynamically and as needed.

Łukasz Macuga again: ‘Contrary to popular opinion, Azure services provide high-quality software development kits for Java that are developed in a transparent way. We did not have any problems using Java on Azure platforms.’

The results

After migrating the platform to Azure Kubernetes Service, our developers have been able to make it more effective reliable and have reduced the downtime.

Another process that has been sped up is managing functionalities that users engage with daily:

  • The time needed for logging went from over 10 seconds to below 1 second.
  • The time needed for signing up went from 12 seconds to below 1 second.

Thanks to the excellent compatibility of API Azure and Azure DevOps with opensource solutions, we were able to configure the infrastructure using the Infrastructure-as-a-Code method with Terraform and to manage the applications cluster using the GitOps technique. We also introduced complete automation wherever we could which means that we are now able to quickly and easily deliver any new functionalities.

Next steps

One of the next steps in the development of a platform for our client is expanding the offer to other countries, besides the UK. In order to do that, our engineers have implemented a Global Location Search service using Azure Cognitive Search. By the end of 2020, we plan to implement the infrastructure/applications in Australia. The next countries that will be launched in 2021, are the US and Canada.

Thanks to the use of Azure Traffic Manager service we can deliver and service high-performance applications, which in the end, will be available globally. The Azure Traffic Manager will be used to manage web traffic. We will use the movement transfer strategy as well as servicing the possible, sudden increase in movement which could threaten the app’s stability and reduce its availability for users.

Latest articles

Latest articles

The tech sector has a responsibility to support people during times of crisis; the war in Ukraine is no exception

Talk to Ukraine. Ukrainian tech and Poland: New Business Opportunities

One SoDA membership fee — significant savings for the company budget and benefits for the entire team