Contact us

A COMPREHENSIVE BUSINESS INTELLIGENCE ANALYTICS SOLUTION FOR A GLOBAL E-COMMERCE COMPANY

ABOUT
the project

Client:

global e-commerce platform

Location:

Country flag

US

Company Size:

50+ Employees

Industry:

E-commerce

Solution:

About Project

The solution is a data integration workflow based on Azure Data Factory (ADF) and Azure Synapse as a data warehouse. It collects raw data on the customer’s sales and business operations from different sources, processes, and aggregates it, and delivers processed data to the reporting level, providing the customer with Power BI reports for sales activity analytics and forecasting.

Customer

Our client is a U.S.-based e-commerce platform that specializes in selling a diversity of products in different locations across the world. The customer ran thousands of operations and had to process terabytes of sales-related data on a daily basis. With the growing data loads, they needed a solution that would optimize their data integration processes. The client also needed to unlock the power of the most up-to-date business intelligence (BI) technologies.

Custom BI Solution for Ecommerce

Business Challenge

The customer was dealing with growing volumes of data and required a cost-efficient data integration solution that would help them weaponize their data and leverage BI reports. The customer needed the solution to be efficient, secure, and based on the latest cloud-based technologies. They also needed a team with solid experience in BI technologies to create analytical and forecasting reports pertaining to their sales data.

Why Leobit

Leobit attracted the customer’s attention with our clear vision of the data integration workflow architecture and strong cloud expertise. We also showed readiness to dive into their processes and create BI reports bringing maximum value to the customer’s team. Our specialists decided to use a Power BI integration on the reporting layer, which would totally meet the customer’s requirements. We also showed commitment to covering all phases of the software development lifecycle (SDLC), including operational support for their system.

BI Architecture

Project
in detail

The project can be roughly divided into three stages that involve requirements gathering and planning, active development, and post-development activities, such as testing, deployment, and operational support.

The first stage of the project involved in-depth research of client requirements. We reviewed their existing data sources (Google Docs, MS SQL, BigQuery), along with their data stored in different formats, and gathered all the client’s functional and non-functional demands. Our team proceeded with project planning, during which we defined solution architecture, security measures, and particular ETL (Extract, Transfer, Load) configurations. Our team decided to build the major part of the system relying on the Microsoft Azure tech stack since Azure’s pricing plans were especially attractive to the customer and Azure’s tools would provide functionality covering all the customer’s news. Upon planning the main features of the flow that we needed to develop, we provided the customer with estimates. Our team also prepared a statement of work (SOW) document and populated the project backlog with corresponding tasks. This stage also involved the planning of Scrum ceremonies and the definition of test plans and strategies.

During the next stage, we prepared the infrastructure and developed the solution in several iterations. This phase involved ETL configuration and the configuration of reports on the reporting layer. In particular, we created user console (UC) reports for sales activity and analytics. We also used Power BI Report Server to prepare extracts for sales analytics in different formats, such as CSV, PDF, XLSX, etc. Our team also configured the flow for the preparation of sales forecast reports based on data analytics info prepared by ETL and stored in the Azure Synapse data warehouse. During this stage, we also configured the solution’s security measures, such as encryption for data at rest and in transition. Finally, our specialists defined policies and permissions for data stored at the reporting layer.

Once all the major development tasks were settled, we proceeded with solution testing. We ran several QA workflows and user acceptance testing (UAT) to ensure that the entire flow was working properly. Upon testing, we deployed the solution to production. Our team also helped the customer set up a custom QA framework built with Python and Azure DevOps. The customer can use it for continuous testing and monitoring with the system. This support will help them keep the infrastructure productive and working in the long run.

BI Ecommerce solution development
project-in-detail

Data Integration with Azure Data Factory

We created several ETL jobs using Azure Data Factory to ensure efficient data aggregation and processing. Our team leveraged strong experience in Azure services to help the customer use the main benefits of ADF, such as cost-efficiency, simple scalability, and serverless architecture, which facilitates efficient system manageability. The data is retrieved from several sources, namely Google Doc, MS SQL Server, and BigQuery, and ADF handles several processes, such as:

1. Data Cleansing
2. Checking for constraints
3. Checking referential integrity
4. Pivot/aggregation of transactional data
5. Data preparation for reporting and analytics

project-in-detail

Detailed reports created with Power BI

Once ETL jobs are executed, data is stored in the Azure Synapse data warehouse. This exact data is used to create reports built with Power BI. The main advantages of this platform include its connectivity with the Microsoft ecosystem, cost-efficiency, and simplicity of use, which allows even non-technical specialists to manage informative and convenient reports. The User console can be used for creating sales activity and analytics reports.

The solution allows extracting sales analytics in reports stored in different formats, such as PDF, XLSX, HTML, etc. There is also functionality for preparing sales forecast reports based on processed data analytics informati

project-in-detail

Custom QA framework

In conjunction with this, our QA framework, based on Python and Azure DevOps, delivered a robust and scalable approach to quality assurance for the client. The framework incorporates automated testing, continuous monitoring, and comprehensive reporting, ensuring that every code change is rigorously tested before reaching production.

With Python's extensive libraries and ease of integration, we developed and executed efficient test cases. Coupled with Azure DevOps, the QA framework facilitates seamless collaboration between the development and QA teams, reducing bottlenecks and improving the overall product quality. This integrated approach ensures that the client’s applications exceed quality standards and provide excellent user experience.

project-in-detail

Efficient continuous integration and continuous deployment

Our recently completed CI/CD solution, implemented using Azure DevOps, has transformed the software development lifecycle for our client, enabling faster and more reliable deployments. By integrating continuous integration and continuous delivery pipelines, we automated the entire build, test, and release process.

This approach not only accelerated the client’s time-to-market but also enhanced code quality and minimized human errors. The solution provided seamless integration with various tools and services, comprehensive reporting, and the flexibility to scale according to project needs. As a result, the development team was able to focus on innovation and coding, while the CI/CD pipelines managed the deployment process efficiently and effectively.

project-in-detail

A comprehensive and secure data integration workflow for BI

We created a productive and cost-efficient cloud-based system that relies on the power of Microsoft Azure services. All the parts of the infrastructure seamlessly integrate with each other and ensure that only processed and aggregated data is delivered to the reporting level, where it can be displayed to non-technical users in the format of informative and detailed reports. Our team also paid much attention to security measures, as the system often deals with sensitive business-related data. In particular, we implemented data encryption for data at rest and data in transit. We also established group/user management policies and permissions based on the single sign-on (SSO) authentication scheme for the reporting level. This gives the client complete control over who and when can access their sensitive data.

Technology Solutions

  • Efficient ETL jobs for data processing, pivot, aggregation, and preparation built with Azure Data Factory
  • Efficient CI/CD built with Azure DevOps that brings flexibility and accelerates the system’s time to market
  • Diverse reports for sales analytics and forecasting built with Power BI
  • Robust security measures are achieved through data encryption and custom user/group access policies
  • Robust QA framework that ensures continuous monitoring, automated testing, and comprehensive reporting
  • Azure-based cloud infrastructure ensures efficient data storage, processing, and management.

Value Delivered

  • Seamless processing of terabytes of business-related data, including sensitive information
  • A flexible infrastructure ready for scale-ups and adjustments on demand
  • Diversity of reports with the most relevant and up-to-date data for BI forecasting and analytics
  • Custom QA framework for continuous tests allowing the customer to keep the system reliable
  • Set-up operational practices for efficient system monitoring and analytics