Success Stories

OKR Platform. Powerful Machine Learning Always on Target.

The algorithm takes the lead.


We partnered with a company offering software combined with the Objectives and Key Results (OKRs) methodology to allow all levels of the organization to align, improve visibility, and enable an outcome-driven culture.


Rising digitalization with a relentless focus on operational excellence drives product companies to innovate at higher speeds. In a modern/niche market, consumers often opt for data-driven products enabling intelligent and seamless digital experiences.

To turn their product vision into reality, our client needed help with data-driven enhancements that make the most of customer insights and place them ahead of the competition in a multimillion-dollar market category of their own.


Our transformation journey started by performing agile analytical work in various customer and product data segments to grasp key concepts and gain knowledge about the product.

We developed a PostgreSQL database for the analytical workloads connected to an automated data pipeline within Azure Cloud, thus leveraging Azure Data Lake and Data Factory to source data from MongoDB. As development efforts progressed, we enabled Azure Synapse to handle data orchestration and supersede Azure Data Factory.

Once enough insights were gained, we proposed to develop a Long short-term memory (LSTM) deep learning model based on TensorFlow and integrate it as a core natural language processing (NLP) feature in the product. 

Our new feature provides real-time semantic suggestions inferred from customer-supplied text fields. Once the user writes out the text, this information is sent to the model API, and predictions about the form fields are returned to the front end. As product architecture is based on microservices, embedding the model service within this context was done using gRPC.

By leveraging technologies such as Docker, MLflow, Jenkins, and SonarQube, we achieved complete data science and machine learning operationalization workflow automation for seamless model training, testing, evaluation, and deployment. As part of the ongoing monitoring setup, we developed Grafana dashboards with KPIs and metrics to monitor the real-time accuracy and performance of the service in production.

Our model service exists in a live Kubernetes cluster serving thousands of customer requests daily. Throughout robust code optimization and testing, we achieved a mean response time of 150 milliseconds per request and model accuracy of 92%. We continue to bring value to the customer by researching, building, testing, and delivering new, data-driven product features.
Quick Facts
Duration: 11 months

Technology Stack: Python, R, Tensorflow, Prophet, H2O, Kubeflow, MLFlow, Snowflake, Power BI, Tableau, Grafana

Team: 6 Data Science Engineers

Let's talk about your Ideas.

Contact Us