This post originally appeared on the Myntra Engineering Blog.

Learn how Myntra gained real-time insights on rapidly growing data using their new processing and reporting framework.

Background

I got an opportunity to work extensively with big data and analytics in Myntra. Data Driven Intelligence being one of the core values at Myntra, so crunching and processing data and reporting meaningful insights for the company is of utmost importance.

Everyday millions of users visit Myntra on our App or website generating billions of clickstream events which makes it very important for the data platform team to scale to such a huge number of incoming events, ingest them in real time with minimal or no loss, and process the unstructured/semi-structured data to generate insights.

We use a varied set of technologies and in-house products to achieve the above including but not limited to Go, Kafka, Secor, Spark, Scala, Java, S3, Presto and Redshift.

image1

Motivation

As more and more business decisions tend to be based on data and insights, batch and offline reporting from data was simply not enough. We required real-time user behaviour analysis, real-time traffic, real-time notification performance and other metrics to be available with minimal latency for business users to make decisions. We needed to ingest as well as filter/process data in real-time and also persist it in a write fast performant data store to do dashboarding and reporting on top of it.

Meterial is one such pipeline which does exactly this and even more with a feedback loop for other teams to take action from the data in real time.

Architecture

image2

Meterial is powered by:
1. Apache Kafka
2. Data transformer based on Apache Spark
3. MemSQL real-time database
4. React.js based UI

Deep Dive

Our event collectors written in golang sit behind Amazon ELB to receive events from our App/website, they add a timestamp to the incoming clickstream events and push them into Kafka.

From Kafka, Meterial-ingestion layer based on Apache Spark streaming ingests around ~4 million events/minute, filters and transforms the incoming events based on a configuration file and persists them to MemSQL rowstore every minute. MemSQL return results for queries spawning across millions of rows with sub-second latency.

Our in-house dashboarding and reporting framework (viz. UDP: Universal Dashboarding platform) have services written which queries MemSQL every minute and stores the result in UDP query cache from where it is served to all the connected clients using socket based connections.

Results are displayed in a form of graphs, charts, tables and other numerous widgets supported by UDP.
The same UDP apis are also used by slackbots to post data into Slack channels in real time using Slack outgoing webhooks.

As all transactional data currently lies in Redshift and there are requirements where reporting of commerce data with user data every 15 minutes is needed, Meterial also enables this ad-hoc analysis on data for our team of data analysts. Every fifteen minutes data from MemSQL for that interval is dumped into S3 from where it is loaded to Redshift using our S3 — redshift ETLs.

We selected Spark as our streaming engine because of its proven scale, powerful community support, expertise within team and easy scalability with proper tuning.

For real-time datastore choice, we did POC on multiple databases and drilled down to MemSQL.
MemSQL is a high-performance, in-memory and disk-based database that combines the horizontal scalability of distributed systems with the familiarity of SQL.

We have seen MemSQL to support very high concurrent reads/writes very smoothly at our scale with proper tuning.

Currently we are exploring MemSQL columnstore as OLAP db for our AB Test framework (Morpheus) and Segmentation Platform(Personify).

Sample UI Screenshots

image3

Traffic

image4

Notification

Future of Real-time Analytics at Myntra

Using real-time data with predictive analytics, machine learning and artificial intelligence opens altogether new doors to understand user behaviour, what paths and funnels leads to commerce, and more. Getting such information in real time can definitely help us boost our commerce and take corrective actions if something goes wrong as soon as possible and we are constantly working on improving and enhancing it.