The Importance of Real-time Event Processing at DigiOutsource

JACQUES BEZUIDENHOUT IS THE HEAD OF SOFTWARE DEVELOPMENT FOR PLATFORMS AND SHARED SERVICES AT DIGIOUTSOURCE. JACQUES TAKES US THROUGH THE ENORMITY, AS WELL AS THE CAPABILITIES, OF DATA PROCESSING AT DIGIOUTSOURCE.

Cutting Edge

The term is used quite loosely but at DigiOutsource we fully embrace it. This is no different to our approach and philosophy when considering "real time processing". In an ever-changing world of a business fully enabled by technology, the one thing that is constant is the needs of the customer. With the growth of the internet through the late '90s and mid 2000s until today, the world as we know it is more connected and demanding than ever before. As such, having any delay in processing is simply no longer acceptable. Imagine booking that Uber but having to wait 30 minutes before anyone confirms the order - it would never suffice. Online gaming is certainly no different. Our customers are just as demanding (perhaps even more so) than other industries. As technology improves, patience and tolerance diminish, that's why, here at DigiOutsource, we embrace technology to empower real time experiences that supplement our business strategies to the maximum.

Our Real-time Processing Journey

Transitioning from traditional database driven monolithic architecture into event driven, micro service-based architecture was no small feat. Following our teams’ philosophy of using agile principles, we embarked on a journey to reconstruct and rethink the way our systems operate. This resulted in designing an event-based data processing platform in a time where traditional CRUD ETL processes were still the norm.

Raw data is seldom useful for real-time analytics. The ‘aha’ moment was realising that we need to capture data in real-time to an event and move data across the enterprise in a reliable real-time manner. It was key to be able to transform and enrich the data, to be able to make business decisions at run-time.

As we create reliable and scalable data movement pipelines we had to look at the best practices for stream processing and to give data the right value.

A fundamental shift in technology and upskilling was required. Basically, taking a ‘streaming first’ approach. 

Real-time streaming data can often be used for more than one purpose, and to optimize data flows, and minimize resource usage, it was important that this data is collected only once, but able to be processed in different ways and delivered to multiple endpoints. The whole point of doing real-time data movement and stream processing is to deal with huge volumes of data with very low latency. It was important for us to ensure that the entire design end-to-end was well thought through. 
This did come with some challenges and learnings that helped us along the way; learnings and challenges we saw as opportunities to rethink software engineering.

Key areas of consideration:

Embracing new technologies and working closely with other tech teams in the group, allowed us to set ourselves apart from many other businesses at the time - pushing the boundaries in creating the best customer experience real-time. We now have an event driven platform capable of performing complex event processing in real time for all our thousands of players simultaneously. Running with full redundancy, recovery and isolated execution zones ensuring maximum up-time and fault tolerance.

The architecture allows for on-boarding new domains seamlessly and is continually being enhanced by more services and data science-based models; ever moving closer to the perfect point of individual customization, giving complete insight into independent processes as a whole, allowing for real time monitoring and execution. Today, it handles over 20 million events per day feeding more than 50 independent complex event processes, stretching across our data centres all over the globe. Continuous validation of data movement from source to target, coupled with real-time monitoring, can provide peace of mind. This monitoring can incorporate intelligence, looking for anomalies in data formats, volumes, or seasonal characteristics to support reliable mission-critical data flows.

What's next?

More. Continual tech upgrades, onboarding of more areas in the business and using Machine Learning / AI to evolve technical capabilities. All in real-time.

Interested in a career with us? Head over to our careers page to see if we have the perfect role for you.
11 FEBRUARY 2020
Digi Women in Tech: Liesl Talmarkes

23 JANUARY 2020
East, West, Cape Town is best!

20 DECEMBER 2019
Harnessing Leadership Potential

13 DECEMBER 2019
The Importance of Real-time Event Processing at DigiOutsource

2 DECEMBER 2019
Work and Play in Cape Town!

21 NOVEMBER 2019
The Evolution of Mobile at DigiOutsource

15 NOVEMBER 2019
Digi Women in Tech: Andrea Bryant

11 NOVEMBER 2019
We Make Moving to Cape Town from an Overseas Country a Breeze!

6 NOVEMBER 2019
Work Abroad, Move to Cape Town!

31 OCTOBER 2019
DIGI WOMEN IN TECH: LYNDELLE LOK

28 OCTOBER 2019
LONG SERVICE AWARDS: KHANYISA POTO

22 OCTOBER 2019
DIGI WOMEN IN TECH: JAYD WILLIAMS

21 OCTOBER 2019
THE DIGIOUTSOURCE CUSTOMER SERVICE AWARDS 2019

17 OCTOBER 2019
THE IMPORTANCE OF REAL-TIME RISK MANAGEMENT AT DIGIOUTSOURCE

10 OCTOBER 2019
DIGI WOMEN IN TECH: ANKITA DALMIA

2 OCTOBER 2019
SOFTWARE DESIGN FOR MODERN ARCHITECTURE

16 SEPTEMBER 2019
THE IMPORTANCE OF GOOD LEADERSHIP TO EFFECTIVELY SCALE A TECH DEPARTMENT

13 SEPTEMBER 2019
INTERVIEW WITH CARLO PHILLIPS

9 SEPTEMBER 2019
DIGI WOMEN IN TECH: CHERYL FOWLER

26 AUGUST 2019
SOFTWARE ARCHITECTURE AT DIGIOUTSOURCE