Category LiveSync Automation

How to Implement Near Zero Downtime for Large Cloud Data Migration?

One of my clients is moving its SAP on-premise instances to the cloud. SAP offers Near Zero Downtime Technology (NZDT) to reduce the migration downtime from approx. one week to 6-60 hours (from 6pm Friday to 6am Monday). The purpose of applying NZDT is to secure business continuity. However, it’s not cheap, to pay a million-dollar bill for a weekend! If you know how, then the estimation of building your own in-house developed NZDT component should be less than 50K.

Please be aware, NZDT is neither a new technology, nor the invention of SAP. The discussion of how to implement it is all over the place on the internet. Our Architects have gained enough experience in this area during the past two decades. We have designed and developed sophisticated non-stop 24*7 data replication tool LiveSync ...

Read More

Batch Data Processing Architecture for Visualization and Analytics

One of my municipality clients has a requirement to handle batch data from multiple data sources for visualization and analytics needs. The data source can be ERP system, sensors or internal databases. Sensors load data into time series database continuously. It requires BI dashboard show charts based on latest data in near real-time manner.

Let’s see how we design the architecture to satisfy the requirements.

Batch Data Processing for Visualization and Analytics

First, we have data sources listed in the left, including ERP, time series database-based backend systems, other databases and data flow from APIs ;

Second, we have ETL process to load data from data sources to DVA platform. You may choose any ETL tool you like, but we highly recommend Lionsgate Software’s LiveSync Automation here...

Read More

What Confluent or Kafka can do for data synchronization?

Confluent is a streaming platform based on Apache Kafka. Apache Kafka is a distributed streaming platform which offers three key capabilities:

  1. It lets you publish and subscribe to streams of records. In this respect it is similar to a message queue or enterprise messaging system.
  2. It lets you store streams of records in a fault-tolerant way.
  3. It lets you process streams of records as they occur.

So Kafka is good for:

  1. Building real-time streaming data pipelines that reliably get data between systems or applications
  2. Building real-time streaming applications that transform or react to the streams of data

Now let’s get back to Confluent. Confluent offers Confluent Open Source and Enterprise editions. We may see Confluent a wrapper of Apache Kafka...

Read More