On September 9, 2021 Antoine Stelma teamed up with Rick van der Lans, Industry Analyst and Sukki Sandhar from SingleStore for the live webinar “Rationalizing Data Architectures with a Unified Database”
After a short introduction, Rick, who is selected as the sixth most influential BI analyst worldwide by onalytica.com in 2018, illustrated the need to simplify today’s data architectures.
The chain of databases in Classic, but also in Modern Data Architectures, required a lot of data copies, Rick states. We are used to a process in which coping data is self-evident. We copy data from Source Systems to Staging Areas to Datawarehouse and Data Marts. And even in a Data Lake or Data Hub Architecture, we copy and duplicate data for analytics and reporting.
But copying data is not for free, Rick says. The more you copy your data, the older your data will be. And at the same time, he emphasizes that repeatedly copying data is not just a matter of time: “Every time we copy data, we add complexity to our data management and data security organization.”
The database-by-workload approach has led to a proliferation of data infrastructure, which is complex and costly, Rick continued. For many years, special-purpose databases have served organizations very well. But for today’s data-intensive applications, good is no longer good enough.
The need to rationalize data architectures to provide zero-latency analytics, be exponentially scalable and agile has grown. By merging databases and bringing transactional and analytical workloads together in one unified and simplified architecture, Rick illustrated a new Data Delivery System for data processing with simplified data routes.
One unified platform to deliver the speed, scale, and agility that organizations need
To understand the concept of a Unified Database, Sukki provided insights on how SingleStore uniquely supports multiple workloads with Universal Storage and how the SingleStore architecture supports Hybrid Transaction Analytical Processing (HTAP) that “breaks the wall” between transaction processing (OLTP) and analytics (OLAP) due to the direct use of the existing and original data source.
Sukki went on to explain how the combination of In-Memory Row store, with a response time of a few milliseconds, and Memory & Disc Column store, with 80% compression ratio, gives you a groundbreaking new storage engine that helps consolidate and simplify your architecture while reducing operational cost and complexity.
One single Database for Fast Analytics
In a next level of detail, Antoine, with his extensive background as data architect, talked about the capabilities of SingleStore and how to use Single Store’s multi-model architecture and universal storage to enrich your data analytics.
The core features that Antoine has highlighted:
- No-SQL Key-Value Pairs
- Time Series
- Full-Text Search
- Procedures, Triggers and Functions
Finally, Antoine provided an insight into the concept of SingleStore Pipelines that allows you to extract, transform and load external data in a robust and scalable way without any third-party software. This enables real-time ingestion of data using the exactly-once semantics to manage your data.
Due to its scalability, ultra-fast data ingestion, super-low latencies and high concurrency,SingleStore is the ideal unified (cloud-style) database for fast analytics and AI/ML-powered applications that require a real-time access system. And delivers at least 10-100x performance and flexibility at 1/3 the cost of legacy databases.