If you work in a large enterprise that’s been around for a while, you’ve probably experienced this yourself: traditional business intelligence (BI) systems are seriously struggling.
On-premise clients built on older relational databases are poorly equipped to cope with the explosion of semi-structured data from mobile devices, the web, and IoT. They simply can’t compete with more modern web-based and API-enabled analytics tools which provide real-time analytics.
This leaves many organizations in conflict. They may understand the need to upgrade to a modern analytics platform, but find themselves trapped with legacy BI systems. Their tangled data transformation workflows make modernization prohibitively disruptive and expensive. Meanwhile, their existing analytics BI system puts them at a serious competitive disadvantage by delivering poor-quality, outdated information that hardly deserves the name “business intelligence.”
The sheer scale of today’s data operations makes upgrading imperative — yet even more daunting. IBM has estimated that people around the globe create approximately 2.5 quintillion bytes of data per day. In fact, the vast majority of available data is relatively new — 90 percent of it has been generated in just the past two years.
The critical concern, then, is how to take advantage of newer BI solutions without sacrificing the vital connections currently maintained through legacy BI systems. In this two-part series, we will examine the challenges of integrating new and old technologies and the role NoSQL databases can play.
Any new enterprise technology will create challenges. Just training employees on a different platform can strain the budget. New service-level agreements and technical support are other potential sources of trouble.
However, maintaining a legacy system indefinitely, though it might seem the safest approach, can be the most dangerous option of all. Organizations that rely on aging BI systems often find these infrastructures cannot adequately process real-time data. The existing architecture becomes brittle as user roles evolve and new needs emerge. Think of an application that needs to be scaled across multiple offices, to users on desktops, laptops, phones and tablets and in different time zones. Relational databases alone can’t fulfill this need.
One reason has to do with the overhead from common database processes, such as the extract, transform, load (ETL) transactions that are the bread and butter of data warehousing. Others stem from dependence on inefficient data replication mechanisms. All of them obstruct the goal of obtaining actionable insights and using them to make educated management decisions.
It’s not always feasible or smart to replace a legacy BI system entirely. Fortunately, there are ways businesses can improve their analytics without unduly disrupting existing operations. One is to add NoSQL databases alongside the traditional relational databases that hold crucial items like transaction data. This combination can minimize the overhead associated with warehousing while facilitating the creation, delivery, and management of company information.
With in-memory engines like Spark and NoSQL databases backed by better SLAs than their relational counterparts, organizations can provide business users with near real-time metrics. In addition, their new databases can easily interact with legacy setups without disrupting dependencies or requiring substantial workflow changes.
Most organizations use different databases for Online Transactional Processing (OLTP) and Online Analytics Processing (OLAP). The OLAP database typically pulls data from several OLTP systems to generate analytics reports, but cannot operate in real time. Many organizations invest heavily in vendor-specific hardware and software tools to address this problem but still haven’t obtained real-time capacity.
NoSQL databases create new possibilities for reducing data warehousing overhead, accommodating fresh system architectures and providing near real-time analytics via micro/millisecond SLAs. NoSQL databases complement existing solutions and enable proper handling of varied data sets — a core capability, considering that data variety is now a top issue for CIOs. Organizations must process information from a range of sources, including email, images, geospatial data, and text. NoSQL is ideal for this task in particular and for use in BI systems in general, since it offers efficient indexing and search capabilities, in addition to its distinctive key-value stores.
In my next post, I’ll discuss some practical tips on how to make your NoSQL implementation work well.
Meanwhile, I’d be interested in your experience. Are you wrestling with a decision about next steps for a critical legacy system? Will you stay or will you go?
Chak Pakala is a principal technologist at Nexient. He has more than 20 years’ experience architecting, implementing and leading development projects. His specialties include microservices, cloud and grid computing strategy.