article thumbnail

BI Cubed: Data Lineage on OLAP Anyone?

Octopai

This is how the Online Analytical Processing (OLAP) cube was born, which you might call one of the grooviest BI inventions developed in the 70s. It’s a snapshot of data at a specific point in time, at the end of a day, week, month or year. Saving time and headaches with online analytical processing tool.

OLAP 56
article thumbnail

Financial Intelligence vs. Business Intelligence: What’s the Difference?

Jet Global

This practice, together with powerful OLAP (online analytical processing) tools, grew into a body of practice that we call “business intelligence.” Such BI methodologies are built on a snapshot of what happened in the past.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

What is business intelligence? Transforming data into business insights

CIO Business Intelligence

BI tools access and analyze data sets and present analytical findings in reports, summaries, dashboards, graphs, charts, and maps to provide users with detailed intelligence about the state of the business. Whereas BI studies historical data to guide business decision-making, business analytics is about looking forward.

article thumbnail

Unleashing the power of Presto: The Uber case study

IBM Big Data Hub

But what most people don’t realize is that behind the scenes, Uber is not just a transportation service; it’s a data and analytics powerhouse. This blog takes you on a journey into the world of Uber’s analytics and the critical role that Presto, the open source SQL query engine, plays in driving their success.

OLAP 93
article thumbnail

Build an Amazon Redshift data warehouse using an Amazon DynamoDB single-table design

AWS Big Data

Deriving business insights by identifying year-on-year sales growth is an example of an online analytical processing (OLAP) query. In our analytic use case, if we are analyzing quarterly growth rates, we may only need a couple of years’ worth of data; the rest can be unloaded into the data lake.