Conference Outline

The conference starts at 9:30 am and ends at 5:15 pm on both conference days. Registration commences at 8:30 am.

Wednesday, March 25, 2015

9:30 am Opening by the conference chairman Rick van der Lans
The Business Intelligence Paradox

Session 1

It’s Time for the Logical Data Warehouse
Rick van der Lans

Session 2

Creating a Business Analytics Center of Excellence
Wayne Eckerson
Case Making the Logical Data Warehouse a Reality
Mark Pritchard

Session 3

Collaborative BI – What is it, Why should I Care, What does it take Technologically, and How do I Get Started?
Claudia Imhoff
Case From reporting to transactional data discovery
Tjerk Bancken

Session 4

The Data Lake and How to Avoid Drowning in It
Barry Devlin

 

Thursday, March 26, 2015

9:30 am Opening by the conference chairman Rick van der Lans

Session 5

Overview of SQL-on-Hadoop Engines
Rick van der Lans

Session 6

Decision Making Support in the Intersection of Big Data and BI
Barry Devlin
Case Using Big Data to Improve Profitability
Frederik Naessens

Session 7

Creating an Analytically-Driven Enterprise – How to Implement an Analytics Program
Claudia Imhoff

Session 8

Secrets of Analytical Leaders: Insights from Information Insiders
Wayne Eckerson

 

Daily schedule:

09:30 – 09:45 Opening by Conference Chairman
09:45 – 11:00 Session 1
11:00 – 11:15 Coffee break
11:15 – 12:30 Session 2
12:30 – 13:00 Case study
13:00 – 14:00 Lunch
14:00 – 15:15 Session 3
15:15 – 15:30 Coffee break
15:30 – 16:00 Case study
16:00 – 17:15 Session 4

Rick van der Lans

1. The Business Intelligence Paradox
Rick van der Lans, Managing Director, R20/Consultancy

Technology for storing, processing, and analyzing data keeps on improving. The unthinkable ten years ago, is easy today. Still, organizations are struggling with mundane BI issues such as data quality, information management, and “selling” BI. And they are still using the same technology in their BI environments they acquired years ago. Where does this paradoxical situation come from? Why not adopt the technology that is available? This short, introductory session addresses this BI paradox and discusses the road to progress in BI.

 

It’s Time for the Logical Data Warehouse
Rick van der Lans, Managing Director, R20/Consultancy

The classic data warehouse architecture has had a long and successful run, but we’re starting to stretch its abilities to the limit. The logical data warehouse will take its place, which has an architecture consisting of less physical data stores, less redundant storage of data, is more suitable for operational BI, and is much more flexible. Mature technology in the form of data virtualization servers exist to develop a logical data warehouse. Products from Cisco, Denodo, Informatica, and RedHat have proven that large BI systems can be developed using data virtualization.

In addition, now that more and more data is produced in a distributed fashion, it may not be smart anymore to move the data to a centralized store for integration purposes. It’s time to move the integration process to the data. Especially big data can be too big to move.

  • Working with virtual data marts increases flexibility
  • Use data virtualization to turn a data vault in a flexible environment
  • The logical data warehouse opens the way to operational BI
  • Streaming data in a logical warehouse.

Wayne Eckerson

2. Creating a Business Analytics Center of Excellence
Wayne Eckerson, Principle consultant, Eckerson Group
You can build the most elegant data architecture with the best tools, but if you don’t have the right people in the right positions with the right leadership and business sponsorship, your business analytics (BA) program won’t get very far. This session shows how to build a business analytics program that will stand the test of time. It describes how to establish (or revitalize) the program, organize the team to maximize value, and manage interactions among team members and business counterparts. It describes how to build a federated BA organization with matrix reporting relationships and establish a BA Council that guides and approves the work of the team. It also describes the vital role of the BA director and provides an organizational and career pathways chart to guide the development of your team. You will learn how to:

  • Evolve a business analytics team
  • Set up a federated BA organization that maximizes business value
  • Manage handoffs between various constituencies in a BA organization
  • Establish a BA Council
  • Define the roles and responsibilities of the BA director
  • Establish career pathways for your team members.

Claudia Imhoff

3. Collaborative BI – What is it, Why should I Care, What does it take Technologically, and How do I Get Started?
Dr. Claudia Imhoff, President, Intelligent Solutions Inc.

Collaboration is a mechanism used by information workers to discover, access and share corporate and external information and analyses for business decision making. The growing use of social computing and mobile technologies add even more power to collaborative applications for information gathering and sharing. It is therefore crucial that BI applications and systems support and exploit the collaborative environment. The key question is, “How do organizations combine BI and collaborative technologies to provide the best benefits to their information workers?
This presentation is the result of a research project on the role of collaboration in business intelligence. We found three key opportunities for business users to leverage the combined benefits of collaboration and business intelligence:

  • Collaborative interactions
  • Information enhancement
  • Collaborative decision-making

The presentation discusses these three aspects in detail along with how companies are using collaborative BI today. Also discussed are the technological requirements to support collaborative BI. Lastly, there is a section on how to get started in deploying a collaborative BI environment.

Barry Devlin

4. The Data Lake and How to Avoid Drowning in It
Dr. Barry Devlin, Founder and Principle of 9sight Consulting

“Leave the over-structured, complex Data Warehouse behind. Dive into the pure, sparkling waters of the Data Lake!” Over the past year or two, the Data Lake (or Reservoir) has become popular as a vision or, even, architecture for big data information storage and processing. The simple view is to pour all raw data into Hadoop and enable all types of processing and use from there. As the Hadoop ecosystem evolves, this description will undoubtedly become more nuanced. However…
I suggest you enjoy the Instagram, but beware the hidden depths. The Data Lake is a misleading metaphor; it may become a watery grave for context, governance and value. Today’s intricate information ecosystem demands a careful blend of architectures and technologies. The fluidity of the Data Lake allegedly improves on the inflexible, complicated Data Warehouse and its supporting processes. This is based on an erroneous view of the rationale for the warehouse. It mistakes its common implementation as a reporting environment with its core principle, which is to create a consistent and reconciled core for the whole business. As currently defined, Data Lake thinking throws out the baby with the lake water. In this session, Dr. Barry Devlin will:
  • Explore the meaning, pros and cons of the Data Lake concept
  • Describe how the emerging biz-tech ecosystem, requires both Data Lake and Warehouse
  • Provide the required architectural concepts such as a tri-domain information model and a logical architecture based on information pillars to deliver business value
  • Show how this approach supports reconciled and consistent data for governance, as well as the agility to act quickly and decisively in a rapidly changing world.

Rick van der Lans

5. Overview of SQL-on-Hadoop Engines
Rick van der Lans, Managing Director, R20/Consultancy

In the world of Big Data, Hadoop, and NoSQL, right now the spotlights are on SQL-on-Hadoop engines. In the beginning, only low-level technical interfaces, such as HDFS, MapReduce, and HBase, were available to access big data in Hadoop platforms. The drawbacks were clear: low productivity and high maintenance costs. With the emergence of these SQL-on-Hadoop engines, big data becomes available to a larger audience and for a wider set of use cases. SQL-on-Hadoop engines make it possible to access big data stored in Hadoop by using the familiar SQL language. With these, users can plug in almost any reporting or analytical tool to analyze and study the data. Today, many different engines are available, making it hard for organizations to choose. This session explains the technological challenges for SQL-on-Hadoop engines, and presents an overview of the current products, including Apache Hive and Drill, CitusDB, Cloudera Impala, Concurrent Lingual, Hadapt, IBM BigSQL, InfiniDB, JethroData, MemSQL, Pivotal HawQ, ScleraDB, Spark SQL, and Splice Machine. Architectural challenges.

  • Handling non-relational data with SQL-on-Hadoop engines
  • Opening up big data to reporting and analytics using SQL-on-Hadoop
  • Integrating Hadoop data and data warehouse data
  • Overview of the market.

Barry Devlin

6. Decision Making Support in the Intersection of Big Data and BI
Dr. Barry Devlin, Founder and Principle of 9sight Consulting

The ongoing explosion of big data from social media and sensors on the Internet of Things is providing opportunities for truly original business applications. Predictive analytics also gives insights into human behavioral patterns to anticipate peoples’ needs and pre-empt their actions. With these exciting opportunities, there is a growing belief that we can move into an era of fully data-driven decision making. While there is much room for improvement in data usage in business, one of the key lessons we (should) have learned from traditional BI is that gathering data is only the first step of the journey.
What is needed now is a new set of concepts and steps to move from data collected to decisions made. This begins with defining the context of information, and it relationships to data, knowledge and meaning. This leads to an understanding that deeper non-rational aspects of human decision making must be incorporated into modern, decision making support systems. And recognizing that business is a fully social construct, we see the innovative value and opportunities offered by collaborative systems. In this session, Dr. Barry Devlin will:
  • Describe the modern meaning model as a new approach to linking data to human understanding and use
  • Explain why we must move from metadata to context-setting information
  • Show new approaches that enable collaborative working and define the adaptive decision cycle that finally bridges the gap between BI, analytics and today’s spreadsheet culture.

Claudia Imhoff

7. Creating an Analytically-Driven Enterprise – How to Implement an Analytics Program
Dr. Claudia Imhoff, President, Intelligent Solutions Inc.

Analytics have become the darling of vendors, consultants, and the press. In addition, data scientists are now highly sought after by many enterprises. Yet, the adoption rate for analytics and BI is still hovering between 20 and 30% in most enterprises. Why? What is the problem? How can we improve the adoption of these critical decision support functions? How do we implement an analytics program? These are some of the questions to be answered in this timely presentation by Dr. Claudia Imhoff. Getting an analytics program up and running requires the following considerations:
  • The need for analytics and enterprise strategy for acceptance
  • Education – not just training
  • The new way we work
  • The data scientist, data priest, data interpreter, data engineer – who wins
  • Ultimate goal – comprehension by the executives on how analytics can impact them and their company.

Wayne Eckerson

8. Secrets of Analytical Leaders: Insights from Information Insiders
Wayne Eckerson, Principle consultant, Eckerson Group

How do you bridge the worlds of business and technology? How do you harness big data for business gain? How do you deliver value from analytical initiatives? This session will unveil the success secrets of top information leaders from companies such as Zynga, Netflix, US Xpress, Nokia, Capital One, Kelley Blue Book and Blue KC, among others. The session will cover both the “soft stuff” of people, processes, and projects and the “hard stuff” of architecture, tools, and data required to create and sustain a successful analytics program.
  • Deliver value quickly
  • Span business and technology
  • Manage change
  • Translate insights into business impact
  • Create an agile data warehouse.

Cases:

Mark Pritchard1. Making the Logical Data Warehouse a Reality
Mark Pritchard, Technical Director for Northern Europe, Denodo Technologies

Data Virtualization has earned its place in many enterprise architectures, driven by the Data Economy’s demand for truly disruptive and agile projects to become an actuality. The Logical Data Warehouse is no longer just a theoretical pattern, enabled by Data Virtualization it has become a reality.
• How does Data Virtualization do what it does?
• How does Data Virtualization enable the Logical Data Warehouse?
• Who is already using Data Virtualization in this context?

Tjerk Bancken

2. From reporting to transactional data discovery
Tjerk Bancken, IT Architect, Blokker

Explore how leading retailer Blokker is implementing a new ICT strategy across 15 different brands and over 2,800 stores across Europe. The retail giant has transformed its IT infrastructure with Teradata to create an environment that is accessible to people at all levels in the organisations. They have also utilised the power of the MicroStrategy analytics platform to allow users to explore consistent and in-depth analysis of business results.

3. Using Big Data to Improve Profitability
Frederik Naessens, Senior Solution Architect, WhereScape

Frederik NaessensHear how WhereScape’s manufacturing customers have consolidated disparate data warehouses into single environments and how big data is helping them to stand out from the competition and improve profitability.
The Challenge: How to combine test and sensor data with operational data to deliver faster and more accurate analysis to enable effective business decisions and improve commercial propositions.
The Solution: An integrated centralized environment with a single global modelling standard; automated and managed by WhereScape.
The Benefit: Users get value from their data, shortening analysis times and delivering BI solutions faster than ever before – cost effectively.