From: JTP PR on
The evolution of reporting and analytics has seen dramatic changes in
recent years. Starting with static “green bar” reports in the mid-to-
late '70s, information could be abstracted from mainframe systems and,
often manually, transferred to spreadsheets where data could be
aggregated and analyzed. Data warehousing was the buzz of the '80s,
and while this did enable heterogeneous data sources to be
centralized, projects were often grossly over budget and far below
expectations. As technologies have matured and the advent of services-
based architectures has become more prominent, data warehousing
reinvented itself and emerged as what is now recognized as business
intelligence.

However, the recent advent of in-memory analysis means that Business
Intelligence expectations have changed forever. Dealing with overly
complex software designed for a handful of power users involving long
deployment cycles and low project success rates is no longer
acceptable. Today, smart companies are striving to spread fact-based
decision making throughout the organization, but they know they can’t
do it with expensive, hard-to-use tools that require extensive IT hand
holding. The pace of business now demands fast access to information
and easy analysis; if the tools aren’t fast and easy, business
intelligence will continue to have modest impact, primarily with
experts who have no alternative but to wait for an answer to a slow
query.

The success or failure of in-memory analysis does, however rest to
some degree on the technology chosen to be the delivery platform. The
fundamental requirement is that this platform is web-centric, beyond
that there are some essential technology components that assist to
deliver the business benefits sought. These are:

Enterprise scalability and security

All BI solutions must include enterprise administrative features, such
as usage monitoring, single sign-on and change management; and this is
just as true for in-memory solutions. It is therefore, critical that
you choose solutions such as Yellowfin business intelligence, with its
integrated in-memory database, that can provide enterprise class
infrastructure that enable you to scale your deployment as your users
grow.

Integration with your existing data warehouse and OLAP cubes

While some vendors tout in-memory as a way of avoiding building a data
warehouse, this option usually applies to smaller organizations that
may only have a single source system. For larger companies that have
multiple source systems, the data warehouse continues to be the ideal
place to transform, model and cleanse the data for analysis.

Look for tools that are designed to integrate with and leverage
existing BI environments. An in-memory solution that is tightly
integrated into the visualization tool is critical. However, it is
equally important that the visualization tool can also access your
OLAP cubes and data warehouse tables without the need for an in-memory
middle-layer. Without this option a purely stand-alone in-memory
solution can lead to yet another version of the truth, adding
complexity to your BI environment.

Yellowfin takes a flexible approach whereby the system administrator
can configure the server to perform processing either against the in-
memory database, or alternatively, push processing down to the
underlying data store. The decision on which approach is optimal for a
given deployment will depend a lot on the query performance
characteristics of the data store. For example, a traditional OLTP
data store may benefit significantly from in-memory processing,
whereas a query optimized analytic data store may provide performance
similar to or better than in-memory processing. Combining this
flexible architecture with the cost advantages of not using an OLAP
server gives customers choice and a BI platform that can grow as their
data and analysis requirements do.

Ensure real time data refresh

Because reporting data is potentially extracted from a source system
or a data warehouse and then loaded into memory, data latency can be a
concern. Front-line workers in a customer service center, for example,
need near-real-time, highly granular (detailed) data. If an in-memory
tool contains last week’s product inventory data, it’s probably not of
use to customer service reps. Thus, the suitability of an in-memory
tool and the success of the deployment may hinge on the degree to
which the solution can automate scheduled incremental data loads.
One of the criticisms’ of some in-memory analysis tools is their lack
of incremental load. This means that whenever a data refresh is
required the entire data set need to be refreshed rather than just
changed or new transactions. This increases the load times and means
that refreshes cannot be frequent enough to enable near-real time
reporting. This is nor the case with Yellowfin’s in-memory
technology.

Minimize Administration overhead

In-memory analytic tools often introduce some of the same concerns
that OLAP stores create: namely, they usually create another data
source, with its own calculations and business definitions. This is
where tools such as Yellowfin differ from other in-memory approaches:
existing queries, reports and dashboards automatically take advantage
of an in-memory database, seamless to users. Administrators are not
adding calculations and business logic within another layer; they
reside within the existing meta-data layer for reporting that is
already built.

Web-based development and deployment.

Some in-memory tools are not nearly as Web enabled as their
conventional BI counterparts. This seems to reflect both technology
immaturity and a tendency to be a niche deployment. However, for
successful adoption with minimal administrative overhead web based
development and deployment is critical. Both the visualization tool
and in-memory database need to be server based deployments to ensure
data access security and application upgrades can be easily managed.
Solutions such as Yellowfin, provide a single web based platform for
delivering your Business Intelligence needs. From connection through
to design, modeling and visualization, your users work within a fully
integrated browser application that encourages collaboration and an
iterative approach to report development - leading to analytical
applications that meet the needs of your end users.

Data security must be of paramount concern

In Memory applications have the potential to expose significantly more
data to end-users then ever before. This raises security issues
regarding how data is accessed, where it is stored and who has access
to that data.

In determining the best strategy for your in-memory deployment
security needs to be foremost in your selection criteria. There are
two aspects of security the location of your data. Where is it stored
and is that storage secure? And secondly who has access to that data
store. In terms of storage – the most secure location for your data
is on a centralized server, whether hosted or internal. Not only is
this more secure but it maintains basic controls regarding data
governance.

To understand this consider a scenario where users are able to conduct
complex queries by downloading up to 100 million rows of data to their
desktop from many data sources, or data feeds from the Web. Sure the
information can then be sliced and diced into reports or users can
create BI applications on their desktops and share them with
colleagues. Sounds great in theory but fraught with danger in
practice. With this level of data on a laptop it is free to leave
your premises and get lost or stolen in the worst case or published
without any form of governance at best.

In addition to centralized storage your in-memory analysis need to
conform to data security measures as well. These means that data
access profiles for your users need to be adhered to through out your
reporting process. Organizations spend an enormous amount of effort
in securing their transactional applications and so it is critical
that when it comes to the data they contain the same level of security
is present. This means that users only have access to the data they
are authorized to access, and that this access is changes as the
employees role changes.

In summary when choosing an in-memory analysis tool set you do need to
consider how it will reside within your enterprise architecture.
Interaction with your current business intelligence environment, the
security framework and the ability to deliver real time reporting are
all critical aspects that need to be considered in your selection
process.


About Yellowfin Business Intelligence

Yellowfin is passionate about making Business Intelligence easy.
Recently recognised among 25 rising companies that CIO’s must know
about, Yellowfin is a leading web-based BI solution that can be easily
integrated into any third-party application or delivered as a stand-
alone enterprise platform. Yellowfin is an innovative and flexible
solution for reporting and analytics, providing a full range of data
access, presentation and information delivery capabilities.

www.yellowfinbi.com


 | 
Pages: 1
Prev: Hi
Next: Disabling a MQT in db2 9.7