Business Intelligence

BI is the activity which contributes to the growth of any company.
BI is the act of transforming raw/operational data into useful information for business analysis.

Need of Business Intelligence?

A Data Warehouse (Properties) is a subject-oriented, integrated, time-variant and nonvolatile collection of data in support of managements decision-making process.

Bill Inmon, Father of Data Warehousing

What is a Data Warehousing?

  • A central location where consolidated data from multiple locations (databases) are stored.
  • DWH is maintained separately from an organizations operational database.
  • End user access it whenever any information is needed.

Why build a data warehouse?

  1. Data collected from various sources & stored in various databases cannot be directly visualized.
  2. The data first needs to be integrated and then processed before visualization takes place.

Advantages of Data Warehouse

  • strategic questions can be answered by studying trends
  • data warehousing is faster and more accurate

Note: data warehouse is not a product that a company can go and purchase, it needs to be designed & depends entirely on the company’s requirement.

01.jpg

02.jpg

03.jpg

04.jpg

05.jpg

06.jpg

Key terminologies of DWH architecture
OLTP vs OLAP
ETL
Data Mart
Metadata

E Extract information from a company’s operational systems.
T Transform (cleaned and integrated)
L Load the data into Data Warehouse

Data Warehousing and ETL
Informatica PowerCenter 10.x
Talend

Business Intelligence
MicroStrategy 10
IBM Cognos BI
Qlikview
Power BI – business analytics service provided by Microsoft
Talend – A Cloud Data Integration Leader (modern ETL)
MongoDB
Pentaho BI
Advance Excel
VBA in Excel

Tableau
Exam: Tableau Qualified Associate Level
Core task: Creating and distributing interactive and shareable dashboards.
Concepts: Data Blending, the creation of Charts, Graphs and LOD expressions using different versions of Tableau including Tableau Desktop, Tableau Public and Tableau Reader.

https://www.edureka.co/tableau-training-for-data-visualization

https://www.edureka.co/blog/interview-questions/top-tableau-interview-questions-and-answers/

https://www.edureka.co/blog/power-bi-vs-tableau/

 

 

Performance Tuning

ABAP programming standards for performance tuning

    1. When reading data from database table, never use SELECT *, always use select with list of fields.
    2. Always specify key fields in where conditions of SELECT statements, because these will have primary index.
    3. Sometimes we may need to use non-key fields in where conditions of SELECT statements, in that case create secondary indexes and use.
    4. Never use select with corresponding in SELECT statements.
    5. Always use read table with binary search and make you have sorted the internal table in ascending order before using read table binary search.
    6. Never use select joins for more than 3 tables.
    7. Always use select for all entries for more than 3 tables.
    8. Always check whether the parent internal table is initial or not before using for all entries.
    9. Never use nested loops, instead use parallel cursor .
    10. Never use select statements inside loops, instead use for all entries.

 

http://wiki.scn.sap.com/wiki/display/profile/Optimizing+Performance+In+ABAP

SAP BW – Extraction by Function Module

Generic Extractor – Extraction by Function Module

Under certain circumstances, the data to extracted from an SAP ERP system is not available as a transparent table and cannot be complied in a view.

The function module will be called repeatedly(DS-API program-FM):
The first call is to initiate the function module as well as to possibly review the input parameter.
All Further calls serves to read and transfer data.

Steps to create function module extractor:

    1. Extract Structure
    2. Function Module
    3. DataSource

Extract Structure is fields, extracting from the source system.
Transfer Structure is filed transferred to BW.

Step 1] Create the extract structure in the source systemlet’s say ECC GOTO SE11 -> Data Type -> Structure

Step 2] Create the function module; the interface of the function module is stricly defined and needs to show the following.

21.jpg

22.jpg

E_T_DATA table data are filled & transferred to BW.

Step 3] Create the DataSource in RSO2

RSA3 Extractor Checker

By default 10 times FM is called & selected data * 10 number of records returned, when cursor is not implemented in FM.

Select statement cannot be controlled while extraction.

23.jpg

The package size is controlled by cursor state.

open cursor
close cursore
where clause in open cursor

BW – Virtual Providers

Virtual Providers do not hold the data physically.

A MultiProvider is known as an InfoProvider that allows you to combine data from multiple InfoProviders and makes it available for reporting purposes.

MultiProvider 

Combines the multiple Info Providers using Union operation.

A MultiProvider consists of the following different combinations of InfoProvider types −

  • Master Data (InfoObject)
  • InfoCube
  • DataStore Object
  • InfoSets
  • SPO Semantic Partition Object
  • Virtual Provider

 

InfoSet

JOIN operation

 

BW on HANA – Composite Provider

Introduction
In a SAP BW environment it is very common that you combine data coming from various sources. This combination of data can be done during the staging process by enriching data and by persistence of the result of the enrichment on database level (typically a DataStore object or InfoCube) or during runtime of the query by performing a union or join operation.

Typically you decide to persist the result of the operation in case the data you want to join is not volatile – as otherwise you always would have to realign the data – or if the runtime of the join is to high.

In case you want to execute the join during query execution, the metadata objects offered in releases before SAP BW 7.30 are Multiprovider for union operations and InfoSet for join operations. Beginning with SAP BW 7.30 on HANA the Composite Provider was introduced.

A Composite Provider is an InfoProvider in which you can combine data from BW InfoProviders such as InfoObjects, DataStore Objects, SPOs and InfoCubes, or SAP HANA views such as Analytical or Calculation Views using join or union operations to make the data available for reporting. The used union and join operations in the CompositeProvider are pushed down to HANA leveraging the power of the calculation engine inside HANA. BEx queries can be created on CompositeProviders as on any other BW Info Provider. SQL access is possible by creating an HANA view to the Composite Provider.

The CompositeProvider consolidates the number of InfoProviders types and harmonizes the modeling of mixed BW on HANA scenarios.
The role of the Composite Provider is to provide a metadata object that forms the data mart layer within BW. It provides the data for reporting and analysis in form of an outbound structure which is semantically rich. It abstracts the underlying data providers and provides an outbound interface which can be consumed by any kind of query by offering the option to generate a HANA view.

Recommendation: in a BW 7.4 SP 5 is to create central Composite Providers only in with the BW modeling tools and no longer with transaction RSLIMOBW. Only existing old central Composite Providers (SAP BW 7.3x based) should be maintained there.

In order to enable the consumption of SAP HANA views (Analytic or Calculation Views) in the Composite Provider, you have to attach the corresponding SAP HANA system to the BW project

OpenODSViews which you want to use in CompositeProviders can only contain fields with InfoObject-compatible data types. These are the following data types: CHAR, CUKY, CURR, DEC, DATS, FLTP, INT4, NUMC, TIMS, QUAN, UNIT. Data type STRG is also supported for particularly long characteristic-type field values.

After the assignment of an InfoArea (the InfoArea has to be defined first in the BW backend system, as you are not able to create InfoAreas in the Eclipse based GUI) you have the option to add the CompositeProvider to your Favorites.

On the Overview tab in the area General, you can specify whether a SAP HANA view should be generated or not. Please mark that flag in case you intend to access the resulting structure via the SQL interface of SAP HANA. Setting the option leads to a generation of a Calculation View in the HANA database which can be accessed by SQL or reused in HANA native modeling.

Under the area Runtime Properties, you can specify settings for processing of BW queries defined on top of the CompositeProvider.

On the Scenario tab page, you can select the participating InfoProviders or SAP HANA models. You specify which fields from these PartProviders should be used as output columns of the CompositeProvider.

In the Output tab you can specify for each individual field in the output structure associations to other objects such as InfoObjects, OpenODSViews or you can assign constants.

As long as only union operations are modeled there is no run time difference between the Composite Provider and the Multi Provider as the analytic manager inside BW is well suited for this kind of operation and highly optimized for SAP HANA. Therefore there is no need to migrate existing Multi Providers or existing Composite Providers into the Composite Provider.

Possible InfoProviders are InfoObjects, SPOs, DSOs and InfoCubes. MultiProviders are not possible to include.

By defining the association to an existing InfoObject of type characteristic for example you inherit the technical properties as well the attributes, navigational attributes as well as defined hierarchies to the field of your CompositeProvider

 

 

 

 

 

 

 

 

 

 

 

 

The composite Providers can be created at the backend in RSLIMOBW and also in BWMT in HANA studio.
The main difference: in CP in BWMT the HANA views can be directly used in modeling CP, whereas at the backend in RSLIMOBW, those HANA views cannot be used directly but we can use HANA models in Virtual Provider(VP) and then that VP can be used in CP.

A Composite Provider is an Info Provider, which combines data from several analytic indexes or from other Info Providers (by Join or Union), and makes this data available for reporting and analysis. UNION and JOIN operations are executed in HANA and not the application server. BEx Queries can be created on Composite Providers as on any other BW Info Provider. This is done in transaction RSLIMOBW.

The main advantage of Composite providers is that: BW Info Providers can be combined using the JOIN operation allowing us to create new scenarios not possible or very expensive with standard techniques (Multiprovider, InfoSet).

The Joins and Unions are executed at the query execution time. UNION and JOIN operations are executed in HANA DB.

SID generation option should be checked in the DSO for it to be available as a data provider for Composite provider.

There are three different types of CompositeProviders
(Central) CompositeProvider
Ad hoc CompositeProvider
Local CompositeProvider

Prerequisites
You are using a SAP HANA database or a SAP BW Accelerator. The BW Modeling tools are only available if you are using SAP HANA.

Constraints
A CompositeProvider can only be transported if it consists solely of InfoProviders.

You cannot use non-cumulative key figures in a local CompositeProvider.

If you are using a BWA, the open request is not read in the query for real-time InfoCubes for local CompositeProviders, even if you have made the corresponding data integrity setting in Query Designer. If you are using an SAP HANA database, the open request is read.

https://archive.sap.com/documents/docs/DOC-52329

 

BW – InfoObjects

InfoObjects are the smallest units of BW. Using InfoObjects, information is mapped in a structured form. This is required for constructing InfoProviders.

InfoObjects with attributes or texts can themselves also be InfoProviders (if in a query).

InfoArea is a folder to organize the InfoProviders. Max length 30 char, must start with alphabet.

InfoObject Catalog is a folders to organize the Characteristic or key figure InfoObjects.

when metadata of an InfoObject is activated, BW automatically creates the data element and domain in the ABAP dictionary.

Types of InfoObjects
Characteristics (Material, Employee) MD IO
Key Figures (Amount, Quantity)
Unit (Unit of weight, Currency)
Time (Year, Month, Week)
Technical (Request ID, Change ID)

Customer can create Characteristics, Key Figure and Unit InfoObjects.
Time and Technical InfoObjects are provided by BW, we don’t need to create.

All master data are represented with characteristics.
Unit gives proper meaning to the key figure.
Technical characteristics are generated internally.

Attributes is an additional information or information given to a char

Types of attributes:
Display attributes
Navigational attributes
Transitive attribute
Compounding attribute

Reference
Changes are not possible,
Only description can be changed.

Template
Changes are possible

Characteristic

Data Types

CHAR Numbers and letters of length 1-60 start with ! & # are never allowed

NUMC Numbers only of length 1-60

DATS Date of length 8

TIMS Time of length 8

Permission of lowercase letters

In the case of characteristics of type CHAR, the question of how to deal with lowercase letter comes up.

FM RSKC_CHAVL_CHECK to verify the validity of a characteristic string

TCode RSKC to maintain the permitted extra characters in BW

Conversion routine

Primarily ALPHA, NUMCV, GJAHR are the conversion routines

routine MATN1’s customization can be done by TCode 0MSL

SID Tables

SID is the master data identification number. (Data Element: RSSID)

SID is always of the type integer (4 bytes in length) may have a positive impact on performance despite its more complex data model.

Time Characteristics 

We can use only time characteristics that BW supplies with BI content. Renaming/Change in description is allowed.

Hierarchies

A hierarchy indicates a parent-child relationship which consists of several nodes and leaves.

Hierarchy can be created manually or loaded from the SAP system or other non sap source systems. Hierarchy can be used to drill down or extract specific information about the business item.

Hierarchy Type
1] Standard
2] Local implementation

Ways of creating hierarchy
1] Manual
2] via program
3] Flat file load

Manual creation

Step 1: Right click on Characteristics InfoObject  -> Create Hierarchy

Step: 2 Add the InfoObject (Customer node), Characteristic Nodes (external), Text Node

01.jpg

Foreign hierarchy characteristic.

Characteristics Nodes: to add the external characteristics in hierarchy ex. region

InfoObject has to be added in external characteristics in hierarchy.

Flat File load

Step 1: Create the file in below header format
NODEID
IOBJNM
NODENAME
LINK
PARENTID
LANGU
TXTSH
TXTMD
TXTLG

Step 2: Create DataSource, InfoPackage, Transformation(Segments), DTP

01

01

01.jpg

SAP BW Hierarchy

Business Contents are pre-delivered well build object developed by SAP when we buy the system.

All the InfoObjects that comes as a part of Business Content starts with ‘0’

The technical name of custom InfoObject should not start with ‘0’ & any special character

InfoObject name must be between 3 & 9 characters long.

We cannot start the tech name by using number, since they are reserved for SAP
0 for SAP BW
9 for APO
6 for DB
8 for Data marts

There are two types of objects:
SAP delivered: /BI0/
Custom created: /BIC/

Fact table, dimension table, SID table, master data (attr, text, hier)

M – /BIC/M (IO tech name) – Master Data View
P – /BIC/P (IO tech name) – Master Data Table (time independent)
Q – /BIC/Q (IO tech name) – Master Data Table (time dependent)
X – /BIC/X (IO tech name) – Attribute SID table (time independent)
Y – /BIC/Y (IO tech name) – SID table for time dependent nav attr
S – /BIC/S (IO tech name) – SID table for char
T – /BIC/T (IO tech name) – Text Table
H – /BIC/H (IO tech name) – Hierarchy table
I – /BIC/I (IO tech name) – SID Hierarchy Structure
K – /BIC/K (IO tech name) – Hierarchy SID table
J – /BIC/J (IO tech name) – Hierarchy Interval Table

BW 7.5 SP06 on-words InfoObjects are only being created in BW Modeler.

02.jpg

03.jpg

04.jpg

BW – Hierarchies

A hierarchy indicates a parent-child relationship which consists of several nodes and leaves.

Hierarchy can be created manually or loaded from the SAP system or other non sap source systems. Hierarchy can be used to drill down or extract specific information about the business item.

Hierarchy Type
1] Standard
2] Local implementation

Ways of creating hierarchy
1] Manual
2] via program
3] Flat file load

Manual creation

Step 1: Right click on Characteristics InfoObject  -> Create Hierarchy

Step: 2 Add the InfoObject (Customer node), Characteristic Nodes (external), Text Node

01.jpg

Foreign hierarchy characteristic.

Characteristics Nodes: to add the external characteristics in hierarchy ex. region

InfoObject has to be added in external characteristics in hierarchy.

Flat File load

Step 1: Create the file in below header format
NODEID
IOBJNM
NODENAME
LINK
PARENTID
LANGU
TXTSH
TXTMD
TXTLG

Step 2: Create DataSource, InfoPackage, Transformation(Segments), DTP

01

01

01.jpg

 

SAP BW Hierarchy

 

BW – Introduction

Technical names can’t be changed, be careful.

Data warehouse stores the data which is not used so frequently.
Enterprise DW is used to collect the data from different source, consolidate, cleaning.
BW Will not impact on operational system i.e. insert, update, delete
Data can come from multiple diverse systems

OLTP (On-line Transaction Processing) is characterized by a large number of short on-line transactions (INSERT, UPDATE, and DELETE). The main emphasis for OLTP systems is put on very fast query processing, maintaining data integrity in multi-access environments and an effectiveness measured by number of transactions per second. In OLTP database there is detailed and current data, and schema used to store transactional databases is the entity model (usually 3NF).

OLAP (On-line Analytical Processing) is characterized by relatively low volume of transactions. Queries are often very complex and involve aggregations. For OLAP systems a response time is an effectiveness measure. OLAP applications are widely used by Data Mining techniques. In OLAP database there is aggregated, historical data, stored in multi-dimensional schemas (usually star schema).

OLTP System
Online Transaction Processing
(Operational System)
OLAP System
Online Analytical Processing
(Data Warehouse)
Source of data Operational data; OLTPs are the original source of the data. Consolidation data; OLAP data comes from the various OLTP Databases
Purpose of data To control and run fundamental business tasks To help with planning, problem solving, and decision support
What the data Reveals a snapshot of ongoing business processes Multi-dimensional views of various kinds of business activities
Inserts and Updates Short and fast inserts and updates initiated by end users Periodic long-running batch jobs refresh the data
Queries Relatively standardized and simple queries Returning relatively few records Often complex queries involving aggregations
Processing Speed Typically very fast Depends on the amount of data involved; batch data refreshes and complex queries may take many hours; query speed can be improved by creating indexes
Space Requirements Can be relatively small if historical data is archived Larger due to the existence of aggregation structures and history data; requires more indexes than OLTP
Database Design Highly normalized with many tables Typically de-normalized with fewer tables; use of star and/or snowflake schemas
Backup and Recovery Backup religiously; operational data is critical to run the business, data loss is likely to entail significant monetary loss and legal liability Instead of regular backups, some environments may consider simply reloading the OLTP data as a recovery method

Data mining: the process of extracting patterns from data. It examines data for trends and patterns.
We can divide IT systems into transactional (OLTP) and analytical (OLAP). In general we can assume that OLTP systems provide source data to data warehouses, whereas OLAP systems help to analyze it.

1.   Introduction & Evaluation of SAP BW

A warehouse is subject-oriented, integrated, time-variant, and non-volatile collection of data in support of management’s decision making process (as defined by Bill Inmon).

Data Warehouse: the process of designing, building, and maintaining a data warehouse system.

Tools available in the market: SAP BW, Informatica, Hyperion, data stage, cognos, Teradata

Need of BI System: to generate multi-dimensional reports, for which a multi-dimensional structure should be designed, the multi-dimensional structure is based on schema.

Detailed level data to aggregated data.

Use of information that enables origination to best decide, measure, manage, optimize performance, to achieve efficiency & financial benefits.

SAP Business Warehouse (BW) integrates data from different sources, transforms and consolidates the data, does data cleansing, and storing of data as well. It also includes data modeling, administration and staging area.

SAP Business Intelligence (BI) means analyzing and reporting of data from different heterogeneous data sources. It allows you to acquire data from multiple data sources, data staging, which can be further distributed to different BI systems. A SAP Business Intelligence system can work as a target system for data transfer or source system for distribution of data to different BI targets.

01.png

The Business Explorer (BEx) is a reporting and analysis tool that supports query, analysis and reporting functions in BI. Using BEx, you can analyze historical and current data to different degree of analysis.

BI objects consists of the following components −

  • Roles
  • Web templates and workbook
  • Queries
  • Info Provider
  • Update Rules
  • Info Source
  • Transfer Rules
  • Info Objects
  • Data Sources

 

BI Architecture

Data Acquisition

Fetches the data from various available sources

Data warehouse

This is the staging layer where processing of data is done in different layers like PSA, DSO, data warehouse

Analytical engine

Bex

BI Accelerator

Process Chain

SP8 for SAP BW 7.4 has introduced BW’s newest modelling object: the Advanced DSO. The Advanced DSO is BW’s new major modelling object that plays a central role in any SAP BW on HANA data model. It is used as a classical DSO, info cube, PSA or write optimized DSO. The Advanced DSO manages to replace all of these objects by basically being all of them when creating the Object you have to decide about the properties the Info Provider shall have.

Sometime we choose Classical DSO option for data provision to next layer and so change log table comes into picture. One thing missing at the moment is the possibility to delete the Change log via process type. But according to SAP this functionality is to come in future (may be BW 7.5).

At this moment, it is possible to do the deletion manually to delete the requests manually in the manage UI but no workaround to delete in batch. However, SAP has recently come with a workaround.

After implementation of this note #2253065, program “RSDSO_REMOVE_REQS_PC “will be available in your BW on HANA system. This program internally calls Function Module M “RSDSO_REMOVE_REQUESTS_API” to delete change log requests in batch.

02.png

Activating Drag & Drop:

SM30 -> TM for RSADMINSV -> uncheck Drag & Drop

2.   Info Objects

Definition

Business evaluation objects are known in BI as Info Objects.

Info object name must be between 3 & 9 characters long

They are divide into

  1. Characteristics
  • All master data are represented with characteristics.
  • for example, customers, material, plant
  1. Key figures
  • for example, revenue
  1. Units
  • Unit gives proper meaning to the key figure
  • for example, currency, amount unit like KG, Centimeter
  1. Time characteristics
  • System provides
  • We don’t need to create
  • for example, fiscal year, fiscal period, Cal day
  1. Technical characteristics
  • Related to system
  • We can’t need to create
  • Generated internally
  • for example, request number

Business Content: pre-delivered well build object developed by SAP when we buy the system.

Use

Info Objects are the smallest units of BI. Using Info Objects, information is mapped in a structured form. This is required for constructing Info Providers.

Info Objects with attributes or texts can themselves also be Info Providers (if in a query).

Info Area: to organize the Info Objects

Max length 30 char, must start with alphabet

Info Object Catalog:  Characteristic, key figure

 03.png

All the Info Objects that comes as a part of Business Content starts with ‘0’

The technical name of custom Info Object should not start with ‘0’ & any special character

Length should be 9

Reference Template
Changes are not possible,

Only description can be changed.

Changes are possible

Naming conventions:

  1. Length of tech name should be 3 to 9 letters
  2. The generic objects are prefixed with Z or Y
  3. We cannot start the tech name by using number, since they are reserved for SAP

0 for SAP BW

9 for APO

6 for DB

8 for Data marts

Tabs in char info object:

  1. General:

Data type:

Lowercase letter:  Wall Mart-checked, WALL MART-unchecked

Conversion routine: add leading zeroes

Transfer routine: ABAP routine

Attribute only: can’t add attribute, usually won’t use

  1. Business explores:

Display: using this we can set the display property of a char in reporting

Authorization relevant: security with controlled data access

  1. Mater data/text:
  2. Attribute (display & navigational)
  3. Text
  4. Hierarchy

Further divided into

  1. time dependent: Emp name, DOB, DOJ
  2. time independent: Emp age, designation

There are two types of objects:

SAP delivered: /BI0/

Custom created: /BIC/

Fact table, dimension table, SID table, master data (attr, text, hier)

P — /BIC/P (IO tech name) – time independent master data attr table

Q — /BIC/Q (IO tech name) – time dependent master data attr table

X — /BIC/X (IO tech name) – SID table for time independent nav attr

Y — /BIC/Y (IO tech name) – SID table for time dependent nav attr

S — /BIC/S (IO tech name) — SID table for char

T — /BIC/T (IO tech name) – text table

J — /BIC/J (IO tech name) – hier interval table

H – Hierarchy table

K – Hierarchy SID table

I – SID hierarchy structure

Attribute:

It is an additional information or information given to a char

Types of attributes:

  1. Display attributes
  2. Navigational attributes
  3. Transitive attribute
  4. Compounding attribute

3.   Data Store Object (DSO)

Operational Data Store (ODS-older name)

3.X ODS – now DSO

Data target, Info Provider, Data container, flat structure or 2 dimensional object, works with override(can also be set to additive) functionalities

BW 3.X BI 7.X
Standard ODS Standard DSO
Transactional ODS Direct update DSO
Write-Optimized DSO

 

Definition

A Data Store object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.

This data can be evaluated using a BEx query.

Staging: Cleanse the data or consolidated the data before sending it to the target.

A Data Store object (DSO) is a two dimensional storage unit which mainly stores consolidated and cleansed transaction data or master data on a lowest granularity

  • It is a two dimensional Transparent Table.
  • Data is stored at detailed level.
  • DSO has Overwrite and additive functionalities. If all characteristics are same, key figures are aggregated/overwritten based on the functionality chosen?
  • Detailed level reporting can be obtained from a DSO.
  • When we create DSO/ODS it creates 3 tables in backend

Active data table /BIC/AXXXXXX00 – data moved after request activation

New data table /BIC/AXXXXXX40 – data 1st loaded

Change log table /BIC/B000* – complete changes in DSO

Max key fields are 16

Max Data fields 749

Key figure will not come under key fields

Type Standard DSO Write-Optimized DSO Direct Update DSO-

real time

Structure Consists of three tables: activation queue, table of active data, change log Consists of the table of active data only Consists of the table of active data only
Data Supply From Data Transfer Process From Data Transfer Process From APIs, DTP not possible
SID Generation Yes No No
Semantic Groups
Delta update, activation Faster, no activation
Purpose Staging Initial staging
Tables 3 flat tables –

Active data, Change Log, Activation queue

Active data table Active data table
Reporting possible Yes Yes Yes
Act as/supporting Data mart, InfoProvider, ETL supported ETL supported, Data mart, Delta mode, No ETL, No delta, Data mart
Technical Keys Key fields, Data fields Yes(Req no, Data packet, record no) semantic key is imp to identify duplicate, data field APD process
Data mart Yes, can be loaded to next level Yes, can be loaded to next level
Delete data Delete Data

Request wise deletion

Selective deletion

Delete change log data

Standard DSO:

PSA -> Activation queue -> change log & activation (active table)

Activation queue remains clear after activation of request

When data is transferred from PSA to activation queue, need to activate the data through request, when request is activated

04.png

05.png

0RECORDMODE

When standard DSO activated, SAP BW adds the 0RECORDMODE Info Object to the DSO & all the three tables of standard DSO.

This Info Object is used internally by SAP BW.

Write optimized DSO:

Semantic key is the only key combination in active table, it’s like secondary index

Direct update DSO:

The Following APIs exists:

RSDRI_ODSO_INSERT: This API is used to insert new data which does not exist in the system.

RSDRI_ODSO_INSERT_RFC: This is similar to the API “RSDRI_ODSO_INSERT” except that this API can be called remotely.

RSDRI_ODSO_MODIFY:  This API is used to modify existing records. If a new record comes in, the record is inserted.

RSDRI_ODSO_MODIFY_RFC: This is similar to the API “RSDRI_ODSO_MODIFY” except that this API can be called remotely.

RSDRI_ODSO_UPDATE: This API is used to modify existing records.

RSDRI_ODSO_UPDATE_RFC: This is similar to the API “RSDRI_ODSO_UPDATE” except that this API can be called remotely.

RSDRI_ODSO_DELETE_RFC: This API is used to delete records.

4.   Info Cubes

Selective deletion Program: RSDRD_DELETE_FACTS

Cube is additive in nature. If all characteristics are same, key figures are aggregated.

1.    Schema

Logical arrangement of the tables with business data.

Schemas are used to define a info cubes

Fact is event that can be counted or measured

Tables contains maximum of 16 primary keys & hence we can attach 16 dimension table to fact table

Star Schema

Simplest form of a dimensional model, in which data is organized into fact & dimensions.

Each dimension is joined to one single fact table. Each dimension is represented by only one dimension and it is not further normalized. A dimension Table contains a set of attributes that are used to analyze the data.

Each dimension table is connected to a fact table as the fact table has the primary Key for each dimension tables that are used to join two tables.

Facts/Measures in the Fact Table are used for analysis purpose along with the attribute in the dimension tables.

06.png

07.png

Disadvantages of star schema:

Since the master data is inside the cube so reusability is reduced.

It has Alphanumeric key are slower to process than numeric key.

The analysis can be done only for 16 characters

Extended Star Schema

In Extended Star schema, fact tables are connected to dimension tables and this dimension table is further connected to SID table and this SID table is connected to master data tables. In an extended star schema, you have the fact and dimension tables inside the cube, however SID tables are outside the cube. When you load the transactional data into the Info cube, the Dim Id’s are generated based on SID’s and these Dim ids’ are then used in the fact tables.

In the extended star schema one fact table can connect to 16 dimension tables and each dimension table is assigned with 248 maximum SID tables. These SID tables are also called as characteristics and each characteristic can have master data tables like ATTR, Text, etc.

  • ATTR − It is used to store all the attribute data.
  • Text − It is used to store description in multiple languages.

Mater data outside the cube so can be reused by multiple cube.

SID: alphanumeric key has converted to numeric, hence performance improved

Improved the analysis of report by around 4000 ways

08.png

09.png

10.png

11.png

 

5.   Other Info Providers

1.    Multi providers

2.    Composite Provider

A Composite Provider is an Info Provider that combines data from several analytic indexes or from other Info Providers (by union or inner join or left outer join), and makes this data available for reporting and analysis.

The authorizations for working with Composite Providers are assigned using authorization object S_RS_CPRO.

The analysis authorizations are created from a combination of the authorizations for the Composite Provider and the authorizations for its sub objects.

Restrictions

A Composite Provider can only be transported if it consists solely of Info Provider.

You cannot use non-cumulative key figures in a Composite Provider.

For real-time Info Cubes, the open request is not read in the query, even if you have made the corresponding data integrity setting in Query Designer.

3.    Info Sets

4.    Semantically Partitioned Objects

5.    Hybrid Providers

6.    Transient Providers & Composite Providers

6.   ETL process (Extraction, Transformation, & Data Loading)

1.    ETL capabilities in SAP BW

2.    Data Source

Extraction from ECC(generic data source):

RSO2

12.png

Create

13.png

14.png

RSA6

Check data source is activated or not

That’s it done from ECC side.

BW side:

RSA13

Check the connection

Double click on connection (application component) -> Replicate the data source

RSDS (replicate specific data source)

16.png

17.png

Create Info Package -> schedule -> monitor -> Create Cube/DSO -> Create transformation -> Create DTP

3.    Info Package (Source to Data source)

The Info Package is an entry point for SAP BI to request data from a source system.

IP is a data loading Scheduler from which you can execute your extraction of data from Source system. It contains information about when & from which data source.

In BW 3.5 IP can be used to load data in PSA as well as in data targets from source system but in case of BI 7.0 it is restricted to PSA only; further DTP can be used.

Extract data from source system & load it to PSA (Data source)

IP is specific to data source

Data selection: Fixed value, Dynamic (ABAP routine)

How to create?

Write click on Data source & select create info package

IP name starts with ZPAK_* which is system generated, we can give only description to it.

4.    Data Transfer Process (DTP): (DS to DSO, DSO to Cube, DSO to DSO etc.)

DTP determines how data is transferred between two persistent objects (PSA to data target like, CUBE or DSO or Info Object or External systems), with respect to transformations and filters

DTP replaced IP in BI 7.0 for loading the data to data targets and data marts.

DTP has separate Source to Target Delta handling mechanism.

Deleting the data mart deletes the request itself not just status as in case of IP.

DTP filter’s the data based on Semantic Key which is not possible in IP.

Breakpoint can be set at data packet level at all the stages (Before Extraction, Before Data Transfer and

After Data Transfer.) Whereas, breakpoint doesn’t exist in IP.

Temporary Data Storage and Error Stack improve error handling in DTP which doesn’t exist in IP.

DTP is faster as DTP data loading is optimized by Parallel process.

Data loading through DTP has no TRFC/LUWs generated at run time thus minimizing the loading time.

It’s possible to migrate from IP flow to DTP flow if you are in BI 7.0 or above.

Create a DTP and set the Processing mode Serial Extraction, Immediate parallel processing to switch from IP to DTP without data loss.

DTP can be used to load the data in the following situations:

  1. Loading data from DSO to DSO/CUBE, CUBE to CUBE
  2. Real time Data Acquisition
  3. For Data Direct Accessing.
  4. Loading data from BI to External systems (for example Open Hub Destinations)

Advantages of DTP:

  1. Delta Management
  2. Handling duplicate records
  3. Filters
  4. Parallel processing
  5. Error Handling
  6. Debugging

5.    Transformation

6.    Routines

7.    Loading of master data & transaction data

7.   Transformation

The Transformation process is used to perform data consolidation, cleansing and data integration. When data is loaded from one BI object to other BI object, transformation is applied on the data. Transformation is used to convert a field of source into the target object format.

Each Transformation consists of minimum one transformation rule. As different rule types and routines are available that allows you to create simple to complex transformations.

Transformation Rules

Transformation rules are used to map source fields and target fields. Different rule types can be used for transformation.

  • Rule Type − A rule type is defined as an operation applied on the fields using a Transformation rule.
  • Rule Group − It is defined as the group of transformation rules and each key field in the target contains one transformation rule.
  • Transformation Type − It is used to determine the transformation values and how data is entered in the target.
  • Routines − Routines are used to perform complex transformations. Routines are defined as local ABAP classes and it consists of predefined definition and implementation area.

A Routine is created in implementation area and inbound and outbound parameters are defined in definition area. Routines can be defined as transformation rule for a key figure and they are available as rule types.

  1. Real-Time Data Acquisition (RDA)

Real-time data acquisition is based on moving data to Business Warehouse in real time. Data is sent to delta queue or PSA table in real time. The real time data acquisition is used when you transfer data more frequently – hourly or every minute and data is refreshed at the report level multiple times in a single time interval.

Real-time data acquisition is one of the key properties of data source and data source should support real time data acquisition. Data Source which are configured to transfer data in real time they can’t be used for standard data transfer.

Real time data acquisition can be achieved in two scenarios −

  • By using InfoPackage for real time data acquisition using Service API.
  • Using Web Service to load data to Persistent Storage Area (PSA) and then by using real time DTP to move the data to DSO.

Real-time Data Acquisition Background Process −

To process data to InfoPackage and data transfer process DTP at regular intervals, you can use a background process known as Daemon.

Daemon process gets all the information from InfoPackage and DTP that which data is to be transferred and which PSA and Data sore objects to be loaded with data.

8.   Data extraction from Source System

Loading data from ECC to BW system using standards extractor

Custom or generic extractor

Data extraction from non-sap system sources using UD connect (Universal) or DB (Database) connect

9.   Creating queries using BEx Query Designer

Creating simple BEx query

Creation of OLAP variables

Creating filters

Creation of structures

Creating selection & formulas

Creating RKF & CKF

Creating conditions & exceptions

10.                     BEx analyzers & report designer

Running queries in BEx analyzer

Creating workbook

Local properties & formulas in BEx analyzer

11.                     Web Application Designer WAD

WAS Web Application Service

12.                     Administration & monitoring

1.    Process chains

Display component is the folder for Process chain.

Subcomponents cannot be created under generated components (unassigned nodes).

Definition

A process chain is a sequence of processes that are scheduled to wait in the background for an event. Some of these processes trigger a separate event that can, in turn, start other processes. It provides various connectors that allow alternative and parallel execution of process.

Process chains provide graphical scheduling & monitoring features to help in automation, visualization & monitoring of tasks / processes.

Use

In an operating BI system there are a multitude of processes that occur regularly. If you use process chains, you can:

  • Automate the complex schedules in BW with the help of the event-controlled processing,
  • Visualize the processes by using network graphics, and
  • Centrally control and monitor the processes.

Run in background

PC are grouped by components

Features −

  • Security
  • Flexibility
  • Openness

Process Chain involves three main steps:

RSPC -> create -> variant ->

Start Process: It describes when the process will start (immediately, scheduled job, met process, API)

Connector: It is a linking process, and can choose options of starting next process

Variant: The object on which we are supposed to execute the process is called as Variant. It is a set of parameters passed to the process like name of the Info Package or Info-Objects

Transaction code:

RSPC is the whole and sole transaction used for Process Chain Maintenance.

RSPCM This is used to monitor daily process chains.

In this screen, existing process chains are sorted by “Application Components”.

There are two views available:

  1. Check View
  2. Planning View.

The default mode is Planning View.

If the data load is to be done via an Info Package, use the Process type “Execute Info Package”

If the data load is to be done via a DTP, use the Process type “Data Transfer Process”

For any subsequent step, we can choose if the successor step shall be executed only if the predecessor.

  • Was successful: typically used in normal processing
  • Ended with errors: Typically used to send failure messages
  • Execute irrespective of success or failure of previous step

Steps to Check Consistency of Process Chain: Go to -> checking view

Create event from transaction SM62

BP_EVENT_RAISE is a standard SAP function module that performs Trigger Background Event functionality.

Sequence of processes:

Loading data into DSO

Activating the DSO data

General services supported by process chain

Variant

Set of parameter to execute any job in the background

Without variant background job can’t be started

2.    Analysis authorizations

3.    Aggregations

4.    Compression

5.    Info Cube & DSO administration

 

13.                     Advance features

Open Hub Destination (OHD): BW to other system

Analysis Process Designer (APD):

14.                     SAP BW powered by HANA

Disc base & in memory database

Intro

In memory Info Cube

In memory DSO

Importing SAP BW models into HANA analytic view

Creating transient provider on HANA analytic view

Creating semantic partition objects (SPO’s)

LSA & LSA++ architecture (layer & scalable architecture)

Composite provider

Advance DSOs, ODP, LSA++, BW objects exposed as HANA views

Eclipse based modeling tools

Schemas & tables on HANA analytical view, calculation view in HANA studio

SOL-scripting procedures(SQLSCRIPT in HANA)

15.                     SAP BW – BO integration

BO intro

Integration landscape

Create OLAP connection using IDT

Connecting to Web intelligence

Connecting to explore

Connecting to crystal report 2011 & enterprise

Connecting to analysis office for OLAP & MS office

Connecting Dashboards

Analysis for Microsoft Office

OLAP Report with Analysis

  • As an OLAP tool, analysis allows you to do the following
  • Insert data from multiple sources and view the data from multiple perspectives.
  • Use SAP BEx queries, query views and SAP BW Info Providers as data sources.
  • Add multiple crosstabs to a workbook to insert and analyze data from different sources and systems.
  • Access the data and metadata of connected SAP BW systems with a dedicated set of Analysis functions in Microsoft Excel.
  • Connect to the SAP Business Objects BI platform to access data source
  • Change the view of displayed data in the report
  • Refine your analysis of OLAP data with conditional formatting, filters, prompts, calculations and display hierarchies.
  • Drag measures and characteristics objects into and out of the report to add and remove them.
  • Add charts for visual analysis of data.
  • Save the status of your navigation as an Analysis view that is available for reuse.
  • Use both the SAP Business Objects BI and the SAP Net Weaver platforms to save and share workbooks and perspectives.

 The analysis Design Panel consists of three tabs( Analysis -> Display -> Display Design Panel )

  1. Analysis: Data source, Columns, Rows, Background Filter
  2. Information: report metadata
  3. Components:

Navigation options in Analysis reports:

  • Pausing refresh
  • Prompting
  • Filtering data
  • Sorting (Ascending, Descending, Break Hierarchy)
  • Hierarchies – by default flat structure is displayed
  • Displaying members, measures, and totals

 

Analysis view

Analysis Workbook
Contains one analysis Can contain several analysis
Contains the navigation state Contains the navigation stat and Excel design elements
Saved to an SAP Business Objects BI platform Saved to an SAP Business Objects BI platform or to SAP BW.

User profiles – There are three user profiles for SAP Analysis for Microsoft Office:

  1. Workbook Creator: Users who create and maintain workbooks based on SAP BEx queries, query views and SAP BW InfoProvider.
  2. Data Analyst: Users who navigate through existing workbooks and analyze the data they contain. They can also include workbooks in a Microsoft PowerPoint presentation and continue the analysis there.
  3. Administrator: IT specialists who install, configure and administer SAP Analysis for Microsoft Office. They also assign security rights and authorizations to workbook creators and analyzers.

SAP Analysis for Microsoft Office, is a Microsoft Office Add-In that allows multidimensional analysis of OLAP sources. It consists of the following components:
Analysis Plug-in

  1. Enterprise Performance Management Plug-in
  2. Business Planning and Consolidation Plug-in
  3. The plug-ins include versions for Microsoft Excel and Microsoft PowerPoint.

In Analysis for Microsoft Excel, you can use the plug-ins in one workbook. The sheet type defines which plug-in is active. The available sheet types are: Analysis, EPM, Neutral and Non-COF. Empty sheets are defined as neutral. If you add a data source into a neutral sheet with one plug-in, the corresponding sheet type is assigned. If you switch within a workbook to a sheet of another type, the respective plug-in is enabled automatically. To reset a sheet to type neutral, you have to remove all inserted data sources from the sheet.

Sheets of type Non-COF are not checked from the Analysis Add-In. It might be useful to assign this type to sheets that do not contain Analysis content to improve the performance.

In Microsoft Excel, Analysis is available in two separate tabs in the ribbon: Analysis and Analysis Design. In Microsoft PowerPoint, it is available in one tab: Analysis.

In the Analysis plug-in, you can use SAP Queries, query views and InfoProvider as data sources. The data is displayed in the workbook in crosstabs. You can insert multiple crosstabs in a workbook with data from different sources and systems. If the workbook will be used by different users, it is also helpful to add info fields with information on the data source and filter status.

Using the design panel, you can analyze the data and change the view on the displayed data. You can add and remove dimensions and measures to be displayed easily with drag and drop. To avoid single refreshes after each step, you can pause the refresh to build a crosstab. After ending the pause, all changes are applied at once.

You can refine your analysis using conditional formatting, filter, prompting, calculations and display hierarchies. You can also add charts to your analysis. If you want to keep a status of your navigation, you can save it as an analysis view. Other users can then reuse your analysis.

For more sophisticated workbook design, the Analysis plug-in contains a dedicated set of functions in Microsoft Excel to access data and meta data of connected BW systems. There are also a number of API functions available that you can use with the Visual Basic Editor, to filter data and set values for BW variables.

You can also plan business data based on the current data in your data source. You can enter the planning data manually and you can enter planning data automatically using planning functions and planning sequences of SAP BW Integrated Planning.

The Analysis plug-in, must be installed on your local machine. You can connect directly to a SAP BW system or you can connect via a to include data sources. You can use the following platforms to store and share workbooks and presentations: SAP BusinessObjects business intelligence platform and SAP BW (SAP BW/4HANA).

Creating and Administrating Workbooks
You can insert SAP BEx Queries, query views and SAP BW InfoProvider as data sources into a workbook. These data sources are stored in a SAP BW system. You can add multiple crosstabs to a worksheet or workbook. The crosstabs can contain data from the same data source or from different sources. You can also use data sources that are stored in different systems in one workbook.

To add a crosstab with data to a workbook, you select a data source in a SAP BW system. You need the appropriate authorizations for the platform you want to use (SAP BW, SAP BW/4HANA or SAP BusinessObjects Business Intelligence), and the relevant SAP BW systems to insert a data source into a workbook.

You can save new workbooks either on the SAP BusinessObjects business intelligence platform, a SAP BW platform or in a SAP BW/HANA system. Stored workbooks can be opened from the corresponding platforms. They can also be renamed or deleted on the platforms.

You can also convert workbooks created with the SAP BEx Analyzer 3.5 and SAP BEx Analyzer 7.0 to an Analysis workbook.

You can also insert and analyze SAP HANA data sources and SAP Analytics Cloud models in an Analysis workbook.

Working with formulas
In Analysis for Microsoft Office, you can use the standard functions of Microsoft Excel to build formulas.

You can also use these functions in VBA macros.

Analysis functions
The following functions are available in the Analysis category:
SAPGetData
SAPGetDimensionDynamicFilter
SAPGetDimensionEffectiveFilter
SAPGetDimensionInfo
SAPGetDimensionStaticFilter
SAPGetDisplayedMeasures
SAPGetInfoLabel
SAPGetMeasureFilter
SAPGetMeasureInfo
SAPGetMember
SAPGetSourceInfo
SAPGetUniformScaling
SAPGetVariable
SAPGetWorkbookInfo
SAPListOf
SAPListOfDimensions
SAPListOfDynamicFilters
SAPListOfEffectiveFilters
SAPListOfMeasureFilters
SAPListOfMembers
SAPListOfMessages
SAPListOfStaticFilters
SAPListOfVariables
SAPSetData
SAPSetFilterComponent

SAP BusinessObjects Analysis, edition for Microsoft Office 2.2

SAP Analysis for Microsoft Office

SAP Analysis for Microsoft Office – Official Product Tutorials

SAP BusinessObjects Analysis, edition for Microsoft Office 2.2

https://help.sap.com/saphelp_boao22/helpdata/en/f2/6f8d126c9b1014bf2c9a7eb0e91070/frameset.htm