Powered By

Free XML Skins for Blogger

Powered by Blogger

Thursday, February 12, 2009

Functions in the SAP Source System in BI

Use

The BI Service API (SAPI) is a technology package in the SAP source system that enables the close integration of data transfer from SAP source systems into a BI system.

The SAPI allows you to

make SAP application extractors available as a basis for data transfer into BI

carry out generic data extraction

use intelligent delta processes

access data in the source system directly from BI (VirtualProvider support)

With transaction SBIW, the SAPI provides an implementation guide in the SAP source system that includes the activities necessary for data extraction and data transfer from an SAP source system into BI.

Irrespective of the type of SAP source system, Customizing for extractors comprises activities that belong to the scope of SAPI:

general settings for data transfer from a source system into BI

the option of installing BI Content delivered by SAP

the option of maintaining generic DataSources

the option of postprocessing the application component hierarchy and DataSources on a source system level

In addition to the activities that are part of the scope of SAPI, Customizing for extractors for OLTP and further SAP source systems may contain source-system specific settings for application-specific DataSources.

Features

General Settings

General settings include the following activities:

Maintaining control parameters for data transfer

Restricting authorizations for extraction

Monitoring the delta queue

Installing BI Content Delivered by SAP

DataSources delivered with the BI Content by SAP and those delivered by partners appear in a delivery version (D version). If you want to use a partner or BI Content DataSource to transfer data from a source system into BI, you need to transfer this DataSource from the D into the active (A) version.

In the source system, the DataSources are assigned to specific application components. If you want to display the DataSources in BI in the DataSource tree of the Data Warehousing Workbench according to this application component hierarchy, you need to transfer them from the D version into the A version in the source system.

Note

Transferring data from an OLTP system or other SAP source systems

Note: You need to make settings for some BI Content DataSources before you can transfer data into BI. These settings are listed in transaction SBIW in the Settings for Application-Specific DataSources section. You can only find this section in those SAP source systems for which it is relevant.

The following activities are associated with installing BI Content:

Transferring application component hierarchies

Installing Business Content DataSources

Generic DataSources

Regardless of the specific application, you can use generic data extraction to extract data from any transparent tables, database views or SAP Query functional areas. You do not need to program in ABAP. You can also use function modules for generic data extraction.

In this way, you can use your own DataSources for transaction data, master data attributes or texts. The data for such DataSources is read generically and then transferred into BI.

Generic DataSources allow you to extract data which cannot be supplied to BI either with the DataSources delivered with BI Content or with customer-defined DataSources of the application.

For more information, see Maintaining Generic DataSources.

Postprocessing DataSources

You can adapt existing DataSources to suit your requirements as well as edit the application component hierarchy for the DataSources.

For more information, see Editing DataSources and Application Component Hierarchies .

Maintaining Control Parameters for Data Transfer in BI

Procedure

Maintain entries for the following fields:

...

1. Source system

Enter the logical system for your source client and assign a control parameter to it.

For information about source clients, see the source system under Tools ® Administration ® Management ® Client Management ® Client Maintenance.

2. Maximum size of the data package

When you transfer data into BI, the individual data records are sent to BI in packages of variable size. You use this parameter to control the typical size of a data package of this type.

If you do not maintain an entry, the data is transferred with the default setting of 10,000 kBytes per data package. However, the required memory depends not only on the data package size setting, but also on the width of the transfer structure, the required memory of the affected extractor, and, for large data packages, the number of data records in the package.

3. Maximum number of rows in a data package

For large data packages, the required memory mainly depends on the number of data records that are transferred with the package. You use this parameter to control the maximum number of data records that you want the data package to contain.

By default, the system transfers a maximum of 100,000 records per data package.

The maximum main memory required per data package is approximately 2 X ’Max. Rows’ X 1000 bytes.

4. Frequency

By specifying a frequency you determine the number of data IDocs after which an info IDoc is to be sent. In other words, how many data IDocs are described by a single info IDoc.

The frequency is set to 1 by default. This means that an info IDoc follows after each data IDoc. You should choose a frequency between 5 and 10, but not greater than 20.

The larger the package size of a data IDoc, the lower you should set the frequency. As a result, you get information about the data load status during the data upload at relatively short intervals.

In the BI monitor, you can see from each info IDoc whether the load process was successful. If this is the case for all data IDocs described in an info IDoc, the traffic light in the monitor is green. Info IDocs contain information about whether the data IDocs were correctly uploaded.

5. Maximum number of parallel processes for the data transfer

An entry in this field is only required as of Release 3.1I.

Enter a value greater than 0. The maximum number of parallel processes is set to 2 by default. The optimal choice of the parameter depends on the configuration of the application server that you are using for the data transfer.

6. Target system of a batch job

Enter the name of the application server on which you want to process the extraction job.

To get the name of the application server, choose Tools ® Administration ® Monitor ® System Monitoring ® Server. The Host column displays the name of the application server.

7. Maximum number of data packages in a delta request

You use this parameter to set the maximum number of data packages in a delta request or in the repeat of a delta request (repair).

Only use this parameter if you are expecting delta requests with a very large volume of data. In this case, you allow more than 1000 data packages to be generated in a request, while retaining an appropriate data package size.

As before, there are no limits for initials values or the value 0. A limit is only applied if you have a value that is greater than 0. However, for consistency reasons this number is not always strictly adhered to. Depending on the extent to which the data in the qRFC queue is compressed, the actual limit can deviate by up to 100 from the specified value.

Restricting Authorizations for Extraction in BI

UseYou use this function to exclude DataSources from the extraction. Data that is stored in these DataSources is not transferred into BI.

Use this function to exclude DataSources from the extraction for individual BI systems. If you want to exclude a DataSource from the extraction for all connected BI systems, in the post-processing of DataSources, choose editing DataSources and application component hierarchies and delete the DataSource.

Procedure

...

1. Choose New Entries.

2. Choose the DataSource that you want to exclude from the extraction.

3. Choose the BI system into which you no longer want data from this DataSource to be extracted.

4. In the Extr. Off field, specify that the DataSource is to be excluded from the extraction.

5. Save your entries and specify a transport request.

Delta Queue Check in BI

Use

The delta queue is a data store in the source system into which data records are written automatically. The data records are written to the delta queue either using an update process in the source system (for example with FI documents) or are extracted using a function module when data is requested from BI (for example, LIS extraction prior to BW 2.0).

With a delta request, the data records are transferred into BI from the scheduler.

The data is stored in compressed form in the delta queue. It can be requested from several BI systems. The delta queue is also repeat enabled; it stores the data from the last extraction process. The repeat mode of the delta queue is specific to the target system.

If the extraction structure of a DataSource is changed after data is written into the delta queue but before the queue data is read (for example, when you upgrade), you can tell which structure in the delta queue the data was written to from the data itself. The queue monitor contains fields that were not filled before but are now filled and/or fields that were filled before but are no longer filled.

You use this function to check the delta queue.

Features

The status symbol shows whether an update into a delta queue is activated for a particular DataSource. The delta queue is active if the status symbol is green; it is filled with data records when there is an update process or data request from BI. The delta method has to be initialized successfully in the scheduler in BI before a delta update can take place.

You can carry out the following activities:

Display data records

Display the current status of the delta-relevant field

Refresh

Delete the queue

Delete queue data

Activities

Displaying Data Records
...

1. To check the amount and type of data in the delta queue, select the delta queue and choose Display Data Records.

2. A dialog box appears in which you can specify how you want to display the data records.

a. You can select the data packages that contain the data records you want to see.

b. You can display specific data records in the data package.

c. You can simulate the extraction parameters to select how you want to display the data records.

3. To display the data records, choose Execute.

Displaying Current Status of Delta-Relevant Field

For DataSources that support generic deltas, you can display the current value of the delta-relevant field in the delta queue. In the Status column, choose Detail. The value displayed includes the largest value for the last extraction with reference to the delta-relevant field. It is the lower limit for the next extraction.

Refreshing

If you select refresh,

newly activated delta queues are displayed

new data records that have been written to the delta queue are displayed

data records that have been deleted by the time the system reads the data records are not displayed

Deleting Queue Data

To delete the data in a delta queue for a DataSource, select the delta queue and in the context menu, choose Delete Data.

If you delete data from the delta queue, you do not have to reinitialize the delta method to write the DataSource data records into the delta queue.

Note that data is also deleted that has not yet been read from the delta queue. As a result, any existing delta update is invalidated. Only use this function when you are sure that you want to delete all queue data.

Deleting Queues

You can delete the entire queue by choosing Queue ® Delete Queue. You need to reinitialize the delta method before you can write data records for the related DataSource into the delta queue.

Installing Application Component Hierarchies in BI

You use this function to install and activate application component hierarchies delivered by SAP or by partners.

After the DataSources are replicated in BI, this application component hierarchy is displayed with the transferred DataSources in the source system view of the Data Warehousing Workbench – Modeling. In BI, choose the DataSource overview from the context menu (right-mouse click) for the source system.

If you activate the BI Content application component hierarchy, the active customer version is overwritten when you install the BI Content version.

For information about changing the installed application component hierarchy, see Editing DataSources and Application Component Hierarchies.

Installing BI Content DataSources in BI

Use

You use this function to transfer and activate DataSources delivered with BI Content and, where applicable, partner DataSources delivered in their own namespaces. After installing BI Content DataSources you can extract data from all the active DataSources that you have replicated in BI and transfer this data to all connected BI systems.

Activities

The Install DataSources from BI Content screen displays the DataSources in an overview tree. This tree is structured in accordance with the application components assigned to you.

...

1. In the application component hierarchy, select the nodes for which you want to install DataSources in the active version. To do this, position the cursor on the node and choose Highlight Subtree.

The DataSources and subtrees below the node are selected.

2. Choose Select Delta.

DataSources where the system found differences between the active and the delivered version (due to changes to the extractor, for example) are highlighted in yellow.

3. To analyze the differences between active and delivered versions of a particular DataSource, select the DataSource and choose Version Comparison. The application log contains further information about the version comparison.

4. To transfer a DataSource from the delivery version to the active version, select it in the overview tree by choosing Highlight Subtree and choose Transfer DataSources.

If an error occurs, the error log appears.

Regardless of whether data has been successfully transferred into the active version, you can call the log by choosing Display Log.

With a metadata upload (when you replicate DataSources in BI), the active version of the DataSource is made known to BI.

When you activate BI Content DataSources, the system overwrites the active customer version with the SAP version.

You can only search for DataSources or other nodes in expanded nodes.

For information about changing the installed DataSources, see Editing DataSources and Application Components.

Maintaining Generic DataSources in BI

Use

Regardless of the application, you can create and maintain generic DataSources for transaction data, master data attributes or texts from any transparent table, database view or SAP Query InfoSet, or using a function module. This allows you to extract data generically.

Procedure

Creating Generic DataSources
...
1. Select the DataSource type and specify a technical name.

2. Choose Create.

The screen for creating a generic DataSource appears.

3. Choose the application component to which you want to assign the DataSource.

4. Enter the descriptive texts. You can choose any text.

5. Select the datasets from which you want to fill the generic DataSource.

a. Choose Extraction from View if you want to extract data from a transparent table or a database view. Enter the name of the table or the database view.

After you generate the DataSource, you have a DataSource with an extraction structure that corresponds to the database view or transparent table.

For more information about creating and maintaining database views and tables, see the ABAP Dictionary Documentation.

b. Choose Extraction from Query if you want to use a SAP Query InfoSet as the data source. Select the required InfoSet from the InfoSet catalog.

Notes on Extraction Using SAP Query

After you generate the DataSource, you have a DataSource with an extraction structure that corresponds to the InfoSet.

For more information about maintaining the InfoSet, see the System Administration documentation.

c. Choose Extraction Using FM if you want to extract data using a function module. Enter the function module and extraction structure.

The data must be transferred by the function module in an interface table E_T_DATA.

Interface Description and Extraction Process

For information about the function library, see the ABAP Workbench: Tools documentation.

d. With texts you also have the option of extracting from fixed values for domains.

6. Maintain the settings for delta transfer, as required.

7. Choose Save.

When performing extraction, note SAP Query: Assigning to a User Group.

Note

Note when extracting from a transparent table or view:

If the extraction structure contains a key figure field that references a unit of measure or a currency unit field, this unit field has to be included in the same extraction structure as the key figure field.

A screen appears on which you can edit the fields of the extraction structure.

8. Edit the DataSource:

Selection

When you schedule a data request in the BI scheduler, you can enter the selection criteria for the data transfer. For example, you can determine that data requests are only to apply to data from the previous month.

If you set the Selection indicator for a field within the extraction structure, the data for this field is transferred in correspondence with the selection criteria in the scheduler.

Hide field

You set this indicator to exclude an extraction structure field from the data transfer. The field is no longer available in BI when you set the transfer rules or generate the transfer structure.

Inversion

Reverse postings are possible for customer-defined key figures. Therefore inversion is only active for certain transaction data DataSources. These include DataSources that have a field that is marked as an inversion field, for example, the update mode field in DataSource 0FI_AP_3. If this field has a value, the data records are interpreted as reverse records in BI.

If you want to carry out a reverse posting for a customer-defined field (key figure), set the Inversion indicator. The value of the key figure is transferred to BI in inverted form (multiplied by –1).

Field only known in exit

You can enhance data by extending the extraction structure for a DataSource by adding fields in append structures.

The Field Only Known in Exit indicator is set for the fields of an append structure; by default these fields are not passed to the extractor from the field list and selection table.

Deselect the Field Only Known in Exit indicator to enable the Service API to pass on the append structure field to the extractor together with the fields of the delivered extract structures in the field list and in the selection table.

9. Choose DataSource ® Generate.

The DataSource is saved in the source system.

Maintaining Generic DataSources

Change DataSource

To change a generic DataSource, in the initial screen of DataSource maintenance, enter the name of the DataSource and choose Change.

You can change the assignment of a DataSource to an application component or change the texts of a DataSource. Double-click on the name of the table, view, InfoSet or extraction structure to get to the appropriate maintenance screen. Here you make the changes to add new fields. You can also completely swap transparent tables and database views, though this is not possible with InfoSets. Return to DataSource maintenance and choose Create. The screen for editing a DataSource appears. To save the DataSource in the SAP source system, choose DataSource ® Generate.

If you want to test extraction in the source system independently of a BI system, choose DataSource ® Test Extraction.

Delta DataSource

On the Change Generic DataSource screen, you can delete any DataSources that are no longer relevant. If you are extracting data from an InfoSet, delete the corresponding query. If you want to delete a DataSource, make sure it is not connected to a BI system.

For more information about extraction using SAP Query, see Extraction Using SAP Query.

Delta Transfer to BI in BI

The following update modes are available in BI:

Full update

A full update requests all data that meets the selection criteria you set in the scheduler.

Delta update

A delta update only requests data that has appeared in the source system since the last load.

Initializing the delta process

You need to initialize a delta process before you can execute it. The initialization selections are copied to load the delta records.

With large volumes of data, you can only ensure a performance-optimized extraction from the source system if you use a delta process.

In the maintenance of the generic DataSource, you can set up a delta for master data attributes and texts. You can also set up a generic delta using a (delta-relevant) field with a monotonically increasing value.

Setting Up an ALE Delta for Master Data Attributes or Texts

Master data attributes or texts for which you want to use a delta transfer have to fulfill two prerequisites:

1. Data must be extracted generically using a transparent table or database view.

2. A change document object must be available that can update the complete key of the table (or view) used for data extraction in combination with one of the tables on which the change document object is based.

The required control entries are delivered for the most important master data attributes and texts. By including a maintenance interface for control entries in the maintenance of generic DataSources or InfoSources, you can use the delta transfer for other master data attributes or texts.

To generate the control entry for master data attributes or texts that is required for BI, proceed as follows:

1. For an attribute or text DataSource, choose DataSource ® ALE Delta.

2. Enter the table and the change document object that you want to use as a basis for the delta transfer.

An input help for the Table Name field searches all possible tables for a suitable key.

3. Confirm your entries.

If you have entered a usable combination of table and change document object, the extraction structure fields are listed in the table below. The status in the first column shows whether changing the master data in this field causes the system to transfer the delta record.

4. Save the settings to generate the required control entry.

Delta transfer is now possible for master data and texts.

After the DataSource has been generated, you can see this on the DataSource: Edit Customer Version screen; the Delta Update field is selected.

Note

You need two separate entries if you want to transfer delta records for texts and master data attributes.

Generic Delta

If a field exists in the extraction structure of a DataSource and the field contains values that increase monotonically over time, you can define delta capability for this DataSource. If a delta-relevant field of this type exists in the extraction structure (a timestamp, for example), the system determines the data volume transferred in the delta mode by comparing the maximum value transferred with the last load with the amount of data that has since entered the system. Only the new data is transferred.

To get the delta, generic delta management translates the update mode into a selection criterion. The selections of the request are enhanced with an interval for the delta-relevant field. The lower limit of the interval is taken from the previous extraction. The upper limit is taken from the current value, for example, the timestamp at the time of extraction. You use security intervals to ensure that all data is taken into account during extraction (see below). After the data request is transferred to the extractor and the data is extracted, the extractor informs generic delta management that the pointer can be set to the upper limit of the previously determined interval.

Note

The delta for generic DataSources cannot be used with a BI system release prior to 3.0. In older SAP BW releases, the system does not replicate DataSources for master data and texts that were delta-enabled using the delta for generic DataSources.

Determining Generic Delta for a DataSource
1. Choose Generic Delta.

2. In the dialog box that appears, specify the delta-determining field and the type of this field.

3. a. Enter a security interval.

The purpose of a security interval is to make the system extract records with the next extraction that appear during the extraction process but which have not yet been extracted (because, for example, they have not been saved).

You can add a security interval to the upper limit or lower limit of the interval.

Caution

You should only specify a security interval for the lower limit if the delta process produces a new status for the changed records (when the status is overwritten in BI). In this case, duplicate data records that may arise with a security interval of this type have no affect in BI.

b.Choose the delta type for the data that you want to extract.

You use the delta type to determine how the extracted data is interpreted in BI and the data targets to which it can be updated.

With the Additive Delta delta type, the record to be loaded for cumulative key figures only returns the change to the respective key figure. The extracted data is added into BI. DataSources with this delta type can write data to DataStore objects and InfoCubes.

With the New Status for Changed Records delta type, each record to be loaded returns the new status for all key figures and characteristics. The values in BI are overwritten. DataSources with this delta type can write data to DataStore objects and master data tables.

c. Specify whether the DataSource supports real-time data acquisition.

4. Save your entries.

Delta transfer is now possible for this DataSource.

After the DataSource has been generated, you can see this on the DataSource: Edit Customer Version screen; the Delta Update field is selected.

Note

In systems as of Basis Release 4.0B, you can display the current value of the delta-relevant field in the delta queue.

Example of Determining Selection Intervals with Generic Delta

Security interval upper limit

The delta-relevant field is a timestamp.

The timestamp that was read last is 12:00:00. Delta extraction begins at 12:30:00. The security interval for the upper limit is 120 seconds. The selection interval for the delta request is: 12:00:00 to 12:28:00. When the extraction is finished, the pointer is set to 12:28:00.

Security interval lower limit

The delta-relevant field is a timestamp. After images are transferred. In BI the record is overwritten with the post-change status, for example, for master data. Any duplicate data records do not affect BI.

The last read timestamp is 12:28:00. Delta extraction begins at 13:00. The security interval for the lower limit is 180 seconds. The selection interval for the delta request is: 12:25:00 to 13:00:00. When the extraction is finished, the pointer is set to 13:00:00.

Function Module: Interface Description and Procedure in BI

A description of the interface for a function module that is used for generic data extraction:

Importing Parameter

I_DSOURCE type SRSC_S_IF_SIMPLE-DSOURCE DataSource

I_INITFLAG type SRSC_S_IF_SIMPLE-INITFLAG Initialization call

I_MAXSIZE type SRSC_S_IF_SIMPLE-MAXSIZE package size

I_REQUNR type SRSC_S_IF_SIMPLE-REQUNR request number

Tables

I_T_SELECT type SRSC_S_IF_SIMPLE-T_SELECT

I_T_FIELDS type SRSC_S_IF_SIMPLE-T_SELECT

E_T_DATA

Exceptions

NO_MORE_DATA

ERROR_PASSED_TO_MESS_HANDLER

Details on Individual Parameters

· I_INITFLAG

This parameter is set to ‘X’ when the function module is called up for the first time, then to ‘ ‘.

· I_MAXSIZE

This parameter contains the number of lines expected within a read call.

Extraction Process
...

The function module is called up again and again during an extraction process:

1. Initialization call:

Only the request parameters are transferred to the module here. The module is as yet unable to transfer data.

2. First read call:

The extractor returns the data typified with the extract structure in an interface table. The number of rows the system expected is determined in the request parameter (I_MAXSIZE).

3. Second read call:

The extractor returns the data enclosed within the first data package in a separate package with I_MAXSIZE rows.

4. The system calls up the function module again and again until the module returns the exception NO_MORE_DATA. Data cannot be transferred in the call in which this exception is called up.

Example

An example of a function module that meets these demands is RSAX_BIW_GET_DATA_SIMPLE. A simple way of creating a syntactically correct module is to copy it into its own function group and then to cope the rows of the top-include of function group RSAX (LRSAXTOP) into the top-include of its own function group. Afterwards, the copied function module must be adjusted to the individual requests.

Testing Extraction in BI

Use

You can use this function to test extraction from DataSources that were created using the maintenance for the generic DataSource. After the test extraction, you can display the extracted data and the associated logs.

Procedure

...

1. Choose DataSource ® Test Extraction.

A screen appears in which you can set parameters and selections for the test extraction.

2. Enter a request no. for the test extraction via a function module.

3. Enter how many data records are to be read with each extractor call.

4. The extractor is called up by the Service API until no more data is available. In the Display Extr. Calls field, you can specify the maximum number of times the extractor is to be called. This enables you to restrict the no. of data packages when testing the extraction. With a real extraction, the system transfers data packages until is no longer able to find any more data.

5. Depending on the definition of the DataSource, you can test the extraction in various update modes.

For DataSources that support the delta method, you can also test deltas and repeats as well as the full update.

The modes delta and repeat are only available for testing when the extractor supports a mode in which the system reads the data but does not modify the delta management status tables.

Caution

· To avoid errors in BW, the timestamp or pointer that was set in delta management must not be changed during testing.

· Before you are able to test the extraction in a delta mode in the source system, you need to have carried out an initialization of the delta method or a simulation of such an initialization for this DataSource.

You can test the transfer of an opening balance for non-cumulative values.

6. Specify selections for the test extraction.

Only those extract structures fields you have selected in DataSource maintenance can be selected.

To enter several selections for a field, insert new rows for this field into the selection table.

7. Choose whether you want to execute the test extraction in debug mode or by tracing an authorization trace.

If you test the extraction in the debug mode, a breakpoint is set just before the extractor initialization call.

For information on the debugger, see ABAP Workbench: Tools.

If you set an authorization trace, you can call it after the test by choosing Display Trace.

8. Start the extraction.

Result

If the extraction was successful, a message appears that specifies the number of extracted records. The buttons Display List, Display Log and Display Trace (optional) appear on the screen. You can use Display List to display the data packages. By double-clicking on the number of records for a data package, you get to a display of the data records. Choose Display Log to display the application log.

Extraction Using SAP Query in BI

SAP Query is a comprehensive tool for defining reports. It uses many different forms of reporting. It allows users to define and execute their own evaluations of data in the SAP system without requiring ABAP programming know-how.

To define the structure of evaluations, you enter texts in SAP Query and select fields and options. InfoSets and functional groups allow you to easily select the relevant fields.

An InfoSet is a special view of a set of data (logical database, table join, table, sequential file). It serves as the data source for SAP Query. An InfoSet determines which tables or fields of these tables are referenced in an evaluation. InfoSets are usually based on logical databases.

The maintenance of InfoSets is one component of SAP Query. When an InfoSet is created, a data source is selected in an application system. Since a data source can have a large number of fields, fields can be combined into logical units; the functional groups. Functional groups are groups of several fields that form a logical unit within an InfoSet. Any fields that you want to use in an extraction structure have to be assigned to a functional group. In generic data extraction using an InfoSet, all the fields of all functional groups for this InfoSet are available.

The relevance of SAP Query to BI lies in the definition of the extraction structure by selecting fields of a logical database, a table join or other datasets in an InfoSet. This allows you to use generic data extraction for master or transaction data from any InfoSet. A query is generated for an InfoSet. The query gets the data and transfers it to the generic extractor.

InfoSets represent an additional, easily manageable data source for generic data extraction. They allow you to use logical databases from all SAP applications, table joins, and further datasets as data sources for BI. For more information about SAP Query, and InfoSets in particular, see the SAP Query documentation -> System Administration.

In the following section, the terms SAP Query and InfoSet are used independently of the source system release. Depending on the source system release, SAP Query is the same as an ABAP Query or ABAP/4 query. The InfoSet is also called a functional area in some source system releases.

Notes on Extraction Using SAP Query in BI

Client Dependency

InfoSets are only available if you have created them globally, independent of a client. You set this global area in the initial screen of InfoSet maintenance under Environment ® Work Areas.

Size Limits When Extracting Data Using SAP Query InfoSets

If you are using an InfoSet to extract data, the system first collects all data in the main memory. The data is transferred to the BI system in packages using the Service API interface. The size of the main memory is therefore important with this type of extraction. It is suitable for limited datasets only.

As of SAP Web Application Server 6.10, you can extract mass data using certain InfoSets (tables or table joins).

See also:

Extraction Using SAP Query

Editing DataSources and Application Component Hierarchies in BI

Use

To adapt existing DataSources to your requirements, you can edit them in this step before transporting them from a test system into a productive system.

In this step you can also postprocess the application component hierarchy.

Procedure

DataSource

Transporting DataSources

Select the DataSources that you want to transport from the test system into the productive system and choose Transport. Specify a development class and a transport request so that the DataSources can be transported.

Maintaining DataSources

To maintain a DataSource, select it and choose Maintain DataSource. The following editing options are available:

Selection

When you schedule a data request in the BI scheduler, you can enter the selection criteria for the data transfer. For example, you can determine that data requests are only to apply to data from the previous month.

If you set the Selection indicator for a field within the extraction structure, the data for this field is transferred in correspondence with the selection criteria in the scheduler.

Hide field

You set this indicator to exclude an extraction structure field from the data transfer. The field is no longer available in BI when you set the transfer rules or generate the transfer structure.

Inversion

Reverse postings are possible for customer-defined key figures. Therefore inversion is only active for certain transaction data DataSources. These include DataSources that have a field that is marked as an inversion field, for example, the update mode field in DataSource 0FI_AP_3. If this field has a value, the data records are interpreted as reverse records in BI.

Set the Inversion indicator if you want to carry out a reverse posting for a customer-defined field (key figure). The value of the key figure is transferred to BI in inverted form (multiplied by –1).

Field only known in exit

You can enhance data by extending the extraction structure for a DataSource by adding fields in append structures.

The Field Only Known in Exit indicator is set for the fields of an append structure; by default these fields are not passed to the extractor from the field list and selection table.

Field Only Known in Exit indicator to enable the BI Service API to pass on the append structure field to the extractor together with the fields of the delivered extract structures in the field list and in the selection table.

Enhancing the extraction structure

If you want to transfer additional information for an existing DataSource from a source system into BI, you first need to enhance the DataSource extraction structure by adding fields. To do this, create an append structure for the extraction structure (see Structure linkAdding Append Structures)...

1. Choose Enhance Extr. Str., to access field maintenance for the append structure. The name of the append structure is taken from the extraction structure name in the customer namespace.

2. Enter the fields you want to add in the field list, together with their subordinate data elements. You can use all the functions that are available for maintaining fields of tables and structures.

3. Save and activate your append.

4. Go back to the DataSource display and make sure that the Hide Field indicator is not selected for the newly added fields.

Function enhancement

To fill the append structure fields with data, you need to create a customer-specific function module. For information about enhancing the SAP standard with customer-specific function modules, see Enhancing the SAP Standard in SAP Library.

The SAP enhancement RSAP0001 is available for enhancing BI DataSources. This enhancement contains the following enhancement components:

Transaction data

exit_saplrsap_001

Attributes, texts

exit_saplrsap_002

Hierarchies

exit_saplrsap_004

For more information, see Enhancing DataSources.

As of Release 6.0, the Business Add-In (BAdI) RSU5_SAPI_BADI is available. You can display the BAdI documentation in the BAdI definition or BAdI implementation.

Application Component Hierarchy

To create a same-level or lower-level node for a particular node, place the cursor over this node and choose Object ® Create Node. You can also create lower-level nodes by choosing Object ® Create Children.

To rename, expand, or compress a node, place your cursor over the node and click on the appropriate button.

To move a node or subtree, select the node you want to move (by positioning the cursor over it and choosing Select Subtree), position the cursor on the node onto which the selected node is to be positioned. Choose Reassign.

If you select a node with the cursor and choose Set Segment, this node is displayed with its subnodes. You can go to the higher-level nodes for this subtree using the appropriate links in the row above the subtree.

If you select a node with the cursor and choose Position, the node is displayed in the first row of the view.

All DataSources for which a valid (assigned) application component could not be found are placed under the node NODESNOTCONNECTED. The node and its subnodes are only built at transaction runtime and are refreshed when the display is saved.

NODESNOTCONNECTED is not persistently saved to the database and is therefore not transferred in a particular state to other systems when you transport the application component hierarchy.

Note: Hierarchy nodes created under NODESNOTCONNECTED are lost when you save. After you save, the system only displays those nodes under NODESNOTCONNECTED that were moved to this node with DataSources.

A DataSource is positioned under an application component X. You transfer a new application component hierarchy from BI Content that does not contain application component X. In this application component, the DataSource is automatically placed under the node NODESNOTCONNECTED.

Note: Changes to the application component hierarchy only apply until BI Content is installed again.

SAP Query: Assignment to a User Group in BI

If you want to extract your data from an InfoSet, the InfoSet must be assigned to a user group before the DataSource can be generated. This is necessary as the extraction is processed from an InfoSet using a query that comprises all fields of the InfoSet. In turn, this query can only be generated when the InfoSet is assigned to a user group.

Releases up to 3.1I

In releases up to 3.1I, a screen appears in which you have to specify a user group as well as a query name. The user group must be specified using the value help. In other words, it must already have been created. You can get more information about creating user groups in the SAP Query documentation, in the section System Management ® Functions for Managing User Groups.

A separate query is required for an InfoSet each time it is used in a DataSource. For this reason, enter a query name that was previously not in the system.

The query is generated after you confirm your entries.

Releases from 4.0A

In releases as of 4.0A, the InfoSet for the extract structure of the new DataSource is automatically assigned to the pre-finished system user group. A query is automatically generated by the system.

Free Download Warehouse Management(WM) Tutorial Pdf Books

Decentralized Warehouse Management (LE-IDW)
SAP Open Information Warehouse
Warehouse Management Guide

Free Download Book Decentralized Warehouse Management LE-IDW

Decentralized Warehouse Management (LE-IDW)

Free Download Book SAP Open Information Warehouse

SAP Open Information Warehouse

Free Download Book Warehouse Management Guide

Warehouse Management Guide

Sunday, February 8, 2009

BI DataStore Objects

Definition

A DataStore object serves as a storage location for consolidated and cleansed transaction data or master data on a document (atomic) level.

This data can be evaluated using a BEx query.

A DataStore object contains key fields (for example, document number/item) and data fields that can also contain character fields (for example, order status, customer) as key figures. The data from a DataStore object can be updated with a delta update into InfoCubes and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems.

Unlike multidimensional data storage using InfoCubes, the data in DataStore objects is stored in transparent, flat database tables. The system does not create fact tables or dimension tables.

Use

The cumulative update of key figures is supported for DataStore objects, just as it is with InfoCubes, but with DataStore objects it is also possible to overwrite data fields. This is particularly important with document-related structures. If documents are changed in the source system, these changes include both numeric fields, such as the order quantity, and non-numeric fields, such as the ship-to party, status and delivery date. To reproduce these changes in the DataStore objects in the BI system, you have to overwrite the relevant fields in the DataStore objects and set them to the current value. Furthermore, you can use an overwrite and the existing change log to render a source delta enabled. This means that the delta that is further updated to the InfoCubes, for example, is calculated from two successive after-images.

There are different types of DataStore object:

...

Standard (see Standard DataStore Objectss)

For direct update (see DataStore Objects for Direct Updates)

Write-optimized (see Write-Optimized DataStore Objectss)

Differences between DataStore object types:

Type

Structure

Data Supply

SID Generation

Are BEx Queries Possible?

Standard

Consists of three tables: activation queue, table of active data, change log

From data transfer process

Yes

Yes

For direct update

Consists of the table of active data only

From APIs

No

Yes

Write-optimized

Consists of the table of active data only

From data transfer process

No

Yes

Integration

Integration with the Data Warehousing Workbench - Modeling
Metadata

DataStore objects are fully integrated with BI metadata. They are transported just like InfoCubes and are installed from BI Content (for more information, see Structure linkInstalling Business Content). DataStore objects are grouped with InfoCubes in the InfoProvider view of the Data Warehousing Workbench - Modeling, and are displayed in a tree. They also appear in the data flow display.

Update

Transformation rules define the rules that are used to write data to a DataStore object. They are very similar to the transformation rules for InfoCubes. The main difference is the behavior of data fields in the update. When you update requests into a DataStore object, you have an overwrite option as well as an addition option.

See also Structure linkTransformation Type.

The Structure linkdelta process that is defined for the DataSource also influences the update. When loading files, the user must select a suitable delta process so that the correct transformation type is used.

Unit fields and currency fields operate just like normal key figures, meaning that they must be explicitly filled using a rule.

Scheduling and Monitoring

The processes for scheduling the data transfer process for updating data into InfoCubes and DataStore objects are identical.

It is also possible to schedule the activation of DataStore object data and the update from the DataStore object into the related InfoCubes or DataStore objects.

The individual steps, including processing the DataStore object, are logged in the monitor.

There is a separate detailed monitor for executed request operations (such as activation or rollback).

Loadable DataSources

In full-update mode, each transaction data DataSource contained in a DataStore object can be updated. In delta-update mode, only those DataSources that are flagged as delta-enabled DataStores can be updated.

BW Scenario for SAP DataStore Objects

The diagram below shows how DataStore objects are used in this example of updating order and delivery information, and the status tracking of orders, meaning which orders are open, which are partially-delivered, and so on.

This graphic is explained in the accompanying text

There are three main steps to the entire data process:

...

Loading the data into the BI system and storing it in the PSA

The data requested by the BI system is stored initially in the PSA. A PSA is created for each DataSource and each source system. The PSA is the storage location for incoming data in the BI system. Requested data is saved, unchanged, to the source system.

1. Processing and storing the data in DataSource objects

In the second step, the DataSource objects are used on two different levels.

a. On level one, the data from multiple source systems is stored in DataSource objects. Transformation rules permit you to store the consolidated and cleaned up data in the technical format of the BI system. On level one, the data is stored on the document level (for example, orders and deliveries) and constitutes the consolidated database for further processing in the BI system. Data analysis is therefore not usually performed on the DataSource objects at this level.

b. On level two, transfer rules subsequently combine the data from several DataStore objects into a single DataStore object in accordance with business-related criteria. The data is very detailed, for example, information such as the delivery quantity, the delivery delay in days, and the order status, are calculated and stored per order item. Level 2 is used specifically for operative analysis issues, for example, which orders are still open from the last week. Unlike multidimensional analysis, where very large quantities of data are selected, here data is displayed and analyzed selectively.

2. Storing data in the InfoCube

In the final step, the data is aggregated from the DataStore object on level two into an InfoCube, meaning in this scenario, that the InfoCube does not contain the order number, but saves the data, for example, on the levels of customer, product, and month. Multidimensional analysis is also performed on this data using a BEx query. You can still display the detailed document data from the DataStore object whenever you need to. Use the report/report interface from a BEx query. In this way, you are able to analyze the aggregated data from the InfoCube, and target the specific level of detail you want to access in the data.

SAP Standard BI DataStore Object

Use

The standard DataStore object is filled with data during the extraction and load process in the BI system.

Structure

A standard DataStore object is represented on the database by three transparent tables:

Activation queue: Serves to save DataStore object data records that are to be updated, but that have not yet been activated. The data is deleted after the records have been activated.

Active data: A table containing the active data (A table).

Change log: Contains the change history for the delta update from the DataStore object into other data targets, such as DataStore objects or InfoCubes.

The tables of active data are built according to the DataStore object definition. This means that key fields and data fields are specified when the DataStore object is defined. The activation queue and the change log are almost identical in structure: the activation queue has an SID as its key, the package ID and the record number; the change log has the request ID as its key, the package ID, and the record number.

This graphic is explained in the accompanying text

This graphic shows how the various tables of the DataStore object work together during the data load.

Data can be loaded performantly from several source systems simultaneously because a queuing mechanism enables a parallel INSERT. The key allows records to be labeled consistently in the activation queue.

The data arrives in the change log from the activation queue and is written to the table for active data upon activation. During activation, the requests are sorted according to their logical keys. This ensures that the data is updated to the table for active data in the correct request sequence.

See also: Example of Activating and Updating Data.

DataStore Objects for Direct Update

Definition

The DataStore object for direct update differs from the standard DataStore object in terms of how the data is processed. In a standard DataStore object, data is stored in different versions (active, delta, modified), whereas a DataStore object for direct update contains data in a single version. Therefore, data is stored in precisely the same form in which it was written to the DataStore object for direct update by the application. In the BI system, you can use a DataStore object for direct update as a data target for an analysis process. For more information, see Structure linkAnalysis Process Designer.

The DataStore object for direct update is also required by diverse applications, such as SAP Strategic Enterprise Management (SEM) for example, as well as other external applications.

Use

DataStore objects for direct update ensure that the data is available quickly. The data from this kind of DataStore object is accessed transactionally. The data is written to the DataStore object (possibly by several users at the same time) and reread as soon as possible.

It is not a replacement for the standard DataStore object. It is an additional function that can be used in special application contexts.

Structure

The DataStore object for direct update consists of a table for active data only. It retrieves its data from external systems via fill or delete APIs. See DataStore Data and External Applications.

The load process is not supported by the BI system. The advantage to the way it is structured is that it is easy to access data. Data is made available for analysis and reporting immediately after it is loaded.

This graphic is explained in the accompanying text

Creating DataStore Objects for Direct Update

When you create a DataStore object, you can change the DataStore object type under Settings in the context menu. The default setting is Standard. You can only switch between DataStore object types standard and direct update if data does not yet exist in the DataStore object.

Integration

Since you cannot use staging and fill DataStore objects for direct update with BI data (DataSources do not provide the data), DataStore objects are not displayed in the scheduler or in the monitor. However, you can update the data in DataStore objects for direct update to additional InfoProviders.

If you switch a standard DataStore object that already has update rules to direct update, the update rules are set to inactive and can no longer be processed.

Since no change log is generated, you cannot perform a delta update to the InfoProviders at the end of this process.

The DataStore object for direct update is available as an InfoProvider in the BEx Query Designer and can be used for analysis purposes.

Archives