Archive for July, 2013

How to know everything about BW Extractors

July 25th, 2013

Extractor in a simple terminology is used for extracting the data from various sources to BW.
For this purpose we have SAP pre-defined extractors (LO extraction etc…) and customized extractors (Generic extractors)

Application specific BW content extractors:
Lo Extraction:
Logistics refers to the process of getting a product or service to its desired location upon request which involves transportation, purchasing, warehousing etc.

Main areas in logistics are:
Sales and Distribution (SD) : application 11, 13, 08 (in LBWE T-code)
Materials Management (MM) : application 03, 02
Logistics Execution (LE) : application 12
Quality Management : application 05
Plant Maintenance (PM) : application 04, 17
Customer Service (CS) : application 18
Project System (PS) : application 20
SAP Retail : application 40,43,44,45

How the data extraction happens?

Extraction can be done using either Full update/delta update.

Full load: In case of logistic application, Full/Initialization will extract the data from setup tables (contains only historical data).

For full update the data will be taken from setup tables, so in order to capture the changes you need to fill setup tables every time ,which will be a laborious task.
So, it is always suggestible to go for delta loads which makes loading life easier

Read the below note to get details on delta load-:

Initialization: Data will be fetched from application table to setup tables (In Lo extraction, the extractor won’t allow the direct communication with the application tables) from here, data finally reaches the target (info cube/ODS).Remember this process is for onetime.

Pre-requisites: Prior to initialization make sure the following steps are completed:

1. Maintain Extract Structure
2. Maintain data sources
3. Activate Extract Structure
4. Delete Setup tables
5. Fill setup tables

Delta load: Once after successful initialization, we can use delta update to capture the changed /new records
Once a new transaction happens/an existing record is modified, upon saving it goes to the respective application table.
Pre-requisites: Prior to delta loads make sure the following steps are completed:
1.Define periodic V3 update jobs 2. Setting up the update mode (direct/queued/Un serialized v3 update)
LO- Delta Mode:
Info object 0Recordmode helps in identifying the delta
Check the field “delta “in ROOSOURCE /RODELTAM table
Incase of Lo extraction it is “ABR”
ABR: An after image shows the status after the change, a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion. This serializes the delta packets.This process supports an update in an ODS object as well as in an Info Cube.
FI extraction:
FI Module deals with accounting and financial needs of an organization.
Financial Accounting is broken down into the following sub-modules:
• Accounts Receivables
• Accounts Payable
• Asset Accounting
• Bank Accounting
• Consolidation
• Funds Management
• General Ledger
• Special Purpose Ledger
• Travel Management
Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity of the area
We can extract the financial data at totals level / line item level.
In general, we will use R/3 line item tables as the data source for extracting the data to allow drill down capability from summarized data to line-item details.
Financial Accounting data can be extracted directly from the tables.
Depending on the business requirement we can use either FI-SL or standard BW content extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.
FI-AR, FI-AP, and FI-GL:
General Ledger: All accounting postings will be recorded in General Ledger. These postings are real time to provide up-to-date visibility of the financial accounts.
Account Receivable: Accounts Receivables record all account postings generated as a result of Customer sales activity. These postings are automatically updated in the General Ledger
Accounts Payable: Accounts Payables record all account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.
Standard FI data sources:
0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS).
0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable)
How the data extraction happens?
In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to maintain consistent data transfer from OLTP system (it is called coupled data extraction, Ref OSS notes 428571).
Note: Uncoupled” extraction possible with Plug-In PI 2002.2, see OSS note 551044
0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R/3 System with a new upper limit for the time stamp selection.
And now, 0FI_AP_4 and 0FI_AR_4 will copy this new upper limit for the time stamp selection during the next data extraction in the SAP R/3 System. This ensures the proper synchronization of accounts payable and accounts receivable accounting with respect to G/L accounting.
Full load: Not a valid choice because of large volumes of detailed R/3 transaction data.
Delta load:
Note: Here the delta identification process works differently for new financial records and for changed financial records.
New Financial accounting line items which are posted in SAP R/3 sytem will be identified by the extractor using the time stamp in the document header (Table BKPF-(field) CPUDT).
By scheduling an initialization IP all the historical data can be loaded into BW from the application tables and it also sets “X” indicator in field LAST_TS (Flag: ‘X’ = Last time stamp interval of the delta extraction).That means after the last delta, initialization was done.

OLTPSOURCE AEDAT/AETIM UPD DATE_LOW DATE_HIGH LAST_TS
0FI_GL_4 16 May 2013/21:05 Init 01 Jan 1990 15 May 2013
0FI_GL_4 24 May 2013/17:30 delta 16 May 2007 23 May 2013
0FI_GL_4 21 June 2013/18:12 delta 15 June 2007 20 June 2013 X
0FI_AP_4 18 May2013/20:14 Init 01 Jan 1990 15 May 2013

After this, daily delta loads can be carried out depending on timestamp by scheduling delta info packages.
During the delta load , the SAP R/3 system logs two time stamps that delimit a selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_HIGH).
In case of changed FI documents, selections will be based on tables:
BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more details).
Delta extraction using delta queue method can also be possible incase if we want,
• Serialization of the records
• To distribute delta records to multiple BW systems.
FI -Delta Mode:
A time stamp on the line items serves to identify the status of the delta. Time stamp intervals that have already been read are then stored in a time stamp table (BWOM2_TIMEST).
(Info object 0Recordmode plays vital role deciding delta’s .Check the field “delta “in ROOSOURCE /RODELTAM table to identify the image)
The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method).
AIE: This delta method is not suitable for filling Info Cubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (Info Cubes) can be provided with data from this ODS object.
It uses delta type E(pull) means the delta data records are determined during the delta update by the data source extractor, updated to the delta queue and passed on to BI directly from there.

Check the below helpful links:
• General ledger
• Accounting payable/receivable:
CRM extraction:
Customer relationship management (CRM) is broadly about managing the relationships with customers, and is useful to analyze customer, vendor, partner, and internal process information.
How the data extraction happens?
We can do both full load and delta load depending on the CRM extractor behavior.
Initialization:
During the initialization, all data that can be extracted using a data source is transferred from SAP CRM into SAP BW.
• Execute the initialization of the delta process in SAP BW by creating and scheduling an Info Package.
• SAP BW calls up the BW Adapter using the Service API.
• The BW Adapter reads the data from the respective database.
• The selected BDoc data is converted into the extract structure from a mapping module that is also entered in the BW Adapter metadata.
• The type of Business Add-In (BAdI) that is called up by the BW Adapter depends on the BDoc type
• The requested data package is transferred to SAP BW using the Service API.
• Any new postings/uptation of old postings from the source sytem (CRM )side will be communicated via Middleware in the form of a BDoc.
• The flow controller for Middleware calls up the BW Adapter.
• The BW Adapter first checks whether the change communicated via the BDoc is relevant for SAP BW. A change is relevant if a Data Source for the BDoc is active.
• If the change is not relevant, it is not transferred to SAP BW and the process is complete.
• If it is relevant, then the BW Adapter calls up the corresponding mapping module and BAdi (the type of BAdi that needs to be called up in turn depends on the type of BDoc).
• And finally these will help in converting the BDoc data into the extract structure.
Note:The mapping module and the BAdis that are called up during delta upload are same as those called up during the initialization of the delta process.
The change is transferred to SAP BW using the Service API.
CRM-Delta Mode:
The delta will be identified /communicated via middleware in the form of Bdoc to BW adapter.
CRM standard data sources support AIMD (After-Images with Deletion Indicator Delta Queue)
HR extraction:
The HR module enables customers to effectively manage information about the people in their organization, and to integrate that information with other SAP modules and external systems
HR broadly has the following modules:
PA (Personnel Administration) and Organization Management
Personnel Development
Payroll Accounting
Time Management
Compensation
Benefits
Training and Events
The Personnel Administration (PA) sub module helps employers to track employee master data, work schedules, salary and benefits information. Personnel Development (PD) functionality focuses on employees’ skills, qualifications and career plans. Finally, the Time Evaluation and Payroll sub modules process attendance and absences, gross salary and tax calculations, and payments to employees and third-party vendors
HR delivers a rich set of business content objects that covers all HR sub-functional areas.
How the data extraction happens:
Before getting into how the data gets populated into HR info cube
Let’s understand the term info type
“An info type is a collection of logical and/or business-related characteristics of an object or person”
Here the data will be extracted from an info type (PA, PD, time management etc) and for few other applications it is from the cluster tables (Payroll, compensation etc.)
HR is basically master data centric because it is always related to people related Info Objects, such as Employee, Person. In most of the cases HR master data is defined as Time Dependent to enable historical evaluation. HR R/3 system records a specific period of validity for each Info type.
Procedure to extract the HR data:
• Activate Data Sources in the Source system (R/3)
• Replicate Data Sources in BW system:
• Activate business contents in BW.
• Populate HR cubes with data by scheduling info packages.
Note: Master Data should be loaded first
Except for payroll and time management rest all sub-functional areas supports only full load.
In case of full loads, old data needs to be deleted to avoid duplicated records in the target.
Application specific-customer generated extractors:
Controlling:
Controlling is broken down into following sub modules:
• Cost Element Accounting
• Cost Center Accounting
• Internal Orders
• Activity-Based Costing ( ABC)
• Product Cost Controlling
• Profitability Analysis
• Profit Center Accounting
Note: Only discussing (CO-PA) briefly because of the complexity of the area.
CO-PA:
Profitability analysis allows Management to review information with respect to the company’s profit or contribution margin by business segment.
It can be obtained by the following methods:
• Account-Based Analysis
• Cost-Based Analysis
Note:The details will be discussed once after understanding the CO-PA data flow.
How the data Extraction happens?
When the data is requested from SAP BW, the extractor determines which data source the data is to be read from. This depends on the
• Update mode (full, initialization of the delta method, or delta update)
• On the definition of the DataSource (line item characteristics (apart from field REC_WAERS) or calculated key figures)
• On the available summarization levels.
The extractor always tries to select the most appropriate data source, that is, the one with the smallest data volume
Once an Info-Package is executed, the SAP BW Staging Engine calls the CO-PA transaction data interface. CO-PA extraction program for the SAP BW uses the same replication method as the update program for CO-PA updating summarization levels. On the BW side, only data that is “at least 30 minutes old” is received .This is to secure data integrity.Because the time stamps from different application servers can be slightly different.
This retention period of 30 minutes is often described as a “security delta/Safety delta” The system only extracts data that is at least 30 Min Old.
Account-Based Analysis
For account-based CO-PA extraction, only Full Update from summarization levels is supported for releases up to and including Release PI2001.1.
In this case we can carry out delta using pseudo delta technique. Here we need to do selective full load based on some selection conditions (Fiscal period) and then we need to selectively drop the requests for the last period and reload the data that have changed.
From Release PI2001.2, the delta method can also be used.
Initialization: The initialization must be performed from a summarization level.
Delta update: Delta will be read from line items.
During the delta load controlling area, fiscal period fields should be mandatory.
Note: If the data needs to be read from a summarization level, then the level must also contain all the characteristics that are to be extracted using the Data Source (entry * in maintenance transaction KEDV). Furthermore, the summarization must have status ACTIVE.
Account based CO-PA is part of the CO module. This means the data which is posted in account based CO-PA is always in synchronize with the CO-module (CCA, OPA, PA, PS etc).
The CO tables are COEP, COBK (for line items) COSS and COSP (for the totals).
Cost-Based Analysis:
In the case of costing-based CO-PA, data can only be read from a summarization level if no characteristics of the line item are selected apart from the Record Currency (REC_WAERS) field, which is always selected.
An extraction from the segment level, that is, from the combination of the tables CE3XXXX / CE4XXXX (where XXXX stands for the operating concern), is only performed for Full Updates if no line item characteristics are selected (as with summarization levels).
Initialization: There are two possible sources for the initialization of the delta method. One is from Summarization levels (if no characteristics of the line item are selected) and the other one is from line item level.
In case of Summarization level, it will also record the time when the data was last updated / built.
If it is not possible to read data from a summarization level, data is read from line items instead.
Delta update: Data is always read from line items.
Costing Based CO-PA data is statistical data. This means that the update of CO-PA is not always equal to what is stored in the CO modules or in finance. The cost element is also not always updated and there are also more key-figures used to store info about the type of costs or revenues.
Understanding various tables(CE1/CE2/CE3/CE4) that are involved in co-pa extraction, please read BW data extraction .
CO-PA Delta Mode:
Extraction is based on Timestamp.
When data is extracted from CO-PA, a “safety delta” of half an hour is used with the initialization and the delta upload. This always ensures that only records that are already half an hour old since the start of the upload are loaded into SAP BW. Half an hour was chosen as the safety delta to overcome any time differences between the clocks on the different application servers.
Please check the below links for more information:
Profitability analysis
FI-SL:
There are two types of ledgers in the FI-SL System:
Standard Ledger: Delivered by SAP, Ex: General Ledger Accounting (FI-GL)
Special Purpose Ledgers: These will be designed as per business needs (User defined,Ex:FI-SL)
The FI-SL Data Source can supply the data both at totals record level and also at line item level
How the data extraction happens?
Prerequisite:
Since FI-SL is a generating application, the Data Source, transfer structure and assignment of the DataSource to the InfoSource must be created manually.
FI-SL line items:
Line item Data Source provides actual data at line item level.
Full and Delta mode: FI-SL line items can be extracted both in full and delta upload mode. The time stamp (TIMSTAMP field in the extract structure) is used to identify the delta load, which is supplied from the CPUDT and CPUTM fields in the line items table. It uses safety delta concept set to one hour. This means that posted line items can be loaded into BW after an hour.
Constraint:
The extract structure does not contain the BALANCE field. Refer note 577644 to find out alternative ways to populate this field.
FI-SL Totals Records:
This DataSource can provide both actual and plan data at totals record level
Full update: The full update DataSource can be used to determine the balance carry forward, since the line items DataSource does not supply this.
Usually Plan data will be transferred using the totals datasource in full update mode.
Delta Update: The delta method can only be used for actual data with the selection (0VTYPE = 010). The Delta method is based on Delta queue technology. That means after initialization during updating, the relevant data is posted to the Delta queue.
Before running the delta, please check the restrictions in the below link
Delta-Special Ledger
Part3: Cross application -Generic extractors
When none of the SAP- predefined extractors meeting the business demand, then the choice is to go for Generic extraction
We will go for Generic extraction:
1. When Business content does not include a data source for your application.
2. Business content requires additional enhancements that need data that is not supplied by SAP BW.
3. The application does not features it’s own generic data extraction method
4. When the requirement demands to use your own programs to fill your tables in SAP Systems
Check the below link for more information:
• Generic extraction
• Generic delta
• Generic Extraction via Function Module
Data recovery:
Scenario 1: The last run delta was failed(Not applicable to ALE based datasources)
Solution:
Make the QM status red, delete the request from all targets
Senario2: Everyday delta was running fine but you find suddenly delta is missing for certain period (the reason may be anything),
Solution:
1. Reload data from the PSA
2. Reload data from an ODS Object or an Info Cube (in a layered
Architecture, EDW approach)
Applicable to Logistics:
Please refer “One stage stop to know all about BW Extractors-Part1” to get an idea on Logistics extraction .
Option 1 and 2 are not applicable, the only choice is to extract the data from sources system
Check this OSS notes: 691721: Restoring lost data from a delta request
Here again we have one more constraint
As explained in the above OSS, because of huge data we can’t bear the downtime due to re-initialization, we have a workaround here
1. in BW,transfer the existing target contents to an external source using open hub services
2. Then selectively fill the setup tables for the missing data for the respective duration.
3. And run initialization, schedule V3 jobs to enable delta postings

Source: http://scn.sap.com

UNDERSTANDING OF NEW GENERAL LEDGER IN SAP FI

July 25th, 2013

Source: Internet

ADVANTAGES OF USING THE NEW SAP GENERAL LEDGER

•The new general Ledger covers all the functions that were previously implemented by the use of separate ledgers like the Costs of Sales Ledger, Special Purpose Ledger, the Reconciliation Ledger or the Profit Centre Ledger. Therefore, businesses need not maintain the separate ledgers in separate applications. This makes the recording at Administration of transactional data easier.
•The user interface of the New General Ledger is similar to that of the Classic General Ledger. Therefore users already familiar with the Classic General Ledger require very little training in order to migrate to using the new general Ledger.
•The New General Ledger eliminates data redundancy by storing all the transactional data in a single totals table. This improves the efficiency of the database.
•With the New General Ledger, there is no need to use separate ledgers like the Special Purpose Ledger, Profit Center Ledger, etc. It eliminates the need of Profit & Loss and Balance Sheet adjustments with online splitting functionality. Due to this there is no need to carry out additional reconciliation activities with other applications during closing. This helps in saving a lot of time and effort on the part of the end user.
•The use of the new General Ledger makes it easier to include additional fields in order to provide flexible reporting as per the requirements of a business.
•Reconciliation ledgers postings are eliminated by the use of the New General Ledger. In case of cross company code postings, the information was initially stored in controlling and then transfer to the FI module at period close. The controlling data was stored in the Reconciliation Ledger. At the period end, this data was transferred from controlling to create FI documents so that are reconciliation between controlling and FI was carried out. The New General Ledger provides real-time reconciliation between the controlling and FI modules.
•The Total Cost of Operations for a business is reduced due to the advantages provided by the New General Ledger.

CHARACTERISTICS OF THE NEW GENERAL LEDGER

The Classic General Ledger was primarily intended to provide a comprehensive mechanism for external reporting purposes by recording all the transactions of a business. However, for internal reporting purposes data needs to flow to the various controlling modules like Profit Centre Accounting, Cost Element Accounting, etc. The linkage between the Classic General Ledger and the controlling module was weak. Due to this there was automatic reconciliation between the Classic General Ledger and the various controlling applications. The New General Ledger provides the following features:
• PARALLEL ACCOUNTING: The New General Ledger allows several parallel ledgers to be maintained in order to record transactions and provide reports to meet different accounting requirements. This eliminates the need to use the Special Ledger application separately in order to fulfill these requirements.
• INTEGRATED STATUTORY AND MANAGEMENT REPORTING: The New General Ledger enables a business to perform internal management reporting along with supporting the traditional purpose of legal reporting. It allows financial statements to be generated for any dimensional in a business. Therefore, the New General Ledger may be used to produce the balance sheet which is required for external reporting purposes or the Profit Centre analysis report which is required for internal reporting purposes.
• SEGMENT REPORTING: The New General Ledger allows reports to be prepared on the basis of segments based on the International Financial Reporting standards (IFRS) and Generally Accepted Accounting Principles (GAAP)
• COST OF SALES ACCOUNTING: The New General Ledger allows the costs of sales accounting to be implemented.
• DOCUMENT SPLITTING: Document splitting helps to create balance sheets for entities that extend beyond the scope of the company code.
• NEW TABLES: the new general Ledger makes use of the tables FAGLFLEXT (totals), FAGLFLEXP (plan line items), FAGLFLEXA (actual line items) and FAGL_SPLINFO (splitting data).

LEDGER CONCEPT IN THE NEW G/L

The New General Ledger uses the Special Purpose Ledger to save total values in the tables. All company codes are assigned to a leading Ledger. For each client, additional ledgers for each company code can be added. These additional ledgers can be used for different purposes such as parallel accounting or management reporting. With the New General Ledger it is now possible to perform postings that previously required several components. This is also true with regards to the transfer postings between profit centers that were previously stored and the Special Ledger.
The new general Ledger makes use of additional tables and fields as compared to the classic general Ledger once the new Gen functionality is activated, the previous financial accounting menu is replaced by the financial accounting (new) and the general Ledger (new) menu. It is required to activate the new general Ledger accounting so that the transactional data is written to the new tables rather than the classic general Ledger tables.

ACTIVATING THE NEW GENERAL LEDGER IN THE SYSTEM

For all the new installations, the New General Ledger is activated by default. SAP recommends the use of the New General Ledger functionality because of the many advantages it has to offer. However, for existing customers it is optional. Existing customers can continue to use the Classic General Ledger or they can migrate to the New General Ledger. The New General Ledger is activated automatically during an upgrade

How-to-prepare-for-a-certification-exam-in-5-simple-steps

July 10th, 2013

Source: Internet

Here are five simple steps that will help you to gather all relevant information you need for your exam preparation:

1. Find the right level of certification

You can either take an associate or professional certification exam. Associate certifications are for those who are new to SAP solutions. Professional certifications require proven project experience and a more detailed understanding of SAP solutions. Descriptions of each level are available here.

2. Find the right exam

Here are some useful tips for easy navigation in the web shop:
• Start your search directly in the certification web shop
• Select your country

Leave the search field empty, and then just run your search by browsing through our catalogue.

A list will then be displayed of all available SAP certifications; you can further refine your search with the “filter” or “sort by” function (e.g. Filter By Solution)

3. Consider training

Are you new to your chosen field? Training courses can help you to pass associate level exams, and SAP Education provides various training options. You can choose between classroom training (live or virtual), e-learning courses, and many more. Here you can find an overview of the different training options.
Not sure if you have taken all relevant training courses? For each exam, you can find the related training courses on the exam site.

Do you already have hands-on experience with a particular solution, or do you want to upgrade your certification to the latest release? SAP Education’s Associate and Professional Packages might be an option to help you to prepare for your certification exam at your own pace.

4. Locate the syllabus

In order to get an idea about which topics are important for your certification, you should review the syllabus. You can find an overview of the related topics on the certification site. Prepare for the exam based on the weighted value for each topic in the exam Topic Areas.

5. Sample questions

For most exams, SAP Education provides sample questions; these are actual questions asked during past exams, so they are representative of the questions asked during a live exam. You can find the sample exam questions under the Certification Details for each exam

http://scn.sap.com/community/training-and-education/certification/blog/2012/12/06/how-to-prepare-for-a-certification-exam-in-5-simple-steps