Search This Blog

Sunday 26 February 2012

LO Extraction Q&A


1.  What is the difference between LIS and LO?
A:        1. Level of Information, Lo provides Data Sources specific to level of information. For Ex: 2LIS_11_VAHDR – Sales Order Header Information, 2LIS_11_VAITM – Sales Order Item, 2LIS_11_VASCL – Sales Order Scheduler, but in case of LIS it provides information structures specific to Sales Order Information.
            2. Lo uses V3 update (Asynchronous Background Process), which gives the option of executing the LUW’s in the background, but in case of LIS we use V1 or V2.
            3. In Lo, we fill the Setup Tables and once the data is completely extracted into BW side we can delete the Setup Tables Back again by using the T-code LBWG. But in case of LIS we never delete the contents of the information Structure once it is filled, apart from this information structure is also filled with Delta Records, which is of no use.

2.                  What are the steps for generating Lo Data Source?
A:        Since Lo Data Sources are given as part of Business Content in Delivered Version To create a Copy of the same data source in active version, Go to T-Code RSA5, Select the data source and click on ‘transfer datasource’ button, to maintain the ready made extract structure given along with the datasource, go to the T-Code LBWE and click on the ‘Maintenance ‘ link to maintain the Extract Structure by moving the fields from the communication structures, if the required field is not available in the communication structure then we go for enhancing the data source. Since the extract structure is re-generated, we need to generate the data source by clicking the data source link and customize the extract structure by using the following check boxes ‘selection, hide, inversion, field only known in exit’ and then save it. Then choose the delta update mode as Queued Delta among (Direct Delta, Serialized V3 Update, Un serialized v3 update and queued delta) and click on inactivate link to activate the datasource. Then replicate the data source in BW side and start Migrating the data.

3.         How do we migrate the data in LO?
            To be on the safer side, we prefer to delete the Setup Tables by using the T-Code LBWG and specify the application component and also we clear the Delta Queues by using the T-Code SMQ1 and we clear the Extractor queues by using the T-Code LBWQ then lock Related T-Codes, then run the Statistical Setup by using the T-Code OLI*BW, to fill the setup tables then run the initial Loads or Full Loads to extract all the data from Setup Tables into BW side, then once done successfully we go back to R/3 LBWE to setup the periodicity of the V3 job, which could be hourly or two hourly depending on the number of transactions, more the transactions v3 Job  should be scheduled more  frequently and then Finally locks can be released, so that user can start recording the transactions.
  NOTE: These Migrations are always done in the Week Ends after the office hours and T-Codes are usually locked and unlocked by the basis people.


3.                  What are the different Delta Update modes and explain each?
A.                Direct Delta:- In case of Direct delta LUW’s are directly posted to Delta Queue (RSA7) and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. If we use Direct Delta it degrades the OLTP system performance because when LUW’s are directly posted to Delta Queue (RSA7) the application is kept waiting until all the enhancement code is executed.
B.                 Queued Delta: - In case of Queued Delta LUW’s are posted to Extractor queue (LBWQ), by scheduling the V3 job we move the documents from Extractor queue (LBWQ) to Delta Queue (RSA7) and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. Queued Delta is recommended by SAP it maintain the Extractor Log which us to handle the LUW’s, which are missed.
C.                 Serialized V3 Update: - In case of Serialized V3 update LUW’s are posted to Update Queue (SM13), by scheduling the V3 job which move the documents from Extractor queue (LBWQ) to Delta Queue (RSA7) in a serialized fashion and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. Since the LUW’s are moved in a Serial fashion from Update queue to Delta queue if we have any error document it doesn’t lift the subsequent documents and as it sorts the documents based on the creation time, there every possibility for frequent failures in V3 job and missing out the delta records. It also degrades the OLTP system performance as it forms multiple segments with respective to the change in the language.
D.                Un serialized V3 Update: - In case of Un serialized V3 update LUW’s are posted to Update Queue ( SM13 ), by scheduling the V3 job which move the documents from Extractor queue ( LBWQ ) to Delta Queue ( RSA7 ) and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. Since the LUW’s are not moved in a Serial fashion from Update queue to Delta queue if we have any error document it considers the subsequent documents and no sorting of documents based on the creation time. It improves the OLTP system performance as it forms a segment for one language.

4.                  How do you enhance a LO Datasource?
A.                The enhancement procedure is same as any other data source enhancement but only difference is that instead of appending the fields to the Extract structure we append the field to the communication structure and move the field from communication structure to extract structure in Maintenance link in T-code LBWE. Deselect the hide check box and field only known in exit check box when generating the data source, then we write the enhancement code to populate the data into the enhanced fields by using the enhancement RSAP0001 this enhancement contains four function exits, that is EXIT_SAPLRSAP_001(for transaction data data source),EXIT_SAPLRSAP_002(For master data attribute)EXIT_SAPLRSAP_003(For master data texts), EXIT_SAPLRSAP_004 (Master data Hierarchies ). We use the internal table C_T_DATA in coding.

5.                  What is the T-code for deleting Set Up tables?
A.                LBWG.


6.                  What is the Importance of Delta Queue?
A.                Delta Queue (RSA7) Maintains 2 images one Delta image and the other Repeat Delta.
            When we run the delta load in BW system it sends the Delta image and when ever            delta loads and we the repeat delta it sends the repeat delta records.

7.                  What does the total column indicate in RSA7 Delta queue?
A.                For each set of LUW either available in both Delta and Repeat Delta or not it counts as one LUW.

8.                   What you mean by LUW?
A.                LUW stands of Logical Unit of Work. When we create o new document it forms New image ‘N’ and when ever there is a change in the existing document it forms before image ‘X’ and after Image ‘ ‘ and these after and before images together constitute one LUW. Before image is always backed up by after image.

9.                  When we are initializing the data source do we need to Lock the posting in SAP R/3?
A.                Yes, that is the reason we normally do this data migration steps during weekends in non-office hours, not to disturb the business.

10.              How do we Re-initialize a LO Data source?
A.          Firstly we lock as transactions as see that we have posting done to Extractor queue (LBWQ) in case of Queued Delta and Update queue (SM13) in case of Serialized V3 Update and Un serialized V3 update, Then if we don’t want the delta records in SM13 and LBWQ we can delete them or if we want them into SAP BW, then we schedule the V3 job and collect LUW’s into Delta Queue and Extract the LUW’s from Delta Queue to SAP BW by running Delta Loads in SAP BW. Once the Queues are Cleaned up, delete the Setup Tables by using the T-Code LBWG and specify the application component, then run the Statistical Setup by using the T-Code OLI*BW, to fill the setup tables then run the initial Loads or Full Loads to extract all the data from Setup Tables into BW side, then once done successfully we go back to R/3 LBWE to setup the periodicity of the V3 job, which could be hourly or two hourly depending on the number of transactions, more the transactions v3 Job should be scheduled more  frequently and then Finally locks can be released, so that user can start recording the transactions.

            NOTE: These Migrations are always done in the Week Ends after the office hours and     T-Codes are usually locked and unlocked by the basis people.

Saturday 25 February 2012

BW Interview Q&A

1.   What are the extractor types?
 Application Specific
o BW Content FI, HR, CO, SAP CRM, LO Cockpit
o Customer-Generated Extractors
LIS, FI-SL, CO-PA
• Cross Application (Generic Extractors) 
o DB View, InfoSet, Function Module 
2. What are the steps involved in LO Extraction?
• The steps are:
o RSA5 Select the DataSources
o LBWE Maintain DataSources and Activate Extract Structures
o LBWG Delete Setup Tables
o 0LI*BW Setup tables 
o RSA3 Check extraction and the data in Setup tables
o LBWQ Check the extraction queue
o LBWF Log for LO Extract Structures
o RSA7 BW Delta Queue Monitor
3. How to create a connection with LIS InfoStructures? 
• LBW0 Connecting LIS InfoStructures to BW
4. What is the difference between ODS and InfoCube and MultiProvider?
• ODS: Provides granular data, allows overwrite and data is in transparent tables, ideal for drilldown and RRI. 
• CUBE: Follows the star schema, we can only append data, ideal for primary reporting. 
• MultiProvider: Does not have physical data. It allows to access data from different InfoProviders (Cube, ODS, InfoObject). It is also preferred for reporting. 
5. What are Start routines, Transfer routines and Update routines?
• Start Routines: The start routine is run for each DataPackage after the data has been written to the PSA and before the transfer rules have been executed. It allows complex computations for a key figure or a characteristic. It has no return value. Its purpose is to execute preliminary calculations and to store them in global DataStructures. This structure or table can be accessed in the other routines. The entire DataPackage in the transfer structure format is used as a parameter for the routine. 
 Transfer / Update Routines: They are defined at the InfoObject level. It is like the Start Routine. It is independent of the DataSource. We can use this to define Global Data and Global Checks.
6. What is the difference between start routine and update routine, when, how and why are they called? 
• Start routine can be used to access InfoPackage while update routines are used while updating the Data Targets.
7. What is the table that is used in start routines?
• Always the table structure will be the structure of an ODS or InfoCube. For example if it is an ODS then active table structure will be the table. 
8. Explain how you used Start routines in your project?
• Start routines are used for mass processing of records. In start routine all the records of DataPackage is available for processing. So we can process all these records together in start routine. In one of scenario, we wanted to apply size % to the forecast data. For example if material M1 is forecasted to say 100 in May. Then after applying size %(Small 20%, Medium 40%, Large 20%, Extra Large 20%), we wanted to have 4 records against one single record that is coming in the info package. This is achieved in start routine. 
9. What are Return Tables?
• When we want to return multiple records, instead of single value, we use the return table in the Update Routine. Example: If we have total telephone expense for a CostCenter, using a return table we can get expense per employee. 
10. How do start routine and return table synchronize with each other?
• Return table is used to return the Value following the execution of start routine

Read more: http://www.ittestpapers.com/sap-bw-interview-questions---part-a.html#ixzz0ylFTn8IH
11. What is the difference between V1, V2 and V3 updates? 
• V1 Update: It is a Synchronous update. Here the Statistics update is carried out at the same time as the document update (in the application tables).
• V2 Update: It is an Asynchronous update. Statistics update and the Document update take place as different tasks. 
o V1 & V2 don't need scheduling.
• Serialized V3 Update: The V3 collective update must be scheduled as a job (via LBWE). Here, document data is collected in the order it was created and transferred into the BW as a batch job. The transfer sequence may not be the same as the order in which the data was created in all scenarios. V3 update only processes the update data that is successfully processed with the V2 update. 

12. What is compression?
• It is a process used to delete the Request IDs and this saves space.

13. What is Rollup?
• This is used to load new DataPackages (requests) into the InfoCube aggregates. If we have not performed a rollup then the new InfoCube data will not be available while reporting on the aggregate. 

14. What is table partitioning and what are the benefits of partitioning in an InfoCube?
• It is the method of dividing a table which would enable a quick reference. SAP uses fact file partitioning to improve performance. We can partition only at 0CALMONTH or 0FISCPER. Table partitioning helps to run the report faster as data is stored in the relevant partitions. Also table maintenance becomes easier. Oracle, Informix, IBM DB2/390 supports table partitioning while SAP DB, Microsoft SQL Server, IBM DB2/400 do not support table portioning. 

15. How many extra partitions are created and why?
• Two partitions are created for date before the begin date and after the end date.

16. What are the options available in transfer rule?
• InfoObject 
• Constant
• Routine
• Formula

17. How would you optimize the dimensions?
• We should define as many dimensions as possible and we have to take care that no single dimension crosses more than 20% of the fact table size. 

18. What are Conversion Routines for units and currencies in the update rule?
• Using this option we can write ABAP code for Units / Currencies conversion. If we enable this flag then unit of Key Figure appears in the ABAP code as an additional parameter. For example, we can convert units in Pounds to Kilos. 

19. Can an InfoObject be an InfoProvider, how and why?
• Yes, when we want to report on Characteristics or Master Data. We have to right click on the InfoArea and select "Insert characteristic as data target". For example, we can make 0CUSTOMER as an InfoProvider and report on it. 

20. What is Open Hub Service?
• The Open Hub Service enables us to distribute data from an SAP BW system into external Data Marts, analytical applications, and other applications. We can ensure controlled distribution using several systems. The central object for exporting data is the InfoSpoke. We can define the source and the target object for the data. BW becomes a hub of an enterprise data warehouse. The distribution of data becomes clear through central monitoring from the distribution status in the BW system. 
21. How do you transform Open Hub Data?
• Using BADI we can transform Open Hub Data according to the destination requirement.

22. What is ODS?
• Operational DataSource is used for detailed storage of data. We can overwrite data in the ODS. The data is stored in transparent tables.

23. What are BW Statistics and what is its use?
• They are group of Business Content InfoCubes which are used to measure performance for Query and Load Monitoring. It also shows the usage of aggregates, OLAP and Warehouse management.

24. What are the steps to extract data from R/3?
• Replicate DataSources
• Assign InfoSources
• Maintain Communication Structure and Transfer rules
• Create and InfoPackage
• Load Data

25. What are the delta options available when you load from flat file?
• The 3 options for Delta Management with Flat Files:
o Full Upload
o New Status for Changed records (ODS Object only)
o Additive Delta (ODS Object & InfoCube)

BW Mock Questions - Certification


  1. Choose the statements relating to “BW Star Schema”
    1. In the BW, the distinction is made between Info cube & Master Data/SID Table’s
    2. Master Data tables are connected to an Infocube by way of DIM ID’s
    3. Dimension Historization is made possible
    4. D.    Supports Multi Lingual Capability
    5. All of the Above

  1. The following are the AWB functions within the BW system
    1. Translation
    2. Business Content
    3. Documents
    4. BEx Maps
    5. All of the Above

  1. Choose the relative answers applicable to “Info Objects”
    1. Time Characteristics
    2. Characteristic
    3. Key Figures
    4. Technical Characteristics
    5. All of the above

  1. Validate & Choose the correct statements reflecting “Attributes”
    1. Attributes are Info objects, which can be formed by Characteristics or Key figures or Time Characteristics.
    2. Attributes are Info objects, which can be formed by Characteristics or Key figures
    3. Both Display & Navigation Attributes can be time-dependent
    4. Transitive Attributes can be used in BEx Query directly after infoobject is activated
    5. All of the above

  1. Choose the correct answers to the Tables
              A.    Master Data view Table is M
    1. Time Independent Characteristic Navigation table is “Y”
    2. Hierarchy Interval table “K
    3. Naming convention of Customer created tables start with “/BIC/…”
    4. All of the above

  1. The necessary objects for Data Flow in BW are
    1. Source Systems
    2. Extract Structure & Transfer Structure
    3. Communication Structure & Update Rules
    4. Necessary RFC connections with Plug in’s
    5. All of the above

  1. Choose the Update Methods for Characteristics
             A.    ABAP Routine
    1. Initial Value
    2. Master Data
    3. Formula Builder
    4. Constants

  1. One of the steps in using ETL tools to load data into BW is “Create Infopackage with selection as 3rd Party”.
           TRUE
FALSE

  1. Using “Manage” functions on a Info cube, the following tasks can be performed.
    1. Display all requests which are aggregated or scheduled to be aggregated as well set with deletion & compressed requests.
    2. Can perform repair index
    3. Can perform to empty the E table and fill F table with request ID set to Null
    4. Can reconstruct the data even if the data is not available at PSA
    5. All of the above

  1. Choose the correct statements with respect to Aggregates
    1. Switch Off Aggregate , the data is still filled however the OLAP reporting is not possible
    2. Deactivate Aggregate delete the data & definition.
    3. All Aggregate properties are stored under table RSDDAGGRDIR
    4. Aggregates levels can be for “inclusive”, “exclusive”, “fixed” & “hierarchy”
    5. All of the above.
  2. Choose the relevant aspects relating to Delta Methods
    1. Delta Methods are source system & extractors dependent
    2. Delta Method components are Delta types, Record mode and Serialization
    3. Delta Methods are primary key in the table RODELTAM
    4. Delta Methods are Not included in the table ROOSOURCE
    5. All of the above

  1. Select the correct statements with respect to Flat file Staging scenarios
    1. Flat File Staging Scenario requires “Set update method for each Data source”.
    2. The supported delta loads are FIL0, ODS, & FIL1
    3. No requirement on Data Source replication for all Flat file source system
    4. Has the ability to check & simulate the Data transfer both at Communication Structure level as well at Data Target level
    5. All of the above

  1. Choose the correct answers referring to DB Connect
    1. One of the prerequisite on DB Connect is to have both DBSL & DB Clients are to be installed in all BW Application servers
    2. You can extract from all kind of data within the database using DB Connect
    3. Only tables & views from the Database systems can be extracted
    4. One of the steps is to Map Database types against BW DDIC types
    5. All of the above

  1. You are implementing Open Hub for your company. Validate the correct statements relating to the Infospoke
    1. Info Spoke is an alternative solution to Data Mart
    2. All Master data & Transaction data available for extraction using Info cube, ODS & Infoobjects
    3. Output file types supported are flat csv file or transparent tabl
    4.  It supports both Full & Delta
    5. Delta determination is based on request ID

  1. Validate & choose the relevant statements on Ascential DataStage
    1. Ascentail Data Stage is SAP’s product
    2. Used to interface with BW using BAPI technology
    3. Requires BW Info source definition to be imported in the ETL tool
    4. Need to schedule the Extractor job from ETL
    5. Ascential data stage is SAP’s recognized 3rd party product

  1. All xml data are transferred using SOAP RFC services & subsequently pushed into the BW Delta Queue using BW defined function module for later update into the BW
True
False

  1. Validate & choose the relevant info on Data Mart scenarios.
    1. Data Mart is a solution designed primarily for data transfers among BW systems
    2. The types of data marts are Aggregated, Mixed, Myself and Replicating
    3. You do not require to generate export data source for Infocube
    4. Data Mart naming convention starts with “8”
    5. All of the above

  1. Choose the correct steps on CO-PA extraction scenarios
    1. You need to generate data source in the Source SAP system only once for both Accounting & Costing based CO-PA
    2. The data extracted from CO-PA into BW differs by 30 minutes
    3. Both Accounting based & Costing based CO-PA supports Full and Delta load in BW
    4. When realignment/changes updated in R/3, you need realignment run or set the DIRTY_FLAG as option
    5. All of the above

  1. Choose the correct answers relating to LO Cockpit & LIS functions
    1. The LIS extraction require 2 additional tables such as SxxxBIW1/SxxxBIW2 & 1 view such as SxxxBIWS to accommodate and supply the necessary data to BW
    2. Queued Delta option used in LO cockpit update the data directly into BW Delta Queue
    3. Delete & fill Set-up table is one of the critical & safety aspects during LIS to LO Cockpit conversion
    4. Setup Periodic V3 updates is common step used in all LO cockpit/LIS & LIS to LO Conversion jobs
    5. Activating extract structure in LO Cockpit can be done immediately after replicating the Datasource.

  1. Choose the correct steps in LIS Extraction
    1. Maintain extract structure, Generate data source, maintain info source, Assign Datasource with Info source, Maintain Communication Structure/Transfer rules, Maintain Infocube/ODS with update rules, Generate update for LIS Info source, Delta upload in LIS, Info package for delta initialization, Activate update, Info package for delta
    2. Set up LIS environment, Generate data source, maintain info source, Assign Datasource with Info source, Maintain Communication Structure/Transfer rules, Maintain Infocube/ODS with update rules, Generate update for LIS Info source, Delta upload in LIS, Info package for delta initialization, Activate update, Info package for delta
    3. Set up LIS environment, Generate data source, maintain info source, Assign Datasource with Info source, Maintain Communication Structure/Transfer rules, Maintain Infocube/ODS with update rules, Generate update for LIS Info source, Info package for delta initialization, Activate update, Info package for delta
    4. Set up LIS environment, Generate data source, maintain info source, Assign Datasource with Info source, Maintain Communication Structure/Transfer rules, Maintain Infocube/ODS with update rules, Delta upload in LIS, Info package for delta initialization, Info package for delta
    5. None of the Above

  1. Choose the Correct Steps for LO Cockpit extraction
    1. Maintain Extract Structure, Generate data source, maintain info source, Assign Datasource with Info source, Maintain Communication Structure/Transfer rules, Maintain Infocube/ODS with update rules, Generate update for LIS Info source, Delta upload in LIS, Info package for delta initialization, Activate update, Info package for delta
    2. Maintain extract structure, Maintain data source, Replicate Datasource in BW, Assign Datasource to Info source, Maintain Communication Structure/transfer rules, Maintain IC/ODS with update rules, Activate extract structure, delete/set-up tables & set-up table extraction, Info package for delta initialization, Info package for delta
    3. Maintain extract structure, Maintain data source, Replicate Datasource in BW, Assign Datasource to Info source, Maintain Communication Structure/transfer rules, Maintain IC/ODS with update rules, Activate extract structure, delete/set-up tables & set-up table extraction, Info package for delta initialization, set-up periodic V3/Queued/Direct updates, Info package for delta
    4. Maintain extract structure, Maintain data source, Activate extract structure, Assign Datasource to Info source, Maintain Communication Structure/transfer rules, Maintain IC/ODS with update rules, delete/set-up tables & set-up table extraction, Info package for delta initialization, set-up periodic V3/Queued/Direct updates, Info package for delta
    5. None of the above

  1. Choose the correct statements relating to the relevant subject
    1. You have 4 types of Data sources defined in the Source system
    2. The Business Contents are delivered focusing both on Role based as well Industry based.
    3. SAPI, BAPI, File & SOAP are some of the Interfaces among BW with SAP & Non SAP Systems
    4. Service API controls the Data transfer settings, delta queue & monitoring the data load
    5. All of the above

  1. Using SAP or User Exit during data push or using append structure along with function module, the SAP Standard Business Content can be enhanced
True
False

  1. Choose the correct answers relating Generic Data sources
    1. Generic Data sources can be used for Master data such as Texts, Attributes, Hierarchies as well for Transaction data
    2. The generic data sources can be generated against Transparent tables/views, SAP Query, Infoset Query, Function Modules
    3. After Image, Additive & Before Images are supported record modes for Generic Data sources with delta capability
    4. Time stamps, Calendar day & numerical pointer are the Delta determining parameters
    5. Safety interval is permitted

  1. Choose the Update Modes used against all data targets.
    1. Repeat Delta
    2. Queued Delta
    3. Direct Delta
    4. Full update
    5. Delta Initialization




Area
Qn No
Answers
BW310
1
A B D

2
A B C

3
A B C D E

4
B C

5
A D

6
A B C D E

7
A B C E

8
TRUE

9
A B

10
A C D
BW340
11
A B C

12
A C D

13
A C D

14
B C D E

15
B C D E

16
TRUE

17
A B D
BW350
18
B C D

19
A C D

20
E

21
C

22
A B C D E

23
TRUE

24
B D E

25
A D E