martes, 3 de noviembre de 2015

Tips for Merging Dimensions

 

Tip 1: Adding “Incompatible” dimensions to the block
Have you ever noticed that, sometimes when you try to add a dimension to a block that includes merged dimensions, you sometimes get an “Incompatible object” error? Let me explain why this happens, and then we’ll look at ideas on how to fix it.
Dimension objects and Detail objects have a fundamental difference: Dimension objects usually represent a different level of granularity. For example, State and City might be two dimension objects in your universe. City is a lower level of granularity than State. So, when you add City to an existing block that already includes State, the measure objects will be aggregated at a lower level of granularity.
Detail objects, on the other hand, typically do not represent a lower level of granularity when used with their related dimension. For example, if I have Sales Revenue broken down by Customer in a block, and then add the Eye Color detail object, Sales Revenue will not be aggregated at a lower level. It will stay at the level of Customer. This is how detail objects work.
So, if I have two queries merged on State, and try to display another dimension, that is not merged, such as State Capitol, Web Intelligence doesn’t know how to aggregate the measures at the lower level of State Capitol, since that dimension doesn’t exist as a merged dimension. Of course, you and I both know that each State only has one State Capitol, so it’s not really a lower level of granularity. But Web Intelligence doesn’t know that. So we have to tell it.
The way we tell is as follows: Create a detail variable. In this case, maybe we call it Capitol. Make it a detail of the State merged dimension. The formula for this variable is:
=[State Capitol]
We can then add the variable to the block, as Web Intelligence sees it as a detail of State, rather than a different level of granularity. Note that the detail variable must be a detail of a merged dimension. Otherwise, you still won’t be able to add it to the block.
Tip 2: Auto-Merge dimensions only works within a universe

Web Intelligence has a feature called “Auto-merge dimensions”. It’s in the document properties, and is turned on by default. However, not all dimensions will automatically merge with this feature. So let’s clear up the confusion and make it crystal clear when this feature works.
If you have two queries, from the same universe, that include the exact same dimension objects, those dimension objects will automatically merge. This is the only time when dimension objects automatically merge.
Here’s an example of a merge that will not happen automatically. Let’s say you have an object called Address, in a class called Vendor, and you have another object called Address, in a class called Customer. These two objects have the same name, and are from the same universe. Will they automatically merge? No. Web Intelligence is smart enough to know that they are not the same object. Of course, in this case, you probably won’t want them to merge. But if you do, you will need to manually merge them.
Tip 3: Values displayed depend on which object is used

Sometimes, the values between two merged dimensions don’t completely match. For example, you may have a list of product numbers from query 1, and a list of product numbers from query 2, and perhaps some of the product numbers in query 1 don’t show up in query 2. That’s OK. But which list of product numbers will appear on the report? Well, that depends on which Product Number object you use.
If you display the Product Number object from Query 1, you will see all the Product Numbers from Query 1. If you display the Product Number object from Query 2, you will see all the Product Numbers from Query 2. However, if you merge the two Product Number objects, and display the merged dimension, you will get all product numbers from both queries. For those of you familiar with SQL, this is the equivalent of a Full Outer Join.
Tip 4: “Extend merged dimension values” has a similar effect of using the merged dimension

In the document properties, you will find a property called “Extend merged dimension values”. This a fairly useless feature, as it has a similar effect to using the merged dimension. Therefore, I never use this feature. I just follow the rules in Tip 3, above, to determine which values will be displayed.
Tip 5: There are rules to merging dimensions
  • Only dimensions defined in the universe can be merged. You cannot merge using variables.
  • Objects must have the same data type. You cannot merge a number with a string, even if the values match.
  • Any number of queries can be merged. There is no limit.
  • Any number of dimension objects can be merged between two queries. Again, no limit.
  • Values are case-sensitive. So, if the values are the same, but of different case, they will not match. They will be shown as different values.
  • Watch out for trailing blanks. Even if the values look exactly the same, they won’t match if one has a trailing blank, and the other one doesn’t.
Conclusion
Merging dimensions is the only way to combine data from different data sources in the report. Therefore, it’s a very powerful feature, especially if you understand how it works, and how to make it work. If you’re trying to get your merged dimensions to work, and they just won’t cooperate, read through the tips above, and you’re likely to find the solution. If you have other tips, feel free to comment below.
Thanks for reading. :-)

jueves, 4 de junio de 2015

Tips for BW


Converting Standard InfoCubes to SAP HANA Optimized InfoCubes

1. Login to the SAP NetWeaver BW system.

2. Call up Transaction RSMIGRHANADB or the program RSDRI_CONVERT_ CUBE_TO_INMEMORY.

The job is executed in the background as a stored procedure. After the job is finished, the standard InfoCube is converted to an SAP HANA optimized InfoCube.

After the conversion, the InfoCube dimension tables are removed, and the master data tables are now directly linked with the F-fact table.

 
After the conversion, the InfoCube dimension tables are removed, and the master data tables are now directly linked with the F-fact table
 
Modeling InfoProviders with the Composite Provider
 
You can avoid data redundancy or extensive coding when modeling data relations with the use of the composite provider and joins.

 


 

 



viernes, 8 de mayo de 2015

Hierarchy to a Flat File

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?overridelayout=true

Uso 0FI_GL_10 o el 0FI_GL_14

http://mohammadusama.blogspot.com/2014/06/when-to-use-0figl10-and-0figl14.html

Use 0FI_GL_14:


0FI_GL_14 provides you detail level information of account number at day level. (ie posting date). All line items in fbl3n can be reported.

Daily debit and credit and balance for the period selected can be reported in local and foreign currency. 

Mainly applicable for Expense line items(P&L line items).

Uses AIED delta method so DSO is required. Install 0FIGL_O14 DSO.



Use 0FI_GL_10:

Provides you cumulative balance at posting period level. 

Relevant report in ECC is S_PL0_86000030.

Mainly used to report Balance sheet line items where cumulative balance is to be reported. For eg, Fixed assets, current assets and current liabilities G/L accounts.

Uses AIE delta method so DSO is required. Install 0FIGL_O10 DSO.


Alternately 0FI_GL_12 can also be used. But cumulative balance have to be calculated using start routine in 
transformation rule from datasource to cube. No Need of DSO as it works on ADDD(additive delta method) so cube directly loaded. 

miércoles, 15 de abril de 2015

How to reset SAP* user in BW on HANA system

                          

09:07 // Mark Foerster
Hello Jack,
the table USR02 typically isn't buffered any more, so that change should be simple. Log on to the HANA server as sidadm.
hdbsql -i $$ -u SAPSR3 -p password_of_sapsr3
($$ is the instance number of the HANA database)
hdbsql=> update USR02 set uflag=0 where mandt='000' and bname='SAP*'

lunes, 13 de abril de 2015

Creando Agregados y data Rollup 7.X

http://scn.sap.com/community/data-warehousing/bw/blog/2013/07/07/basics-of-cube-aggregates-and-data-rollup

Important tables in SAP BW 7.x

 

Listing of commonly used tables in SAP BI

Listed below are some of the tables commonly used in SAP BW 7.x - however the same has not been tested in NW2004s and please make corrections if required. Some of this material has gone through a lot of eyeballs before making it to the final list and could contain some older references / incorrect references , however most of them have been validated.
Transfer Structure
 
RSTSTransfer Structure List
RSTSFIELDTransfer Structure fields
RSTSRULES Transfer Structure rules
RSAROUTTText name of Transfer Routine
DD03TText for R/3 Transfer structure Objects
Update Rules 
RSUPDROUTUpdate rules List
RSUPDDAT  Update rules with routines
RSUPDKEY   Update rule key fields
RSUPDINFOInfoProvider to Infosource correlation
Embedded ABAP coding for Transfer / Update Rules 
RSAABAPABAP source code per object routine
InfoPackage 
RSLDPIOLinks datasource to infopackages
RSLDPIOT InfoPackage Text Description
RSLDPRULEABAP source code for InfoPackages
RSLDPSEL  Hardcoded selections in InfoPackages
RSMONICDPContains the request-id number by data target
RSPAKPOSList of InfoPackage Groups / InfoPackages
ProcessChain
 
RSEVENTCHAIN  Event Chain Processing Event Table
RSEVENTHEAD   Header for the event chain
RSEVENTHEADT   Header for the event chain
RSPCCHAINProcess chain details
RSPCCHAINATTR Attributes for a Process Chain
RSPCCHAINEVENTS Multiple Events with Process Chains
RSPCCHAINT Texts for Chain
RSPCCOMMANDLOGSystem Command Execution Logs (Process Chains)
RSPCLOGCHAIN Cross-Table Log ID / Chain ID
RSPCLOGSApplication Logs for the Process Chains
RSPCPROCESSLOGLogs for the Chain Runs
RSPCRUNVARIABLES Variables for Process Chains for Runtime
RSPC_MONITORMonitor individual process chains
Queries
 
RSZELTDIRDirectory of the reporting component elements
RSZELTTXTTexts of reporting component elements
RSZELTXREFDirectory of query element references
RSRREPDIRDirectory of all reports (Query GENUNIID)
RSZCOMPDIRDirectory of reporting components
RSZRANGESelection specification for an element
RSZSELECTSelection properties of an element
RSZELTDIRDirectory of the reporting component elements
RSZCOMPICAssignment reuseable component <-> InfoCube
RSZELTPRIOPriorities with element collisions
RSZELTPROPElement properties (settings)
RSZELTATTRAttribute selection per dimension element
RSZCALCDefinition of a formula element
RSZCELQuery Designer: Directory of Cells
RSZGLOBVGlobal Variables in Reporting
RSZCHANGESChange history of reporting components
Workbooks
 
RSRWBINDEXList of binary large objects (Excel workbooks)
RSRWBINDEXTTitles of binary objects (Excel workbooks)
RSRWBSTOREStorage for binary large objects (Excel workbooks)
RSRWBTEMPLATEAssignment of Excel workbooks as personal templates
RSRWORKBOOKWhere-used list for reports in workbooks
Web templates
 
RSZWOBJStorage of the Web Objects
RSZWOBJTXTTexts for Templates/Items/Views
RSZWOBJXREFStructure of the BW Objects in a Template
RSZWTEMPLATEHeader Table for BW HTML Templates
InfoObject 
 RSDIOBJ                              Directory of all InfoObjects
RSDIOBJTTexts of InfoObjects
RSDIOBJ  Directory of all InfoObjects
RSDIOBJTTexts of InfoObjects
RSDATRNAVNavigation Attributes
RSDATRNAVTNavigation Attributes
RSDBCHATRMaster Data Attributes
RSDCHABASBasic Characteristics (for Characteristics,Time Characteristics, and Units)
RSDCHA Characteristics Catalog
RSDDPAData Package Characteristic
RSDIOBJCMPDependencies of InfoObjects
RSKYF Key Figures
RSDTIMTime Characteristics
RSDUNIUnits
InfoCube
 
RSDCUBE Directory of InfoCubes
RSDCUBETTexts on InfoCubes
RSDCUBEIOBJObjects per InfoCube (where-used list)
RSDDIMEDirectory of Dimensions
RSDDIMETTexts on Dimensions
RSDDIMEIOBJ InfoObjects for each Dimension (Where-Used List)
RSDCUBEMULTIInfoCubes involved in a MultiCube
RSDICMULTIIOBJMultiProvider: Selection/Identification of InfoObjects
RSDICHAPROCharacteristic Properties Specific to an InfoCube
RSDIKYFPROFlag Properties Specific to an InfoCube
RSDICVALIOBJInfoObjects of the Stock Validity Table for the InfoCube
  
RSDCUBEMULTIInfoCubes concerned with MultiCube
RSDICMULTIIOBJMultiProvider: Selection/Identification of InfoObjects
Aggregates
 
RSDDAGGRDIRDirectory of Aggregates
RSDDAGGRCOMPDescription of Aggregates
RSDDAGGRTText on Aggregates
RSDDAGGLTDirectory of the aggregates, texts
ODS Object 
RSDODSODirectory of all ODS Objects
RSDODSOTTexts of all ODS Objects
RSDODSOIOBJInfoObjects of ODS Objects
RSDODSOATRNAVNavigation Attributes for ODS Object
RSDODSOTABLDirectory of all ODS Object Tables
PSA 
RSTSODS Directory of all PSA Tables
DataSource (= OLTP Source) 
ROOSOURCE  Header Table for SAP BW DataSources (SAP Source System/BW System)
RODELTAMBW Delta Procedure (SAP Source System)
RSOLTPSOURCEReplication Table for DataSources in BW
InfoSource 
RSIS Directory of InfoSources with Flexible Update
RSISTTexts on InfoSources with Flexible Update
RSISFIELDInfoObjects of an InfoSource
Communications Structure 
RSKSCommunications Structure for InfoSources with Flexible Update
RSKSCommunications Structure (View) for Attributes for an InfoSource with Direct Update
RSKSFIELDTexts on InfoSources with Flexible Update
RSISFIELDInfoObjects of an InfoSource with Flexible Update
Transfer Structure 
RSTSTransfer Structure in SAP BW
ROOSGENGenerated Objects for a DataSource (Transfer Structure, for example in SAP Source System)
Mapping
 
RSISOSMAPMapping Between InfoSources and DataSources (=OLTP Sources)
RSOSFIELDMAPMapping Between DataSource Fields and InfoObjects
InfoSpoke 
RSBSPOKESELSETInfoSpoke Directory and Selection Options
RSBSPOKEVSELSETInfoSpoke Directory with Selection Options and Versioning
RSBSPOKEList of all InfoSpokes with attributes maintained with transaction RSBO which include the name of
the Source & Target Structures
RSBSPOKETList of all InfoSpokes with the Short & Long Descriptions (only one of these can be maintained).
RSBSTEPIDMESSContains all the messages that have been recorded during the execution of an InfoSpoke. This table can
be added to using the ABAP Class/Method i_r_log->add_sy_message.
SAP BW Statistics 
RSDDSTATBasic Table for InfoCubes/Queries
RSDDSTATAGGRDetail Table for Aggregate Setup
RSDDSTATAGGRDEFDetail Table of Navigation for each InfoCube/Query
RSDDSTATCONDInfoCube Compression
RSDDSTATDELEInfoCube Deletions
RSDDSTATWHMWarehouse Management
Misc 
RSFECBW Frontend Check. Useful for checking the installed SAP GUI versions on user machines.
RSSELDONEInfoPackage selection and job program, there in field UPDMODE the update status (INIT/DELTA/FULL)
RSPSADELPSA Table deletion
TBTCPJob Schedule Definition
TBTCOJob Schedule Result
RSMONMESSMonitor Messages
RSERRORLOGCheck loading errors in table
V_RSZGLOBVReport Variables view table
DEVACCESSDeveloper Keys table
TSTCAll Transactions in the system
RSDDAGGRDIRDirectory of the aggregates
ROIDOCPRMSControl parameters for data transfer from the source system
SMEN_BUFFCObjects in User's Favorites
TSTCTTransaction Codes with Text
DD03Lfield names and corresponding data element names
DD03LTDescription of each data element
DD02LAll SAP Defined Table Names
DD02LTDescription of All SAP Defined Table Names
Web Item
 
RSZWITEMHeader Table for BW Web Items
RSZWMDITEMBW Web Metadata: Template Item ( Dataprovider, Item, ... ).
RSZWITEMXREFCross Reference of Web Items
RSZWMIMEIOBUFFERBuffer for Translation MIME Rep. to IO
RSZWOBJStorage of the Web Objects
RSZWOBJTXTTexts of Templates/Items/Views
RSZWOBJXREF Structure of the BW Objects in a Template
RSZWTEMPLATE Header Table for BW HTML Templates
Archiving 
RSARCHIPROBW Archiving: General Archiving Properties
RSARCHIPROIOBJBW Archiving: General Archiving Properties
RSARCHIPROLOCBW ARchiving: General Local Properties
RSARCHIPROLOCSELBW Archiving: Archived Data Area
RSARCHIPROPIDBW Archiving: Program References of InfoProvider
RSARCHREQBW Archiving: Archiving Request
RSARCHREQFILESBW Archiving: Verfified Archive Files
RSARCHREQSELBW Archiving: Request-Selections
Open Hub Destination 
RSBOHSERVICETPOpen Hub: Service Types
RSBREQUESTDELTAOpen Hub: Cross Reference Outbound/Inbound
RSBREQUESTMESSOpen Hub: Log for a Request
RSBREQUIDOpen Hub: Requests
RSBREQUID3RDOpen Hub: Status of 3rd Party Requests
RSBREQUIDRUNOpen Hub: Table with Status for a Request
RSBSTRUCTUREOpen Hub: Generated Structures and Tables

lunes, 6 de abril de 2015

BW and In-Memory Storage

oday I’m going to talk to you a bit about how data is stored in BW and why proper planning for storage is important.  Let’s say for example, you have a large project coming down the pipeline and are unsure of how this new data being introduced into your production landscape will affect the BW environment.   You probably have a mature SAP installation at your company, and chances are you are using a Business Warehouse Accelerator (BWA) appliance or HANA (don’t forget to factor these in).  It’s essential to plan a strategy with the database administrators as they are directly impacted by each and every new project.  New projects mean more data being added, which means more space required within the database.
The 10% rule – BWA Sizing:
An easy way to ballpark how much memory is required to index BW data onto your BWA is to follow the 10% rule.  That means for every Gigabyte of BW data, around 100 MB will be needed for BWA storage.  Thankfully, the TREX engine that BWA runs is capable of efficiently compressing and storing of data in memory at 1/10th the size of what BW stores it at.  This allows you to fit more data in BWA without having to worry too much about space constraints.
Database storage vs in-memory storage
Data can be stored cheaply on a disk based database versus an in-memory database.  Each type has their pros and cons.  Disk based storage is cheap, but the catch is slow performance.  In-Memory is fast, but extremely expensive.  That’s why it’s important for companies to properly plan for and balance out their data distribution to only index into memory the best candidates.
BWA and the 50% rule
BWA is made up of a series of blades.  Let’s use HP’s 36GB blades for examples sake.  If your appliance has 14 blades, you would have 504 GB of space for data storage, right? Wrong.  BWA has its own 50% rule.
Per SAP’s best practices, only half of each blade can be filled with data meaning only 18GB of each 36GB blade is usable for indexing.  18GB*14 blades = 252GB.  The other 18GB is used for processing and computing.  As you creep over the 50% index storage number, expect to see performance degrading exponentially.
Our BWA is full, now what?
RAM within BWA is very, very expensive.  Unless everything is being used equally within the BWA, I would recommend performing an As-Is assessment of all indexed InfoCubes and remove the least utilized InfoCubes to free up space for more popular cubes.  RSDDBIASTATUSE is a handy BW table that stores the execution history of queries against the BWA by cube.  Take a look at query execution history to better identify the slackers.  You will be surprised by how much space you could easily regain by removing a few hogs.
How much space is a cube taking up on BWA memory?
Transaction TREXADMIN allows you to view exactly how much space a table is taking up on BWA.  Focus solely on the F table which consists of both the E and F tables from the BW side.  There is no E table on BWA, just F.
Firstly, we type in an InfoCube we know is indexed in BWA.  In this example I’m using cube 0PCA_C01.  Navigate to the Index Admin tab and type *0PCA_C01* in order to pull all relevant tables for this cube.  Now focus on the F table and the Memory Size column.  This cube is taking up 211,292 KB or 0.2 GB of BWA’s total memory.

How much space is a cube taking on BW disk?
Transaction DB02 allows you to view size at a database level.  Go into BW Analysis and access both the E and F cube object areas.  Below you can see each row from each area.  If you combine both of these, you will have around 3.5GB of data.

As you can see the compression ratio can change drastically from cube to cube.  3.5GB of data on BW for this cube only takes up 0.2 GB on BWA!  Feel free to leave comments below and share with your friends and colleagues.

miércoles, 25 de marzo de 2015

This Content replaces the existing "SAP HANA Cookbook"

First Guidance
 
The First Guidance documents are the beginning of a series of documents that should help to better understand the various concepts of SAP BW powered by SAP HANA.The documents are still “work in progress”, so these guides are not intended to be exhaustive.. The purpose of these documents is to deliver additional information besides SAP Help and blogs to get a better understanding of the concepts of SAP BW on SAP HANA.


 
 
SAP First Guidance - Business Warehouse on SAP HANA Installation (Updated version: reflects now the 7.31/7.40 releases as well, the usage of the SL toolset (SWPM, SUM, etc.) and the current SAP HANA 1.0 SP08 release and there actual revisions.)
After BW on HANA is already available since mid April 2012 this document was enhanched a few times since the first version.
On base of this information this "SAP First Guidance" document is released to support Basis administrators in running and  and configuring a Business Warehouse on a SAP HANA system. SAP First Guidance - Business Warehouse on SAP HANA installation provides answers to major questions and issues as well as workarounds and additional details to complement standard SAP Guides and SAP notes. This SAP First Guidance document has no claim of completeness, but it is the most complete starting point for an successfull implementation.
For more information please contact roland.kramer@sap.com
 
 
 
SAP First Guidance - Migration BW on HANA using the DMO option in SUM
the database migration option (DMO) is an option inside of the software update manager (SUM) for combined update and migration: update an existing SAP system to a higher software release and migrate to SAP HANA database in one technical step. As the technical SUM steps are the same, this “SAP First Guidance” document should make all customer-specific documentation obsolete. It is the complementary documentation to the existing Notes and SUM/DMO Upgrade Guides.
DMO can be used with every BW Release starting from 7.0x and onwards. It makes the two step approach (upgrade/migration) and the usage of the BW post copy automation (BW-PCA) obsolete. It can also be used within a Release, e.g. to go from 7.40 SP06 on anyDB to migrate to 7.40 SP07 on HANA.
For more information please contact roland.kramer@sap.com
 
 


SAP First Guidance - Implementing BW-MT for BW-aDSO
SAP BW 7.4 SP08 powered by SAP HANA is the next milestone for enterprise data warehousing with BW on HANA and provides the next level of simplification for BW .In addition, SAP BW on HANA Virtual Data Warehouse capabilities have been enhanced for even more flexibility. In midterm the advanced DSO shall replace the main BW InfoProviders with persistency (InfoCube, DSO, PSA). The classic InfoProviders are of course still supported and do co-exist with the advanced DSOs.
For more information, please contact roland.kramer@sap.com
 
 


SAP First Guidance - SAP NLS Solution with Sybase IQ
This “SAP First Guidance” document provides all the necessary information to help quickly implement this newly released option:
Store historical BW data on an external IQ server to optimize the  system size  when preparing  to migrate to SAP BW powered by SAP HANA.
The SAP NLS solution with Sybase IQ also helps to keep down the TCO for investing in the SAP In-Memory when historical (“frozen”) data is stored on a less sophisticated device, which acts like a “BW Accelerator on Disk”. SAP Sybase IQ is therefore the perfect smart store for this kind of data.
Please note that the SAP NLS solution can be used with all database versions supported by SAP BW 7.3x. SAP HANA is not necessary. The document is a “work in progress” and is not intended to be exhaustive.  It does however contain all information required for successful implementation of the SAP-NLS solution with Sybase IQ. SAP-NLS can be used with every SAP supported database, where SAP IQ will be the seconday database for data reallocation.
For more information, please contact roland.kramer@sap.com
 
 
SAP First Guidance - SEM/BW Modelling in SolMan 7.1 with MOPZ
Due to constant Questions about the Upgrade to NetWeaver 7.3x and 7.40 including SEM Add-On Components, we created a SAP First Guidance Document which describes the successfull Definition of a SAP BW 7.0/7.01 System with the SEM Add-On Installed on top.  With this Information you will be able to integrate the stack.xml in the SUM (software update manager) process and the DMO (database migration option) DMO process included in SUM as the first input is the location of the stack.xml which defines the download directory for SUM. Furthermore the interaction of the stack.xml in the upgrade process enables you a smooth Integration into the Upgrade Process included in DMO.
For more information please contact roland.kramer@sap.com
 
 
 
 
SAP First Guidance - BW Housekeeping and BW-PCA
The system copy procedure of SAP BW systems and landscapes is complex for a number of reasons. There is a large number of configuration settings (such as connections and delta queue handling for data loading) and system copy scenarios of SAP BW (each with different landscape aspects).  These have to be handled as part of every system copy, regardless of whether the system copy is part of the migration to SAP HANA or if you want to perform regular system copies of your SAP BW landscape. BW-PCA can be used from SAP BW 7.0 onwards depending on the SPS level.
Additionally see the usage and implementation of the BW Houskeeping Task List and the Pre/Post Task Lists for Upgrade/Migration. These are Included in the database migration option (DMO) as part of the software update manager (SUM). Additional Details also can be found on the BW ALM Page.
For more information please contact roland.kramer@sap.com
 
------------------------------------------------------------------------------------------------------------
 
 
SAP First Guidance SAP BW 7.40 on HANA - HANA Analysis Processes
Starting with SAP BW 7.4 SP5 on HANA a new feature to analyze data from certain perspectives, for example to calculate ABC classes or to calculate scoring classes, is introduced. This new feature is called SAP HANA Analysis Process and enables the data warehouse modeler to analyze the data using different predefined or own written functions or scripts. HANA provides the integration numerous specialized libraries like PAL, AFL, R to understand the correlation of the data in the existing EDW.
 
SAP First Guidance - SAP BW 7.40 on HANA - View Generation
The focus of this document is the exposure of BW data natively in HANA as HANA views that point directly to the data and tables managed by BW. This enables HANA-native consumption of BW data.
 
 
 
SAP First Guidance - SAP BW 7.30 on HANA - Inventory InfoCubes
This SAP first guidance document describes the handling of non-cumulative InfoCubes on SAP HANA. As the handling of Inventory InfoCubes changed within SAP BW 7.30 based on SAP HANA, this document wants to briefly describe the differences. for better understanding it is recommended to read How to Guide based on previous release How to Handle Inventory Management Scenarios in BW.
 
 
SAP First Guidance - SAP BW 7.40 SP05 on HANA - OpenODSView
In SAP BW 7.4 SP05 on HANA the new metadata object Open ODS View is introduced, which provides the data warehouse modeler with a flexible, easy to use tool to integrate external data in the EDW Core Data Warehouse.
 
 
SAP First Guidance - SAP BW 7.40 SP05 powered by SAP HANA - CompositeProvider
A CompositeProvider is an InfoProvider in which you can combine data from BW InfoProviders such as InfoObjects, DataStore Objects, SPOs and InfoCubes, or SAP HANA views such as Analytical or Calculation Views using join or union operations to make the data available for reporting. This paper explains how to create a CompositeProvider in SAP BW 7.4 SP5 using the BW modeling tools in SAP HANA Studio.

SAP Development Tools for Eclipse

ABAP Development Tools for SAP NetWeaver

This site describes how to install and update the front-end components of ABAP Development Tools for SAP NetWeaver (ADT).
It also provides you with detailed information on how to prepare the relevant ABAP back-end system for working with ADT.

Prerequisites

Eclipse PlatformLuna (4.4) or Kepler (4.3)
Operating System
  • Windows 7 32- or 64-Bit, or
  • Apple Mac OS X 10.8, Universal 64-Bit, or
  • Linux distribution
* The compatibility is no more tested by the Eclipse Community since Eclipse Kepler (4.3)
Java RuntimeJRE version 1.6 or higher, 32-Bit or 64-Bit
SAP GUI
  • For Windows OS: SAP GUI for Windows 7.30
  • For Apple Mac or Linux OS: SAP GUI for Java 7.30
Microsoft VC RuntimeFor Windows OS: DLLs VS2010 for communication with the back-end system is required.
NOTE: Install either the x86 or the x64 variant, accordingly to your 32- or 64-Bit Eclipse installation.

Procedure

To install the front-end component of ADT, proceed as follows:
  1. Get an installation of Eclipse Luna (recommended) or Eclipse Kepler.
  2. In Eclipse, choose in the menu bar Help > Install New Software...
  3. For Eclipse Luna (4.4), add the URL https://tools.hana.ondemand.com/luna.
    For Eclipse Kepler (4.3), add the URL https://tools.hana.ondemand.com/kepler.
  4. Press Enter to display the available features.
  5. Select ABAP Development Tools for SAP NetWeaver and choose Next.
  6. On the next wizard page, you get an overview of the features to be installed. Choose Next.
  7. Confirm the license agreements and choose Finish to start the installation.

More Information

To configure the relevant ABAP back-end system, follow the steps in the how-to-guide Configuring the ABAP back-end for ABAP Development Tools
For more information about ABAP Development Tools, see our community.
The downloads are provided under the terms of the SAP DEVELOPER LICENSE AGREEMENT.

lunes, 2 de marzo de 2015

Diferencia entre MB52 y MB5B en Inventario

la MB52 SOLO MUESTRA STOCK FINAL ACTUAL ; En cambio la MB5B te muestra el stock en un momento X, por ejemplo puedes colocar el stock que tenias hasta el 31 de Diciembre del 2013, también te sirve para ver que movimientos hubieron en un intervalo X de tiempo, por ejemplo ver el material BOLIGRAFOS que movimientos se hicieron entre las fechas 1 de Septiembre y 10 de Septiembre.
Esa es la diferencia, claro que la MB5B exige un poco mas de procesamiento de tu servidor, otra transacción similar a la MB5B yo la utilizo mas por ser mas entendible en cuanto a su estructura como reporte es la MB51, que hace casi lo mismo.


 La primero la MB52 solo te muestra el stock del inventario a la fecha por centro o por almacén, en cambio la MB5B te muestra los movimientos en un rango de fechas de los materiales o de determinado material, esta transacciones es mas para el análisis de los movimientos contables y logísticos, es decir en esta se incluyen por ejemplo las actualizaciones de precio que se realizan en la MR21, como también los diferentes ajustes que puede realizar FI a determinado material; por eso esta se vuelve un poco compleja de comprender, pues si no das el rango de fechas adecuado pueda que los saldos iniciales y finales no sean iguales al de la mb53; yo generalmente la utilizo para analizar el comportamiento del precio, cuando el material maneja control de precio V.

How to handle inventory managment in BW

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328?QuickLink=index&overridelayout=true&5003637719724

jueves, 26 de febrero de 2015

Full/Delta/Initialize delta update methods


Introduction

Update method is used to get the updated data coming from the source system to BI system at Info package level. We can set update methods in the update tab of the info package.

The update methods in the info package are:

1. Full Update
2. Delta Update
3. Initialize Delta Process
    (I) Initialize with data transfer
    (II) Initialize without data transfer
    (III) Early Delta Initialization

1.Full Update

Full update extracts the full data from source system to PSA in BI7 every time. 

2. Delta Update

Delta update extracts delta records only from the BW delta queue in the source system to BI system.

We must initialize the delta in order to get delta records, otherwise it is not possible to load delta records.

The following are the 4 delta types for the data source in the system.

F: Flat file provides the delta
E: Extractor determines the delta, Ex: LIS, COPA
D: Application determines the delta, Ex: LO, FI-AR/AP
A: Use ALE change log delta

Note: We can know the delta properties from ROOSOURCE table in the source system with SE16 transaction code.

3.Initialize Delta Process

To get the delta records, one must initialize the delta process. While initializing the delta process, the system will generate a flag: Initialize option for the source system in (scheduler menu of info package) BI and BW delta queue per the data source in the source system (RSA7). This enables the time stamp mechanism.

Initialize with data transfer

If you select this option, It extracts the init data from source system to BI system and allows delta functionality.

Steps for initialize with data transfer

Lock the users in the source system 
Delete the contents of the setup tables for the specific application component in source system(T code: LBWG). 
Fill the setup tables (SBIW or OLI*BW, use 1,2,3...in place of * according to the application ). 
Run the info package with initialize with data transfer. 
unlock the users in the system 

Note: This is very time consuming process, because we need to lock the users until data reaches to the BI system.This effects the client business.

Initialize without data transfer

In some cases, init is successful but someone has deleted the init flag. In order to set the flag again to perform the delta load without disturbing the data, we execute IP with this option.

Steps for initialize without data transfer

Lock the users in the source system 
Delete the setup tables content for the specific application component. 
Fill the setup tables 
Run the IP with the option: Initialize without data transfer. 
Unlock the users in the source system 
Load data to BI system using repair full request info package 

Note: In this method, after data is loaded to setup tables we can unlock the users in source system. this is better option than initialize with data transfer option.

Early Delta Initialization

In this option, we can do the delta initialization before filling the setup tables.So that users can post the documents when we are filling the setup tables.We will get the posted records in the next delta run

Steps for early delta initialization

Run the Info package with early delta initialization option.This will enable the BW delta queue and setup the time stamp for delta in the source system. 
Delete the setup tables for the application component 
Fill the setup tables 
Load the setup table data using repair full request (scheduler menu option of info package) info package 

How to check whether the data source supports early delta initialization or not?

Go to SE16 in ECC, give table name: ROOSOURCE and enter 
In the next screen give data source like 2lis_02_sdr(purchase document header data source) name and enter 
if field ZDD_ABLE has a value 'X', then the data source supports early delta initialization 
If the filed has space, then the data source does not support early delta initialization.

miércoles, 25 de febrero de 2015

Permisos en BW (como saltar)

RS_HDSYS_CALL_TC_VARIANT, en la se37
Desmarco el Authority Check y coloco el nombre de la TX que necesito
 
 

viernes, 20 de febrero de 2015

TX para Composites


RSDDCOPR - composite provider Directory BW - OLAP Technology


USH04 - Change history for authorizations Basis - User and Authorization Management

RSDICMULTIIOBJ - Multiprovider: Selection/Identification of InfoObjects BW - Data Basis

V_CMP_JOIN - Generierte Tabelle zu einem View BW - End User Technology

MYCOM - Bal. Sheet Valuation: Table of Database Commits Carried Out MM - Balance Sheet Valuation Procedures

MPROV - Generierte Tabelle zu einem View Logistics - Material Master

SMENSAPT - Texts for Menu - SAP Basis - Session Manager


RSDDTREXADMIN - Additional Settings for Indexing an HPA Index BW - OLAP Technology

SSM_RFC - Variables for RFC Destinations in the Workplace Basis - ABAP Authorization and Role Management

T71JPR09 - Survey Job / Internal Job Matching Data Personnel Mgmt - Job Pricing


RSDDTPS - Date and Language for Polestar BW - OLAP Technology

SRMWFPATH - Workflow Process Route Basis - SAP Records Management

USUSER_AGR - GUM: Assignment of Role to User Basis - User and Authorization Management

RSQFOBJ - Field Objects in the InfoSet BW - Metadata (Repository)

AIND_STR5 - AS: Function Modules for Application-Dependent Reporting Basis - Archive Information System

RSDDTREXDIRTABL - Table of an HPA Index BW - OLAP Technology

Rutinas Mes, Año, Día


Rutina para Dia

PROGRAM trans_routine.


*---------------------------------------------------------------------*
*       CLASS routine DEFINITION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform DEFINITION.
  
PUBLIC SECTION.

*  Attributs
    
DATA:
      p_check_master_data_exist
            
TYPE RSODSOCHECKONLY READ-ONLY,
*-    Instance for getting request runtime attributs;
*     Available information: Refer to methods of
*     interface 'if_rsbk_request_admintab_view'
      p_r_request
            
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.

  
PRIVATE SECTION.

    
TYPE-POOLSrsdrstr.

*   Rule specific types
    
TYPES:
      
BEGIN OF _ty_s_SC_1,
*      Field: POSTING_DATE Cronomarcador.
        POSTING_DATE           
TYPE P LENGTH DECIMALS 0,
*      Field: RECORD.
        RECORD           
TYPE RSARECORD,
      
END   OF _ty_s_SC_1.
    
TYPES:
      
BEGIN OF _ty_s_TG_1,
*      InfoObject: ZPDIA Días.
        /BIC/ZPDIA           
TYPE /BIC/OIZPDIA,
      
END   OF _ty_s_TG_1.

*$*$ begin of global - insert your declaration only below this line  *-*
    
... "insert your code here
*$*$ end of global - insert your declaration only before this line   *-*

    
METHODS
      compute_ZPDIA
        
IMPORTING
          request                  
type rsrequest
          datapackid               
type rsdatapid
          SOURCE_FIELDS              
type _ty_s_SC_1
          segid                    
type RSBK_SEGID
        
EXPORTING
          RESULT                   
type _ty_s_TG_1-/BIC/ZPDIA
          monitor                  
type rstr_ty_t_monitor
        
RAISING
          cx_rsrout_abort
          cx_rsrout_skip_record
          cx_rsrout_skip_val
          cx_rsbk_errorcount
.
    
METHODS
      invert_ZPDIA
        
IMPORTING
          i_th_fields_outbound         
TYPE rstran_t_field_inv
          i_r_selset_outbound          
TYPE REF TO cl_rsmds_set
          i_is_main_selection          
TYPE rs_bool
          i_r_selset_outbound_complete 
TYPE REF TO cl_rsmds_set
          i_r_universe_inbound         
TYPE REF TO cl_rsmds_universe
        
CHANGING
          c_th_fields_inbound          
TYPE rstran_t_field_inv
          c_r_selset_inbound           
TYPE REF TO cl_rsmds_set
          c_exact                      
TYPE rs_bool.
ENDCLASS.                    "routine DEFINITION

*$*$ begin of 2nd part global - insert your code only below this line  *
... "insert your code here
*$*$ end of 2nd part global - insert your code only before this line   *

*---------------------------------------------------------------------*
*       CLASS routine IMPLEMENTATION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform IMPLEMENTATION.

*----------------------------------------------------------------------*
*       Method compute_ZPDIA
*----------------------------------------------------------------------*
*       This subroutine allows the mapping from source to target fields
*       of a transformation rule using ABAP for application specific
*       coding.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD compute_ZPDIA.

*   IMPORTING
*     request     type rsrequest
*     datapackid  type rsdatapid
*     SOURCE_FIELDS-POSTING_DATE TYPE P LENGTH 000008 DECIMALS 000000
*    EXPORTING
*      RESULT type _ty_s_TG_1-/BIC/ZPDIA

    
DATA:
      MONITOR_REC    
TYPE rsmonitor.

*$*$ begin of routine - insert your code only below this line        *-*
    
... "insert your code here
*--  fill table "MONITOR" with values of structure "MONITOR_REC"
*-   to make monitor entries
    
... "to cancel the update process
*    raise exception type CX_RSROUT_ABORT.
    
... "to skip a record"
*    raise exception type CX_RSROUT_SKIP_RECORD.
    
... "to clear target fields
*    raise exception type CX_RSROUT_SKIP_VAL.

    
data l_datetime(15type c.
    
data :l_date type sy-datum.
    
datadia  type /BI0/OICALMONTH.


    l_datetime 
SOURCE_FIELDS-POSTING_DATE.
    l_date 
l_datetime+00(08).
    dia 
l_date+6(2).

    RESULT 
dia.

*$*$ end of routine - insert your code only before this line         *-*
  
ENDMETHOD.                    "compute_ZPDIA
*----------------------------------------------------------------------*
*       Inverse method invert_ZPDIA
*----------------------------------------------------------------------*
*       This subroutine needs to be implemented only for direct access
*       (for better performance) and for the Report/Report Interface
*       (drill through).
*       The inverse routine should transform a projection and
*       a selection for the target to a projection and a selection
*       for the source, respectively.
*       If the implementation remains empty all fields are filled and
*       all values are selected.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD invert_ZPDIA.

*   IMPORTING
*     i_r_selset_outbound          TYPE REF TO cl_rsmds_set
*     i_th_fields_outbound         TYPE HASHED TABLE
*     i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
*     i_r_universe_inbound         TYPE REF TO cl_rsmds_universe
*   CHANGING
*     c_r_selset_inbound           TYPE REF TO cl_rsmds_set
*     c_th_fields_inbound          TYPE HASHED TABLE
*     c_exact                      TYPE rs_bool

*$*$ begin of inverse routine - insert your code only below this line*-*
    
... "insert your code here
*$*$ end of inverse routine - insert your code only before this line *-*

  
ENDMETHOD.                    "invert_ZPDIA
ENDCLASS.                    "routine IMPLEMENTATION


 

Rutina para mes:




PROGRAM trans_routine.


*---------------------------------------------------------------------*
*       CLASS routine DEFINITION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform DEFINITION.
  
PUBLIC SECTION.

*  Attributs
    
DATA:
      p_check_master_data_exist
            
TYPE RSODSOCHECKONLY READ-ONLY,
*-    Instance for getting request runtime attributs;
*     Available information: Refer to methods of
*     interface 'if_rsbk_request_admintab_view'
      p_r_request
            
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.

  
PRIVATE SECTION.

    
TYPE-POOLSrsdrstr.

*   Rule specific types
    
TYPES:
      
BEGIN OF _ty_s_SC_1,
*      Field: POSTING_DATE Cronomarcador.
        POSTING_DATE           
TYPE P LENGTH DECIMALS 0,
*      Field: RECORD.
        RECORD           
TYPE RSARECORD,
      
END   OF _ty_s_SC_1.
    
TYPES:
      
BEGIN OF _ty_s_TG_1,
*      InfoObject: ZPMES Mes.
        /BIC/ZPMES           
TYPE /BIC/OIZPMES,
      
END   OF _ty_s_TG_1.

*$*$ begin of global - insert your declaration only below this line  *-*
    
... "insert your code here
*$*$ end of global - insert your declaration only before this line   *-*

    
METHODS
      compute_ZPMES
        
IMPORTING
          request                  
type rsrequest
          datapackid               
type rsdatapid
          SOURCE_FIELDS              
type _ty_s_SC_1
          segid                    
type RSBK_SEGID
        
EXPORTING
          RESULT                   
type _ty_s_TG_1-/BIC/ZPMES
          monitor                  
type rstr_ty_t_monitor
        
RAISING
          cx_rsrout_abort
          cx_rsrout_skip_record
          cx_rsrout_skip_val
          cx_rsbk_errorcount
.
    
METHODS
      invert_ZPMES
        
IMPORTING
          i_th_fields_outbound         
TYPE rstran_t_field_inv
          i_r_selset_outbound          
TYPE REF TO cl_rsmds_set
          i_is_main_selection          
TYPE rs_bool
          i_r_selset_outbound_complete 
TYPE REF TO cl_rsmds_set
          i_r_universe_inbound         
TYPE REF TO cl_rsmds_universe
        
CHANGING
          c_th_fields_inbound          
TYPE rstran_t_field_inv
          c_r_selset_inbound           
TYPE REF TO cl_rsmds_set
          c_exact                      
TYPE rs_bool.
ENDCLASS.                    "routine DEFINITION

*$*$ begin of 2nd part global - insert your code only below this line  *
... "insert your code here
*$*$ end of 2nd part global - insert your code only before this line   *

*---------------------------------------------------------------------*
*       CLASS routine IMPLEMENTATION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform IMPLEMENTATION.

*----------------------------------------------------------------------*
*       Method compute_ZPMES
*----------------------------------------------------------------------*
*       This subroutine allows the mapping from source to target fields
*       of a transformation rule using ABAP for application specific
*       coding.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD compute_ZPMES.

*   IMPORTING
*     request     type rsrequest
*     datapackid  type rsdatapid
*     SOURCE_FIELDS-POSTING_DATE TYPE P LENGTH 000008 DECIMALS 000000
*    EXPORTING
*      RESULT type _ty_s_TG_1-/BIC/ZPMES

    
DATA:
      MONITOR_REC    
TYPE rsmonitor.

*$*$ begin of routine - insert your code only below this line        *-*
    
... "insert your code here
*--  fill table "MONITOR" with values of structure "MONITOR_REC"
*-   to make monitor entries
    
... "to cancel the update process
*    raise exception type CX_RSROUT_ABORT.
    
... "to skip a record"
*    raise exception type CX_RSROUT_SKIP_RECORD.
    
... "to clear target fields
*    raise exception type CX_RSROUT_SKIP_VAL.

    
data l_datetime(15type c.
    
data :l_date type sy-datum.
    
dataanomes  type /BI0/OICALMONTH.


    l_datetime 
SOURCE_FIELDS-POSTING_DATE.
    l_date 
l_datetime+00(08).
    anomes 
l_date+4(2).


    RESULT 
anomes.


*$*$ end of routine - insert your code only before this line         *-*
  
ENDMETHOD.                    "compute_ZPMES
*----------------------------------------------------------------------*
*       Inverse method invert_ZPMES
*----------------------------------------------------------------------*
*       This subroutine needs to be implemented only for direct access
*       (for better performance) and for the Report/Report Interface
*       (drill through).
*       The inverse routine should transform a projection and
*       a selection for the target to a projection and a selection
*       for the source, respectively.
*       If the implementation remains empty all fields are filled and
*       all values are selected.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD invert_ZPMES.

*   IMPORTING
*     i_r_selset_outbound          TYPE REF TO cl_rsmds_set
*     i_th_fields_outbound         TYPE HASHED TABLE
*     i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
*     i_r_universe_inbound         TYPE REF TO cl_rsmds_universe
*   CHANGING
*     c_r_selset_inbound           TYPE REF TO cl_rsmds_set
*     c_th_fields_inbound          TYPE HASHED TABLE
*     c_exact                      TYPE rs_bool

*$*$ begin of inverse routine - insert your code only below this line*-*
    
... "insert your code here
*$*$ end of inverse routine - insert your code only before this line *-*

  
ENDMETHOD.                    "invert_ZPMES
ENDCLASS.                    "routine IMPLEMENTATION


 

Rutina para Año


 



PROGRAM trans_routine.


*---------------------------------------------------------------------*
*       CLASS routine DEFINITION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform DEFINITION.
  
PUBLIC SECTION.

*  Attributs
    
DATA:
      p_check_master_data_exist
            
TYPE RSODSOCHECKONLY READ-ONLY,
*-    Instance for getting request runtime attributs;
*     Available information: Refer to methods of
*     interface 'if_rsbk_request_admintab_view'
      p_r_request
            
TYPE REF TO if_rsbk_request_admintab_view READ-ONLY.

  
PRIVATE SECTION.

    
TYPE-POOLSrsdrstr.

*   Rule specific types
    
TYPES:
      
BEGIN OF _ty_s_SC_1,
*      Field: POSTING_DATE Cronomarcador.
        POSTING_DATE           
TYPE P LENGTH DECIMALS 0,
*      Field: RECORD.
        RECORD           
TYPE RSARECORD,
      
END   OF _ty_s_SC_1.
    
TYPES:
      
BEGIN OF _ty_s_TG_1,
*      InfoObject: ZPANIO Año.
        /BIC/ZPANIO           
TYPE /BIC/OIZPANIO,
      
END   OF _ty_s_TG_1.

*$*$ begin of global - insert your declaration only below this line  *-*
    
... "insert your code here
*$*$ end of global - insert your declaration only before this line   *-*

    
METHODS
      compute_ZPANIO
        
IMPORTING
          request                  
type rsrequest
          datapackid               
type rsdatapid
          SOURCE_FIELDS              
type _ty_s_SC_1
          segid                    
type RSBK_SEGID
        
EXPORTING
          RESULT                   
type _ty_s_TG_1-/BIC/ZPANIO
          monitor                  
type rstr_ty_t_monitor
        
RAISING
          cx_rsrout_abort
          cx_rsrout_skip_record
          cx_rsrout_skip_val
          cx_rsbk_errorcount
.
    
METHODS
      invert_ZPANIO
        
IMPORTING
          i_th_fields_outbound         
TYPE rstran_t_field_inv
          i_r_selset_outbound          
TYPE REF TO cl_rsmds_set
          i_is_main_selection          
TYPE rs_bool
          i_r_selset_outbound_complete 
TYPE REF TO cl_rsmds_set
          i_r_universe_inbound         
TYPE REF TO cl_rsmds_universe
        
CHANGING
          c_th_fields_inbound          
TYPE rstran_t_field_inv
          c_r_selset_inbound           
TYPE REF TO cl_rsmds_set
          c_exact                      
TYPE rs_bool.
ENDCLASS.                    "routine DEFINITION

*$*$ begin of 2nd part global - insert your code only below this line  *
... "insert your code here
*$*$ end of 2nd part global - insert your code only before this line   *

*---------------------------------------------------------------------*
*       CLASS routine IMPLEMENTATION
*---------------------------------------------------------------------*
*
*---------------------------------------------------------------------*
CLASS lcl_transform IMPLEMENTATION.

*----------------------------------------------------------------------*
*       Method compute_ZPANIO
*----------------------------------------------------------------------*
*       This subroutine allows the mapping from source to target fields
*       of a transformation rule using ABAP for application specific
*       coding.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD compute_ZPANIO.

*   IMPORTING
*     request     type rsrequest
*     datapackid  type rsdatapid
*     SOURCE_FIELDS-POSTING_DATE TYPE P LENGTH 000008 DECIMALS 000000
*    EXPORTING
*      RESULT type _ty_s_TG_1-/BIC/ZPANIO

    
DATA:
      MONITOR_REC    
TYPE rsmonitor.

*$*$ begin of routine - insert your code only below this line        *-*
    
... "insert your code here
*--  fill table "MONITOR" with values of structure "MONITOR_REC"
*-   to make monitor entries
    
... "to cancel the update process
*    raise exception type CX_RSROUT_ABORT.
    
... "to skip a record"
*    raise exception type CX_RSROUT_SKIP_RECORD.
    
... "to clear target fields
*    raise exception type CX_RSROUT_SKIP_VAL.

    
data l_datetime(15type c.
    
data :l_date type sy-datum.
    
dataanio type /BI0/OICALYEAR.

    l_datetime 
SOURCE_FIELDS-POSTING_DATE.
    l_date 
l_datetime+00(08).
    anio 
l_date+0(4).



    RESULT 
anio.


*$*$ end of routine - insert your code only before this line         *-*
  
ENDMETHOD.                    "compute_ZPANIO
*----------------------------------------------------------------------*
*       Inverse method invert_ZPANIO
*----------------------------------------------------------------------*
*       This subroutine needs to be implemented only for direct access
*       (for better performance) and for the Report/Report Interface
*       (drill through).
*       The inverse routine should transform a projection and
*       a selection for the target to a projection and a selection
*       for the source, respectively.
*       If the implementation remains empty all fields are filled and
*       all values are selected.
*----------------------------------------------------------------------*
*       Customer comment:
*----------------------------------------------------------------------*
  
METHOD invert_ZPANIO.

*   IMPORTING
*     i_r_selset_outbound          TYPE REF TO cl_rsmds_set
*     i_th_fields_outbound         TYPE HASHED TABLE
*     i_r_selset_outbound_complete TYPE REF TO cl_rsmds_set
*     i_r_universe_inbound         TYPE REF TO cl_rsmds_universe
*   CHANGING
*     c_r_selset_inbound           TYPE REF TO cl_rsmds_set
*     c_th_fields_inbound          TYPE HASHED TABLE
*     c_exact                      TYPE rs_bool

*$*$ begin of inverse routine - insert your code only below this line*-*
    
... "insert your code here
*$*$ end of inverse routine - insert your code only before this line *-*

  
ENDMETHOD.                    "invert_ZPANIO
ENDCLASS.                    "routine IMPLEMENTATION

  SAP BW 4hana DTP ABAP Filter Range using DTP Routine data: lv_date_low TYPE sy-datum, lv_date_high TYPE sy-datum, lv_date TYPE dats, lv_mo...