Quantcast
Channel: SCN : Document List - SAP Planning and Consolidation, version for SAP NetWeaver
Viewing all 192 articles
Browse latest View live

How to Create an esf File and change the selections in Data Manager Package

$
0
0

Hi,

 

This document is all about how to create .esf file manually for Text/CSV file to fill selections of Data Manager Package.

This will be helpful when you want to selectively load lots of Dimension members values while running a Data Manager Package.

 

1. Log on to EPM Excel client, In Data Manager Tab, Click on Run Package

1.jpg

2. Select  the package which you want to Run. In our example select “Clear” Package and Click on “Run”.

2.jpg

 

3. Select some members.

3.jpg

4. Click on Save

4.jpg

5. Give some name to .esf file, in our example “Clear_Data.esf”.

5.jpg

6. Now log on to SAP Netweaver and go to  T-Code UJFS.

6.jpg

7. Enter name of Your Environment and click on Execute.

7.jpg

8. Navigate to your esf file where it is stored (see screenshot below). In our example “clear_data.esf”.

Root-> webfolders -> (your environment name ) -> (your model name) -> datamanager -> selectionfiles. 8.jpg

9. Right Click on file and click on Download Document.

9.jpg

10. Click No and save file to your Desktop.

10.jpg

11. Now open the “clear_data.esf” file stored on your desktop in Notepad, it will look like

11.jpg

12. Now add some Materials, and save the file.

12.jpg

13. Now again go to SAP Netweaver, Right click on folder in which you want to upload, in our example right click on “Clear” folder. Then Click on Upload Document from PC. Select the esf file from Desktop and click “Open”.

13.jpg

14. Now go to Data Manager Package, again Run the package, then Click on “Load

14.jpg

15. Select the esf file which you have uploaded from SAP Netweaver, and click on Open.

15.jpg

16. You can see the Members filled in selection which are added in notepad.

16.jpg

17. To see the Members in detail, click on Add in MATERIAL Dimension, you can see extra members as well which are manually added by you.

17.jpg

 

 

Hope this will be Helpful.

Thanks,

Rishi


Create an Excel based BPC Admin Consol Using VBA, ABAP and BPC10.x

$
0
0

Purpose:

As BPC Admin, we need to switch a lot between Excel EPM Addin and Web Client for various administration tasks. To enhance the productivity and save some moments, I have worked out a BPC Admin Consol using some ABAP, VBA and BPC and developed an Excel based BPC Admin Consol.

 

Features:

You will get the BPC configuration and BPC tasks in the same workbook. No need to switch to Web client for Model details, secured dimension, dimension structure etc.

 

System Requirements:

BPC10.x on SAP NW

EPM Addin Version 10.0 SP15 or later

Excel 2007 or later

 

Configuration:

The workbook contains two sheets:

  • BPC_ADMIN_CONSOL - For common BPC Tasks
  • CONFIGURATION_DETAILS - For details regarding models and dimensions

There are three parts to the development.

  1. ABAP - Function Modules to extract BPC configuration
  2. VBA - Connecting and Extracting data from ABAP FMs and Formatting it
  3. Excel - For Name ranges to use in drop-downs

The BPC ADMIN CONSOL looks like the following:Admin Consol.jpgLet us check out the CONFIGURATION_DETAILS sheet first. This is how it looks like:Configuration_Details.jpgAs you see, there will be four FMs for each of the table based on Environment ID:

  • FM for Model List
  • FM for Package List
  • FM for Model Structure
  • FM for Dimension Structure

 

You can definitely utilize your ABAP and VBA expertise to get more information but this is just the starting point to explain. All these FMs will be Remote-Enabled Modules. I would advise to create a separate package for these ABAP developments just to keep the work area separate.

 

Ensure that all FMs you create are Remote-Enabled as shown below:

FM Settings.jpg

 

Refer to FM Documentation.txt for the source code and import parameters along with output table structure.

 

Once these FMs are created, we will connect to EPM Excel Addin. Open up a new excel file and ensure that following references are maintained:

VBA References.jpg

 

Some references might be extra but I added them to be on the safer side.

 

Create two sheets in the excel file and name them as "BPC ADMIN CONSOL" and "CONFIGURATION_DETAILS".

 

Press Alt+F11 and go to VBA editor. Insert the module and copy the code in the file "Module Code.txt" and copy the code in "Sheet Code.txt" in worksheet "BPC ADMIN CONSOL".

VBA Code.jpg

In your module code, replace "Your ID" with your BPC ID and "Your Password" with your BPC Password.


You have to do the following:

- Maintain system details in sheet "CONFIGURATION_DETAILS"

- Maintain BPC URL in sheet "BPC ADMIN CONSOL"

- Remove "Convert to formula" in cell B5 of the sheet "BPC ADMIN CONSOL"

- Use the cell references as show in the screenshots so that attached code works perfectly for you.

 

In CONFIGURATION_DETAILS sheets, add five button as show in the screenshot above and assign macros to them as follows:

ButtonMacro Name
Connect to BWLogOn
Get Model ListGetModelList
Get Package ListGetPackageList
Get Model StructureGetModelStructure
Get Dimension StructureGetDimensionStructure

 

Enter Environment ID and then click "Connect to BW" button. This will connect you to the backend BW system. Now click on each button and get the various lists.

 

Once these lists are populated, we will create name ranges to be used in the BPC ADMIN CONSOL sheet for dynamic dropdowns.

 

Here are the name ranges you should be creating:

Name Ranges.jpg

Name Range IDFormula
Model_List=OFFSET(CONFIGURATION_DETAILS!$B$11,0,0,COUNTA(CONFIGURATION_DETAILS!$B$11:$B$20),1)
Model_Start=CONFIGURATION_DETAILS!$E$10
NEW_PACKAGE_LIST=CONFIGURATION_DETAILS!$F$11:$F$2000
PACKAGE_DETAILS=OFFSET(CONFIGURATION_DETAILS!$E$11,0,0,COUNTA(CONFIGURATION_DETAILS!$E$11:$E$2000),1)
PACKAGE_GROUP_LIST=OFFSET(CONFIGURATION_DETAILS!$F$11,0,0,COUNTA(CONFIGURATION_DETAILS!$F$11:$G$2000),1)
PACKAGE_LIST=OFFSET(CONFIGURATION_DETAILS!$F$11,0,0,COUNTA(CONFIGURATION_DETAILS!$F$11:$F$2000),1)

 

Now we are ready for sheet "BPC ADMIN CONSOL"

 

You need to put "Data Validation" for following cells as follows:

Data Validation.jpg

Allow: List and Under Source copy the following for the cells defined:

 

CellData Validation (Source)
K5=Model_List
F16=OFFSET(Model_Start,MATCH(K5,PACKAGE_DETAILS,0),1,COUNTIF(PACKAGE_DETAILS,K5),1)
F23Input Schedules,Reports

 

You can now take drop down of the models and packages extracted in the sheet CONFIGURATION_DETAILS and it will dynamically select the team and allow you to run the package, open reports and input schedules corresponding to a given model.

 

Remember to save your file as .xlsm.

 

This can further be enhanced depending on your ABAP and VBA expertise. I understand it will be some work for you guys before you get it working but I hope it will be a useful tool for you in your BPC projects.

 

Enjoy!

Lessons Learned: SAP ECC Systems Consolidation

$
0
0

Lessons Learned: SAP Systems Consolidation

 

For: SAP Application Delivery Professionals

By: Abhishek Srivastava | Deloitte | Consulting | SAP Package Technologies

 

 

Key Takeaways/Preface


The consolidation of two more than decade old SAP environments is a complex undertaking that does not follow our standard methodology. There are various parameters that we need to rethink and design that we normally do not need in Greenfield SAP implementation.  In this article, we will discuss various lessons learned of two SAP landscape consolidations for C&IP Industry client where Parts business running on SAP R/3 4.7 was migrated to Appliances (a.k.a. Majors) business running on SAP ECC 6.0.

 

Ease on Design & Build but complex testing

Most of system/business consolidation efforts do not call for process reengineeringso design and build goes smooth by limiting to draw as-is process and define to-be for conflicting scenarios only. However, testing effort and complexity is increased multi-ford by ensuring that migrated process runs in new system (i.e. Integration testing) and it did not break anything in the process running in target system (i.e. Regression testing). One of the usual oversights one can do to avoid the need of requirement list with the argument that we are not doing business process reengineering but testing coverage can’t be assured without requirement tracking. 

 

Don’t underestimate Batch consolidation effort and testing

Our OTC processes are heavily dependent on batch schedule involving 3-tier batch management applications. Unlike Greenfield implementations where we define each batch job frequency, variants etc. from scratch, we may think to retrofit all jobs from one system to another as-is – this is a mistake!


One of the biggest learning we had that we must treat batch job consolidation same as greenfield implementation for the process that is being migrated from one system to another. You should map each and every migrated process (parts) and perform fit/gap against Majors system batch jobs and retrofit variants, schedule and frequency.


User Impact and training effort are minimal

Unlike Greenfield implementations, system consolidation does not change anything for end user except new system, few new transactions names and limited process changes where we had conflict between the two processes. We can live with just hand-outs rather than comprehensive classroom trainings.

 

Deployment Plan is the most interesting journey

The deployment plan complexity is multi-fold for system consolidation projects where we need to look for system cutover and business freeze of 2 different OTC business processes merging into one system. This directly translate to multi-billion $ company business freeze so it better be a fool-proof and self explanatory. The checks and balances need to be performed in all stages with controlled, throttled and multi-stage deployment. Each and every task and duration will be questioned for its worthiness and the most tempting question an Integration lead should ask everyone and him/herself that which of those cutover tasks can be done without business freeze.

 

All Communications are not same

Communication is the key to success for any project but system consolidation needs another level of communication layer for all impacted (Parts business) as well as non-impacted (Majors business) parties. Essentially, even Majors business is impacted in system consolidation effort but they don’t know until you approach them in all possible communication channels like walk-the-wall session covering new data elements, batch schedule changes, business freeze etc.


Further, we will discuss considerations across all technology areas and SDLC phases.

 

1.   System Landscape and Sync Considerations

 

This is comparatively landscape heavy engagement in terms of keeping 2 different production systems in-sync with project system landscape. We need to keep the production break-fix path intact but also have separate development and QA landscape during the course of consolidation project schedule.


The project system landscape should be updated regularly with production break-fix changes from both parts and majors production systems. You must not underestimate the manual effort for keeping project landscape in-sync with old Parts production system. Every change moving to retiring parts production system must be manually retrofitted to project landscape as they cannot be transferred via transport path due to mismatch in
SAP production version and level.

 

We mitigated this risk by introducing the change governance process; whosoever is adding change to production landscape, is responsible to retrofit same in project landscape as well. We also introduced hard-freeze after testing completion and only must-have changes/defects was addressed during the project
duration.

 

This is one of the examples and bare-minimum system landscape and sync that one would need for a consolidation project.

Landscape.png

 

2. Design Considerations

 

Unlike Greenfield implementation, where you study as-is and define to-be for entire business process framework, we limit scope defining as-is and defining to-be for conflicting scenarios only for ERP system consolidation project. This is not a process redesign but focused on how 2 different processes, code, configuration and data can co-exist in one system and seamlessly blend from process as well as system parameter perspective. For Example: How we can have Delivery due list running for both parts and Majors– do we have to keep DDL job variant and schedule entirely separate or Majors DDL variant and scheduled should be widened to cater the need of both processes?


Design should be emphasized on defining each of these:

  1. As-Is process and Requirement metrics for Parts Business Process
  2. As-Is process and Requirement metrics for Majors Business Process
  3. Define To-Be for conflicting processes (one has to adopt another OR entirely new process for both)

 

You must be wondering why I need to put humongous effort for requirement metrics when I am not changing processes much as part of consolidation. The answer lies in SDLC phase – Testing. Unless you have the requirement metrics for both Parts and Majors, you cannot ensure testing scenarios coverage, depth and breadth. This effort pays-off when you will be able to map your entire or critical process path to test script and see if they are really tested for all possible permutations and combinations.


In system consolidation projects, business are more interested to know what’s changing for them so walk-the-wall session helps where we can demonstrate the changes by region and process area.  For example, we can segregate each area in following buckets which provides high-level idea what could impact to their respective business responsibility areas and engage further accordingly.

 

Design.png

 

3.   Build Considerations

 

3.1 Scope Baseline

One of the biggest challenges that we face is locking down custom code migration scope from Statement of Work (a.k.a. SoW) as well as functionality perspective. Please note, here we are talking about retiring 15 years old ERP Parts system and migrate every usable(not just active) piece of byte to another more than decade old (but latest version and support pack) ERP system. Our finding revealed that at least 40% of code and configuration is obsolete and not used anymore in the retiring applications. We used 2 different tools for scope baseline:

  1. A well-renowned tool by 3rd party
  2. Another in-house Upgrade tool

 

Both the tools had almost same results which gave us baseline to start our scoping, effort, and timeline. However none of them tells you which all custom code repository is still active but not used anymore from process standpoint. This means, we need another layer of scope filter by tagging each and every custom code like user-exit, report, programs, workflow etc. to current process and rule-out the ones which are not tied to process anymore.  This is troublesome but required for any system consolidation engagement.

 

3.2 Code Retrofit

Unlike Greenfield implementation, we categorize objects in 3 broad categories for code consolidation of ERP systems:

 


Category


Definitions


Effort


Port Objects


No conflict between processes running
  in 2 systems and they are purely lift & shift


~30% of standard effort including
  documentation and unit testing


Leverage Objects


Conflict between processes running 2  systems as they reside in both systems with same names


~50% of standard effort including
  documentation and unit testing


Test Relevant Objects


No build action required, already available in  target system


~10% of standard effort as only
  testing effort is required


Here Leverage object retrofit is the key to success as they are the ones which are prone to error and issues and impacting both sides of processes if not retrofitted carefully. They need to co-exist with same name and address both parts and Majors process as-is. Specifically for leverage objects, we must have unique global filter to segregate the code execution (like Sales Org, Sales Area, document types, User role etc.) so certain code elements are executed for
required process chains.

 

We should be careful about categorizing leverage objects and most of the leverage objects can be very well changed to port if migrated with different names. This may have user training impact but it is still a recommended approach to minimize the number of leverage objects as each leverage object introduce the risk to overall business operations. 


3.3 Configuration Retrofit

There will always be a laundry list of items that were not considered in the initial effort estimation. We experienced that several IMG configuration node/values were changed as part of consolidation due to conflict with target system. This means all custom code repository need to be scanned for hardcoding or TVARVC and replace with new values. Some of the IMG configurations values that are likely to be changed as part of system consolidations are:

 


Document Text IDs


Pricing Procedure


Line Item Category


Line Item Cat Group


Document Types


Pricing tables


Pricing conditions


Custom tables names


Delivery block


Billing block


Plant IDs


Order Reason


Channels


Status Codes


etc…

 

The last piece of puzzle is table indexes retrofit, over the time support team may have created hundreds of DB indexes but all of them may not be used anymore or they are already covered partial or fully by the indexes available in target system. A comprehensive analysis is required before moving DB indexes otherwise it may have adverse effect to DB size and performance.

 

 

4.Data Load Considerations

 

We have different kind of business data that we are dealing here like Master Data, supporting data objects like Sales Area, Inventory, Materials Substitutions, Purchase Info Records, Routes, Inclusion/exclusion, Open transaction records and historical records; and their migrations need to be happen in multiple stages.

 

Let’s categorize data objects in different phase and methods of sync. I will limit the discussion how we mitigated the risk by deploying them in multiple stages
and you can adapt different strategy based on complexity of your consolidation engagement.

 


Phase


Data  Object


Sync Frequency


Sync Schedule


Sync Method(s)


1


Sales Area and Master Data


Recurring till Go-Live


At end of UAT


ALE


2


Inventory, Materials Subs, PIRs,
  Incl/Excl. etc.


Recurring till Go-Live


At end of UAT


LSMW, Custom Conversion


3


Open Transaction Data


– Sales Order


Initial, once


Business Go-Live


LSMW, BAPI, Custom Conversion, 3rd
  party tool


4


Historical Records


Initial, once


Business Go-Live


BW, RFC

 

 

 

4.1     Data Reconciliation

The key of the successful data transfer lies with the data load dress rehearsals and ability to reconcile between source and destination. The reconciliation can be multi-fold at header summary level as well as line item levels. For example – Open Sales Order migration reconciliation need to be done at header summary levels – Net Value of all migrated orders, total number of line items and total units at line items, and further drill down to line item level reconciliation where we must compare the key attributes like item status, profit segment, shipping point, blocks etc. between source and target systems.

 

We must develop reconciliation procedure of each and every data objects and special skilled task force is required for Inventory and opens sales order migrations. Specifically Inventory reconciliation need to be done at quantity as well as accounting levels. It is more challenging if we have shared materials between two systems (like accessories) where we will have to wait till Go-Live freeze window to sync inventory of such materials. For rest of the reconciliation mechanism, it could be as easy as simple extract from both systems followed by excel vLookup or MSAccess database queries, custom programs etc. You must account for utilities development and effort for data reconciliation and staff adequate number of resources in data steward team.

 

 

We adopted following phases for defining data reconciliation mechanism and sign-offs for each data structure.

Data.png

 

 

4.2     Historical Data Management

We must ask the business reasons for historical transaction data availability in the target system. Most of the time, it is limited to return/claims cross-reference, display of transaction on need basis, or legal obligation. In our project, we had use case on later 2 and we provided the separate link to display historical transaction data on websites for consumer/distributors from retired system rather than bringing them in the target system. This is above and beyond of BW historical data availability that is already available for end users.

 

 

 

5.  Batch Setup Considerations

 

Batch is one of the most critical pieces for ERP consolidation project. Batch setup involves retrofit and reconciliation of multiple elements like:

 

1. Batch Scope Identification: There could be many ways to identify the batch migration scope but we followed what was already running in legacy SAP 4.7 parts system. We pulled all the jobs that ran in last 90 days, removed the duplicates and that gave us the scope baseline. Further, those jobs need to be aligned to business process as you never know what all jobs running without any use for more than a decade old system. We ruled out around 15% jobs by mapping each job to business process and removed duplicate of housekeeping jobs that were
already running in target system.

 

2. SAP Variant retrofit: We need to be really creative on this. Over the time, client may have thousands of batch variants belonging to multiple programs so it is not easy to redefine each. We developed a custom utility to pull all the variants of all in-scope batch jobs programs from parts to majors system as-is and further aligned them with Majors variants with due-diligence by process area. For example – DDL jobs can be very well merged into one if their variants are mapped correctly. We faced several challenges on variant migrations and learned hard way that utility must consider following scenarios:

 

2.1 Dynamic variants (Date/Time/User)– Dynamic parameters cannot be transferred due to mismatch in SAP version. You will have to identify and retrofit them manually.

2.2 File path changes– any file based interface need to refer directory of target system

2.3 Conflicting variants– Add 2 character suffix for conflicting variants if length is lessthan or equal to 12 otherwise those need to be retrofitted manually due to limitation of 14 character variant length limitation

2.4 Multi-tab variants– Selection screen filters running in multiple tabs has to be retrofitted manually (Example – Delivery due list Job program)

 

  

3. SAP Job definitions: Program and variants have been migrated so the next step comes to define the jobs in SAP with right dependencies and list of steps. We developed a custom utility to pull all the job definitions of all in-scope batch jobs programs from parts to majors system as-is and further aligned them with Majors job definitions with due-diligence by process area.

 

4. Job Scheduling: We used a well-renowned 3rd party tool because of heterogeneous but dependent system landscape. We had requirement to trigger the non-sap system job once sap job finishes so used external tool but if you don’t have such complexity, you can define the schedule and frequency within SAP system itself. Here is the snapshot of how we aligned Parts and Majors critical path for batch.

 

Batch.png

 

 

6.  Testing Considerations

 

This is the most important phase of a consolidation project. As mentioned earlier, the testing scope is double as we not only need to test what processes we are migrating to one system to another but also need equal focus that we don’t break anything in processes running in target system. Let’s discuss some of the learnings we had during testing stage.

 

6.1   Integration & Regression Testing

Like Greenfield project, we must have requirement metrics to document the test scenarios and variations. One of the usual oversights one can do to avoid the need of requirement metrics with the argument that we are not doing business process reengineering but testing coverage can’t be assured without requirement tracking.

 

Further, testing critical path lies with:

 

  1. Leverage categorized code (see Build Considerations in section 3.2) objects must be translated to business process followed by test scripts that covers both sides of the process scenarios. Each and every Leverage objects test scenarios should be drafted for positive and negative test cases for Parts vs. Majors business. Here positive for one may or may not act as negative test case for other side of business and it completely dependent on how you migrated and blended the two processes in same system.
  2. All conflicting scenarios that have been redesigned to cater the requirement for both side of the business; must be tested with all variations of master data. Essentially, this is the change we are bringing for client business and we must ensure that changes are appropriately addressed, accepted and signed-off from both parts and majors side of the business.
  3. All interface connections with non-sap, external systems, website etc. must be tested for all scenarios as most of these connections may require the redefinition of extractors, filters, URLs, new network port etc. You will never be assured of what changes are required (and on which side of the interface connection) unless you test them for all variations.  For example, CRM middleware adapter objects must be adjusted so it pulls/push data as before rather than opening the wider channels and impacting call center operations.

 

6.2   Batch Testing

This is another critical piece of testing puzzle. When we migrate over thousands of variants, batch definitions and schedule from one system to another Live
system, we bring a great deal of uncertainty on how they are going to blend and work together. Based on complexity of your business, you may have from few hundred to multiple thousands of batch jobs running in both source and target systems. You may not be able to test each and every variations of job but you must focus on critical OTC process runs that are heavily dependent on batch jobs. 

 

Batch testing can very well co-exist with performance testing landscape so you createtrue production like environment for your performance testing by running consolidated, cleansed batch and let Orders flow through ship-bill-accounting cycle. It is worth to mention that those non-critical jobs is followed by
critical ones, also becomes critical for your testing. For example – Credit old removal job is essential for critical delivery due list runs. Here is the napshot of how we tracked the batch schedule baseline and testing of critical path.

 

BatchTest.png

 

 

7. Deployment Plan Considerations

 

The deployment plan complexity is multi-fold for system consolidation projects where we need to look for system cutover and business freeze of 2 different OTC business processes merging into one system. This directly translate to multi-billion $ company business freeze so it better be a fool-proof and self-explanatory. The checks and balances need to be performed in all stages with controlled, throttled and multi-stage deployment. Each and every task and duration will be questioned for its worthiness and the most tempting question an Integration lead should ask everyone and him/herself that which of those tasks can be done without business freeze. Let’s discuss the few components of deployment plan:

 

7.1   Cutover Plan – System and Business Cutover

The primary goal of any cutover plan is to make the system business-ready with minimal business operation impact. We recommend having staggered 5 stage deployment specifically for consolidation projects. This helps us minimize business impact, mitigate and stagger risk and enhanced focus on key areas especially when we are dealing with 2 different ends of business.

 

Cutover.png

 

Stage 1 – Cutover Preparation: This is the stage where we perform activities that do not impact business, i.e. mostly out of system activities like master data cleansing, cutover logistics, socialization, finalize of transport and sequence etc. This stage runs on standard 8 hours calendar.

 

Stage 2 – Technical Go-Live: We migrate all code, configurations, indexes and manual configuration in this stage. This stage should occur at least couple of weeks earlier before Business Go-Live and it give us enough time for next stage of data load and batch setup. This stage also gives us benefit of dealing with Majors business issues upfront that may have been slipped through our regression testing and stabilizes before parts business is live in the target system. This stage runs on 24x7 calendar and require Majors business freeze and system back-up for rollback point.

 

Stage 3 Parts Data Load/Batch Setup: This stage runs in parallel to Majors stabilization phase and takes comparatively longest time. You must find a right slot for data load as you will be loading parts data in already live Majors business system – usually post online business hours and weekends. You can load everything except Open sales order and Inventory as these 2 data objects will keep changing every minute in source system. This stage runs on available slot based calendar.

 

Stage 4 – Business Go-Live: This is the final stage of deploymentwhen we freeze both Parts and Majors ends of the business for clear rollback point in case of any catastrophe. We focus more on clearing the warehouse pipeline, finance books closure of the source retiring system followed by open data object load. We can go lot deeper on this stage but this is the moment when we flip switch and retire the Parts ERP system.  All data are queued up at PI, EDI and other middleware ends during this stage and we must be careful before opening the floodgate by introducing controlled and throttled data flow.

 

Stage 5 – Stabilization– It is self-explanatory.

 

The key of any cutover plan success lies in the dress rehearsals. The more you rehearse, the more ready you are. We hadmultiple dress-rehearsal of each and every activity (including smoke testing) planned for Go-Live stages 2 and 4. 

 

8.  Communication Plan Considerations

 

As mentioned earlier, all communication plans are not same. We have to adapt different methods and procedures to suit the situation and audience. For example – a kick-off must be planned with wider client team, all 3rd parties and project team before we start any stage like cutover dress rehearsal, Go-Live cutover, testing etc. Similarly, corporate intranet post needs to be shared if we need to communicate with entire organization. 

 

We adopted following communication methods and this is not just related to systemconsolidation projects but can be adapted to any project based on the nature and audience. 

 

Comm.png

 

 

9.Governance

 

Governance is the key to successful migration of ERP consolidation project. It is more important in this kind of engagement because you don’t want to deal with humongous, unaccounted changes while you are shifting its core to entirely new system. I will split the governance into following 2 categories:

 

 

9.1   Data Governance

Data governance is absolutely required and it must be driven by project phase and change severity levels. For example, we must stop placing all International orders couple of weeks before because of pick/pack/ship lead time. Similarly, master data freeze should start from the UAT end as you will be migrating master data to target system by this stage.

 

 

9.2   Technical changes Governance

As mentioned in section 1.0, all technical code or configuration changes need to be manually retrofitted from retiring parts production landscape to project landscape. We must draw a line on the break-fixes/changes moving to production from the start of the project test phase as every new break-fix retrofit will warrant repeating the testing of all the scenarios that are impacted by the break-fix changes. The exponential effort around testing, manual retrofit and associated risks may not be worth for the changes unless it is must have severity 1 issue. We introduced a toll-gate of every changes moving to retiring production system and ensured only severity 1 is permitted and person responsible explains the changes to project and testing team so relevant scenarios can be tested/re-tested in consolidated environment.

 

 

 

10.Hyper-care/ Post Go-Live Considerations

 

Hyper-care, as the name states that utmost care and we need extra pair of eyes and monitoring in all possible system and business dimensions. Business issues will be anyways addressed through stabilization channels and I would like to discuss more on the topics which doesn’t seem important until they start creating problems.

 

10.1   Operational Matrices

One of the questions we get from organization CXX level that how do we ensure that migration did not impact business throughput? The only way to defeat this concern that we start tracing the operational metrics before and after business Go-Live and publish the average Order, deliveries and billing net value against 4 weeks window. Some of the operational metrics that we gathered and reported on:

 

  1. How many order created everyday and what are order intake channels of each
  2. How many order have been converted to delivery and PGI’d
  3. How many units have been shipped
  4. How many deliveries have been converted to billing
  5. etc.

op.png

 

 

10.2  System Monitoring

System monitoring becomes essential as most of the time we react to it rather than taking proactive monitoring and correction approach. There are various parameters that can be tracked in regular intervals like CPU Utilization, memory consumption, work process availability, central instance health. We tracked following every 2 hours-

 

sys.png

 

Note:- These practices worked well for my project and it may not be true in your case. Please use your own discretion while applying any of the lessons learned for your project.

Top 10 viewed SAP KB/NOTE for September 2014

$
0
0

Purpose

The purpose of this document is to provide a list of the top ten most viewed SAP KB/NOTE for SAP Planning and Consolidation version for SAP NetWeaver

 

in the month of September 2014.

 

Overview

Below are the top ten most viewed SAP KB/NOTE for SAP Planning and Consolidation version for SAP NetWeaver for the month of September 2014.

 

KB/Note NumberKB/Note Title
2053377No active links in BPC 10.1 web client after upgrading SAP G
2010964BPC parameters list and how they impact system behavior- BPC
1966663BPC 10.1 Frequently Asked Questions
1673094Error in EPM add-in when running reports: UJO_READ_EXCEPTIO
1817817How to configure RFCs used by BPC NW
1665544UJBR Error occured while generating BUI persistence data - B
1667091BPC 10 Transport error: Errors occurred during post-handlin
1722516BPC 10 NW The resource for alias is missing
1708660BPC10 Transport error UJT_TLOGO_AFTER_IMPORT
1744310Get Error "500 - Server error" when login to BPC 10 Web Admi

 

Please note, in order to view the contents of the SAP KBAs, you will need to be logged into Service Marketplace.

 

SAP EPM Design Council

$
0
0

Be a part of SAP EPM's Future.

 

We aspire to share features, capture feedback, and co-innovate with customers to consistently communicate feedback to SAP Development team during the development process from concept to go-live, and through continuous improvement.

 

 

Request to Join EPM DC - Americas

Request to Join EPM DC - EMEA

Request to Join EPM DC - APJ

Request to Join EPM DC - Greater China

Be sure to include:

FULL NAME, COMPANY, INDUSTRY, ROLE, EMAIL, PHONE,

and any planning solution implemented or plan to implement.

 

 

What Happens in EPM DC and Why Should I Join?

Understanding a user’s needs is crucial for delivering successful and relevant SAP products. Customer feedback is key and most valuable when development can still be influenced. One way to engage with customers and gain this feedback is via Design Councils. All parties benefit from being involved: The customers get an exclusive insight into upcoming products and features while being an active part of the development process. SAP learns about the needs and requirements of their customers and receives feedback on products before they are broadly launched. The current solutions under focus for EPM DC are SAP Cloud for Planning and SAP BPC 10.1 NW.

 

 

Member Benefits

 

whistle.png

relationships.png

roadmap.png

 

Coaching

 

An assigned coach from our EPM Product Management (PM) team will help you every step of the way


Build Relationships

 

Build a close relationship with the EPM Product Management and Development Team


Influence Roadmaps

 

The chance to influence the EPM Roadmap


strategy.png

 

network.png

calendar.png

Strategize

 

Align with SAP's intended strategy, from the get-go

 

Network

 

Meet and learn from participants and SAP employees

Plan Ahead

 

Learn about future events, special invites, and further details

 

Who are we looking for?

 

Customers who are involved with Departmental Planning or Enterprise Planning and Analysis such as:

1) Design & support of solution architecture and strategy

2) Forecast, planning, and analysis - Finance, Human Capital Management, Sales, or other operational areas

 

 

Prerequisites:

  • Customer has documented their top business use cases
  • SAP Cloud for Planning:
    • Customer is open to use SAP HANA Cloud, and has planning needs.
    • Customer who has implemented or would like to implement BPC planning and would like to extend departmental/LOB planning on public cloud.
  • BPC 10.1 NW:
    • Customer who has implemented or would like to implement BPC planning and would like to have integrated planning  either on-premise or with HANA Enterprise Cloud.

 

Commitments:

  • A signed legal feedback agreement is in place
  • Provide key business use cases
  • Attend a 1 day onsite workshop (attendees must cover their own travel costs) and a quarterly 1-hour remote webinar
  • Provide active feedback

Request to Join EPM DC - Americas

Request to Join EPM DC - EMEA

Request to Join EPM DC - APJ

Request to Join EPM DC - Greater China

Be sure to include:

FULL NAME, COMPANY, INDUSTRY, ROLE, EMAIL, PHONE,

and any planning solution implemented or plan to implement.

Top 10 viewed SAP KBAs for December 2014

$
0
0

Purpose

The purpose of this document is to provide a list of the top ten most viewed SAP Knowledge Base Articles for SAP Planning and Consolidation version for SAP NetWeaver in the month of December 2014.

 

Overview

Below are the top 10 most viewed SAP KBAs for SAP Planning and Consolidation version for SAP NetWeaver for the month of December 2014.

 

Note NumberNote Title
2107965Issues in EPM Add-In after installing Microsoft updates
2081345BPCNW - Troubleshooting Tips for Transport Issues
2010964BPC parameters list and how they impact system behavior- BPC
2053377No active links in BPC 10.1 web client after upgrading SAP G
1817817How to configure RFCs used by BPC NW
1665544UJBR Error occured while generating BUI persistence data - B
1966663BPC 10.1 Frequently Asked Questions
1667091BPC 10 Transport error:  Errors occurred during post-handlin
1708660BPC10 Transport error UJT_TLOGO_AFTER_IMPORT
1673094Error in EPM add-in when running reports:  UJO_READ_EXCEPTIO

 

Please note, in order to view the contents of the SAP KBAs, you will need to be logged into Service Marketplace.

 

Related Content

Top 10 viewed  SAP KBAs for November 2014

Top 10 viewed  SAP KBAs for October 2014

Top 10 viewed  SAP KBAs for September 2014

Planning and Consolidations for NW: Lifecycle Management

$
0
0

Life Cycle Management

What it is,
Who can do it,
Where it can be done,
and, When?

 

What is Life Cycle Management?

Life Cycle Management is the process of managing the BPC environment to evolve with the business needs for consolidations and planning.  Over time, changes will be required to allow BPC to continue to meet the needs of the business.  However, there are rules to be followed regarding who can make changes, what types of changes they can make, and how they go about making the changes.

 

This document should help the business developers to understand clearly what changes they can make directly in Production versus what needs to start in the Development instance and be promoted through to Production.

 

Who can do what, and where?

SAP BPC is meant to be a business-driven solution to enable agility in the planning and financial close processes.  Consequently, the life cycle management is intended to follow a methodology that supports this.


The tables below define whether actions can be performed in Production by the Business or by IT.  In short, if the change requires a change to the structure of a model or underlying tables in BW, then these will be transported from DEV to PROD following the transport landscape defined by IT.  If the changes are purely changes to data (master data or transaction data), then these can be executed in Production directly, by the appropriate business users.

 


Changes to be Made Directly in PROD (after DEV)

(Performed by the Business Developers, according to Security Profiles)



Environment Objects


  • Dimension Master Data
    • Add members to the dimension
    • Change descriptions or any other property values for existing members
  • Member Formulas

Model Objects


  • Context Defaults (web)
  • Report Files
  • Input Templates
  • Transaction Data
  • Book Templates
  • BPF Templates
  • Data Manager Package Link
  • Journal Template
  • Web Documents
  • Audit Configuration
  • Data Manager Conversion File
  • Data Manager Data File
  • Data Manager Transformation File
  • Drill Through
  • Report Templates
  • Work Status Updates

 

NOTE:  Individual user security changes can be made by a combination of the SAP Security team and the SAP BPC Admin team, at any time, directly in Production.

 

 

 


Changes to be Transported from DEV to PROD

(Performed by the SAP BPC Admin Team and the Basis Team)

Environment Objects


  • Business Rule
  • Configuration
  • Dimension
  • Structure changes (i.e. the structure section of the dimension in BPC admin)
  • Adding a new hierarchy
  • Model
  • Any changes to a model (i.e. any configuration changes to model in BPC admin)
  • Script Logic
  • Task Profile
  • Data Access Profile
  • Team

Model Objects


  • Business Rule
  • Configuration
  • Control
  • Data Manager Package
  • Data Manager Package Group
  • Script Logic
  • Work Status Settings / Configuration

BW, Basis, and Other Considerations


  • All back-end development would need to follow the Transport landscape.  Direct updates to BPC Production are allowed on the following objects:
    • Process Chains – create, modify and schedule
    • Info Packages – create, modify and schedule
    • DTPs (Data Transfer Process) – create, modify and schedule
    • APDs (Analysis Process Design) – create, modify and schedule
  • Some types of back-end development include:
    • OSS Notes
    • Process Chains
    • ABAP Programs
    • BADI’s

Items for Deletion (rarely ever used or necessary)


  • Models
  • Dimensions
  • Data Manager Packages
  • Data Manager Package Groups
  • Data Manager Package Links

 

Changes to be transported from Development to Production can be developed and/or tested by either business resources or IT resources.  It is important to note, however, that these changes must be made in Development (and currently tested in Development, in lieu of a QA / Test instance) prior to being transported to Production.

 

NOTE:  ABAP changes would be managed as they traditionally are in BW, including version comparisons between DEV and PROD

 

Timing of Changes

 

BW-Layer Transports
Transports that do not impact the business user community can be executed at any point in time, as the Basis team deems appropriate.  Currently, this takes place on Thursday afternoons, after approval in the Change CAB.

 

BPC-Layer Transports
Transports that modify BPC content will require that the target environment (formerly referred to as Appset, prior to SAP BPC 10.0) be taken offline, prior to the deployment of the transport.  Once the transport has been moved successfully, the target environment needs to be put back online.

 

Because of this impact to the business user community, these types of transports are currently deployed, starting at 3a Central, by our offshore Basis resources, typically on Friday, after approval in Thursday’s Change CAB.

 

Other Factors to Consider
Currently, the only other jobs scheduled in the SAP BPC solution that impact the ability to move transports are the Light Optimization routines that run each day, starting at 1a Central.  They generally conclude by 3a Central.

 

Also, other maintenance activities, such as the archival of audit data or routine scheduled backups of the Production instance (via the UJBR transaction code in BW, specifically to backup or restore BPC content) may alter the window of availability.

 

Any specific instructions required to move a transport to Production will be explicitly documented in the JSOX document required by Change Management.

 

Emergency changes will be executed as needed, according to the nature of the emergency.

How to Transport a New Environment (Appset) that Does Not Exist in the Target Instance

$
0
0

Collecting the Transport (Part 1 - all except measures)

 

Follow these steps to create the transport, which is the first part of migrating the new environment to the target instance.

 

Please, follow the steps below:

 

1. Open SAP GUI, and connect to the source instance.

 

2. Go to RSA1

 

3. Click on Transport Connection

 

4. Select More Types, then select Environment, and then double-click Select Objects:

1.png

 

5. A pop-up window appears to choose the appropriate environment.  Select the appropriate environment, and then click Transfer Selections:
2.png

NOTE:  This make take several minutes, depending on the size of the content in the environment.

 

 

6. Once this step has completed, you should see something that looks like:

3.png

Notice the name of the environment under collected objects and the grayed out checkbox under the Transport (2nd) column.

 

7. Right-click on the grayed out checkbox, and then select Do Not Transport Any Below, as shown below:

4.png

 

8. Then, Select the Grouping button, and select Only Necessary Objects, as shown below:

5.png

NOTE:  You will see some activity very briefly, once you select this.

 

9. Make sure the Collection Mode is set to Collect Automatically:

6.png


10. Then, click on the “truck” icon to Transport Objects, and follow the remaining prompts to name and ultimately release the tasks and transport.

 
 

Collecting the Transport (Part 2 - measures)
 
 

The standard measures do not always get collected, automatically, so this step ensures the measures are included in the transport.

 

Please, follow the steps below:

 

1. Open SAP GUI, and connect to the source instance.

 

2. Go to SE16

 

3. Enter UJA_FORMULA_APP in the Table Name field, as shown below:

7.png

Press Enter.

 

 

 

4. Enter the Appset ID and then click on the Execute button:

8.png

 

5. You can either right-click one of the check boxes and click on Select All, if appropriate (as shown below), or you can individually select each measure that should be transported:

9.png

 

 

 

6. Click on the Table Entry menu option at the top of the window, and then select Transport Entries:

10.png

 

7. Add the entries to the existing transport created in the previous section, and follow the prompts, accordingly.

 

 

 

Releasing the Transport

 

The two main things to keep in mind, regarding releasing the transport are:

 

1. Make sure the tasks in the transport request have objects associated with them.  If there are no objects associated with a particular task, then you will need to re-collect the objects.

 

2. Make sure the environment (or appset) you are collecting from is currently set offline in the BPC Web Admin Client before releasing the transport.

 

 

 

SAP Basis Team to Move the Transport

 

When the SAP Basis team moves the transport, they need to use a profile that has SAP_ALL.

 

http://scn.sap.com/thread/3267007

 

 

 

Update Security in the BPC Web Admin Client

 

Once the transport has been successfully moved, either make sure that the BPC team has been assigned to the Admin team in new environment just transported, via the BPC Web Admin Client or that the BPC team has SAP_ALL access to be able to do this for themselves.  Once the Admin assignment is complete, then SAP_ALL should not be required.

 

If the above step is not completed, then it is possible to see the new environment in SAP GUI but not see the environment in the BPC Web Admin Client.

 

Now that the master data is in the environment, it is possible to make sure the users are pulled into the environment in BPC and assigned to the appropriate security profiles, via the BPC Web Admin Client.

 

 

 

 

Restore Remaining Data

 

 

Once you are able to confirm that the environment exists in the BPC Web Admin Client, then it is time to restore the remaining data (transaction data).

 

Please, perform the following steps:

 

1. Open SAP GUI, and connect to the target instance.

 

2. Go to UJBR.  The following screen appears:

11.png


a. Select Execute Restore.

 

b. Select Background.

 

c. Select the appropriate Environment ID.

 

d. Paste the path for the zip file created in the archive process into the Archive File Name field.

 

e. Select only Restore Transaction Data.

 

f. Change the Record Count to “0”.

 

3. Execute the Restore.

 

4. Use SM37 to check the status of the restore.


How to load files from BW Application server to BPC 10.0

$
0
0

This document helps you to move a file from BW application server to BPC 10.0. I came upon this requirement when I had to load files from a Datastage server to BPC, for this we can actually use Data Manager Packages available in BPC but they only support FTP (File Transfer Protocol). My requirement is to load using SFTP protocol which was not possible using Data Manager Packages in BPC. Below are the steps to follow to achieve the current requirement:

 

1) First place the file from Datastage to BW Application server. For this first I copied the csv file from Data stage server to Presentation server and used a custom built program to move the file from Presentation server to BW application server. You need to create a folder in BW Application server to place the file.


2) Logon to BPC, and logon to the EPM office Add-in Excel , then goto "Run Package" and select the Data manager Package "/CPMB/IMPORT_MASTER" (Import Master data), this DMP is available in "System administration" package group.


 

 

3) Select "RUN" , then the DMP asks for "Import file" , here provide the Import file name from the BW application server eg : "/apps/sap/xfer/BWBPC...." then click on "NEXT".

 

4) Here provide the Transformation file that you have already created

 

go to Excel, Data maanger, choose transformation file--> Validate and process T/fer file

As below

  

5) Select the Dimension name in BPC to which you wanted to load the file. Specify the "Write mode" as per your requirement and then click "RUN".

 

Your file will be now loaded from BW application server to BPC

Dimension Override based on dynamic selection with multiple properties

$
0
0

As EPMDimensionOverride is mostly used function and requirement is to pass multiple property, I decided to write different scenario of this function with dynamic selection of multiple properties and values of properties.

 

Function and it's parameters -

 

EPMDimensionOverride("reportID","Dimension","Member")

 

The logical operator supported by this functions are as follows

 

ANDOR
;,

 

Background

 

PRODUCT Dimension contains three properties and values as shown in below table.

 

AREAGROUPFIELD
5109011A

 

In my given case these are the Dimensions as well.

User is selecting "AREA", "GROUP" and "FIELD" and these value are used to override PRODUCT dimension.


As third parameter of function is "Member" , Concatenate all the desired properties and values of properties in separate cell and give reference of that cell as third parameter in function.

 

As shown in below first example, Properties and property values are populated in cell "C7:E8" along with logical operator. Desired combination of properties and operator is Concatenated in cell "I8". The formula of cell "I8" is displayed in cell "I7" (Formula 2) . Alternate formula also provide in above cell "I6" (Formula 1). The EPMDimensionOverride function is written in cell "I4". Formula of "I4" is displayed in above cell "I3" . The cell "I8" is used as third parameter in function.

 

EPMDimensionOverride.PNG

 

Tested on EPM Add-In SP 20, .NET 3.5 , BPC 10.0 NW.


Comments and suggestions are highly appreciated.


Best Regards,

Shrikant

Detailed explanation about automatic copy opening of BPC consolidation

$
0
0

Some customers do not know that when the BPC consolidation package is executed, an automatic ‘copy opening’ will be done. Some customers might have known about this but they are not sure about the detailed process.

Because of this, many customers have doubt about the data after they executed the consolidation data package, especially when ‘copying opening’ is taken into consideration.

 

This article explains the detailed process .

 

An automatic ‘copy opening’ for data in AuditID (Datasource type) of type ‘A’ will be executed during consolidation run (either through a BPC data manager package or through the consolidation monitor).

 

In fact it is not a simple copy process.

 

Let’s take an example to explain.

 

In 2010.DEC, a customer has a record below under entity
E14003,group C14001, currecny GBP.The data source is ‘DS11140’,
which is ‘I’ type data source.

 

1) 2010.DEC /10505111 / F_999 / DS11140 / 3,250,000/

 

After consolidation run for 2010.DEC, System generates a
record below to reverse the record 1) above, under audit ID DS12100:

 

2) 2010.DEC /10505111 / F_999 / DS12100 / -3,250,000/

 

To execute consolidation for 2011.AUG, the first step is to run the balance carry-forward business rule, which is to do ‘copy opening’ (carry forward) for ‘I’ and ‘M’ type data source.

 

 

After that, we can see a record under the same account, flow
F_100, in 2011.AUG,as below:

 

3) 2011.AUG /10505111 / F_100 / DS11140 / 3,250,000

 

That's the only record on account 10505111 and flow F_100 in 2011.AUG.

 

And the audit ID is DS11140.

 

The second step to execute consolidation for 2011.AUG is to execute the consolidation data package for 2011.AUG .At this time, the process of copy opening for data in audit ID of type 'A' will occur.

 

It is not a simple copy process.

 

a) First, system looks for data that has been carried forward into flow F_100. That's the record 3).

 

b) Consolidation engine tries to calculate result for record

3)according to business rules. Normally system will generate the following record,
just like it does when generating record 2):

 

4) 2011.AUG /10505111 / F_100 / DS12100 / -3,250,000

 

You may have already noticed that the step b) is in fact doing the carry-forward. And that's how the system is designed.

 

You may ask why the system is designed in this way just in order to copy the balance in audit ID of type 'A'. Well, let's imagine
that ownership on the entity E14003 is decreased in 2011.AUG. Although the opening balance record 3) is not changed, final result should be
calculated according to the latest ownership.

 

4-1) 2011.AUG /10505111 / F_100 / DS12100 / -3,000,000

 

The record 4-1) is the result according to the latest ownership.

 

Compare it with 4), the difference is 250,000. Instead of generating the record 4-1), system will come up with the records below:

 

5) 2011.AUG /
10505111 / F_100 / DS12100 / -3,250,000

 

 

6) 2011.AUG /
10505111 / F_VAR / DS12100 /    250,000

 

5) tells the end-user that this record is copied from
2010.DEC.

 

6) tells the end-user that the percentage is changed and the
difference is 250,000.

 

The summary of 5) and 6) is the actual result of 2011.AUG.

 

Next, I will share a bad configuration that results in the copying opening does not work.

 

A customer has configured a rule ICELIM and there is a detailed line whose SOURCE FLOW has been set to 'F_999' (closing flow).

 

That prevents the system from doing step b) above because ‘opening flow’ is not included’.

 

This will result in the fact that the balance is not carried-forwarded.

Top 10 viewed SAP KBAs for January 2015

$
0
0

Purpose

The purpose of this document is to provide a list of the top ten most viewed SAP Knowledge Base Articles for SAP Planning and Consolidation version for SAP NetWeaver in the month of January 2015.

 

Overview

Below are the top 10 most viewed SAP KBAs for SAP Planning and Consolidation version for SAP NetWeaver.

 

Note NumberNote Title
2107965Issues in EPM Add-In after installing Microsoft updates
2010964BPC parameters list and how they impact system behavior- BPC
2081345BPCNW - Troubleshooting Tips for Transport Issues
1966663BPC 10.1 Frequently Asked Questions
2053377No active links in BPC 10.1 web client after upgrading SAP G
1817817How to configure RFCs used by BPC NW
1673094Error in EPM add-in when running reports:  UJO_READ_EXCEPTIO
1708660BPC10 Transport error UJT_TLOGO_AFTER_IMPORT
1667091BPC 10 Transport error:  Errors occurred during post-handlin
1744310Get Error "500 - Server error" when login to BPC 10 Web Admi

 

Please note, in order to view the contents of the SAP KBAs, you will need to be logged into Service Marketplace.

 

Related Content

Top 10 viewed  SAP KBAs for December 2014

Top 10 viewed  SAP KBAs for November 2014

Top 10 viewed  SAP KBAs for October 2014

Top 10 viewed  SAP KBAs for September 2014

Top 10 viewed SAP KBAs for February 2015

$
0
0

Purpose

The purpose of this document is to provide a list of the top ten most viewed SAP Knowledge Base Articles for  SAP Planning and Consolidation version for SAP NetWeaver in the month of February 2015.

 

Overview

Below are the top 10 most viewed SAP KBAs for SAP Planning and Consolidation version for SAP NetWeaver.

 

Note NumberNote Title
2107965Issues in EPM Add-In after installing Microsoft updates
2010964BPC parameters list and how they impact system behavior- BPC
1966663BPC 10.1 Frequently Asked Questions
2081345BPCNW - Troubleshooting Tips for Transport Issues
1673094Error in EPM add-in when running reports:  UJO_READ_EXCEPTIO
2053377No active links in BPC 10.1 web client after upgrading SAP G
1667091BPC 10 Transport error:  Errors occurred during post-handlin
1817817How to configure RFCs used by BPC NW
1708660BPC10 Transport error UJT_TLOGO_AFTER_IMPORT
1664950Error in Excel add-in 10:  Error while communicating with th

 

Please note, in order to view the contents of the SAP KBAs, you will need to be logged into Service Marketplace.

 

Related Content

Top 10 viewed  SAP KBAs for January 2015

Top 10 viewed  SAP KBAs for December 2014

Top 10 viewed  SAP KBAs for November 2014

Top 10 viewed  SAP KBAs for October 2014

Top 10 viewed  SAP KBAs for September 2014

Restrict Data saving in input form for specific members/intersection SAP BPC 10

$
0
0

Scenario:


There is a requirement in which the user wants to view the data for some accounts and for other Accounts he wants to plan and enter the values to be saved at the backend. No data should be saved at the accounts which are only for view purpose. If we create an input form , then by default all the cell values can be saved at the backend. The procedure to lock some cells, so that data is not entered against those accounts, is explained below.


Example

In the example below, we will take two Account hierarchy nodes. Out of these two hierarchy nodes, One node and its descendants will be available for data saving and the other will only be available for View data.


Step by Step Procedure:


Step 1: Create a report using report editor.

 

Select members for Accounts.


 

Here we have selected 2 parent nodes (2114, 2115) and its descendants. For one parent node and its descendants we will lock the cells for input.


Step 2: To lock this data


Goto -- Read only data tab -- Click on select members -- Select the dimension for which you want to restrict the data.

 

Select the member for which you don’t want user to save any data. Here we select 2115 and its descendants.


 

      Move those members to the right side -- click Ok


Step 3: Enter some data and test


 

As we have locked this cell , so that no data can be saved at the backend for this account. So , While saving data we will get “ No data to save” message.

 

 

Enter the data against the accounts (2114) that can be saved.

It will allow you to save the data.

 

Intersection only method (for multiple dimensions)


Using this option, we can restrict data saving based on the intersection of different dimensions.

In the example shown below we are considering intersection of Account and Time.

Scenario:  User should not be able to enter the data for hierarchy 2115 corresponding to time Feb-08.

Note: To make this work “At Intersection only” check box should be checked


Accounts we already have restricted and now select time to implement intersection. Select the time dimension-- Click on select members -- select the time value

 

After this Report looks like:


 

Data input will be restricted only at the intersection of the member selected.


Read only data will be at the intersection of Feb-08 and hierarchy 2115 as we have selected these in Read only data tab. So we cannot save any data here.


 

If you save the data against Jan-08 and hierarchy node 2115, then it will allow saving the data as this is not the intersection of read only data.


 

 

Conclusion

 

Data can be restricted to be stored at the backend based on both single dimension as well as multiple dimensions using Intersection only option.


Thanks.

Security Process (Functional) – Add, Modify, and Remove Users

$
0
0

Before You Begin

 

Who Should Use this Document?

This document is intended for technical and business resources who have an interest in the security process for the SAP BPC solution. 

 

How Should You Use this Document?

This document details the process flow and other related content associated with the process and approvals of managing the
security for SAP BPC content. 

 

Add a New User

 


Process Flow

 

Add_User_Process.png



Initiating and Approving the Request



  1. The business makes a request to have access to either SAP Business Planning and Consolidation (BPC) and/or the EPM Add-In for MS Excel
    1. The user, their manager, a peer, or a content owner may make the request
  2. The request should come through the Enterprise Service Desk (ESD) – there is no formal document or form required
    1. The request, however, should denote the following things, although the content owner(s) may need to provide assistance
      • The environment in BPC the user needs access to
      • Where the user can access data (data access profile)
      • What the user can do (task profile)
      • Often times, the easiest thing might be to communicate whose rights to mirror
      • Enough information to ensure the correct user is being added to the system
  3. The ESD creates a ticket and routes it to the SAP Enterprise Performance Management group.
  4. The BPC Administrator reviews the request and clarifies any questions with the business.
  5. At this point, the BPC Administrator determines whether there are any licenses available to extend.
  6. In addition, the BPC Administrator gains approval from the content owner in the business



Processing the Request



  1. Once all required approvals are obtained, the BPC Administrator proceeds with processing the request
  2. The BPC Administrator creates two new, related tickets:
    1. One to the ESD to assign the new user to the appropriate Active Directory (AD) group for access to the Citrix connection for the EPM Add-In for MS Excel
      • ESD notifies the BPC Administrator that their
        task is complete
    2. One to SAP Security to add the user to the production instance under one of three profiles in the back-end so the BPC Administrator can assign the user to the appropriate rights in the front-end
      • Z1:BPC_GENERAL_ALLUSER – for most all business users
      • Z3:BPC_DEVELOPER – requires above and adds the ability to run ZUJBR (business initiated backup in BPC but not restore
        functionality)
      • Z4:BPC_ADMINISTRATOR – for IT resources who support and develop in the BPC solution
      • SAP Security notifies the BPC Administrator that their task is complete
  3. Once SAP Security adds the user to the back-end, the BPC Administrator can pull the user in to the front-end security in the BPC web administration and assign the user to the appropriate team(s).



Communications and Closing the Request



  1. SAP Security notifies the user of their temporary password, directly
  2. The BPC Administrator notifies the new user that the security request has been completed and provides first time user instructions for changing their temporary password and accessing the content via the web and the EPM Add-In for MS Excel via Citrix
  3. The BPC Administrator will add the user to the appropriate Outlook distribution list:
    1. *SAP_BPC-KeyStakeholders – used to communicate to power users and content owners in the solution
    2. *SAP_BPC-Users – used to communicate to all users from report viewers only to business developers and their managers
  4. The BPC Administrator will update the BPC Security spreadsheet for managing licenses.
  5. The BPC Administrator will close out the original Remedy incident associated with the user’s request for access to SAP BPC / EPM Add-In for MS Excel
  6. SAP Security will close out the Remedy incident associated with adding the user to the back-end of SAP BPC
  7. ESD will close out the Remedy incident associated with adding the user to the AD group



Modify an Existing User



Process Flow



Modify_User_Process.png


Initiating and Approving the Request


  1. The business makes a request to modify access to either SAP Business Planning and Consolidation (BPC) and/or the EPM Add-In for MS Excel
    1. The user, their manager, a peer, or a content owner may make the request
  2. The request should come through the Enterprise Service Desk (ESD) – there is no formal document or form required
    1. The request, however, should denote the following things, although the content owner(s) may need to provide assistance
      • The environment in BPC the user needs access modifications to
      • Modifications related to where the user can access data (data access profile)
      • Modifications related to what the user can do (task profile)
      • Often times, the easiest thing might be to communicate whose rights to mirror
      • Enough information to ensure the correct modification(s) is(are) being made to the system
  3. The ESD creates a ticket and routes it to the SAP Enterprise Performance Management group.
  4. The BPC Administrator reviews the request and clarifies any questions with the business.
  5. The BPC Administrator gains approval from the content owner in the business



Processing the Request


  1. Once all required approvals are obtained, the BPC Administrator proceeds with processing the request
  2. The BPC Administrator possibly creates one new, related ticket:
    1. One to SAP Security to modify the user’s access in the production instance, provided the modification to
      rights in the front-end warrants a change in the back-end.  This is rare, and would typically only happen in a case where a business user is
      changing to/from the Business Developer Role from the General User Role.
      • Z1:BPC_GENERAL_ALLUSER – for most all business users
      • Z3:BPC_DEVELOPER – requires above and adds the ability to run ZUJBR (business initiated backup in BPC but not restore functionality)
      • Z4:BPC_ADMINISTRATOR – for IT resources who support and develop in the BPC solution
      • SAP Security notifies the BPC Administrator that their task is complete
  3. Once SAP Security modifies the user in the back-end, the BPC Administrator can modify the user’s rights in the front-end security in the BPC web administration by modifying the user assignment to the appropriate team(s).



Communications and Closing the Request


  1. The BPC Administrator notifies the user that the security modification has been completed.
  2. The BPC Administrator will modify the user enrollment for the appropriate Outlook distribution list, if necessary:
    1. *SAP_BPC-KeyStakeholders – used to communicate to power users and content owners in the solution
    2. *SAP_BPC-Users – used to communicate to all users from report viewers only to business developers and their managers
  3. The BPC Administrator will update the BPC Security spreadsheet for managing licenses, if necessary.
  4. The BPC Administrator will close out the original Remedy incident associated with the user’s request to modify access to SAP BPC / EPM Add-In for MS Excel
  5. SAP Security will close out the Remedy incident associated with modifying the access for the user in the back-end of SAP BPC



Delete a User



Process Flow



Delete_User_Process.png


Initiating and Approving the Request


  1. The business makes a request to have a user removed from SAP Business Planning and Consolidation (BPC) and/or the EPM Add-In for MS Excel
    1. The user, their manager, a peer, or a content owner may make the request
  2. The request should come through the Enterprise Service Desk (ESD) – there is no formal document or form required
    1. The request, however, should denote the following things, although the content owner(s) may need to provide assistance
      • Enough information to ensure the correct user is being removed from the system
  3. The ESD creates a ticket and routes it to the SAP Enterprise Performance Management group.
  4. The BPC Administrator reviews the request and clarifies any questions with the business.



Processing the Request


  1. Beyond clarifying / confirming the request to remove a user, there are no approvals required.
  2. The BPC Administrator proceeds with processing the request.
  3. The BPC Administrator creates two new, related tickets:
    1. One to the ESD to remove the user from the appropriate Active Directory (AD) group for access to the Citrix connection for the EPM Add-In for MS Excel
      • ESD notifies the BPC Administrator that their task is complete
    2. One to SAP Security to remove or disable (consistent with SAP Security’s current processes) the user from the production instance
      • SAP Security notifies the BPC Administrator that their task is complete
  4. Once SAP Security removes the user from the back-end, the BPC Administrator can confirm that the user has been removed from the front-end security in the BPC web administration.



Communications and Closing the Request


  1. The BPC Administrator will remove the user from the appropriate Outlook distribution list(s):
    1. *SAP_BPC-KeyStakeholders – used to communicate to power users and content owners in the solution
    2. *SAP_BPC-Users – used to communicate to all users from report viewers only to business developers and their managers
  2. The BPC Administrator will update the BPC Security spreadsheet for managing licenses.
  3. The BPC Administrator notifies the Content Owner of the license recovery
  4. The BPC Administrator will close out the original Remedy incident associated with the request to remove the user from SAP BPC / EPM Add-In for MS Excel
  5. SAP Security will close out the Remedy incident associated with removing the user from the back-end of SAP BPC
  6. ESD will close out the Remedy incident associated with removing the user from the AD group



Master data loading from Flat file

$
0
0

Hello every one,

The main intention of this document is what is procedure to load the master data from flat file.. it covers the only basic things and which will be helpful for the beginners only.. Please share your comments or suggestion below.

There are so many points we need to add here, but this for beginners. feel free to add more points here..


Assumptions:         Dimension and Models are already available.

                                 The connection is already established.

                                 The number of columns in the source and destination are same.

                                 The master data is already available in the flat file and it’s in CSV format.


Step1: Log on to the BPC Excel

           Open excel and click on the tab EPM and click on Log on radio button.

2015-04-14_14h53_12.png

Once click on the log on button it will ask for connection name, choose the connection name... it will ask for credentials.

2015-04-14_14h53_12.png



Step 2: Upload data    

In this step we are uploading our master data file into application server.


Click on data manager tab --> Upload data --> browse the file and click on upload

 

Once you click on upload, it will ask for saving. Give a name as you like and click on save.

 

2015-04-14_16h02_56.png

2015-04-14_16h04_08.png

 

Step 3: Create a transformation file.

After uploading the master data file to application server, we need create a transformation file. By using this transformation file we are mapping the source and destination columns.

For that click on transformation file choose new transformation file

2015-04-14_16h05_36.png

If the number of fields in source and destination are same and they are in same order then no need to map here. In BPC it will automatically map the fields in source and destination.

The transformation file looks like below..

2015-04-14_16h07_30.png


The reason using Evdescription = Description, is if we don’t use this mapping some time the description is missing.. in order to avoid such kind of things map the description.

In the mapping section we have 3 scenarios.

  1. The data columns in the source and destination are same(our example)
  2. The data columns in the source and destination are same in different order then we will map accordingly

DESTINATION --> SOURCE

ID = ID

EVDESCRIPTION = DESCRIPTION

ACCTYPE = ACCTYPE

RATETYPE= RATETYPE

 

  1. The data columns in the source and destination are different and the header is missing

DESTINATION --> SOURCE

ID = *COL(1)

EVDESCRIPTION = *COL(2)

ACCTYPE = *NEWCOL(INC)

RATETYPE= *COL(3)

Here the ACCTYPE is not available in the source, so we are hard coded the value to INC type.

 

Next click on transformation file --> validate and process transformation file.


It will pop up a screen choose data type as master data from flat file, data field browse and select the data file which is place in the application server in step 2 and then select the respective dimension which you want load the data.


Next click on save, it will pop up screen for saving the transformation file, type name as you like and save.

2015-04-14_16h10_05.png

Once the validation is successful we will see the log like below, if there are any error it will show here.

2015-04-14_16h11_03.png

 

Step 4: Run the DM package


Click on DM tab --> Run package

Choose the data management and chose the master data import flat file dm package.

Next click on RUN

  1. It will ask for the data file browse it ( which is save in the step2)
  2. Next transformation file (which is save in step 3)
  3. Choose your Dimension ( to which dimension you want load the master data)
  4. Write mode (update Hierarchy)
  5. Click on run
  6. Next click on view status

VII. Once the package is successful, go to the web admin and check the data is available or not.

 

Thank you..

 

Regards,

Saida Reddy G

How to archive the data change audit and clearing/deleting the archived data change audit in SAP BPC 10.0 NW

$
0
0

Hi All,

 

This document is intended for beginners on housekeeping of Data change Audit.

 

 

Why Data Audit feature has to be turned ON for models.

 

Benefits:

  1. Keeps track of the data changes that have happened through various tasks (i.e input schedule, Data manager packages, script logic execution…etc )
  2. Easy to identify the data loss issue (keeps track of data changes made by users based on their user ids)

 

Now,why we should archive the data change audit –In Production system when data auditing feature is enabled there will be large volumes of data change audit captured in the Audit tables and it will have significant impact on the performance when we do a report on audit “Data Changes” in the administration.

So, it is recommended to archive the Data Audit frequently.

Assuming Data Audit feature is already turned ON for Models.

 

Steps to be followed for Archiving the Data change Audit

Logon to BPC Administration>>Features>>Audit

In the “Data Audit Configuration by Model “section click on Edit and do the audit configuration category wise for the models

In the screenshot below set audited tasks and the Frequency of Data Audit Purge for the categories in scope.

 

audit1.png

 

Please note that setting the data Audit frequency(days) will not archive the data audit automatically.


Standard data manager package have to be run which will move the data audit from current audit table to Archive data audit table

 

Process chain: /CPMB/ARCHIVE_DATA    ---Archive Data Change Audit

 

To find out the Audit tables related to model specific then go to SE38 run the program UJ0_GET_GEN_TABNAME

Search for the keyword AUDDATA  &  AUDDATA_A(archive)

Below is the sample screenshot, table names will vary with respect to Models.

 

AUDIT2.jpg

 

How to build BPC audit reports on Archived audit data

Execute the transaction code SPRO

SAP IMG references>>Planning and Consolidation>> Set Global parameters

ENABLE_REPORT_ARCHIVED_AUDIT” = “X

Logon to SAP BPC web>>Home>>Audit>>Data Changes

A new section appears after the global parameter is set

Report on: Archived, Current & Current and Archived.

Note:1695526 has to be applied if the BPC system is lower than SP07

 

bpcaudirrep.jpg

 

Steps to be followed for deleting /clearing the archived data audit.


Go to SE38 and run the program UJU_DELETE_AUDIT_DATA

Test Run “X” displays no records to be deleted

Test Run “ “ –  delete the audit data records.

If the BPC System is lower than CPMBPC 800 SP10 or CPMBPC 801 SP03 then Note: 1827262 have to be implemented to use the delete feature.

 

 

audit3.jpg

Thanks,

Dinesh.

Overview of BPF in Unified model

$
0
0

Hello All,

 

The basic idea of this document is to give a brief overview of BPF in unified model.

 

The Business process flow in the unified model works in a very similar way compared with the previous release.

There are three main differences:

  •       Some changes for the UI of BPF design time and run time
  •       Internal and external dimensions
  •       The available target actions have been adjusted accordingly

 

  1. Some changes for the UI of BPF design time and run time:

See the below image as reference how it is changed

 

 

The process template activities which are available in 10.0 and 10.1 versions respectively.

 

 

 

 

2. Internal and external dimensions:


When trying to assign driver dimension in BPF template, user have to select one dimension from “Internal Dimension” or “External dimensions”

 

 

In the unified model, users will define the model by using the existing info providers directly. For these info providers, users will user some SAP delivered Info Objects like 0CALYEAR, )CURRENCY (Info Object names started with zero)

 

When assign these kind of info objects as the driven dimension, in most cases, they don’t have the attribute for Owner and Reviewer like one we use in the classic model.

Normally SAP doesn’t suggest their customers to modify the standard SAP delivered Info Objects by adding the attributed directly. So they introduced the “External Dimensions” and this external dimension is mapped to respective internal dimensions.

 

3. Hyper link design:

As we discussed in the point one the UI screens are changed in this section also.

Some new actions are introduced, such as Analysis for Office.

 

The UI will change according to the target actions and further selection

 

 

please add more valuable points here.

 

References: SAP Ramp-up material.


Regards,

Saida Reddy G

Issue deleting multiple members from a dimension - BPC 10.1 NW SP5

$
0
0

Warning : only delete a single member before processing a dimension using BPC 10.1 Standard (NetWeaver) SP5

 

Only a single record in the BW table is deleted when deleting multiple dimension members through the BPC dimension admin.

 

We have managed to resolve this issue by applying note 2164233.

 

 

I have tested this at 3 different customer installations of BPC.  This has been raised as a message with SAP, but I thought it worth sharing.

 

Scenario 1: Delete multiple members of a dimension through the BPC dimension maintenance.

 

7 members have been added to a dummy dimension

 

Starting position - members appear OK in both BPC and BW

BPC Members Pre deletion.pngBW Members Pre deletion.png

 

Delete 4 members from BPC then process the dimension

BPC Members deletion 1.png

Only a single member is removed from BW

BW Members deletion 1.png

 

Refreshing the BPC dimension using UJA_REFRESH_DIM_CACHE results in the members reappearing in BPC.

BPC Members cache refresh.png

 

Scenario 2: Reuse a member ID but with a different case.

DELTEST5 has been previously deleted as a part of a multi delete and had been left in the BW database.

BPC Members scenario 3.png

A new member DELTESt5 is added

BW Members scenario 3.png

Both DELTEST5 and DELTESt5 appear in the BW

Distribute template using Email notification

$
0
0

Business Scenario:

 

In planning scenarios, the reports or input schedules may require to be distributed to

multiple recipients. BPC allows you to distribute data to a distribution list. The recipients in

the distribution list can change the data and send it back.

 

To achieve this we have to configure Email process as described below:

 

Step 1: Configure SMTP service with help of Basis consultant.

 

Step 2: After configure SMTP service we have to configure the SMTP Server to distribute

using e-mail. For this we have to set the application parameters: SMTPAUTH,

SMTPPASSWORD, SMTPPORT, SMTPSERVER and SMTPUSER in BPC administration.(Please

refer below steps to set the application parameters)

 

2.1 Go to T-code: SPRO

2.2 Click on SAP Refrence IMG as sown in below screen shot.

 

Untitled1.png

 

2.3 Go to Set Environment Parameters.

 

Untitled.png

 

2.4 Select Environment.

 

Untitled.png

 

2.5 Create SMTPPORT, SMTPSERVER, SMTPUSER parameters and click on save.

 

After configure Environment parameters user can distribute BPC templates using Email notification.

Untitled.png

Viewing all 192 articles
Browse latest View live