Wednesday, October 21, 2015

FDM to FDMEE Migration Utility ( - Part 3


Before I go through the execution of the ODI scenarios for the migration, I would like to empathize that any comment or opinion you read here should not drive your final decision at all. Using the migration or not is a decision that needs to be made based on many different factors.
I just want to put some lights on it and see how far I can get with it.

Being said that, here you have the third part of FDM to FDMEE Migration Utility posts.

To summarize the other parts:
- Part 1: Introduction to the migration utility and overview of main capability
- Part 2: Installing the migration utility

Let's start the rock n' roll!

Artifacts of my legacy FDM application
The FDM application I'm trying to migrate is a complete one and may have some particularities which will be discussed during this series.
 - 2 target adapters but only one being used by current locations
 - Several locations
 - Logic Groups (Complex types)
 - No Validation Entity Groups
 - No Check Rule Groups
 - Import format for delimited files
 - Import scripts
 - Mappings
 - ...

When my target application is an EPMA application
When target application is deployed with EPMA, it must be registered manually in FDMEE before extracting the FDM artifacts.
Our target application is Classic so no need to register as the utility will create it in FDMEE.
Obviously, the migration utility assumes your target application(s) exist in your system :-)

Executing the migration: extracting FDM artifacts
We just have to execute the ODI scenario FDMC_EXTRACT_SETUP to extract legacy FDM artifacts and generate the new ones in FDMEE.
We open ODI Studio and logon into our FDMEE repository. The ODI scenario can be executed from Designer as below:
When executing the scenario, the prompt window for entering parameter values will be displayed:
- p_application_name: our target application name.  This must be the same name as it is in our source environment. This is not the place where we will rename our HFM application so names will be the same.
- p_application_type: valid application types supported are CUSTOM, ESSBASE, HPL and HFM (upper case)
- p_application_db_name: not applicable to HFM. This is where we specify Planning/Essbase cube names.
For Essbase, we need to run the scenario once for each cube.
For Planning, we specify a comma separated list of 6 plans (Ex: BalS,IncS,,,,Plan6)
- p_prefix: although this parameter is optional, it is strongly recommended. When migrating multiple applications, there could be issues with duplicate names (Ex: two locations from two apps having same name).
Once values are entered, the following window will be prompted: 
This is where we select our context, agent running the scenario, and the log level.
The context will be the one linked to the FDM application (database) we want to migrate. Although you can use the local agent, it's I prefer to use the FDMEE agent. And log level? always 5 so you get the max detail in ODI operator and the log produced by the utility.

After executing the ODI scenario I'm sure you go to ODI operator and start pushing Refresh, aren't you? :-)

And what can we see here? all steps executed in the ODI scenario.
Summarizing the different steps, the migration process executes the following tasks:
1. Refresh ODI variables (global adapter key, sub-query for target adapters, etc.)
2. Create target application in FDMEE
    - Configure application options
    - Configure application dimensions
3. Create source adapters
4. Extract Categories
5. Extract Global and application periods
6. Extract Logic Account Groups
    - Logic Account items and item criteria
7. Extract Check Entity Groups (Validation Entity Groups in FDMEE)
    - Entity Group items
8. Extract Check Rule Groups (Validation Rule Groups in FDMEE)
    - Check Rule Group items
9. Extract Import Formats
    - Import format definitions
    - Import format for source adapters
10. Extract locations
11. Extract data mappings

I must say that I was surprised that I did not get any error when executing the scenario but...
Having green ticks means migration succeed? 

A glimpse of FDMEE after the migration
Let's start checking artifacts in FDMEE. I will start with the target application. Why? Because it's the first object being created (ODI step 15). That makes sense as all other objects in FDMEE depend on it.
The creation of the target application is done through step 15 to 23 and it basically creates the application and add the application options and dimensions based on target adapter configuration in legacy FDM.
Cool! the HFM application has been created in FDMEE and dimensions have been added. Do you see anything special? 
Dimension names are created based on dimension aliases defined in FDM classic so if you did not update from common "FM dimension" pattern, then your dimensions will look like that. You would have to update aliases in FDM classic before running the migration.
I'm not going through application options right now so I will assume they were correctly imported from adapter options in FDM classic.

Let's have a look to other artifacts like locations or import formats:
Mayday, Mayday! we have a problem :-( no import formats were imported from FDM so actually something went wrong. Unluckily, having green ticks does not mean success. 

Let's dive into the details! We may put some lights on it.

The log: fdmClassicUpgrade.log
Before making further investigations you should know that the migration utility produces a log which is stored in the temporary directory: 
In that log we can see values for ODI variables being used during the migration. The funny thing is that the log stops after importing categories. Where is the rest then? ODI operator is your new best friend if you want to troubleshoot when some objects were not migrated.
Did you also get ODI scenario successfully executed but can't see your FDMEE artifacts? 
You may want to troubleshoot, you may want to switch your approach and recreate them manually.
If you chose option 1, please keep reading.

Next stop: digging into ODI!

Tuesday, October 20, 2015

FDM to FDMEE Migration Utility ( - Part 2

Did you already have the chance to play with the new migration utility?
In my previous post I described the basics about this new ODI built-in tool which is supposed to speed-up your migration process by moving your legacy FDM artifacts to FDMEE through the database layer.

Today, I will go through the installation steps needed to perform the migration of one classic FDM application running in SQL Server 2008 into new FDMEE running on SQL Server 2012.
The application has two target adapters for HFM. Actually is an old version which was synchronized the the new version of the adapter but kept in the FDM application.

What do I need to start my migration?
  • a running version of FDM - not necessarily, actually I have restored a SQL server database for a FDM application in the same database server that FDMEE uses. Reason? well I just have one virtual machine in my laptop :-) what about you? the migration is done through the database layer so I don't see any problem having them in the database server. However if you also migrate your database platform (ex: from Oracle to SQL Server), you may want to keep it in your old database server. Backup of the original database is always recommended as the migration utility creates some object in it.     
  • ODI Studio don't try to install ODI Studio from EPM installer, it fails. I still keep the full ODI installer. I think Oracle just keeps last 11g release which is If you don't have the original installer files, I would suggest you raise a SR with oracle so they can provide you with the links to download it.
  • FDMEE - got it!
As expected with this kind of automation tools there are many restrictions, some of them due to its functionality, others due to FDMEE's functionality. For this reason I encourage you read carefully the migration guide. It may avoid headache for you :-)
For example, the utility guide shows the following note:
You can read above "FDMEE does not support parent locations across different target applications".
What do they mean with this? Is that you cannot have one parent location for two children locations each of them pointing to a different target application having or not the same dimensionality?
Let's have a quick look:
I can create one parent location for two children pointing to different target applications: 
- Location LOC_PARENT_GLOBAL has an import format pointing to an HFM application
- Location LOC_CHILD_A has an import format pointing to an HFM application
- Location LOC_CHILD_B has an import format pointing to a Planning application
But what about if I want to maintain mappings for Planning dimensions?
There is no way of showing Planning dimensions as parent location points to the HFM application.
I will be able to map planning dimensions only if LOC_CHILD_B has a parent pointing to Planning, which as I said, makes totally sense:
To me, it does not make sense that FDMEE allows you having this configuration so, as an enhancement, it would be great if you could only select then parent locations having the same target application as the children one you are configuring.

Step 1: installing the patch
Let's start with the easy ones:
1) Create a working directory
2) Download path 21379349 (Migration Utility)
 3) Unzip the patch to your working directory

Step 2: installing DB schema
Before moving forward, please call your DBA colleague if your don't feel comfortable with SQL.
We need to run some SQL scripts coming with the patch for both FDMEE and FDM schema (if you work with Oracle) or database (if you work with SQL Server).
These scripts will create SQL temporary objects used by the migration process.

Update the FDMEE Schema/Database
1) Logon to the FDMEE database using a tool like SQL Management Studio
2) Run appropriate SQL script. The one for SQL Server in my case:
Command is completed successfully:
What is this script doing?
- Add columns prefixed by FDMC to tables:

-    Alter attribute columns in TDATASEG and TDATASEG_T to support ICT module integration.

Update the FDM Schema/Database
1) Logon to the FDM database using a tool like SQL Management Studio
2) Run SQL script aif_fdmc_create_common.sql (valid for Oracle/SQL Server)
We get error below when running the script in SQL Mgmt Studio:
The code above is valid for Oracle. We have replaced the semi-colon between the two CREATE VIEW statements by reserved word GO:
What is this script doing? It basically creates two Views:
- View TDATASEG_ALL (union of of all TDATASEGX) tables
- View TDATAMAPSEG_ALL (union of all TDATAMAPSEGX tables)

Note: bear in mind that you will need to adjust this script in the case you use more than 50 data segments in legacy FDM (probably when having more than 250 locations)

Install and configure the ODI objects
In ODI Studio:

1) Configure topology for FDMC Source

    - Logical schema: FDMC_SOURCE
    - Data server: FDMC_SOURCE
      JDBC connection

    - Physical server for FDM database: FDMC_SOURCE
   - Context FDMC_CMFDMDEV

You can use GLOBAL context or create a new one. I will create a new one FDMC_CMFDMDEV as a duplicate of existing one GLOBAL. It is recommended to use one context for each FDM application to be migrated:
After updating logical schema for FDM and FDMEE:
2) Import ODI Scenarios in INSERT_UPDATE mode:

     - Scenario FDMC_EXTRACT_DATA_001
     - Scenario FDMC_EXTRACT_SETUP_OO1

To import the ODI scenarios:
Select them from the working directory:
Note: click Yes in all warnings

Besides SQL objects needed for the migration, we now have our ODI topology configured:
  • Physical/Logical architecture for our legacy FDM database
  • The two scenarios which perform the migration for both Setup (artifacts) and Historical data.
Next stop... executing the migration! and maybe troubleshooting...or maybe not :-)

Wednesday, October 7, 2015

HFM as a source, when Value is other than <Entity Currency>

Hi folks!

I'm currently working on a migration for a common ERP > EPM > EPM data integration using legacy FDM. Probably you have already found a similar situation in your organization or at some customers you were working with.

Making it simple, functionally actual data is typically loaded from our ERP system(s) into our consolidation system (1). Then after consolidation happens (2), the consolidated data is exported into our planning system (3) where budget and forecast are calculated (4). Finally, data calculated is exported back to the consolidation system (5). The following picture shows the process above:

AS_IS solution in legacy FDM
There were many different ways of implementing flow above using legacy FDM. The one my customer has is basically using integration scripts to load data from SQL tables/views into target systems using FDM. When moving data between HFM and Planning, Extended Analytics (EA) is used to export HFM data into a table so FDM can access it. In a similar way, when moving data between Planning and HFM, Business Rules (BR) are used to execute a DATAEXPORT that exports (a good pun) data from Essbase cubes into SQL table:
If you take a glimpse to the title you may think what I described above has nothing to do with it. Actually it hasn't but I though it would be also nice to go through the overall solution so you can have a better understanding about how new functionality in FDMEE replaces custom solutions implemented in legacy FDM.

TO_BE solution in FDMEE
We can clearly define 3 interfaces here:

1) MS Dynamics (one of the ERPs of Microsoft)  > HFM
FDMEE provides new functionality called Open Interface Adapter which lets you import data from any source which is not natively supported into FDMEE. This sounds as a good candidate for this interface :-)
You can find in ODTUG a webinar I delivered about Open Interface Adapter.

2) HFM to Planning
Before, we did not have any built-in functionality to move data between EPM applications. Luckily, EPM data synchronization was one of the best features coming with new release :-)
Being able to extract HFM data at any level adds a great value to the tool and definitely makes our life easier.

3) Planning to HFM
Same as above, in, "everything" can be a source or a target. With still some limitations , FDMEE can extract data from Planning (actually Essbase plan types) in the same way it does from HFM.

Being said that, what about solution depicted below? does is it make sense?
The main goal is to replace most of the customization to give more control to the end-user as he will be able to create synchronization loads as needed by defining filters on source dimensions in the same way they would do in application forms for example (Ex: @IDESCENDANTS, .Base, etc.)
The only piece where we need a bit of customization is in the Open Interface Adapter as the open interface table (AIF_OPEN_INTERFACE) will be populated using a event script in Jython which runs a dynamic remote SQL query based on user-defined filters. End to end in one click!

And now the issue...
One of the requirements was to export consolidated data. For that we had to set the filter on Value dimension as [Contribution Total] (value aggregated to the parent entity). Just an aside, I strongly recommend visiting Thano's blog for a good explanation of Value dimension.
When running the data load rule we noticed that data was successfully exported to the staging file but there was an issue when importing it into FDMEE. As always log level set to 5 in Development:
A SQL exception was raised: EPMFDM-140274:Message - [FMWGEN][SQLServer JDBC Driver][SQLServer]String or binary data would be truncated.

FDMEE exports HFM data into a DAT file located in the \data folder of our target application. Then this file is imported into FDMEE in order to be mapped and loaded into the target.
We can check that the DAT file has the HFM data load format. Entity is exported as Parent.Children pair and Value dimension is [Contribution Total] as expected.
The file exported from HFM is imported into FDMEE temporary table TDATASEG_T so the error seems to be related to some issue importing the file into this table. 

After some researching I got confirmation from Oracle. Value dimension is mapped to Currency. Typical exports are done at <Entity Currency> level which actually will export the currency code. Then currency codes are imported into column CURKEY of TDATASEG_T (and then into TDATASEG after all data is mapped)
Currently CURKEY is declared as NVARCHAR(10) in SQL Server and CHAR(10) in Oracle.
When importing currency codes into that column we don't have issues as they fit but in our case [Contribution Total] exceeds that size so we get the truncate error.

The workaround...
Although Oracle confirmed this has been fixed in PSU100 (most probably coming in early 2016) is good to know that you can apply a workaround at database level in order to perform your HFM exports as expected.
Basically we will have to alter column size. For SQL Server:
After applying it, we run our data load rule again, and voilà! our HFM data has been exported and imported into FDMEE:
And what happened in TDATASEG?
Value [Contribution Total] has been successfully imported into CURKEY column.