Friday, January 24, 2014

FDMEE Custom Solution to Load Multiple Periods Through Single POV - Part 1

I got recently one question regarding the solution of the following scenario:
Our file has periods in rows, how can we load it through a single POV? 

In Classic FDM, everything was bit more flexible as you got control of almost everything. You could easily add a custom dimension for your periods and then adjust the Export Action script to export custom period column instead of POV period.

In FDMEE, you also have several approaches. Today I'm going to share one solution that will show:
- How to think in FDMEE
- Using Lookup dimensions
- Jythoning (slice notation, exceptions, loops, file operations, list operations, unicode files, using continue statement...)
- Avoid modifying Adapter action scripts (not supported, removed soon?)
- All you want to see...

I'm going to load data into HFM using the following source file:

Turning it over your head
So, let's apply my favorite methodology... Divide & Conquer!
My aim is to import the source column with periods (1st column) into FDMEE and then replace the POV Period by the mapped values for imported periods in the .DAT file exported to HFM.

Adding a Lookup dimension for Source Period
Lookup dimensions can be added to target applications and assigned with data columns. They are only for FDMEE purposes, therefore they won't be exported. They can be used in many different ways. Today we will use the Lookup dimension to import the source period, map it, and export it. How?
I have added Lookup dimension Period_Custom and assign with data column UD10.

Importing Source Period into Period_Custom dimension
Lookup dimensions can be configured in Import Formats as other dimensions:
When I run my Data Load Rule for POV March-2006, I can see a column for Period_Custom dimension:
In this example, my mappings are * to * although you can create any kind of mapping for the lookup dimension.
If we export this data into HFM we can see how the the POV Period March-2006 is exported...we don't want this!
Obviously HFM will show 6000 for March:

Replacing POV Period by Source Period in the DAT file 
If you take a look to the export file we don't see source period anywhere, therefore it will be very difficult to replace as we will not find it when going through all lines.
First thing we can figure out is how to export the source period without modifying the HFM_EXPORT.vbs action script :-)
One approach would be:
- Before the file is exported...concatenate Source Period to Target Account
- After the file is exported...go through the file and replace POV Period by Source Period
- Don't forget to remove Source Period from Target Account in the DAT file...
I'm sure you can find other approaches but this one lets me show more :-)

How we concatenate Source Period to Target Account?
At database level, this is concatenating two columns of table TDATASEG: UD10X and ACCOUNTX. Why UD10X? we said we can also have mappings for my source period so let's take the target period.
The update statement would be something like:
UPDATE TDATASEG SET ACCOUNTX = UD10X  || '_' || ACCOUNTX WHERE LOADID = ?  AND ACCOUNTX IS NOT NULL AND UD10X IS NOT NULL
How do we perform this operation before .DAT file is exported?
We can use the Event Script BefExportToDat
You can also see how I have added some debug information to my process log (Outbox\Logs). This is totally recommended as you can easily debug your code.

Note: I found an issue when running the UPDATE statement using LOADID as parameter. I'm still figuring out if this a bug. You can use PARTITIONKEY instead. I will keep you posted.

Update: the issue is not related to the SQL query itself but to the fact that if we run EXPORT step separately from IMPORT step, the column LOADID in TDATASEG table will be only updated during the EXPORT and therefore not available in BefExportToDat event script. 
Basically if we run steps separately (from Workbench), LOADID will be different for each step. If we run the DLR from Data Load Rule page and select Import + Export, then both steps will have same LOADID and therefore the SQL query with LOADID in the WHERE clause will work as expected.
Easier solution is to modify the SQL UPDATE and use POV information in the WHERE statement (RULE_ID, PARTITIONKEY, CATKY, PERIODKEY:

How do we replace POV Period and remove prefix added to Target Account?
We can go through all lines in the .DAT file and use split + Python Slice Notation to get the prefix and build the final line with custom period and target account in the correct format.
Actually we will be creating a new .DAT file (with same name) so we first backup the original one.
Once the file is secured we will go through all lines and build the new line:
And finally, after the file is produced we have to remove the prefix from ACCOUNTX:
Now it's time to place our scripts in FDMEE:

Running the FDMEE workflow... exporting the new file
All operations shown above will be transparent to the end user as they will happen during the Export step. Therefore the workbench grid will look the same as before but we can see the differences in the DAT file being exported and final data in HFM:
Our new DAT file has now source periods instead of POV Period as we were expecting, and we can check data in HFM:

Well, as I said, this can be one of multiple solutions so take your time to think. Also take into consideration all other integration requirements you may have.

Enjoy it!





11 comments:

  1. Hi,

    this looks like a really good idea :) and might work for HFM. But sadly this does not work for Planning. I tried to Copy the Steps and create a multi load for planning. The problem is that the Dat file is created after the Import planning. It's only used for "statiscal" purposes as it seems. From my perspective you need to alter the eport script.

    Best regards

    Tim

    ReplyDelete
    Replies
    1. Hi Tim,

      thanks for your feedback. The solution for HP/Essbase is not literally the same but slightly different. You have two methods when loading data into HP/Essbase: SQL or File. For both, an Essbase Rule File is generated by FDMEE. If you want to load multiple periods through the same POV I would suggest to use SQL method and modify the SQL view the rule file gets data from. In that way you will not have to generate the file and then replace in the file.
      In any case, there is an un published bug where lookup dimensions are not mapped in HP.
      Actually I will publish it soon on the blog.

      Delete
    2. In addition to what stated above, the file is generate before data is loaded into HP as it has the data being loaded. There is no EXPORT action script as it is still with HFM. If you want to replace exported file you would have to build AftExportToDat. This is something I would not recommend for HP/Essbase as volume of data is usually large.

      Delete
    3. Hi Franciso,

      Yeah you're right the file might be to big. Just for your information i used the File "method" but if i follow your steps. If i run the AftExportToDat Script you described above. The Dat file has not been created at this step. (If I read your post again i think this is what you are telling me :)) So the the script can't find the Dat file and therefore cannot alter it. The Upload stops.
      Do you now where the SQL View is created ?!
      The bug wtih the lookup dimension is right now no Problem for me. Target and Source Period are the same.

      Delete
    4. Hi Tim,

      you can see the SQL Views in the FDMEE database with any SQL tool.

      Delete
  2. Hi Franciso,

    I tested this method and found drill through from HFM to FDM will not work when data loads in this method.

    Do you have any idea about how we can get drill through function work?

    Thanks.

    James Zhang

    ReplyDelete
    Replies
    1. Hi James,

      i put some notes in http://akafdmee.blogspot.ca/2014/02/fdmee-custom-solution-to-load-multiple.html (Part 2)

      This solution has drawback regarding Drill-Through as you will be only able to drill to the POV from which you loaded multiple periods.

      Delete
  3. Love this idea, tried implementing and get all the way to the after .dat script and keep getting this error. I tried debugging a bit to see what line it's trying to write to. There's only one line in the test file and that's coming through in the debug line correctly. I understand the out of range error but not sure where it's coming from. FDMEE error is below:

    2014-08-25 22:22:27,172 DEBUG [AIF]: Comm.executeVBScript - END
    2014-08-25 22:22:27,173 DEBUG [AIF]: Comm.executeScript - START
    2014-08-25 22:22:27,183 INFO [AIF]: Executing the following script: \\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py
    2014-08-25 22:22:27,204 DEBUG [AIF]: Process: Replace POV with custom Period Year
    2014-08-25 22:22:27,208 DEBUG [AIF]: Line working: STAT;2014;Aug;PERIODIC;310;;CancelOrders;CC_0000;BT_35;PJ_000000;R_000000;FDMEE_LOAD;10000

    2014-08-25 22:22:27,218 ERROR [AIF]: The script has failed to execute:
    2014-08-25 22:22:27,256 FATAL [AIF]: Error in CommData.exportData
    Traceback (most recent call last):
    File "", line 4425, in exportData
    File "", line 420, in executeScript
    File "\\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py", line 46, in
    lstlinereplaced = lstlinesplit[:1] + [lstlinesplit[6].split("_")[1]] + [lstlinesplit[6].split("_")[0]] + lstlinesplit[3:6] + [lstlinesplit[6].split("_")[2]] + lstlinesplit[-6:]
    File "\\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py", line 46, in
    lstlinereplaced = lstlinesplit[:1] + [lstlinesplit[6].split("_")[1]] + [lstlinesplit[6].split("_")[0]] + lstlinesplit[3:6] + [lstlinesplit[6].split("_")[2]] + lstlinesplit[-6:]
    IndexError: index out of range: 6

    2014-08-25 22:22:27,256 DEBUG [AIF]: CommData.updateWorkflow - START
    2014-08-25 22:22:27,256 DEBUG [AIF]:
    UPDATE TLOGPROCESS
    SET PROCESSENDTIME = CURRENT_TIMESTAMP

    ,PROCESSSTATUS = 22
    ,PROCESSEXP = 0
    ,PROCESSEXPNOTE = 'Export Error'
    WHERE PARTITIONKEY = 11
    AND CATKEY = 7
    AND PERIODKEY = '2007-01-31'
    AND RULE_ID = 22

    2014-08-25 22:22:27,261 DEBUG [AIF]: CommData.updateWorkflow - END
    2014-08-25 22:22:27,261 DEBUG [AIF]: Comm.executeScript - START
    2014-08-25 22:22:27,270 INFO [AIF]: Executing the following script: \\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py
    2014-08-25 22:22:27,281 DEBUG [AIF]: Process: Replace POV with custom Period Year
    2014-08-25 22:22:27,292 ERROR [AIF]: The script has failed to execute:
    2014-08-25 22:22:27,319 FATAL [AIF]: Error in CommData.exportData
    Traceback (most recent call last):
    File "", line 4441, in exportData
    File "", line 420, in executeScript
    File "\\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py", line 46, in
    lstlinereplaced = lstlinesplit[:1] + [lstlinesplit[6].split("_")[1]] + [lstlinesplit[6].split("_")[0]] + lstlinesplit[3:6] + [lstlinesplit[6].split("_")[2]] + lstlinesplit[-6:]
    File "\\VMHODGILTP8\Custom\FDMEE/data/scripts/event/AftExportToDat.py", line 46, in
    lstlinereplaced = lstlinesplit[:1] + [lstlinesplit[6].split("_")[1]] + [lstlinesplit[6].split("_")[0]] + lstlinesplit[3:6] + [lstlinesplit[6].split("_")[2]] + lstlinesplit[-6:]
    IndexError: index out of range: 1

    2014-08-25 22:22:27,460 DEBUG [AIF]: Comm.finalizeProcess - START



    ReplyDelete
  4. These errors refer to error when trying to access positions 1 or 6.
    Have a look to your code and debug lists to see elements

    ReplyDelete
  5. Custom software development specialists explore exactly what it is a company does - what the necessities of the company are on a day to day basis and what they need from their IT system in order to operate to their optimum. In order to perform efficient functionality, a company would ideally need its software to do exactly what they want it to.

    ReplyDelete
  6. Hi Francisco,

    This is a lovely idea! Many thanks for this.

    While implementing this, I did notice that this would only work if the 'Accumulate in File' option is set to 'No' in the Application Settings.

    Will you please let me know if this is correct?

    Regards,
    S

    ReplyDelete

Thanks for feedback!