Monday, July 25, 2016

Using On-Prem Planning REST API to Import Metadata

I finally found some time to blog after KScope16 :-)

KScope is always an inspiring event for me. I actually enjoy attending other sessions than FDMEE ones. And this is what I did!

It's being some time since I was curious about REST (this is what I wish to do but not possible with little one running all the time and trying to destroy whatever he finds...)
For that reason I tried to attend REST sessions at KScope, even in the DB track!

I have to highlight the one delivered by Jake Turrell. A very interesting session about using REST API in Groovy to import metadata/data from external sources into Planning. That was a good session, clear, concise and very well presented.

This post is not about introducing REST which is I think is very well covered in the other blogs. This post is just to provide some idea about how we can invoke REST API methods from FDMEE in order to communicate with other parties either as a source or as a target.

However, I found this descriptive image answering to someone asking about REST Vs. SOAP. I could not skip this in today's post :-)
The Requirement
Going back to what gave me the idea for this post, all started some time ago when I had a chat with GJG about a requirement where REST could be a good solution to build a custom process that allowed importing Planning metadata from FDMEE.

Which was the requirement? We had a customer using FDM Classic to load new accounts and entities into Planning before data was loaded. In that way they didn't get errors when target members didn't exist in the application. In other words, no partial loads in Planning.

Let's have a look at old solution BTW:
As you can see above, FDM Classic was launching an ODI scenario which was actually performing some operations and finally loading new metadata into Planning. We could have also used OLU (Outline Load Utility) but there were many security restrictions and they liked ODI as other processes were also in place (if you are wondering if they had full ODI license, yes they did)

So time for migration, time for FDMEE. Keep same solution?
  • High level? YES
  • Technical solution? NO
Why keeping high level solution? seamless metadata import, robust error capturing, transparent for end-user, etc.
End-users are interested only in seeing the gold fishes along workflow. My data is imported, my data is mapped, my data is valid, my data is in Planning. That's all.

Looking at the technical solution, do we really need ODI to load metadata from a flat file? Even if they had full ODI license, we agreed to remove that piece of the cake. ODI was not being used as expected so no reason to keep it.

Was OLU a good replacement? Maybe, maybe not. Couple of points that didn't make me happy at all:
  • FDMEE and Planning servers were different so Planning utilities had to be executed remotely or installed in the FDMEE server.
  • Not a clean solution in terms of error capturing so hard to integrate with FDMEE as I wished
John was showing in his blog how to use REST and how REST API is also available for on-prem Planning. It was not clear at that moment if that was supported but we got confirmation from Oracle so green light! Thanks John for laying the first stone.

Decision made, time to play!

- Francisco: Hello REST, my name is Francisco and I want to use you from FDMEE. 
- REST: Hello Francisco, we are going to have fun for the next couple of days. 

REST API for On-Prem Planning
I'm not going to reinvent the wheel so visiting John's post about Planing REST API is a must to understand this post.
Already read it? Then you are probably now familiar with Planning REST API and how to use it. I was.

As always, I recommend to start from high level approach and keep diving into lowest levels until you finally have the technical solution. And that is what I did, start from the high level solution design:
The diagram above does not show all details and look simple at this stage.
When you start developing the orange boxes then several questions comes up:

How do I invoke a REST API method to execute a Planning job?
How do I check Job status?
How do I read response from REST API?
How do I parse the response to get job id, result, etc.?

If we focus on today's topic, we could add some light by defining the process as:

1) Invoke REST API method to run a Planning job to import metadata
2) Read and parse JSON response to get job details
3) Invoke REST API method to check status of job 1) (until it is completed)
4) If job succeeded, invoke REST API method to run a Planning job to refresh the cubes
5) Read and parse JSON response to get job details
6) Invoke REST API method to check status of job 4) (until it is completed)

Basically a game of invoking & reading response. So let's play that game.

Planning Jobs in the SUI
New to the SUI (Simplified User Interface)? Have a look, this is how cloud services interface looks like. The good thing is that SUI is also available in on-prem (Planning so far). You can feel like in the Cloud :-)
It took me some time to get used to how things are done now but once you get it, it's easy to remember.

I just need to create one job for my testing. A job to import new entities into Planning from a CSV file. The idea is to have a metadata file in the Planning Inbox/Outbox folder so the job can import it.
Metadata file is simple:
Playing around with REST Web Client
Now that we have created our job it's time to play with REST. This is essential to get familiar with it. I wouldn't start coding if I don't understand how it works.

For that purpose, I'm also using the RESTClient for Firefox, easy to use and very helpful.
Invoking REST API method to run a Job
API methods for PBCS accepts JSON body with parameters but on-prem does not so far. You can easily create the following request and see the error returned;
A workaround is to pass the body as part of the POST URL:
As we can see, the JSON response contains the job status and job id (132). This last value will be used to invoke the GET method that returns the job status:
The new base entity FDMEE_Entity_REST has been created in Planning: 

Using FDMEE as the REST Client
John showed how to invoke REST API from PowerShell scripts, Jake did the same from Groovy and now it's time for Jython.

I first need to highlight that we are restricted to Jython 2.5.1 in current FDMEE release ( Luckily, from Python 2.6 onward we had several enhancements. As you can imagine some of them were related to HTTP requests and JSON.

Unluckily we have to stick to 2.5.1 so we can't use any of these enhancements. The sample code below shows specific method using native urllib2 module and Jyson, a third party JSON codec for Jython 2.5.1.

Steps to run a Planning Job
1) Import necessary libraries/modules
2) Build URL
3) Build Body (parameters as a encoded dictionary)
4) Add Basic Authentication Header
5) Invoke REST API Method (open URL)
6) Read and parse the JSON response
 7) Invoke REST API method to check Job status
I have logged all output so you can see details like JSON Response.
If you want to see more details about job execution you can navigate to Import and Export Status in SUI:
Refreshing the cube follows similar approach so I let you play with that.

This is all for today. I hope you found this post useful and got the basis to build your own solution.
I must say that with FDMEE now providing Hybrid integration, REST is a very good approach to make both Cloud and On-prem interact and will help you to meet most of your requirements.

You just need to be creative, REST will do the rest.

Take care!

Monday, June 20, 2016

Loading Cell Texts into HFM from Flat Files

Definitely, one of the topics that I had in my TO-DO list...

After trying to organize my daily life, I have finally found a good opportunity to work on this. 
Before I start I would like to say thanks to my colleague Michael van der Merwe. He had a challenge and I got the opportunity to collaborate with someone that helped me a lot when I started fishing in the early Classic FDM times. If you come from old Upstream/FDM, I'm sure you know this guy.

Before we dive into this fuzzy concept I think I should briefly describe what I will be showing today so we are all in the same page. 

As you may know, HFM allows adding a text description to any valid cell. 
This description is known as Cell Text and, luckily, FDMEE can load it by attaching to data lines something called Memo Item.
However, out of the box, FDMEE only allows adding memo items manually once data has been imported. Today we will focus on how to import memo items from the source (file or any other) and loading them as cell texts into HFM. 
That's the goal :-)

Memo Items: attaching text descriptions to FDMEE data
I'm sure you have asked yourself (at least once) what is that post-it icon in the very first column on your Data Load Workbench grid. Have you ever clicked in the "--" link of any data line? If you do so, you will see a new window popping up. Welcome to memo items :-)
As we have already shown in this blog, all is stored in the database. So the best we know how it is stored, the best positioned we are to control it. It sounds interesting, doesn't it?

Lets' then have a look to what happened in my FDMEE database when I just clicked Add (all db screenshots are for MSSQL):
Picture above is self explanatory but I would like to highlight some key points:
  • Memo items are stored in table TDATAMEMOITEMS
  • Memo items and data lines are linked through the column STATICDATAKEY
  • I don't know why but column MEMOITEMKEY reuses sequence used by Data Load Rules (I can see potential issues when adding memo items and data load rules at the same time. Maybe not a common case but something that can really happen)

Once that we have added our memo item, we need to type a name and a description. Optionally we can attach up to 10 documents. At this point you must know that current functionality does not support loading of attached documents to the HFM cell text. In any case, you can use this feature in FDMEE in case you want to attach any file to data lines (ex: invoices)
What happens when we click Update? the existing memo item is updated with details entered:
Are you curious? Of course you are. So if you have a look to the columns where attached documents are stored, you won't see any path, just an identifier. To what? As many other files in FDMEE, the information for attached documents is stored in table TDATAARCHIVE:

Loading Memo Items as Cell Texts
To show this functionality I don't need to do any magic, just to enable one target option and click Export.
In this case, I will enable Cell Text loading at data load rule level. Remember that we can override global target application options at data load rule level if needed. When you see the option value is yellow-cream highlighted, then the global option is being overridden:
Now click Export, get all the gold fishes and have a look to your HFM data. There it is, a non really well formatted cell text has been loaded from FDMEE.

Note: HFM allows having multiple cell texts assigned to the same data intersection (Enhanced Cell Text). This can be done by using different Cell Text Labels. Unfortunately FDMEE is limited in this way as it only loads cell texts to [Default] label.
If you try to have multiple memo items to the same data intersection and you export them as cell texts, you will something like this:
Therefore if you have multiple labels and load through FDMEE, then you need to follow another approach. I will not cover this today but as a tip you should know that you can load cell texts by having a section in your DAT file labeled as !Description. In this picture taken from HFM documentation you can see syntax for enhanced cell texts: 
If a load using an append mode is run and new cell text is added to an intersection that already has cell text, the old cell text is replaced by the new cell text and not appended.

How are Cell Texts actually loaded?
Cell texts are loaded through the HFM adapter API. From the adapter scripts are available in Jython so if you open script you will see a function to load cell texts:
Adapter scripts can be found in %EPM_MIDDLEWARE%\EPMSystem11R1\products\FinancialDataQuality\bin folder.

Have a look to the entire script and you will better understand how it works. If you are not happy with how the cell text has been formatted (author...) then you can change that code in the same script (not supported...)

If appropriate log level is configured, the FDMEE process log will show all actions related to cell texts:

Loading Cell Texts from Flat Files
Now that we better understand how cell text loading from memo items work, it's time to address common requirement to import memo items from source.

As an example we will import the following delimited file:
As you can see I have added an additional column with the value for the memo item (Memo Title:Memo Text). To keep it simple, I have entered NoMemo for data lines not having cell texts.

If we think about the high level solution, our plan would look as follows:

  1. Import memo items into FDMEE column (typically attribute or description columns)
  2. Add memo items to the database
  3. Use the standard cell text loading option

That sounds like a good plan then, doesn't it?

Importing Memo Items into Description column
We are all familiar with Import Formats, right? Descriptions and Attribute columns are not available when you create the import format but they can be added later by clicking Add button:
The image above shows how column 14th has been assigned to Description 1 field (column DESC1 in TDATASEG)
After importing data we can show description column to see text imported (View > Columns menu):
First step was easy. As you can imagine, the art & science is entirely in second step.

Creating the Memo Items Behind the Scenes
I always like to speak about customization from the WHAT-WHERE-HOW standpoint. We already covered the WHAT in step 1. Let's go through the other two.

This is a common customization where you would probably start with the HOW without thinking about the WHERE until last minute, just when you start getting unexpected results :-)

And this is how I'm going to approach it in this blog entry.

The HOW...
We already showed what was happening in the database when new memo item was created. Now it's time to emulate via scripting what FDMEE does when actions are performed in the UI:

1. Insert into table TDATAMEMOITEMS

We just have to execute an INSERT statement into table TDATAMEMOITEMS. The easiest way is to execute INSERT INTO table VALUES (columns) SELECT columns FROM TABLE

2. Update columns HASMEMOITEM and STATICDATAKEY in data table

We need to update DATAKEYs that have memo items assigned:
3. Update sequence value in table AIF_SEQUENCES

The last step is to update the sequence used to generate the MEMOITEMKEY. We can use the number of memo items inserted to update it:

The WHERE ...
One of the first thoughts would be probably to insert new memo items after data has been imported into the FDMEE staging table (TDATASEG_T). This happens in the AftImport event script.
That solution is actually working with no errors but then you get something unexpected when trying to view the memo items in the data load workbench:
Memo item has been added but no details are shown. What do you do next? Go to oracle community and ask? Come on, don't be shy and use your SQL skills:
Yup, memo item is not in TDATAMEMOITEMS but it was inserted indeed. Ok, my SQL skills were a good starting point but it's time to use my log auditing skills: 
After searching text "TDATAMEMOITEMS" we notice that there is a DELETE statement and...voilà! it is executed after AftImport event script. 
Does it mean that the memo item was actually inserted but then deleted because DATAKEY assigned did not exist in TDATASEG yet? Elementary, my dear Watson. Data is copied from TDATASEG_T to TDATASEG table when mappings happen. 

Mmm... then going back to the WHERE, we need to perform the SQL operations when data is already in TDATASEG... and that is? AftProcMap event script, simply after mapping happen.

Let's give a try and do some reverse engineering from HFM to see the detailed process:
Our "custom" FDMEE process log shows results of the 3 SQL operations:

What happened now in the database?
 And what about FDMEE? 
We can now see memo items details as well :-)

A known Bug
Something you need to know: Cell Texts and Data Protection are not the best friends. If you are using Data Protection feature, your cell texts will not be protected when loading data in Replace mode. The workaround would be to extract the cell texts before loading data and the loading them back after data has been loaded into HFM.

Loading through Excel TB Templates
I don't want to leave this post without performing one more test. In old Classic FDM times, we could import memo items from Excel TB files. There were two additional metadata tags M1 and M2 to import both memo item text and description:

So I had to test this in FDMEE...unfortunately this is not supported in FDMEE

Cell texts are very useful to keep additional information to your HFM data intersections. This new information may be available in your source data so being able to load it through FDMEE it's definitely a +1 

Unlucky we don't have an OOTB functionality to import them but we do have a nice way of extending FDMEE via scripting, and this is what we did!


Thursday, March 31, 2016

I can't believe my eyes! that script has a hard-coded password!

Hi folks!

Raise your hands! How many of you have seen or used hard-coded passwords in FDM Classic or FDMEE scripts? When I say passwords, I mean all type of passwords but especially database passwords.

If you are still reading this post, most probably... you have!

Hopefully what you are going to read below can help you to avoid showing the world how unsecured you scripts are :-)

Why do we need to use passwords in scripts?
There are many situations where you would need to use passwords in a script.
For example, you may want to open an Essbase connection to check outline members.
But among all of scenarios, probably the one you are more familiar with is opening database connections.
Do we really need to open a database connection from a script? There is enough material to write a new post about this topic so I will focus in situations where you really need it.

Why don't we want them to be hard-coded?
Code is not pretty if you can read a password...
Administrators can get furious...
Password can be context/environment dependent...
Change the password, change the script(s)...
See the script, see the password...
I don't think it complies with any security policy...
bla bla bla

Our starting point
Maybe you don't know what a hard-coded password is but I'm sure you have used them.

Let's see an example:
As you can see when I need to open a JDBC connection with a database, I must know:
  • JDBC connection string
  • DB user
  • DB password
I'm going to use the FDMEE database as an example. However it's quite important to remark that most of the times you don't need to open a connection with the FDMEE database as it's already opened when the script starts. 
In my case, I need it. Why? it's a long story but let's summarize it...the database has some custom stored procedures returning cursors which can be executed only through callable statements as they are not supported using standard fdmAPI functions like executeDML. Don't ask me why using these stored procedure. Somebody else built it due to specific requirements and we were forced to re-use them. Simply that.

Multiple approaches to avoid hard-coding
This is not new in FDMEE. We already had to open database connections in FDM Classic.
There are many approaches to avoid hard-coding passwords:

  • Use a System Environmental Variable to store the password and read it from the script
  • Use encrypted configuration files to store passwords
  • Encrypt password and decrypt in the script
  • Use db links/linked servers from FDMEE database to source databases
  • ...
Any of the solutions could easily suffer code injection. What's that? For example, if I have access to the script I can insert some code to write the decrypted password into a file. Even the solution I will show below can be "injected". 
In any case, if your require one security extra level, you can always implement a Java class to perform secured database connection and use it in your FDMEE scripts.

Today I will focus only in a clean solution to avoid hard-coding passwords in a very flexible way.

1st Step - Storing passwords for database connections in ODI

Let's be clear:
FDMEE uses ODI > ODI stores database connections in Topology > Let's use ODI!
As you can see we can store any database connection in Topology, either through ODI Studio or ODI Console (web). You may want to connect to a data warehouse to get data or export your Planning/Essbase data into a reporting system.

Luckily, if we are going to open a connection against FDMEE database, we won't need to create it as it is created with default installation. However any other connection can be created as needed.

ODI uses its own encoding/decoding so why not taking advantage of this? By inserting the password in ODI topology, we are getting them already encrypted and stored somewhere. 

But where is the password actually stored? 
FDMEE database has a set of tables prefixed by SNP for the ODI repository. Among all these tables there is one which stores the physical connections configured in Topology. This table is SNP_CONNECT.

If we run this simple query then we will see the encrypted password:
What do we now with that password? Can we pass it directly to the database connection? No, that's the answer.

2nd Step - Decrypting the Database Password

From my ODI times (I still do a lot with ODI but TBH a lot more with FDMEE) I used to play with the ODI SDK (Software Development Kit). After some investigation I found that the SDK API for ODI 11g (the one FDMEE uses so far although ODI 12c has similar ones) has a class called com.sunopsis.DwgObject which has a static method (can be easily called without creating any instance of the class) called snpDecypher. This method accepts a parameters as a string (encrypted password) and returns the decrypted password.

You can easily see it by opening odi-core.jar file with Eclipse:
Let's give a try!

Testing the solution with a FDMEE Custom Script
I want to keep things simple so I'm just going to show s Jython script which decrypts the password stored in ODI repository. Besides, it gets the FDMEE JDBC connection string and database user from System settings in FDMEE. Another option would be to get JDBC and USER also from the ODI repository :-)

As you already know, one of the strength points of Jython is that we can easily use Java classes to extend FDMEE functionality and decypher our database password:

       decryptedPwd = DwgObject.snpsDecypher(encryptedPwd)

If we run the custom script through Script Execution in FDMEE:

I hope you found this post helpful. It doesn't matter the method you use but remember...

No hard-coded passwords any more!!!