Friday, May 14, 2021

FDMEE 11.2.x...I can't believe my eyes! that script has a hard-coded password!

Ey there!

Long time ago I was showing one of the different methods to avoid hard-coding passwords in FDMEE scripts. You can see the original post here.

I have upgraded FDMEE to 11.2.x, and now what?
If you run the same code in 11.2.x, you will gen the following error:

TypeError: snpsDecypher(): expected 2 args; got 1

Why? FDMEE 11.1.2.x uses ODI 11g and FDMEE 11.2.x uses ODI 12c. And the ODI APIs have slightly changed. Some methods are now deprecated.

Luckily, there is a workaround. Let's use the ODI 12c API :-)
The code below also shows you how to use different EPM API such as the EPM Registry API.

Let me highlight something that I have seen in different implementations. If you want to execute a SQL statement against the FDMEE database, you don't need to open the connection via jdbc. There are API methods to execute SQL statements against it. Those methods will manage the DB connection for you.

Enjoy the code!

# ***************************************
# ODI 12c Password Decrypter
# ***************************************

''' 
	ODI 11g Code
	--------------------------------------------
	# Import ODI API class
        from com.sunopsis.dwg import DwgObject
	# Decrypt pwd
	connPwdDec = DwgObject.snpsDecypher(connPwdEnc)
	
	Execution in ODI 12c
	--------------------------------------------
	Traceback (most recent call last):
	File "<string>", line 529, in executeJythonScript
	File "\\WIN19\EPMSHARE\FDMEE/data/scripts/custom/odi12c_decrypy_pwd.py", line 20, in <module>
        connPwdDec = DwgObject.snpsDecypher(connPwdEnc)
	TypeError: snpsDecypher(): expected 2 args; got 1
	
	Cause
	--------------------------------------------
	Function (depecreated) is defined as:
	public static String snpsDecypher(String pPass, OdiInstance pOdiInstance)
	
'''

# Import libraries
from oracle.odi.core.config import MasterRepositoryDbInfo
from oracle.odi.core.config import OdiInstanceConfig
from oracle.odi.core.config import PoolingAttributes
from oracle.odi.core.config import WorkRepositoryDbInfo
from oracle.odi.core import OdiInstance
from com.sunopsis.dwg import DwgObject

from com.hyperion.aif.util import RegistryUtilCore
from com.hyperion.hit.registry import DBTypeComponentImpl
from com.hyperion.hit.registry import ComponentType

# Encrypted password (you can get it from ODI SNP tables as you would do in 11g)
connPwdEnc = "xxxxxxxxxxxxxxxxxxxx"

# Get MR Connection details
# ----------------------------
aifDbComponent = RegistryUtilCore.getAIFDatabaseComponent()
jdbcDriver = aifDbComponent.getPropertyValue("dbJDBCDriverProperty")
jdbcUrl = aifDbComponent.getJdbcUrl()
jdbcUserName = aifDbComponent.getUserName()
jdbcPwd = aifDbComponent.getPassword()
fdmAPI.logInfo("Jdbc Driver ->" + str(jdbcDriver))
fdmAPI.logInfo("Jdbc Url -> " + str(jdbcUrl))
fdmAPI.logInfo("Jdbc User -> " + str(jdbcUserName))
fdmAPI.logInfo("Jdbc Pwd -> " + str(jdbcPwd))

# Create MR/WR Info
# ----------------------------
workRep = "FDMEE"
masterInfo = MasterRepositoryDbInfo(jdbcUrl, jdbcDriver, jdbcUserName, jdbcPwd, PoolingAttributes())
workInfo = WorkRepositoryDbInfo(workRep, PoolingAttributes())

# Create ODI instance
# ----------------------------
odiInstance = OdiInstance.createInstance(OdiInstanceConfig(masterInfo, workInfo))

# Decrypt password
# ----------------------------
connPwdDec = DwgObject.snpsDecypher(connPwdEnc, odiInstance)

# Log decrypted password
fdmAPI.logInfo("Jdbc Decrypted Pwd -> " + str(connPwdDec))

# Destroy objects
odiInstance.close()

Have a good weekend!

Friday, May 8, 2020

Building your own Monitoring Solution for Data Changes in HFM Phase Submission Loads - Part 2

In the last blog post Building your own Monitoring Solution for Data Changes in HFM Phase Submission Loads - Part 1, we described FDMEE automation requested by our customer. One of the requirements was to include a data change tracker to monitor the data changes along the different HFM submission phases.

The MUST-HAVE list for this custom functionality is:
  • Get the HFM Process Management details (Review Level for the different phases)
  • When Soft Close, be able to compare new data set against last data set loaded. That will monitor all data load changes
  • Auto-map missing mappings so the workflow can be completed
  • Auto-map for invalid HFM intersections and identify the CANNOT WRITE intersections (Soft Close)
  • Keep audit in FDMEE so missing mappings and invalid intersections can be easily filtered in the Data Load Workbench
  • Send email notification with detailed results 
Top-Down Solution Design, that's the key
There are multiple ways of addressing the design of a solution. If you ask me, my preferred approach is to start with drawing my solution at high-level. Why? Basically, with this Top-Down methodology, I can decompose the solution is smaller parts to have a better comprehension of its different components. I can then refine each part in greater detail.
Definitely, with this "Divide and Conquer" approach, I will have a better understanding of how my solution will look like and will reduce the impact of additional requirements.
Let's start drawing the solution!

Solution Diagram
As I usually say, "A picture paints a thousands words". There will be always a solution design diagram in every project I work:
Basically:
  • We get the HFM Process Management details before data is loaded into HFM
  • We perform the data monitoring after data is loaded into HFM
Let's use reverse engineering to add more explanation:
  • After data has been loaded into HFM, the data monitoring is initiated for all HFM entities in Soft Close for any of the different phases. 
  • As we first need to know the status for all phases, FDMEE will get this information from HFM and store it in the TDATASEG table. This action happens before data is loaded into HFM (BefLoad event script)
  • Then, the data monitoring will compare the new data being loaded with data previously loaded. But, where is the data previously loaded? It's not in FDMEE because data is wiped out when the same POV (DLR and Period) is re-processed. Data must be accessible somewhere else. To workaround this, FDMEE data is copied to a custom table (Z_BALANCES_PRIOR) every time it is loaded into HFM (with no errors). This action happens after data is loaded into HFM (AftLoad event script)
In a nutshell, the first time data is loaded for a specific POV (DATA 0), there won't be data to be compare against. The second time data is loaded into HFM (DATA 1), it will be compared against DATA 0, and so on.

What do users get?
A very nice email notification with results :-)
The email above, shows that one intersection has been updated. Results are attached as a CSV file where they can see data differences for all phases. They can also see when previous data has not been loaded in the new load:

Taking the notifications to the Next Level
This solution for the automated process followed the pattern "This is very cool! Could we also have...?" I like challenges and delivering solutions to meet requirements. That makes everyone happy, doesn't it?

There are missing maps
We want to see which source values have not been mapped:

There are invalid intersections
We want to see the invalid intersections in the same way we did in FDM Classic!
There are invalid intersections, but why?
Most of the times, the Cannot Write intersections are due to process level, can we get process level for the different HFM process units with invalid intersections?


What's next?
Next post will cover details of implementation:
  • How to get the HFM Process Management details
  • SQL Query to get data differences (new, updated, or deleted data)
Cheers

Friday, December 20, 2019

Building your own Monitoring Solution for Data Changes in HFM Phase Submission Loads - Part 1

Back to blog!

From time to time I get some cool requirements from customers. That makes me think and do what I like to do: analyze, design, build and make everyone happy :-)

In this case, I got a challenging requirement for an automated integration between several source systems and HFM. Of course, using our favorite fishing tool, FDMEE. Yes, I know, everybody is now thinking about Cloud and more Cloud. However, FDMEE is still a very strong EPM integration tool supporting Hybrid and survives in 11.2.

I won't go into much detail so I can keep it simple.

The customer had already all automation built using open batch files and different direct data extracts. However, there was no email notification solution implemented. It was hard for them to get results without accessing FDMEE.

We started to gather the requirements and we came up with the following list:
  • Bypass missing mappings and send summary by email
  • Bypass invalid HFM intersections and send the Intersection Check report by email (html format)
  • Identify the CANNOT WRITE intersections (the blue ones) and include them in the email body
  • Show Process Management information about the HFM Process Units where data cannot be written. Basically, show Review Level so the users are aware of what is happening
  • Build a data-tracking solution for HFM Phased Submission
The goal is to cover all requirements in future blog posts but today I will start with the key one, the data-tracking. Personally, I think this would be a great functionality for Cloud Data Management as well. It fills a gap in the data audit in FDMEE (you can see how cell data changes in HFM)

Requirement - Data Tracking for HFM Phased Submission
This is the requirement word by word:
To maintain control over the phased close process, a data tracking service is required to track data changes impacting Gross Margin and EBITDA from WD2 and WD4 respectively.  
We need the following key controls:
  • The phased submission process should allow for a soft Gross Margin and EBITDA close whilst tracking changes to the underlying data impacting Gross Margin and EBITDA from WD2 and WD4 respectively
  • The data load monitoring service must be relevant i.e. we DO NOT want to track all financial statement data changes and report them to users
  • Appropriate stakeholders must be notified of the data changes once the deadlines have been passed
  • This process must not disrupt the source trial balance data load to HFM i.e. the trial balance loaded must be complete and accurate
  • Sufficient provision should be built into the design to switch over to a hard WD2/WD4 close if desired
Is it clear? I know it is not a conventional solution but it will help establish the control that is desired over the phased close process without "hard" locking components of the income statement. This is what the customer wants to achieve.

Let's give some shape to these words as we want to keep everything simple. HFM will be using phased submissions to manage the data flow with review levels acting as a trigger event for data tracking.
  • Review Level 1 (RL1): Both the automated and manual FDMEE load processes are running and exporting data to HFM.
  • Review Level 2 (RL2) Soft Close - the promotion to RL2 will now act as a trigger to activate the data monitoring script that will track data changes when data is exported to HFM.
  • Review Level 3 (RL3) Hard Closeat RL3 the FDMEE Global user system account no longer has access to load data to HFM.
Phased Submissions at a glance
The term phased submissions in HFM is synonymous with managing data through stages of the close cycle. In a typical organization this would involve:
  • Phase 1 – Promote inter-company data
  • Phase 2 – Promote Trial Balance data
  • Phase 3 – Promote Balance Sheet movement data/cash-flow
  • Phase 4 – Promote disclosure notes 
The process encourages a phased hard close and is designed for locking data submissions defined by accounting processes (i.e when Phase 1 is promoted to Review Level 3, inter-company data cannot be changed by the trial balance loads in Phase 2)

Our Customer
The following tables show the different phases and details: 

WD2
WD4
Phase 1
Phase 2
Promote Gross Margin data
Promote EBITDA data
Soft close - RL2
Soft close - RL2
Enable data monitoring on accounts that impact Gross Margin
Enable data monitoring on accounts that impact EBITDA
Automated notifications informing users of changes to Gross Margin data
Automated notifications informing users of changes to EBITDA data


WD6

Phase 3
Phase 4
Promote Trial Balance data
Promote USGAAP/Disclosure Notes
Hard close – RL3
Hard close – RL3
Lock trial balance data
Lock USGAAP adjustments and changes to disclosure notes
Stop all email notifications
Journals/Offline (excel) submissions

Using Custom HFM dimension to Link Accounts and Phases
A conscious decision was made to try and avoid tagging submission groups on the account dimension as invariably accounts end up being incorrectly tagged (or Group Finance change their mind!) which results in the need for an application metadata update to correct the submission group assignment (used to profile phased submissions).

Instead a spare custom dimension was leveraged, in this case Custom6 (Custom5 is used to evaluate constant currency) with the intention of using FDMEE and SQL mapping scripts to load data to each of the Custom6 dimension members respectively. This means that we can now adjust the FDMEE mapping for phased submission purposes without the need to apply HFM metadata updates.
Please, note that this approach will only work dynamically where the chart of accounts adheres to a rational numbering sequence e.g. Revenue account are 1* series, Cost of Sales accounts are 2* series and OPEX accounts are 3* series etc. Other approaches are also valid.

Translating the Requirements for the Integration Guy
Although solution seems to be complex, design is the key. Basically, we need FDMEE to:
  • Get the HFM Process Management details (Review Level for the different phases)
  • When Soft Close, be able to compare new data set against last data set loaded. That will monitor all data load changes
  • Auto-map missing mappings so the workflow can be completed
  • Auto-map for invalid HFM intersections and identify the CANNOT WRITE intersections (Soft Close)
  • Keep audit in FDMEE so missing mappings and invalid intersections can be easily filtered in the Data Load Workbench
  • Send email notification with detailed results 
In the next chapter we will go through the solution design in FDMEE, and then we will continue with some examples where you can see how it works and the outcome of this data monitoring solution.

Merry Xmas!

Tuesday, May 28, 2019

Code Snippet: Getting ODI Details for Source/Target System

There are different scenarios where you might need to interact with your source system. For example, we want to perform a delta extract. For that purpose, we have a timestamp column in our source table/view. In our solution, we need to execute a query against the source system to get only the entities which have data generated after specific timestamp.

Also, you may want to export your data to a target table. You will need to connect and execute an insert statement.

For source systems, the connection details are stored in the ODI tables of the FDMEE database. Basically, what you setup in ODI topology. If using a relational database as a target system, you may want to store the connection details in ODI to avoid hard-coding and making it more dynamic.

This version uses the current context you setup in the source system page. We also wanted to have to logical schema as a parameter

Let's have a look!

SQL query to get the ODI details for specific source system:


def get_odi_source_details(fdmAPI, fdmContext, sourceSystemName, logicalSchema):
    '''
     Snippet:      Get ODI Source details for Source System Name 
     Author:       Francisco Amores
     Date:         21/05/2019
           
     Parameters:   
                   - fdmAPI: FDMEE API object
                   - fdmContext: FDM Context object
                   - sourceSystemName: Source System name 
                   - logicalSchema: ODI Logical Schema
                   
     Notes:         This snippet can be pasted in any event script. The function
                    returns a map object with the different properties
     
     FDMEE Version: 11.1.2.3 and later
               
     ----------------------------------------------------------------------
     Change:
     Author:
     Date:
    '''
    
    # *******************************************
    # Import section
    # *******************************************
    from java.sql import SQLException
    
       
    # *******************************************
    # Get ODI Details for Source System
    # *******************************************
    
    # log
    logMsg = "Getting ODI details for Source System %s" % (sourceSystemName)
    fdmAPI.logInfo(logMsg)
    
    sqlOdiDetails = """SELECT
                        S.SOURCE_SYSTEM_NAME,
                        C.CONTEXT_CODE AS ODI_CONTEXT,
                        CO.CON_NAME AS DATA_SERVER_NAME,
                        L.LSCHEMA_NAME AS LOGICAL_SCHEMA,
                        P.SCHEMA_NAME AS PHYSICAL_SCHEMA,
                        TXT.FULL_TXT AS JAVA_URL,
                        CO.JAVA_DRIVER,
                        CO.USER_NAME,
                        CO.PASS AS ENCRYPTED_PWD
                    FROM 
                        AIF_SOURCE_SYSTEMS S INNER JOIN SNP_CONTEXT C
                            ON S.ODI_CONTEXT_CODE = C.CONTEXT_CODE
                        INNER JOIN SNP_LSCHEMA L
                            ON L.LSCHEMA_NAME = ?
                        INNER JOIN SNP_PSCHEMA_CONT PC
                            ON PC.I_CONTEXT = C.I_CONTEXT AND
                               PC.I_LSCHEMA = L.I_LSCHEMA
                        INNER JOIN SNP_PSCHEMA P
                            ON P.I_PSCHEMA = PC.I_PSCHEMA
                        INNER JOIN SNP_CONNECT CO
                            ON P.I_CONNECT = CO.I_CONNECT
                        INNER JOIN SNP_MTXT TXT
                            ON CO.I_TXT_JAVA_URL = TXT.I_TXT
                        LEFT OUTER JOIN SNP_CONNECT_PROP CP
                            ON CP.I_CONNECT = CO.I_CONNECT
                    WHERE S.SOURCE_SYSTEM_NAME = ?"""
    
    # params
    params = [logicalSchema, sourceSystemName]
    
    try:
        # execute SQL query
        rsOdiDetails = fdmAPI.executeQuery(sqlOdiDetails, params)
        
        # initialize map
        mapOdiDetails = {}
        
        # loop
        if rsOdiDetails.isBeforeFirst():
            while rsOdiDetails.next():
                # get ODI details
                mapOdiDetails["ODI_CONTEXT"] = rsOdiDetails.getString("ODI_CONTEXT")
                mapOdiDetails["DATA_SERVER_NAME"]= rsOdiDetails.getString("DATA_SERVER_NAME")
                mapOdiDetails["PHYSICAL_SCHEMA"]= rsOdiDetails.getString("PHYSICAL_SCHEMA")
                mapOdiDetails["JAVA_URL"] = rsOdiDetails.getString("JAVA_URL")
                mapOdiDetails["JAVA_DRIVER"] = rsOdiDetails.getString("JAVA_DRIVER")
                mapOdiDetails["USER_NAME"] = rsOdiDetails.getString("USER_NAME")
                mapOdiDetails["ENCRYPTED_PWD"] = rsOdiDetails.getString("ENCRYPTED_PWD")
               
                # log
                fdmAPI.logInfo("ODI Details: %s" % mapOdiDetails)
        else:
            # ODI Details not found                        
            errMsg = "ODI Details not found for Source System name %s (Logical Schema %s)" % sourceSystemName, logicalSchema
            fdmAPI.logInfo(errMsg)
            raise RuntimeError(errMsg)        
        # close rs
        fdmAPI.closeResultSet(rsOdiDetails)
    except SQLException, ex:
        errMsg = "Error executing the SQL Statement: %s" % ex
        raise RuntimeError(errMsg)
        
    # return
    return mapOdiDetails

Code snippets for FDMEE can be downloaded from GitHub.

Monday, April 29, 2019

Kscope19 - there we go!

Every year gets more and more complicate to get a slot in this amazing event.
I'm more than happy to be there again. 

Sharing knowledge is something I really enjoy, so I will try to do my best as always.
This is the summary for you:
We all know that the administration guides are a good starting point to learn how to use a product. But, is everything written in them? Dark arts are not taught in books. I have been writing my own magic potions for many years, and it is a pleasure to be able to show you some dark magic in FDMEE.

The cloud is already a reality which increasingly takes center stage in our data integrations solutions. That's why we won’t forget about it, and we will share the best tips and tricks with you. You will also see new Data Integration SUI!

If you would like to discover the best spells for your data integration requirements, this is undoubtedly your platform 9¾. We wait for you!



Looking forward to seeing you at Seattle!

Wednesday, April 17, 2019

Data Protection with Multiple Global Application Users

Dear colleagues,
I'm back! It has been a hard few months with a lot of work especially at home where the large family has required 200% of my attention.

Many people asked me where I was, although those who know me know that I have not been to the Caribbean :-)

Being said that...
We all know that the Cloud continues to grow with great force and that the role of data integration is fundamental in the architecture of any solution. Some other people including my colleague John Goodwin, have covered many topics of maximum interest. I strongly recommend visiting the different blogs out there although I'm sure you already did it :-)

Today, I come back to show you a solution that we have been implementing in multiple customers. How many of you have had data protection problems in HFM when loading from FDMEE? Do you know all the solutions? I'm not going to cover all of them, but I will be introducing the one that is not fully documented.

As usually, I'm not stating this is the best solution for your requirement. This is just to share with you a new idea that I found very useful for some of my implementations.

The requirement
Let's start with a common question from customers:
When we load HFM data with FDMEE in Replace mode, some accounts are wiped out. Controllers type them, so we need to protect that data. Can we?
Then, you start thinking about different approaches.
  • Maybe, we can use Merge mode instead...
  • Or Replace by Security...
  • FDMEE has built-in functionality for Data Protection...
  • Etc.
During your analysis, all options should be evaluated. You need to understand the pros and cons and if they have any impact on existing integration flows.

Let's now take that requirement to another level of complexity. Multiple FDMEE interfaces loading different sets of data for the same HFM sub-cube (Entity, Scenario, Year, and Value). For example: 
  • ERP data / Supplemental data
  • Statutory accounts / IFRS16 accounts
The two examples above have something in common. If you execute the Data Load Rules (same category) with Replace export mode, each DLR will delete the data of the other one. This is how the Replace load method works in HFM so data for the sub-cube is deleted before the new data set is loaded. For example:
  1. DLR GL_DATA loads actual data in Replace mode for Entity NY, Period Mar-2019
  2. DLR IFRS16_DATA loads actual data in Replace mode for Entity NY, Period Mar-2019
There can be many different sequences and scenarios but if we focus on the above one, the second execution will delete all the data previously loaded for NY/Mar/2019 (assuming there is no data protection mechanism).

Can I use HFM Data Protection functionality available in FDMEE?
I must admit that I have not been a big fan of Data Protection functionality. The main reason is the way it works, its limitations and the performance impact it can have. Basically, FDMEE (legacy FDM classic too) extracts the data to be protected, and then append to the DAT file that will be loaded.
You can protect, for the sub-cube being loaded, all data being that has specific member name in any of the dimensions of the data intersections, or just all data which has not that member.
Some of the limitations I refer to... you cannot protect multiple dimension members OOTB. Also, protecting data with operator "<>" can derive in FDMEE extracting big volumes of data. Definitely, something to be taken into account.

Therefore, yes, data protection is an option to be evaluated but, in addition to understanding how it works, you need to take it into account in your HFM application design. If you want to protect manual data inputs or different data sets loaded through FDMEE, you may want to consider using an HFM custom dimension (typically Custom4), so the different data sets use different custom dimension members. Each FDMEE interface should protect all data being different than the custom protection member it is loading into. As you can imaging,simply extracting the HFM data requires time and resources. Also,several re-loads can happen depending on our solution design.
BTW, I haven't mentioned Cell Texts but they are included in the data protection process.

What about Merging data instead of Replacing?
If you are thinking that changing your load method can be the solution for your data protection issue, you may be introducing more issues. As we usually say, the cure is worse than the disease. E.g. if you have re-allocations in your data, you might be protecting your data but leaving wrong data in the system as well. There is a well-known custom solution called Smart-Merge but it is not in the scope of this post, so I leave that for your researching.

Replace by Security Load Method and Global Application Users
Let's have a look to a load method different than Merge and Replace. The HFM documentation defines Replace by Security as:
In a nutshell, write and clear the data cells that you access to. This seems to be a good way of protecting data, doesn't it? As you know, FDMEE leverages HFM load methods so Replace by Security is available when exporting data:
So if the user loading data has specific security classes assigned, data being secured will be protected as it will not be cleared. Good, we are on the right path.

Now let's go with users. Let's say the user loading into HFM is an admin user. Restricting access for the admin might no be a good idea. The same applies for non-admin users, assigning security classes for data protection requirements only, might conflict with his role.

What about having specific Shared Services native users for data protection only? That would fit but we don't want everyone using the same user in FDMEE as that is not compatible with other FDMEE featured such us Security by Location.
Luckily, FDMEE has something called Global Application Users. This feature is not new as it was already available in legacy FDM Classic as "Global Logon" option in the target adapter.

In FDMEE, this option is available in the Target Application Options:
Therefore, if we join the two concepts we have global users loading data in Replace by Security mode. Still on the right path. However, this option is only available at target application level. We cannot overwrite it at Data Load Rule level as we can do with other target options:
One target application can only have one global user assigned. This is not good if we want to have multiple global users for multiple data flows.
You may come up with the option to register target applications multiple times which is possible from FDMEE 220. In my opinion, that's not a good idea as most of the FDMEE artifacts are defined at target application level and you would end up duplicating them as well. 

This post makes sense when we must find a solution to work around this limitation :-)

Multiple Global Application Users at Data Load Rule level
If we cannot have global users at DLR level OOTB, how can we get them?
Global Users, as many other options, are target application options. Therefore, as we already showed in other blog posts, they are stored at run-time in table AIF_BAL_RULE_LOAD_PARAMS when the DLR is executed. The technical name for the global user option is GLOBAL_USER_FOR_APP_ACCESS:
Then, this little hack is easy. We can update the table at run-time based on the logic we define to get the right global user. The script below inserts the global application user for the current DLR only if it does not exist already (this is done in case you want to place the script in different event scripts as shown in section At which step do we update the global user?)
As you can see, the SQL statement above includes an INSERT/UPDATE combination. Why? Because if you define a global user at Target Application level, there will be an existing line in the table, so we just have to update it. If you don't define it, we just have to insert it. It also avoids issues if you want to use global user in the Validate step to run the intersection check report, and you try to insert it again in the load step (as long as you execute the DLR in one go)

The code I show is valid for Oracle Database but it is similar for SQL Server. You might consider MERGE statement as well. Whatever you implement, the result must be the same :-)

At which step do we update the global user?
The key question is when do we update the global user in our process? The answer depends on your specific requirements, but the table below summarizes the main scenarios:
How do we configure the global users?
It depends on the requirements. For our example, if we need two global users:
  • User for GL data: IntegrationUser_INP_FDM
  • User for IFRS16 data: IntegrationUser_IFRS16_FDM
We have to perform the following steps:
  1. Create the security classes in HFM: C4_INP_FDM and C4_IFRS16_FDM
  2. Activate the security class for Custom4 (App settings in HFM metadata)
  3. Assign the security classes to the Custom4 members: INP_FDM and IFRS16_FDM
  4. Assign role Default to the global user so he shows up in the matrix below
  5. Assign user access to the security classes
Let's see an end-to-end example
We will first show the data protection issue:
  1. Load GL data first in Replace mode
  2. Input data manually
  3. Load IFRS16 data in Replace mode.
To show how the solution works, we will repeat the process but loading IFRS16 data in Replace by Security mode with new global user IntegrationUser_IFRS16_FDM.
  1. As a starting point, we load the GL data with Replace Mode. Data is loaded to Custom4 member INP_FDM

  2. The HFM data form shows the right data being loaded:
  3. After loading GL data, we input value 500 manually (Custom4 member INP):
  4. Finally, we load IFRS data in Replace mode:
As you can see, all data previously loaded has been wiped out after loading IFRS16 data in Replace mode. Therefore, both manual and GL data have not been protected.
Now we have a data protection issue and a solution for it. Once we apply our event script with the code to update the global user at run-time, data looks good after loading in Replace by Security mode. The three values have been protected as expected:
Conclusion
As usually happens for any custom solution, you need to take into account different considerations:
  • If you run the data load rule manually, you can select Replace mode as there is no functionality to hide load methods. To avoid that issue, you can always have a BefLoad script which checks the export mode before loading data. 
  • As you are loading with the global user, that's the user you see in the HFM Audit.
  • The solution applies to many different requirements. For example, we had a customer with different security requirements at HFM scenario level. In that solution, we used global users at category level.
  • Consider location security. If user A has access to location A, note that the HFM security applied will be the one for the global user and not for user A. Therefore, if user A tries to load data for an entity that he has not access in HFM but the global user has, then data will be loaded for that entity. As a good practice, I usually build a control when locations/data load rules are defined for unique entities. In this way, I prevent users loading data for other entities, especially when they can manipulate the source files :-)
Today, I tried to show you a new solution for data protection that leverages global users, HFM security classes and Replace by Security mode.

I hope you found it useful. Now it is turn for your creativeness.

That's all my folks.

Friday, November 24, 2017

Code Snippet: Executing PL/SQL Stored Procedures with IN/OUT Parameters

Do you need to execute a stored procedure from you script? Maybe to populate the Open Interface Table? You can make use of the FDMEE API method executePLSQL but only if the stored procedure does not return OUT parameters. If you need to return any value, then you can use the Java classes for SQL.

The following snippet shows how to execute the procedure remotely from a dblink. Executing the procedure from the database you connect, follows the same approach.

Let's have a look!

Executing a stored procedure with IN/OUT parameters

'''
 Snippet:       Execute a PL/SQL stored procedures IN/OUT params
 Author:        Francisco Amores
 Date:          24/11/2017
 Blog:          http://fishingwithfdmee.blogspot.com
 
 Notes:         This snippet can be pasted in any event script.
                Content of fdmContext object will be logged in the
                FDMEE process log (...\outbox\logs\)
                
                This snippet executes the stored procedure via dblink
                Local stored procedures are executed in a similar way
                
 Instructions:  Set log level (global or application settings) to > 4 
 Hints:         You can implement also code to get db connection details
                instead of hard-coding
               
 FDMEE Version: 11.1.2.3 and later
 ----------------------------------------------------------------------
 Change:
 Author:
 Date:
'''
try:
    # Import Java libraries
    import java.sql.SQLException as SQLException
    import java.sql.DriverManager as SQLDriverMgr
    import java.sql.CallableStatement as SQLCallableStmt
    import java.sql.Types as SQLTypes
    import java.sql.Date as SQLDate # needed for DATE parameters
    import java.text.SimpleDateFormat as SimpleDateFormat
    
    # Note: import any other class you need
    
except ImportError, err:
    errMsg = "Error importing libraries: %s" % err
    fdmAPI.logFatal(errMsg)
    raise RuntimeError(errMsg)

# ----------------------------------------
# Connect to FDMEE or External database
# ----------------------------------------

# Connection details
dbConn = "the jdbc url"
dbUser = "the db user"
dbPasswd = "the db password"

try:        
    # get connection to database for callable statements        
    conn = SQLDriverMgr.getConnection(dbConn, dbUser, dbPasswd)
    fdmAPI.logInfo("Connected to the database")
except SQLException, ex:
    errMsg = "Error executing SQL: %s" % (ex)
    raise RuntimeError("Error generated from FDMEE script\n%s" % errMsg)

# ----------------------------------------
# Execute PL/SQL Stored Procedure
# ----------------------------------------

# Get dblink
dbLink = "your dblink"

# PL/SQL Block Code (via DBLINK)
'''
    Procedure implemeted as:
    PROCEDURE CARGA_TABLA(P1            OUT VARCHAR2,
                          P2            OUT NUMBER,
                          P3            IN NUMBER,
                          P4            IN DATETIME,
                          P5            IN VARCHAR2,
                          P6            IN VARCHAR2,
                          P7            IN VARCHAR2,
                          P8            IN VARCHAR2,                          
                          P9            IN VARCHAR2)
'''

# Each ? represents one stored proc parameter
# Ex: schema.package.storedproc if your stored proc is in a package
plSqlBlock = "{CALL schema.package.storedproc@%s(?, ?, ?, ?, ?, ?, ?, ?, ?)}" % dbLink

# Get parameters for the statement
p3 = "valuep3"
# parameter p4 must be passed as java.sql.Date
sdf = SimpleDateFormat("dd/MM/yyyy")
dtParsed = sdf.parse("date value")
p4 = SQLDate(dtParsed.getTime())
p5 = "this param is passed as null"
p6 = "valuep6"
p7 = "valuep7"
p8 = "valuep8"
p9 = "valuep9"

# Prepare and execute call
try:
    # Callable Statement    
    callableStmt = conn.prepareCall(plSqlBlock)
    fdmAPI.logInfo("Callable statement successfully prepared")
    
    # Set IN parameters
    callableStmt.setBigDecimal("p3", p3)    
    callableStmt.setDate("p4", p4)
    callableStmt.setNull("p5", SQLTypes.VARCHAR) # NULL
    callableStmt.setString("p6", p6)
    callableStmt.setString("p7", p7)
    callableStmt.setString("p8", p7)
    callableStmt.setString("p9", p7)      
    fdmAPI.logInfo("Parameters IN set")
    
    # Register OUT parameters
    callableStmt.registerOutParameter("p1", SQLTypes.VARCHAR)    
    callableStmt.registerOutParameter("p2", SQLTypes.NUMERIC)
    fdmAPI.logInfo("Parameters OUT registered")
        
    # Execute PL/SQL Stored Procedure
    result = callableStmt.execute()
    conn.commit()
    fdmAPI.logInfo("Stored Proceedure successfully executed: %s" % result)
    
    # Get OUT parameters
    p1 = callableStmt.getString("p1")
    p2 = callableStmt.getInt("p2")

    # Log OUT parameters
    fdmAPI.logInfo("OUT p1:  %s" % p1)
    fdmAPI.logInfo("OUT p2:  %s" % p2)
    
except (Exception, SQLException), ex:
    errMsg = "Error when executing the stored procedure: %s" % ex
    fdmAPI.logFatal(errMsg)
    if len(errMsg) <= 1000:
        fdmAPI.showCustomMessage(errMsg)
    raise RuntimeError(errMsg)

# ----------------------------------------
# Close connection
# ----------------------------------------
if callableStmt is not None:
    callableStmt.close()
if conn is not None:
    conn.close()
    fdmAPI.logInfo("DB connection closed")



Code snippets for FDMEE can be downloaded from GitHub.