Tuesday, May 28, 2019

Code Snippet: Getting ODI Details for Source/Target System

There are different scenarios where you might need to interact with your source system. For example, we want to perform a delta extract. For that purpose, we have a timestamp column in our source table/view. In our solution, we need to execute a query against the source system to get only the entities which have data generated after specific timestamp.

Also, you may want to export your data to a target table. You will need to connect and execute an insert statement.

For source systems, the connection details are stored in the ODI tables of the FDMEE database. Basically, what you setup in ODI topology. If using a relational database as a target system, you may want to store the connection details in ODI to avoid hard-coding and making it more dynamic.

This version uses the current context you setup in the source system page. We also wanted to have to logical schema as a parameter

Let's have a look!

SQL query to get the ODI details for specific source system:


def get_odi_source_details(fdmAPI, fdmContext, sourceSystemName, logicalSchema):
    '''
     Snippet:      Get ODI Source details for Source System Name 
     Author:       Francisco Amores
     Date:         21/05/2019
           
     Parameters:   
                   - fdmAPI: FDMEE API object
                   - fdmContext: FDM Context object
                   - sourceSystemName: Source System name 
                   - logicalSchema: ODI Logical Schema
                   
     Notes:         This snippet can be pasted in any event script. The function
                    returns a map object with the different properties
     
     FDMEE Version: 11.1.2.3 and later
               
     ----------------------------------------------------------------------
     Change:
     Author:
     Date:
    '''
    
    # *******************************************
    # Import section
    # *******************************************
    from java.sql import SQLException
    
       
    # *******************************************
    # Get ODI Details for Source System
    # *******************************************
    
    # log
    logMsg = "Getting ODI details for Source System %s" % (sourceSystemName)
    fdmAPI.logInfo(logMsg)
    
    sqlOdiDetails = """SELECT
                        S.SOURCE_SYSTEM_NAME,
                        C.CONTEXT_CODE AS ODI_CONTEXT,
                        CO.CON_NAME AS DATA_SERVER_NAME,
                        L.LSCHEMA_NAME AS LOGICAL_SCHEMA,
                        P.SCHEMA_NAME AS PHYSICAL_SCHEMA,
                        TXT.FULL_TXT AS JAVA_URL,
                        CO.JAVA_DRIVER,
                        CO.USER_NAME,
                        CO.PASS AS ENCRYPTED_PWD
                    FROM 
                        AIF_SOURCE_SYSTEMS S INNER JOIN SNP_CONTEXT C
                            ON S.ODI_CONTEXT_CODE = C.CONTEXT_CODE
                        INNER JOIN SNP_LSCHEMA L
                            ON L.LSCHEMA_NAME = ?
                        INNER JOIN SNP_PSCHEMA_CONT PC
                            ON PC.I_CONTEXT = C.I_CONTEXT AND
                               PC.I_LSCHEMA = L.I_LSCHEMA
                        INNER JOIN SNP_PSCHEMA P
                            ON P.I_PSCHEMA = PC.I_PSCHEMA
                        INNER JOIN SNP_CONNECT CO
                            ON P.I_CONNECT = CO.I_CONNECT
                        INNER JOIN SNP_MTXT TXT
                            ON CO.I_TXT_JAVA_URL = TXT.I_TXT
                        LEFT OUTER JOIN SNP_CONNECT_PROP CP
                            ON CP.I_CONNECT = CO.I_CONNECT
                    WHERE S.SOURCE_SYSTEM_NAME = ?"""
    
    # params
    params = [logicalSchema, sourceSystemName]
    
    try:
        # execute SQL query
        rsOdiDetails = fdmAPI.executeQuery(sqlOdiDetails, params)
        
        # initialize map
        mapOdiDetails = {}
        
        # loop
        if rsOdiDetails.isBeforeFirst():
            while rsOdiDetails.next():
                # get ODI details
                mapOdiDetails["ODI_CONTEXT"] = rsOdiDetails.getString("ODI_CONTEXT")
                mapOdiDetails["DATA_SERVER_NAME"]= rsOdiDetails.getString("DATA_SERVER_NAME")
                mapOdiDetails["PHYSICAL_SCHEMA"]= rsOdiDetails.getString("PHYSICAL_SCHEMA")
                mapOdiDetails["JAVA_URL"] = rsOdiDetails.getString("JAVA_URL")
                mapOdiDetails["JAVA_DRIVER"] = rsOdiDetails.getString("JAVA_DRIVER")
                mapOdiDetails["USER_NAME"] = rsOdiDetails.getString("USER_NAME")
                mapOdiDetails["ENCRYPTED_PWD"] = rsOdiDetails.getString("ENCRYPTED_PWD")
               
                # log
                fdmAPI.logInfo("ODI Details: %s" % mapOdiDetails)
        else:
            # ODI Details not found                        
            errMsg = "ODI Details not found for Source System name %s (Logical Schema %s)" % sourceSystemName, logicalSchema
            fdmAPI.logInfo(errMsg)
            raise RuntimeError(errMsg)        
        # close rs
        fdmAPI.closeResultSet(rsOdiDetails)
    except SQLException, ex:
        errMsg = "Error executing the SQL Statement: %s" % ex
        raise RuntimeError(errMsg)
        
    # return
    return mapOdiDetails

Code snippets for FDMEE can be downloaded from GitHub.

Monday, April 29, 2019

Kscope19 - there we go!

Every year gets more and more complicate to get a slot in this amazing event.
I'm more than happy to be there again. 

Sharing knowledge is something I really enjoy, so I will try to do my best as always.
This is the summary for you:
We all know that the administration guides are a good starting point to learn how to use a product. But, is everything written in them? Dark arts are not taught in books. I have been writing my own magic potions for many years, and it is a pleasure to be able to show you some dark magic in FDMEE.

The cloud is already a reality which increasingly takes center stage in our data integrations solutions. That's why we won’t forget about it, and we will share the best tips and tricks with you. You will also see new Data Integration SUI!

If you would like to discover the best spells for your data integration requirements, this is undoubtedly your platform 9¾. We wait for you!



Looking forward to seeing you at Seattle!

Wednesday, April 17, 2019

Data Protection with Multiple Global Application Users

Dear colleagues,
I'm back! It has been a hard few months with a lot of work especially at home where the large family has required 200% of my attention.

Many people asked me where I was, although those who know me know that I have not been to the Caribbean :-)

Being said that...
We all know that the Cloud continues to grow with great force and that the role of data integration is fundamental in the architecture of any solution. Some other people including my colleague John Goodwin, have covered many topics of maximum interest. I strongly recommend visiting the different blogs out there although I'm sure you already did it :-)

Today, I come back to show you a solution that we have been implementing in multiple customers. How many of you have had data protection problems in HFM when loading from FDMEE? Do you know all the solutions? I'm not going to cover all of them, but I will be introducing the one that is not fully documented.

As usually, I'm not stating this is the best solution for your requirement. This is just to share with you a new idea that I found very useful for some of my implementations.

The requirement
Let's start with a common question from customers:
When we load HFM data with FDMEE in Replace mode, some accounts are wiped out. Controllers type them, so we need to protect that data. Can we?
Then, you start thinking about different approaches.
  • Maybe, we can use Merge mode instead...
  • Or Replace by Security...
  • FDMEE has built-in functionality for Data Protection...
  • Etc.
During your analysis, all options should be evaluated. You need to understand the pros and cons and if they have any impact on existing integration flows.

Let's now take that requirement to another level of complexity. Multiple FDMEE interfaces loading different sets of data for the same HFM sub-cube (Entity, Scenario, Year, and Value). For example: 
  • ERP data / Supplemental data
  • Statutory accounts / IFRS16 accounts
The two examples above have something in common. If you execute the Data Load Rules (same category) with Replace export mode, each DLR will delete the data of the other one. This is how the Replace load method works in HFM so data for the sub-cube is deleted before the new data set is loaded. For example:
  1. DLR GL_DATA loads actual data in Replace mode for Entity NY, Period Mar-2019
  2. DLR IFRS16_DATA loads actual data in Replace mode for Entity NY, Period Mar-2019
There can be many different sequences and scenarios but if we focus on the above one, the second execution will delete all the data previously loaded for NY/Mar/2019 (assuming there is no data protection mechanism).

Can I use HFM Data Protection functionality available in FDMEE?
I must admit that I have not been a big fan of Data Protection functionality. The main reason is the way it works, its limitations and the performance impact it can have. Basically, FDMEE (legacy FDM classic too) extracts the data to be protected, and then append to the DAT file that will be loaded.
You can protect, for the sub-cube being loaded, all data being that has specific member name in any of the dimensions of the data intersections, or just all data which has not that member.
Some of the limitations I refer to... you cannot protect multiple dimension members OOTB. Also, protecting data with operator "<>" can derive in FDMEE extracting big volumes of data. Definitely, something to be taken into account.

Therefore, yes, data protection is an option to be evaluated but, in addition to understanding how it works, you need to take it into account in your HFM application design. If you want to protect manual data inputs or different data sets loaded through FDMEE, you may want to consider using an HFM custom dimension (typically Custom4), so the different data sets use different custom dimension members. Each FDMEE interface should protect all data being different than the custom protection member it is loading into. As you can imaging,simply extracting the HFM data requires time and resources. Also,several re-loads can happen depending on our solution design.
BTW, I haven't mentioned Cell Texts but they are included in the data protection process.

What about Merging data instead of Replacing?
If you are thinking that changing your load method can be the solution for your data protection issue, you may be introducing more issues. As we usually say, the cure is worse than the disease. E.g. if you have re-allocations in your data, you might be protecting your data but leaving wrong data in the system as well. There is a well-known custom solution called Smart-Merge but it is not in the scope of this post, so I leave that for your researching.

Replace by Security Load Method and Global Application Users
Let's have a look to a load method different than Merge and Replace. The HFM documentation defines Replace by Security as:
In a nutshell, write and clear the data cells that you access to. This seems to be a good way of protecting data, doesn't it? As you know, FDMEE leverages HFM load methods so Replace by Security is available when exporting data:
So if the user loading data has specific security classes assigned, data being secured will be protected as it will not be cleared. Good, we are on the right path.

Now let's go with users. Let's say the user loading into HFM is an admin user. Restricting access for the admin might no be a good idea. The same applies for non-admin users, assigning security classes for data protection requirements only, might conflict with his role.

What about having specific Shared Services native users for data protection only? That would fit but we don't want everyone using the same user in FDMEE as that is not compatible with other FDMEE featured such us Security by Location.
Luckily, FDMEE has something called Global Application Users. This feature is not new as it was already available in legacy FDM Classic as "Global Logon" option in the target adapter.

In FDMEE, this option is available in the Target Application Options:
Therefore, if we join the two concepts we have global users loading data in Replace by Security mode. Still on the right path. However, this option is only available at target application level. We cannot overwrite it at Data Load Rule level as we can do with other target options:
One target application can only have one global user assigned. This is not good if we want to have multiple global users for multiple data flows.
You may come up with the option to register target applications multiple times which is possible from FDMEE 220. In my opinion, that's not a good idea as most of the FDMEE artifacts are defined at target application level and you would end up duplicating them as well. 

This post makes sense when we must find a solution to work around this limitation :-)

Multiple Global Application Users at Data Load Rule level
If we cannot have global users at DLR level OOTB, how can we get them?
Global Users, as many other options, are target application options. Therefore, as we already showed in other blog posts, they are stored at run-time in table AIF_BAL_RULE_LOAD_PARAMS when the DLR is executed. The technical name for the global user option is GLOBAL_USER_FOR_APP_ACCESS:
Then, this little hack is easy. We can update the table at run-time based on the logic we define to get the right global user. The script below inserts the global application user for the current DLR only if it does not exist already (this is done in case you want to place the script in different event scripts as shown in section At which step do we update the global user?)
As you can see, the SQL statement above includes an INSERT/UPDATE combination. Why? Because if you define a global user at Target Application level, there will be an existing line in the table, so we just have to update it. If you don't define it, we just have to insert it. It also avoids issues if you want to use global user in the Validate step to run the intersection check report, and you try to insert it again in the load step (as long as you execute the DLR in one go)

The code I show is valid for Oracle Database but it is similar for SQL Server. You might consider MERGE statement as well. Whatever you implement, the result must be the same :-)

At which step do we update the global user?
The key question is when do we update the global user in our process? The answer depends on your specific requirements, but the table below summarizes the main scenarios:
How do we configure the global users?
It depends on the requirements. For our example, if we need two global users:
  • User for GL data: IntegrationUser_INP_FDM
  • User for IFRS16 data: IntegrationUser_IFRS16_FDM
We have to perform the following steps:
  1. Create the security classes in HFM: C4_INP_FDM and C4_IFRS16_FDM
  2. Activate the security class for Custom4 (App settings in HFM metadata)
  3. Assign the security classes to the Custom4 members: INP_FDM and IFRS16_FDM
  4. Assign role Default to the global user so he shows up in the matrix below
  5. Assign user access to the security classes
Let's see an end-to-end example
We will first show the data protection issue:
  1. Load GL data first in Replace mode
  2. Input data manually
  3. Load IFRS16 data in Replace mode.
To show how the solution works, we will repeat the process but loading IFRS16 data in Replace by Security mode with new global user IntegrationUser_IFRS16_FDM.
  1. As a starting point, we load the GL data with Replace Mode. Data is loaded to Custom4 member INP_FDM

  2. The HFM data form shows the right data being loaded:
  3. After loading GL data, we input value 500 manually (Custom4 member INP):
  4. Finally, we load IFRS data in Replace mode:
As you can see, all data previously loaded has been wiped out after loading IFRS16 data in Replace mode. Therefore, both manual and GL data have not been protected.
Now we have a data protection issue and a solution for it. Once we apply our event script with the code to update the global user at run-time, data looks good after loading in Replace by Security mode. The three values have been protected as expected:
Conclusion
As usually happens for any custom solution, you need to take into account different considerations:
  • If you run the data load rule manually, you can select Replace mode as there is no functionality to hide load methods. To avoid that issue, you can always have a BefLoad script which checks the export mode before loading data. 
  • As you are loading with the global user, that's the user you see in the HFM Audit.
  • The solution applies to many different requirements. For example, we had a customer with different security requirements at HFM scenario level. In that solution, we used global users at category level.
  • Consider location security. If user A has access to location A, note that the HFM security applied will be the one for the global user and not for user A. Therefore, if user A tries to load data for an entity that he has not access in HFM but the global user has, then data will be loaded for that entity. As a good practice, I usually build a control when locations/data load rules are defined for unique entities. In this way, I prevent users loading data for other entities, especially when they can manipulate the source files :-)
Today, I tried to show you a new solution for data protection that leverages global users, HFM security classes and Replace by Security mode.

I hope you found it useful. Now it is turn for your creativeness.

That's all my folks.

Friday, November 24, 2017

Code Snippet: Executing PL/SQL Stored Procedures with IN/OUT Parameters

Do you need to execute a stored procedure from you script? Maybe to populate the Open Interface Table? You can make use of the FDMEE API method executePLSQL but only if the stored procedure does not return OUT parameters. If you need to return any value, then you can use the Java classes for SQL.

The following snippet shows how to execute the procedure remotely from a dblink. Executing the procedure from the database you connect, follows the same approach.

Let's have a look!

Executing a stored procedure with IN/OUT parameters

'''
 Snippet:       Execute a PL/SQL stored procedures IN/OUT params
 Author:        Francisco Amores
 Date:          24/11/2017
 Blog:          http://fishingwithfdmee.blogspot.com
 
 Notes:         This snippet can be pasted in any event script.
                Content of fdmContext object will be logged in the
                FDMEE process log (...\outbox\logs\)
                
                This snippet executes the stored procedure via dblink
                Local stored procedures are executed in a similar way
                
 Instructions:  Set log level (global or application settings) to > 4 
 Hints:         You can implement also code to get db connection details
                instead of hard-coding
               
 FDMEE Version: 11.1.2.3 and later
 ----------------------------------------------------------------------
 Change:
 Author:
 Date:
'''
try:
    # Import Java libraries
    import java.sql.SQLException as SQLException
    import java.sql.DriverManager as SQLDriverMgr
    import java.sql.CallableStatement as SQLCallableStmt
    import java.sql.Types as SQLTypes
    import java.sql.Date as SQLDate # needed for DATE parameters
    import java.text.SimpleDateFormat as SimpleDateFormat
    
    # Note: import any other class you need
    
except ImportError, err:
    errMsg = "Error importing libraries: %s" % err
    fdmAPI.logFatal(errMsg)
    raise RuntimeError(errMsg)

# ----------------------------------------
# Connect to FDMEE or External database
# ----------------------------------------

# Connection details
dbConn = "the jdbc url"
dbUser = "the db user"
dbPasswd = "the db password"

try:        
    # get connection to database for callable statements        
    conn = SQLDriverMgr.getConnection(dbConn, dbUser, dbPasswd)
    fdmAPI.logInfo("Connected to the database")
except SQLException, ex:
    errMsg = "Error executing SQL: %s" % (ex)
    raise RuntimeError("Error generated from FDMEE script\n%s" % errMsg)

# ----------------------------------------
# Execute PL/SQL Stored Procedure
# ----------------------------------------

# Get dblink
dbLink = "your dblink"

# PL/SQL Block Code (via DBLINK)
'''
    Procedure implemeted as:
    PROCEDURE CARGA_TABLA(P1            OUT VARCHAR2,
                          P2            OUT NUMBER,
                          P3            IN NUMBER,
                          P4            IN DATETIME,
                          P5            IN VARCHAR2,
                          P6            IN VARCHAR2,
                          P7            IN VARCHAR2,
                          P8            IN VARCHAR2,                          
                          P9            IN VARCHAR2)
'''

# Each ? represents one stored proc parameter
# Ex: schema.package.storedproc if your stored proc is in a package
plSqlBlock = "{CALL schema.package.storedproc@%s(?, ?, ?, ?, ?, ?, ?, ?, ?)}" % dbLink

# Get parameters for the statement
p3 = "valuep3"
# parameter p4 must be passed as java.sql.Date
sdf = SimpleDateFormat("dd/MM/yyyy")
dtParsed = sdf.parse("date value")
p4 = SQLDate(dtParsed.getTime())
p5 = "this param is passed as null"
p6 = "valuep6"
p7 = "valuep7"
p8 = "valuep8"
p9 = "valuep9"

# Prepare and execute call
try:
    # Callable Statement    
    callableStmt = conn.prepareCall(plSqlBlock)
    fdmAPI.logInfo("Callable statement successfully prepared")
    
    # Set IN parameters
    callableStmt.setBigDecimal("p3", p3)    
    callableStmt.setDate("p4", p4)
    callableStmt.setNull("p5", SQLTypes.VARCHAR) # NULL
    callableStmt.setString("p6", p6)
    callableStmt.setString("p7", p7)
    callableStmt.setString("p8", p7)
    callableStmt.setString("p9", p7)      
    fdmAPI.logInfo("Parameters IN set")
    
    # Register OUT parameters
    callableStmt.registerOutParameter("p1", SQLTypes.VARCHAR)    
    callableStmt.registerOutParameter("p2", SQLTypes.NUMERIC)
    fdmAPI.logInfo("Parameters OUT registered")
        
    # Execute PL/SQL Stored Procedure
    result = callableStmt.execute()
    conn.commit()
    fdmAPI.logInfo("Stored Proceedure successfully executed: %s" % result)
    
    # Get OUT parameters
    p1 = callableStmt.getString("p1")
    p2 = callableStmt.getInt("p2")

    # Log OUT parameters
    fdmAPI.logInfo("OUT p1:  %s" % p1)
    fdmAPI.logInfo("OUT p2:  %s" % p2)
    
except (Exception, SQLException), ex:
    errMsg = "Error when executing the stored procedure: %s" % ex
    fdmAPI.logFatal(errMsg)
    if len(errMsg) <= 1000:
        fdmAPI.showCustomMessage(errMsg)
    raise RuntimeError(errMsg)

# ----------------------------------------
# Close connection
# ----------------------------------------
if callableStmt is not None:
    callableStmt.close()
if conn is not None:
    conn.close()
    fdmAPI.logInfo("DB connection closed")



Code snippets for FDMEE can be downloaded from GitHub.

Thursday, October 5, 2017

Universal Data Adapter - Making it Simple for Multiple Databases

Hi folks!

Long time since I don't post but several events happened lately. Anyway, sorry about that. 

Before I dive into today's topic I'd like to summarize my life in the last weeks:
  • Kscope17 was a great event as always. It's a very good opportunity for us living on the other side of the pond. Meeting lot of people, partners, customers is always great. San Antonio was impressive. I spent one day visiting the city with my colleague Henri (The Finnish Hyperion Guy). We had sun and rain. And what do you when it's raining? Shopping! In addition, I got the "EPM Data Integration Top Speaker" award! That was awesome. I didn't expect it so I can only say thanks to all the community. 
  • Heading to large family. This is an easy one: if all goes fine, next year we will be 2 more in the family :-)
  • New apartment. I've been very busy assembling IKEA furniture. For those would would like visiting Malaga (Spain), we bought a new apartment for rental. Feel free to visit it!
OK, now that you know what I have been doing...time for some FDMEE content!

Universal Data Adapter (UDA)
If you are not familiar with the UDA yet, it may be a good a idea that you visit my previous entries about it:
The Requirement - Multiple Source Databases with same Source View Layout
One of the drawbacks of the UDA is the configuration and maintenance. That's something we cannot change. It has been designed like that. 

Why configuration
UDA requires configuration in FDMEE and Oracle Data Integrator (ODI).

In FDMEE, UDA requires typical artifacts plus some specific ones
  • Source System with an ODI context assigned
  • Source Adapter
  • Source Period Mappings
  • Import Formats
  • Location and Data Load Rule
In ODI, UDA requires new artifacts to be created in addition to the ones imported during initial configuration (Projects and Model Folders)
  • Manually in ODI
    • Data Server
    • Physical Schema
    • Context
  • Generated in ODI from FDMEE
    • Datastore for Table/View definition
    • Package and Interface
    • Scenario for the Import Format
Why maintenance? There are many events that require re-generating the ODI objects created from ODI. I'm not going to list all of them but will explain the main ones.
  • Migrating across environments. LCM doesn't migrate the ODI objects. Besides, you can't perform a native ODI export/import. FDMEE has been designed to have the same ODI repositories' ID across environments so the export/import objects will fail.
  • Apply patches. Some patches may require to re-import some default objects for ODI. This will probably delete the objects you generated from FDMEE.
  • Changes in Tables/Views. Think about adding a new column to a Table/View. You have to re-import table definition again, regenerate the package and interface, adjust the Import Format and regenerate the ODI scenario
What about multiple databases? All configuration mentioned above multiplies as well. Why? All is chained. Each Source System has assigned an ODI Context. If you have multiple sources of the same database type, you can't use the Global context as you can only assign once to the Logical Schema UDA_MSSQL. Then, as you need multiple Source Systems and Source Adapters, you will need multiple Import Formats as they are assigned to adapters, and so on...

I know what you are thinking...lot of manual work to do!

Today, I will show you the solution we implemented for an integration with Microsoft Navision. (SQL Server). Customer had 30 different source databases in the same server (including different collations)

The Solution - Moving complexity to the DB makes UDA simpler!
As part of the Analysis & Design workshop, we explained the drawbacks of having 30 source databases. They understood that we had to find another solution immediately.
For 2 databases, solution architecture would be as:
Then, I told them that the design could be significantly simplified. However, that required some additional work on the SQL side. They wanted to keep things simple in FDMEE so they were happy to go for it. Actually, they did :-)

Basically, my advice was to move the complexity to SQL. That would made the UDA configuration and maintenance simpler.

In a nutshell:
  1. Create a DB Link (Oracle)/Linked Server (SQL Server) from FDMEE database server to the source database server
  2. Create a view in FDMEE database. This view has an additional column for the source database. It queries the source views using remote queries (Ex: OPENQUERY in SQL Server) which perform quite well as they leverage the source DB engine
  3. Configure UDA in ODI
  4. Configure UDA FDMEE
  5. In addition to the required parameters, define an additional one for the column having the source database

Note that if the source databases are in different servers, the solution would be slightly different but still doable.

Also, a similar approach could be taken if you have multiple views with different layouts. You could merge them into one with a common layout.

I know you may have concerns about performance but if the views are correctly designed and network delay is not a bottleneck, everything should be fine. Indeed, ODI usually executes the SQL extract queries in the source database. Don't forget to play a bit with Data Server tuning settings to get the best performance.

As usually, I'm not telling you this is your solution but definitely a good practice. I hope you found it interesting as it may help you to simplify your design.

Have a good summer!

FDMEE & Java APIs, more than friends

Hi folks!

Finally the day arrived.

Some years ago, FDMEE was introduced into our lives with lot of nice and new functionality. Jython is probably one the most important ones. Why? That's an easy one. With Jython, FDMEE opened itself to the Java world. 
If you haven't read it yet, there is a must-read chapter about Jython and Java Integration at Jython's site. I'd like to highlight the following phrase:

Java Integration is the heart of Jython application development...The fact is that the most Jython developers are using it so that they can take advantage of the vast libraries available to the Java world, and in order to so there are needs to be a certain amount of Java integration in the application.

Most of the key Oracle EPM and non-EPM products have their own Java API (JAPI). During this blog series, I'm going to focus on the EPM ones. In a nutshell, the integration of FDMEE with the Java APIs for products like Essbase or HFM, gives us freedom to implement additional functionality to enhance our EPM integration solutions.

Using Java from within Jython
One of the goals of Jython is to use existing Java libraries straight forward. It's as simple as using external Jython modules (PY files) within a Jython script:
  1. Import the required Java classes and use them directly in your code
  2. Call the Java methods or functions you need
What about Types? well, here the good thing comes. Usually, you don't need to worry at all about them. There is some automatic type coercion (conversion of one type of object to a new object of a different type with similar content) either for parameters passed and for the value returned by the Java method.

Basically, when it gets a Java numeric type, or a Java string, Jython automatically converts it into one of its primitive types.

Let's have a look at the following example:
As you can see, the ArrayList object (which is an object from the Java Collection Framework) has been coerced into a Jython list. We can use methods from ArrayList Class (like add) and iterate the object as it would be a proper Jython list.

We will see more examples for coercion when using the Essbase and HFM JAPIs.

BTW, what is Foo?

Using Java from within FDMEE Scripts (Jython 2.5.1 and Java 1.6)
When writing your Jython scripts, don't forget that:
  • FDMEE latest version (11.1.2.4) uses Jython 2.5.1
  • FDMEE uses Java 1.6 (as EPM system)
In other words, you are restricted to use classes available in Java 1.6. Also, if you use 3rd party Java libraries, they must be compatible with 1.6.

Regarding the different approaches to implement the Jython script:
  • Build the custom functionality in a Java library that you can later import into your scripts
  • Cast Java code as Jython within your script
Option 1 requires a deeper knowledge on Java programming. I'd recommend this option only if you know Java programming and your customization is a good candidate for being reused in other implementations. On the other hand, option 2 is quicker and probably a better option for one-time customization.

Essbase Java API
FDMEE comes with functionality that is commonly use:
  • Extract data 
  • Run calculation scripts before/after loading data
  • Pass parameters to the scripts
  • Create drill regions
  • Among others...
But, what about?
  • Run calculation scripts before extracting data
  • Validate target data before is loaded
  • Load new metadata before loading data
  • Execute MaxL scripts
  • Using substitution variables in FDMEE artifacts like period mappings 
  • Among others...
I wish the product would provide this functionality but unfortunately it doesn't. However, it provides a powerful scripting engine which enables us to extend its functionality.

Going back to the list above, you have probably met some these requirements in one of your projects. What did you do? Create a MaxL script and run it from script using subprocess module? Or, did you leverage the Essbase JAPI?

That probably depends on many other factors...do we have time for implementation? do we know how to do it? do they have existing batches doing the work?...

To me, using the Essbase JAPI is not only about having seamless integration but capturing errors in an elegant way. Something that you can hardly get by running batches from external scripts.

Spoiler!!! see how simple would be to execute a MaxL script or statement:
I will cover more details about using the Essbase JAPI and some examples in upcoming posts.

HFM Java API
What about HFM?
  • How can we extract Cell Texts?
  • Extract and Load Metadata?
  • Translate data before extracting it?
  • Run custom consolidation/calculation/translation logic?
  • Among others
HFM also has a JAPI! Actually, in the same way that happens with Essbase integration, FDMEE uses these APIs behind the scenes.

Spoiler again!!! extracting cell texts:
Other Java APIs
Besides the HFM and Essbase JAPIs, there are other products and components having their own API. Some of them such as LCM's one are documented, some others are not. In example, OLU's API (Outline Load Utility).

In the next posts, I will show some examples for customization implemented with the Essbase and HFM APIs. If you can't wait, my colleague John already published a very cool one.

I haven't forgotten about Planning. It does not have any published Java API but you should have a look at REST API.

Take care!

Tuesday, May 23, 2017

Code Snippet: merging files by writing chunks

Do you have multiple source files that have to be merged so you can import them as a single one? This requirement is quite common, especially in automated data loads.

For example, our ERP system is exporting two files for Balance Sheet and Profit & Loss. We want to import them as a single file under the same POV. Merging the source files is the solution.

The approach taken is to merge files by writing chunks into target file. In this way, we avoid memory issues when having large source files.

Let's have a look!

Merging a list of files by writing chunks


'''
 Snippet:       Merge a list of files
 Author:        Francisco Amores
 Date:          23/05/2016
 Blog:          http://fishingwithfdmee.blogspot.com
 
 Notes:         This snippet can be pasted in any event script.                
                
 Instructions:  Set log level (global or application settings) to > 4 
 Hints:         Use this snippet to merge multiple single files into
                ont.
                It write chunks to avoid memory issues with large files
               
 FDMEE Version: 11.1.2.3 and later
 ----------------------------------------------------------------------
 Change:
 Author:
 Date:
'''

# initialize
srcFolder = r"C:\temp"
tgtFolder = r"C:\temp"
listSrcFilename = ["file1.txt", "file2.txt", "file3.txt"]
tgtFilename = "merge.txt"

# import section
import os
import shutil

try:
    # Open Target File in write mode
    tgtFilepath = os.path.join(tgtFolder, tgtFilename)
    tgtFile = open(tgtFilepath, "w")
    # Log
    fdmAPI.logInfo("File created: %s" % tgtFilepath)    
    
    # Loop source files to merge
    for srcFilename in listSrcFilename:
    
        # file path
        filepath = os.path.join(srcFolder, srcFilename)          
        # Log
        fdmAPI.logInfo("Merging file: %s" % filepath)                    
        # Open file in read mode
        srcFile = open(filepath, "r")        
        # Copy source file into target
        # 10 MB per writing chunk to avoid big file into memory
        shutil.copyfileobj(srcFile, tgtFile, 1024*1024*10)
        # Add new line char in the target file
        # to avoid issues if source file don't have end of line chars
        tgtFile.write(os.linesep)
        # Close source file
        srcFile.close()
        # Debug
        fdmAPI.logInfo("File merged: %s" % file)
        
    # Close target file
    tgtFile.close()
                    
except (IOError, OSError), err:
    raise RuntimeError("Error concatenating source files: %s", err)

Code snippets for FDMEE can be downloaded from GitHub.