Thursday, February 18, 2016

Universal Data Adapter (UDA), direct DB integration - Part 3

In the last two posts we have analyzed the new Universal Data Adapter (UDA) thoroughly. Now it's time for the final part of the series: importing data from our source database table and view!

If you haven't done already you can visit the other two parts:
  • Part 1: Introduction to UDA, initial ODI configuration
  • Part 2: Configuring source system and source adapter in FDMEE
As any other source, the UDA needs the following artifacts to be configured in FDMEE:
  1. Import Format: maps UDA columns to dimensions
  2. Location: data entry point 
  3. Data Load Mappings: conversion rules for source data
  4. Data Load Rule: data extract definition
  5. Source Period Mappings: period filters for data extract
1. Import Format
The import format is where we will link our source columns (imported from the source adapter page) with our HFM dimensions (including any potential lookup dimension defined in FDMEE)
As usual, we have to select the source system and the source adapter:
Before going through the details of the import format I would like to highlight that, in our case, source tables were designed to match our FDM/HFM dimensions. Maybe in your case you have to combine source columns in order to get the source value into FDMEE.

What can we select in the source columns?
We can select any column available in the UDA. In order to make column names understandable for anyone, the import format shows the "Display Name" property setup in the source adapter:
What can we select in the target dimensions?
By default you will see the list of dimensions of your target application including amount and lookup dimensions.
In the case you need to concatenate multiple source columns into one target dimension you can add the target dimension again. Each source column assigned to the same target dimension will be concatenated using the concatenation character set in the import format definition:
I'm missing some things here:
  • We cannot select Currency dimension (Oracle confirmed that will be available in PSU200)
  • We cannot select description columns (DESC1 and DESC2)
  • We cannot select Attribute columns
For description and attribute columns the only workaround is to adjust the ODI interface generated by FDMEE. However as I already explained in previous post, this change would have to be replicated in every environment as we won't be able to export and import the ODI objects.

Back to the import format configuration, we map source columns to target dimensions for each adapter:
Regenerating the ODI Scenario
When we create new import formats or we change the columns map, we need to re-generate the ODI Scenario (FDMEE will execute this scenario when you click execute the DLR)
If everything runs fine we will get a success message. However this wouldn't be exciting so we got an error:
We troubleshoot the issue by starting with aif-WebApp.log:
I don't see any error except an issue with the interface but that could be normal because FDMEE just generates the ODI Scenario (it does not update the interface). And if you review the interface generated in the package, it doesn't have any reference to the source table (just variables and hard-coded values)

[2016-02-18T14:23:29.948-08:00] [ErpIntegrator0] [NOTIFICATION] [EPMFDM-140274] [oracle.apps.erpi.model] [tid: 22] [userId: <anonymous>] [ecid: 00iYe^P7VpJFw0zsRG007z2FtVU1zYIxL0000NS0000g1,0:1] [APP: AIF#] [SRC_CLASS: com.hyperion.aif.odi.common.util.OdiConstants] [SRC_METHOD: logDebugInfo] Message - Issues: LKM is set for SourceSet or Staging Area but TargetDatastore column mappings do not reference any columns from a SourceDatastore of Dataset Default. Either delete SourceDatastore(s) or add source column references in one or more TargetDatastore column mappings for Dataset Default.

In fact, if I see the status icon I can see a green tick which means that the ODI scenario was successfully generated. We can even go to ODI and see it:
Weird. I saw this issue randomly happening in some projects so I decided to run the re-generate again. Now it worked. Don't ask me why, just be happy like me.
After re-generating the two ODI scenarios we are ready to create the other artifacts:

2. Locations
Nothing special to remark here, we just assign the import format as we don't have any other artifact (logic group, check group...)

3. Data Load Mappings
I won't cover mappings in this topic. I just want to execute the DLR and see source data in FDMEE :-)

4. Source Period Mappings
The source period mappings define the values for columns Period, Period Number, and Year. These values will be passed as parameters when we set our DLR with explicit period mapping type.
We can define multiple calendars and configure adjustment periods if needed.
Source period mapping are defined at source system level. Therefore if we have more than one table for the same source system and they use different calendar, we will have to configure multiple calendars. Calendars are then assigned to the data load rules as shown below.

Update (26/Feb/2016): If you have only one period column with alphanumeric values like JAN-16 you will need to enter those values in GL Period Column and classify that column as Period Class in the source adapter page. Then this column will be used as a filter when running the data load rule with explicit period type. Tested and working in a customer :-)

5. Data Load Rules
If you are familiar only with file-based loads you will see some differences in DLRs for the UDA:
We select the calendar defined for table SAMPLE_DATA and set period mapping type to Explicit.
We defined a parameter for the source adapter. This is used to filter data by column ERP_SOURCE. In order to show that works we have added a new line to the table with a different value than the value we want to extract data for:
Time to execute the DLR...3, 2, 1, go!
It's important to select a period or range of periods (UDA supports multi-period loads) with data in the table, or in other words, we have Period, Period_Number, and Year columns both correctly populated in the table and configured in the source period mappings.

Once is finished we navigate to Data Load Workbench and:
VoilĂ ! our source data has been successfully imported and filtered.

Time to test the Oracle View. Will this also work like a charm?
For Oracle adapter, we defined a LIKE condition on column ERP_COMPANY:
Again we have added a new line to be filtered:
Running it...and no data imported!
What did I do wrong? Let's have a look to the FDMEE process log. I can see the SQL query executed. Actually I can't :-)
To troubleshoot source adapters you will have to visit ODI operator (either in ODI Studio or ODI console)
It seems that we forgot to add the % sign in the parameter value so new one value is BL% which will extract all data for companies starting with BL:
Let's execute it again:
Now it's time for you to explore other options.

Customization of UDA
There are many scenarios where our source adapter needs to be customized. As opposed to the Open Interface Adapter, the ODI package and interface are generated from FDMEE. This has some drawbacks:
  • Any customization performed in the ODI objects will be deleted when we click on "Generate Template Package"
  • Also, it will have to be replicated across environments
For reason I would prefer to customize on the source side. For example, if we need to join two tables, we could create a SQL view instead of using tables as a source.

At the moment, I can only think about customizing the interface in order to import data for descriptions and attribute columns but this is something that Oracle will probably fix so I wouldn't worry.

Migration to other environments
When we talk about migration, LCM has been enhanced a lot so most of the FDMEE artifacts can be migrated right now. But what about ODI objects?
  • Topology: the topology needs to be re-created manually in the target environment
  • Designer Objects: the packages/interfaces/scenarios need to be re-created from FDMEE in the target environment once LCM migration is done.
Why? basically, as already said, due to ODI master and work repositories having the same internal id across environments.

Hopefully in the future, Oracle adds a button "Generate all ODI objects" :-)

And the key that I have the UDA, should I use OIA for my database integration?
My answer is that it depends on your requirements. In a straight forward integration you wouldn't need OIA. However, if you have a complex logic for data extracts, you may prefer OIA instead of UDA.

What could be enhanced?
Of course, as a new functionality, the Universal Data Adapter could be enhanced. From what I saw, it would be great if we had the following functionality besides the one I said above:
  • Filtering capabilities. For example, we can't add a filter to extract data having a column value different than the parameter we passed. Also we cannot add complex nested filter conditions.
  • We cannot select import mode. It is always imported in replace mode.
  • We cannot use import scripts. If we need any import operation on source fields that would need to be done in a SQL view.
  • Being able to apply SQL functions on source fields. These functions would be available in the import format as expressions.
Can you think in any others? I'm sure so feel free to add your comments!

With this post I'm done with discovering the new Universal Data Adapter. At least until I find something interesting to show ;-)

Update 22/4/2016: it seems that table/view names must be in upper case. Otherwise the table import definition fails.


Wednesday, February 17, 2016

Universal Data Adapter (UDA), direct DB integration - Part 2

If you just landed into this post you may want to visit Part 1 first in order to get the basics of the new Universal Data Adapter (UDA).

Configuration steps in a nutshell:
  1. Configuration of ODI components (topology, projects and models)
  2. Configuration of FDMEE components (source system, source adapter...)
In this second part of the series we will cover the configuration of the UDA in FDMEE.

Source System
In the same way we do with the OIA, we need to configure a source system in FDMEE.
When we register a new source system we need to select the source system type. One of the changes in is that we don't have type "Others" any more. This was the source system type for Open Interface Adapter before PSU100. We now have one type for each source database supported (besides the new one for Open Interface):
As we have two source databases (MSSQL and Oracle) we need to register 2 source systems:
After saving our source systems we will see the two of them in the source system summary panel:

ODI Context Code
As you can see, the ODI context code is a mandatory field when registering a source system. If having one source system of each type you would probably use GLOBAL context which is the default one for FDMEE.
The context code links our source system with the physical architecture defined in ODI.

Think then about the following scenario: what would happen if we have 2 different source systems for Oracle databases? Which context would you use? GLOBAL? we can't use one context as ODI wouldn't know which database to point when extracting data. Therefore we would need two contexts. 

Technically, at run-time, ODI points to the data server/physical schema mapped to specific logical schema (based on our source type) through the context set in FDMEE.

For example, if our source system type is "MSSQLServer Data Source" and context code is "UDA_MSSQL", ODI will extract data from the database (physical schema) mapped to logical schema "UDA_MSSQL" through that context: SRC_ERP.

In order to provide a flexible configuration I always like to setup different contexts when using other source systems than File.

Source Adapters
In the same way the source system is the link between FDMEE and the configuration of our physical connections in ODI, the source adapter is the link between FDMEE and the ODI objects which will perform the data extract.

When adding a new source adapter we can't select neither ODI Package Name nor ODI Project Code. This is because the UDA only uses one project. This project has a package template which is used to generate new packages for each table we extract data from. 
This approach is different than the one used for Open Interface Adapter but similar to the one used for SAP BW adapter.

Don't worry if you did not understand what I just said. I will explain it graphically below.

1. Creating the source adapters

How does the source adapter for UDA work?
Once we have created the new source adapter we need to

1. Bring table definition into FDMEE so we can configure filters and import format
2. Generate the ODI package which will extract data from the table and import into TDATASEG_T

Do we have to perform these operations manually? noooooo! I can't imagine we would have to do that :-) So then how we do it? Luckily, FDMEE will do for you by clicking 2 buttons :-)
  • Import Table Definition: will import source table/view columns into the source adapter in FDMEE
  • Generate Template Package: it will generate a package in ODI which extracts data from the table. This package will contain an ODI interface using the filters we define in the source adapter

2. Importing the table definition
Going back to the configuration, we have already added the two source adapters. Now we will import the table (MSSQL) and view (Oracle) definitions into FDMEE:
When importing the columns we need to select the source system name so ODI can go to the physical table and reverse it.
Hopefully you get a success message like this:
Cube definition? I said that UDA source adapter was similar to SAP BW, too similar I would say :-) Obviously this message will have to be changed by Oracle so we see Table definition has been imported successfully (Update 22/4/16: already fixed in

You may not be familiar with reverse concept if you don't know ODI. Let's try to give some clarification of what happens internally when you click that button.

If we navigate to ODI operator we will see that an ODI process was executed. This process reverses the table which basically means: it will create a data store in ODI for that table. 
Then this datastore will be used in the ODI interface as a source datastore. If we navigate to the ODI model folder that we imported when configuring ODI, we can see how a new datastore has been created below MSSQL Source model:
I'm not going to test this today but I can anticipate a potential issue here. What would happen if we had two source databases of same type having tables of same name?
I guess the import table definition process would fail. If you have this issue you may have to create Views on top of the tables and name each one differently.

Going back to the table columns, once they are imported, we have to classify some columns and optionally type a user-friendly name for each column. This is the name that we will see in the import format configuration:

3. Classifying amount and period columns
We need to classify the following columns:
- Amount (mandatory): so FDMEE knows which is the column imported as amount
- Period (optional): used to filter data if DLR is configured with period type "explicit"
- Period Number (optional): same as period
- Year (optional): same as period

If we assign only Amount, then we can only use period type "None" in the data load rules. In other words, FDMEE assumes that the content of the table/view has data only for the period being processed. If we classify the period columns then we can use period type "Explicit" which will be passing as extract filters the values that we configure in the source period mappings.

Update (26/Feb/2016): We can find source tables/views where we only have one period column. This column could have values like JAN-16, FEB-16, etc. In this case we just have to classify the period column as Period class. Then it is quite important that we set correct values in the source period mapping table for the adapter as this will be the value passed as filter when explicit period type is selected in the data load rule.

Data types must match with the following table:
Time to change my source table and view because I forgot to include period columns :-)
Once they are added I can re-import the table definition:
We perform the same steps for the Oracle View:
If you have new columns in table, don't add them manually in the source columns panel as they table needs to be reversed in ODI. Otherwise that column cannot be used in the interface and your configuration will fail when generating the template.

4. Defining the source parameters (extract filters)
In a typical data extract from database we usually use filters. If you think in a SQL query like this:

WHERE X = 'red'

We are using a filter on column X so only records having X with value "red" will be extracted.
The same applies to UDA. We can define filters on any column of the source adapter. These parameters will be available as text box in the data load rule.
  • Parameter name: in order to be aligned with other source adapters I like using the following naming convention for parameter names 
    • p_param_name (lower case and words delimited by underscore)
  • Data types: char (text), number or date
  • Conditions: condition used to filter the column with the value set in the data load rule
    • Explicit (WHERE COLUMN='x')
    • Between (WHERE COLUMN BETWEEN 'x' AND 'y')
    • In (WHERE COLUMN IN ('x','y'))
    • Like (WHERE COLUMN LIKE 'x%')
  • Default value: value used as a filter when you don't pass a value
  • Parameter prompt: parameter name you see in the data load rule (user-friendly)

We have created one parameter for each source adapter:

I have created one of them with LIKE condition as I want to stress test it :-)

5. Generate template package
As I explained before, once ODI has the datastore for our table/view (imported after clicking "Import Table Definition"), it needs to generate the package with an interface to pull data from our source into FDMEE staging table TDATASEG_T.

Before clicking any button I'd suggest you visit ODI Studio and see ODI project structures. You will see something like this:
I will go back to that structure once we generate the new package.
Let's click on "Generate Template Package":
Whoops! An error occurred message. We didn't expect this one :-(
Let's do some troubleshooting!

The 1st place where I will look is the aif-WebApp.log:
There is one error:

Error encountered while computing Source Sets :ODI-15562: Source sets cannot be computed because one execution area doesn't have a physical schema. Check the errors on the interface

It says something about physical schema so I will review my topology.
I don't see anything wrong here as my context UDA_MSSQL is correctly configured.

Then I remembered when I was struggling with the SAP BW adapter where I had a similar issue. I found out that even if you are using other contexts than GLOBAL (we have configured UDA_MSSQL and UDA_ORACLE), context GLOBAL cannot have the logical schema for the UDA mapped to <Undefined> physical schema.

Why? I guess this is needed because the ODI interface generated by FDMEE is using GLOBAL as the optimization context (it has no impact as we will be using our own contexts):
Nevertheless, if we go to ODI Studio and configure GLOBAL so the logical schemas are mapped to ANY physical schema (it's not important the physical schema but the fact that it isn't <Undefined>):
Then the package template will be successfully generated:

Finally, if you are curious and you want to fully understand what happened, then you must visit ODI again.

You will notice couple of new items:
  • Package UDAMSSQL (this is the name of my source adapter)
  • Interface UDAMSSQL

ODI Interface generated
The interface extracts data from SAMPLE_UDA datastore and loads into TDATASEG_T.
You can also see how the parameter we defined has been added as a filter on the source datastore.
Don't be surprised if you don't see all mappings in target datastore as they will be dynamically created when we define our import format and generate the ODI scenario:
For Oracle:

Note: you can apply your ODI knowledge and adjust the interface as required. The only thing you need to take into consideration is that changes on generated interfaces/packaged need to be recreated manually in each environment as you may have potential issues when exporting/importing the ODI objects between environments having the same repository ID (FDMEE repositories have same ids across environments by default)

ODI Package generated
The package has been generated based on the existing one and it contains the interface that pulls data.
The ODI scenario that we will generate from the import format, will be generated from this package:
For Oracle:

After the ODI objects have been generated we can see that the ODI package has been updated in the source adapter accordingly:

OMG! I just noticed that I used the Oracle table rather than the View :-)
I don't want to take all screenshots again so I will copy the existing adapter and create a new one for the view:
Then I change the Table name (which actually should say Table/View):
Don't think we are done! copying the adapter doesn't mean the ODI objects are copied so we need to import the view definition and generate the package:

6. Drill-through URL
I'm not covering this topic today as I'm not going to use any drill to source.

Some considerations
  • You need one source adapter for each source database table
  • Don't change the project code in ODI for any UDA
  • Don't change the ODI Package name in ODI for any UDA
  • Documentation says that we should enclosed table name in double quotes to preserve capitalization but when I type the table name as "SAMPLE_UDA" and move to another field, double quotes are automatically wiped out. Also, if I enter any text in lowercase in any of the source adapter fields, that text is capitalized automatically. I won't take this into account for the moment :-)
  • You could have issues if having same table name for different source systems of same type
  • You can copy existing adapters to create new ones with similar configuration
  • Click Save after every step :-)
  • When changes are done in the source adapter setup, you need to import the table definition and regenerate the template package again.

Enough for today!