FDMEE Custom Solution to Load Multiple Periods Through Single POV - Part 1
I got recently one question regarding the solution of the following scenario:
Our file has periods in rows, how can we load it through a single POV?
In Classic FDM, everything was bit more flexible as you got control of almost everything. You could easily add a custom dimension for your periods and then adjust the Export Action script to export custom period column instead of POV period.
In FDMEE, you also have several approaches. Today I'm going to share one solution that will show:
- How to think in FDMEE
- Using Lookup dimensions
- Jythoning (slice notation, exceptions, loops, file operations, list operations, unicode files, using continue statement...)
- Avoid modifying Adapter action scripts (not supported, removed soon?)
- All you want to see...
I'm going to load data into HFM using the following source file:
My aim is to import the source column with periods (1st column) into FDMEE and then replace the POV Period by the mapped values for imported periods in the .DAT file exported to HFM.
Adding a Lookup dimension for Source Period
Lookup dimensions can be added to target applications and assigned with data columns. They are only for FDMEE purposes, therefore they won't be exported. They can be used in many different ways. Today we will use the Lookup dimension to import the source period, map it, and export it. How?
I have added Lookup dimension Period_Custom and assign with data column UD10.
Importing Source Period into Period_Custom dimension
Lookup dimensions can be configured in Import Formats as other dimensions:
Replacing POV Period by Source Period in the DAT file
If you take a look to the export file we don't see source period anywhere, therefore it will be very difficult to replace as we will not find it when going through all lines.
First thing we can figure out is how to export the source period without modifying the HFM_EXPORT.vbs action script :-)
One approach would be:
- Before the file is exported...concatenate Source Period to Target Account
- After the file is exported...go through the file and replace POV Period by Source Period
- Don't forget to remove Source Period from Target Account in the DAT file...
I'm sure you can find other approaches but this one lets me show more :-)
How we concatenate Source Period to Target Account?
At database level, this is concatenating two columns of table TDATASEG: UD10X and ACCOUNTX. Why UD10X? we said we can also have mappings for my source period so let's take the target period.
The update statement would be something like:
UPDATE TDATASEG SET ACCOUNTX = UD10X || '_' || ACCOUNTX WHERE LOADID = ? AND ACCOUNTX IS NOT NULL AND UD10X IS NOT NULL
How do we perform this operation before .DAT file is exported?
We can use the Event Script BefExportToDat
Note: I found an issue when running the UPDATE statement using LOADID as parameter. I'm still figuring out if this a bug. You can use PARTITIONKEY instead. I will keep you posted.
Update: the issue is not related to the SQL query itself but to the fact that if we run EXPORT step separately from IMPORT step, the column LOADID in TDATASEG table will be only updated during the EXPORT and therefore not available in BefExportToDat event script.
Basically if we run steps separately (from Workbench), LOADID will be different for each step. If we run the DLR from Data Load Rule page and select Import + Export, then both steps will have same LOADID and therefore the SQL query with LOADID in the WHERE clause will work as expected.
Easier solution is to modify the SQL UPDATE and use POV information in the WHERE statement (RULE_ID, PARTITIONKEY, CATKY, PERIODKEY:
How do we replace POV Period and remove prefix added to Target Account?
We can go through all lines in the .DAT file and use split + Python Slice Notation to get the prefix and build the final line with custom period and target account in the correct format.
Actually we will be creating a new .DAT file (with same name) so we first backup the original one.
Our file has periods in rows, how can we load it through a single POV?
In Classic FDM, everything was bit more flexible as you got control of almost everything. You could easily add a custom dimension for your periods and then adjust the Export Action script to export custom period column instead of POV period.
In FDMEE, you also have several approaches. Today I'm going to share one solution that will show:
- How to think in FDMEE
- Using Lookup dimensions
- Jythoning (slice notation, exceptions, loops, file operations, list operations, unicode files, using continue statement...)
- Avoid modifying Adapter action scripts (not supported, removed soon?)
- All you want to see...
I'm going to load data into HFM using the following source file:
Turning it over your head
So, let's apply my favorite methodology... Divide & Conquer!My aim is to import the source column with periods (1st column) into FDMEE and then replace the POV Period by the mapped values for imported periods in the .DAT file exported to HFM.
Adding a Lookup dimension for Source Period
Lookup dimensions can be added to target applications and assigned with data columns. They are only for FDMEE purposes, therefore they won't be exported. They can be used in many different ways. Today we will use the Lookup dimension to import the source period, map it, and export it. How?
I have added Lookup dimension Period_Custom and assign with data column UD10.
Importing Source Period into Period_Custom dimension
Lookup dimensions can be configured in Import Formats as other dimensions:
When I run my Data Load Rule for POV March-2006, I can see a column for Period_Custom dimension:
In this example, my mappings are * to * although you can create any kind of mapping for the lookup dimension.
If we export this data into HFM we can see how the the POV Period March-2006 is exported...we don't want this!
Obviously HFM will show 6000 for March:
If you take a look to the export file we don't see source period anywhere, therefore it will be very difficult to replace as we will not find it when going through all lines.
First thing we can figure out is how to export the source period without modifying the HFM_EXPORT.vbs action script :-)
One approach would be:
- Before the file is exported...concatenate Source Period to Target Account
- After the file is exported...go through the file and replace POV Period by Source Period
- Don't forget to remove Source Period from Target Account in the DAT file...
I'm sure you can find other approaches but this one lets me show more :-)
How we concatenate Source Period to Target Account?
At database level, this is concatenating two columns of table TDATASEG: UD10X and ACCOUNTX. Why UD10X? we said we can also have mappings for my source period so let's take the target period.
The update statement would be something like:
UPDATE TDATASEG SET ACCOUNTX = UD10X || '_' || ACCOUNTX WHERE LOADID = ? AND ACCOUNTX IS NOT NULL AND UD10X IS NOT NULL
How do we perform this operation before .DAT file is exported?
We can use the Event Script BefExportToDat
You can also see how I have added some debug information to my process log (Outbox\Logs). This is totally recommended as you can easily debug your code.
Update: the issue is not related to the SQL query itself but to the fact that if we run EXPORT step separately from IMPORT step, the column LOADID in TDATASEG table will be only updated during the EXPORT and therefore not available in BefExportToDat event script.
Basically if we run steps separately (from Workbench), LOADID will be different for each step. If we run the DLR from Data Load Rule page and select Import + Export, then both steps will have same LOADID and therefore the SQL query with LOADID in the WHERE clause will work as expected.
Easier solution is to modify the SQL UPDATE and use POV information in the WHERE statement (RULE_ID, PARTITIONKEY, CATKY, PERIODKEY:
How do we replace POV Period and remove prefix added to Target Account?
We can go through all lines in the .DAT file and use split + Python Slice Notation to get the prefix and build the final line with custom period and target account in the correct format.
Actually we will be creating a new .DAT file (with same name) so we first backup the original one.
Once the file is secured we will go through all lines and build the new line:
And finally, after the file is produced we have to remove the prefix from ACCOUNTX:
Now it's time to place our scripts in FDMEE:
Running the FDMEE workflow... exporting the new file
All operations shown above will be transparent to the end user as they will happen during the Export step. Therefore the workbench grid will look the same as before but we can see the differences in the DAT file being exported and final data in HFM:
Our new DAT file has now source periods instead of POV Period as we were expecting, and we can check data in HFM:
Well, as I said, this can be one of multiple solutions so take your time to think. Also take into consideration all other integration requirements you may have.
Enjoy it!
0 Response to "FDMEE Custom Solution to Load Multiple Periods Through Single POV - Part 1"
Post a Comment