Archive for ODI Architecture

Let’s Join DEVEPM @ KSCOPE 16

Posted in ACE, EPM, Essbase, ETL, Hyperion Essbase, Hyperion Planning, InfraStructure, Kscope 16, ODI, ODI 10g, ODI 11g, ODI 12c, ODI Architecture, ODTUG, Oracle Database, OS Command, Performance, Tips and Tricks with tags , , , , , , , , , , , , on April 5, 2016 by RZGiampaoli

Hi Guys how are you?

Just a quickly post about this year KSCOPE. This year we’ll have 2 excellent sessions:

Take a Peek at Dell’s Smart EPM Global Environment:

Ricardo Giampaoli , TeraCorp

Co-presenter(s): Rodrigo Radtke de Souza, Dell

When: Jun 27, 2016, Session 2, 10:15 am – 11:15 am

Topic: EPM Applications – Subtopic: Planning

In a fast-moving business environment, finance leaders are successfully leveraging technology advancements to transform their finance organizations and generate value for the business.
Oracle’s Enterprise Performance Management (EPM) applications are an integrated, modular suite that supports a broad range of strategic and financial performance management tools that help business to unlock their potential.

Dell’s global financial environment contains over 10,000 users around the world and relies on a range of EPM tools such as Hyperion Planning, Essbase, Smart View, DRM, and ODI to meet its needs.

This session shows the complexity of this environment, describing all relationships between those tools, the techniques used to maintain such a large environment in sync, and meeting the most varied needs from the different business and laws around the world to create a complete and powerful business decision engine that takes Dell to the next level. 

Incredible ODI Tips to Work with Hyperion Tools

Ricardo Giampaoli , TeraCorp

Co-presenter(s): Rodrigo Radtke de Souza, Dell

When: Jun 27, 2016, Session 6, 4:30 pm – 5:30 pm

Topic: EPM Platform – Subtopic: EPM Data Integration

ODI is an incredible and flexible development tool that goes beyond simple data integration. But most of its development power comes from outside-the-box ideas.

  • Did you ever want to dynamically run any number of “OS” commands using a single ODI component?
  • Did you ever want to have only one data store and loop different sources without the need of different ODI contexts?
  • Did you ever want to have only one interface and loop any number of ODI objects with a lot of control?
  • Did you ever need to have a “third command tab” in your procedures or KMs to improve ODI powers?
  • Do you still use an old version of ODI and miss a way to know the values of the variables in a scenario execution?
  • Did you know ODI has four “substitution tags”? And do you know how useful they are?
  • Do you use “dynamic variables” and know how powerful they can be?
  • Do you know how to have control over you ODI priority jobs automatically (stop, start, and restart scenarios)?

If you want to know the answer to all this questions, please join us in this session to learn the special secrets of ODI that will take your development skills to the next level.

Join us in KSCOPE 16 and book our 2 sessions in schedule. They will be very good sessions and I’m sure that you’ll learn some new stuff that will help you in your EPM Environment!

SpeakerSquare (1)

Remotely Ziping files with ODI

Posted in 11.1.1.9.0, ACE, Configuration, EPM, Essbase, ETL, Hacking, Hyperion Essbase, InfraStructure, ODI, ODI 10g, ODI 11g, ODI 12c, ODI Architecture, ODI Architecture, OS Command, Performance, Remotely, Tips and Tricks, Zip Files with tags , , , , , , , , , , , on April 5, 2016 by RZGiampaoli

Hi guys how are you? It has been a long time since last time I wrote something but it was for a good reason! We were working in our two Kscope sessions! Yes, this year we will have 2 sessions and I think they will be great!

Anyway, let us get to the point!

Today I want to talk about something that should be very simple to do it but in the end, it is a nightmare…. Zip a file in a remote server…

A little bit of context! I was working in a backup interface for one client and, because their cubes are very big, I was trying to improve the performance as much as I can.

Part of the backup was to copy the .ind and .pag files and the data extract files as well. For an app we are talking in 30 gb of .pag and 40 gb of data extract files.

Their ODI infrastructure is like this:

Infrastructure

Basically I need to extract/copy data from Essbase server to the disaster recovery server (DR Server). Nothing special here. The problem is, because the size of the files I wanted to Zip the files first and then send it to the DR server.

If you use the ODI tools to Zip the file, what it does is bring all the files to the ODI Agent server, zip everything and the send it back. I really do not want all this traffic in the network and all the time lost in this process (also, the agent server is a LOT less powerful then the Essbase server).

Regular odi tools zip process

Then I start to research how I could do that (and thank you my colleague and friend Luis Fernando Cairo that help me a lot doing a lot of tests on this)

First of all we have three main options here:

  1. Create a .bat file and run it remotely: I did not like it because I do not want a lot of .bats all over the places
  2. Use windows invoke command: I need a program in the server like 7 zip or so and I don’t have access to install freely and I do not want to install zip’s program all over the places too
  3. Use Psexec to execute a program in the server: Same as the previous one.

Ok, I figure out that in the end I’ll need to create/install something in the server… and I rate it. Well, let’s at least optimize the problem right!

Then I was thinking, what I have in common in all Hyperion servers? The answer is JAVA.

Then I thought, I can use the JAR command to zip a file:

jar cfM file.zip *.pag *.ind

Where:

c: Creates a new archive file named jarfile (if f is specified) or to standard output (if f and jarfile are omitted). Add to it the files and directories specified by inputfiles.

f: Specifies the file jarfile to be created (c), updated (u), extracted (x), indexed (i), or viewed (t). The -f option and filename jarfile are a pair — if present, they must both appear. Omitting f and jarfile accepts a “jar file” from standard input (for x and t) or sends the “jar file” to standard output (for c and u).

M: Do not create a manifest file entry (for c and u), or delete a manifest file entry if one exists (for u).

Humm, things start to looks better. Now I had to decide if I would use the Invoke command or Psexec.

I started trying the Invoke command, but after sometime I figure out that I can’t execute the jar command using invoke.

Then my last alternative was Psexec.

The good thing about it is that is a zip file that you need just to unzip in the agent server, set it in the Environment Variables (PATH) and you are good to go.

It works amazingly.

You can run anything remotely with this and it’s a centralized solution and non-invasive as well (what I liked).

You just need to:

psexec \\Server  -accepteula  -w “work dir” javapath\jar cfM file.zip *.pag *.ind

Where:

-w: Set the working directory of the process (relative to remote computer).

-accepteula: This flag suppresses the display of the license dialog.

There’s one catch, for some unknown reason, the ODI agent does not get the PATH correctly then you need to use the complete path where it was “Installed”. The ODI is like this:

OdiOSCommand “-OUT_FILE=Log_Path/Zip_App_Files-RUM-PNL.Log” “-ERR_FILE =Log_Path /Zip_App_Files-RUM-PNL.err”

D:\Oracle\PSTools\psexec \\server -accepteula -w \\arborpath\APP\RUM\PNL\ JAVA_PATH\jdk160_35\bin\jar cfM App_Files-RUM-PNL.zip *.pag *.ind

With this, we will have a process like this:

Remotly Zip Process

This should not be something that complicate but it is and believe me, I create a very fast process and the client is very happy.

I hope you guys enjoy it and see you soon.

Dynamically exporting objects from ODI

Posted in 11.1.1.9.0, ACE, ODI, ODI Architecture, Tips and Tricks with tags , on March 1, 2016 by Rodrigo Radtke de Souza

Hi all!

Today we will be talking about how we can export any object from ODI in a dynamic way. But first, why would we want to do that? One good example to do this is to figure out which ODI objects changed during a period range and export their xml to be stored in a code versioning repository. Another one could be to export all ODI scenarios with a certain marker, or from specific projects/folders in an automated way. Exporting Load Plans: Few people realize it, but there is no easy way in ODI to export several Load Plans at once (you may move the desired load plans to a folder and then export the entire folder with “Child components export” selected, but that would be considered cheating 🙂 ). Or maybe you just want to do it for the sake of doing something in a dynamic way (if you already read some of our posts, you already know that we like dynamic coding!).

First, let’s take a look on the OdiExportObject object from the Toolbox.

1

From Oracle Documentation:
Use this command to export an object from the current repository. This command reproduces the behavior of the export feature available in the user interface.

Great, that’s what we want: export any object (even Load Plans, that it’s not listed in the Oracle documentation) from the current repository. You may read about all its parameters here:

https://docs.oracle.com/middleware/11119/odi/develop/appendix_a.htm#ODIDG805

An example of a Load Plan export would look like this:

2

Two important parameters here: First we have the Object ID, that indicates which object you are about to export. This ID can be found by double clicking the ODI object and checking its Version tab:

3

The other parameter is the Classname. This one you may check on Oracle documentation but, as I said before, there may be some class names missing in the documentation, like SnpLoadPlan. So, the easiest way to check the correct Classname for any ODI object is to export it using the user interface, like below:

4

5

Go to the folder and open the xml file in a text editor. The Classname will be the Object Class right in the beginning of the xml file:

6

Ok, but it does not seem very dynamic, since we need to pass the object ID/Classname in order to export the correct object. So here we will use two of our favorite techniques to make it dynamic: Command on Source/Target and ODI metadata repository SQL. This is how it works: we will create an ODI procedure that will contain a SQL that queries the ODI metadata repository in the “Command on Source” tab (returning all the objects that we want to export) and OdiExportObject command on the “Command on Target” tab to actually export the objects.

Let’s begin with the “Command on Source” tab. First create a connection to you ODI work repository and define a Logical Schema to it. In the Command, add the SQL that will meet your requirement (in this example, retrieve all Load Plans that were created/modified since last week):

7

Our query needs to return three columns: the OBJECT_ID, OBJECT_CLASS and FILE_NAME. This information will be passed to the “Command on Target” to identify which objects needs to be exported.

Now, we need to add the OdiExportObject to the “Command on Target” tab and this is pretty simple to do. Every ODI object found in the Toolbox can be added to an ODI procedure and be called as “ODI Tools” Technology. If you are not sure how to do it, a good tip here is to add the ODI object that you want to add in the ODI procedure in an ODI package, set its parameters as you would normally do and click at the “Command” tab, like bellow:

8

9

Now just copy the command text and add it to your procedure in the “Command on Target” tab, selecting “ODI Tools” as its Technology:

10

As you can see, we have added three # variables here that will receive the information from the “Command on Source” tab. When you run this procedure, if 10 load plans were created/modified since last week, those will be exported to the EXPORT_DIR folder.

In this example we queried SNP_LOAD_PLAN table in order to get all load plan information. Luckily, the ODI table names are very similar to its Classname, so they should not be hard to find. Here is a list of the most common objects that you will likely export from ODI:

Capture

That’s it guys. I hope you liked it! See ya!

Tips and Tricks: Working with ODI Variables and Global Parameters

Posted in ODI Architecture with tags , , , , , , , , , , , , on September 7, 2015 by RZGiampaoli

Hi guys, today we’ll talk about some very simple but powerful technic that we always use in our integrations. Its joins two concepts together and make our lives a lot easier and our integration a lot more dynamic. We are talking about variables and the concept of “Global” parameters.

In our integrations we never, ever have anything hard coded. Every time you hard code something it will come back to bite you in the future that is for sure.

Then the first thing we do in a project is to create a table that we call ODI_PARAMETER. This table will contain all configuration and parameters that needs to be validated, hard coded and so one.

I like to create this table in our work schema (to make easier to use) and its look like this:

ODI_PARAMETER Table

The “SESSION_NM” is used to make the variable reusable in all scenarios that we want, meaning we’ll have only one variable for packages in the project or even for all projects (if we make this variable global in ODI).

How it works? First of all we need to get the “Session Name” for our “Scenario/Package”. Why did I say “Scenario/Package”? Because the result could change depending if you are running a Scenario or a Package. Let me explain this.

To get the “Session Name” in ODI we use an ODI Substitution method called “odiRef.getSession”. This method has other parameter that could return the Session ID, and other stuff but what matters for us is the “SESS_NAME” parameter, that will return the name of the session, the same thing that appears in the operator when we run any object in ODI.

Why I said object? Because if you run a variable the session name will be the variable name. If you run an interface, the session name will be the interface name, it goes to procedure, package and scenario, and that is why I separate the “Scenario/Package” because if we do not pay attention, the name of the package would be different of the name of the scenario, causing a problem when we run one of them.

Let me show how it works. First of all, we’ll create a Global ODI variable called SESSION_NM (could be whatever you want, I just like to call it like this) and we’ll put this code inside of it:

SESSION_NM Variable

After that, we will run this variable to see the results:

SESSION_NM Results

As we can see, the value of the variable was the name of the Variable itself. Now, let us create a package, put this variable inside it, and see what’s happens:

Package test 1

Here is what the interface looks like and above its results:

Package test 1 results

As we can see the result of the variable is the same as the session but in UPPER case since I create the variable like this. But why I did that? Let me create a scenario of this package to show you why:

Scenario Creation

And this is why I create in the variable getting the result and put in UPPER and why I said we need to worry about some peculiarity regarding Scenarios and Packages. When you create a scenario will have the name of the interface in UPPER case and also, NO SPACES. Now, if we run the just created scenario we will have:

Scenario results

Meaning, if we will use the result of this variable as a way to return data from a table, we’ll have a problem because it’ll not find the same result if you run the package or the scenario of that package.

The easiest way to resolve that is to have the name of the main scenario (the scenario that will contain all the other scenarios) with no spaces and no special characters (ODI also transform special characters like % in to _).

Doing that and we are good to continue as we can see below:

Package results

Now we have the same results if we run the package or the scenario.

Ok next let us create another variable to return the LOG_PATH, the path where we will store all our logs from our integrations. The code that we will use for this variable is:

Query ODI_PARAMETER

As we can see we are using the result of the “SESSION_NM” variable in this “LOG_PATH” variable. This is what’ll make this variable reusable in all “Packages/Scenarios/Procedures”. Let us insert a value inside our ODI_PARAMETER Table and run the Package to see the results:

Insert Test 1

Package 1 Results

Now let us create a new package with a different name, use the same variable as above, and insert a new line in our ODI_PARAMETER table for the new interface:

Package 2 results

See, same code, two different results. That means, 90% of the interfaces needs just to be duplicated and the parameters in ODI_PARAMETER needs to be inserted for the new interface and it is done. Also, we don’t need a ton of variables to get different results. And there is more.

The code of the variable also does not change that much. For a new variable, we just need to duplicate the LOG_PATH variable and change the PARAMETER_TYPE, PARAMETER_NAME and PARAMETER_VALUE to get any other information from the ODI_PARAMETER. That means, easier to maintain.

However, let us not stop here. In this example, we are getting the LOG_PATH for our logs in our integrations. Normally this path does not change from integration to integration. What changes is the name of the integration that we are logging right? In addition, with our SESSION_NM variable we could just put in our LOG_PATH variable the root of our LOG folder and then use like this:

#LOG_PATH\#SESSION_NM

This would make the LOG_PATH equal for all integration right. Nevertheless, in the way we create our variables we will need to insert one line for each integration in our ODI_PARAMETER table right.

Well, we just need to change a little bit our code in our variable to create the concept of GLOBAL parameters. How it will work:

First, we will delete the two lines we just created and then we will insert just one line in ODI_PARAMETER table:

Insert Global

Now we just need to change the code from our LOG_PATH variable to this:

Query ODI_PARAMETER global

And here we go:

Global results

We have one global parameter that can be used for all integrations. And the cool thing is that the code above tests if we have a parameter for the actual SESSION_NM and if not it’ll get the parameter from the GLOBAL parameter, meaning if any integration needs a special LOG_PATH or something you just need to insert a new line in the ODI_PARAMETER to get the value just for that integration:

Global results exceptions

This will guarantee that you never ever needs to touch your code again to test or change anything that the business ask you for.

As I said, is a simple but very powerful tool to use.

Hope you guys enjoy and see you soon.

Do you know about ODI project on java.net?

Posted in EPM, ODI, ODI 12c, ODI Architecture with tags , , , on January 8, 2015 by Rodrigo Radtke de Souza

Hi all, today’s post is just a quick tip about ODI project on java.net. I’m often asked if I know a place where there could be different KMs for ODI besides the ones that comes within its installation folder. Every time that I mention the ODI project on java.net, people reaction is always the same “Wow, that’s very nice! I didn’t know that!”.

ODI project on java.net is a collaboration site that contains a lot of code samples made for ODI. It contains Docs, KMs, Open Tools, Procedures, Samples, User Functions, Utilities and so on. Also if you have something that you think that would be useful for the others, you may collaborate with the community by adding your code there. Here is the link:

https://java.net/projects/oracledi

You just need to create an account and you are good to go! Enjoy!

Follow us on Twitter: @RodrigoRadtke @RZGiampaoli @DEVEPM

ODI Substitution Tags demystified

Posted in ODI, ODI Architecture with tags , , , , on September 16, 2014 by Rodrigo Radtke de Souza

Hi all, I bet that all ODI developers already got confused when working with ODI substitution tags. Every time that I need to work with those, I need to repeat all my previous tests again because I forget which tag is the right one to use (even after many years working with ODI….). That’s one of the reasons why I decided to write this post, so I can read my own post in future references (memory, memory, where are you???). Also because it is nice to share our experiences with others, so they can benefit from it (at least I hope so 🙂 ) if they face similar issues.

For all those that are not familiar with it, what are ODI substitution tags? Substitution tags are those characters that you find when you open a Knowledge Module or a procedure in ODI, like these ones: <%>, <?>, <@> and <$>. Here is one example:

1

Those tags are part of ODI Substitution API and they are used to produce dynamic code based on several things like ODI variables, procedure options, data models, interface diagram and so on. Basically the Substitution API is the greatest development power in ODI (that’s why I love ODI so much), since it can produce dynamic code based in the existing information around it. Oracle has a good documentation about it here: Oracle Doc

Although Oracle talks about substitution API in their documentation, it does not state about the tags themselves and in their examples they only talk about the most common tag that is the <%>. But in reality we have four types of it, each one behaving in its own way. Basically each one of them is executed in a specific order, which affects their results in a way greater than you could imagine.

I’ll explain each one of them using a real situation that happened to me some time ago. I was working in a project that contained several different servers (ODI agents) with different Operational Systems on them. I was developing a dynamic component code that had to change its behavior based on which OS that component was going to be executed, so if the OS server was Linux I would execute one particular command and if it was Windows another command. To test in which OS the component was in, I used the following Substitution API command with the following tag (the only one that I knew back then):

<%= System.getProperty(“os.name”) %>

It didn’t work as I expected. The result of this command was always returning the OS name of the machine in which I was launching the execution (the ODI client) not the one that the command was going to be executed (the ODI agent). It puzzled me, so I started to study which options I could use… that’s when I learned about <?>, <@> and later on <$> tags.  Each one of them has a different effect in ODI code generation because each one of them has its own parse phase! This is a very simple concept and it truly affects the ways that ODI generate its dynamic code.

So let me show you what I did to test all those options. First consider that my ODI client is in a Windows 2003 OS and the ODI agent is a Windows Server 2008 R2 OS (the Linux OS was not available for me anymore, but this Windows 2008 will suffice for this example, since we just need different types of OSs). I created the following procedure:

2

It is a simple procedure with a Jython code that will “raise” the name of the OS as an error. I just marked “Ignore Errors” as well, so I could add more steps to this procedure. I did the same for all substitution tags:

3

Then you execute the procedure, making sure that it uses the ODI agent (that resides in a different OS):

4

Let’s begin with <%>. After you execute the procedure and go to “Code” tab, we have the following:

5

Very interesting and it matches my previous experience. The <%> is returning the ODI client OS which indicates that the substitution API is generating the command when it is sent to the agent, but before it gets to the agent! Obviously if we go to “Definition” tab, we will get the execution of this command, which will be “Windows 2003”:

6

Let’s see what it did using <?> substitution tag:

7

Nice! This is exactly what I previously wanted: get the ODI agent OS and decide which command was going to be executed based on that OS. So ODI is generating the code in the agent server, before the command is sent to Operator and before ODI variables are substituted (what, Operator, variables??? More on those topics in a few moments…). If we go to “Definition” tab, we will get the execution of this command, which will be “Windows Server 2008 R2”:

8

Ok, let’s see <@>:

9

Humm….. More interesting stuff and that’s why I mentioned Operator/ODI variables before. This code is being generated in the agent, but after the command appears in the Operator and after ODI variables are substituted. When we go to “Definition” tab, we will get the execution of this command, which will be “Windows Server 2008 R2” again:

10

The <@> tag is great for you to evaluate ODI variables content and depending on that variable value, do something different in your code, like the example below:

11

The interface filter in this example will change accordingly to #V_COND variable. If #V_COND is 1, then it will filter all records with BANK_ACCOUNT_TYPE = ‘1’. If #V_COND is different from 1, then it will load all records.

Those three examples perfectly show us that each tag executes in a different parse phase and each one has precedence over the other. And now you must be thinking: does it means that we can mix and match those tags and create even more dynamic code??? Oh sure yes! Here you can see great examples from Sonra guys:

Sonra Part 1

Sonra Part 2

They show us how we can create “loops” using one tag and apply that result to another tag that will get executed right after that, which give us an extreme dynamic tool for developing. I already told you that I love ODI right?

Ok, but what about <$>? I left this one for the end because this tag was added later to the game (it seems that it is available from 11.1.1.6 version on). This tag resides between <?> and <@> which give us the unique ability of generating the code in the agent server, before the command is sent to Operator but AFTER ODI variables are substituted. Ok, this is crazy stuff 🙂 Why do we need it? The key here is that ODI will contain all ODI variables already evaluated (meaning containing its correspondent refresh/set values) and the code will be displayed in Operator already parsed with the correct ODI variables logic, which makes your code much more readable for the Operators users. Besides that, having a fourth parse phase is great to do even crazier dynamic self-constructed code (long live dynamic coding!!!). Here is “the same” example using <$> in an ODI interface from Oracle ODI blog (the place where I learned about the existence of <$>):

Oracle Blog

In our OS example, the <$> result will not be different from what we got using <?>:

12

And in Definition tab:

13

That’s it folks, so in resume here is the list of substitution tags and their parsing phase order:

<%>: generates the command when it is sent to the agent, but BEFORE it gets to the agent;

<?>:  generates the code in the agent server, BEFORE the command is sent to Operator and BEFORE ODI variables are substituted

<$> (available from 11.1.1.6 on): generates the code in the agent server, BEFORE the command is sent to Operator but AFTER ODI variables are substituted

<@>: generates the code in the agent server, AFTER the command appears in the Operator and AFTER ODI variables are substituted

Hope you have enjoyed it! See you next time!

Really using ODI 12c for Data Integration with Oracle Hyperion Planning/Essbase

Posted in EPM, Hacking, Hyperion Essbase, Hyperion Planning, Kscope 14, ODI, ODI 12c, ODI Architecture with tags , , , , , on July 18, 2014 by RZGiampaoli

On Kscope 14 Oracle announced that ODI 12c would not have support to Planning/Essbase and HFM. You people could imagine that this bomb would make a lot of noise in EPM world.

Because we did not liked the ODI 12c interface, we did not care too much about this announcement, but after Oracle post this: https://blogs.oracle.com/dataintegration/entry/using_odi_12c_for_data, we heard a lot of users complaining about it.

Also we thought a little about our dear users and imagined how our beautiful ODI environment that orchestrates the entire EPM environment would look like if we had to create a lot of extra steps to export data from tables to files and use some CMD commands to load it into Planning/Essbase/HFM.

We decided to make some tests to see if we were able to use ODI 11 KMs with minimal changes to them (in order to increase acceptance of this method), and do our part to help the EPM world.

After a couple of hours (it was more like 6 hours…) we figure out how to make the KMs from ODI 11 work with minimal changes in ODI 12. Let us stop talking and start showing what we need to change to make ODI 12 compatible with EPM.

Technologies Technologies We do not need to import or change anything in the Technologies tree because Oracle left all the technologies for the EPM tools there.

Jars

We do not need to change or copy any Jar because Oracle was kind enough to maintain them there. The only thing we need to do is to Import the EPM KMs from the ODI 11 to ODI 12.

KMs KMSWe have seven KM that we need to import from ODI 11.

For Essbase:

RKM Hyperion Essbase

LKM Hyperion Essbase DATA to SQL

LKM Hyperion Essbase METADATA to SQL

IKM SQL to Hyperion Essbase (DATA)

IKM SQL to Hyperion Essbase (METADATA)

For Planning:

RKM Hyperion Planning

IKM SQL to Hyperion Planning

After we import the KMs, we need to changes a few things on them.

RKMs

In the RKMs we saw the first changes Oracle did in the way ODI works. In the 11 version, ODI used the <%=odiRef.getModel(“ID“)%> substitution method to get the ID of the model there we would like to reverse. If we take a look in the master repository, in ODI 12 we’ll see a new column in the table: Master Repository This is the first thing we need to update in the RKM. ODI 11 used the ID, and if you try to use it as is, Oracle created a validation that makes the interface fails. (It was a good error message because we could figure out the problem reading it). Global ID Error Then, basically we need to update it to the new GLOBAL_ID.

RKM Hyperion Essbase

Step Reset MetaData:

We need to update the code:

SnpsReverseResetTable -MODEL=<%=odiRef.getModel(“ID“)%>

To

SnpsReverseResetTable -MODEL=<%=odiRef.getModel(“GLOBAL_ID“)%>

Step Start the Reverse:

We need to update the code to the new standard and also enclosure the substitution method with double quotes because now it returns a string instead of a numeric value.

imod = <%=snpRef.getModel(“ID“)%>

To

imod = “<%=snpRef.getModel(“GLOBAL_ID“)%>”

Step Set MetaData:

We need only to update to GLOBAL_ID.

SnpsReverseSetMetaData -MODEL=<%=snpRef.getModel(“ID“)%>

To

SnpsReverseSetMetaData -MODEL=<%=snpRef.getModel(“GLOBAL_ID“)%>

RKM Hyperion Planning

For the Planning KM we just need to update the code exactly in the same way we did in the Essbase KM.

Step Reset MetaData:

SnpsReverseResetTable -MODEL=<%=odiRef.getModel(“ID“)%>

To

SnpsReverseResetTable -MODEL=<%=odiRef.getModel(“GLOBAL_ID“)%>

Step Start the Reverse:

imod = <%=snpRef.getModel(“ID“)%>

To

imod = “<%=snpRef.getModel(“GLOBAL_ID“)%>”

Step Set MetaData:

SnpsReverseSetMetaData -MODEL=<%=snpRef.getModel(“ID“)%>

To

SnpsReverseSetMetaData -MODEL=<%=snpRef.getModel(“GLOBAL_ID“)%>

After this, just create a model and reverse something to test it. RKM Execution

Results are these: Planning ReverseEssbase Reverse

With this we are ready with the RKM part and we can start to create interfaces. Sweet.

IKMs

The IKMs were the easiest part. Oracle created some categories to group similar KMs. We need just to choose one of the 3 possibilities to make the KM shows in the Mappings (the new Interface object of ODI 12): Integration Type

We choose “Control Append” since the EPM IKMs does not have any type of integration strategy, but really does not matter with one you choose (I just like things organized).

IKM SQL to Hyperion Essbase (DATA)

We need to double click the KM and select the integration type: KM Integration type selectionIKM SQL to Hyperion Essbase (METADATA)

For the Metadata we need to do the same thing, add the Integration type (We chose “Control Append” again)

IKM SQL to Hyperion Planning

Same thing for the Planning IKM. Add the Integration type. (We chose “Control Append“) IKMs executionsWe are done with the IKM part.

LKMs

LKM Hyperion Essbase DATA to SQL

There is no change in this KM. Works as is. I like it.

LKM Hyperion Essbase METADATA to SQL

This was the tricky one. And not because it’s too different than the others. In fact, if it was not for a BUG in the API it would not need any changes to make it works (On Monday we will open a SR with Oracle about this bug).

The thing is: In the step “Begin Essbase Metadata Extract”, we have a line with the follow command: stagCols = ‘<%=snpRef.getColList(“”, “[CX_COL_NAME]=[COL_NAME] “, “, “, “”, “”)%>’

This command basically create a mapping between the temporary table and Essbase to start the extract. Essbase and its API is case sensitive with this information, and the pattern COL_NAME, in this version, seems to be bugged or changed the behavior (that I do not believe is the case, I will explain later).

In this version instead of bring something like this:

stagCols = ‘PARENTNAME=ParentName, MEMBERNAME=MemberName , ALIAS=Alias

It’s bring something like this:

stagCols = ‘PARENTNAME=PARENTNAME, MEMBERNAME= MEMBERNAME, ALIAS= ALIAS

The pattern is always returning in Uppercase, even when you change the execution to Stage or Target areas, and because of this, we got an error saying that the member PARENTNAME is not valid for the dimension XXXXX (And this does not happen in ODI 11 if you set the execution to Stage or Target areas).

Anyway, we start to test the behavior and we found out that in ODI 12 it is impossible (by now) to create an Oracle table like this: Oracle Model test Because the pattern COL_NAME always return, uppercase no matter what and it removes the double quotes from the expression. This is why we think it is a bug, because it removes a functionality from the Oracle DB as well.

For a quickly workaround (yes we know it’s ugly, but Oracle will fix that in no time and we’ll be able to get rid of it) we changed the code from:

stagCols = ‘<%=snpRef.getColList(“”, “[CX_COL_NAME]=[COL_NAME] “, “, “, “”, “”)%>’

To

stagCols = ‘<%=snpRef.getColList(“”, “[CX_COL_NAME]=[COL_DESC] “, “, “, “”, “”)%>’

After this, we just need to copy the column name to the description inside the attributes in the model: Models Details Double clicking the column, it will open its properties and in the description tab we only need to copy the name of the column to its description because the COL_DESC will get exactly what is written there (with one exception: if you put a quote or double quotes it will turn it in space, but we will not use it here). Description workaroundThis workaround does the trick: LKMs Executions Yes, we know that we could use a lot of other methods, as Flexfields for example, to do this or even do some code changes in the KM, but the idea here is to show that is possible to use ODI 12 with minor changes. And with this we can use ODI 12 in our EPM environment just like we do with the 11 version.

We didn’t test HFM here because we don’t have it installed in our infra, but probably the amount of changes for HFM is the same as it’s for Essbase and Planning.

We hope you guys like this and please give us any feedback if you guys implement this solution. As far as we test it, it should work exactly like ODI 11.

See you next time.

——————Edited on 07/22/2014——————-

About the Case sensitive issue of [COL_NAME].

We installed the ODI 12.1.2.3 today and repeated the tests we did in the previous version.

The [COL_NAME] for IKM is working, and that means, the issue to create an Oracle table with 2 columns with the same name but different case is solved.

But the [COL_NAME] for LKM still doesn’t works. All LKM have the same behavior and we can see that in the logs when we tried to replicate the IKM behavior in a LKM.

LKM testIKM test

As we can see, the behavior was fixed in the IKM but not in the LKM. This happens because the KMs are executed in different ways depending on its type (I, L, CKM) as we can see here:

http://docs.oracle.com/middleware/1212/odi/ODIKD/odiref_reference.htm#CIADFCFJ

For now please stick with our workaround until Oracle fix this in the LKMs.

See you next time.