Archive for the Oracle Category

Automating Essbase Copy Outline Operation using Java API

Posted in ACE, BSO, Cubes, Essbase, Hacking, Hyperion Essbase, Java, Migration, Oracle with tags , , , , , , on August 9, 2017 by RZGiampaoli

Hi guys how are you? Did you guys ever tried to automate the process of coping a cube outline from one application to another?

Well, there’s an easy way to do that. Basically you copy the .otl from the server file system over the other cube. The problem is that if the cube is not empty, the database becomes corrupted since we just replaced an .otl file for another strange .otl file (no restructure happened).

Then if you want to copy the outline to an existing cube (that has data) this is not a solution.

The thing is, the only two possible ways to do what we want is the EAS “Save as” operation and the migration wizard. These both operations work because they copy the .otl file as .otn and then run a restructure in the database. The restructure “synchronize” the cube with the new outline, making the process safe for a cube that has data on it.

The problem is, none of these can be automated and there’re no way to do this operation using Maxl or EssCmd.

In fact, even using the Java API, it’s hard to figure out how to do that because all the copy methods seem to copy all kind of objects but the outline.

The good news is, we figured out a way to replicate the “Save as” operation using the Java API after hours of frustration and tears…

Here we go:

Save As Java code

The code is really simple. We need to connect in the essbase server, lock the target outline (the one we’ll overwrite) and then copy the outline from one application to another. To do that we are going to use the functions “lockOlapFileObject” and “copyOlapFileObjectToServer”.

This process that we just described will create an .otn file in the target cube. Now comes the great catch of this code (that is not documented anywhere):

If we open the target outline in EAS we will still see the old metadata. To commit the changes, we need to perform a restructure to merge the new outline (.otn) with the old one (.otl) updating the metadata.

To do that we are going to use the functions in the class “IEssCubeOutline” to “open”, “restructureCube” and “close” the target outline.

That is it. This process will do exactly what the “Save As” in EAS does, which means that you can copy outlines from one application to another even when the target database contains data.

I hope you guys enjoy and see you soon.

Advertisements

Kscope 17 is approaching fast!!! And we’ll be there!

Posted in ACE, Data Warehouse, Essbase, Hyperion Essbase, Java, Kscope 17, ODI, ODI Architecture, Oracle, Performance, Tips and Tricks, Uncategorized with tags , , , , , , , , on June 8, 2017 by RZGiampaoli

Hi guys how are you? We are sorry for being away for so much time but this year we have a lot of exiting things going one, then let’s start with what we’ll be doing at Kscope 17!

This year we’ll present 2 sessions:

Essbase Statistics DW: How to Automatically Administrate Essbase Using ODI (Jun 28, 2017, Wednesday Session 12 , 9:45 am – 10:45 am)

In order to have a performatic Essbase cube, we must keep vigilance and follow up its growth and its data movements so we can distribute caches and adjust the database parameters accordingly. But this is a very difficult task to achieve, since Essbase statistics are not temporal and only tell you the cube statistics is in that specific time frame.

This session will present how ODI can be used to create a historical statistical DW containing Essbase cube’s information and how to identify trends and patterns, giving us the ability for programmatically tune our Essbase databases automatically.

And…

Data Warehouse 2.0: Master Techniques for EPM Guys (Powered by ODI)  (Jun 26, 2017, Monday Session 2 , 11:45 am – 12:45 pm)

EPM environments are generally supported by a Data Warehouse; however, we often see that those DWs are not optimized for the EPM tools. During the years, we have witnessed that modeling a DW thinking about the EPM tools may greatly increase the overall architecture performance.

The most common situation found in several projects is that the people who develop the data warehouse do not have a great knowledge about EPM tools and vice-versa. This may create a big gap between those two concepts which may severally impact performance.

This session will show a lot of techniques to model the right Data Warehouse for EPM tools. We will discuss how to improve performance using partitioned tables, create hierarchical queries with “Connect by Prior”, the correct way to use multi-period tables for block data load using Pivot/Unpivot and more. And if you want to go ever further, we will show you how to leverage all those techniques using ODI, which will create the perfect mix to perform any process between your DW and EPM environments.

These presentations you can expect a lot of technical content, some very good tips and some very good ideas to improve your EPM environment!

Also I’ll be graduating in this year leadership program and this year we’ll be all over the place with the K-Team, a special team created to make the newcomers fell more welcome and help them to get the most of the kscope.

Also Rodrigo will be at Tuesday Lunch and Learn for the EPM Data Integration track on Cibolo 2/3/4.

And of course we will be around having fun an gathering new ideas for the next year!!!

And the last but not least, this year we’ll have a friend of us making his first appearance at Kscope with the presentation OBIEE Going Global! Getting Ready for More Than +140k Users (Jun 26, 2017, Monday Session 4 , 3:15 pm – 4:15 pm).

A standard Oracle Business Intelligence (OBIEE) reporting application can hold more or less 1,200 users. This may be a reasonable number of users for the majority of the companies out there, but what happens when an IT leader like Dell decides to acquire another IT giant like EMC and all of their combined 140,000-plus users need to have access to an HR OBIEE instance? What does that setup looks like? What kind of architecture do we need to have to support those users in a fast and reliable way?
This session shows the complexity of Dell’s OBIEE environment, describing all processes and steps performed to create such environment, meeting the most varied needs from business demands and L2 support, always aiming to improve environment stability. This architecture relies on a range of different technologies to support that huge amount of end users such as LDAP & SSL, Kerberos, SSO, SSL, BigIP, Shared Folders using NAS, Weblogic running into a cluster within #4 application servers.
If the challenge was not hard enough already, all of this setup also needed to consider Dell’s legacy OBIEE upgrade from v11.1.1.6.9 to v11.1.1.7.160119, so we will explain what were the pain points, considerations and orchestration needed to do all of this in parallel.

Thank you guys and see you there!

kscope17logo-pngm

OTN Article: Building a 100% Cloud Solution with Oracle Data Integrator

Posted in ACE, ArchBeat, BICS, DBCS, DEVEPM, EPM, EPM Automate, InfraStructure, ODI, ODI 11g, ODI Architecture, Oracle, Oracle Database, OS Command, OTN, PBCS, Tips and Tricks with tags , , , , , , , , , , , , on January 23, 2017 by RZGiampaoli

Hi guys how are you? Today I want to share our new OTN article Building a 100% Cloud Solution with Oracle Data Integrator.
The article will cover how to integrate BICS, PBCS, DBCS and ODI and will explain step by step how to create a 100% cloud solution using ODI (everything on the cloud including ODI :)).

This is a perfect article for companies that are thinking to go cloud and have some doubts or even are thinking how you can integrate/use your actual infrastructure with the cloud services.

I hope you guys enjoy and see you soon.

ODI 12c new features: Dimension and Cubes! Part 4 (Loading using Surrogate Keys)

Posted in Dimensions, ETL, ODI 12c, ODI Architecture, ODI Mapping, Oracle, Tips and Tricks with tags , , , , , on December 16, 2016 by RZGiampaoli

Hi guys how are you?

Today we’ll continue the dimension and cubes series (Part 1, Part2 and Part 3 here) and we’ll see how to load data using Surrogate keys.

After all the setting done in the last post, now the only thing left is to create the interfaces and map everything. For the Surrogate keys, the interface and the mapping are exactly the same as for no-surrogate version (as we can see in the previous posts) for both, dimensions and facts, what’s very nice.

times-surrogate-interfaceThe interesting here is what he does behind the scenes. In the no-surrogate version ODI created one mapping for each hierarchy and in the end it merged everything together inside a table.

no-surrogate-time-operatorFor the Surrogate key version, ODI also generates one mapping for each hierarchy but the main difference is that after each one he merges it witch the others. This happens because he needs to get the surrogate key for each level.

time-surrogate-operator

For each level ODI automatically generates an insert into that level stage table verifying if all the columns does not exists in the target table (He does that to decrease the amount of data for the merge step since merge would insert or update everything and would take more time than necessary).

After the stage table is loaded the next step is to merge the stage table to the target table, and for that ODI just create a “Merge”: when match he updates the descriptions or attributes and when doesn’t match it inserts the new rows with the sequences for the SK.

In the next level of the hierarchy ODI repeats the process but joining the Year with the Quarter. ODI will keep doing this for each level mapped until the last one, where instead of having a merge with matches and not matches, he just do a merge with Matches (since he know everything will already be there).

The results will be this:

time-surrogate-table-results

It’s nice that ODI already creates the dimension thinking in an aggregated fact since we can see that he has some rows just with the year, other with the year and quarters and the last one with all the information.

One thing to notice is that the PK is the same as the Month SK. This is because ODI is ready to create SCD type 2 (we’ll do another post to show how it works).

For the fact, the mapping will still be the same as the No-surrogate version and again the difference will be in the results.

fact-surrogate-interface

We can see that in the operator ODI does something really neat this time.

fact-surrogate-operator

MERGE INTO EPM_HPT_ODI_RUN.S_FACT FACT_SURROGATE1_FACT_SURROGATE USING
(SELECT TIME_SURROGATE_FACT_SURROGAT_1.MONTH_SK AS ID_TIME ,
PRODUCT_SURROGATE_FACT_SURRO_1.PRODUCT_SK AS ID_PRODUCTS ,
REGIONS_SURROGATE_FACT_SURRO_1.CITY_SK AS ID_REGIONS ,
SRC_ERP.SALES AS METRIC
FROM ((EPM_HPT_ODI_RUN.SRC_ERP SRC_ERP
LEFT OUTER JOIN
(SELECT TIME_SURROGATE_FACT_SURROGATE.ID_MONTH AS ID_MONTH ,
TIME_SURROGATE_FACT_SURROGATE.MONTH_SK AS MONTH_SK ,
TIME_SURROGATE_FACT_SURROGATE.TIME_PK AS TIME_PK
FROM EPM_HPT_ODI_RUN.S_TIME TIME_SURROGATE_FACT_SURROGATE
WHERE ((TIME_SURROGATE_FACT_SURROGATE.TIME_PK = TIME_SURROGATE_FACT_SURROGATE.MONTH_SK)
AND (TIME_SURROGATE_FACT_SURROGATE.MONTH_SK IS NOT NULL) )
) TIME_SURROGATE_FACT_SURROGAT_1
ON (SRC_ERP.ID_MONTH = TIME_SURROGATE_FACT_SURROGAT_1.ID_MONTH) )
LEFT OUTER JOIN
(SELECT PRODUCT_SURROGATE_FACT_SURROGA.ID_PRODUCT AS ID_PRODUCT ,
PRODUCT_SURROGATE_FACT_SURROGA.PRODUCT_SK AS PRODUCT_SK ,
PRODUCT_SURROGATE_FACT_SURROGA.PRODUCTS_PK AS PRODUCTS_PK
FROM EPM_HPT_ODI_RUN.S_PRODUCTS PRODUCT_SURROGATE_FACT_SURROGA
WHERE ((PRODUCT_SURROGATE_FACT_SURROGA.PRODUCTS_PK = PRODUCT_SURROGATE_FACT_SURROGA.PRODUCT_SK)
AND (PRODUCT_SURROGATE_FACT_SURROGA.PRODUCT_SK IS NOT NULL) )
) PRODUCT_SURROGATE_FACT_SURRO_1
ON (SRC_ERP.ID_PRODUCT = PRODUCT_SURROGATE_FACT_SURRO_1.ID_PRODUCT) )
LEFT OUTER JOIN
(SELECT REGIONS_SURROGATE_FACT_SURROGA.ID_CITY AS ID_CITY ,
REGIONS_SURROGATE_FACT_SURROGA.CITY_SK AS CITY_SK ,
REGIONS_SURROGATE_FACT_SURROGA.REGIONS_PK AS REGIONS_PK
FROM EPM_HPT_ODI_RUN.S_REGIONS REGIONS_SURROGATE_FACT_SURROGA
WHERE ((REGIONS_SURROGATE_FACT_SURROGA.REGIONS_PK = REGIONS_SURROGATE_FACT_SURROGA.CITY_SK)
AND (REGIONS_SURROGATE_FACT_SURROGA.CITY_SK IS NOT NULL) )
) REGIONS_SURROGATE_FACT_SURRO_1
ON (SRC_ERP.ID_CITY = REGIONS_SURROGATE_FACT_SURRO_1.ID_CITY)
) MERGE_SUBQUERY ON ( FACT_SURROGATE1_FACT_SURROGATE.ID_TIME = MERGE_SUBQUERY.ID_TIME AND FACT_SURROGATE1_FACT_SURROGATE.ID_PRODUCTS = MERGE_SUBQUERY.ID_PRODUCTS AND FACT_SURROGATE1_FACT_SURROGATE.ID_REGIONS = MERGE_SUBQUERY.ID_REGIONS )
WHEN NOT MATCHED THEN
INSERT
(
ID_TIME ,
ID_PRODUCTS ,
ID_REGIONS ,
METRIC
)
VALUES
(
MERGE_SUBQUERY.ID_TIME ,
MERGE_SUBQUERY.ID_PRODUCTS ,
MERGE_SUBQUERY.ID_REGIONS ,
MERGE_SUBQUERY.METRIC
)
WHEN MATCHED THEN
UPDATE SET METRIC = MERGE_SUBQUERY.METRIC

He automatically joins all our dimensions at level zero (since we have the dimensions in the higher levels for the aggregated fact) to get the surrogate key information and use it in the fact table. This is very nice because in large DWs we’ll have tons of dimensions, and map/join everything is very time consuming. The final results is this:

fact-surrgoate-sql-results

A perfect DW created using surrogate key, in other words, instead of having the dimensions PKs in the fact table we have the SKs (that ware generated by a sequence in the dimensions).

In resume, we think that if you going to create simple dimensions and simple facts (without surrogate key or SCD type 2) it’s still nice to use this new feature since it’s a nice way to document and standardize your DW, but if we measure by development time it’s not worthy since it’s very time consuming for simple DW.

Now, if you want to create a DW using surrogate keys or SCD type 2 we found this new feature extremely useful for both, documentation and standardizations and because is a lot faster than do manually.

Thanks and see you soon.

PBCS, BICS, DBCS and ODI!!! Is that possible???

Posted in 11.1.1.9.0, 11.1.2.4, ACE, BICS, DBCS, EPM, EPM Automate, ODI, ODI 10g, ODI 11g, ODI 12c, ODI Architecture, ODI Architecture, Oracle, OS Command, PBCS, Performance, Uncategorized with tags , , , , , , , on August 15, 2016 by RZGiampaoli

Hey guys, today I’ll talk a little bit about architecture, cloud architecture.

I just finished a very exciting project in Brazil and I would like to share how we put everything together for a 100% cloud solution that includes PBCS, BICS, DBCS and ODI. Yes ODI and still 100% cloud.

Now you would be thinking, how could be 100% cloud if ODI isn’t cloud yet? Well, it can be!

This client doesn’t have a big IT infrastructure, in fact, almost all client’ databases are supported and hosted by providers, but still, the client has the rights to have a good forecast and BI tool with a strong ETL process behind it right?

Thanks to the cloud solutions, we don’t need to worry about infrastructure anymore (or almost), the only problem is… ODI.

We still don’t have a KM for cloud services, or a cloud version of ODI, them basically we can’t use ODI to integrate could tools….

Or can we? Yes we can 🙂

The design is simple:

  1. PBCS: Basically we’ll work in the same way we would if it was just it.
  2. BICS: Same thing here, but instead of use the database that comes with BICS, we need to contract a DBCS as well and point the DW schema to it.
  3. DBCS: here’s the trick. Oracle’s DBCS is not else then a Linux machine hosted in a server. That means, we can install other things in the server, other things like ODI and VPN’s.
  4. ODI: we just need to install it in the same way we would do in an on premise environment, including the agent.
  5. VPN’s: the final touch, we just need to create VPN’s between the DBCS and the client DB’s, this way ODI will have access to everything it needs.

Yes you read it right, we can install ODI in the DBCS, and that makes ODI a “cloud” solution.

cloud solution

The solution looks like this:

BICS: It’ll read directly from his DW schema in the DBCS.

PBCS: There’re no direct integration between the PBCS and DBCS (where the ODI Agent is installed), but I found it a lot better and easy to integrate them using EPM Automate.

EPM Automate: With EPM Automate we can do anything we want, extract data and metadata, load data and metadata, execute BR and more. For now the easiest way to go is create a script and call it from ODI, passing anything you need to it.

VPN’s: For each server we need to integrate we’ll need one VPN created. With the VPN between the DBCS and the hosts working, use ODI is extremely strait forward, we just need to create the topology as always, revert anything we need and work in the interfaces.

And that’s it. With this design you can have everything in the cloud and still have your ODI behind scenes! By the way, you can exactly the same thing with ODI on premise and as a bonus you can get rid of all VPN’s.

In another post I’ll give more detail about the integration between ODI and PBCS using EPM Automate, but I can say, it works extremely well and as far I know is a lot easier than FDMEE (at least for me).

Thanks guys and see you soon.

 

Why should you apply for ODTUG Leadership Program

Posted in Career, Kscope, ODTUG, Oracle on July 13, 2016 by radk00

Hi all! Everybody that reads our blogs knows that we are passionate about ODTUG. We already wrote about how it changed our lives, we have a lot of great friends there, we are frequent speakers at sessions/webinars and Kscope already became the “must to go” event of the year for us. But since last year, it got even more special because I was selected to participate in the ODTUG Leadership Program, which I graduated during Kscope16. It was an AMAZING experience and since the “2017 ODTUG Leadership Class Application” is now open, I highly encourage all of you to also apply! Let me talk a little bit about my experience there.

The leadership program is a nine-month program where the participants have the opportunity to join monthly webinars that talks about a range of leadership topics like leading a user group, being an effective mentor/mentee, work-life balance, to name a few. The participants also join a “project”, that is different for each leadership class. Our project’s class was about the creation of a new ODTUG community, aimed to attract young professionals to the Oracle career. The result of our project was the brand new community called “Career Track”, which I wrote a post about it here.

But this description is just about the “technical” side of the program: you participate over nine months, have webinars to attend, one project to build. But this, in my opinion, is not the most important feature of this program. The most important one is the people that you will meet there. You will have contact with such an extraordinary/talented group of people that you will be extremely amazed and honored to be part of such a great team. The networking experience is one of the most powerful ones that I ever experienced. Over nine months I’ve meet great leaders in the Technology field, the entire ODTUG board, Oracle folks and some great new friend from around the globe (yes, the Leadership program is international!).

IMG_20160627_074520

Another great thing about the program is “exposure”. There are many opportunities within ODTUG to show you work, like webinars, meetups, technical papers and even Kscope sessions. The leadership program will put you in touch with the people that run those opportunities, so you will understand how they work and how you may participate on those as well. The program makes you feel like you “belong” to something greater than your normal jobs, so you get inspired to also to do greater things and exposure your work to other people.

This is just a much resumed description of what I’ve lived during those nine months but I really hope that this may somehow encourage you to also apply for the next class. I totally guarantee that this is worth it and your live will never be the same again!

I’m grateful for all those people that I meet during the program (all of you are amazing!!!) but I must write here a special THANK YOU to Sarah Zumbrum that was our class’ leader that guided us through the entire period. Few times in my life I’ve seen such a dedicated and passionate person about what she was doing. Her actions and words were always of a true leader and I’ll always take her teachings with me in both my personal and professional lives. Thank you again Sarah!

Applications are now being accepted and the deadline to apply is 8/14/16. Apply here: ODTUG 2016-2017 Leadership Program Application

lplogo

See ya!

… Kscope 16 has finished and it was amazing!!!

Posted in Kscope 16, ODTUG, Oracle on July 4, 2016 by radk00

Hi all, how are you all doing? Man, it was too fast…. This year’s Kscope was great, but when we realized, it was just over already. We got so much involved and busy there that our plans to do a “live blogging” just failed, so we will just go with a resumed version of it today (sorry about that!).

This year for me was the “networking” year. We spoke with great new people, meet with old friends, recorded OTN interview, being in happy hours, parties, after parties, receptions, graduations and so on. It was a great social event for us this year. Luckily, we have the option to see all Kscope 16 recorded sessions afterwards (you too, if you have being at Kscope 16 or if you are an ODTUG full member) because there were some great sessions that I really wanted to attend, but there was no feasible time to do so.

Our two sessions where just amazing! We got a very good number of attendees (thanks all that were there!) and the feedback during the event about them was just great! We are very happy that we could deliver some good information to all of you and this kind of feedback motivates us to keep going and bring more and more content for you all.

If you want to take a look on our sessions’ slides, take a look on the links below:

Take a peek at Dell’s smart EPM global environment

20160627_102539

Incredible ODI tips to work with Hyperion tools that you ever wanted to know

20160627_163138

This Kscope was also very special to me because I was graduating from the ODTUG’s leadership program. It was an amazing experience and later this week I’ll write a post talking about my experiences there and how do you apply for it if you are interested in doing so. It was a nine months program with some great people that I was blessed to meet face to face at Kscope 16.

So, are you wondering where Kscope 17 will be? Check it out!

20160627_174501.jpg

Texas!!! I already had the opportunity to go to San Antonio and it is really a nice venue. It will be great to see once again the Alamo and walk down the River walk. A great thing about Kscope is that you get inspired by such a great content that you keep thinking about what are you going to present next. We are already taking notes on some ideas for next year, so you may be sure that we will do our best to also be present at Kscope 17.

That’s it folks, thanks again for all of those that were present at our sessions! I’m honored to be part of such a great and well organized event!

Hope to see you all again at Kscope 17!