Archive for the Oracle Category

ORACLE SQL for EPM tips and tricks S01EP04!

Posted in ACE, Connect By, Oracle, Oracle Database, Performance, REGEXP, SQL, Tips and Tricks with tags , , , , , , on April 9, 2019 by RZGiampaoli

Continuing the Oracle SQL for EPM series, today we’ll extend a little bit the “Connect by” post to see some neat thing we can do when we mix regular expression with connect by.

First of all, let’s start explaining the basic of this approach. We can use Connect By and Level to generate data in Oracle like this:

This is very useful when you need to generate 2 rows for each row we have in a query, then you can do a Connect by Level <=2 and use it as a query without join, duplicating everything in the source query. Then you can use a Decode or Case to say, Case when level = 1 then x when level =2 then y….

But for us this is just the basic concept that we need to understand for our trick. let’s get back to our metadata table, the one with parent child relationship and with all metadata from all apps and all dims that we have. Let’s say we need to create an hierarchical table based in a parent child table. Some people does one union for each level that we have in the source table (too much work), another will do filter all over the places, I don’t, I just use connect by with regular expression. Let’s take a look.

As we can see, we have 17 levels or generations for the account dimension. quite big. First of all, let’s do a connect by path to flat the entire hierarchy in one row and remove the parent child relationship.

Now, as we can see, if we just do a connect by prior with the sys_connect_by _path, the query will generate 1 row for each level of data. We don’t want that, unless you want to have the ability to input data in the upper level of the hierarchy, that is usually not the case. What we want is to have one the entire path for each leaf member, then we need to filter just the leafs in our queries. In my case my table already have a column that says if a member has children or not, but I’ll proceed as if I don’t have it.

If we don’t have the leaf information we can generate it by using Connect_By_IsLeaf that basically return 0 if is a parent or 1 if is a leaf. Again, we are filtering this outside the connect by prior and now we have just the leafs and the entire hierarchy above him. Next step is to split this back into rows.

REGEXP_SUBSTR is the perfect tool to split the string we have into new columns. As you can see, we can use the follow expression: REGEXP_SUBSTR(Column, ‘[^|]+’, 1, 1) Basically the regexp uses what is in [] to find what yo want and split it in the way you want. In our case I’m looking for the first | (that is defined by the 1, 1) and get the first word after it. Then I have another regexp and I changed the 1, 1 to 1, 2 to get the second word and keep doing this until I get to the 17 generation (as we saw in the beginning).

Basically if you have 17 generation you need to have 17 columns in the table, then you need to have 17 REGEXP, one for each column, you just need to change the 1, 1, 1, 2, 1, 3… until the 1, 17.

Here’s when ODI come in handy because if you use ODI you just need to do a select in the source to figure out how many generations you will have and then generate dynamically the 17 REGEXP and pass this to the target dynamically. It’s very easy since the code itself is always the same and the only thing that changes is the second parameter, and it’s a sequential parameter.

I hope this can be as useful for you as it’s for me and see you soon.

Advertisements

BI Community Lvl up! Enter Analytics Community

Posted in ACE, Career, DEVEPM, EPM, ODTUG, Oracle with tags , , , on February 22, 2019 by RZGiampaoli

Hey guys how are you?

Today I have a very exiting news for all of us. ODTUG just announced the re-branding of the BI Community, to the Analytics Community.
The change is not in name alone—it’s an effort to create a more dynamic and inclusive community to better serve the needs the of ODTUG members. What does this change mean? It means we encompass more than just BI. It means if you are interested in any of the tools that are used to analyze data and turn it into information, we are here for you! Have a question? Want to share knowledge you gained from a project? Connect with other like-minded analysts who explore data in myriad ways, both on premises and in the cloud. Join the Analytics Community to share your passion for all things analytics, whether you are using OAC, OBIEE, ML/AI, R, Python, Essbase, DV, Big Data, Data Warehousing, or other platforms. This page is your portal to all things Analytics– a blog feed, a list of upcoming Analytics-related events, and a look at the Analytics content located in the Technical Database. Check out the most recent blogs from industry experts. Like what you read? Click on the title to visit the author’s blog. Would you like to see your blog featured here? Drop a note to Haleigh@odtug.com. Let’s explore analytics together!

Check the Twitter, LinkedIn and Facebook page to learn more, join us and help us to create a better Analytic Community.

BI Playlist Banner2.jpg

ORACLE SQL for EPM tips and tricks S01EP01!

Posted in DEVEPM, ETL, Oracle, Oracle Database, Performance, SQL, Tips and Tricks, Uncategorized, WITH Clause with tags , , , , , , on January 21, 2019 by RZGiampaoli

Hey guys how are you? I decide to start a new series called ORACLE SQL for EPM tips and tricks. The idea here is to show the most useful SQL commands for EPM, how to improve performance, tips, tricks and everything that can be useful from a SQL point of view!

And to start well, I’ll show something very old but very useful that I don’t see too many people using these days. “WITH” clause.

I love to use “WITH” in my code. It helps organize the code, helps to optimize it and more important, to make it more efficient.

When you use “WITH” Oracle treats your query inside it as an inline view or resolved as a temporary table, making it easier and faster for Oracle to access that data if you need it again.

Simply putting, every time you needs to right a query that uses the same table over and over, it’ll probably be way more efficient if you use “WITH”.

The “WITH”clause works a little bit different from a regular SQL. We can say that we split the query in 2, one is the “WITH” declaration (That will behave like a table) and the other is the SQL that will Query the “WITH”.

WITH name_of_temp_table_here AS
(
    YOUR QUERY HERE
),
   name_of_temp_table_here2 AS
(
   SELECT *
   FROM name_of_temp_table_here, another_table...
)
SELECT *
FROM name_of_temp_table_here, name_of_temp_table_here2 

In the “WITH” you can have any kind of query you want. You can do joins, group by, you can also have more than one “WITH”, you can use the result of one “WITH” in the next “WITH”, you can do a lot of things.

But for now, lets take a look in a more real example. Let’s say that you have a table like I do, that contains all metadata from all yours applications:

Let’s say you want to get the Parent of a attribute that is associated with your Entity dimension. You probably will be doing something like this:

In the “FROM” we call the table 2 times and we join and filter everything we need. Since we don’t have attribute association in all levels we do a “Left Join” to make sure all Entities comes in the query. If we run a Explain Plan now we’ll get something like this:

As you can see, Oracle is querying the METADATA_EXTRACT table twice and each time it’s doing a FULL in one Partition (‘ENTITY’ and ‘PHYSICAL_GEOGRAPHY’ partitions).

Now, if we change the query (and we can do it in different ways, this is just one of them) to a “WITH” clause we ‘ll have something like this:

As you can see, we achieved the same results with the code a little bit different. Now I have all my filters in the “WITH” query and in the bottom I just call the “WITH” query 2 times and do what needs to be done.

If we run a Explain Plain now we will have:

As you can see, Oracle now is querying the METADATA_EXTRACT table just once and then his queries the SYS.SYS TEMP table twice. The only problem with this query and the way I did is that since we are creating a temporary table filtering 2 partitions and then later I’m filtering again, it’s basically doing 2 FULL scan in the same TEMP table, and even so, it’s a few seconds faster then the original query.

But this is just an example on how we can reduce the amount of times that Oracle needs to query a table. WITH is not a miracle clause or anything like that, is just another tool that we can use to optimize our code, and its performance needs to be evaluated in a case-by-case basis.

And even if the performance doesn’t change, I believe using “WITH” clause makes any query easier to ready, to test, to update and to right since you can divide your huge query in small bits and then join
everything in the bottom query.

“WITH” is a huge subject and we’ll be talking more about it in the next post, and this time we’ll be improving performance as well using “WITH” with “CONNECT BY”.

All about Meetups!

Posted in ACE, MeetUp, ODTUG, Oracle, Uncategorized with tags , , on January 23, 2018 by RZGiampaoli

Hey guys how are you doing? It has been a long long LONG time since our last post…. and we are sorry for that. We having been very busy with a lot of projects (both work and personal) and we had to get some time for us!

But we are finally getting back to action! And I’ll do that posting something about one thing that everybody loves…. Meetups..

First of all I would like to talk about the next ODTUG meetup that will happens in 2 days in Orlando, inside the Walt Disney World Dolphin Resort…. I think I don’t need to say anything else about it other that it’ll be awesome…. but anyway, here’s the link for you guys to attend:

All Community Social Meetup with the ODTUG Board of Directors

Thursday, Jan 25, 2018, 5:30 PM

Walt Disney World Dolphin Resort- Phins Lounge (Lobby Bar)
1500 Epcot Resorts Blvd Orlando, fl

8 ODTUGers Attending

Join us on Thursday, January 25, 2018, at 5:30 p.m. for an ODTUG social Meetup in the Phins Lounge at the Walt Disney World Dolphin Resort. Come enjoy a drink with the ODTUG Board of Directors and mingle with fellow ODTUGers! Date: Thursday, January 25, 2018, at 5:30 – 7:00 p.m. Location: Walt Disney World Dolphin Resort – Phins Lounge (lobby bar) …

Check out this Meetup →

Now that this is sort out, do you know that if you want to do your own Meetup you can do it together with ODTUG? And you can even get financial support to do it????

Well it’s true. You just need to plan your meet up and then fill out the Meetup Planning form. This will provide ODTUG with all the information they need in order to move forward.

It’s also helpful to provide them if you are seeking financial support and if so, how much, and also your estimated attendee count as well. Another advantage is that there’s an list of services that ODTUG will provide as a sponsor along with the qualifications for sponsorship.

  • Promotion for your local Meetup:
    • ODTUG will post your event on the ODTUG Meetup page.
    • Email Campaigns: ODTUG will invite members to your event
    • Social Media: ODTUG will promote your event on Twitter, Facebook, and LinkedIn
    • ODTUG will publish the success of your meeting in your community’s quarterly newsletter
  • Source an Oracle speaker if needed
  • Ship ODTUG swag for you to distribute at your event.

Off coarse there’re a few Sponsorship Requirements: Communicate ODTUG Resources and Events:

  • Educational resources

o   Provide an overview ODTUG’s memberships and benefits

o   Provide ODTUG Kscope18 deadlines

  • Early bird rate deadline – March 29, 2018

If you get exited with this piece of information and want to do you own Meetup, here are some useful tips and strategies for planning a Community Meetup.

 1- Choose a Date and Location

Be mindful of holidays and tech conference dates when planning a meetup. A free venue is best—workplaces and universities tend to be free, while private rooms at restaurants and hotels usually have a rental fee.  When securing a space, be sure to ask what the room capacity is and monitor your RSVP’s accordingly. If you need assistance with securing a venue please be sure to fill out the Meetup Planning form.

 2- Schedule Speaker(s)

Education and content should be a part of a Meetup. Even if presentations are brief, education helps to engage members. If you need assistance with securing presenters, please be sure to fill out the Meetup Planning form.

 3- Include Time for Networking with Food and/or Drinks

This could be as simple as having pizza delivered to an office after a speaker presentation. If there is a nearby bar/ restaurant the group may want to go for a Happy Hour.

 4- Promote ODTUG Membership and ODTUG Kscope18 Conference

Share the benefits of ODTUG membership, encourage others to volunteer with the community, and promote ODTUG Kscope18.

 Tips for a successful Meetup:

 Scheduling

o   Schedule quarterly Meetups if possible. Five-week months result in higher attendance and commitment. (Customers have an extra week to plan for maintenance, testing, and some downtime.)

o   Wednesdays and Thursdays are the best days for scheduling meetings; whereas Fridays you will be competing with families and/or social lives.

 Location is important

o   Provide an environment that is fun and inviting. Seek popular local bars or restaurants that provide private rooms for formal meetings.

 Budget

o   Everything is negotiable! Find creative ways for customizing menus for your meetings. Order appetizers for a small group or buffets for larger group.

o   The best option for savings is scheduling meetings in the workplace if possible.

 Greeting your guests

o   Have someone available to greet your guests. Provide a sign-up sheet, name tags, and pass out ODTUG/Kscope18 swag.

 Engaging your guests is KEY!

o   Creative ice breaker activity is a great way for people to engage and find common ground.

 Follow-up is vital for success

o   Feedback is important – this gives you an opportunity to thank your guests and provide a questionnaire (limited to 10 questions or less) to help improve and plan future Meetups.

 Non-solicitation

o   The user community is interested in learning how other organizations are leveraging the tools today within their environment, not a sales pitch. Solicitation is not allowed.

 

I hope you guys enjoy all this info and let’s start to plan our meetups 🙂

Automating Essbase Copy Outline Operation using Java API

Posted in ACE, BSO, Cubes, Essbase, Hacking, Hyperion Essbase, Java, Migration, Oracle with tags , , , , , , on August 9, 2017 by RZGiampaoli

Hi guys how are you? Did you guys ever tried to automate the process of coping a cube outline from one application to another?

Well, there’s an easy way to do that. Basically you copy the .otl from the server file system over the other cube. The problem is that if the cube is not empty, the database becomes corrupted since we just replaced an .otl file for another strange .otl file (no restructure happened).

Then if you want to copy the outline to an existing cube (that has data) this is not a solution.

The thing is, the only two possible ways to do what we want is the EAS “Save as” operation and the migration wizard. These both operations work because they copy the .otl file as .otn and then run a restructure in the database. The restructure “synchronize” the cube with the new outline, making the process safe for a cube that has data on it.

The problem is, none of these can be automated and there’re no way to do this operation using Maxl or EssCmd.

In fact, even using the Java API, it’s hard to figure out how to do that because all the copy methods seem to copy all kind of objects but the outline.

The good news is, we figured out a way to replicate the “Save as” operation using the Java API after hours of frustration and tears…

Here we go:

Save As Java code

The code is really simple. We need to connect in the essbase server, lock the target outline (the one we’ll overwrite) and then copy the outline from one application to another. To do that we are going to use the functions “lockOlapFileObject” and “copyOlapFileObjectToServer”.

This process that we just described will create an .otn file in the target cube. Now comes the great catch of this code (that is not documented anywhere):

If we open the target outline in EAS we will still see the old metadata. To commit the changes, we need to perform a restructure to merge the new outline (.otn) with the old one (.otl) updating the metadata.

To do that we are going to use the functions in the class “IEssCubeOutline” to “open”, “restructureCube” and “close” the target outline.

That is it. This process will do exactly what the “Save As” in EAS does, which means that you can copy outlines from one application to another even when the target database contains data.

I hope you guys enjoy and see you soon.

Kscope 17 is approaching fast!!! And we’ll be there!

Posted in ACE, Data Warehouse, Essbase, Hyperion Essbase, Java, Kscope 17, ODI, ODI Architecture, Oracle, Performance, Tips and Tricks, Uncategorized with tags , , , , , , , , on June 8, 2017 by RZGiampaoli

Hi guys how are you? We are sorry for being away for so much time but this year we have a lot of exiting things going one, then let’s start with what we’ll be doing at Kscope 17!

This year we’ll present 2 sessions:

Essbase Statistics DW: How to Automatically Administrate Essbase Using ODI (Jun 28, 2017, Wednesday Session 12 , 9:45 am – 10:45 am)

In order to have a performatic Essbase cube, we must keep vigilance and follow up its growth and its data movements so we can distribute caches and adjust the database parameters accordingly. But this is a very difficult task to achieve, since Essbase statistics are not temporal and only tell you the cube statistics is in that specific time frame.

This session will present how ODI can be used to create a historical statistical DW containing Essbase cube’s information and how to identify trends and patterns, giving us the ability for programmatically tune our Essbase databases automatically.

And…

Data Warehouse 2.0: Master Techniques for EPM Guys (Powered by ODI)  (Jun 26, 2017, Monday Session 2 , 11:45 am – 12:45 pm)

EPM environments are generally supported by a Data Warehouse; however, we often see that those DWs are not optimized for the EPM tools. During the years, we have witnessed that modeling a DW thinking about the EPM tools may greatly increase the overall architecture performance.

The most common situation found in several projects is that the people who develop the data warehouse do not have a great knowledge about EPM tools and vice-versa. This may create a big gap between those two concepts which may severally impact performance.

This session will show a lot of techniques to model the right Data Warehouse for EPM tools. We will discuss how to improve performance using partitioned tables, create hierarchical queries with “Connect by Prior”, the correct way to use multi-period tables for block data load using Pivot/Unpivot and more. And if you want to go ever further, we will show you how to leverage all those techniques using ODI, which will create the perfect mix to perform any process between your DW and EPM environments.

These presentations you can expect a lot of technical content, some very good tips and some very good ideas to improve your EPM environment!

Also I’ll be graduating in this year leadership program and this year we’ll be all over the place with the K-Team, a special team created to make the newcomers fell more welcome and help them to get the most of the kscope.

Also Rodrigo will be at Tuesday Lunch and Learn for the EPM Data Integration track on Cibolo 2/3/4.

And of course we will be around having fun an gathering new ideas for the next year!!!

And the last but not least, this year we’ll have a friend of us making his first appearance at Kscope with the presentation OBIEE Going Global! Getting Ready for More Than +140k Users (Jun 26, 2017, Monday Session 4 , 3:15 pm – 4:15 pm).

A standard Oracle Business Intelligence (OBIEE) reporting application can hold more or less 1,200 users. This may be a reasonable number of users for the majority of the companies out there, but what happens when an IT leader like Dell decides to acquire another IT giant like EMC and all of their combined 140,000-plus users need to have access to an HR OBIEE instance? What does that setup looks like? What kind of architecture do we need to have to support those users in a fast and reliable way?
This session shows the complexity of Dell’s OBIEE environment, describing all processes and steps performed to create such environment, meeting the most varied needs from business demands and L2 support, always aiming to improve environment stability. This architecture relies on a range of different technologies to support that huge amount of end users such as LDAP & SSL, Kerberos, SSO, SSL, BigIP, Shared Folders using NAS, Weblogic running into a cluster within #4 application servers.
If the challenge was not hard enough already, all of this setup also needed to consider Dell’s legacy OBIEE upgrade from v11.1.1.6.9 to v11.1.1.7.160119, so we will explain what were the pain points, considerations and orchestration needed to do all of this in parallel.

Thank you guys and see you there!

kscope17logo-pngm

OTN Article: Building a 100% Cloud Solution with Oracle Data Integrator

Posted in ACE, ArchBeat, BICS, DBCS, DEVEPM, EPM, EPM Automate, InfraStructure, ODI, ODI 11g, ODI Architecture, Oracle, Oracle Database, OS Command, OTN, PBCS, Tips and Tricks with tags , , , , , , , , , , , , on January 23, 2017 by RZGiampaoli

Hi guys how are you? Today I want to share our new OTN article Building a 100% Cloud Solution with Oracle Data Integrator.
The article will cover how to integrate BICS, PBCS, DBCS and ODI and will explain step by step how to create a 100% cloud solution using ODI (everything on the cloud including ODI :)).

This is a perfect article for companies that are thinking to go cloud and have some doubts or even are thinking how you can integrate/use your actual infrastructure with the cloud services.

I hope you guys enjoy and see you soon.