Saturday, March 17, 2018

Kscope18 - Here I Come!!!

I got to know about ODTUG from 2009 and have been following it since then. Every year Kscope happens, I assure myself that someday I will present but T didn't had the confidence in myself to dare and submit an abstract. What If it won't get selected

I have decided to myself that this year, I will overcome that fear in myself and work on four things that will enhance and improve my career

  1. Submit an abstract to Kscope18. I have submitted two abstracts and both of them got selected and I thank ODTUG selection committee from the bottom of my heart to give an opportunity to be present. I will try my best to do justice for the opportunity that you have given to me
  2. Continue blogging. I took a break for two years when my wife became pregnant in 2015 to take care of her and followed by taking care of my wife and daughter. The first year was really tough handling the kid by ourselves and decided to dedicate completely to my family
  3. Contribute enough to become capable of Oracle Ace Associate
  4. Become knowledgeable in Machine Learning (There's so much to it and I would like to start with Machine Learning)
Being able to get selected for ODTUG Kscope18 is one feat as this is going to be my first Kscope. But, being the first person from Genpact is the one which I will cherish for a lifetime. This has been well received by senior management at Genpact and I feel really happy and proud about it. This wouldn't have been possible without ODTUG and the selection committee. I thank them from the bottom of my heart as they have made this a memorable experience to me.

What will I be presenting?

An alternate approach to ASO Calcs - MDX Scripts
This session focus on how you can use MDX queries for calculations. Procedural calcs have been for a very long time and have become the de-facto standard for calculations in ASO. The limitations have been addressed using a mix of MDX member formulas and Procedural calcs
With patch, it has become very easy to format the data export from MDX queires. Gary Crisci has done a session at kscope17 titled " I've got an MDX script for that". In this session, I will talk about how you can use this feature and can build calculations using MDX queries. I will also talk about few paragidms / best practices on how you would do in BSO and how you can do the same in ASO

Intro to machine Learning using Oracle Data Viz
This one was a surprise to me as well. I wanted to get out of my comfort zone and wanted to present something on which I have started learning during my free time. It's very difficult to keep up with office, family, EPM cloud and Machine Learning. This session focuses on how you can kick start yourself with Machine Learning with Oracle DVD. This session will be about Machine Learning, the Machine Learning process from model building to prediction and steps involved in it and a live demo using Oracle DVD.
Sneak peak!!! Oracle DVD Machine Learning capability uses python. So, if you are familiar / unfamiliar with python, the Machine Learning process and approach will fairly be similar to how you do in Python / any other programming language

See you all at Kscope18!!!

Sunday, February 18, 2018

Upskill to Oracle EPM Cloud

When Oracle decided to push Oracle EPM to Cloud, it was a very big decision. There were always questions in my mind back then whether it will be an integrated platform “Oracle EPM Cloud” or it will be individual EPM product components. I didn’t have a clear picture / vision but I was definitely sure that having an integrated platform might work for big organizations who have most of the EPM products in place. But, cloud is different. You need to be a differentiator, competitor and yet provide all the needed functionality.
Most organizations are faced with the challenge of not just Cost but also Fast and Quick.

Q: How quick can I get on to the Cloud?
A: Immediate. Sign a deal, choose the number of users and you are up

Q: How quick can I have my application in the cloud
A: As-is, within a week / two all the basic functionality. Keep in mind that if you are using Workforce/Capital/Project planning and have customized it to your needs, it would take time

Q: How secure is my data
A: If you look and understand what and how Oracle is working in terms of providing a very high-secure environment, you will not have any doubt that your data is safe. Think about the risk and severity it is for Oracle who supports many clients

Q: Do I really need cloud? Can’t I do it in my environment
A: You definitely can. But, think about extreme circumstances where spending so much on infrastructure, to manage, to update and to keep it running across multiple products and databases is a huge task. It is better to leave it to the experts who know it lot better and then let us focus more on the application side. Cloud is environment ready and you can get it up and running in days.

Oracle has really changed the game of EPM not just in terms of bringing every single product to the cloud, but is also very aggressive in terms of the patches that are released. When you were on-premise, when was the last time that you had a patch that was released back to back in a month (Unless a security flaw was identified). I don’t remember.

                The point I am trying to make is the ability to manage and deliver patches for every single Cloud EPM product is a big feat and I commend their team for doing that. Patches doesn’t mean just “Bug Fixes”, it also includes new features, workaround solution, tips, best practices and what not.
In the world of connected devices, connected systems, we now have connected EPM all in the cloud
So, what exactly you get with the Oracle EPM Cloud?

                Oracle has pushed itself to the next level by collecting all the best practices accumulated over-time in the world of EPM and come-up with the industry standards and put all that together as a framework (OOTB – Out of The Box functionality) enabling for rapid deployment. You get all the required / needed components (cannot be deleted though but with limited modifications to the pre-built content)
When Hyperion was an independent company way back in beginning of 2007, I didn’t had access to any resources. I was working at a client and for 1 year all I had was the Application Manager 6.5 and the DBAG. I was new to Essbase. It took 4 months without any formal training to be able to understand and read-through the whole DBAG. I am not sure how many of them have actually read the whole DBAG. But, I might have probably read it 2-3 times in that 1 year. When Oracle acquired Hyperion in 2007, it was a game changer.  Slowly, things started to change. Hyperion System 9 documentation was available. Slowly, the software was also made available for developer versions and people started using it. Lots of blogs started floating in with expert advice and oracle forums came to existent.

Note: Whatever I have mentioned above is how day by day I came to know about the new things happening around Hyperion. They have existed many years back but It was at that time when I came to know about them

                When cloud got released, few things remained same and few things changed but in a good way. If you are ON-Premise, you would install Oracle EPM in your new machine as soon as the new version is available and explore all the new features. When it comes to cloud, you do not have access to the EPM cloud environment, unless you take a subscription

                Oracle is aware of this and have curated lot of videos with detail navigation of the process. The documentation also has been updated and aligned to the list of tasks and activities that you would work on. Another good thing is that these videos are part of the documentation to keep the flow going

For those who are part of Oracle Partner Network, you would have access to few additional resources which is not available for general audience. Here are a few that I am curating and will keep updating once the list keeps increasing
  • Oracle demo cloud environments

As it has become extremely difficult to have access to cloud environments as you cannot install them in your own machine, Oracle is providing free cloud environments for partners. You can access these environments from Keep in mind that these are only accessible for oracle partners will silver level as the starting point. Having access to these environments will give you a heads up when you go to demo clients on cloud capabilities
Along with EPM, you also get access to other cloud environments as wel
  • Oracle Partner Training

If your company is part of Oracle Partner network, reach out to oracle counterpart to get registered for partner training's. If you are unsure whom to reach out to, you can send your nominations to the contact list provided in the workshop registration page. These training sessions are delivered in-class across various regions. However, there are few which you can attend online if you cannot be physically present. You can look at full schedule here

  • Cloud Customer Connect

Oracle is bringing more and more customers together to a central platform where they can connect, answer other customer issues / problems in forums, idea lab where you share new product ideas / improvements for a better customer experience. Similar to above two, you need to be part of Oracle Partner Network to be able to access the customer connect
                What I like about the Orace CCConnect is about various events related to new features, functionalities that are released / will be released in next Oracle EPM cloud products. You will also see some best practice sessions as well. My focus is only towards EPM Cloud but the CloudCustomerConnect do include other cloud applications. It’s definitely worth checking out
You can access Cloud Customer Connect here

Note: The Cloud Customer connect link is also made available in all the EPM cloud applications. You can access it from the dropdown that is next to your userid when you login to any EPM Cloud SaaS Application

Keep learning and share any additional information that you think would add value and will update that in this post

Friday, February 16, 2018

Oracle EPM Cloud Updates & Patches

Oracle EPM Cloud Feb 2018 updates are out
If you are unsure of where and when this information would be available, take a look at "Oracle Proactive Support EPM". As this is a blog, it will include a post for each Cloud EPM update. If you are interested to look for previous patch updates, you can look at older posts and you will be able to find it

You can also refer to "Cloud Readiness", select the appropriate tool. Watch out for "What's New" document. But remember that this will only include the latest update

PATCH UPDATE SCHEDULE (Information collected from the patch documents)

Oracle will apply the the latest updates to your test environment on first Friday of the month and to your production environment on third Friday of the month

I have took some time and liberty and collated direct link for the Feb 2018 Updates for you

PS: You can also see Jan 2018 updates as well in the same link

Note of appreciation

I appreciate Oracle for doing a really great work in terms of documentation. The patch documentation is well documented

  1. EDMCS February 2018 Updates
  2. PBCS/EPBCS February 2018 Updates
  3. FCCS February 2018 Updates
  4. EPRCS February 2018 Updates
  5. ARCS February 2018 Updates
  6. TRCS February 2018 Updates
  7. PCMCS February 2018 Updates

The first cloud product was released on Feb 14th 2014 and was PBCS.

And there are different timelines for each product releases. Considering that, what if you would like to know about the past patch updates

Oracle team is following a standard nomenclature for the patch updates

{base link}/{product short code}/{year}-{product short code}-wn.htm

Let's understand the above nomenclature with the EDMCS example is the link for EDMCS

{base link}

This is the starting point of the link that is constant

{product short codes}: edmcs, pbcs, fccs, eprcs, arcs, trcs, pcmcs are all the product short codes. I hope you know what those short names are

{year}: 2018, 2017,2016 and so on

If you want 2017 for any product that was released in FY17 or before you can use the above format and be able to get all the patch updates of that particular product for that year (all periods)

Try out this for PBCS:

The above link approach will only work for Oracle EPM SaaS based applications. For Oracle Analytics Cloud (OAC) which is PaaS, use this link 

That's all for today. Now with all the EPM products live in Cloud, I am trying to collate and collect the data across all the Cloud EPM products in terms of features, defects, enhancements, workarounds and build some visualizations around it

As mentioned in my previous posts, though I am pursuing my interests in Machine Learning, AI and NLP, my primary skill set is still Oracle EPM for now

Happy Learning!!!

Sunday, January 28, 2018

EDMCS is Here!!!

For all those folks who worked / working in DRM and have been waiting to see the EDMCS in the cloud. It's here

Every time, I look at I always see the "COMING SOON". See below

I don't see it today. I believe it's officially launched. I am not the first one to spot. I would like to give Juan the credit for his tweet. Click on the "Quick Tour" on the right side to see a few slides of what EDMCS is all about

There is no Pricing tab along with "Overview", "Features" and "Learn More". I would expect that this is available for the very few privileged and may be open for everyone in due course of time (I have no idea when that is going to me)

I don't see the documentation link under "Learn More". Thankfully, I was able to dig a little and able to find the documentation here. This will be your starting point

Oracle is doing a great job providing lot of video tutorials and EDMCS is no exception. I personally learned a lot about Oracle EPM cloud tools though I do not have access to many of them. "Get Started" will be your starting point.
If you are unsure of how you would get more out of the videos and documents, follow my instructions below

Overview : Take a look at the overview / tour of EDMCS (link here) to get a view of what EDMCS is and what you can do with EDMCS

The document is organized in a way that you can navigate based on the list of tasks that you will be performing in EDMCS. Below is the order of tasks that you would be performing with EDMCS and it is worth to explore each step one by one

Click on "Configure Applications" and take a look at the Getting Started section

In contrast to DRM, EDMCS follow an application based approach

You have to create an application first followed by dimensions, nodes and required properties
Thanks to Oracle EPM cloud team for adding the PBCS and EPBCS directly (Can't ) as the application type. You can also select the "Custom" and define a custom application.

I would have to explore more to understand the terminology that is used here
Information Model, Data Chain, Viewpoints are few things which came across while looking at the document
I am not a DRM expert and have to connect back with my folks to get a more deeper understanding of how it works in EDMCS

If you have already explored EDMCS, I would appreciate if you can give more information about it. Till then, Happy Learning!!!

Sunday, October 8, 2017

Quick Tips on data recast and data validation

I was working on a data recast process which involves applying some complex mappings to the source data before it can be loaded to the target cube. This is quite a common exercise in applications related to Essbase, Planning and HFM.

I used a database to hold all those mappings along with the source data. I would recommend this approach as you have full control on what mapping you are applying and you can keep track of what mapping changes you are doing since the data recast process is a repeatable process

After the mappings have been applied, I had to do some manual addition of few accounts subtract data from few accounts. I am not an expert in SQL so excel pivot table to the rescue. All good and the data is ready to be loaded to the cube

Tip #1: Standardization is the key
Always use a standard load rule and a standard data load format for the data file (sort the columns from least to most and the dense dimension at the last and for columns). Also, use a standard delimiter in the rule file to prevent future modifications.

Tip #2: Header is the identity
Have a header record to identify the dimension for the column. The dimension is identifiable if it has prefixes. Otherwise, always have a header record

Tip #3: Know your data first
Often I get request from my friends and others (somehow they reach out to me through a common list of friends) that they get the error"Essbase encountered a field in the data source that it did not recognize". This generally happens when you have a number in a member column whereas essbase excepts a member name. There are instances where the member name is actually a number and when you use excel to do some transformation, excel converts the number to exponential format 1e+10. when you import the data to excel / transform the data using "Text to Columns", define the column data format as "Text" as shown below

Refer to the error document here for the list of possible error messages that you might encounter

Tip #4: Special characters are not special
If you have special characters in your member names, ensure that the delimiter in the rule file is not one among those special characters. If you have spaces in your member names, ensure that you have quotes surrounded like "Opening Inventory" and your delimiter is not SPACE. If you are saving the file as a csv and your member name has "," then follow the same principle as above for SPACES.

Tip #5: Avoid Copy Paste
Do not copy the data from Excel to file and update the delimiter.

All the above tips acts as a checkpoint to ensure that you have a repeatable process and following a standard methodology. (I have used a relational database to store the mapping. It's not mandatory to have a database. Whatever process you are comfortable with works. Ensure that you don't change the process each time you have to run the mapping and load the data again). Have a broader vision on how the process can be replicated for any future data recast processes. I do not have access to ODI / FMDEE which is another option but time consuming.

I was all set to get the data out and have the right delimiter set. Got the data out of the excel, loaded to the cube and started validation. Data doesn't tie.But, data doesn't lie. Where did it go wrong?

Tip #6: Data doesn't lie
(Recommended for small data sets). Validation always happen at the top level. Since, ours an ASO cube, I don't have to run any aggregations. I love ASO. I really love ASO (Awesome Storage Option :) ). I quickly copied the data file in to excel, added the missing members in the file and tadaaaaa. I realized what went wrong. I was loading data to 5 periods ( 4 periods and BegBal) and the BegBal column was completely empty and the load rule shifted the other 4 period columns to left. Q1 data got loaded to BegBal, Q2 to Q1 and so on. I added #Mi in the BegBal column and all good
This is due to the violation of Rule #5
That's all I have for today. Let me know if you have any more tips to add. You can post in the comments and i will add them as updates to this blog post

Happy learnung

Monday, October 2, 2017

I am Back!!!

Hi Everyone

It's been a long time. Probably, 2 years and lot of things happened during this time.
My wife got pregnant in 2015 and we had a lovely daughter born in December 2015. New place far away from home with no second hand to help, I had to take a complete care of my wife and daughter due to the complications she had during pregnancy

This are settling down slowly and I have decided to move back to India due to health issues for my family to recover. It was a hard decision to make with lot of scope and opportunity in the Oracle EPM space that is available in US. I am trying to get pack in pace with all the latest advancement that is happening

Having worked primarily on Essbase for the past 10 years, my focus is completely on Oracle Analytics Cloud which was released on 17-Mar-2017. Seasoned consultants who had an early access to OAC has posted various posts on OAC and what it has to offer. I will not post what other experts have already posted but rather collate all the various posts in one post for all of you to take a look at it before you star reading my detail posts on OAC.

I am very late in the game as I have recently got access to the demo instance which Oracle provide to partners (Stay tuned for my next post) and started looking at the OAC. I also had the privilege to attend the OAC training which was conducted by Oracle Development Team exclusively for partners. I loved the training and it was completely opposite to what my initial impression was on OAC specifically about Essbase

More posts to follow.

Happy Learning!!!

Saturday, November 21, 2015

Planning (BSO) to Reporting(ASO) Replicated Partition Issue (Not really an issue) - Solution

There was an issue reported from user that a small sheet with 15 rows and 6 columns from reporting cube (ASO) is taking around 6 minutes to retrieve and getting timed-out

We tried from our end couple of times and identified that it was sporadic and tried all possible options to fix it. Sometimes, it retrieves very quickly and sometimes it takes a huge amount of time.

Option 1: Outline size was around 200MB and we compacted the outline which brought down the outline size to 22 MB. But, this didn't work

Option 2: We cleared aggregations (We were very sure that this will not improve performance). It didn't work

Option 3: We have a database where we capture all our metadata and data loads with start time, end time and elapsed time. We noted down the times when the retrievals were faster and looked in to the database to find what process were run before the retrievals.

We found that whenever a dimension build process is completed, the retrievals are faster but after that the retrieval times increased gradually. We have a process which runs every 10 mins which does currency conversion in Planning (BSO) and refreshes a replicated partition to push data to reporting (ASO) cube. We kept retrieving after each refresh is completed and could see that the retrieval times were increasing gradually. We have also noticed that the Plan to report process time was also increasing gradually

We identified that the Plan to Report process is the issue. After lot of digging and analysis, we identified that every time a partition is refreshed a new slice gets created in reporting (ASO) cube and if a user retrieves any combination which needs to check in both main slice and incremental slice and then has to do some internal processing to get you the right numbers

As a solution to the issue (It's not an issue but rather essbase behavior), we have to merge the slices. This improved the retrieval times from 6 minutes to 30 seconds. This has also reduced our Plan to Report process and the timings were very consistent at 6 minutes which was taking somewhere between 6 mins - 15 mins depending the number of slices

We also had to look from a performance standpoint to decide which type of "slice merge" we had to choose

  • Merge all slices to main cube
  • Merge incremental slices to one slice. Since, it is creating a new slice, everything went fine as 
Merge to the main cube gave us two issues for which we dropped the option

  • Merge was taking more time as we have aggregations on the cube. We had to drop the aggregations to make the merge to cube faster which was not an option
  • We tried merging without the aggregations and was taking more than a minute

Merge to the slice was the way to go. merge takes around 25 seconds with / without the cube aggregation

It was very difficult to identify the problem as what was causing the retrieves to be slower but now everything works as normal