Sunday, July 26, 2020

EPM Cloud Tips & Tricks - #1

The first EPM cloud product was released in 2014 and it's been six years till date. I was recently part of an FCCS implementation project. I know what you might be thinking. Coming from completely essbase and planning background and been working on it for almost 12 years and doing an FCCS project? Well, it turned out that way and it was a change for me too than being in my comfort zone and took it as a challenge. 

I have been very busy for over the past one year and I didn't really had a chance or time to get back to my blogging and sharing my knowledge. The project finally went live and I am going to share my bit of learnings. Some of it you might alread know

The first tip is going to be an easy one and those who have worked in FDMEE / Data Management in the cloud, you might already know it. But, this is very important when it comes to FCCS as zeroes are valid from Balance Sheet standpoint


Data Management by default doesn't load zeroes. Below is a excerpt from the documentation

Disabling Zero Suppression (NZP)

The NZP expression is used to disable zero suppression during the data-load process. By default, Data Management bypasses accounts in the trial balance that have zero balances. In certain circumstances, you may want to load all accounts, to ensure that values that should be zero are replaced. You enter NZP in the Expression column of the Amount field to disable zero suppression.

The parameter is NZP.

Link: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/diepm/integrations_expressions_disabling_zero_144xd9f89fe4.html

 The NZP expression can be used both for the data load and for the data extract. There could be many reasons why you would want to load the zero balances. Below are a few possible use cases that we encountered in our project where we had to enable the NZP

Scenario #1

For Balance Sheet accounts, zero balance makes sense and you have to load the zero balances. As the default load method in FCCS is Periodic, this would not be a problem as your YTD numbers would be right. It makes sense to have zero balances when you push the data to downstream systems. For example, if you are pushing periodic numbers to system like Hyperion Planning / PBCS and YTD is calculated dynamically with Skip Missing enabled, then your YTD numbers would be wrong

Scenario #2

We send the consolidated Closing Balance for all Balance Sheet accounts from FCCS to the account reconciliation tool. We push the data on a daily basis during the close process, there would be balances that would have become zero during a period of time / reclass could have happened where the balances would become zero

For example, the Intercompany accounts may have balances at the beginning of the close and as you progress during the close, those balances may become zero

Our account reconciliation tool uses an insert/update process where new balances are inserted / change in balances would be updated. If you don't send zero balances then the previous balances would remain in the account reconciliation tool

Scenario #3

Any journals in EBS primary ledgers that cause the periodic balances to become zero during the close. We load STAT units which get reversed during the day-5 of close and get posted for the current period on day+1 of the close. If the periodic units become zero, those will not get loaded and the reversal of units from prior period remains. For this purpose, we enabled the NZP to load the zeroes






Thursday, January 30, 2020

Kscope19 - My experiences & Learnings

Kscope18 happened at Disney World resorts, Orlando, Florida and it was a blast. That was my first Kscope experience and it has inspired me a lot. I was the only one from Genpact to present two sessions and I took back all the learnings, experiences and inspired few more colleagues to submit abstracts for the Kscope19 conference

Kscope19 happened at Seattle and couldn't have asked for more. It was again a great experience that can only be experienced. Though my abstracts were not selected, I am really happy that few of my colleagues abstracts were selected. Kudos to Kavibharathi, Megha Chopra, Mohit Jain (couldn't make to kscope due to visa issue), Yilmaz, Zulma who had their first time experience and a chance to present at the conference

Pre-conference day - 22nd Jun 2019 - Community Service Day

Every year ODTUG kicks-off with a community service day before the conference. I couldn't make it last year and I made a point to attend and be part of the community service day. This year, Kscope volunteers worked at pike market food bank to pack lunch bags for the homeless and those who are in need. We were all explained about the history of Pike Market Place and all the volunteers were split in to 5 groups with 15 people each and and were given various activities. Our group was given a tour to Pike Market Place first and were then involved in packing the lunch bags.

Community service day in full effect


Neviana does a really good at social media platforms and no one can beat Neviana. I give Neviana the credit by posting update from her tweets



We have a new Stack Master

If you see the below tweet, It's ACE only picture and finally I get to be a part of every ACE event. More to come. Stay tuned

Sunday Symposiums started on Sunday 23rd Jun 2019. There were lot of surprise announcements from Oracle in the world of Analytics, EPM, EPM Cloud and that requires another post and will not be able to cover here. There were many announcements which seemed to be a strategic decision from Oracle but we all have to wait and see if any of those would change w.r.to the existing customer base as the announcement has raised lot of questions

24th Jun 2019 - Kscope19 officially starts of with a general session. A hands-down to all the Board of Directors, conference committee and all the volunteers who have spent many hours in making the Kscope19 happen



Congratulations to newly added ACE's and those who got promoted. I am pretty happy that I have been accepted as Oracle ACE associate.






To put some stats on why Oracle ACE is important.

The announcement followed by the Leadership Graduation Ceremony. A great honor and privilege for the entire leadership program team to get the certificates in front of 1400+ attendees at the general session

Opal Alapat completed her term as a conference chair and Kevin McGinley will be taking over as the conference chair for the next two Kscope conferences. Kscope20 will be at Boston, MA and Kscope21 will be at Nashville, TN. I think this is the first time Kscope is coming to Boston next year and Nashville followed. Opal has written an awesome blog on what it means to be part of the ODTUG. We all thank Opal from the bottom of our heart for her contributions and for all those who do not know. Though Kscope happens once a year, it takes a full year for the conference committee to plan and make sure that it delivers above and beyond what everyone would expect

The keynote speaker was <> and all were hooked to his magic. You have to look at Teal Sexton expression. It was mind blowing.


Last year at Kscope18, I was the only person from Genpact to present. This year at kscope19, we had a total of 6 sessions, 4 speakers, 2 attendees (Including me :)).
Below are the sessions that the Genpact team has presented. The presentations are available for all the attendees in the ODTUG website.
Couple of pictures and twitter feed for the Genpact team

Tuesday
First time ever, ever. I am part of the ACE dinner. It's a completely different feeling sitting with the ACE's and talk about the stuff which you don't generally get to talk anywhere else. Thanks Jennifer for all the work that you have put in to recognize the Oracle ACE's and also recognize the value add that ODTUG community brings in to the oracle community

On Wednesday, The Leadership Class of 2019 has presented their project to the board. I am thrilled to be part of such an amazing class. Natalie Brunamonte, Nick Boronski, Jackie Mcllroy, Faraz Rahim, Edith Villareal and myself. A special thanks to Wendy Wilson for being our leadership program co-coordinator. Karen, Veronica for your support and guiding us and providing information whenever we need. A big thanks to all the presenters for our monthly sessions on leadership and it really helped a lot

Here a couple of pics and twitter feed of our leadership class and our presentation to the board (I made sure to put pictures where you will not be able to read what our project was :P)

Met lot of friends I have made last year at kscope18 and having a re-union is always great. couple of pictures

Wednesday night is where all the fun starts. Kscope is not just about the education, networking and making connections but it's also to relax and have some fun. Every year, kscope make it interesting by coming up with a theme. Last year theme was Disney and this year, it was the Pop Culture. Great costumes, great fun. This is one of the reason why people keep coming back to kscope. Kscope isn't just an a-la-carte, It's a buffet.

Finally, the deep dive session on Thursday and the closing session. It's sad to hear that Natalie Delemar term is going to come to an end by the end of December. It's sad to see two iron ladies going away but they are still with ODTUG and are in reach and will be helping in the background. Thank for all your hard work and dedication and now is the time to get back to your personal goals (There might be a huge list of what to do)
The closing session concluded with the award ceremony and below are the people who won the awards. Worth to mention. Pete () won the top speaker award. YooHoo!!!!. Congrats Pete. I would recommend everyone to take a look at at Pete's presentations at kscope19 and past Kscope presentations. They are tremendous and have a depth of knowledge which you cannot get it anywhere.




Friday, April 12, 2019

EPM Cloud (PBCS/EPBCS/FCCS) - Report Bursting & Reports scheduling

The first product of Oracle EPM cloud was launched in 2014 and it's been 5 years and over the course of these 5 years, Oracle has every EPM product in cloud to what is available in on-premise. With so many products under the EPM belt each with its own functionality, there is a real need to have a single unified reporting tool that can handle all your reporting requirements across all your EPM products at one place. EPRCS (Enterprise Performance Reporting Cloud Service) is the Oracle direction to address all the reporting needs for any organization of any size. If your team is responsible for the management, narrative and external reporting with the ability to author, collaborate and a solid approval process, you definitely have to consider implementing EPRCS at your organization

EPRCS can connect to your EPM Cloud products, Essbase cloud and also to your Fusion Applications Essbase app. It addresses all your financial, management, narrative and disclosure reporting. I am not going to talk much about EPRCS as Opal Alapat has done a great job in writing a lot of content around the EPRCS. Below is the list of blog posts that Opal has written on EPRCS. Opal has also written a book which you can buy it from here

What I am going to talk about in this post is how to achive an on-premise functionality in the cloud EPM products. Every EPM cloud product (PBCS, EPBCS, FCCS) provide the ability to create FR reports in its own instance. This is a strategic decision to have the reporting requirement capability within individual cloud product gives an opportunity for small organizations whose reporting requirement can be handled from the PBCS instance only.

One of the nicest feature of Hyperion Financial Reports in on-premise world is the ability to combine reports to books and also the ability to burst the report to users at a scheduled time. If you look at the jobs section in PBCS instance, for example, you don't see an option to schedule the reports. you can only open the reports over the web / using SmartView. I am not sure if anyone has noticed but there is an option to create schedule the reports. This gives you an opportunity to build reports within its own instance and also the ability to schedule the jobs can give an edge to the clients. Think of the HFR capability in the individual cloud product similar to CDM (Cloud Data Management) compared to EPRCS

Before you start any migration project/starting off implementation in the cloud, you cannot sideline the reporting requirements.

Login to your Cloud instance and click on the hamburger icon (navigator) on the top left and click on Explore Repository under Reporting. Depending upon what all cloud features are enabled in individual cloud application, the location of the Reporting is going to change




Below screen open with the list of folders and reports

Step 1: Create a batch for Scheduling. Select "Batch Reports for Scheduling" and click "Next"


Step 2: Add the reports to the batch. [1] Select the report "TBC_PnL_Report". [2] Click to move the report to the right. [3] Click Finish adding the report to the batch


In the next screen, Give a name to the batch and click "Save"



Step 3: Follow the screenshot below along with the instructions to create a new batch scheduler

Step 4: Follow instructions as per below screenshot for batch scheduler job setup. Here, I have setup the "Frequency" to "Perform Now" but you can schedule it to whatever time you would want to.
Step 5: Follow the rest of the screenshots to configure the output and e-mail options for yout batch scheduler






Step 6: You should see the batch reports sent to your mail.


I hope you get an idea of how to utilize the most loved feature of HFR reports in your on-premise environment to the cloud.Before you make any decisions to move in to the cloud, it is really important to look at 
  1. What your end-users are?
  2. what type of processess you have in your on-premise and how are you going to implement / re-produce the same in the cloud?
  3. Are there any customizations built outside of your EPM on-premise environment and how these would affect once you move in to the cloud?
  4. What are your end-user reporting capabilities and will you be able to provide / complement with another solution to the users if not available in the cloud. 
Watch out for more blogpost on helping you move to the cloud / get to know the cloud better. Till then,
Happy Learning!!!










Friday, January 25, 2019

Moving essbase outline from On-Prem to Cloud (OAC) works

I am part of the EPM competency at my organization and I often get pulled in to few of the discussions. There was one instance this week where I was pulled in to a meeting for a simple yet typical problem

Problem statement
Client had an Essbase ASO application that was migrated to Oracle Analytics Cloud and was not used for a long time. Recently, they have loaded the data and started validating. There is a currency conversion process which is not producing any results. They wanted to check what got changed / what was missed. Unfortunately, they only had the OTL and the data files and they do not have any on-prem instance where they could easily open the OTL in EAS.

We had a lab instance where we opened the outline and figured out that the Entity dimension is missing the UDA's. Now, we wanted an easy way to get the UDA's so that we can make the change very quick

There was no way that you could view an OTL file and reached out asking how can we extract the dimension from the OTL so they could make the change

Couple of things we tried

  • I had Razza Outline extractor but for some reason it gives a read violation error on ASO outlines. Ruled Out
  • I created a sample app in our lab environment. used the DBX tool to export it to workbook and shared it

This got me thinking. Essbase Cloud is just any essbase. Why can't we create an app.db and just copy over the outline. There would be version differences but at the end, it worked out very well without any issues

Sometimes, the simple things miss from our mind. Glad that it all worked out...

Happy Learning!!!




Monday, January 7, 2019

PBCS/EPBCS - ASO exclude shared & Dynamic

As you all are aware that Oracle releases patches to EPM cloud every month (EDMCS is released every 2 months) and the patches are applied on first-week of Friday in Dev and third-week of Friday in Prod

I did a post long back about a challenge that I have faced in on-premise and how I have addressed that.

New functions were released in Nov-2018 release of PBCS. Below is an excerpt from the readiness document. You can find the document here

New Aggregate Storage Functions in Calculation Manager
The following Aggregate Storage functions have been added to Calculation Manager. These functions filter the shared and dynamic members from the Point of View (POV). The functions can be used in the POV of an Aggregate Storage custom calculation or allocation.

  • @FilterDynamic(Dimension Name, Member Name) removes all dynamic members from the list of members
  • @FilterShared(Dimension Name, Member Name) removes all shared members from the list of members
  • @FilterSharedAndDynamic(Dimension Name, Member Name) removes all dynamic and shared members from the list of members
These are very helpful functions. Previously, you don't have a way of excluding shared members / dynamic members. Remember, that these have only been added for running custom calculations on ASO. I wish these functions are available as part of data maps as well and works across the board for both BSO and ASO

Previously, you would have to address the same by using UDA's and using those to exclude the members / include the members that can be used in performing custom calculations
in on-premise world, you have a workaround where you can use a mix of filter and member properties to address this and I have used this very heavily in the ASO applications that I have built so far

Filter(Filter(Descendants([NetIncome], [Account].Levels(0)),NOT [Account].CurrentMember.Shared_Flag),[Account].CurrentMember.MEMBER_TYPE <> 2)

The above logic will exclude the shared members and dynamic calc members. This can be used in Clear, procedural calcs and MDX queries

However, in the cloud world, you do not have access to run the MDX queries / MaxL scripts like the way you can do in on-prem. If you have to run procedural calcs on ASO, you have to use calculation manager and is limited to what you can actually do using MaxL and MDX scripts

Oracle have heard and have come with these functions to help you out with easy peasy ways of addressing the never-ending challenges you face in the ASO world. This is just the beginning. There are lot many things that you can do with procedural calcs and MDX compared to what you can do with with the calculation manager on ASO. I wish they add the functionality to execute MDX queries and the ability to store the result in the inbox/outbox folder

I haven't personally tried them if you have do let me know and will link your blog here.

Saturday, July 14, 2018

Essbase Cloud - REST API

REST API is not something new in Oracle EPM world when it was nunvieled and what runs behind most of the Oracle EPM product portfolio and that is the direction what Oracle says and probably you should hear it and start looking in to it

When OAC was released in March 2017 and when I took a look at it three months later, I always wondered if there were REST API's that powers EssCli and I was right. Oracle at Sunday Symposium did mention that and showed a demo of how the REST API's can be used for dynamic data loading and I was thrilled.

I was even thrilled when my good friend Vijay Kurian gave a session on Essbase REST API. You can download his presentation once it is available for everyone from ODTUG. Being a presenter and attendee, we get early access to the presentations and I feel so much previleged about it

If you want to get started right away and want to go and look at the documentation, it's not available yet. This is where it becomes difficult to know what all operations I can perform with REST API and what should be my payload format for my POST requests. What I am going to show in this session is how to go down and take a look at the endpoints and what is the format for the REST endpoints. Suresh T in the blog essbaselabs have shared the REST endpoints and JSON for outline editing. This gives a sneak peak to what all operations (Every operation) you can perform using REST API

Kumar Ramiyer hinted that every click in OAC is a REST API. Considering that, every single operation that you can perform in UI can be customized / extended / you can build your own operational UI.
This is very exciting. But, how do you start and where to start? John Goodwin (inlumi), Jason Jones (Applied olap) and Victor (Oracle) have shared their information about REST API here, here and here. Also, look at Suresh T post in essbaselabs.

If you have past experience with Essbase, you may have heard about C API, VB API na dJava API which extends the existing functionality and gives you the ability to build a custom application using these API without which it would not have been possilbe otherwise.

There are many tools available on how to get started with REST API. Postman is one such tool which can be very helpful. If your organization has restrictions for installing any third-party software, you can use various REST Clients available as add-ons for most of the browser. I personally use RESTClient with firefox Quantum Ver 60.0.2 (64-bit) and works beautifully.

With no documentation available, how will I ever know about my REST endpoints? That's a valid question. There are two ways that you can know. None of them is my discovery

  1. You can use the url format 'http://your_essbase_url/essbase/rest/v1/application.wadl'. If you don't know what WADL is, check out this link from Wikipedia. This shared by Vijay Kurian at ODTUG Kscope18 in his presentation
  2. You can use Fiddler to track every single action of yours and based on that you can capture and define your REST endpoints. This is cool right?? Thanks to Kumar Ramiyer for hinting this out
I am going to kickstart now with the [2] and show you how you can use fiddler to track the OAC operations and use that to build your own end-end process using your choice of scripting language and REST API

You can install fiddler from this link. Installation is pretty straightforward.
When you open fiddler, it will automatically track the traffic. You can turn that off first by clicking File > Capture traffic

Once you have turned off "Capture Traffic", make sure that you have "stream" and "Decode" on.

Minimize your fiddler and firefox browser, click on "Any Session" and select the firefox browser and enable the traffic as File > Capture Traffic.
Before


After
Somehow, it didn't work. So I have closed all the browsers and opened only chrome. It probably has to do with the default browser setting I believe. If you get it to work with Firefox, let me know.

Open your OAC instance and you would be directed to the usual link. Look at my previous post here where I have shown how to switch from regular UI to JET UI (Modern Interface). If you are all set, you should something like below

Marked at the top block is the REST API link what is tracked with Fiddler. Bottom is the list of applications returned. The link captured is the home screen of the OAC. You might have a doubt that is this really real....

Let's create an application in the UI and see what is tracked in fiddler
Click on "Create Application"
Provide the application name, database name, expand the Advanced options and change if you like to change anything
Once the application is created, you should see that application reflected in the home screen


Below is what is tracked in fiddler


Creation of application is a POST request. Marked in the top is the POST request

URL Link
http:///rest/v1/applications
JSON Payload

{
   "applicationName":"test1",
   "databaseName":"test1",
   "databaseType":"B",
   "allowDuplicates":"false",
   "enableScenario":"false"
}

Isn't it cool? I would say this is awesome.

Now let's try the same using the REST Client in firefox and see if we can create another application
Just like any other RESTClient, you can setup the authentication and any custom header (I have set accept to application/json) and used the same URL that is displayed in the fiddler to show that both are the same. Below is how it would look like in RESTClient in firefox


Scroll down and click on "Response" and you should see the applications

 If I scroll even further, it will show the "test1" application that I have created




Now, let's try and create an application using the same format as what is tracked in the fiddler and see if the application gets created
For application creation, you have to change the method to POST
You have to provide JSON payload in the body section
It should look something like below


you would get "unsupported media type" error if you do not specify "content-type" as "application/json"

If you get a 200 OK response, then the application creation is unsuccessful. You should see the application once you refresh your home screen

Something sounds wierd right. in my JSON payload, I have specified the databaseType as A considering that it would create an ASO similar to the databaseType B for BSO. But, unfortunately it created a BSO cube only
I was curious to know and manually created the application from OAC and looked at the fiddler trace
Strange isn't it. databaseType for BSO is B but for ASO it is ASO. I wish there was a standard code for this

How does we delete an application. For deletion, you would use a DELETE method and use the same URL as above with an addition of your application name
http:///rest/v1/applications/test2
you do not need a JSON payload to delete an application

Don't get confused if the response is 204 No Content as the DELETE will not come back with an output. If you get the response code as "204 No Content", you should be good
Let's refresh the OAC home screen and see
This is just the beginning.


The only question I have now is will it only be REST API or will there be support for Java API. I am not a programmer so I don't know how it works under the covers. Does the REST API hit the JAVA API internally? I do not know

Take a look at essbaselabs as Oracle team is doing brilliant work in sharing information
 I will stop here. You can use the approach I showed here to track any request and let me know if you have a showstopper / I have said something wrong

It took my whole Saturday to write this post. I should have planned it in a better way and have to speed up my writing

In the next post, I am going to talk something outside of essbase by giving an introduction to Jupyter notebooks. There is a reason why I would like to cover and it's a surprise and you have to wait till my next post.

Happy Learning and Sharing!!!

Sunday, June 24, 2018

OAC - Switch Classic to Modern - Trick

Back to back posts today. But, I am excited and wanted to share this cool hack....Not sure if anyone has figured it out....

I did a blogpost on OAC new / modern interface and few features that I really liked. You can find my blogpost here. A good friend (became friends at Kscope18) Vijay Kurian posted in his blog at theunlockedcube about the Essbase Cloud features Part I and Part II and he has shown how to switch from classic interface to modern interface

For some reason it didn't work for me in the pormotional OAC instance that I was using. However, I was able to switch based on what I poseted in my previous blogpost

In general If you want to switch from Classic to Modern, follow the below screenshot

However, for some reason that didn't work for me and I didn't knew why

What I did might sound pretty silly but now I have the whole new Modern Interface that i can use without having to rely on the Deep Dive environment that I was using earlier. (That's Oracle environment and I don;t know till when that env will be live)

The OAC classic link will be https:///essbase/ui/ as shown in below screenshot



To get to new interface replace essbase/ui with essbase/jet.

https:///essbase/jet/ will be your new modern interface


 Isn't it cool...It's awesome!!!