Friday, April 12, 2019

EPM Cloud (PBCS/EPBCS/FCCS) - Report Bursting & Reports scheduling

The first product of Oracle EPM cloud was launched in 2014 and it's been 5 years and over the course of these 5 years, Oracle has every EPM product in cloud to what is available in on-premise. With so many products under the EPM belt each with its own functionality, there is a real need to have a single unified reporting tool that can handle all your reporting requirements across all your EPM products at one place. EPRCS (Enterprise Performance Reporting Cloud Service) is the Oracle direction to address all the reporting needs for any organization of any size. If your team is responsible for the management, narrative and external reporting with the ability to author, collaborate and a solid approval process, you definitely have to consider implementing EPRCS at your organization

EPRCS can connect to your EPM Cloud products, Essbase cloud and also to your Fusion Applications Essbase app. It addresses all your financial, management, narrative and disclosure reporting. I am not going to talk much about EPRCS as Opal Alapat has done a great job in writing a lot of content around the EPRCS. Below is the list of blog posts that Opal has written on EPRCS. Opal has also written a book which you can buy it from here

What I am going to talk about in this post is how to achive an on-premise functionality in the cloud EPM products. Every EPM cloud product (PBCS, EPBCS, FCCS) provide the ability to create FR reports in its own instance. This is a strategic decision to have the reporting requirement capability within individual cloud product gives an opportunity for small organizations whose reporting requirement can be handled from the PBCS instance only.

One of the nicest feature of Hyperion Financial Reports in on-premise world is the ability to combine reports to books and also the ability to burst the report to users at a scheduled time. If you look at the jobs section in PBCS instance, for example, you don't see an option to schedule the reports. you can only open the reports over the web / using SmartView. I am not sure if anyone has noticed but there is an option to create schedule the reports. This gives you an opportunity to build reports within its own instance and also the ability to schedule the jobs can give an edge to the clients. Think of the HFR capability in the individual cloud product similar to CDM (Cloud Data Management) compared to EPRCS

Before you start any migration project/starting off implementation in the cloud, you cannot sideline the reporting requirements.

Login to your Cloud instance and click on the hamburger icon (navigator) on the top left and click on Explore Repository under Reporting. Depending upon what all cloud features are enabled in individual cloud application, the location of the Reporting is going to change

Below screen open with the list of folders and reports

Step 1: Create a batch for Scheduling. Select "Batch Reports for Scheduling" and click "Next"

Step 2: Add the reports to the batch. [1] Select the report "TBC_PnL_Report". [2] Click to move the report to the right. [3] Click Finish adding the report to the batch

In the next screen, Give a name to the batch and click "Save"

Step 3: Follow the screenshot below along with the instructions to create a new batch scheduler

Step 4: Follow instructions as per below screenshot for batch scheduler job setup. Here, I have setup the "Frequency" to "Perform Now" but you can schedule it to whatever time you would want to.
Step 5: Follow the rest of the screenshots to configure the output and e-mail options for yout batch scheduler

Step 6: You should see the batch reports sent to your mail.

I hope you get an idea of how to utilize the most loved feature of HFR reports in your on-premise environment to the cloud.Before you make any decisions to move in to the cloud, it is really important to look at 
  1. What your end-users are?
  2. what type of processess you have in your on-premise and how are you going to implement / re-produce the same in the cloud?
  3. Are there any customizations built outside of your EPM on-premise environment and how these would affect once you move in to the cloud?
  4. What are your end-user reporting capabilities and will you be able to provide / complement with another solution to the users if not available in the cloud. 
Watch out for more blogpost on helping you move to the cloud / get to know the cloud better. Till then,
Happy Learning!!!

Friday, January 25, 2019

Moving essbase outline from On-Prem to Cloud (OAC) works

I am part of the EPM competency at my organization and I often get pulled in to few of the discussions. There was one instance this week where I was pulled in to a meeting for a simple yet typical problem

Problem statement
Client had an Essbase ASO application that was migrated to Oracle Analytics Cloud and was not used for a long time. Recently, they have loaded the data and started validating. There is a currency conversion process which is not producing any results. They wanted to check what got changed / what was missed. Unfortunately, they only had the OTL and the data files and they do not have any on-prem instance where they could easily open the OTL in EAS.

We had a lab instance where we opened the outline and figured out that the Entity dimension is missing the UDA's. Now, we wanted an easy way to get the UDA's so that we can make the change very quick

There was no way that you could view an OTL file and reached out asking how can we extract the dimension from the OTL so they could make the change

Couple of things we tried

  • I had Razza Outline extractor but for some reason it gives a read violation error on ASO outlines. Ruled Out
  • I created a sample app in our lab environment. used the DBX tool to export it to workbook and shared it

This got me thinking. Essbase Cloud is just any essbase. Why can't we create an app.db and just copy over the outline. There would be version differences but at the end, it worked out very well without any issues

Sometimes, the simple things miss from our mind. Glad that it all worked out...

Happy Learning!!!

Monday, January 7, 2019

PBCS/EPBCS - ASO exclude shared & Dynamic

As you all are aware that Oracle releases patches to EPM cloud every month (EDMCS is released every 2 months) and the patches are applied on first-week of Friday in Dev and third-week of Friday in Prod

I did a post long back about a challenge that I have faced in on-premise and how I have addressed that.

New functions were released in Nov-2018 release of PBCS. Below is an excerpt from the readiness document. You can find the document here

New Aggregate Storage Functions in Calculation Manager
The following Aggregate Storage functions have been added to Calculation Manager. These functions filter the shared and dynamic members from the Point of View (POV). The functions can be used in the POV of an Aggregate Storage custom calculation or allocation.

  • @FilterDynamic(Dimension Name, Member Name) removes all dynamic members from the list of members
  • @FilterShared(Dimension Name, Member Name) removes all shared members from the list of members
  • @FilterSharedAndDynamic(Dimension Name, Member Name) removes all dynamic and shared members from the list of members
These are very helpful functions. Previously, you don't have a way of excluding shared members / dynamic members. Remember, that these have only been added for running custom calculations on ASO. I wish these functions are available as part of data maps as well and works across the board for both BSO and ASO

Previously, you would have to address the same by using UDA's and using those to exclude the members / include the members that can be used in performing custom calculations
in on-premise world, you have a workaround where you can use a mix of filter and member properties to address this and I have used this very heavily in the ASO applications that I have built so far

Filter(Filter(Descendants([NetIncome], [Account].Levels(0)),NOT [Account].CurrentMember.Shared_Flag),[Account].CurrentMember.MEMBER_TYPE <> 2)

The above logic will exclude the shared members and dynamic calc members. This can be used in Clear, procedural calcs and MDX queries

However, in the cloud world, you do not have access to run the MDX queries / MaxL scripts like the way you can do in on-prem. If you have to run procedural calcs on ASO, you have to use calculation manager and is limited to what you can actually do using MaxL and MDX scripts

Oracle have heard and have come with these functions to help you out with easy peasy ways of addressing the never-ending challenges you face in the ASO world. This is just the beginning. There are lot many things that you can do with procedural calcs and MDX compared to what you can do with with the calculation manager on ASO. I wish they add the functionality to execute MDX queries and the ability to store the result in the inbox/outbox folder

I haven't personally tried them if you have do let me know and will link your blog here.

Saturday, July 14, 2018

Essbase Cloud - REST API

REST API is not something new in Oracle EPM world when it was nunvieled and what runs behind most of the Oracle EPM product portfolio and that is the direction what Oracle says and probably you should hear it and start looking in to it

When OAC was released in March 2017 and when I took a look at it three months later, I always wondered if there were REST API's that powers EssCli and I was right. Oracle at Sunday Symposium did mention that and showed a demo of how the REST API's can be used for dynamic data loading and I was thrilled.

I was even thrilled when my good friend Vijay Kurian gave a session on Essbase REST API. You can download his presentation once it is available for everyone from ODTUG. Being a presenter and attendee, we get early access to the presentations and I feel so much previleged about it

If you want to get started right away and want to go and look at the documentation, it's not available yet. This is where it becomes difficult to know what all operations I can perform with REST API and what should be my payload format for my POST requests. What I am going to show in this session is how to go down and take a look at the endpoints and what is the format for the REST endpoints. Suresh T in the blog essbaselabs have shared the REST endpoints and JSON for outline editing. This gives a sneak peak to what all operations (Every operation) you can perform using REST API

Kumar Ramiyer hinted that every click in OAC is a REST API. Considering that, every single operation that you can perform in UI can be customized / extended / you can build your own operational UI.
This is very exciting. But, how do you start and where to start? John Goodwin (inlumi), Jason Jones (Applied olap) and Victor (Oracle) have shared their information about REST API here, here and here. Also, look at Suresh T post in essbaselabs.

If you have past experience with Essbase, you may have heard about C API, VB API na dJava API which extends the existing functionality and gives you the ability to build a custom application using these API without which it would not have been possilbe otherwise.

There are many tools available on how to get started with REST API. Postman is one such tool which can be very helpful. If your organization has restrictions for installing any third-party software, you can use various REST Clients available as add-ons for most of the browser. I personally use RESTClient with firefox Quantum Ver 60.0.2 (64-bit) and works beautifully.

With no documentation available, how will I ever know about my REST endpoints? That's a valid question. There are two ways that you can know. None of them is my discovery

  1. You can use the url format 'http://your_essbase_url/essbase/rest/v1/application.wadl'. If you don't know what WADL is, check out this link from Wikipedia. This shared by Vijay Kurian at ODTUG Kscope18 in his presentation
  2. You can use Fiddler to track every single action of yours and based on that you can capture and define your REST endpoints. This is cool right?? Thanks to Kumar Ramiyer for hinting this out
I am going to kickstart now with the [2] and show you how you can use fiddler to track the OAC operations and use that to build your own end-end process using your choice of scripting language and REST API

You can install fiddler from this link. Installation is pretty straightforward.
When you open fiddler, it will automatically track the traffic. You can turn that off first by clicking File > Capture traffic

Once you have turned off "Capture Traffic", make sure that you have "stream" and "Decode" on.

Minimize your fiddler and firefox browser, click on "Any Session" and select the firefox browser and enable the traffic as File > Capture Traffic.

Somehow, it didn't work. So I have closed all the browsers and opened only chrome. It probably has to do with the default browser setting I believe. If you get it to work with Firefox, let me know.

Open your OAC instance and you would be directed to the usual link. Look at my previous post here where I have shown how to switch from regular UI to JET UI (Modern Interface). If you are all set, you should something like below

Marked at the top block is the REST API link what is tracked with Fiddler. Bottom is the list of applications returned. The link captured is the home screen of the OAC. You might have a doubt that is this really real....

Let's create an application in the UI and see what is tracked in fiddler
Click on "Create Application"
Provide the application name, database name, expand the Advanced options and change if you like to change anything
Once the application is created, you should see that application reflected in the home screen

Below is what is tracked in fiddler

Creation of application is a POST request. Marked in the top is the POST request

URL Link
JSON Payload


Isn't it cool? I would say this is awesome.

Now let's try the same using the REST Client in firefox and see if we can create another application
Just like any other RESTClient, you can setup the authentication and any custom header (I have set accept to application/json) and used the same URL that is displayed in the fiddler to show that both are the same. Below is how it would look like in RESTClient in firefox

Scroll down and click on "Response" and you should see the applications

 If I scroll even further, it will show the "test1" application that I have created

Now, let's try and create an application using the same format as what is tracked in the fiddler and see if the application gets created
For application creation, you have to change the method to POST
You have to provide JSON payload in the body section
It should look something like below

you would get "unsupported media type" error if you do not specify "content-type" as "application/json"

If you get a 200 OK response, then the application creation is unsuccessful. You should see the application once you refresh your home screen

Something sounds wierd right. in my JSON payload, I have specified the databaseType as A considering that it would create an ASO similar to the databaseType B for BSO. But, unfortunately it created a BSO cube only
I was curious to know and manually created the application from OAC and looked at the fiddler trace
Strange isn't it. databaseType for BSO is B but for ASO it is ASO. I wish there was a standard code for this

How does we delete an application. For deletion, you would use a DELETE method and use the same URL as above with an addition of your application name
you do not need a JSON payload to delete an application

Don't get confused if the response is 204 No Content as the DELETE will not come back with an output. If you get the response code as "204 No Content", you should be good
Let's refresh the OAC home screen and see
This is just the beginning.

The only question I have now is will it only be REST API or will there be support for Java API. I am not a programmer so I don't know how it works under the covers. Does the REST API hit the JAVA API internally? I do not know

Take a look at essbaselabs as Oracle team is doing brilliant work in sharing information
 I will stop here. You can use the approach I showed here to track any request and let me know if you have a showstopper / I have said something wrong

It took my whole Saturday to write this post. I should have planned it in a better way and have to speed up my writing

In the next post, I am going to talk something outside of essbase by giving an introduction to Jupyter notebooks. There is a reason why I would like to cover and it's a surprise and you have to wait till my next post.

Happy Learning and Sharing!!!

Sunday, June 24, 2018

OAC - Switch Classic to Modern - Trick

Back to back posts today. But, I am excited and wanted to share this cool hack....Not sure if anyone has figured it out....

I did a blogpost on OAC new / modern interface and few features that I really liked. You can find my blogpost here. A good friend (became friends at Kscope18) Vijay Kurian posted in his blog at theunlockedcube about the Essbase Cloud features Part I and Part II and he has shown how to switch from classic interface to modern interface

For some reason it didn't work for me in the pormotional OAC instance that I was using. However, I was able to switch based on what I poseted in my previous blogpost

In general If you want to switch from Classic to Modern, follow the below screenshot

However, for some reason that didn't work for me and I didn't knew why

What I did might sound pretty silly but now I have the whole new Modern Interface that i can use without having to rely on the Deep Dive environment that I was using earlier. (That's Oracle environment and I don;t know till when that env will be live)

The OAC classic link will be https:///essbase/ui/ as shown in below screenshot

To get to new interface replace essbase/ui with essbase/jet.

https:///essbase/jet/ will be your new modern interface

 Isn't it cool...It's awesome!!!

OAC - New Interface

At ODTUG Kscope18, Oracle has revealed new features and groundbreaking updates that are going to come in the next release in OAC. Have you ever wondered what it's going to be

I was so excited that I kicked-off the promotional offer and fired up my OAC instance. But, I was surprised and felt sad after looking at the instance that is in public cloud and the instance that is going to come in the next release. Not sure if I am missing anything here which will enable me to access the newer interface / newer instance

The new interface looks sleek and looks a lot better than the previous interface. I am using the instance as parts of Hands-On lab at ODTUG Kscope18. Not sure till when this environment will be available. I will update this blog and make it more comparitive with what is available right now in public cloud

New JET UI. I don't know if that is the right way to call it but that's what it says when you fire up the OAC instance new interface

Less clutter and way better. When you click on actions at a database level, you would notice two new items. "Inspect" and "Analyze"
"Inspect" gives information about your database. The most important being the audit trail, statistics, partitions, filters and many more

What I really liked is the ability to execute the MDX scripts. With the ability to perform calculations using MDX INSERT and export the data using MDX EXPORT, it really makes sense to execute the MDX script.

"Analyze" on the other hand can be considered as smartview lite. You would have understood it by looking at the error message that popped up below.

This is not a fully functional SmartView and can be considered as a lite version to do some basic testing and also the ability to execute MDX scripts. If you are on Mac and doesn't want to install SmartView, this could be one way. However, it was announced at Kscope18 that office 365 will be supported for SmartView. This is really a very good news.

 If you look at Jobs, the type of jobs that you can execute has been expanded. You have the ability to run MDX script, Groovy script (this is new), ability to build and clear aggregations. Take a look at the list below

I have to explore more on "Run Groovy" and where that would be applicable. I believe this is the first time Groovy has been added to Essbase. Till now, Groovy has only been permitted to Planning cloud

Now let's look at files. Files could be anything from outline (otl), calc scripts (csc), report scripts, maxl scripts, mdx scripts, grrovy scripts, text files (csv, txt)
I believe some of the extension of these file depends on the type of application that you are trying to build.

There are two ways that you can upload the files to an OAC.

  • You can upload it at an APP/DB level by navigating yourself from "Inspect" > Click "Files" tab > "Upload files" and select the files that you would want to upload
  • Another way is to click on "Files" at the top and upload the files from there. I would prefer this was as this gives more control of where I can upload the files to. You would notice that there is a folder called "Gallery" and you can probably upload templates for a reference and use them later

Note of Caution
Remember that these features are not live yet and it may change from the time I am writing this to actually when it will be available for you to patch up to be able to see all these neat features

Now, let's take a look at one example. Below is the main screen. We will focus only on the applications for now

Navigate to applications folder and navigate down till the database level. "Upload Files" will let you upload files. When you selct the files, wait for it to show as per the below screen and then proceed further. It will come up with X mark if the format is not supported. Only the supported format gets uploaded. I am still trying to figure out the format for groovy scripts

You would see below screen after the files are uploaded

Have you noticed the file type? Maxl, MDX, Groovy. Isn't it amazing!!! I really love it...

Things I felt would have been better

  • There is no way to cancel. Once you click "Close", the files get uploaded
  • If the file already exist, it will show X mark and doesn't upload
  • If you want to delete any files, you have to do it manually. There is no way that you can select multiple files and delete
I wish all the above would change eventually go away

In the next blog posts, I will try to migrate the application I used for my kscope18 presentation in to OAC and see how things would work out. I am going to look at more use cases of MDX and how they can be used to become a standard langague in OAC

Thursday, June 21, 2018

OAC - Finally Up and Running

I had few hiccups while setting up my OAC instance. Learned a lot from these mistakes and would always recommend to read the document first and understand the pre-requisites before you get your OAC up and running

There are various blog posts that talk about how you can get started with OAC. The blogs that helped me to get started are Blog 1 (blog post at Rittman Mead and this post also has information about how to setup the https for your OAC instance) and Blog 2 (This particular blog helped me with the access rules that need to be enabled to get things to work)

Apart from these, the two things where I was stuck and probably would help others if you face the same issue

  • Even though I have created DBaaS and have started the service, my OAC still pop-up with the message that the pre-requisites are not met. All I did was to open up the ports (http, https, dbconsole, dblistener) and post that I was able to create the OAC instance. From your
From your Dashboard, do as shown in the below screenshot

Click on "Open Service Console" and perform the step as shown in the below screenshot

Click on "Access Rules" and ensure that all those marked are enabled as per the below screenshot

Once this is done, you should be able to launch the "Create Analytics Service"
  • While creating the OAC instance, remember to use the rest endpoint URL for storage directly copied from the accounts section. The format would be {REST endpoint URL}/{container}. I have named my container "beyondessbase". So, my storage container format would be {REST endpoint URL}/beyondessbase. Remember that the user id and password would be your cloud account.
You can get the REST endpoint of your storage container navigating as below
Click on the hamburger icon and click on "Dashboard". You would see all your created instances below

Click on "Open Service Console" and you would land to as below

Click on the "Accounts" and you would get the REST endpoint. This is the rest URL that you should use followed by the container. in my case, it's the {REST endpoint}/beyondessbase1. Though I have two containers, I was using beyondessbase1 for my OAC instance

I also had another issue of using http for my OAC instance which I don't like it. Becky has helped with a link to a post on how to fix that which can be found at the link blog post mentioned before as Blog 1

After all that is done. I am finally up and running with OAC. I will shut down the instances, for now, to use it tomorrow. Not sure what challenges I will encounter once I fire up my OAC instance tomorrow. Till then, Goodbye for now...

Remember to select the type of OAC that you would like to have. It can be Standard, Data Lake or Enterprise Edition

Happy Learning!!!