Skip to main content

OAC - Finally Up and Running

I had few hiccups while setting up my OAC instance. Learned a lot from these mistakes and would always recommend to read the document first and understand the pre-requisites before you get your OAC up and running

There are various blog posts that talk about how you can get started with OAC. The blogs that helped me to get started are Blog 1 (blog post at Rittman Mead and this post also has information about how to setup the https for your OAC instance) and Blog 2 (This particular blog helped me with the access rules that need to be enabled to get things to work)

Apart from these, the two things where I was stuck and probably would help others if you face the same issue

  • Even though I have created DBaaS and have started the service, my OAC still pop-up with the message that the pre-requisites are not met. All I did was to open up the ports (http, https, dbconsole, dblistener) and post that I was able to create the OAC instance. From your
From your Dashboard, do as shown in the below screenshot

Click on "Open Service Console" and perform the step as shown in the below screenshot


Click on "Access Rules" and ensure that all those marked are enabled as per the below screenshot

Once this is done, you should be able to launch the "Create Analytics Service"
  • While creating the OAC instance, remember to use the rest endpoint URL for storage directly copied from the accounts section. The format would be {REST endpoint URL}/{container}. I have named my container "beyondessbase". So, my storage container format would be {REST endpoint URL}/beyondessbase. Remember that the user id and password would be your cloud account.
You can get the REST endpoint of your storage container navigating as below
Click on the hamburger icon and click on "Dashboard". You would see all your created instances below



Click on "Open Service Console" and you would land to as below


Click on the "Accounts" and you would get the REST endpoint. This is the rest URL that you should use followed by the container. in my case, it's the {REST endpoint}/beyondessbase1. Though I have two containers, I was using beyondessbase1 for my OAC instance



I also had another issue of using http for my OAC instance which I don't like it. Becky has helped with a link to a post on how to fix that which can be found at the link blog post mentioned before as Blog 1

After all that is done. I am finally up and running with OAC. I will shut down the instances, for now, to use it tomorrow. Not sure what challenges I will encounter once I fire up my OAC instance tomorrow. Till then, Goodbye for now...

Remember to select the type of OAC that you would like to have. It can be Standard, Data Lake or Enterprise Edition



Happy Learning!!!

Comments

Popular posts from this blog

PBCS/EPBCS - ASO exclude shared & Dynamic

As you all are aware that Oracle releases patches to EPM cloud every month (EDMCS is released every 2 months) and the patches are applied on first-week of Friday in Dev and third-week of Friday in Prod I did a post long back about a challenge that I have faced in on-premise and how I have addressed that. New functions were released in Nov-2018 release of PBCS. Below is an excerpt from the readiness document. You can find the document here New Aggregate Storage Functions in Calculation Manager The following Aggregate Storage functions have been added to Calculation Manager. These functions filter the shared and dynamic members from the Point of View (POV). The functions can be used in the POV of an Aggregate Storage custom calculation or allocation. @FilterDynamic(Dimension Name, Member Name) removes all dynamic members from the list of members @FilterShared(Dimension Name, Member Name) removes all shared members from the list of members @FilterSharedAndDynamic(Dimensio

EPM Cloud Tips & Tricks - #1

The first EPM cloud product was released in 2014 and it's been six years till date. I was recently part of an FCCS implementation project. I know what you might be thinking. Coming from completely essbase and planning background and been working on it for almost 12 years and doing an FCCS project? Well, it turned out that way and it was a change for me too than being in my comfort zone and took it as a challenge.  I have been very busy for over the past one year and I didn't really had a chance or time to get back to my blogging and sharing my knowledge. The project finally went live and I am going to share my bit of learnings. Some of it you might alread know The first tip is going to be an easy one and those who have worked in FDMEE / Data Management in the cloud, you might already know it. But, this is very important when it comes to FCCS as zeroes are valid from Balance Sheet standpoint Data Management by default doesn't load zeroes. Below is a excerpt from the document

EPM Cloud (PBCS/EPBCS/FCCS) - Report Bursting & Reports scheduling

The first product of Oracle EPM cloud was launched in 2014 and it's been 5 years and over the course of these 5 years, Oracle has every EPM product in cloud to what is available in on-premise. With so many products under the EPM belt each with its own functionality, there is a real need to have a single unified reporting tool that can handle all your reporting requirements across all your EPM products at one place. EPRCS (Enterprise Performance Reporting Cloud Service) is the Oracle direction to address all the reporting needs for any organization of any size. If your team is responsible for the management, narrative and external reporting with the ability to author, collaborate and a solid approval process, you definitely have to consider implementing EPRCS at your organization EPRCS can connect to your EPM Cloud products, Essbase cloud and also to your Fusion Applications Essbase app. It addresses all your financial, management, narrative and disclosure reporting. I am not goi