Exporting Dynamic Calc Members… FAST! Via the Oracle EPM Cloud Planning REST API

I was recently brought into a small engagement where the client wanted to automate the export of budget data / updates from an Oracle EPM Cloud Planning BSO Application (Side note: I guess it is no longer PBCS? Can we call it EPM CP?) and import into a relational database to be used in other applications. I have done similar things to this countless times, sounds easy right? In my experience, so many things seem easy on their face, but once you start to dig into the requirements things can get much more complicated.

Well as I worked with the client a few items of note came out that made this a little more difficult :

  • The ability to control the POV being exported (fair enough)
  • Allowing functional administrators to execute the process on-demand, but allow for it to be scheduled regularly in the future (again, fair enough, I expected this)
  • They wanted to export all levels of their sparse dimensions, parent level Entities / Departments etc. (OK these exports are going to be larger than expected, still not an issue)
  • They wanted to export dynamic statistics and dynamic parent level accounts as well as Quarters / Year Totals (and there is the catch)

OK, I do love a good challenge. Now with EPM CP there are a few options that I am aware of when exporting data :

  • DATAEXPORT – Business Rules
    • Setting DataExportDynamicCalc ON in DATAEXPORTOPTIONS
  • REST API – Export Data Slices
    • Done via external scripting (Powershell in this case, but could be python / groovy etc)
    • Not to ruin the story, this is what we ended up going with and had some pretty fantastic results. If you don’t care about the journey I took to get here, skip to the bottom of the page when I walk through the solution
    • This required significantly more effort and should not be taken too lightly.
      • Shameless plug here -> in future posts (the next few weeks) I will walk through a new personal project I have been working on for a Powershell Automation Framework using a combination EPMAutomate & the REST API (think of it as a “Starter Kit”)
      • This framework will be publicly available on github for all to use!
      • This helped speed up development greatly
  • For clarity sake, there are other options, but I believe these run DATAEXPORT under the covers
    • Data Export Jobs
      • Essbase Exports (I believe this runs DATAEXPORT under the covers)
      • Planning Exports (i.e. Outline Load Utility)
    • Data Management – Export to Flat File
      • Again, I believe this runs the DATAEXPORT command when exporting from a BSO

DATAEXPORT

So to begin, I started out with a tried and true Business Rule using DATAEXPORT. I know in the past performance has been miserable when you include dynamic calculations ( setting “DataExportDynamicCalc ON” in DATAEXPORTOPTIONS). But I have a few tricks up my sleeve that I have used to help speed these up, namely FIXPARALLEL (post on this coming soon as well).

So as I have advised all consultants I have worked with, start with a very small subset and get that running efficiently and accurately and then scale up. The problem here was even my small subset of data, the performance was abysmal. For a single entity and limiting some of the other dimensions, the export was taking upwards of 15-20 minutes (this was running without FIXPARALLEL). This wasn’t a particularly large application, so this didn’t inspire any confidence

So I ran it with FIXPARALLEL and selected several different dimensions to run it in parallel on, but I couldn’t get it to run faster than a few minutes per entity (even with the limited scope on other dimensions). So I don’t think this was going to be my answer

The Solution – REST API & Exporting Data Slices

So to give some background, I have done exports of dynamic calculations in the past when doing on-prem and to do that I used MaxL + MDX (as MDX can export dynamic calcs at lightning speed… even Sparse Dynamics!)

Unfortunately, you cannot run MDX on EPM CP in a way to export the data that I was aware of at the time. Now I am sitting here thinking, OK if I did just want to run MDX, how can I do that in ANY way in Planning? Those that I was aware of (before this solution) :

  • Groovy DataGrid’s
  • SmartView Smart Slices (I believe this is in Planning?)

Alright, short list, and in this case the client did not have Enterprise so Groovy wasn’t an option. Well back to the drawing board. I was perusing what sort of options we might have had via the REST API. I have used it several times in the past, but it is constantly being updated so wanted to investigate.

I was looking at Exporting Data Slices and alright that looked somewhat promising. Then I noticed in the request you had to provide a DataGridDefinition. DataGridDefinition… sounds just like what you do in Groovy… and as far as I am aware, the groovy method essentially runs MDX under the covers. Very promising

Now this had a few pretty large challenges :

  • MDX has a query limit (I believe its 2^32 CELLS) and I have worked with it in the past, it’s a major pain. So we will have to likely loop through some of the dimensions and execute several exports
  • The export is returned in JSON which thankfully Powershell has some nifty ways of handling that to get it in a CSV type format

Getting Started

To get setup and start exploring the API I wanted to first get down the payload (think : our FIX / dimension members we are requesting) and how it needs to be formatted and what the potential responses might be. I highly suggest using Postman (its free for a DEV license). I won’t get into the details of setting that up in this post, but basically it got me to the point that I now knew things a few things :

  • The URI (think : URL)
  • The format of the Payload

I also wanted to confirm that it was fast. So I did some tests and ran the same POV as the DATAEXPORT and it ran in under a second. Alrighty then, let’s get this setup.

The URI was :

_
$BASE_URL = "https://$YourURL/HyperionPlanning/rest/$APIVersion"
$URI = "$BASE_URL/applications/$YourApp/plantypes/$YourPlanType/exportdataslice"
_

The payload needed to be setup similar to below (obviously and it needed to be dynamic as I would be running this in a loop).

{
	"exportPlanningData" : false, 
	"gridDefinition" : {
		"suppressMissingBlocks" : true,
		"suppressMissingRows" : true,
		"suppressMissingColumns" : true,
		"pov" : {
			"dimensions" : [ "Scenario","Version","Currency"],
			"members" : [ ["Budget"], ["Final"], ["USD"]]
		},
		"columns" : [{
			"dimensions" : ["Periods"],
			"members": [[ "BegBalance","IDescendants(YearTotal)" ]]
		}],
		"rows" : [{
			"dimensions" : ["Years","Layer","Entity","Department","Account"],
			"members" : [ ["&BudYr"], ["Descendants(Post_Allocated)"], ["E1001"], ["IDescendants(Total_Departments)"], ["IDescendants(INCOMESTMT)","Descendants(RATIO)","Descendants(STATS)"] ]
		}]
	}
}

This returns a JSON like below

{
    "pov": [
        "Budget",
        "Final",
        "USD"
    ],
    "columns": [
        [
            "Jan",
            "Feb",
            "Mar",
            "Q1",
            "Apr",
            "May",
            "Jun",
            "Q2",
            "Jul",
            "Aug",
            "Sep",
            "Q3",
            "Oct",
            "Nov",
            "Dec",
            "Q4",
            "YearTotal"
        ]
    ],
    "rows": [
        {
            "headers": [
                "FY20",
                "PreAlloc",
                "E1001",
                "D001",
                "Account01"
            ],
            "data": [
                "0.00",
                "100.00",
                "100.00",
                "100.00",
                "300.00",
                "200.00",
                "200.00",
                "200.00",
                "600.00",
                "100.00",
                "100.00",
                "100.00",
                "300.00",
                "200.00",
                "200.00",
                "200.00",
                "600.00",
                "1800.00"
            ]
        },
        {
            "headers": [
                "FY20",
                "PreAlloc",
                "E1001",
                "D001",
                "Account02"
            ],
            "data": [
                "0.00",
                "100.00",
                "100.00",
                "100.00",
                "300.00",
                "200.00",
                "200.00",
                "200.00",
                "600.00",
                "100.00",
                "100.00",
                "100.00",
                "300.00",
                "200.00",
                "200.00",
                "200.00",
                "600.00",
                "1800.00"
            ]
        }
    ]
}

The Plan

OK so we know how to request the data and how it comes back to us, now let’s put together the plan :

  • Build a list of dimensional members that we are going to loop through
  • For each one of those members we need to export the data, then convert the exported data from JSON to a CSV friendly format and finally save it to a flat file on our local machine/server
  • At the end of the process we will loop through all the flat files and merge them into one

Getting the Dimensional Loop

As I mentioned previously, one issue with using the API was that it was limited in how many cells of data you could query. Based on past experience I knew we couldn’t even come close to grabbing all the data in one go, we would have to piece this out into lots of little queries.

To do that, I needed some dimensional information so I know what members to feed into the payload. Thankfully the REST API allows you to query dimensions.

It would be up to you to determine if one or more dimensions require looping through or which dimensions to use. As a suggestion I would start with a small dimension and keep testing until you get something sustainable.

Here is how you can grab the dimensional information

$DIM_URI = "$BASE_URL/internal/applications/$YourApp/plantypes/$YourPlanType/dimensions/$YourDimension"

This returns a JSON response with all the members in the dimension and properties like so (this example shows down through Q1 and has been slimmed down for clarity’s sake):

{
    "dimName": "Period",
    "name": "Period",
    "children": [
        {
            "dimName": "Period",
            "dataStorage": "Store Data",
            "name": "BegBalance",
            "parentName": "Period",
            "children": null
        },
        {
            "dimName": "Period",
            "name": "YearTotal",
            "parentName": "Period",
            "children": [
                {
                    "dimName": "Period",
                    "name": "Q1",
                    "parentName": "YearTotal",
                    "children": [
                        {
                            "dimName": "Period",
                            "name": "Jan",
                            "parentName": "Q1",
                            "children": null
                        },
                        {
                            "dimName": "Period",
                            "name": "Feb",
                            "parentName": "Q1",
                            "children": null
                        },
                        {
                            "dimName": "Period",
                            "name": "Mar",
                            "parentName": "Q1",
                            "children": null
                        }
                    ]
                }
            ]
        }
    ]
}

It basically resolves out to the entire hierarchy, where you can navigate down through the “Children” nodes. You can keep going until you get to a spot where “Children” is null, and this represents a level 0 member.

OK so we can query a dimension, and return the tree. Now we need to build a list of members we are going to loop through. Thankfully while perusing something else on Oracle’s documentation I found a few pieces of Powershell code they used to do something similar and with a little modification we can get what we need

The first function will find the parent node we want to start with so we can find all descendants of that and will return ONLY that piece of the hierarchy.

function Find-EPMMember {
    param(
        #Current Tree
        $tree,
        #Member to Find
        $memberName
    )
    $subtree = ""
    if ($tree.name -eq $memberName){
        return $tree
    } else {
        $tree.children.ForEach({
            if ($subtree -eq ""){ $subtree = Find-EPMMember $_ $memberName}
        })
        return $subtree
    }
}

Now that we have the relevant hierarchy, we can build a list of all members from here. This function has a switch where you can make it only return level 0 members, or return all descendants it is up to you. It is basically a recursive function which will run through all the children in the hierarchy you provided (above) and will build a list of the members

function Get-EPMMembers {
    param(
        #Parent member to start from to grab all descendants
        $parent,
        #Switch to get level 0 only
        [Switch]$returnBaseLevel
    )

    $parent.children.ForEach({
        if ($returnBaseLevel){
            if ($_.children.count -eq 0){
                $script:listMembers += $_.name
            }
        } else {
            $script:listMembers += $_.name
        }
        Get-EPMMembers($_)
    })
}

Bringing it all together for the dimension loop

Now lets bring it all together and grab the base level members of “Total_Consolidated” in the entity dimension.

NOTE : The $API_HEADER provides the credentials to login to the REST API. See here for John Goodwin’s post on how to setup the header

#Build Dimensional Request
$DIM_URI = "$BASE_URL/internal/applications/$YourApp/plantypes/$YourPlanType/dimensions/Entity"
#Invoke REST API to get Entity Dimension
$dimEntity = Invoke-RestMethod -Uri $DIM_URI -Headers $API_HEADER -Method GET -ContentType "application/json" -UseBasicParsing
#Grab Tree Related to "Total_Consolidated"
$treeEntity = Find-EPMMember -tree $dimEntity -memberName "Total_Consolidated"
#Ensure we have a blank list to start
$script:listMembers = @()
#Grab all descendants of "Total_Consolidated"
Get-EPMMembers -parent $treeEntity

Run the Export and write to flat files

Now that we have the members we are going to loop through, it’s time to export the data. For each member in the loop we need to :

  • Build the Payload using that respective member
  • Make the API Request and return the results to a JSON object
  • Convert that JSON Object to a CSV friendly format
  • Write the results to a flat file
$EXPORT_URI = "$BASE_URL/applications/$YourApp/plantypes/$YourPlanType/exportdataslice"
$POV = "Budget|Final|USD"
ForEach ($Entity in $script:listMembers) {

    #Build Payload
    $Payload = @"
    {
        "exportPlanningData" : false, 
        "gridDefinition" : {
            "suppressMissingBlocks" : true,
            "suppressMissingRows" : true,
            "suppressMissingColumns" : true,
            "pov" : {
                "dimensions" : [ "Scenario","Version","Currency"],
                "members" : [ ["Budget"], ["Final"], ["USD"]]
            },
            "columns" : [{
                "dimensions" : ["Periods"],
                "members": [[ "BegBalance","IDescendants(YearTotal)" ]]
            }],
            "rows" : [{
                "dimensions" : ["Years","Layer","Entity","Department","Account"],
                "members" : [ ["&BudYr"], ["IDescendants(Post_Allocated)"], ["$Entity"], ["IDescendants(Total_Departments)"], ["IDescendants(INCOMESTMT)","Descendants(RATIO)","Descendants(STATS)"] ]
            }]
        }
    }
"@

    #Output File
    $OutFile = "$PathToExports\EXPORT-$Entity.txt"

    try {
        #Execute API Request
        $RestResponse = Invoke-RestMethod -Uri $EXPORT_URI -Body $Payload -Headers $API_HEADER -Method POST -ContentType "application/json" -TimeoutSec 6000
        #Check if rows were returned, if so write to file
        if ($RestResponse.rows.count -gt 0) {
            #Open a stream writer with append
            $stream = [System.IO.StreamWriter]::new($OutFile,$true)
            # Convert To / From JSON to get it as readable Powershell Object, Loop through each row and write to file
            $RestResponse.Rows | ForEach-Object {
                #JSON by default returns using a space delimiter, Replace spaces with a pipe delimiter
                #Write results to file
                $stream.WriteLine( ("$mbrScenario|$mbrVersion|`"$(($_.headers) -join('"|"'))`"|$( (($_.data -join ('|')).replace(' |','|')).replace("`n",''))") )
            }
        }
    } catch [System.Net.WebException] {
        # Catch this Web Ex specifically to see if we have queried too many cells
        $rStream = $_.Exception.Response.GetResponseStream()
        $rReader = New-Object System.IO.StreamReader($rStream)
        #Grab the body of the exception so we can get the error message
        $rBody = $rReader.ReadToEnd() | ConvertFrom-JSON
        $rStream.close()
        $rReader.close()
        if ( ($rBody.Message -like "*not enough resources*") -or ($rBody.Message -like "*data entry cells exceeded the threshold*") ) {
            #The query for this entity returned too many cells, need to break down the query even further
            # by getting more granular with the query
            Write-Host "The Entity $Entity returned too many cells | $($rBody.Message)"
        } else {
            #This is a Web Ex error, but not related to too many cells
            Write-Host "Error for Entity $Entity | $($rBody.Message)"
        }
    } catch {
        #Catch all other errors
        Write-Host "Non Web Ex Error for Entity $Entity | $_"
    } finally {
        #Ensure we close streams, check if they are open first to avoid errors
        if($stream.BaseStream) {$stream.close()}
        if($rStream.BaseStream) {$stream.close()}
    }
}

Combine Results of all flat files into one

Now that we have exported the data for each entity into its own flat file you will have something like this in the directory you are exporting to:

Now we need to loop through all these and build the combined file

#Combined Output File
$CombinedOutFile = "$PathToExports\COMBINED_EXPORT.txt"
#Header for File
$Header = "Scenario|Version|Year|Currency|Layer|Entity|Department|Account|BegBalance|Jan|Feb|Mar|Q1|Apr|May|Jun|Q2|Jul|Aug|Sep|Q3|Oct|Nov|Dec|Q4|YearTotal"
Set-Content "$CombinedOutFile" -Value $Header
#Loop through each file and combine into one
ForEach ($file in (Get-ChildItem "$PathToExports" -Exclude "$CombinedOutFile")) {
    Get-Content $file.FullName | Add-Content "$CombinedOutFile"
    Remove-Item $file.FullName
}

In Summary

This whole process ended up being a lot more effort than I originally anticipated, but the results were amazing and as I have said before, I do love a good challenge.

Remember, originally we tried the DATAEXPORT command and it took roughly 4-5 minutes for a SINGLE entity and a limited scope for the other sparse dimensions (I did try it on a large group of entities, roughly 50, and I killed it after 3 hours of running so I couldn’t get an exact timing on how much faster this performance was). We used this API query method and were able to run it for ALL entities (roughly 500-600) and the full scope of all the other dimensions and it took about an hour to process.

I also built in quite a bit more error handling than shown here. For example if a query returned too many cells, I would re-run the query for that entity with a secondary loop going through another dimension to get more granular. This post was more intended to give you some insight on how you might approach this.