Data Load Rules - Use of data load field names

This post is a proof of my laziness. I'm just kidding. You know that I'm not lazy :)

A friend of mine came up with a question, "How can I create a dynamic load rule". Well that's not the exact question, however let's pretend for sometime that it was and how we can solve it.(His question was to how can I export data from BSO and load into ASO).

I do know that there is(are) similar thread(s) in Oracle forums, and the obvious answer is to create a DATAEXPORT calc script.

Get Data out from Source and then use a load rule to Import it to the Target.

The question many of us had/have is how is it determining the Rows and Columns. If you don't specify DataExportColHeaderEssbase makes the first Dense dimension as Column Dimension. 

If you look at the explanation, you'll find that it needs a dense dimension. It is easier to use a static dimension like period as the Column Dimension as it won't change.

Static Dimension will be one where you are not adding members to it and the DATAEXPORT calc script is not referring to a member set which is changing.

For example if I've @RELATIVE(YearTotal,0) in my calc script, I still know that it is 12/13 members starting from BegBalance/Jan to Dec.

What if in my cube Period is not dense and I don't have a static Dimension to select.

Just for the context of this blog post think that Sample Basic has only one dense dimension - Measures and has a similar structure.


If I create a calc script similar to the one given below,

I'll get Measures as Column. 

Let's imagine that next month I did add more members to the Parent "Profit".


My data file will change and I'll have to change my load rule to reflect those changes.

Yes I can do that, but can I make it Dynamic so that I don't need to edit the load rule (or make sure that it is correct) all the time.

I know that there is an option in Load Rule where I can mention the record which contains the field names - "Record containing data load filed names".

Oh yeah, we know that one, but what are you going to do with that?


Let's look at the load rule. Well it's just like any other rule.


Now let's open the data file.


See how it moved line 1 as Field Names, however it is not correct, because those my Data fields.

How can I get the Row Dimensions to show up in export file? (see we asked you earlier, "Where are you going with that option?" ;) )

If you look at the SET DATAEXPORTOPTIONS, there is an option to get the dimension names as header - DataExportDimHeader ON;

Let's add that to our script and see how that works.


Hmm that's interesting I got that one, however I lost the column field names.

How can I get the headers come as a single line?

Not to worry, here come the friendly neighborhood Spiderman - sorry sed command :)

sed -i ":a;N;$!ba;s/\n/\t/;s/\"Measures\"\t//" C:\Temp\100-10.txt

If you notice Dimension name Measures also get displayed when we use DataExportDimHeader, we don't need that because we've the member names from Measures in Columns. What the sed command does is it moves 2nd line to 1st and then replaces Measures with a tab.

You can replace "Measures" with your dimension name and if the delimiter is , then change \t to ,

sed -i ":a;N;$!ba;s/\n/,/;s/\"Measures\",//" C:\Temp\100-10.txt

Open the data file in the load rule again.


Tada, A dynamic load rule!!!!

For the windows users, you can download sed editor from here. (Download both files)

Note: If you open the rule without the data file it'll not show the field names :)

EPMA queries for member properties and more - Understanding EPMA tables Part III

It's been a long time since I blogged. I do owe and apology to you and two other people in EPM space.

Cameron Lackpour and Dan Pressman

Three of us were working on developing some EPMA SQL queries (or just say, trying to make sense of the EPMA backend tables).

Cameron was kind enough to share the queries he and Dan developed and I was supposed to start/continue with my EPMA series.

I failed to do so, I can say that I had a tight schedule (it is a real fact) and all sort of valid excuses, however I'm not going to say those.

Better late than Never :)

Take a look at the below given posts where few EPMA tables are explained and two hacks

Getting Member formulas from an EPMA Application - Understanding EPMA tables Part I
Change History Years in Planning on an EPMA application - Understanding EPMA tables Part II

I'm not sure whether we'll be seeing EPMA in a near future (Data synchronization is going away soon, who know what is next). For those EPMA die-hard fans here we go.

If you would like to see all the EPMA application and their Application Class (Planning, Essbase, HFM,....)

You can use the below given SQL
and here is the output





DS_PROPERTY_APPLICATION_ARRAY stores the property information of an application. Application Class is a Dimension in EPMA and has a value 4. Different Application classes has their own values.

Let's look at the ID's of different Applications.






X_PROPERTY_VALUE gets loaded from a physical location when you start EPMA and it holds some surprises. (more to follow in coming blog posts)

Now you know what application you have :) let's look at their properties

SQL
and here is the output












You'll be surprised to see the information returned by that SQL (I was surprised that it didn't return the StartYear ;) )

More fun, here is a query which will give you all the "SmartList" members.
Output













The same query can be used to pull all members which has Time Balance. Change the where to "WHERE "Property" = 'TimeBalance';"












If you read Part I ,I did talk about a SQL which will give you all the members which has a formula in EPMA. However that is not the only information which is there in DS_PROPERTY_MEMBER_MEMO table, you'll have to wait for the next post to know more ;)

Essbase String to Number CDF

I was looking at the comments on my post about String Essbase CDF functions and found an odd request there.

I couldn't find a reason why someone would try doing that. However I ending up creating 4 functions to perform the request!!!

The request was to create a String to Number function.

Let's say that you want to convert some of your metadata to data in Essbase, there is no function that exists now which can do this. (I'm not sure whether there are some CDFs out there).

Let's see how it works.

I took Sample Basic and created two members which will hold the integer and double values.

StringInteger and StringDouble


















For the next two functions, you can pass the replacement character (for example product name contains "-" in it and I would like to remove it and store the numbers)

StringDoublewReplace and StringIntegerwReplace members were added to hold these numbers.

















You can register the function using the below given MaxL statements.
You can download the CDF from here. For installation instructions please look here.

Oracle EPM Workspace - HTTP 404.13 Request exceeds content length

I don't think you will get this error, but if you got it and then if you are on this post, you now have an answer to solve it :)

Now you'll be thinking "How did this guy get the error?"

I always try to break things (and Fix them later). I was trying to add a MIME TYPE for MP4 files, so that I can upload Video file in Workspace. If you are interested in knowing what MIME is have a look here.

You can add a new MIME TYPE by Navigating to Administer->Reporting and Analysis -> MIME Types in Workspace.

I did add one and then tried to load an MP4 file. During the upload process I got an error (which I've never seen before)

"HTTP Error 404.13 - Not Found
The request filtering module is configured to deny a request that exceeds the request content length."












The good thing is it provides the article and evens lists a solution :) (didn't test my GoogleFu :))












The solution is to edit applicationHost.config under C:\Windows\System32\inetsrv\Config and update <requestLimits maxAllowedContentLength="yournumber" />

The bad thing is it won't work :(

I'm not sure why it doesn't update setting in IIS, you've to manually change the content length in IIS Manager.

Login to IIS Manager and click "Request Filtering"













Click "Edit Feature Settings"













Update Maximum allowed content length to a big number. (I updated it to 100000000 = 100 MB)


















Once the update is done, I could upload a big MP4 file and share it with my users :)

Calculation Manager Execute MaxL and MDX Data Copy

It's been a longtime dear reader. My apologies for not updating this blog.

There were many reasons.

  • I bought a house and was busy settling in.
  • I picked up a new hobby - gardening and landscaping (which was kind of obvious). I was/is always fascinated with gardening. When I was a kid I used to pick up the plants to see whether roots are coming or not :)
  • I was busy with preparing for KScope - My first KScope and it was a great experience to meet all the people you know from forums, blogs,..... You should sign up for Kscope15.
For all those who thanked me during the conference for this blog (and for those who are reading this) , I would like to thank you instead - you are the reason behind this blog.

I wanted to write about this longback when the feature was introduced and it was just another post (like other 25 of them) in my drafts folder. Now you might be thinking why I'm posting about this all on a sudden. Everything happens for a reason and I'll talk about that real soon :)

CalcMgrExecuteEncryptMaxLFile and CalcMgrExecuteMaxLEnScript was introduced in 11.1.2.3.000, what does that do.

It does exactly what the name suggests - it can execute an encrytped MaxL script or a MaxL file.

Imagine a scenario where you've a Planning application and an ASO reporting cube. To move data we have to run a batch which will export the data and then use MaxL scripts to load it into the ASO cube.

Now imagine what such a function can do. You can write a rule in Planning which will export only the data which the user is working (using variables - POV, Pages) and load it into ASO cube. Isn't that wonderful.
Yeah I know what you are thinking - What happens if multiple users execute the same script? For now DATAEXPORT won't allow filename to be parameterized.  However think about the possibilities!!!!

I didn't use Planning (I love Sample Basic) in my example, however it'll work the same.

I've a new database which is BasicT (this one has more members in Market). I'm extracting "New York" data and uploading it to "India" and here is what my Calc script looks like.
This is my MaxL statement
Execute the script and there you go, data loaded to the target.

I thought about using RUNJAVA - just to see the syntax, however I will advice you all to use RUNJAVA always. There is reason behind it :)

Syntax for RUNJAVA is different than the function and here is how it looks like.
As you can see RUNJAVA doesn't need a FIX statement, and it will not execute the statement for all members. We'll talk about this in detail when we look at the next undocumented functions.

Now if you ask me what true and false does in the function, the only answer I've is - It runs asynchronously when true. I've seen that true is fast.

Now let's talk about two undocumented functions. They are the reason why I'm writing this blog. I was excited to figure out the syntax for it.

CalcMgrMDXDataCopy
CalcMgrMDXDynDataCopy

Oh what do they do -

  • First functions can accept MDX queries and it'll export the data from a BSO copy and will load it to a target cube (ASO/BSO)
  • Second function - if you are not good at MDX, will write an MDX query for you and it'll export the data from a BSO copy and will load it to a target cube (ASO/BSO)
Why I was existed about it?

Think about performing a datacopy in ASO - now we've to depend upon execute allocation (using share). Yeah I know it works, however think about just writing a MDX query and loading it into a different application or think about a DATAEXPORT like script from ASO.

Yes the function generates a a neatly formatted report using MDX script. I had a feeling that the function is using GridAPI and Sree Menon (PM for Calc Manager)confirmed that they are using GridAPI to export data.

For all the folks who are interested. Sree Menon updated that it was a POC in which DEV team wanted to move data from BSO to ASO. Well we just got lucky isn't it? ;)

I wrote this calc script for exporting data from Sample Basic and load it to Sample BasicT
What was it doing, it kept exporting/loading the same data for all the members it process. (I didn't have a clue - Sree informed me about this). Here is the reason why I asked you to stick with RUNJAVA always.

Following are Sree's words on the syntax
Since this was just a POC, Source/Target mapping does not work like partitioning. It is just source meber to target member (Apr to April). No functions are supported.
Since the mdx can return large data, for performace, maxRowsPerPage could be set to say 1000 which means as soon as the grid has 1000 rows and update to ASO is done.
At the end remaining rows are also pushed. If the maxRowsPerPage is set to "-1", then all the data is filled in to one grid.
sourceMemberMappings could be something like @List("Apr", "Qtr1") and targetMemberMappings could be @List("April", "Quarter1").
After following his advice this is how the script looks like.
Here is the syntax from the specs @CalcMgrMDXDataCopy(user, password, useToken, sourceApplicaionName, srcDatabaseName, targetApplicationName, targetDatabaseName, mdxQuery, loggingFileName, sourceMemberMappings, targetMemberMappings, maxRowsPerPage,tgtPOVCols,server)

I didn't use tgtPOVCols here, but if you have different dimensionality and you want to load it to a POV, then you can mention them here. Let's look at the output now.
I didn't do anything here. That is the entire output. Isn't that really neat?

I had some trouble getting the syntax for CalcMgrMDXDynDataCopy

I think the definitions are little messed up in the specs.

This is from the specs

@CalcMgrMDXDynDataCopy(user, password, useToken, sourceApplicaionName, srcDatabaseName, targetApplicationName, targetDatabaseName, logFileName, sourceMemberMappings, targetMemberMappings, maxRowsPerPage, slicerAxisQuery, axisQuery, povCols, server)

and this is the one which works
So I think this is what the spec should really look like

@CalcMgrMDXDynDataCopy(user, password, useToken, sourceApplicaionName, srcDatabaseName, targetApplicationName, targetDatabaseName, logFileName, sourceMemberMappings, targetMemberMappings, maxRowsPerPage, povCols, slicerAxisQuery, axisQuery);

You don't need a server for this one.

What will this do, it generates the MDX for you!!!!!

The first line is the MDX generated by the function.
That was a really lengthy post and I hope that Calc Manager team will expand this function and give us an ASO DATAEXPORT and more :)

Views discussed in this blog are my personal views.
They do not represent the view of organizations, businesses or institutions that I'm part of