DATAEXPORT - FIX on DynamicCalc Sparse Member

You might have seen this and have a workaround for this. However it is a surprise that it cannot work.

Now what am I talking about?

Long gone are the days where you were using jexport, all of us moved to DATAEXPORT. (I still sometimes use Jexport, it is a cool CDF)

This one is about me getting a call from a colleague "I've an issue with DATAEXPORT command, no matter what I FIX it exports all the children."

I tried this on a modified version of Sample Basic.

I've Years in my outline, also Actual is a parent with two children.

You really don't need to mention DataExportLevel ALL; that is the default. I got the required output.

I gave him a call and told him that "It's is working fine for me." He informed that "Scenario" is a sparse dimension on his cube. Modified Sample Basic, ran the calc and what do I see.

It is exporting the values of Actual's children.

Not only it does that, it completely ignores the FIX statment and exports everything for Scenario. Is this documented somewhere.

DataExportDynamicCalc ON | OFF

ON—(Default) Dynamically calculated values are included in the export.

OFF—No dynamically calculated values are included in the report.

Specifies whether a text data export excludes dynamically calculated data.


Text data exports only. If DataExportDynamicCalc ON is encountered with a binary export (DATAEXPORT BINFILE …) it is ignored. No dynamically calculated data is exported.
The DataExportDynamicCalc option does not apply to attribute values.
If DataExportLevel INPUT is also specified and the FIX statement range includes sparse Dynamic Calc members, the FIX statement is ignored.

It says DataExportLevel INPUT is specified and range includes Sparse Dynamic Calc members, then the FIX statment is ignored. Is that what I'm seeing? But I've (this is why you see the DataExportLevel set to ALL earlier) set it to ALL. Could this be a bug, it maybe.

A dynamically calculated sparse member, doesn't involve in block creation, so if I try that with DataExportNonExistingBlocks ON; can it work?

The explanation for DataExportNonExistingBlocks is right after DataExportDynamicCalc 

DataExportNonExistingBlocks ON | OFF

ON—Data from all possible data blocks, including all combinations in sparse dimensions, are exported.

OFF—(Default) Only data from existing data blocks is exported.

Specifies whether to export data from all possible data blocks. For large outlines with a large number of members in sparse dimensions, the number of potential data blocks can be very high. Exporting Dynamic Calc members from all possible blocks can significantly impact performance.

I think I just got lucky (I was going under the impression that it is a bug). It works with NonExistingBlocks


I was working on this blog post yesterday and saw Glenn's update on similar (same?) issue today morning (I'm still going ahead with this post). Seems like we both got the call :)

Furthur testing reveals that with DataExportNonExistingBlocks and DynamicCalc Sparse parent member it won't export missing blocks.

Oh wait You don't believe me. Let's prove it.

My export must have 1941 lines (and should have a Row with all #Missing)

Nope I just have 1938 lines and nothing with all #Missing.

Nope no #Missing rows :) (It's strange I know, it think it is bug which happen to creep into from - a good bug)

Did I prove Glenn wrong (no I didn't), however I like to think that I did so that I can complete a virtuous cycle (Glenn torturing Cameron, him torturing me and if and only if I can torture Glenn). Well that's lot of an ask :) and lot going on

.5 Million and counting

Yes you read it right, Day before yesterday marked 0.5 Million page views for this blog. I'm truly humbled by those numbers. (Never expected to see such figures)

This is also the year which celebrates the 5th birthday of this blog. It is all because of the you my dear reader.

It's the time to sit back and analyze some of the details.


Now that's the stat from, as you can see the most read article is the one on Workspace customization, I still get some emails on customization stuff and I've a blog post coming up on the same topic.

You'll see a slight difference in the figures from next picture onwards (It's taken from Google Analytics), the reason behind that is because sometimes I forgot to add the code when I change the blog's look and feel ;)

Audience Overview

Now this one tells me that you, read 1.75 pages per session (I'm glad that some of you are landing here and reading more that one post)

I'm also glad that 40% of you come back to see what I've written and thanks to you new visitor.


Now, I'm kind of sad seeing the 16.1% representation there. (I hope to see that number growing)


US takes the 1st spot on the top ten list (which was kind of obvious). No I don't speak French and nor does this blog has a french version :)

I'm really glad that people around the world do read what I'm writing. (not just me, me and me)

Thank you my dear reader.

Traffic Sources

Google the demi god ranks top
Next is direct - which tells me that you come directly to this place searching for answers, I'll try to improve the quality of post that I write from now onwards :)


Well this truly doesn't represent the keywords because I forgot to enable that feature :(, however It still gives you are picture of your search and how you end up reading this blog, I'm surprised that people search for the blog title :)

Thank you

I'm humbled and I don't have anything to say other than "Thank you".

Real Time data push from Planning to ASO

This is a follow up post on what was covered in here.

Lot of us might have asked this - "How can I move data from a Planning application to ASO in real time?"

Well this post tries to answer that same old query. You'll have to be on a Calc Manager version which is greater than

There was an issue with - When Planning sends a member from Page/POV it adds encloses the member name in double quotes. This was the reason why I didn't cover that in the previous blog. 

Now you'll ask what is wrong in it. Yes this is great on a BSO application, however ASO works differently (it uses double quotes only for literal strings (like in UDA function or a String Compare)). :) "" for BSO member enclose [] for ASO isn't it?

 Sree Menon and team worked on a patch and now that issue is solved.

The below given example is on Planning + Calc Manager.

I'm using Sample Vision application and I've created an ASO replica of it (exception being a new Country dimension).

I created a Calc Manager Rule as shown below

As you can see I'm going to use Variables in this rule. It is easier to create this rule by looking at the form definition. This way you know what is in POV/Page/Row/Column.

Entity, Scenario, Version can be variables in Calc Manager rule.

I'm writing this using RUNJAVA, (Yes I know Cameron is going to object this method ;))

As you can see it is a bit of a mess if the variable needs to treated as set in MDX. (Note - the extra dimension member is covered in Target POV)

MDX set uses '{}' to denote a set and Calc Manager also uses '{}' to denote a variable. So double brackets to the rescue, {[{Scenario}]} is parsed correctly.

Note: Do not use NON EMPTY functions in MDX, this is done to make sure that data clears from Planning is accurately captured.

Now if you are wondering about the encryption of Calc Manager rule, here is how it is done.

There are some characters which cannot be used as key. I did receive an error about AES length not being met.

Let's attach this rule to the corresponding form.

Now let's change some data and see if it copies to ASO.

This is my ASO retrieval before the save.

Changing webform data.

Retrieval after data submit.

I did notice two bugs while preparing for writing this post.
  1. While creating an Application level variable in Calc Manager using Dimension Name option, it won't allow you to type in the custom dimension name. (just behaves like Dimension Type). I can go back to my hack till this one is fixed. :)
  2. Now this one is an issue if you are planning to implement using (I couldn't test this on a version between and The MDX script that extract the data gets cached some where, and it won't extract correct data if the Page member is changed.
If I recycle the services and do a data submit then it will correctly move data using the member selected in page, however after the first data movement, the member gets cached. I would hold on for sometime till Calc Manager team fixes this issue.


Update 03/04/2015

Calc Manager team updated me about the issue with Member caching, they did fix this issue internally, so we might see this coming in a patch soon :)

Block Creation in ASO!!!

It's been a while I planned to write about this. Lot of things happened after that(personal and professional) and I couldn't finish writing it.

Now you must be thinking why is he talking blocks in ASO, there are no blocks in ASO, we all know that.

I just coined that term because everyone in BSO world understands it :) Now that ASO has procedural calculations it is high time someone come up with a term ;)

If you don't like Block creation, then how about efficient procedural calculation in ASO :)

Now if you are wondering what a procedural calc is (where to use one), I can try to explain it in a BSO fashion.

Think about a dynamic formula in BSO, it gets executed irrespective of which level you are (level 0 or higher). Now if I've to make the formula work for all level 0 only - for higher members just do a roll up, I'll have to write a calc script to achieve it (or just load the calculated number)

Now in a ASO till lately the only option was to load the calculated number, with the release of 11.1.2.x you've two options.
  1. Load the calculated values
  2. Run a procedural calc on ASO cube
I was working on a requirement which calculates the residual forecast. Simple formula.

Take the sum of forecast starting from first month in the cube to the current period. I have a calc member (Residual_Forecast_Calc) which has a formula and a store member (Residual Forecast) to store the procedural results.

Everything looks good till I ran the calc. The calc just keet running.

Yes I did wait for it to be done for more than 15 hours. Oh not on the production box.

This is on my test machine and it didn't finish (this is on and a modified version of ASOSamp)

Now what, Oh you can use NONEMPTYMEMBER to restrict it, is that your answer. Ok Let's try it!!!!

Formula updated, now look at the results.

$%$^$%# nothing, yes and why would it work? I don't have a forecast in Feb (which is the current month, so it becomes actual)

Wait, what if I can trick it? That's where the block creation or efficient use of NONEMPTY/....... comes into picture. I know that there is forecast there for some months. What if I load those forecast figures against current month. (Don't load those real numbers)

I created a SQL load rule

If you look at the SQL, I'm grabbing all the distinct member combination which has forecast till current month. (Note the absence of Time)

Now look at the Data column, it is all zeros.

Let's load this to the database.

So what are we doing here.

We are loading zeros against current month Forecast (ZfcstLd) then comes the real actual and real forecast data.

I'm using Overwrite existing data, so if I've real forecast data coming (sqlld2), it'll overwrite the zeros the we loaded early. Also I didn't check "Ignore zero values".

Now if you are using a MaxL statement you can create a different buffer for these.
  1. buffer which allows zero load for ZfcstLd
  2. buffer which ignores zero and missing for sqlld1 and sqlld2
Let's look at the results after the load.

Zeros are in and the calc is working. Let's run our allocation and see how much time it is taking.

Yes .375 seconds (no I won't even do the performance math against 15 hours and still running)

Now, if you want to remove the zeros that you loaded to perform the calculation, you can run a merge statement to remove the zero cells.

Results after calc and merge


Update 02/19/2015

I got a question as comment from amarnath asking what if the file is just a flat file. In my case I'm using a relational database and it is easy to find the distinct values.

You can use OS scripting tools to find out the distinct list. In my example I had to figure out the distinct Forecast values.

Here is the command that I used. As usual I like using the *nix variants (you can find them from the UnxUtils site)

What is it doing?

I'm telling gawk that my delimiter is tab (-F) and I'm asking it to use the same delimiter for the output (-v OFS)

It searches for keyword "Forecast" and print all columns except 2nd and last (Date and Data fields) (NF-1, will remove the data column)

printf $1; for (i=3; i <= NF-1; i++) printf FS$i; print NL,0 (NL,0 adds a zero at the end) released and first impressions

Today got released!!!!!

I was eagerly waiting for this release. However I don't think I'm as excited after installing it.

Couple of changes, the world's largest service name is no longer there. Installer won't prompt for your support id anymore.

Workspace gets new look and feel (more Fusion feel)

New feature in Shared Services, where you can run provisioning report for a user by right clicking the user.

You can run diagnostics information in Planning

There are multiple artifacts available for monitoring. I ran diagnostics for Forms and found that it is is giving a status report.

You can search for specific forms.

What is not providing is how the form got a "Good" rating.

However "Grid Diagnostics" provides bit more information for forms.

This one has information on Load time, How many Unsuppressed Columns/Rows are there in the form, How many pages and What was the load time. That is a good information :)

I did see Custom Styles.

I did see Smart View saying there is a new version ( out there. However couldn't Smart View page still shows

I couldn't figure out where SandBox is so I decided to use the help file and seems like either help file is not updated or it is not there :(

No sandboxing (or I'm sleepy, maybe it is there and I cannot find it)

I didn't see Valid Combinations, I didn't see saving EXCEL formulas back to Planning.

Oh my god, I'm sleepy.

Will update you all later :)

Update 02/06/2015

I had to perform an uninstall and noticed that the uninstaller takes a bottom up approach. Foundation Services is the last one to be uninstalled. I've not uninstalled prior versions (11.1.2.x) and not sure when this change was introduced.

Views discussed in this blog are my personal views.
They do not represent the view of organizations, businesses or institutions that I'm part of.