CloudMap - your integration mapping system

It is very unusual of me to write a blog while I'm attending a conference, I know my older brother can even do a real-time blogging :(

This post will introduce you to a quick tool that was built for integration between cloud and on-prem systems.

Beta is now working for Salesforce to PBCS integration. I'll be planning to add more CloudMaps for PBCS to Salesforce and other CRM/on-prem systems in a later release.

Installing CloudMap

CloudMap can be installed in 3 clicks

Checks whether EXCEL is running, it'll allow you to close EXCEL and resume the installation.

By default CloudMap gets installed to C:\Huron\Utils\CloudMap

Using CloudMap

You can configure Salesforce and PBCS connection information in Options. This is going away and login information will now appear in the task pane. (more Smart Viewish)

Change SOQL for data pull, mention the Datacolumn field.

If there are extra dimensions in PBCS, you can add them in Header Definition.

Select the job to run for the data update. This will show all the jobs from your PBCS environment.

That's it!!!!!!! The data can will be pulled from Salesforce and updated in PBCS.

Similar way MetaData (Customers) can be updated from Salesforce.

You can update upto 3 dimensions!!!

The dimensions and jobs are pulled from PBCS.

I'll be changing the look and feel to make it more aligned to Smart View in the future releases.

Here is the integration point from PBCS.

If you've ideas, please let me know and I can try adding that to the product.

CloudMap uses REST API for the integration, so no more EPM Automate :)

How to log off Inactive Sessions from Essbase

This post is about how to programatically log off the sessions that becomes Inactive.

The question came from a friend of mine and I did let him know that some of those sessions that you see in EAS which is not attached to an Application are all not inactive.

In the screenshot the Sessions for Huron is an active session (that's me), that user is an admin who might have executed a MaxL which is server specific.

However I thought it is fun to do this purely from a fun perspective and created a batch file for it. You can download it from here

What I'm doing in that script is first to get all sessions using display session all MaxL and spool that to a file.

There is a second MaxL which gives me all the applications that are there in the system display application all

Once I got the spool files I'm using for loop to get the required information.
What the script is doing is get the 2nd and 4th token, also I asked it to skip 4 lines, this is my header.

Here is the output of the above command

0 0 is the session you've logged in through MaxL (the one which triggered the display session command)
- -  is the byproduct of our for loop (this line is producing that OK/INFO - 1241044 - Records returned: [27].)

Now to get the Cubes, I guess you get the idea now.
Again a for loop and get the first token

Now comes the comparison where I've to pass column 2 from App_Session.txt to Cubes.txt and get the ones which does match.
This is achieved using gawk and for loop
gawk compares and gives us the sessionid which is not associated with an application and in this case we'll get 0 0 - - and the reals ones.

Bring for loop to the picture and say eol=- this will remove - - and add an if condition which check whether the session id is not equal to 0 and add the ones which are not to the MaxL.

Here is the MaxL that is been created and the spool output.

One issue that I found while playing with this was you cannot use an encrypted MaxL for this.

You cannot use encrypted MaxL with display command.

Essbase Import Text List - Encryption

This an update to the utility referred over here

I had to implement the solution at a client and had to use the tool I developed to import TextLists in Essbase.

Maybe this is also an after effect of the project that I recently completed.

The client(earlier) had a requirement where they needed to present the lineage (hierarchy) and some other metadata information (call it attributes, if you want) to the user in the WebForm, however they don't want them to navigate through the entire hierarchy (I would call that as vertically traversing).

They were looking at an option of horizontally traversing. (Now that would mean creating around 23 dimensions). The next option to look at was to use Text Measure in Planning and load them against a level 0 product. Ta da they got the horizontal traversing that they are looking for.

I could get away with it as the users were not consuming the datalized (well that is not even a word ;)) metadata. This was in Planning and was easy enough to handle.

Coming back to the problem at hand, client needs to show the store address in FR report and this is a pure Essbase implementation and only option was to use TextLists.

The issue with TextLists is, prepopulating the mapping is not exposed in a load rule or MaxL.

Essbase deals with TextLists in a different way compared to Planning.

You've to create the mappings first (where Planning does create the mapping when you load a new text) and that method (prepopulating the mapping) is only available in API.

That's what this tool does. You can use the jar file to encrypt the data. (I don't believe in just encrypting just username and password)

Here is how you can encrypt the strings.

Use that encrypted text and pass that to call jar file with -D option

It's SQL server and to create the file with ID and Address I'm using sqlcmd.

The cool thing that I could do was to remove the headers and the no:of Rows from SQLCMD using a switch and a SQL statment
-h -1 - removes the headers
set nocount on; - removes the row count information from the end of the file.

ROW_NUMBER gives me the ID (I'm fine with random IDs)

After execution

You can download the utility from here

Conditionally trigger a Data transfer and Aggregation on a different cube - Calc Manager CDFs Part II

If you've not read the Part I of this series read that one here.

Let's recap the steps that we did till now.

Depending on the user selection we set a substitution variable, exported data from the cube that we are on (App A)

Now the rule looks like this.
This is the from my spool file.

I'm using a subvar as my file name!!! (After writing this I found that Mike Henderson has used entire Calc commands as subvars). All those ugliness in the calc script (the one above) is gone in the logs :)

If you look at the file name, it is named according to the user selection.

The one highlighted is the time stamp, if you look at that in EXCEL is is 20150924165756.

Now I got my file, over to Step 2

Again I'm staying in App A and going to trigger a clear in App B/App C depending on the user selection.

Now below given is my clear script.

Now that the data is cleared I've to import data into the App.

Here is the script that does it. I'm using a load rule, so that I can remove Product dimension.

Data cleared, data loaded, now it's the time to aggregate the data in App B/App C all the time we are in App A!!!

Here is the spool file from Step 2.

Now the entire Calc.

See those IF condition that is the one which tells me which cube I should go depending on the user selection.

I think the near future (next patch release of Calc Manager) will (hopefully?)include two new functions @CalcMgrMDXDataCopyRTP, @CalcMgrMDXExportRTP. It might make it, it might not (so don't hold me to it)


Yes that has a where clause


Conditionally trigger a Data transfer and Aggregation on a different cube - Calc Manager CDFs Part I

This is really cool and you'll see why.

I'll go ahead the describe the scenario, so that you can think about a different way to do this.

A friend of mine has a unique scenario where there is are three Planning applications (there are more).

App A has an Entity dimension, which has divisions
App B has a Division dimension.
App C has a Division dimension.

If a User enters data in App A and depending on the Entity he entered data, the data needs to be transferred to App B/App C and then run a rule on App B/App C

I can transfer data using @XWRITE, however I'm not going that route considering if you have to use BSO ASO mix environment.

I started on this kind of a wild journey to give him a solution. Couple of calls to and from Sree Menon (I'm amusing by his passion towards the product) to discuss some future enhancements and some geeky talk and here we are with a solution (there could be a different one).

I know that many of you now know and have used @CalcMgrExecuteEncryptMaxLFile, however I'm sure that you completed ignored @CalcMgrExecuteMaxLScript, the reason why I'm saying that it is cooler than it's cousin and the "why?" will be revealed in this post.

Things that I know

  1. I need to clear data from App B/App C depending on the user selection (Entity)
  2. Now transfer data from App A to App B/ App C depending on the user selection (Entity)
  3. Run an agg on App B/App C depending on the user selection (Entity)
My first choice was to go with @CalcMgrExecuteEncryptMaxLFile, and then try to pass the members from Web Form as variables in the calc string.
However that will not work, MaxL won't replace anything inside '.

This is where @CalcMgrExecuteMaxLScript is powerful.
Yeah it is sort of ugly(not readable), hey it is powerful (didn't you hear that the first time). I'm generating a calc script from a calc script :)

You might be wondering why those &quot& and &scolon& are there, that is a trick Sree's team added to get double quotes and semicolons into MaxL.

Everything after exist will be substituted.

Step 1 is done

Now data transfer, I was so excited that I could use @CalcMgrMDXDataCopy and be done with it, however it cannot happen in this scenario. This function cannot perform substitution of variables, RUNJAVA on other hand can do it. I guess Sree and team has a hack in place for RUNJAVA (for both MDX - copy and export) which does the substitution. I cannot use RUNJAVA as I've to check whether the entity belongs to App B or App C and that will be inside a FIX statement.

I'm using DATAEXPORT and then use a load rule to load it. You can use XWRITE if you are transferring the data to another BSO cube.

Now load rule is only used if there is a difference in dimensionality, else you can do a free form load.

The trick is to export the data into different files for different users - well another custom calc manager CDF function :)

I can get the current date time stamp and attach that to my file - well yeah if I could get even the milli seconds that would be great, however the function give only seconds. I guess two users using same intersection at same second would be a rarest event.

@CalcMgrExecuteMaxLScript again.
This is what I'm doing in the above script.
  • Set an Essbase variable to currenttime stamp
  • Using the Essbase variable set a substitution variables (The sub var will have the Scenario+Version+Department+CurrentTimeStamp)
  • Run an aggregation on products (this is all based on Vision app) - this is done because App B and App C don't have Product dimension
  • Export data using the sub var
Did you see the last substitution


Isn't that cool :)

I guess this is going to be way to lengthy to fit into one blog post, so we'll split this into two parts ;)

Wait for the next part where you can see the full script in play along with the MaxL logs.

Views discussed in this blog are my personal views.
They do not represent the view of organizations, businesses or institutions that I'm part of.