Execute a truncate statement from Scribe

Well this has nothing to do with EPM, hmmm not that is not 100% true.

I was doing a project which involved Microsoft Dynamics and Essbase. The idea is to pull data from CRM to an on-premises Essbase cube.

The integration tool the company already owns is Scribe. I had to get stuff done ASAP.

The main challenge was to clear the tables before loading them using Scribe Solutions.

One way is to run a scheduled job on-premises before the Scribe job and truncate the tables.
Next one is to have a map (oh I'll talk about maps) which runs a truncate statement.

Well I don't know I was stuck with the idea of doing it the second way. (Maybe I didn't like giving up ;))

Either my Google-fu is bad all the links were talking about placing the SQL in a location and calling that #@#$%#@$%.

To be honest first of all I didn't have a clue what they were talking about and second of all I didn't like it.

Maps in scribe is a cool visual way (I like it, it is simple) to get data from source to target. you can define which table you want to get data from, join to another table and perform mapping to target tables.

Oh I forgot to mention, when you try to join, it will show you only the tables which you can create a join. Once you select the table, it'll then populate which fields can be used as a join.

So let's look at a map shall we.



I call it making a burger :)

First you query the source, then you can decide what you want which each result, then create, update, delete,.... source table.


If you keep the names the same (source columns same as target columns) it can perform an autolink for you!!!!

That's about the map, create it and execute the map, data from cloud will be sitting in your on-premises SQL server in no time.

My challenge was to perform a truncate operation before running the data pull.

As you can see I've two maps in my solution.

  1. Truncate map
  2. Data pull map
Let's look at the truncate map


Only source is defined, and as you can see since my source is SQL server, I get a Native Query (bun ;))


So what happens when you write a truncate statement now, Scribe will start complaining that the query returned a null schema table.

Hmm now how to overcome that. Easy peesy 


:) yup that's what I did. Just return current data and alias it to a column.

There you go a much easier way to run Native SQL queries in Scribe.

NUMSys Web Edition + Essbase Security Viewer

First post of 2016 and I didn't want to disappoint you.

You might be thinking "What took you so long?" I can definitely answer that question.

Try to fight migraines which comes back after every 5 ~10 days, working on a tight schedule, learning a new framework and finishing NUMSys (yes learning a new framework was part of NUMSys Web Edition). I've really spent a lot of time into this pet project of mine and I'm really proud of the results.

I take this opportunity to introduce to you the latest release of NUMSys with Essbase Reports. Even though the blog post calls it as a Security Viewer - it is more than that and you'll know why.

I did receive lot of feedback on the first version, however I've not added all those to this version. The reason why I could not added them was because I wanted to keep up with my promise of "World needs a security Viewer" :)

I'm keeping track of all the requests and will be incorporating them to forth coming releases/

From this version onwards you'll see another reporting component added to NUMSys - "Essbase Reports". Thank you Kishore Mukkamala, for planting the idea of Essbase Stats Reports.


It'll pull the existing Essbase applications registered with Shared Services.

Essbase Reports

  • License and Server Report
  • Application Report
  • Database Report
  • Dimension Report

License and Server Report

This will extract license information and some server statistics like security file fragmentation, Essbase CFG errors and some CFG settings.

Application Reports

This extracts the information about your applications (the ones which you can view in EAS and more)
Some information include log file size, count of databases



Database Reports

This is similar to the one above, provides you statistics, caches and compression information of databases.


Dimension Reports

Information on all dimensions, how many members are stored, udas, dimension type,....

Keep in mind that this will not provide a dimension extract.



Filter Report

You can view the existing filters and their assignment by selecting the applications.

Filter definition can be populated by selecting the filter.


You can export the filters as MaxL statement along with their assignments.


Access Control Reports

Access control report is similar to the provisioning reporting option in Shared Services, except that this one shows you the filter assignments and the count of users/groups (direct/indirect) provisioned against that application. You also get the option to filter user(s)/group(s)




Log Analyzer

Now comes the most fun component. I've written a post on "Monitor the last login time of a user", which I think is one of the popular posts (lots of comments) till date. I had to spend lot of time to figure out an easier way to perform this. (well, it is not easy as it seems)

I ended up making the decision of adding a database to NUMSys and loading the log files to that DB. Now came the question which one should I use, I first chose Derby as I'll be able to ship that along with the software, however went against it as the querying language is slightly different than what we all know.

I had to replicate what EPM configuration tool does :), while installing NUMSys it'll configure the required databases. You've to add the required JDBC jar files in the lib location of the installer.

ojdbc6.jar file if using Oracle
sqljdbc.jar file if using MS SQL server

  1. SERVERLOG
  2. SERVERODLLOG
  3. APPLICATIONLOG
  4. APPLICATIONODLLOG
  5. LASTLOGIN
You can drag and drop the log files into NUMSys (or use the Browse Logs button) to upload the log files.

ODL logs will go the *ODL* tables (I still don't know why we've legacy and ODL logs.)

You've the option to select multiple log files.


You can also choose to keep the old logs, if not NUMSys will truncate the tables while uploading the files.

I didn't test this on MS SQL server

Once the logs are uploaded, you can query the logs from NUMSys, yes you don't need to start SQL server management studio or TOAD or SQL Developer :)




I had hard time figuring out how to name the columns, like what does Local mean, so I turned to the experts (Cameron LackpourTim German and John Booth - thank you for your thoughts).

Cameron was of the opinion that all log files are local to Essbase, so what is it!!!

Anyways this is what I came up with, so if you've suggestions let me know I'll try to update the columns names in coming versions.

You can export the query results if needed using the "Export Results" icon.


Once you upload the log files, you can run the inbuilt reports (only 1 now)

Last Login report will provide you with a list of users, their last login time and from where they accessed it.


In coming releases I'll add a command line version where you can schedule the log upload and run the log analyze reports.

So now you know what took me so long :)

Adding the database now allows me to do more fun stuff, like enable auditing for User/Group operation in NUMSys.

Next planned release will have an option to get Reports from Workspace......

You can find NUMSys documentation here.

You can download NUMSys Web Edition (3.2) from here.

NUMSys Web Edition

Its been a long time (after my last post). You might be wondering what took me this while to come up with a blog post.

I was busy with my project work and side project :). It took a long time (I've to admit it) and I'm happy with the results :)

So here is my Christmas gift for you all :)

There were couple of occasions I enhanced NUMSys, however creating a Web version of it was never in my mind.

It all started with a discussion on Network54. For NUMSys to be a Security Viewer, there is a long way to go (or maybe a short one), however this is a start.



You can drag and drop users/groups to parent groups while importing groups.

You can create reports from Shared Services.


Watch out for that section - I'm planning to add Security reports (Essbase/Planning/Workspace) into Reports Section of NUMSys.

All the features of NUMSys client version is ported to the Web release.

You can use filters while exporting Users/Groups. You can also remove users/groups once the information is populated.



I've two users who end with *karan, however I only want to export ckattookaran. I can then right click on gkattookaran and clear that row.

NUMSys comes with an installer, which can automatically configure it to an existing Weblogic domain.

A windows service will be installed after the end of the configuration.

For *nix users a start and stop scripts will be created in the bin folder. Please follow the documentation on how to install here.

You can find related documentation, videos at this Wiki page of NUMSys

You can download NUMSys web version from here.

CloudMap - your integration mapping system

It is very unusual of me to write a blog while I'm attending a conference, I know my older brother can even do a real-time blogging :(

This post will introduce you to a quick tool that was built for integration between cloud and on-prem systems.

Beta is now working for Salesforce to PBCS integration. I'll be planning to add more CloudMaps for PBCS to Salesforce and other CRM/on-prem systems in a later release.

Installing CloudMap

CloudMap can be installed in 3 clicks



Checks whether EXCEL is running, it'll allow you to close EXCEL and resume the installation.


By default CloudMap gets installed to C:\Huron\Utils\CloudMap


Using CloudMap



You can configure Salesforce and PBCS connection information in Options. This is going away and login information will now appear in the task pane. (more Smart Viewish)

Change SOQL for data pull, mention the Datacolumn field.


If there are extra dimensions in PBCS, you can add them in Header Definition.

Select the job to run for the data update. This will show all the jobs from your PBCS environment.


That's it!!!!!!! The data can will be pulled from Salesforce and updated in PBCS.

Similar way MetaData (Customers) can be updated from Salesforce.


You can update upto 3 dimensions!!!

The dimensions and jobs are pulled from PBCS.

I'll be changing the look and feel to make it more aligned to Smart View in the future releases.

Here is the integration point from PBCS.


If you've ideas, please let me know and I can try adding that to the product.

CloudMap uses REST API for the integration, so no more EPM Automate :)

How to log off Inactive Sessions from Essbase

This post is about how to programatically log off the sessions that becomes Inactive.

The question came from a friend of mine and I did let him know that some of those sessions that you see in EAS which is not attached to an Application are all not inactive.


In the screenshot the Sessions for Huron is an active session (that's me), that user is an admin who might have executed a MaxL which is server specific.

However I thought it is fun to do this purely from a fun perspective and created a batch file for it. You can download it from here

What I'm doing in that script is first to get all sessions using display session all MaxL and spool that to a file.

There is a second MaxL which gives me all the applications that are there in the system display application all

Once I got the spool files I'm using for loop to get the required information.
What the script is doing is get the 2nd and 4th token, also I asked it to skip 4 lines, this is my header.


Here is the output of the above command


0 0 is the session you've logged in through MaxL (the one which triggered the display session command)
- -  is the byproduct of our for loop (this line is producing that OK/INFO - 1241044 - Records returned: [27].)

Now to get the Cubes, I guess you get the idea now.
Again a for loop and get the first token


Now comes the comparison where I've to pass column 2 from App_Session.txt to Cubes.txt and get the ones which does match.
This is achieved using gawk and for loop
gawk compares and gives us the sessionid which is not associated with an application and in this case we'll get 0 0 - - and the reals ones.

Bring for loop to the picture and say eol=- this will remove - - and add an if condition which check whether the session id is not equal to 0 and add the ones which are not to the MaxL.

Here is the MaxL that is been created and the spool output.



One issue that I found while playing with this was you cannot use an encrypted MaxL for this.

You cannot use encrypted MaxL with display command.

Views discussed in this blog are my personal views.
They do not represent the view of organizations, businesses or institutions that I'm part of.