Tuesday, April 28, 2009

Essbase Security Filter Samples via Maxl

Here are some examples of other types of filters you may need to create:


/*-------------------------------------------------------------------------------------------------------------*/
/* WRITE ACCESS TO BUDGET */
/*-------------------------------------------------------------------------------------------------------------*/
create or replace filter Sample.Basic.'FilterName' WRITE on '"Budget"';


/*-------------------------------------------------------------------------------------------------------------*/
/* WRITE ACCESS TO BUDGET & LEVEL 0 MARKET */
/*-------------------------------------------------------------------------------------------------------------*/
create or replace filter Sample.Basic.'FilterName' WRITE on '"Budget", @LEVMBRS("Market",0)';

Essbase Security Maxl Script

Here is a simple example of how to maintain Essbase security in a maxl statement:

spool on to Security.log;
login admin password on localhost;
set timestamp on;


/*-------------------------------------------------------------------------------------------------------------*/
/* READ ACCESS TO WEST AND ITS DESCENDANTS ONLY */
/*-------------------------------------------------------------------------------------------------------------*/
create or replace filter Sample.Basic.'Read_WestMarket_Filter' READ on '@IDESCENDANTS("West")';


/*-------------------------------------------------------------------------------------------------------------*/
/* CREATE GROUP(S) */
/*-------------------------------------------------------------------------------------------------------------*/
create or replace group 'Read_WestMarket_Group';


/*-------------------------------------------------------------------------------------------------------------*/
/* GRANT FILTER ACCESS TO GROUP(S) */
/*-------------------------------------------------------------------------------------------------------------*/
grant filter TestApp.TestDb.'Read_WestMarket_Filter' to 'Read_WestMarket_Group';


/*-------------------------------------------------------------------------------------------------------------*/
/* ADD USERS TO GROUP(S) */
/*-------------------------------------------------------------------------------------------------------------*/
alter user testuser1 add to group 'Read_WestMarket_Group';


logout;
spool off;
exit;

Create all of your filters in one section, then create all of your groups, then grant the filters to the groups and finally put the users in those groups.

Sunday, April 26, 2009

Essbase - Export Data (Post 3 of 3)

DATAEXPORT Calculation function:
I have been working with Essbase for a long time and I wish this function came out a long time ago.

* Only available in BSO applications as Calc Functions are not supported in ASO.

* You can export data to a text file while specifying the delimiter and file path/name (I have used this often since the release)

* You can export data to a binary file (haven't needed to go down this path yet, but looking forward to testing with it)

* You can export data to a relational database (like SQL Server) using an ODBC connection, be careful with this if you are using SQL Server 2005, had some issues about 1 year ago and the help desk really didn't help...but no issues with SQL Server 2000.

* The best part about this function is that it works like any other calculation function where you can FIX on any subset of data. So, if you only to fix on "Actual", for the Month of January, for a specific product..YOU CAN!!! The performance is very fast (assuming you have your dense/sparse settings set up properly)

* You have the ability to Export data based on certain conditions, like "Sales">500.

* Not only can you export using a Fix statement, you have ability to control the format of the text file (similar to report script functionality). Here are the options (I've used the ones in bold most often):

DataExportLevel ALL | LEVEL0 | INPUT;
DataExportDynamicCalc ON | OFF;
DataExportDecimal n;
DataExportPrecision n;
DataExportColFormat ON | OFF;
DataExportColHeader dimensionName;
DataExportDimHeader ON | OFF;
DataExportRelationalFile ON | OFF;
DataExportOverwriteFile ON | OFF;
DataExportDryRun ON | OFF;

Essbase - Export Data (Post 2 of 3)

Report Scripts:

* You can export data using a report script. Prior to Version 9.3.1, this was the only means of getting a subset of data out of an Essbase database. Not a big fan of them, but if you are running a version prior to 9.3.1, then you need to use them.


/*--------------------------------------------------------------------------*/
/* EXPORT DATA USING REPORT SCRIPT */
/*--------------------------------------------------------------------------*/

export database ASOsamp.Sample using report_file "'$ARBORPATH/app/ASOsamp/Sample/XptData.rep'" to data_file 'C:\Temp\ReportExport.txt';

Essbase - Export Data (Post 1 of 3)

There are a couple ways to export data from an Essbase cube, the old-school way and now in version 9.3, there is the calculation function "DATAEXPORT". Let's discuss in detail:

Export Data (the capability that has been around all along):

* For Block Storage Applications, you can export all data, level-0 data or input-level data. The 'all data' option is nice, but that assumes your hierarchies don't change or level-0 member doesn't get remapped to a new Level-1 member. So, if you hierarchies don't change and it takes too long to aggregate (CALCALL) your cube, then the 'all data' option is the way to go. If your hierarchies change often, then you need your level-0 option (this is the option I prefer to use most of the time as the hierarchies will change at some point in time and its easier to reload based on level-0 then all data. I rarely use the input-level data, as my input data is usually my level-0 data.

* You can export the data into column format or non column format. This option works well depending on what you are trying to accomplish. Column format files will be larger in size than non column format files, but you can use a load rule to load that data into another cube. Say you have to make some sort of data manipulation to get that data into your new cube, it would be advantageous to use a load rule and perform your data manipulation. If you want to simply update your hierarchies, then you can use the non-column format option (which is not pretty), but the file size is smaller. Smaller files size will mean faster export time and faster import time, but you can't you use a load rule for any data manipulation.

* You can export to multiple files at one time, like this:


/*--------------------------------------------------------------------------*/
/* EXPORT DATABASE TO 8 FILES */
/*--------------------------------------------------------------------------*/
export database Sample.Basic level0 data in columns
to data_file 'C:\Temp\ExportData1.0.Txt',
'C:\Temp\ExportData2.0.Txt',
'C:\Temp\ExportData3.0.Txt',
'C:\Temp\ExportData4.0.Txt',
'C:\Temp\ExportData5.0.Txt',
'C:\Temp\ExportData6.0.Txt',
'C:\Temp\ExportData7.0.Txt',
'C:\Temp\ExportData8.0.Txt';

* When you run the level-0 export, you will get upper level data on all dense dimensions, don't be alarmed, its just how the functionality works, even if the member properties are set as dynamic calcs

* The export function will NOT export any values for Dynamic Calc members, only Stored members (unless its a dense dimension)

* In version 9 (not sure which release), but users are able to retrieve on the database during the export process. Previous versions used to lock the database from any Read-Only action.

Thursday, April 23, 2009

Additional Maxl Samples

/*--------------------------------------------------------------------------*/
/* SET VARIABLE FOR CURRENTMONTH */
/*--------------------------------------------------------------------------*/
alter database Basic.Sample set variable "CurrentMonth" "July";


/*--------------------------------------------------------------------------*/
/* EXPORT DATA USING REPORT SCRIPT */
/*--------------------------------------------------------------------------*/
export database ASOsamp.Sample using report_file "'$ARBORPATH/app/ASOsamp/Sample/XptData.rep'" to data_file 'C:\Temp\ReportExport.txt';


/*--------------------------------------------------------------------------*/
/* EXECUTE CALCULATION */
/*--------------------------------------------------------------------------*/
execute calculation Sample.Basic.CALCALL;


/*--------------------------------------------------------------------------*/
/* KILL ALL REQUESTS TO DATABASE */
/*--------------------------------------------------------------------------*/
alter system kill request on application Sample;


/*--------------------------------------------------------------------------*/
/* LOGOUT ALL SESSIONS TO DATABASE */
/*--------------------------------------------------------------------------*/
alter system logout session on application Sample;


/*--------------------------------------------------------------------------*/
/* DISABLE CONNECTIONS TO DATABASE */
/*--------------------------------------------------------------------------*/
alter application Sample disable connects;


/*--------------------------------------------------------------------------*/
/* ENABLE CONNECTIONS TO DATABASE */
/*--------------------------------------------------------------------------*/
alter application Sample enable connects;


/*--------------------------------------------------------------------------*/
/* CLEAR APPLICATION LOG FILE */
/*--------------------------------------------------------------------------*/
alter application Sample clear logfile;


/*--------------------------------------------------------------------------*/
/* CLEAR ESSBASE LOG FILE */
/*--------------------------------------------------------------------------*/
alter system clear logfile;

/*--------------------------------------------------------------------------*/
/* LOAD DATA TO ASO SLICE */
/*--------------------------------------------------------------------------*/
import database ASOsamp.Sample data from load_buffer with buffer_id 1 override values create slice;

Export Database to Multiple files

/*--------------------------------------------------------------------------*/
/* EXPORT DATABASE TO 8 FILES */
/*--------------------------------------------------------------------------*/
export database Sample.Basic level0 data in columns
to data_file 'C:\Temp\ExportData1.0.Txt',
'C:\Temp\ExportData2.0.Txt',
'C:\Temp\ExportData3.0.Txt',
'C:\Temp\ExportData4.0.Txt',
'C:\Temp\ExportData5.0.Txt',
'C:\Temp\ExportData6.0.Txt',
'C:\Temp\ExportData7.0.Txt',
'C:\Temp\ExportData8.0.Txt';

NOTE: This process is useful when the export files are very large and go over the standard 2GB size and this gives you the ability to control how many files are created (always 8 files)...especially if you have automated processes that rebuilds your databases each night.

Other commonly used Maxl Script commands

/*--------------------------------------------------------------------------*/
/* LOAD APPLICATION AND DATABASE */
/*--------------------------------------------------------------------------*/
alter system load application ASOsamp;
alter application ASOsamp load database Sample;


/*--------------------------------------------------------------------------*/
/* COPY BSO OTL TO ASO APPLICATION */
/*--------------------------------------------------------------------------*/
CREATE OR REPLACE OUTLINE ON AGGREGATE_STORAGE DATABASE ASOsamp.Sample AS OUTLINE ON DATABASE Sample.Basic ;

/*--------------------------------------------------------------------------*/
/* BUILD ACCOUNTS DIMENSION */
/*--------------------------------------------------------------------------*/
set Dimension=Accounts;
set RulesFile=AcctASO;
import database ASOsamp.Sample dimensions
from local data_file 'C:\Temp\Accounts.txt'
using server rules_file 'Accounts'
on error write to 'DimAcct.err';

NOTE: Same syntax for Block Storage vs Aggregate Storage Databases

Maxl Script Sample

Here are some examples of Essbase Maxl Scripts. They were based off of the Sample.Basic and ASOsamp.Sample databases.


Load Data into the ASOsamp.Sample database. Those items in bold should be utilized in all maxl scripts.

spool on to LoadData.Log;
login admin password on localhost;
set timestamp on;

/*--------------------------------------------------------------------------*/
/* UNLOAD APPLICATION */
/*--------------------------------------------------------------------------*/
alter system unload application ASOsamp;

/*--------------------------------------------------------------------------*/
/* LOAD APPLICATION */
/*--------------------------------------------------------------------------*/
alter system load application ASOsamp;

/*--------------------------------------------------------------------------*/
/* CLEAR DATA FROM ASO APPLICATION */
/*--------------------------------------------------------------------------*/
alter database ASOSamp.Sample reset;

/*--------------------------------------------------------------------------*/
/* INITIALIZE ASO BUFFER */
/*--------------------------------------------------------------------------*/
alter database ASOSamp.Sample initialize load_buffer with buffer_id 1;

/*--------------------------------------------------------------------------*/
/* LOAD DATA TO BUFFER 1 */
/*--------------------------------------------------------------------------*/
import database ASOSamp.Sample data
from local data_file "'$ARBORPATH/app/ASOsamp/Sample/dataload.txt'"
using server rules_file 'dataload' to load_buffer with buffer_id 1
on error write to 'dataload.err';

/*--------------------------------------------------------------------------*/
/* LOAD DATA FROM BUFFER */
/*--------------------------------------------------------------------------*/
import database ASOSamp.Sample data from load_buffer with buffer_id 1;

/*--------------------------------------------------------------------------*/
/* AGGREGATE DATABASE */
/*--------------------------------------------------------------------------*/
execute aggregate process on database ASOSamp.Sample stopping when total_size exceeds 1.5;

logout;
spool off;
exit;


NOTE: You can put the spool on after the login code so that your log file will capture the login information.

Wednesday, April 22, 2009

MDX Formula - Missing Quotes???

One odd thing that I've come across in an Aggregate Storage Cube in version 9.3.1 is adding an MDX formula to a member via load rule using a text file as the source. If you have an MDX formula that contains a statement like IsUda(Time.CurrentMember, "OPEN_MONTH"), Essbase will not build the formula in the outline with the quotes in it. Instead, it will create the formula like this IsUda(Time.CurrentMember, OPEN_MONTH). This will cause a verification issue since OPEN_MONTH is not a current member, but rather a UDA. However, without the quotes, it will not recognize it as a UDA. Check out an example here

Data Manipulation

Load rules can be used to load data, modify outlines or both. I've always been a big believer on making any "data manipulations" in your source system. For instance, you are performing a dimension build that needs to combine 2 or more columns to create a new field. This can be done with a load rule, but when in doubt, perform this step in SQL or your source system to limit the amount of data manipulations that need to occur in a load rule. Believe it or not, over time, a load rule can become corrupt and if you don't have a backup (and you should always have a backup, but just in case you don't, it will be hard to recreate all the data manipulations that you made in a load rule.

With that being said, I've come across numerous clients that don't have the ability to change the source dimension files (for example, the files come directly from SAP) so you will need to make data modifications in a load rule at some point in time, here's some examples of things you can do:

--------------------------------

Add Prefix or Suffix
As I noted on the Field Properties page, you can add a Prefix and Suffix to any field. This is really straight forward, let's say your data comes in as 2009, 2010, 2011, but your members in the Year dimension are YR2009, YR2010, YR2011. Instead of changing your data files, simply add a prefix of YR on the Global Properties tab of the Field Properties option. After you make the change, you will see the preview of your data change to YR2009, YR2010, YR2011 instead of 2009, 2010, 2011.