Reduce RAM required to process OLAP cube

by Cenderze   Last Updated August 26, 2017 19:06 PM

I have a cube which I have created by using (in Visual studio) New Project -> Import from Server (Multidimensional and data mining).

This cube is indeed created from the metadata of an already existing cube on the database Database A.

The difference between this cube and the newly created cube, NewCube, is that NewCube has many dimensions set as Visible -> False (where the old cube has the same dimensions set as visible). NewCube is also deployed on a different database on the same SSAS server, Database B.

NewCube have two of the three biggest tables, SaleStock and CentralStock replaced by the tables SaleStockYesterday and CentralStockYesterday which, instead of showing the stock balances for every article recorded everyday from five years back, only include the balances for each article from the last day. The tables SaleStock and CentralStock are deleted from the Data source view.

When I use Database -> Process and set the options to not terminate the process if there are multiple values etc, and set the option to not Count errors the cube starts to process.

My issue is that when the cube processes, after it has taken approximately 2 hours, the RAM required makes the SQL server for which SSAS is installed to use 99-100 % RAM. This results in other jobs to not execute because there is simply not enough RAM for these to start processing (i.e. different SQL jobs frmo the SQL agent).

My questions are:

The original cubes takes approx 1 hour to process, and this takes more than 2 hours. Is this because it is processed for the first time? The original cube uses partitions, is the reduced time to process a result of not being forced to process the whole cube?

The original cube is processed from a SQL agent job. NewCube is processed from SSAS. Could that have any result on RAM required?

When I set dimensions to Not visible, they still seem to process. Should I enjoy better RAM performance if I simply remove the dimensions, instead of putting them as not visible (I have them as not visible at the moment to hopefully prevent some errors due to all the Connections in the cube).

Are there any different options to process the cube, which could reduce RAM needed? I use Database -> Process to process NewCube. The old cube is processed using a SQL agent job, and uses FullProcess. Could that have any impact (I haven't had the guts to try this yet due to the server being shared by Another department which I do not know if they have any jobs, and during which hours, to prevent them being terminated because of too much RAM load)?

I do not know how much RAM the original cube requires to process due to the odd hours this is processed on. Is there a way to see RAM required many, many hours later?



Related Questions



SSAS Cache Levels

Updated January 05, 2017 08:02 AM

SSAS memory usage

Updated November 27, 2017 16:06 PM

Adding templated Date-Time dimension to the cube

Updated April 29, 2016 08:02 AM

Multidimensional model: cube structure doubt

Updated September 21, 2017 16:06 PM