Category Archives: Analysis Services

How to process a cube in a SQL Agent job

It can be done but it’s not well documented and it’s complicated by the fact that you can’t easily get the result of the process. Unless you actively check and manually raise an error if there’s been a problem, the job will report sucess no matter how screwed up the cube processing is. That part is all about the XML results that are output by the process.

Here’s how I do it (and I’d be very interested if you have any suggestions to improve it):

Connect to the Analysis Services server in SQL Server Management Studio.

Right click on the database and select Process.

Configure all the options and then use the Script button to grab the XML that defines the processing job. It will look something like this:

<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
    <ErrorConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2">
        <KeyErrorLimit>-1</KeyErrorLimit>
        <KeyErrorLogFile>\\rep01\joblogs\ProcessASOnBI1KeyErrors.log</KeyErrorLogFile>
        <KeyNotFound>ReportAndContinue</KeyNotFound>
        <KeyErrorAction>ConvertToUnknown</KeyErrorAction>
        <KeyErrorLimitAction>StopProcessing</KeyErrorLimitAction>
        <KeyDuplicate>IgnoreError</KeyDuplicate>
        <NullKeyConvertedToUnknown>IgnoreError</NullKeyConvertedToUnknown>
        <NullKeyNotAllowed>ReportAndContinue</NullKeyNotAllowed>
    </ErrorConfiguration>
    <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2">
    <Object>
            <DatabaseID>BI1</DatabaseID>
        </Object>
        <Type>ProcessFull</Type>
        <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
</Batch>

Create a new SQL Agent job.

Add a step to get rid of any old output file(s). Change the step type to CmdExec and paste in this code, modifying the file name:

if exist "\\rep01\joblogs\ProcessASOnBI1Log.xml" (del "\\rep01\joblogs\ProcessASOnBI1Log.xml")

Add a step for the processing. Change the step type to SQL Server Analysis Services Command and paste in the XML. Go to the Advanced page and give it an XML output file – with the same name you used in the previous step – to write its results to.

Add a step to check the output XML file to see whether the process was successful. Change the step type to T-SQL and paste in this code:

-- Check the result files from the cubes
 
set nocount on;
 
create table #File
(
        ResultFileName        varchar(255)
);

insert #File (ResultFileName) values ('\\rep01\joblogs\ProcessASOnBI1Log.xml');
insert #File (ResultFileName) values ('\\rep01\joblogs\ProcessASOnBI2Log.xml');

create table #FileContents 
(
        LineNumber        int identity
        , LineContents        nvarchar(4000)
);
 
declare @FileContents        nvarchar(4000)
        , @NewLine        char(2)
        , @FileName        varchar(255)
        , @CubesFailed        bit
        , @CmdLine        varchar(300)
 
set @NewLine = char(13) + char(10);
 
set @CubesFailed = 0;
 
-- Loop through result files
declare file_cursor cursor
local forward_only 
for 
select        ResultFileName
from        #File;
 
open file_cursor
 
        fetch next from file_cursor 
        into         @FileName
 
 
        while @@fetch_status = 0
        begin
 
                set @CmdLine = 'type ' + @FileName;
 
                insert #FileContents
                exec master.dbo.xp_cmdshell @CmdLine;
 
                select        @FileContents = isnull(@FileContents, '') + @NewLine + isnull(LineContents, '')
                from        #FileContents;
 
                select @FileName;
 
                select @FileContents;
 
                set nocount off;
 
                if @FileContents like '%error%'
                begin
                        set @CubesFailed = 1
                end
 
                delete #FileContents;
 
 
                -- Get the next cursor row
                fetch next from file_cursor 
                into         @FileName
 
        end
 
close file_cursor
deallocate file_cursor
 
 
drop table #FileContents;
drop table #File;
 
 
if @CubesFailed = 1
begin
        raiserror('Cubes failed to build.', 16, 1)
end

Modify it to add your own file name(s) into #File at the top. This step will raise an error if the processing failed. If you don’t do this, you won’t know whether the job has worked.

Advertisements

Column length in SQL 2005 Analysis Services cube

I’ve been dark for a while as I’ve been on holiday in Turkey on the ‘Tourqoise Coast’. Great people, hot and sunny, amazing diving water, thoroughly recommended.
Anyway. I had a new one today with a cube. A few weeks ago, in the underlying database, a varchar column’s max length was increased from 50 to 100. The cube continued to build fine until somebody created a new row with a value longer than 50 characters. The cube failed to build with an error about binding. Annoyingly, refreshing the data source view wasn’t enough. I had to go into the dimension and actually update the properties there too.
Worth bearing in mind.