Dear readers,

I am glad to announce that SQLBG.WORDPRESS.COM is moved to a new and improved site, where you can find all my new articles and where I am aiming to invite feature bloggers in order to maximize the content and your experience. Please continue reading me on www.mssqlinsider.com. All content from this blog post is already transferred there 🙂 


SQL Server and some random notes on Error Logs

Recently I was trying to build up my own virtual playground with active directory, couple of SQL Server machines including one cluster so I can try some of that new stuff that I have only been reading about in SQL Server Denali (or the officially announced SQL Server 2012). It’s been quite a journey but I am on the final stretch. Only one SQL Server instance left to be build and I can start play freely. In the meanwhile I brought online my first Denali CTP 3 instance last week and despite I had some troubles with the installation itself, I finally got it running. Not without further issues but for that I will write probably later. In this post I will share some insights that I noted when we talk about SQL Server Error logs.

So we all heard for the SQL Server error log. We all (hopefully) use it and refer to it as a first stop when resolving issues with our SQL Server. Recently couple of things came to my attention – how we configure the number of SQL Server error logs that are kept before recycling, what happens behind the scenes when we are configuring those numbers, where the values are stored, etc. and is there anything special going in the background when we try to browse through SQL Server Management Studio the error log, etc.

We can configure the number of error log files that are kept before recycling using basically two options: through SSMS (http://msdn.microsoft.com/en-us/library/ms177285.aspx) and through manual edit of OS Registry(http://support.microsoft.com/kb/196909). The interesting part here is that if you use SQL Server 2008 R2 SSMS you cannot enter more than 99 files:


Keeping in mind that second option of manually editing the registry you can easily change the number of error logs kept before recycling to be more than 99:


So that is strange, right?!? Through the GUI I cannot enter more than 99 files but manually I can set such number.

While I was digging around that thing I sniffed couple of interesting queries that SQL Server is performing:

– when I go to SQL Server Error Logs –> Configure behind the scenes SQL Server executes the below query to capture the values:

        declare @HkeyLocal nvarchar(18)
        declare @ServicesRegPath nvarchar(34)
        declare @SqlServiceRegPath sysname
        declare @BrowserServiceRegPath sysname
        declare @MSSqlServerRegPath nvarchar(31)
        declare @InstanceNamesRegPath nvarchar(59)
        declare @InstanceRegPath sysname
        declare @SetupRegPath sysname
        declare @NpRegPath sysname
        declare @TcpRegPath sysname
        declare @RegPathParams sysname
        declare @FilestreamRegPath sysname

        select @HkeyLocal=N'HKEY_LOCAL_MACHINE'

        -- Instance-based paths
        select @MSSqlServerRegPath=N'SOFTWARE\Microsoft\MSSQLServer'
        select @InstanceRegPath=@MSSqlServerRegPath + N'\MSSQLServer'
        select @FilestreamRegPath=@InstanceRegPath + N'\Filestream'
        select @SetupRegPath=@MSSqlServerRegPath + N'\Setup'
        select @RegPathParams=@InstanceRegPath+'\Parameters'

        -- Services
        select @ServicesRegPath=N'SYSTEM\CurrentControlSet\Services'
        select @SqlServiceRegPath=@ServicesRegPath + N'\MSSQLSERVER'
        select @BrowserServiceRegPath=@ServicesRegPath + N'\SQLBrowser'

        -- InstanceId setting
        select @InstanceNamesRegPath=N'SOFTWARE\Microsoft\Microsoft SQL Server\Instance Names\SQL'

        -- Network settings
        select @NpRegPath=@InstanceRegPath + N'\SuperSocketNetLib\Np'
        select @TcpRegPath=@InstanceRegPath + N'\SuperSocketNetLib\Tcp'

        declare @SmoAuditLevel int
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'AuditLevel', @SmoAuditLevel OUTPUT

        declare @NumErrorLogs int
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'NumErrorLogs', @NumErrorLogs OUTPUT

        declare @SmoLoginMode int
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'LoginMode', @SmoLoginMode OUTPUT

        declare @SmoMailProfile nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'MailAccountName', @SmoMailProfile OUTPUT

        declare @BackupDirectory nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'BackupDirectory', @BackupDirectory OUTPUT

        declare @SmoPerfMonMode int
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'Performance', @SmoPerfMonMode OUTPUT

			  if @SmoPerfMonMode is null
			  set @SmoPerfMonMode = 1000

			  declare @InstallSqlDataDir nvarchar(512)
			  exec master.dbo.xp_instance_regread @HkeyLocal, @SetupRegPath, N'SQLDataRoot', @InstallSqlDataDir OUTPUT

			  declare @Arg sysname
			  declare @Param sysname
			  declare @MasterPath nvarchar(512)
			  declare @LogPath nvarchar(512)
			  declare @ErrorLogPath nvarchar(512)
			  declare @n int

			  select @n=0
			  select @Param='dummy'
			  while(not @Param is null)
			  select @Param=null
			  select @Arg='SqlArg'+convert(nvarchar,@n)

			  exec master.dbo.xp_instance_regread @HkeyLocal, @RegPathParams, @Arg, @Param OUTPUT
			  if(@Param like '-d%')
			  select @Param=substring(@Param, 3, 255)
			  select @MasterPath=substring(@Param, 1, len(@Param) - charindex('\', reverse(@Param)))
			  else if(@Param like '-l%')
			  select @Param=substring(@Param, 3, 255)
			  select @LogPath=substring(@Param, 1, len(@Param) - charindex('\', reverse(@Param)))
			  else if(@Param like '-e%')
			  select @Param=substring(@Param, 3, 255)
			  select @ErrorLogPath=substring(@Param, 1, len(@Param) - charindex('\', reverse(@Param)))

			  select @n=@n+1

			  declare @SmoRoot nvarchar(512)
			  exec master.dbo.xp_instance_regread @HkeyLocal, @SetupRegPath, N'SQLPath', @SmoRoot OUTPUT

        declare @SmoDefaultFile nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'DefaultData', @SmoDefaultFile OUTPUT

        declare @SmoDefaultLog nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @InstanceRegPath, N'DefaultLog', @SmoDefaultLog OUTPUT

        declare @ServiceStartMode int
        EXEC master.sys.xp_instance_regread @HkeyLocal, @SqlServiceRegPath, N'Start', @ServiceStartMode OUTPUT

        declare @ServiceAccount nvarchar(512)
        EXEC master.sys.xp_instance_regread @HkeyLocal, @SqlServiceRegPath, N'ObjectName', @ServiceAccount OUTPUT

        declare @NamedPipesEnabled int
        exec master.dbo.xp_instance_regread @HkeyLocal, @NpRegPath, N'Enabled', @NamedPipesEnabled OUTPUT

        declare @TcpEnabled int
        EXEC master.sys.xp_instance_regread @HkeyLocal, @TcpRegPath, N'Enabled', @TcpEnabled OUTPUT

        declare @InstallSharedDirectory nvarchar(512)
        EXEC master.sys.xp_instance_regread @HkeyLocal, @SetupRegPath, N'SQLPath', @InstallSharedDirectory OUTPUT

        declare @SqlGroup nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @SetupRegPath, N'SQLGroup', @SqlGroup OUTPUT

        declare @FilestreamLevel int
        exec master.dbo.xp_instance_regread @HkeyLocal, @FilestreamRegPath, N'EnableLevel', @FilestreamLevel OUTPUT

        declare @FilestreamShareName nvarchar(512)
        exec master.dbo.xp_instance_regread @HkeyLocal, @FilestreamRegPath, N'ShareName', @FilestreamShareName OUTPUT

        declare @cluster_name nvarchar(128)
        declare @quorum_type tinyint
        declare @quorum_state tinyint
        BEGIN TRY
            SELECT @cluster_name = cluster_name, 
                @quorum_type = quorum_type,
                @quorum_state = quorum_state
            FROM sys.dm_hadr_cluster
        END TRY
            IF(ERROR_NUMBER() != 297)
        END CATCH

@SmoAuditLevel AS [AuditLevel],
ISNULL(@NumErrorLogs, -1) AS [NumberOfLogFiles],
(case when @SmoLoginMode < 3 then @SmoLoginMode else 9 end) AS [LoginMode],
ISNULL(@SmoMailProfile,N'') AS [MailProfile],
@BackupDirectory AS [BackupDirectory],
@SmoPerfMonMode AS [PerfMonMode],
ISNULL(@InstallSqlDataDir,N'') AS [InstallDataDirectory],
CAST(@@SERVICENAME AS sysname) AS [ServiceName],
@ErrorLogPath AS [ErrorLogPath],
@SmoRoot AS [RootDirectory],
CAST(case when 'a' <> 'A' then 1 else 0 end AS bit) AS [IsCaseSensitive],
@@MAX_PRECISION AS [MaxPrecision],
CAST(FULLTEXTSERVICEPROPERTY('IsFullTextInstalled') AS bit) AS [IsFullTextInstalled],
SERVERPROPERTY(N'ProductVersion') AS [VersionString],
CAST(SERVERPROPERTY(N'Edition') AS sysname) AS [Edition],
CAST(SERVERPROPERTY(N'ProductLevel') AS sysname) AS [ProductLevel],
CAST(SERVERPROPERTY('IsSingleUser') AS bit) AS [IsSingleUser],
CAST(SERVERPROPERTY('EngineEdition') AS int) AS [EngineEdition],
convert(sysname, serverproperty(N'collation')) AS [Collation],
CAST(SERVERPROPERTY('IsClustered') AS bit) AS [IsClustered],
CAST(SERVERPROPERTY(N'MachineName') AS sysname) AS [NetName],
ISNULL(@SmoDefaultFile,N'') AS [DefaultFile],
ISNULL(@SmoDefaultLog,N'') AS [DefaultLog],
@LogPath AS [MasterDBLogPath],
@MasterPath AS [MasterDBPath],
SERVERPROPERTY(N'ResourceVersion') AS [ResourceVersionString],
SERVERPROPERTY(N'ResourceLastUpdateDateTime') AS [ResourceLastUpdateDateTime],
SERVERPROPERTY(N'CollationID') AS [CollationID],
SERVERPROPERTY(N'ComparisonStyle') AS [ComparisonStyle],
SERVERPROPERTY(N'SqlCharSet') AS [SqlCharSet],
SERVERPROPERTY(N'SqlCharSetName') AS [SqlCharSetName],
SERVERPROPERTY(N'SqlSortOrder') AS [SqlSortOrder],
SERVERPROPERTY(N'SqlSortOrderName') AS [SqlSortOrderName],
SERVERPROPERTY(N'ComputerNamePhysicalNetBIOS') AS [ComputerNamePhysicalNetBIOS],
SERVERPROPERTY(N'BuildClrVersion') AS [BuildClrVersionString],
@ServiceStartMode AS [ServiceStartMode],
ISNULL(@ServiceAccount,N'') AS [ServiceAccount],
CAST(@NamedPipesEnabled AS bit) AS [NamedPipesEnabled],
CAST(@TcpEnabled AS bit) AS [TcpEnabled],
ISNULL(@InstallSharedDirectory,N'') AS [InstallSharedDirectory],
ISNULL(suser_sname(sid_binary(ISNULL(@SqlGroup,N''))),N'') AS [SqlDomainGroup],
case when 1=msdb.dbo.fn_syspolicy_is_automation_enabled() and exists (select * from msdb.dbo.syspolicy_system_health_state  where target_query_expression_with_id like 'Server%' ) then 1 else 0 end AS [PolicyHealthState],
@FilestreamLevel AS [FilestreamLevel],
ISNULL(@FilestreamShareName,N'') AS [FilestreamShareName],
-1 AS [TapeLoadWaitTime],
CAST(SERVERPROPERTY(N'IsHadrEnabled') AS bit) AS [IsHadrEnabled],
SERVERPROPERTY(N'HADRManagerStatus') AS [HadrManagerStatus],
ISNULL(@cluster_name, '') AS [ClusterName],
ISNULL(@quorum_type, 4) AS [ClusterQuorumType],
ISNULL(@quorum_state, 3) AS [ClusterQuorumState],
       AS sysname) AS [Name],
       AS sysname) AS [InstanceName],
CAST(0x0001 AS int) AS [Status],
0 AS [IsContainedAuthentication],
CAST(null AS int) AS [ServerType]

The output from that query is quite interesting to me. It includes the number of error log files kept, root directory, instance directory, backup directory, default collation, product level, edition and so on. Cool, huh? Smile

– When I browse the SQL Server Error Logs folder from SSMS, the following query is executed:

create table #err_log_tmp(ArchiveNo int, CreateDate nvarchar(24), Size int)

insert #err_log_tmp exec master.dbo.sp_enumerrorlogs

CAST(er.ArchiveNo AS sysname) AS [Name],
'Server[@Name=' + quotename(CAST(
       AS sysname),'''') + ']' + '/ErrorLog[@ArchiveNo=''' + CAST(er.ArchiveNo AS sysname) + ''']' AS [Urn],
er.ArchiveNo AS [ArchiveNo],
CONVERT(datetime, er.CreateDate, 101) AS [CreateDate]
#err_log_tmp er
[ArchiveNo] ASC

drop table #err_log_tmp

The output form the query looks like this:


The important part here is the stored procedure that is in the bottom of everything and that is master.dbo.sp_enumerrorlogs. This stored procedure is very similar to the extended SP xp_enumerrorlogs. If you execute both of those queries in a normal query window you get exact output from both:


So yet another mystery to me Smile I dug deep because I was curious which stored procedure comes first – the extended one or the regular one in master. I ran sp_helptext against each of them and the output made it clear:


The extended stored procedure is used to implement the sp_ one with one and only one intention – security – if you are not part of the security admin role of the server then you are not able to run the stored procedure! And here is the error you get if you try to browse SQL Server Error Logs folder in SSMS:


I hope you enjoyed the reading and some of my geeky findings Smile

Querying multiple instances for database file information

Today I was assigned a task to query multiple instances and gather space usage details on all databases. First I thought it would be quite easy task to do, but as my final goal was to summarize the statistics per database and not per database file, I had to modify the standard query (which turned out to be one of the most popular suggestions Google and Bing provided). Anyway, if you want to query multiple instances and gather space usage statistics on database and file level, there is the script:

declare @temp table (
     instance_name varchar(100)
    ,database_name varchar(100)
    ,database_file varchar(100)
    ,file_type varchar(5)
    ,all_space decimal(12,2)
    ,space_used decimal(12,2)
    ,space_free decimal(12,2)
    ,physical_path varchar(256)
insert into @temp
exec sp_MSforeachdb ‘
use ?
     CAST(SERVERPROPERTY(”MachineName”) AS VARCHAR(100)) + ”\” + CAST(SERVERPROPERTY(”InstanceName”) AS VARCHAR(100))
    ,DB_NAME() AS DatabaseName
from sys.database_files

select * from @temp

The @temp table is used so I can easily copy the results to Excel for example as sp_MSforeachdb returns separate blocks in the result screen. Also based on that table you can summarize database size info if you do not feel confident in doing it with excel.

The script can be run on an instance level or on a Registered Server Group level (for SQL Server 2008 – querying multiple instances).

Software-NUMA and SQL Server misconceptions

While digging around NUMA(software and hardware) I found something very interesting. According to BOL soft-NUMA affects directly the number of lazy-writer processes per memory node so if you experience a bottleneck in lazy-writes, you should play with this configuration. Well, according to a blog post of the CSS engineers of Microsoft, this is not exactly true. The lazy-writer process is one per hardware NUMA group i.e. single process per single memory node. The part that is affected by soft-NUMA is the I/O Completion ports.

The next question is what is that I/O Completion port. I/O completion ports provide an efficient threading model for processing multiple asynchronous I/O requests on a multiprocessor system. When a process creates an I/O completion port, the system creates an associated queue object for requests whose sole purpose is to service these requests. Processes that handle many concurrent asynchronous I/O requests can do so more quickly and efficiently by using I/O completion ports in conjunction with a pre-allocated thread pool than by creating threads at the time they receive an I/O request.


1. Microsoft CSS Engineers blog post on soft-NUMA

2. MSDN article on I/O completion ports

What can happen in one-week time (off-topic)

Technorati Tags: ,

Last couple of days there were really big things happening to me as in a week time I got certified in SQL Server 2008 Installation and Maintenance and SQL Server Database Development. With those two certificates I am getting closer to my goal of getting all MCITPs for SQL Server 2008. Another very important event that happened to me yesterday was attending the Bulgarian BI and SQL user group regular meeting. As it was my first meeting I was quite nervous and inactive in the session but I hope it won’t be like that next time. I have to admit that in the group we have some great specialists, all gathered around a common interest – SQL Server. There is definitely a lot to learn and a lot to see in the field of SQL Server but I am not afraid of going into deep waters. I am eager to attend the next meeting where we will talk for SQL Server R2 MDS (Master Data Services) and PowerPivot (the last one is quite interesting and my passion if you will, as you might noticed in my PowerPivot blog series). Mentioning the PowerPivot series, there is still one article to be written so stay tuned. Following days I expect to have some extra time free so I will be able to complete what I have started (hopefully).

PowerPivot for Microsoft Excel 2010 – part two

This is the second part of my series, dedicated to a brief and clear introduction to PowerPivot for Excel. The idea of the series is to help any user to start using the product without reading tons of articles and documentation and at the same time to be introduced to the full range of possibilities of that tool.

Today I will write on the matter of importing data into PowerPivot and will show how to import data from plain text files, create relationships between imported tables in PowerPivot and browsing through the other possible sources of data that PowerPivot supports.

So let’s begin. Last time I showed how to navigate to PowerPivot Window.In order to import data from file (Text or Excel), you go to the drop down From Files and choose From Text (for our example).


Then as usual when importing Text file you choose the file, the separator and if the first column in the file should be used for defining column headers. The different part here is, that you can filter the data you are importing. For example if I want to import only Velo French Product CategoryName, I just click on the filter drop down and uncheck all unwanted values. This is very powerful when it comes to dealing with only a part of a source data in which you are interested in.


After that the import process is carried out and you have a new table imported into your PowerPivot solution.


This is as it comes to importing data. But when you import several tables and want to prepare a report on them, there is something that you should always take care of – relationships. If you do not define them your data will not have desired integrity and of course won’t have a correct data in the report. In the next few lines I will explain and show how to create relationships between tables.

So far I imported two tables – FactInternetSales and DimProductCategory (both part of Adventure Works datawarehouse database).


To create a relationship between both tables, you go to Table ribbon (right next to Home) and then – Create Relationship button.


For those of you that have basic knowledge on relational databases and database design, creating a relationship in creating a foreign key constraint on a table. In PowerPivot first you point out the Table and the field that is common between both tables. First you select the master table and then – related (lookup) one. Be careful for the datatype of the columns you are referencing, because if they are not the same an error will be raised and the relationship creation will fail.

image After hitting Create button, the relationship is created. If you want to edit or delete a relationship, you can go to Manage Relationships button that is right next to Create Relationship.

Now I want to talk about what kind of datasources you can use to import data to PowerPivot. Basically you can import from anything – starting form MS SQL and ending with Sybase, Teradata and Informix. You can also import from Oracle, IBM DB2 and most interesting for me (and the ones that will make the difference for PowerPivot) – importing form SQL Azure and RSS Feeds (including SSRS Report Feed). I will dedicate a special article to importing data from RSS Feed of a SSRS report so please stay tuned.

PowerPivot for Microsoft Excel 2010 – part one

PowerPivot for excel is a very powerful tool that enable users to create, store and present big amount of data without having any exceptional SQL writing skill or knowledge on databases, code writing and in the same time use the friendly and already known GUI of Excel. In this series of posts I will show you the basics of PowerPivot, how to install it, how to import data, how to create your first Gemini report, how to manipulate data (add new columns, using basic DAX – Data Analysis Expressions), etc. I just want to state at the beginning that I will explore only PowerPivot for Excel and will not refer to PowerPivot for Sharepoint which is basically the same thing but has its own specifics on delivering and designing reports.

So I will start with the installation. Before installing it consider the minimum requirements (those are for the current CTP:

  • Requires Microsoft Office Professional Plus 2010 Beta
  • PowerPivot for Excel supports 32-bit or 64-bit machines
  • PowerPivot requires a minimum of 1 GB of RAM (2 GB or more recommended)
    The amount of memory needed will depend on the PowerPivot solution you are designing
  • Requires Windows XP with SP3, Windows Vista with SP1 or Windows 7
  • If you are running a version of Windows other than Windows 7, you will need to download and install the .NET Framework 3.5 SP1

After making sure you meet those requirements, you can download and install PowerPivot for Excel (download link). The installation is pretty straight forward and after a bunch of “Next” clicks you are done. PowerPivot in its nature is an add-in for Excel and as such – it is displayed on the ribbon after starting the application:


PowerPivot menu is simple and it serves as an entry point to the “real” tool in which you manipulate the row data. To open it, click on the PowerPivot Window button, which is in the upper left corner of the PowerPivot ribbon


The PowerPivot window is as simple as an excel and most of the functionalities and options are well-known from Excel (such as importing from excel, SQL Server, Analysis Services, etc). One of the most exciting things here is that you can import data from a RSS Feed but further details you will find later in this series of articles.

This is how PowerPivot window looks like:


In part two you can expect me to show you different options for importing data, creating relationships between imported tables, adding a new column and using DAX.