Daily Archives: November 11, 2010

New SQL 2005/2008 installation

When accepting such a role in a company you should be aware that you will not always be seen as  the nicest person that everyone will want to communicate over a coffee and create friendships. The DBA is usually the man/woman who solves problems as we speak, but in the same time – he is the same man who will be the first held responsible without analyzing other factors when something goes wrong with the database or server administration. The DBA is the man who must discover what goes wrong, what to do to solve the problem immediately, and how to prevent such situations coming back. It’s frustrating but mostly it’s nice and fun and quite often you get a chance to demonstrate your intelligence and technical skills and receive appropriate feedback when you are really good at what you’re doing.Now, this part with “really good” is hard to get. You don’t learn from books internet or universities how to be a good DBA. There are not many people who can teach you the best practices. And generally, there aren’t many DBA’s in a company given the big responsibility and trust offered. Basically you just need to have a global vision of the entire system that you manage and treat very seriously any problem. That’s what it is. A DBA deals with the problems, anticipates them and solves them. Funny? I say yes …

Although there are many articles, a DBA really learns a lot practicing on a daily basis, bravely facing any problem that arises or is presented by your technical or functional team. Nobody is really made for it, but some feel that it suits them. I opened this blog to write about the problems I face or had and I lost time to find the solution. I practice this job for almost five years and I turned into many difficult situations and I got helped a lot by other bloggers with similar problems, whom I am thankful. Although never a problem that seems just like another will have the same solution, there is always one for yours.

For starters I want to describe what should you do if you’re on a new installation of SQL 2005/2008. It’s always important who did the installation and how the server is configured. Some bad configurations, unfortunately can only be securely undone after a re-install.

If you’re on a new installation of SQL 2005 or SQL 2008: The installation process is very different from 2005-2008 but it is important to consider the following for both these installations:

1. Installation options

  1. Be careful of what edition of SQL you plan to install and on what type of hardware (32 bit or 64 bit). As a rule, Enterprise is installed for the production environment, Developer for development environments and Standard for production but with a lot limitations compared to Enterprise features. It’s good to check what version is required before the application is deployed, because the costs differ substantially.
  2. Check with the DEV team, if you need all the components proposed in the kit. SQL Server is a high consuming resources software – especially in what memory is concerned. For example: if you do not need Analysis Services (for data warehouse) do not install it. If needed you will be able to install it later.
  3. If you want to install multiple instances on the same machine, you should give a different name for you instance instead of using the DEFAULT one. Default instance ID will be MSSQLSERVER and the Named Instance will be what you want it to be – ex: MYINSTANCE.

2. Startup Accounts

  1. The 2005 install will ask you to insert the details for the Service’s account used to run SQL Server. You can use the Local System account, Domain System account if the server is connected to a domain and you previously created a Service account specifically for this purpose in Active Directory or you can use the Domain System account option to insert a local user created for this purpose with Administrator role on the server. After installation you can change the Service Accounts to your components installed from SQL Server Configuration Manager or Services from Administrative Tools.
  2. In the 2008 install you can select in the same window: the Service Accounts for all components installed, their startup type (Manual/ Automatic/Disabled) as well as the Collation type used at the server level (very important this step). To find out what type of collation you need, check the documentation of the application deployed or consult the DEV team. If no type of Collation is specified, use the default one.

3. Security Models

The next step is to select an authentication mode supported by SQL Server: Mixed mode (SQL Windows) or Windows. The type of authentication is very important in determining the way your users access the system. Microsoft’s recommendation is as much as possible, to use Windows authentication mode because it is the safest mode. With this type of authentication, user’s access is controlled by a Windows user or group, which in turn is authenticated on the client machine or by Active Directory. SQL authentication, in change opens a door for the malicious users who know the username and password and can log into the system. Also SQL username & password cross the network unencrypted unless network encryption is enabled ( SSL certificate) and can be intercepted by attackers.

4. Databases Files Location.

  1. In SQL 2005 it was possible to select a single location for User Database (Data files and Log files), a single location for the System Databases (Data files and Log files) and a single location for SQL installation files.
  2. In SQL 2008 you can set up better the storage space so that you perform less post-installation changes. You can choose separate locations for: the System databases, Tempdb (which should be placed on a separate disk space well sized), User Databases Data Files and User Databases Log files. Performance will be enhanced if Data Files will be separated from Log files on different disks because the I/O operations of the Log files are sequential and random for Data Files.

After you install SQL Server it is always very important to setup few configuration options or Server properties like:

  1. Memory minimum and maximum values.
  2. I wrote a special blog about setting memory needs and it can be read here.

  3. model database recovery option (Simple if the SQL Server is for a DEV environment) and growing options (default 1 MB and 10% for Data/Log files is a big mistake that most leave behind and later face the problems caused by it)
  4. Set Firewall rules for allowing access to the SQL port.
  5. Check the “Allow Remote connection to this server”: right click SQL Server –> Properties –> Connections page.

SQL 2005/2008 – Configure Memory (part 1)

I promised to come back with the checklist of things to be done after the installation of SQL 2005/2008 is completed. This post will mainly be focused on the memory setting. Even though the memory is not the single worry, it is the principal one in my mind.

For starters, check that you installed the correct version of SQL on the right platform, Service Pack and edition. The following commands will show you all the basic information linked to the Server properties. This information can be stored if you want in a dedicated DBA table from a dedicated DBA server. If you have a lot of servers in your administration the collection of this kind of information is mandatory.

select serverproperty ('productversion')
select serverproperty ('productlevel')
select serverproperty ('edition')
select @@version
master.dbo.xp_msver

1. Memory min and max values:

As with other applications, SQL Server applications need memory to run. However, unlike most applications, it allows you to decide how much memory it can use. That’s good because SQL Server likes lots of memory and it will take as much memory as it is available on the server. If this option is left to the default value, soon you will face the out-of-memory problem for other applications or even for the OS. The funny part is that, the memory acquired by SQL is not released back to the system. Say the system has physical memory = 16 GB. If at some point the SQL needs 15,5 GB of RAM to function and there was this amount of memory available – the SQL will grow the memory used to 15,5 GB and will not give it back to the system when the transaction is over. If SQL Server uses all of the memory in the server, and Windows doesn’t have enough memory to function, SQL Server will run as if it is short on memory. Query response time will go up, CPU usage will go up and disk I/O will go up as Windows begins paging more and more RAM to the hard drive. You don’t want to reach this situation because the complications can be disastrous for you databases.
You can set the minimum and maximum Memory values issuing sp_configure from SQL or accessing “Properties” windows of the SQL Server instance from GUI. Both min and max values are extremely important for SQL to function properly.

Set minimum memory for SQL Server:

Minimum memory needs are specified on msdn website or in installation files of each SQL edition as hardware requirements section.
Here are the Memory needs for SQL 2005/2008 editions:

SQL 2005:
Minimum: 512 MB
Recommended: 1 GB or more
 
SQL 2008:
Minimum: 1 GB
Recommended: 4 GB or more

Set the maximum memory for SQL Server:

-- to allow to see the advanced options of SQL:
sp_configure 'show advanced options', 1
RECONFIGURE
GO

-- To set the maximum server memory value:
sp_configure 'max server memory', 6144
RECONFIGURE
GO

 
After you set the maximum and minimum values there a few things to be checked or considered.
 
2. Memory considerations if you run SQL Server on 32-bit platform:
 
Standard 32-bit addresses can map a maximum of 4 GB of memory. The standard address spaces of 32-bit processes are therefore limited to 4 GB. By default, on 32-bit Microsoft Windows operating systems, 2 GB are reserved for the operating system, and 2 GB are made available to the application – in our case to SQL Server.
 
2.1. /3GB switch.
 
If you have a total of 4 GB Physical memory and you need to address more than 2 GB to SQL, you have to specify the /3GB parameter in the Boot.ini file of Windows 2000/2003 Server and reboot the machine. With the /3GB parameter, the operating system reserves only 1 GB to its processes and the SQL Server can access up to 3 GB instead of the default setting of 2 GB.
 
2.2. PAE (Physical Address Extension) and AWE (Address Windowing Extensions)
 
Physical Address Extension (PAE) is a processor feature that enables x86 processors to access more than 4 GB of physical memory on capable versions of Windows. Certain 32-bit versions of Windows Server running on x86-based systems can use PAE to access up to 64 GB or 128 GB of physical memory, depending on the physical address size of the processor. However, PAE does not change the amount of virtual address space available to a process. Each process running in 32-bit Windows is still limited to a 4 GB virtual address space.  At this point we need to consider the AWE facility.

AWE is a set of extensions to the memory management functions of Windows that allow applications to address more memory than the 2-3 GB that is available through standard 32-bit addressing. AWE lets applications acquire physical memory, and then dynamically map views of the nonpaged memory to the 32-bit address space. Although the 32-bit address space is limited to 4 GB, the nonpaged memory can be much larger. This enables memory-intensive applications, such as large database systems, to address more memory than can be supported in a 32-bit address space.

Steps to configure the operating system for AWE:

  1. To support more than 4 GB of physical memory on 32-bit operating systems, you must first add the /PAE parameter to the Boot.ini file and reboot the computer.
  2. Remove /3GB parameter from Boot.ini file IF there is more than 16 GB of physical memory available on a computer. The operating system requires 2 GB of virtual address space for system purposes and therefore can support only a 2 GB user mode virtual address space.
  3. Enable AWE. Go to Sql Server Management Console > Properties dialog > Memory > select  ‘Use AWE to allocate memory’ , or alternatively by issuing the following command in a query against the target server:
  4. sp_configure 'awe enabled', 1
  5. Ensure that you add the user that the SQL Server service is running under to the ‘Lock Pages in Memory’ Local Security Policy (see Microsoft KB 811891 for the exact details on how to do this).  If you don’t update the local security policy, SQL Server will not actually use AWE and continue to use only 2Gb of memory; furthermore, you’re likely to see the following in the SQL Server log:
  6. "Cannot use Address Windowing Extensions because lock memory privilege
    was not granted."
  7. Restart the server machine.

 
3. Memory considerations if you run SQL Server on 64-bit platform:
 
In today’s 64-bit platforms, some great improvements have been made in the memory arena. While 32-bit platforms require you to use AWE and PAE to access more than 2 GB of RAM, the 64-bit platforms don’t have this limitation. In 64-bit platforms, all memory is available to the applications, as long as they are compiled as 64-bit applications; 32-bit applications running in Windows on Windows (WOW) have the same memory limits that they have running on a 32-bit platform.

While SQL Server only provides a few simple memory settings, setting them correctly is extremely important. Correct memory settings will have SQL Server running smoothly for a long time to come. Memory settings should be reviewed regularly to ensure that the original settings are still appropriate. After all, the amount of memory installed last year may no longer be enough memory or allocated correctly anymore.

You can find out and monitor how much memory SQL needs and whether the Operating System is comfortable with the memory left. Check out my next post related to SQL Memory needs.



SQL ERROR Fix: sp_MS_marksystemobject: Invalid object name ‘[cdc].[cdc_TABLE1]’

Hi, one of the new features in Microsoft SQL Server 2008 is the ability to track changes on a table. You can enable change tracking on a table using the Change Data Capture feature. It is extremely useful for BI (Business Intelligence) processes because it allows to track only the changes done on a table instead of importing the whole table each day for processing. Today me and a DEV guy faced a curious problem trying to enable an additional table for cdc on a Database already enabled for change data Capture feature. Just FYI, you need to be sysadmin or have db_owner role to manage the cdc feature.

Normally when you run:

USE [DB1]
GO
EXEC sys.sp_cdc_enable_table_change_data_capture
@source_schema = 'dbo',
@source_name = 'TABLE1',
@role_name = 'cdc_TABLE1'
GO

Results:

Job 'cdc.Test_capture' started successfully.
Job 'cdc.Test_cleanup' started successfully.

The SQL creates 2 new SQL Agent jobs, and a new table named as you specified it at the @role_name parameter, which will reflect the changes on the tracked table.
The table will be market as system table and found under “System Tables” folder from Database/Tables tree in SQL Management Studio.

In my case the cdc table was created inside master database with few warnings and NOT as system table. It was inserted under User tables leaf. At first look it was working as cdc table, but I didn’t want to take the risk.

These where the warnings:

sp_MS_marksystemobject: Invalid object name '[cdc].[cdc_TABLE1]'
sp_MS_marksystemobject: Invalid object name '[cdc].[sp_insdel_605961235]'
sp_MS_marksystemobject: Invalid object name '[cdc].[sp_upd_605961235]'
sp_MS_marksystemobject: Invalid object name '[cdc].[sp_batchinsert_605961235]'

After digging a little on the Internet I found out that, when the stored procedure sys.sp_cdc_enable_table_change_data_capture is executed, the undocumented procedure sp_MS_marksystemobject tries to mark the table as system table. In our case, the task failed and the table was marked as user table which was not what we wanted.
So first thing was to try to run manually the command which I thought could have done the trick and mark the table as system:

sp_MS_marksystemobject '[cdc].[cdc_TABLE1]'

Same warning. Not what I expected…

After another couple of hours of research, I found the issue. The resolution came from a totally unsuspected area. It happened that cdc schema under which all cdc tables are created had a different schema owner than the cdc SQL user. Somehow, in past it was modified to another SQL user. The reason it failed with “Invalid object name” warning became clear for me. The user which owned the schema didn’t have rights to see/change cdc objects. I had to change the cdc schema owner to cdc as it is supposed to be.

This command solved the issue and let me enable the cdc feature on the table without warnings:

USE [DATABASE]
GO
ALTER AUTHORIZATION ON SCHEMA::[cdc] TO [cdc]
GO

SQL ERROR Fix : Error : Msg 9514, Level 16, State 1, Line 1 Xml data type is not supported in distributed queries. Remote object ‘LINKED_SERVER.Database.dbo.Table’ has xml column(s).

I ran into a rather senseless restriction put by MS when using INSERT INTO…SELECT FROM to transfer few rows through a Linked Server from a table that has a XML data type column even if that column is not listed in “insert into” statement. See detailed the reproduction of this error.

The insert statement from this example uses a variable for filtering options.

Statement:

DECLARE @maxid as nvarchar (max)
SET @maxid=(SELECT TOP 1 col1 from Table_Local ORDER BY col1 DESC)
Insert into Database.dbo.Table_Local (col1, col2, col3, XMLcol4)
select col1, col2, col3, XMLcol4
from  [Linked_Server].Database.dbo.Table_Remote t2  where col1 > @maxid )t2

Error:

Msg 9514, Level 16, State 1, Line 4
Xml data type is not supported in distributed queries.
Remote object 'LINKED_SERVER.Database.dbo.Table' has xml column(s).

Resolution:

1. Change the statement using OpenQuery function instead of Linked Server.
2. Cast the XML column retrieved to a varchar (max).

Basically, the data is queried on the remote server, converts the XML data to a varchar, sends the data to the requesting server and then reconverts it back to XML.

Here is the right statement:

DECLARE @TSQL varchar(8000)
DECLARE @maxid as nvarchar (max)
SET @maxid=(SELECT TOP 1 col1 from Table_Local ORDER BY col1 DESC)
SELECT  @TSQL = 'insert into Database.dbo.Table_Local
                          (col1, col2, col3, XMLcol4)
select col1, col2, col3, Cast(t2.XMLcol4 as XML) as XMLcol4
from OPENQUERY([LINKED_SERVER_NAME],''SELECT col1, col2, col3,
Cast(XMLcol4 as Varchar(max)) as XMLcol4
FROM
DATABASE.dbo.Table_Remote where col1 >  ''''' + @maxid + ''''''') t2'
EXEC (@TSQL)

DiskPart – How to find out whisk disk is distributed to a logical volume

There is one fast way to check out to what physical disk is allocated one logical disk. In this way you can allocate better the available storage between data files, log files and tempdb files. You should know that these are better separated on different disks.

You need to consider applying this method only to the servers which don’t have as physical storage SAN disks. The distribution of  logical disks on SAN is very different and must be provided by network admins.

The following commands don’t mess anything but play carefully with diskpart ’cause it’s a partitioning tool.

Diskpart.exe command-line utility basic commands:

Open a command prompt and type diskpart. Depending the Windows version, diskpart will open in the same window or a new window. When entering the utility you can use the following commands.

list disk – lists the disks known by Windows. At this point you will know how many disks there are to play with.

Diskpart - List Volume Partitions

select disk (Number) – sets the focus on the specified disk. you need to set the focus on one disk in order to check the partitions created on it or the logical volumes. All the below command need to have the focus set on a single disk.

detail disk – after setting the focus, you will see all the volumes created on the specified disk.

select partition – you can guess what that is – lists the partitions from one disk.

Diskpart - List Volume Partitions

It is now obvious that Disk 1 with a capacity of 931 GB hosts two logical Volumes- 683 and 243 each.

A little about this blog…

I started my blog in the will of expanding and enhancing my acquired knowledge over the past 5 years since I am working with SQL Servers as a Database Administrator. Also, if some of the information presented here serve as help, point to start a discussion to some of you, I will be happy to enter in touch with my readers through comments, questions.

This blog  is designed for DBA’s or technicaly interested people in DBA work. Here you will find a little portion of the problems/daily work/solutions which I hope you will find usefull and interesting.