Saturday, 28 June 2008

Load Balancers

There are two ways to have the distribute the load on your network using hardware or software load balancers.

The hardware load balancer some advantages over software load balancers the first is that its independent of the operating system and there for as better throughput than a software one loaded one the second is that can do some checks against the returned code as an example if it sees 404 error page you can configure it to remove that server from the webfarm.

The best hardware load balancer is Cisco Application Control Engine (ACE) module or a Cisco 7600 series router.

Software based load balancers are cheaper to use as they require no Cisco training to use or special hardware, however since they do not have active probes if a web-server or other balanced service starts to return an error message you are unaware of and my issue this error to customers for awhile before you find out.

So from a design point of view to have the highest level of uptime you need two hardware load balancers these can be in active/standby or active/active mode and both to have probes configure to remove unresponsive or error generating servers from the balanced farm.

One last thing to add is if you are using a round robin DNS where you have geographical server farms so that content is provided from near is physical server farm.

Saturday, 21 June 2008

What impact can a poorly designed database have?

I was asked why do ask if you have a risk assessment for each database?

This is one of the standard questions I ask business is do they know what database they have on their systems and are they well maintained?
Because a database can have impact on SQL but just how much?

To answer that question I first have to give you some real world example, Microsoft talks about a performance hit when the database is not designed well or indexed, to translate this in to technical English as Microsoft are very cagey about this subject.

What it really means is the a Itanium 2 server with 4 physical CPU’s 64GB of RAM, for those of you that know you hardware might see where this is going, running SQL 2005 SP2 when loaded with a badly indexed database can go from average CPU load of 20% to 90% on a 3GB database.

Yes you read it right 90% CPU load on a 3GB database, do not underestimate the effect of a database can have on your server. This is why you should have a risk assessment for each database.

So it’s important to look at each database and where it came from as well as having some staging area to test the load before moving it to a production environment.

Surprisingly few companies have a database administrator on their team, most have just the support team and someone that knows how to install SQL.

So when something happened like SQL starts to put a heavy load on the CPU they don’t know where to look. Most if not all of what you need to know is just regular maintained tasks in SQL such as updating statistics, re-index or rebuild broken indexes, shrink the databases, regular transaction log backup
However this can’t be done on all database, some database do need special indexing and maintenance to keep them running.

So make a list of databases that you have and how best to maintain them, and remember to run this jobs at quiet times as re-indexing as an example will impact the performance of the database and depending of the physical resources of the server may impact the server as well and depending on the size of the database can take hours to run.

Don’t forget to have some form of alert if you maintenance job fails, no point in having the job if it doesn't run.

Saturday, 14 June 2008

Distributed File System

Surprisingly few networks have DFS replication in place. This is something you should not overlook as the uses it has on large networks with many sites can save both time and resource.

Here are some of the benefits; you can replicate files to another server giving you some fault tolerance without the expensive of a cluster.
Another plus to this is makes it easy to perform offline work on the server as end users are unaware of the downtime, as well as being able to place profiles on the DFS and have it replicated to all sites so that the end user is always working from the nearest server for quickest response and not loading files over the WAN.

Namespace also let you link many network shares into a single network drive despite them being on separate servers and/or locations.

It can also be used to help you move resource easier by no longer using a server name but a resource name, allowing hardware and locations to be more easily changed without downtime and remapping resources.

Another plus is that you can backup the namespace just once on DFS replicated drives so you don't take multiple backups of the same information allowing you to save money on backup hardware.

But remember there are as always some down sides too.
Remember one is that you will have make sure you have ether enough bandwidth to support the replication or throttle back the DFS replication to not overload your site links.

Fault Tolerant DHCP

This week I heard some stories about DHCP network faults that could have been avoided by better design.

The most common points of failure is that networks have just one DHCP or you need to manually bring the second DHCP server online to restore service back to normal.

The branch offices are also hard to provide best fault tolerance you can but you don’t have the same budget for hardware that you have at head quarters.

And I’ve seen people try to get around this by providing the DHCP only from head office but this still leaves you with the point of failure on the connection between the sites and as we know when the communications do fail they quite often take hours or even days for the provider to get it working again.

So the best ways around this I have found for the moment is to use a hybrid of local and remote DHCP server so that if the link goes down you have a local resource to provide lease and if the local resource fails you still have the remote resource.
By providing the DHCP from more than one location you can get around the single point of failure however this does mean that the scope needs to be larger, Microsoft best practice use a 80/20 rule meaning that 20% of the address are provided by the standby and you need to remove the excluded address manually when the primary DHCP server fails and this takes time, if you are using Cisco routers and switches you can provide a hot standby option with the right hardware.

But most likely you’re using Microsoft windows so your options are a little bit more limited however if you look at running larger subnet on network address translated ranges you can over this, you are maybe wasting half of your IP range this ways however you don't have to wait for some to change over the scope options to get clients connected.

This diagram shows the most fault tolerant model that is currently available using a split DHCP between local and remote and cluster resources at critical site such as head office.

Saturday, 7 June 2008

IIS 401 and 403 Errors

Back my demand IIS errors but unlike most I'm going to tell you where to look to solve your error.

specific cause of the error. These specific error codes are displayed in the browser but are not displayed in the IIS log:

401.1 - Logon failed.
The client gave the wrong username/password (including none at all). This could be from incorrect cached auto-login attempt by the browser, or from a user login dialog from the browser.

Invalid Kerberos configuration - on IIS6, if you have a customized Application Pool Identity AND Integrated Authentication is used AND the web server is in a domain, you will mysteriously get 401.1 unless you configure SETSPN *or* change Integrated Authentication to favor NTLM. See the following URLs on Application Pool Identity, Integrated Authentication in IIS, and Constrained Delegation configuration as well as this URL on additional Kerberos-related troubleshooting for more information

You enabled Anonymous authentication, yet you still get 401.1 for all requests. One common cause is if the configured anonymous user credentials stored in the IIS metabase configuration file is DIFFERENT than the user principle's credentials in reality (i.e. mismatched password). In all cases, the preferred solution is to manually synchronize the username/password of the anonymous user principle in IIS with that of the real user principle. I have seen many amazing variations of this cause, including:

401.2 - Logon failed due to server configuration.
This means the directory ACL you don't have permission to, change the permissions on the directory in IIS manager.

401.3 - Unauthorized due to ACL on resource.
This error message almost the same as the 401.2 but in this case the file its self is what you don't have permission too, check the NTFS permissions on the file at drive level to resolve this one.

401.4 - Authorization failed by filter.
An ISAPI filter loaded denied the request this can be along or short trouble shooting depending on the amount of info out put to the application log in windows

401.5 - Authorization failed by ISAPI/CGI application.
This error indicates that some ISAPI Extension or CGI Web Application sent back a structured 401 response of some sort. The reasons why the CGI/ISAPI are returning such 401 responses are completely arbitrary and uncontrollable by IIS. You will need to determine WHICH CGI/ISAPI is returning the response and obtain support for it.

Monday, 2 June 2008

Store Procedure SP_SDS

SQL has quite a few store procedure one of them is sp_spaceused it’s a useful one to see the space used by the databases however it’s a bit basic

Now luckily there are some much extended version that give much more detail and is useful for a number of reasons, one is because it gives the percentages that are used by the data not just the total size,

So here it is one lovely procedure written by one Mr Ding.

USE master;


@TargetDatabase sysname = NULL, -- NULL: all dbs
@Level varchar(10) = 'Database', -- or "File"
@UpdateUsage bit = 0, -- default no update
@Unit char(2) = 'MB' -- Megabytes, Kilobytes or Gigabytes

** author: Richard Ding
** date: 4/8/2008
** usage: list db size AND path w/o SUMmary
** test code: sp_SDS -- default behavior
** sp_SDS 'maAster'
** sp_SDS NULL, NULL, 0
** sp_SDS NULL, 'file', 1, 'GB'
** sp_SDS 'Test_snapshot', 'Database', 1
** sp_SDS 'Test', 'File', 0, 'kb'
** sp_SDS 'pfaids', 'Database', 0, 'gb'
** sp_SDS 'tempdb', NULL, 1, 'kb'


IF @TargetDatabase IS NOT NULL AND DB_ID(@TargetDatabase) IS NULL
RAISERROR(15010, -1, -1, @TargetDatabase);

IF OBJECT_ID('tempdb.dbo.##Tbl_CombinedInfo', 'U') IS NOT NULL
DROP TABLE dbo.##Tbl_CombinedInfo;

IF OBJECT_ID('tempdb.dbo.##Tbl_DbFileStats', 'U') IS NOT NULL
DROP TABLE dbo.##Tbl_DbFileStats;

IF OBJECT_ID('tempdb.dbo.##Tbl_ValidDbs', 'U') IS NOT NULL
DROP TABLE dbo.##Tbl_ValidDbs;

IF OBJECT_ID('tempdb.dbo.##Tbl_Logs', 'U') IS NOT NULL
DROP TABLE dbo.##Tbl_Logs;

CREATE TABLE dbo.##Tbl_CombinedInfo (
DatabaseName sysname NULL,
[type] VARCHAR(10) NULL,
LogicalName sysname NULL,
T dec(10, 2) NULL,
U dec(10, 2) NULL,
[U(%)] dec(5, 2) NULL,
F dec(10, 2) NULL,
[F(%)] dec(5, 2) NULL,
PhysicalName sysname NULL );

CREATE TABLE dbo.##Tbl_DbFileStats (
Id int identity,
DatabaseName sysname NULL,
FileId int NULL,
FileGroup int NULL,
TotalExtents bigint NULL,
UsedExtents bigint NULL,
Name sysname NULL,
FileName varchar(255) NULL );

CREATE TABLE dbo.##Tbl_ValidDbs (
Id int identity,
Dbname sysname NULL );

CREATE TABLE dbo.##Tbl_Logs (
DatabaseName sysname NULL,
LogSize dec (10, 2) NULL,
LogSpaceUsedPercent dec (5, 2) NULL,
Status int NULL );

DECLARE @Ver varchar(10),
@DatabaseName sysname,
@Ident_last int,
@String varchar(2000),
@BaseString varchar(2000);

SELECT @DatabaseName = '',
@Ident_last = 0,
@String = '',
@Ver = CASE WHEN @@VERSION LIKE '%9.0%' THEN 'SQL 2005'
WHEN @@VERSION LIKE '%10.0%' THEN 'SQL 2008'

SELECT @BaseString =
CASE WHEN @Ver = 'SQL 2000' THEN 'CASE WHEN status & 0x40 = 0x40 THEN ''Log'' ELSE ''Data'' END'
ELSE ' CASE type WHEN 0 THEN ''Data'' WHEN 1 THEN ''Log'' WHEN 4 THEN ''Full-text'' ELSE ''reserved'' END' END +
', name, ' +
CASE WHEN @Ver = 'SQL 2000' THEN 'filename' ELSE 'physical_name' END +
', size*8.0/1024.0 FROM ' +
CASE WHEN @Ver = 'SQL 2000' THEN 'sysfiles' ELSE 'sys.database_files' END +
+ CASE WHEN @Ver = 'SQL 2000' THEN ' HAS_DBACCESS(DB_NAME()) = 1' ELSE 'state_desc = ''ONLINE''' END + '';

SELECT @String = 'INSERT INTO dbo.##Tbl_ValidDbs SELECT name FROM ' +
CASE WHEN @Ver = 'SQL 2000' THEN 'master.dbo.sysdatabases'
WHEN @Ver IN ('SQL 2005', 'SQL 2008') THEN 'master.sys.databases'
EXEC (@String);


-- For data part
IF @TargetDatabase IS NOT NULL
SELECT @DatabaseName = @TargetDatabase;
IF @UpdateUsage <> 0 AND DATABASEPROPERTYEX (@DatabaseName,'Status') = 'ONLINE'
AND DATABASEPROPERTYEX (@DatabaseName, 'Updateability') <> 'READ_ONLY'
SELECT @String = 'USE [' + @DatabaseName + '] DBCC UPDATEUSAGE (0)';
PRINT '*** ' + @String + ' *** ';
EXEC (@String);

SELECT @String = 'INSERT INTO dbo.##Tbl_CombinedInfo (DatabaseName, type, LogicalName, PhysicalName, T) ' + @BaseString;

INSERT INTO dbo.##Tbl_DbFileStats (FileId, FileGroup, TotalExtents, UsedExtents, Name, FileName)
EXEC ('USE [' + @DatabaseName + '] ' + @String);

UPDATE dbo.##Tbl_DbFileStats SET DatabaseName = @DatabaseName;
WHILE 1 = 1
SELECT TOP 1 @DatabaseName = Dbname FROM dbo.##Tbl_ValidDbs WHERE Dbname > @DatabaseName ORDER BY Dbname ASC;
IF @UpdateUsage <> 0 AND DATABASEPROPERTYEX (@DatabaseName, 'Status') = 'ONLINE'
AND DATABASEPROPERTYEX (@DatabaseName, 'Updateability') <> 'READ_ONLY'
SELECT @String = 'DBCC UPDATEUSAGE (''' + @DatabaseName + ''') ';
PRINT '*** ' + @String + '*** ';
EXEC (@String);

SELECT @Ident_last = ISNULL(MAX(Id), 0) FROM dbo.##Tbl_DbFileStats;

SELECT @String = 'INSERT INTO dbo.##Tbl_CombinedInfo (DatabaseName, type, LogicalName, PhysicalName, T) ' + @BaseString;

EXEC ('USE [' + @DatabaseName + '] ' + @String);

INSERT INTO dbo.##Tbl_DbFileStats (FileId, FileGroup, TotalExtents, UsedExtents, Name, FileName)

UPDATE dbo.##Tbl_DbFileStats SET DatabaseName = @DatabaseName WHERE Id BETWEEN @Ident_last + 1 AND @@IDENTITY;

-- set used size for data files, do not change total obtained from sys.database_files as it has for log files
UPDATE dbo.##Tbl_CombinedInfo
SET U = s.UsedExtents*8*8/1024.0
FROM dbo.##Tbl_CombinedInfo t JOIN dbo.##Tbl_DbFileStats s
ON t.LogicalName = s.Name AND s.DatabaseName = t.DatabaseName;

-- set used size and % values for log files:
UPDATE dbo.##Tbl_CombinedInfo
SET [U(%)] = LogSpaceUsedPercent,
U = T * LogSpaceUsedPercent/100.0
FROM dbo.##Tbl_CombinedInfo t JOIN dbo.##Tbl_Logs l
ON l.DatabaseName = t.DatabaseName
WHERE t.type = 'Log';

UPDATE dbo.##Tbl_CombinedInfo SET F = T - U, [U(%)] = U*100.0/T;

UPDATE dbo.##Tbl_CombinedInfo SET [F(%)] = F*100.0/T;

IF @Unit = 'KB'
UPDATE dbo.##Tbl_CombinedInfo
SET T = T * 1024, U = U * 1024, F = F * 1024;

IF @Unit = 'GB'
UPDATE dbo.##Tbl_CombinedInfo
SET T = T / 1024, U = U / 1024, F = F / 1024;

SELECT DatabaseName AS 'Database',
type AS 'Type',
T AS 'Total',
U AS 'Used',
[U(%)] AS 'Used (%)',
F AS 'Free',
[F(%)] AS 'Free (%)',
FROM dbo.##Tbl_CombinedInfo
WHERE DatabaseName LIKE ISNULL(@TargetDatabase, '%')
ORDER BY DatabaseName ASC, type ASC;

SUM (T) AS 'TOTAL', SUM (U) AS 'USED', SUM (F) AS 'FREE' FROM dbo.##Tbl_CombinedInfo;

DatabaseName sysname NULL,
TOTAL dec (10, 2),
[=] char(1),
used dec (10, 2),
[used (%)] dec (5, 2),
[+] char(1),
free dec (10, 2),
[free (%)] dec (5, 2),
[==] char(2),
Data dec (10, 2),
Data_Used dec (10, 2),
[Data_Used (%)] dec (5, 2),
Data_Free dec (10, 2),
[Data_Free (%)] dec (5, 2),
[++] char(2),
Log dec (10, 2),
Log_Used dec (10, 2),
[Log_Used (%)] dec (5, 2),
Log_Free dec (10, 2),
[Log_Free (%)] dec (5, 2) );

SELECT x.DatabaseName,
x.Data + y.Log AS 'TOTAL',
'=' AS '=',
x.Data_Used + y.Log_Used AS 'U',
(x.Data_Used + y.Log_Used)*100.0 / (x.Data + y.Log) AS 'U(%)',
'+' AS '+',
x.Data_Free + y.Log_Free AS 'F',
(x.Data_Free + y.Log_Free)*100.0 / (x.Data + y.Log) AS 'F(%)',
'==' AS '==',
x.Data_Used*100/x.Data AS 'D_U(%)',
x.Data_Free*100/x.Data AS 'D_F(%)',
'++' AS '++',
y.Log_Used*100/y.Log AS 'L_U(%)',
y.Log_Free*100/y.Log AS 'L_F(%)'
( SELECT d.DatabaseName,
SUM(d.T) AS 'Data',
SUM(d.U) AS 'Data_Used',
SUM(d.F) AS 'Data_Free'
FROM dbo.##Tbl_CombinedInfo d WHERE d.type = 'Data' GROUP BY d.DatabaseName ) AS x
( SELECT l.DatabaseName,
SUM(l.T) AS 'Log',
SUM(l.U) AS 'Log_Used',
SUM(l.F) AS 'Log_Free'
FROM dbo.##Tbl_CombinedInfo l WHERE l.type = 'Log' GROUP BY l.DatabaseName ) AS y
ON x.DatabaseName = y.DatabaseName;

IF @Unit = 'KB'
UPDATE @Tbl_Final SET TOTAL = TOTAL * 1024,
used = used * 1024,
free = free * 1024,
Data = Data * 1024,
Data_Used = Data_Used * 1024,
Data_Free = Data_Free * 1024,
Log = Log * 1024,
Log_Used = Log_Used * 1024,
Log_Free = Log_Free * 1024;

IF @Unit = 'GB'
UPDATE @Tbl_Final SET TOTAL = TOTAL / 1024,
used = used / 1024,
free = free / 1024,
Data = Data / 1024,
Data_Used = Data_Used / 1024,
Data_Free = Data_Free / 1024,
Log = Log / 1024,
Log_Used = Log_Used / 1024,
Log_Free = Log_Free / 1024;

DECLARE @GrantTotal dec(11, 2);
SELECT @GrantTotal = SUM(TOTAL) FROM @Tbl_Final;

CONVERT(dec(10, 2), TOTAL*100.0/@GrantTotal) AS 'WEIGHT (%)',
DatabaseName AS 'DATABASE',
CONVERT(VARCHAR(12), used) + ' (' + CONVERT(VARCHAR(12), [used (%)]) + ' %)' AS 'USED (%)',
CONVERT(VARCHAR(12), free) + ' (' + CONVERT(VARCHAR(12), [free (%)]) + ' %)' AS 'FREE (%)',
CONVERT(VARCHAR(12), Data) + ' (' + CONVERT(VARCHAR(12), Data_Used) + ', ' +
CONVERT(VARCHAR(12), [Data_Used (%)]) + '%)' AS 'DATA (used, %)',
CONVERT(VARCHAR(12), Log) + ' (' + CONVERT(VARCHAR(12), Log_Used) + ', ' +
CONVERT(VARCHAR(12), [Log_Used (%)]) + '%)' AS 'LOG (used, %)'
FROM @Tbl_Final
WHERE DatabaseName LIKE ISNULL(@TargetDatabase, '%')
ORDER BY DatabaseName ASC;

IF @TargetDatabase IS NULL
SUM (used) AS 'USED',
SUM (free) AS 'FREE',
SUM (Data) AS 'DATA',
SUM (Log) AS 'LOG'
FROM @Tbl_Final;



This will show you far more detail on usage and will run on all the database at ones while the standard sp_spaceused will have to run many time.

I just wanted to share this with you as its saved me lots of time over the past month.