Thursday, November 6, 2008

When Visual Studio IntelliSense is not intelligent


Ever tried using DataSource, DataMember and ValueMember properties for your checkedListBox control and you don't see them popup on your IntelliSense? Well, looks like we can't just count on IntelliSense 100%. I saw this post on the MSDN forums and tried it out for myself. Surprisingly, IntelliSense does tell me it's not supported but it just works. I guess it's better off researching for more resources. Don't get me wrong - I love the IntelliSense feature as it makes me more productive (it's one of the reasons I switched from being a VB to a C# part-time developer). But it makes you wonder why the product team did not include class properties like these in the tool


Saturday, November 1, 2008

Using LINQ with SQL Server Compact Edition

I was trying to use MS Access with one of the applications I was writing for a non-profit organization but thought about the security of the database file. That being said, I opted to use SQL Server Compact Edition as it is just a small application that requires a database. This would enable the database to be embedded with the application and minimize the need to configure a database engine as a service, like how you would do with SQL Server Express. Since I wanted to use LINQ, I was quite surprised when the Design Surface threw an error saying that the data provider for SQL Server Compact was not supported. Good thing there's SqlMetal. I use SqlMetal to automate generation of DBML files for multiple SQL Server databases. Eventhough the Design Surface does not support SQL Server Compact, you can simply import the generated DBML file in your Visual Studio project. You can generate the DBML file for the SQL Server Compact database by running the SqlMetal command below:

SqlMetal /dbml:northwind.dbml northwind.sdf

Run the Windows SDK tool <drive>:\Program Files\Microsoft Visual Studio 9.0\VC\SqlMetal.exe against your generated sdf file. Make sure you navigate to the folder containing the SDF file, typically the Visual Studio project folder. You can also generate the .vb or .cs code depending on your preference

Check out this blog post from the SQL Server Compact Team on using LINQ with SQL Server Compact

Thursday, October 30, 2008

Another one-liner in Windows PowerShell

I did mention that I have been trying to convert my VBScript files to PowerShell scripts due to ease of coding. One of them happened to be a script that iterates a folder containing logs, reads the contents of the log files and appends them to a text file (I was actually trying to consolidate all the RADIUS logs for parsing and storage in a database). In VBScript, here's how it is written

Const ForReading = 1
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set folder = objFSO.GetFolder("d:\a")
Set outfile = objFSO.CreateTextFile("d:\testout.txt")

for each file in folder.Files
Set testfile = objFSO.OpenTextFile(file.path, ForReading)
Do While Not testfile.AtEndOfStream

line=testfile.ReadLine
'write to a single output file
outfile.writeline(line)
Loop
testfile.close
Next

outfile.close
Set objFSO = Nothing

Set folder = Nothing
Set outfile = Nothing

Here's how you can do it in PowerShell - with the cmdlet aliases

ls d:\a\ gc ac d:\testout.txt

This should be more than enough reason to learn PowerShell as a system administratior. More to come

Monday, October 20, 2008

Renaming your domain controller hostname

This is not really recommended if you think about it. How many of you eventually decided that one day, you need to rename your domain controller? I bet no one. But there are unusual cases where you do not have any other choice but to do it.

That's what happened to me when I was building my virtual image of a Windows Server 2003 domain controller. I accidentally plugged in my automated installation CD. To my surprise, the installation has generated a random name for my machine. Although I could demote the domain controller, change the hostname, and promote it back, it would be very time consuming with all those reboots and manual steps. Searching thru Google gave me this site which recommends the use of the Netdom.exe tool. You do need to install the Windows Support Tools to use this tool. To change the hostname of the domain controller, open the command prompt associated with the Windows Support Tools. Then, run the following command

netdom computername oldhostname.domainname /add:newhostname.domainname

Although there are other instructions mentioned in the article, I only needed this command. After rebooting the server, I ran dcdiag.exe (Domain Controller Diagnostics Tool) and netdiag.exe(Network Connectivity Test Tool in the Windows Support Tools command prompt to validate the connectivity to the domain controllers. Knowing that you have the right tools available would make life a bit easy for any professional - whether in IT or not

Thursday, October 9, 2008

Windows Update Explained

For those who administer or manage WSUS or simply do patch management, here's a documentation from TechNet written by the Windows Update Product Team to better understand updating behavior and what Microsoft is trying to do with Windows Update. While I rely so much on the WindowsUpdate.log file and/or the WSUS reporting feature to troubleshoot patch deployment issues, it's best to understand what is happening under-the-hood to get a better picture

Wednesday, September 17, 2008

Why WIndows Instant File Initialization for SQL Server 2005 matters

SQL Server 2005 supports Database Instant File Initialization which skips zeroing out data pages that can reduce the time when performing operations like creating databases, adding files to an existing database, increasing the size of an existing database file manually or thru autogrowth or restoring a database or filegroup. I've written an article in MSSQLTips.com highlighting it's importance both in a disaster recovery and operational performance scenarios. I ran a few tests as well to simply answer a query in the forum regarding the topic. I ran my tests on a VMWare image on a DELL PowerEdge 2650 with 4 X 2.8 GHz CPU and 3GB RAM. The image is running Windows Server 2003 Enterprise Edition with SQL Server 2005 Enterprise Edition as this is an "Enterprise Edition only" feature. I've created a 10GB-sized database which took only 0.1 seconds using instant file initialization. Without it, the same took about 2:10 minutes. Imagine doing a restore for a 500GB-sized database. That will save you a lot of time on the restore process. Same goes with file auto growth. The difference in the amount of time it takes to do a restore would definitely matter when you're dealing with very large databases. Imagine trying to restore a 500GB-sized database. You'll probably agree after trying it out for yourself

Monday, September 1, 2008

Can't run your batch files from Scheduled Tasks? Try this

I was asked by a friend of mine to check why a specific batch file was not being called by a job in Task Scheduler. The first thing I did was to check if the account that was used to run the job had the appropriate permissions. The account used was a domain account with limited privileges. The same account can run jobs on other servers but not on this one. Making it a member of the local Administrators group gave it the permissions to run the job but not as a member of the Power Users group. The former not being an option forced me to check the ACLs on other servers and found out that the account does not have permissions on the CMD.EXE file on my %windir%\system32 folder. After giving the Read and Execute permissions to this account, it was able to run the job using Task Scheduler.

I've seen systems where the Everyone group is granted the same permissions on the CMD.EXE file and such is not a recommended practice due to security vulnerabilities. I'll work on a PowerShell script to check a machine for this and remove the Everyone group, if it exists.

Thursday, August 28, 2008

Need multiple CPUs for your VMWare image?

I normally create VMWare images using my laptop for testing and then deploy on servers for production. My laptop only has an Intel Centrino CPU but the target servers have multiple CPUs. My problem lies where VMWare supports multiple CPUs. This means that configuring my VMWare image on my laptop would not take advantage of having the production server running on multiple CPUs. Initially, I thought it was just a matter of changing the numvcpus variable in the VMX file associated with my image. But, then, the processors are only visible on the Computer Management console and not from Task Manager. I found this article on how to change the kernel and hardware abstraction layer on a virtual machine running Windows Server 2003 with Service Pack 2. I downloaded Windows Server 2003 Service Pack 2 to extract the following files

halmacpi.dl_
ntkrnlmp.ex_

Next, I've expanded the appropriate files needed by the HAL and MP Kernel by running the following command in DOS

expand halmacpi.dl_ halmacpi.dll
expand ntkrnlmp.ex_ ntkrnlmp.exe

This will extract the halmacpi.dll and ntkrnlmp.exe files. Next, copy these files on your Windows directory. Type %windir%\system32 in your Run command to retrieve the Windows directory. After copying the files, modify the boot.ini using the System startup parameters option as you won't be able to do so directly using Notepad. Simply add the following items on the active boot option line. Mine looks something like this

multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Windows Server 2003, Enterprise, ACPI MPS" /fastdetect /NoExecute=OptOut /KERNEL=ntkrnlmp.exe /HAL=halmacpi.dll

I've added the ,ACPI MPS in the partition(1)\WINDOWS parameter (the comma included to separate the original Windows Server 2003, Enterprise value from what I added) and the /KERNEL=ntkrnlmp.exe /HAL=halmacpi.dll values. This will allow Windows to detect it is running on ACPI multi processor machine when it reboots and install the new MPS HAL. Try rebooting the virtual machine and once you manage to log in, it will prompt you to reboot the server after managing to install the new MPS HAL. After the second reboot, check your Task Manager and see the additional processor as your virtual Windows now recognizes the configuration you just did. I wonder if I can still do this even though I only have a single physical processor on my machine. Although that doesn't have much value, its worth trying out.

And so I need a GRANT IMPERSONATE permission

SQL Server 2005 has given us the ability to allow more granular permissions, one of which is the EXECUTE AS clause. This would give the user the ability to impersonate a more privileged user should there be a need to do so. But before a user can do impersonation, it must be given the permission to impersonate. This is where the GRANT IMPERSONATE permission is needed - where a lower privileged user needs to impersonate a higher privileged user and not the other way around, I presume.

I've written an article on MSSQLTips.com to demonstrate how to use the EXCECUTE AS clause with the REVERT clause to impersonate users

Monday, August 18, 2008

Most important lesson learned at TechEd Asia 2008

This was supposed to go on the other blog I maintain but I reckon this is something that everyone would appreciate even though it isn't a technical post.

After coming back to Singapore from a week of TechEd Asia 2008 mania, I've spent a lot of time thinking over what the most important lesson I've learned at the event. While a lot of attendees may talk about the great sessions they've attended or the parties they've gone to, speakers about the sessions they did (I had two breakout sessions and one instructor-led lab) and how they've given a great (or screwed up) presentation, mine was a non-technical session done by the people from Microsoft Learning entitled "If I can be a Microsoft Certified Trainer, so can you. MCT Speaks!" I am by far a part-time MCT as I no longer do full-time training. But teaching is my passion. I'd do it for free. Almost a decade ago, Ive decided to pursue the MCT certification because I know it will be my gateway to pursuing a passion in teaching (I've got other dreams about teaching but this was one of my earlier goals). One of the statements made by the moderator reminded me why I was at TechEd Asia. He said, "being an MCT gives you the fulfillment that you have made a difference in someone else's life." On the first day, after the keynote session, a guy with a camera called me while I was on my way to the presenter's room. He said he saw the video which included me promoting Microsoft certification. I was trying to recall how he knew me when he suddenly interrupted me by saying, "I still do what you have taught me about SQL Server 2000 in my work." That hit me hard. He was my trainee some five years back in a SQL Server 2000 course I did for Accenture Philippines and now he's a SQL Server consultant. He was with a few of his colleagues from Accenture and Avanade attending the week-long TechEd Asia event. I even saw one of his colleagues who happened to be one of my students way back when I was still teaching in a university. Now that's what I call fulfillment. I never imagined that they would be who they are now a few years back while they were in my classroom. But everybody has the potential to become great. I just happened to be the avenue for these people.

It's great to know that you are making a difference in someone else's life. For me, it's simply fulfilling a purpose. It just so happen that I am an MCT while fulfilling that purpose. And that is basically the reason why I volunteer to speak at these types of events, whether TechEd Asia, PASS Community Summit, MSDN TechTalks, etc. I believe that one of the people in the crowd would become somebody great someday. That in itself would make a whole lot of difference for that person. And that would be my reward

Saturday, August 9, 2008

Level 5 Limited Event - FREE Microsoft Videos

Want to have access to free Microsoft videos? Check out The BlogCast Repository. The owner of the portal has made the videos available on 10:00 AM - 12:00 PM on Wednesday August 13th for the people in the USA (CST GMT -6) and Eastern Europe time zones. The guide will also be open from 10:00 pm - 12:00 AM the same day to accommodate the users on the other side of the world. You will have the ability to view as many of the 82 Level 5 videos as you can during the Open House. Below are the following topics that you can choose from:

System Center Configuration Manager
System Center Operations Manager (MVP)
Windows Server 2008
WSUS 3.0 SP1
SQL 2008 (MVP)
Word 2007 - Office Professional 2007


So, take advantage of this free offer and spread the word to those who are interested. You'll definitely enjoy the videos.

Friday, August 1, 2008

So, how do you uninstall Windows PowerShell v1.0?

This was a long-awaited post as I have been concentrating more on disaster recovery stuff for SQL Server

If you guys wanted to install PowerShell v2.0 CTP, the first thing you need to do is uninstall v1.0. Apparently, though, you won't find any of those usual uninstall process like using the Add/Remove Programs console or the Uninstall Windows PowerShell from your Program Files. Have no fear. You can always use your registry. Simply look for the UninstallCommand key from HKLM\Software\Microsoft\Updates\Windows PowerShell 1.0 The path would really much depend on your operating system's setup like what your service pack you are currently running. In my case, I am running Windows XP Service Pack 3 so the so the registry key for me would be HKLM\Software\Microsoft\Updates\Windows PowerShell 1.0\SP3\KB926139-v2 Run the Value Data from the UninstallCommand key to uninstall Windows PowerShell v1.0. Once you're done, you can now install v2.0. You'll be surprised that it now creates an entry in the Add/Remove Programs console. Remember, this is still in CTP so be careful not to deploy your scripts in production until its released or just bite the bullet

Thursday, July 31, 2008

Another reason to be at TechEd Asia 2008: After Hours

There's more to it that the regular technical sessions to go to during TechEd Asia 2008. After Hours lets you be casual with the speakers and other TechEd Asia delegates. See great demos, watch your peers being launched as a conference speaker in a competition, or simply listen to talks about certification and IT trends. This fun-packed activities will make your attendance at TechEd Asia 2008 an unforgettable experience. Visit TechEd Asia 2008 After Hours for more information so you can plan your schedule way ahead of time

So, if you haven't registered yet, it's not too late. Register now!

Tuesday, July 29, 2008

Broken BACKUP and RESTORE sequence?

Have you ever had the to scratch your head because your database backups won't work even if you are practically sure that you have tested them properly? Imagine this - you have a regular FULL and DIFFERENTIAL database backups working together as part of your backup process. Your FULL backups run every Sunday and your DIFFERENTIAL backups run everyday. Then, all of a sudden on a particular Wednesday, your developers decided to restore a FULL backup of your production databases into your test environment to test a functionality. What happens? The backup sequence is broken. DIFFERENTIAL backups contain the changes since the last FULL backup. If you are following the sequence and working with the backups generated by your backup process, you would have done some tests using the Sunday FULL backup and the latest DIFFERENTIAL backup. Not in this case. You would need the FULL backup created on that specific Wednesday (which unfortunately was deleted after being restored on the test environment). Let me show you a sample code to demonstrate


USE master
GO

--FULL DATABASE BACKUP on Sunday
BACKUP DATABASE
AdventureWorks
TO DISK =
'D:\AdventureWorks_FULL.BAK'
WITH INIT, FORMAT, STATS=
10
GO

USE
master
GO

--FULL DATABASE BACKUP ON Wednesday CREATED BY YOUR DEVELOPERS
BACKUP DATABASE
AdventureWorks
TO DISK =
'D:\AdventureWorksD.BAK'
WITH INIT, FORMAT, STATS=
10
GO

USE
master
GO

--DIFFERENTIAL DATABASE BACKUP AFTER THE LAST FULL BACKUP
BACKUP DATABASE
AdventureWorks
TO DISK =
'D:\AdventureWorks_DIFF.BAK'
WITH DIFFERENTIAL, INIT, FORMAT, STATS=
10
GO



If you're the DBA, you might be thinking that you still have your backup sequence intact and this is what you might have done should you need to restore



--RESTORE SEQUENCE STARTS
RESTORE DATABASE
AdventureWorks
FROM DISK =
'D:\AdventureWorks_FULL.BAK'
WITH REPLACE, NORECOVERY, STATS=
10
GO

--FAIL DUE TO INCONSISTENT RESTORE SEQUENCE
RESTORE DATABASE
AdventureWorks
FROM DISK =
'D:\AdventureWorks_DIFF.BAK'
WITH REPLACE, NORECOVERY, STATS=
10
GO

Now, there are a couple of ways to solve this problem. One is to create a trigger on your MSDB database to trap events written on your backupset system table and check for the type column for D values. At least you get to be notified for any FULL database backups outside of your normal backup process. In SQL Server 2005, it's as easy as using the COPY_ONLY option in the BACKUP command. This specifies that the backup is a copy-only backup, which does not affect the normal sequence of backups. Make sure that your operational DBAs and developers who work on SQL Server 2005 know how to use this option whenever thay run ad-hoc backups. Here's how to use the COPY_ONLY clause on your BACKUP commands (with emphasis on COPY_ONLY)

--FULL DATABASE BACKUP on Wednesday
BACKUP DATABASE AdventureWorks
TO DISK = 'D:\AdventureWorksD.BAK'
WITH INIT, FORMAT, STATS=10, COPY_ONLY
GO

Monday, July 28, 2008

So you can recover from database snapshots...but...

SQL Server 2005 has given us the option to create database snapshots to create a point-in-time image of the database. This gives us the option to restore from a point-in-time in case of user errors, like maybe accidentally truncating a table. MSDN has provided us a procedural approach on how to restore from database snapshots. But what I am more concerned the most is that I have already lost a lot of data from the time the snapshot was created to the point in time that the error occurred. And this is where your understanding of what happened and how you implement your disaster recovery plan come into the picture. As far as Microsoft is concerned, restoring a database from a snapshot overwrites the original source database. But here's what I'd do. I'd first backup the tail of the log before I restore from the database snapshot. After restoring from the snapshot, I'd restore my backups into another copy of the database and restore the tail of the log. This will be the state of the database after the accidental error occurred. Since I am concerned about the new records that were added after the accidental error, I would retrieve those from my database copy and isert those on my restored database. This will take a lot of work as you will be comparing the tables from the original and the copy database. At least you managed to restore the records faster than restoring from a FULL backup. There is no native way to solve this but there are a lot of third-party utilities that can just rollback a specific command from the transaction log. You can use the TableDiff utility to do this although it wasn't meant to solve this type of problems

Friday, July 25, 2008

Did you check your MSDB.dbo.suspect_pages after a database restore?

In SQL Server 2005, the default behavior of a RESTORE command is to simply continue even if there are corrupted pages in your backups. The only way to find out if there are corruptions is when a user gives you a call saying that they could not query some records and probably gets an error similar to the one below

Msg 824, Level 24, State 2, Line 1
SQL Server detected a logical consistency-based I/O error:
incorrect checksum (expected: 0xf93a020c; actual: 0xf93a820c).
It occurred during a read of page (1:69) in database ID 10 at offset 0x0000000013c000 in file
'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\SalesDB.mdf'.
Additional messages in the SQL Server error log or system event log may provide more detail.
This is a severe error condition that threatens database integrity and must be corrected immediately.


The first thing you need to do after a database restore is to check the MSDB.dbo.suspect_pages to see if there are suspect pages that need to be repaired. Books Online describes the structure of this system table and gives you an idea on what to restore should there be suspect pages. Bear in mind though that this system table can only hold up to a maximum of 1000 records and everything get cycled out so you need to do your own housekeeping.

The Enterprise Edition gives you an option to do a page-level restore should you have corrupt databases. A more low-level discussion on fixing damaged pages can be found on the SQL Server Storage Engine blog

Tuesday, July 22, 2008

You don't trust your database backups? Use mirrored backup media sets in SQL Server 2005

What could be more frustrating than knowing that your database backups went missing? This is specifically true if you are dealing with transaction log backups which are dependent on log sequence numbers. You don't want to lose a single transaction log backup in the chain. In previous versions of SQL Server, we just execute a copy command (or even ROBOCOPY) to copy the transaction log backups to a different location. In SQL Server 2005, we have the MIRROR TO clause. This specifies a set of one or more backup devices that will mirror the backups devices specified in the TO clause, which could be a tape, disk or network location.

To use the MIRROR TO clause, see the sample script below

BACKUP DATABASE AdventureWorks
TO DISK = 'D:\AdventureWorksDB2.BAK'
MIRROR to DISK = 'F:\AdventureWorksDB.BAK'
WITH INIT, FORMAT, STATS=10
GO

This creates a mirrored copy of the database backup of the AdventureWorks database to the F:\ drive. Although this creates a mirrored copy of the backup, it will definitely take quite some time to complete as it is writing on all the mirrored media set, thereby, increasing database your backup window. I did a test on this by using the MIRROR TO DISK clause with the AdventureWorks database and it takes like 10 times longer to do a backup with this option - the speed, of course, would be dependent on where you mirror your backups and its IO performance

I'd probably still stick to using the copy command in my backups as long as I get the same result. For more information on using mirrored backup media sets, check out this MSDN article

Wednesday, July 16, 2008

You need to backup the mssqlsystemresource database files

So, you think you have all the necessary database backups you need? Well, think again. This might be your mindset after moving from SQL Server 2000 to SQL Server 2005. SQL Server 2005 includes a read-only, hidden database that contains all the system objects. It comes in the form of mssqlsystemresource.mdf and mssqlsystemresource.ldf files located in :\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\ by default. This database is required by the master database and will definitely affect how your SQL Server service behaves during restarts. If you move the master database, you should also move these files no the same location. Since it is a hidden database, SQL Server won't be able to backup this database as part of your database maintenance plan. One approach you could use is to xcopy it to the location of your database backups on a regular basis.

MSDN has a documentation on what this is all about but let me tell you one thing - you MUSt include this in your backup procedures. If you want to have an idea of how critical these files are, stop your SQL Server 2005 service and rename the mssqlsystemresource.mdf file. You won't be able to restart your SQL Server service afterwards

Tuesday, July 15, 2008

Microsoft Source Code Analyzer for SQL Injection

With the recent mass SQL injection attacks, Microsoft has developed a new static code analysis tool for finding SQL Injection vulnerabilities in ASP code. Web developers can run the tool on their ASP source code to identify the root cause of the attack and address them to reduce their exposure to future attacks. You can download the tool from the Microsoft Download Center and use it on your own infrastructure to test. If you find any bug or what not, just go to this site to report it.

Now, this doesn't mean only web applications are prone to SQL injection attacks. Even Windows-based applications are so make sure you check them as well.

Wednesday, July 9, 2008

A few more days left to take advantage of TechEd Asia 2008's promotions

Wanna save money for your TechEd Asia 2008 registration? Register on or before July 15, 2008 and pay only RM 999 (US$323). Compare that to RM 1,299 (US$419) which is the regular price. What's more, if you register in a group of 5 or more, you save an additional RM 100 (US$33). You only pay RM 899 (US$291). So hurry because you only have a few days left before the 15th. For more details, check out the TechEd Asia 2008 site

Tuesday, July 8, 2008

Extract user's last password set in Active Directory using PowerShell

Just a follow up on my previous post, here's the script to do just that in PowerShell. It extracts the name and the last time the password was changed and displays it in the host.


$strFilter = "(&(objectCategory=User))"
$Dom = 'LDAP://DC=yourDomain;DC=LOCAL'


$objDomain = New-Object System.DirectoryServices.DirectoryEntry $Dom

$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter
$objSearcher.SearchScope = "Subtree"

$colProplist = "name", "pwdlastset"
foreach ($i in $colPropList)
{$objSearcher.PropertiesToLoad.Add($i)}
$colResults = $objSearcher.FindAll()
foreach ($objResult in $colResults)
{
$objItem = $objResult.Properties
$objItem.name
[datetime]::FromFileTimeUTC($objItem.pwdlastset[0])
}

One-liner in PowerShell? Convert a 64-bit (Integer8) value to date time

As I was converting my VBScripts to PowerShell, I reviewed one which checks for the password expiration of a user in Active Directory. The blog post I've had sometime last year extracts object properties and one of them is the pwdLastSet property which specifies a 64-bit value of when the user last changed their passwords. This is necessary if you need to know how many days left before their password expires. One of the things I did was to use a function that converts the 64-bit value to a valid date/time value - thanks to Richard Mueller, of course.

Function Integer8Date(ByVal objDate, ByVal lngBias)
' Function to convert Integer8 (64-bit) value to a date, adjusted for
' local time zone bias.
Dim lngAdjust, lngDate, lngHigh,
lngLow
lngAdjust
=
lngBias
lngHigh
=
objDate.HighPart
lngLow
=
objdate.LowPart
' Account for bug in IADslargeInteger property methods.
If (lngLow < 0)
Then
lngHigh = lngHigh +
1
End If

If
(lngHigh = 0) And (lngLow = 0)
Then
lngAdjust =
0
End If

lngDate = #1/1/1601# + (((lngHigh * (2 ^ 32))
_
+ lngLow) / 600000000 - lngAdjust) /
1440

Integer8Date
= CDate(lngDate
)
End Function


I pass the value from pwdLastSet property to this function to get the date/time value. After a couple of Google searches, I chanced upon a forum post by Brandon Shell - Microsoft PowerShell MVP. The entire function can be summed up in just a line

[datetime]::FromFileTimeUTC($objItem.pwdlastset[0])

This is if you are using the System.DirectoryServices.DirectorySearcher class and not the
System.DirectoryServices.DirectoryEntry in the .NET Framework which is available in PowerShell. The DateTime.FromFileTimeUtc Method converts the specified Windows file time to an equivalent UTC time. So much for using conversion functions in VBScript to do this.

Thursday, July 3, 2008

Simple PING test: VBScript vs PowerShell

As a preparation for my TechEd Asia 2008 session on Windows PowerShell, I've been trying to convert a few of my administrative scripts to demonstrate how easy tasks can be done using PowerShell. Here's one I just did: simple PING test. Here's a sample VBScript code that checks whether a PING test is successful or not. I call the VBScript and pass it a parameter which is the hostname or IP address of the computer I want to ping

VBScript:
strComputer=Wscript.Arguments.Item(0) 'parameter passed = hostname

wmiQuery = "Select * From Win32_PingStatus Where Address = '" & strComputer & "'"


Set objWMIService = GetObject("winmgmts:\\.\root\cimv2")
Set objPing = objWMIService.ExecQuery(wmiQuery
)
For Each objStatus In
objPing
If IsNull(objStatus.StatusCode) Or objStatus.Statuscode<>0
Then
Reachable = False
'if computer is unreacable, return false
Else
Reachable = True
'if computer is reachable, return true
End If
Next


Wscript.Echo Reachable

Here's the equivalent script using PowerShell

Param($hostname)
$ping = new-object System.Net.NetworkInformation.Ping
$Reply = $ping.Send($hostname)

$Reply.status

You can see how much easier it is to accomplish a simple task in PowerShell. It makes the life of an administrator much better, especially if you have to write scripts that automate repetitive tasks. I'll post more samples of the VBScript codes you've probably seen in this blog converted to PowerShell

Tuesday, July 1, 2008

So your PowerShell script won't run?

In case you're wondering why on earth your PowerShell scripts don't run, its because scripting in Windows PowerShell is not enabled by default. Imagine your custom .NET assemblies being blocked until such time you register them on your global assembly cache (GAC). This is to make sure you don't run just about any script on your system. This is normally referred to as the restricted execution policy. You'll probably encounter something like this if it's the first time you'll run a PowerShell script



You can find out more about execution policy in Windows PowerShell by typing Get-Help Set-ExecutionPolicy on your PowerShell command window. You should also make sure that you have permissions to run scripts on your machine. Check either your local or your domain security policy to be sure. Plus, unlike VBScript where you can just double-click and run, PowerShell scripts don't behave that way. You need to call PowerShell before running the script. This is to make sure you don't accidentally run the script by just double-clicking. It does make sense especially if your machine get's exploited by a virus or a worm.

Thursday, June 26, 2008

TSQL Debugger is now on SQL Server 2008 RC0

I have long waited for this feature as I got used to Visual Studio's debugging feature where you can simply put a breakpoint beside a code line and hit the debug button. SQL Server Management Studio now has a Debug button together with the Execute button (except that in Visual Studio, your F5 key would be your Debug command whereas in SQL Server, that would be the Run/Execute command).










This will allow you to step over your code - whether it is calling a stored procedure or a TSQL script - and step thru it a line at a time or at one shot. You can even step thru nested calls, like calling a stored procedure that calls another stored procedure. It will open up a new tab displaying the codes for the procedure or function that you are calling from your TSQL code. I'm just glad it's already in RC0. Check out my video from BlogCastRepository.com on how this works

Monday, June 23, 2008

I'm going to TechEd Asia 2008

I've just confirmed my hotel booking for TechEd Asia 2008 and have already started working on my sessions. I will be doing 2 break out sessions - one on SQL Server 2005 disaster recovery and one on Windows PowerShell. Check out this blog regularly to find out more on what I will be covering for TechEd. There's definitely a thousand and one reasons to go so register now to enjoy the discounts.

Wednesday, June 18, 2008

SQL Server 2008 Videos from BlogCastRepository.com

I was informed by the owner of the site that my videos have just went live. I've spent quite a number of days working on the videos to make sure that subscribers will get as much as they can in their pursuit of learning about SQL Server 2008. I would appreciate your feedback on what else you would like to see on the video list

Check out the BlogCastRepository website for my SQL Server 2008 videos. I'll come up with SQL Server 2005 videos as well in the future

Saturday, June 7, 2008

Exploring Database Backup Compression in SQL Server 2008

Database backup size is always an issue in very large databases. Which is why we plan our backup and restore procedures properly to include either differential and filegroup backups. Most of the times, we use backup compression tools like LiteSpeed for SQL Server from Quest or SQL Backup from RedGate. SQL Server 2008 has a built-in compression engine for you to compress your database backups. I have written an article about this feature in MSSQLTips.com and created a video at BlogCastRepository.com

Monday, June 2, 2008

So, you think your secured, patched and updated servers are secure?

So, we patch our servers, subject them to numerous security scans, implement security best practices and expect them to be secured? That's ridiculous.

I have been trying to convince developers how a functional application can be vulnerable if they do not implement security best practices in writing codes. I come from a developer background as well and with tight and unreasonable deadlines, developers are only concerned with functionality and nothing more. But with the increasing incidents of websites being defaced because of SQL injection attacks, there's no doubt that as long as applications are running on servers, there will always be security vulnerabilities. SQL injections attacks are no respecter of platforms nor database engines. So better have a look at your application codes.

A Microsoft blog post on SQL injection attack is available here.

Friday, May 30, 2008

Encrypting your database backups in SQL Server 2008

I was asked a question about how you make sure that your SQL Server database backups are secured and I ended up saying, "go get a third-party tool like LiteSpeed for SQL Server from Quest Software." While it is true that third-party tools can provide additional layer of security on your database backups, budget is always a constraint. Good thing SQL Server 2008 has included a feature called transparent data encryption which you can use to secure your database backups. I have a written an article on MSSQLTips.com and created a video at BlogCastRepository.com about this feature which is worth having a look at.

Monday, May 26, 2008

Moving the WSUS database to a new location

I woke up from a call from our support engineers telling me that a drive has less than 10% free space and needs to be maintained. This drive happens to be hosting my WSUS server together with the database used in the backend. There's only one thing for me to do: move the database to a new location. This approach works for any SQL Server/MSDE database you want to move to a different location.

Step 1: Stop any service that is accessing the database


Since this is a WSUS database, we need to stop several services like the Update Service and the WWW service. You can do so using the Services applet or by using the NET STOP command
Step 2: Detach the database using the sp_detach_db command
Since we do not have Enterprise Manager by default in using MSDE, we will stick to our command-line tool, osql. The sp_detach_db command detaches the specified database from SQL Server/MSDE. If you didn't stop any service or application accessing this database, this command will fail. Below is the syntax for the sp_detach_db command(see MSDN as well)


sp_detach_db [ @dbname= ] 'database_name'
[ , [ @skipchecks= ] 'skipchecks' ]

To use the sp_detach_db in osql for the SUSDB database, execute this in the command line


osql -E -S %computername%\wsus -Q "exec sp_detach_db 'SUSDB'"


I am more comfortable with executing TSQL scripts while logged in that's why I make it a point to secure a logged in connection first before I execute them.


Step 3: Move the database files to a new location

Now that you have disconnected the database from the server, you can now treat it as any other file in your file system. Move the SUSDB.mdf and SUSDB_Log.LDF files (for any database, be sure you know which files correspond to which database by executing the sp_helpdb command)

Step 4: Re-attach the database using the sp_attach_db command

Execute the sp_attach_db in osql to re-attach the database files you have moved to a new location.

osql -E -S %computername%\wsus -Q "exec sp_attach_db @dbname=N'SUSDB', @filename1=N'E:\WSUS\MSSQL$SUS\Data\SUSDB.mdf', @filename2=N'E:\WSUS\MSSQL$SUS\Data\SUSDB_log.ldf'"

Step 5: Start the services which you stopped in Step 1


Resume the services or applications you stopped. Check whether the application still works fine and that it can still access the database.

This is a generic approach which can be used for any database running on MSDE or SQL Server, whether it's WSUS or not. Note that if you are dealing with WSUS 3.0 which, by default, uses the Windows Internal Database (SQL Server 2005 Embedded Edition), check out this blog post to connect to this instance.

Thursday, May 15, 2008

Undocumented extended stored procedure - xp_fileexist

This post was originally posted ni this blog entry

It's quite shocking to hear the word "undocumented" in SQL Server. This simply means that you can break your neck trying to find these topics in both the Books Online or Microsoft.com and you won't find anything. But there's always Google to take your chances. As I was trying to search for a VBScript to check for an existence of a file in a file system (it would be a lot easier in .NET but I'm limited to what I just have) when I chanced upon this extended stored procedure - xp_fileexist. DBAs do file system checks whenever they do data transformation or data loading from a delimited text file, an Excel spreadsheet or anything similar. If this was done by a batch job you need to check first whether a file exists based on a specific filename format (such as Test_yyyymmdd.xls). You can create a simple stored procedure to check for the formatting of the filename but to check whether the file exists before you to do your task is not something that you can do with SQL Server. Good thing there is such a stored procedure - xp_fileexist. This verifies whether a specified file exist. This stored procedure is located in the master database where you can execute it in the following syntax:

xp_fileexist 'C:\autoexec.bat'


This will return three columns indicating if the file exists, if the file is a directory and if the file's parent directory exists as a result set. You may want to include this in a script to do functional checks such as the example below

DECLARE @result INT
EXEC xp_fileexist 'c:\autoexec.bat', @result OUTPUT
IF (@result = 1)
--@result will return 1 if the specified file exists, if it doesn't, it will return 0
PRINT 'file exists'
ELSE
PRINT 'file does NOT exist'

This is also helpful when you want to check for the existence of a database backup file as it has been recorded in the msdb database. One of the challenges I have is checking if a database transaction log backup still exists if a log shipping job starts failing.

Just remember that when using this stored procedure is a bit risky as Microsoft reserves the right to change or even remove this in future versions or even service packs without having any documentation (well, it really doesn't have any after all) so make sure to test your scripts after applying service packs or upgrading.

Thursday, May 8, 2008

Changing VMWare image settings

And so I found out.

I was deploying an image to a server and wanted to try out VMWare Player instead. I am using version 2.0.3. Since the server has 4 processors, I was thinking of configuring the image to use as much processors as are available. Now there are two ways to do this. First, you can use VMWare Workstation or Server to change the configuration of the VMX file. You can use the graphical user interface for this. In my case, since my laptop does not have multi-processors, I did encounter an error stating that the image will not start if I use more than one CPU. But it doesn't matter since I am not going to start the image on my laptop anyway. I just need to change the settings so I can use those settings on a different machine.

Second, similar to how you do it in Microsoft Virtual PC where you change the VMC file, you can change the VMX file by searching for the numvcpus field and changing the value of the number, let's say from 1 to 2.


Oh, and there's one more thing: VMWare Player does not support more than 2 processors. And so I found out

Tuesday, April 29, 2008

Automating SQL Server 2005 Express Backups and Deletion of Older Backup Files

I've compiled a couple of my scripts - both TSQL and VBScript - to create a tip for MSSQLTips.com on automating backups and deletion of SQL Server 2005 Express databases to mimic that of creating a database maintenance plan. You can check out the article for more details. Feel free to post comments on the MSSQLTips.com forum site and hopefully, you'll find the tip very useful. I'll write a few more tips for them in the long run just for the fun of it.

Monday, April 21, 2008

Multiple connections to a server or shared resource by the same user, using more than one user name, are not allowed

I got this error when I tried to access a remote share from one of our DR servers. Well, apparently, the DR server configuration was not designed properly which is probably the reason why I am getting this error. The production and DR server both have the same configuration (or course, they are on a different network segment which is why this is possible) and I only get to access one or the other using a separate VPN profile. I was trying to copy a database backup from the production server and moving it to the DR server. The resolution: just run a net use * /delete command. It may sound simple but it works and that's what is more important. Having the same configuration (hostname and IP address, in my case) is causing my Windows XP workstation to get confused so a simple purge of the cache of my shared folder information. I would have to do this quite often as I need to connect to both servers on a regular basis

Wednesday, April 16, 2008

Follow-up on Policy-based Management in SQL Server 2008 v1

I asked Dan Jones about getting the policies in SQL Server 2008 to be evaluated in SQL Server 2005 and SQL Server 2000. You can create policies in SQL Server 2008 instances and use those policies to evaluate and even implement them in SQL Server 2005 and SQL Server 2000 instances with a couple of things to consider. One, if you are targetting facets which only exist in SQL Server 2008, the policies won't get evaluated on previous versions. An example of this is creating a policy which checks whether or not FILESTREAM option is enabled on a database. This of course won't get evaluated on previous version instances. The same is true when you intend to run the policies on previous version. You cannot run those policies on schedule or on-demand unless you use a workaround using PowerShell to do that. I'll work on a couple of samples where I'll create a policy in SQL Server 2008 and evaluate previous version instances based on that policy.

Tuesday, April 8, 2008

Thinking of cloning your workstations and servers?Think NewSID.exe

How many times have I mentioned in my blog posts that I am indeed a lazy guy? Probably a lot of times. That is practically because I always think of ways to make my work, especially those repetitive ones, as easy as possible. This case is one of them. I regularly work with virtualization, whether using Microsoft Virtual PC/Virtual Server or VMWare Server or Workstation for my tests or even for simulations. And whenever I need to work with a few servers and workstations, I simply create a copy of the virtual machine to eliminate the time and effort to install the operating system and applying the service packs if they need to be on each one of those images. Many organizations use disk image cloning software such as Norton Ghost or Acronis both for mass deployment of operating systems and for disaster recovery. But if you are to simply use these tools (or my method using virtualization) for cloning purposes, you are in for a surprise if you do not change the computer security identifier (SID) of the different images. To validate what I mean by this, create two cloned images and deploy them on a network. Then, try logging in using a domain account on either of the machines and you'll get the error mentioned below

The name or security ID (SID) of the domain specified is inconsistent with the trust information for that domain

Fortunately, SysInternals have a tool called NewSID v4.10. This tool is a program that changes a computer's SID. It is free and is a Win32 program, meaning that it can easily be run on systems that have been previously cloned. NewSID works on Windows NT 4, Windows 2000, Windows XP and Windows Server 2003. You can download the tool from Microsoft and run it on the cloned machine to change its SID. It is recommended though that you run this tool on a machine that is not a member of a domain. You can choose to randomly generate a SID, copy an SID from another computer or simply specify one (which is not really recommended). You can also choose to rename the machine to a different one (in my case, I still needed to run this tool eventhough I've already renamed the cloned image since they are both going to be a part of the same domain). A rebot would be required after a SID change has been applied. Download the tool and try it out for yourself

Monday, March 31, 2008

...and I found SqlMetal.exe on my machine...

I was reading the online version of the SQL Server Magazine April 2008 edition when I chanced upon an article on LINQ which talks about this tool. This is the first time I've read about this tool so I read further. SqlMetal.exe is a command-line tool that generates code and mapping for the LINQ to SQL component of the .NET Framework. When I started working with LINQ, I was only using Visual Studio 2008 to generate the object-relational diagram by dragging-and dropping the database objects on my designer surface. You can find this tool inside the \Program Files\Microsoft SDKs\Windows\v6.0a\bin folder. It generates the necessary data entity and DataContext object for LINQ as either a .vb or .cs source file. You can run this tool using the Visual Studio 2008 Command-Prompt or navigating to the folder location using command prompt and execute from there. A sample code to use SqlMetal.exe is shown below

sqlmetal /server:localhost /database:Northwind /language:cs /code:Northwind.cs

The command generates a .cs file from the Northwind database running on my SQL Server 2000 instance. When you open the code in Visual Studio 2008, you will see the codes which may seem familiar if you have used the designer to create LINQ objects. The code includes all the database entities and their corresponding attributes, meaning the tables and their fileds with constraints and data types. If you need to include stored procedures and functions, you can simply add the /sprocs and /functions switches. If you're thinking of simply ignoring this tool since the designer is already available for you, think again. It only took me like 10 seconds to type the syntax and it generated the code for me. Compare that to using the designer and dragging and dropping the database objects which would take me at least a minute or two. This would be a great time saver if you need to generate codes for about 50 databases scattered in different SQL Server instances in your environment. Cool tool for developers

Thursday, March 27, 2008

Policy-Based Administration in SQL Server 2008 (formerly known as Declarative Management Framework)

This was a long awaited blog post dating back last November 2007 when I did a session at the Singapore SQL Server User Group specifically for this topic. One of the most difficult task that DBAs are faced with in managing complex enterprise database environments is ensuring that all elements of the system conform to their company's established security and usage policy. Administrators need to constantly monitor security surface area of the entire database server along with permissions and settings for every database and server objects. DMF (as what is was formerly called in the previous CTPs) is a policy-based system for managing one or more instances of SQL Server 2008. To use the DMF, SQL Server policy administrators use SQL Server Management Studio to create policies that manage entities on the server such as the instance of SQL Server, databases, and other SQL Server objects. DBAs select one or more managed targets and specifically check that targets comply with the policies. Or they explicitly force the targets to comply with the policies. Let's take a look at a typical scenario. In the past, you might have used the SQL Server Best Practice Analyzer (there is also a version for SQL Server 2005) to check whether or not your SQL Server instances comply with Microsoft's best practices. The problem with this is that you need to manually run the tool which generates the report telling you which settings are running at best practice or not. Plus, you need to manually configure those settings not running at best practice. Imagine having to do this over and over depending on the number of SQL Server instances you manage. This is a very laborious task for DBAs working for an enterprise hosting services company like me. With DMF, you can create policies, similar to how you create policies in Active Directory, that are based on company's established security and usage policy, use these policies to either inspect a server and even change a group of server instances all at the same time with just a click of a button. Amazing how a DBA can do these maintenance tasks with ease using SQL Server 2008.

For more information on SQL Server 2008 Policy Based Management, you can check out a virtual hands-on lab at the TechNet Virtual Lab site so you can play around with how it works. The SQL Server Manageability Team also came up with a Level 300 webcast sometime February 2008 which digs deeper on this new feature in SQL Server 2008. Plus, I've written an article for MSSQLTips.com and created a video at BlogCastRepository.com on this as well - more on evaluating policies on multiple instances at the same time.

Sunday, March 23, 2008

What is this DCEXEC.EXE running on my server?

If you have been playing around with SQL Server 2008, chances are that you may have already seen this in your Task Manager's running processes. DCEXEC.EXE is the data collector run-time component which manages data collection based on the definitions provided in a collection set and can accept any collection set as input. What is it for? The concept of performance data warehouse is not something new. Almost every system administrators out there would probably have a methodology to capture the performance data of their systems for analysis, forecasting and capacity planning. DBAs are not excluded. But the effort and time to create a solution that does these tasks of collecting data and storing them in a data warehouse, coupled with creating a reporting and analysis solution, is not a joke. Good thing Microsoft included this out-of-the-box feature in SQL Server 2008 which allows DBAs create a performance data warehouse in as few clicks on the wizard as possible. It creates a new database named MDW and the corresponding SQL Server Agent jobs, SSIS packages and SSRS reports which makes monitoring your SQL Server 2008 (I still have to find out whether i can monitor SQL Server 2005 systems as well since this feature is only available in SQL Server 2008) database systems a bit easier. You can use this to have a loko at historical data of your system performance and pin point what might have caused an issue. Let's say end users are complaining that their application is slow, you can identify which queries were running on that specific time frame, and even extract their query execution plans. You have CPU, memory, Disk I/O, network usage and a whole bunch of stuff related to server, query statistics and disk usage. To know more about the new performance monitoring and troubleshooting feature using SQL Server 2008 Management Studio, check out this TechNet webcast

Friday, March 14, 2008

Prove them wrong: Using Dynamic Management Views and Functions in SQL Server 2005

I will spend the next couple of blog posts looking at performance tuning and troubleshooting in SQL Server 2005, although I may insert a few posts about SQL Server 2008 and probably SharePoint 2007.

Having to work in an organization that provides enterprise hosting services, I regularly deal with customers who blame our infrastructure whenever their application runs slow. Having just upgraded some servers on a new hardware and on a new platform - from Windows 2000/SQL Server 2000 to Windows Server 2003/SQL Server 2005 - I have to prepare a few bullets to shoot to our customers if they start complaining about performance issues. My take on this is that no amount of hardware upgrade is more than enough to compensate for poorly written application (scripts included). Let me start by introducing SQL Server 2005's dynamic management views (DMV) and functions (DMF). DMVs (not the Department of Motor vehicles) return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. In SQL Server 2000, you will have to use a ton of tools just to get these information. Not so with SQL Server 2005. Let me start by looking at the sys.dm_os_schedulers view. Note that DMVs belong to the sys schema and have dm_ as their prefix. This view returns one row per scheduler in SQL Server. Each scheduler is mapped to an individual processor in SQL Server (in multi-core, hyper-threaded processors, this count is on a per-core/thread basis). You can use this view to monitor the condition of a scheduler or to identify runaway tasks. When do you use this view? You use it when you feel that the CPU is where your bottleneck happens. To illustrate, let's take a look at a sample script below

SELECT scheduler_id, current_tasks_count, runnable_tasks_count, active_workers_count, work_queue_count, current_workers_count, is_idle, pending_disk_io_count
FROM sys.dm_os_schedulers
WHERE scheduler_id < 255

Some pointers about the script - since the scheduler is represented by the number of processors in your server, you should technically have the same number for the same number of processors. On an 8-processor, dual-core server, you will have 16 schedulers. The reason we limit our query to those with scheduler_id value less than 255 is that anything IDs greater than or equal to 255 are used internally by SQL Server, such as the dedicated administrator connection scheduler. How can we use the information returned by this query? The number of runnable tasks is generally a non-zero value; a nonzero value indicates that tasks have to wait for their time slice to run. Imagine having to wait for your turn just to get some ice cream. Performance wise, the ice cream vendor (your scheduler, in this case) is the bottleneck. With runnable_tasks_count returning 0, there are no actively running tasks. However, active sessions may exist like probably a dedicated administrator connection. High values for this counter are a symptom of a CPU bottleneck. The "high value" term here is probably relative as some say "high value" means greater than 2 or greater than 10. Whatever the values may be, a recommendation would be to come up with a baseline as there are different baseline values for different systems. There are other information you can get out of this DMV. For example, you can look at the pending_disk_io_count column to determine if you have a disk IO problem. Similar to the runnable_tasks_count column, the higher the value for this, the more likely that your IO is your bottleneck. A whitepaper to troubleshoot performance issues in SQL Server 2005 is available from TechNet and provides a better insight. We'll look at the other DMVs and DMFs* in future posts

*Note: DMFs are also the former acronym for SQL Server 2008's Declarative Management Framework. To minimize confusion, they now call DMFs as Policy Based Management in SQL Server 2008

Wednesday, March 12, 2008

Diagnose Authentication and Authorization issues in IIS 5/6 using AuthDiag 1.0

I was stuck with authorization issues with a web application trying to call COM+ applications in IIS. But when the security hops go beyond 2 levels of access, it becomes more difficult to find the root cause. Good thing I found this tool from Microsoft called AuthDiag 1.0. This tool is designed to aid customers in effectively troubleshooting and determining the root cause of the problem. It works with IIS 5.x and 6.0 (sad to say it doesn't work with IIS 4.0 which runs on top of Windows NT 4.0). It will analyze metabase configuration and system-wide policies and warn users of possible points of failure and guide them to resolving the problem. AuthDiag 1.0 also includes a robust monitoring tool called AuthMon designed at capturing a snapshot of the problem while it occurs in real-time. AuthMon is robust and specially designed for IIS servers removing any information not pertinent to the authentication or authorization process.

To know more about this tool, you can download a Microsoft webcast that focuses on how to use this tool

Saturday, March 8, 2008

The importance of an account's SID

I keep highlighting to developers that an account's SID is more important than the other attributes. This is something that any IT professional or developer or DBA should take seriously. Take for instance a Windows account that was created for a user. The user manages to log in to a workstation and creates his or her documents and saves it in a personal folder. Accidentally, the account got deleted. Your approach will be to recreate the account so that the user can log back in. Logging back in is one thing but accessing the user's documents is another. You see, even if they assign the same account and password, the system sees a different SID. The user will not be able to access the documents unless an administrator takes ownership of the documents and assign the user as the new owner. Another application of this concept is in databases. Imagine trying to backup a database and restoring it on a different server. You may be able to restore the database but the users assigned to access the database won't be able to do so. Again, it's an SID issue. Which is why part of database disaster recovery plan is to include these accounts together with their SIDs so that you won't be having difficulty restoring and accessing those databases when needed

Cleaing up your Maintenance Plans in SQL Server 2005

Being a lazy guy that I always am, I try to find ways to make my life easier. When we finished fixing all the issues in our test server, I recommended that we simply backup the MSDB database and restore it on the production server. This way, I don't have to recreate the jobs on the production server. Well, that's just what hapened. I didn't have to recreate the jobs. Unfortunately, I had a database maintenance plan which I created in the test server and that got in my production server as well. I needed to either modify the maintenance plan to point to the correct server - which is my production server - or delete it and recreate another. I tried the easiest path - deleting the maintenance plan. When I did that, it was throwing me an error similar to the one below

TITLE: Microsoft SQL Server Management Studio
------------------------------
Exception has been thrown by the target of an invocation. (mscorlib)
------------------------------
ADDITIONAL INFORMATION:
An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.(provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (Microsoft SQL Server, Error: 53)

I checked the maintenance plan and verified the connections and saw that it was the instance name of my test server. I checked the SQL Server logs on my test server and noticed that my production server's SQL Server service account was trying to connect. Good thing I applied best practice here or I would end up deleting the database maintenance plan on the test server from the production server. So, how do I do it? The workaround is to check the dbo.sysmaintplan_plans table of the MSDB database. The ID column can be used as a value for us to delete records in the dbo.sysmaintplan_log, dbo.sysmaintplan_subplans and dbo.sysmaintplan_plans tables. Once those records have been deleted, you can refresh your Maintenance Plans folder in SQL Server Management Studio and notice the maintenance plans disappear. This does not delete the jobs created by these maintenance plans. You still have to manually delete those jobs.

Sunday, February 24, 2008

Modifying SQL Server 2005 Port Numbers

And so the tests continues...One of the things that I need to test during the upgrade was database connectivity. We know the standard osql, sqlcmd, isql and Query Analyzer tests from a remote client where you supply either a Windows or SQL Server account to login to the database server. Once connected, we are practically sure that database connectivity works fine. But that doesn't stop there. We need to make sure that other applications can connect to the database server as well. This is where the importance of SQL Server port numbers come into the picture. We usually recommend to change the default port numbers 1433 and 1434 for security reasons as anybody who knows SQL Server has a high probability of knowing this port. How do we change the port number that SQL Server uses. Open SQL Server Configuration Manger -> SQL Server 2005 Network Configuration -> Protocols for InstanceName. Open the TCP/IP protocol properties either by right-clicking and selecting Properties or double-clicking. Go into the IP Addresses tab of the TCP/IP Properties window, scroll all the way to the bottom to the IPAll section, change TCP Dynamic Ports to be blank and put any value, say 1234, or whatever your chosen SQL Server port number is, in to the TCP Port box (I just used 1433 for ease of explanation). Make sure you restart your SQL Server service for the changes to take effect.



NOTE: If the value of Listen All on the Protocol tab is yes, the TCP/IP port number for this instance of SQL Server 2005 is the value of the TCP Dynamic Ports item under IPAll. If the value of Listen All is no, the TCP/IP port number for this instance of SQL Server 2005 is the value of the TCP Dynamic Ports item for a specific IP address.
Google