Access is Denied error when Crawling Sharepoint 2010

When trying to get search working correctly, the full crawl on our sharepoint2010 server kept coming back with “Access is denied. Check that the Default Content Access Account has access to this content, or add a crawl rule to crawl this content”.  This error is due to MS implementing a loopback check to prevent reflection attacks.  To disable the loopback check, do the following:

  1. Open Regedit
  2. Navigate to HKLM\CurrentControlSet\Control\Lsa
  3. Create a new DWord (32-bit) value called DisableLoopbackCheck and set its value to 1
  4. Kick off a crawl
  5. Profit $$$

Getting the bloody Sql Server provider to work with Start-Job

I was trying to use the Invoke-Sqlcmd in a script-block in Powershell to kick off a job, but I kept running into the following error:

The term ‘Invoke-Sqlcmd’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if  a path was included, verify that the path is correct and try again.”

When you use the Start-Job cmd to start a job, you have to re-initialize the session with the snap-ins or modules that said script block references.  I had assumed it would more or less inherit the current profile that the script was run from, which is apparently not the case.  Took me a bit to get it all figured, but you have to pass in what you want loaded to the Start-Job in the –InitializationScript parameter.  Here’s an example:

$scriptBlock = {
	Invoke-Sqlcmd -Database master -Query "SELECT * FROM dbo.TestTable"
}

$init = {
	Add-PSSnapin SqlServerCmdletSnapin100
	Add-PSSnapin SqlServerProviderSnapin100
}

cls
sl SQLSERVER:\SQL\localhost\DEFAULT\Databases\master
$job = Start-Job -InitializationScript $init -ScriptBlock $scriptBlock

 receive-job -Job $job

Creating your Profile in Powershell

Here’s something I don’t do often enough to commit to memory, create your profile.  If you have anything you want to run when you fire up powershell (sql server & sqlpsx modules, etc…) you need first to create your profile.  You can do so with the following command:

New-Item -path $profile -type file -force

You can then edit your profile by typing in:

notepad $PROFILE

Moving Large Amounts of Data

We have this database which is about 300 gig.  A substantial portion of this data is logging data, and a substantial portion of the logging data is simply heartbeat records.  Why on earth the home page logs to the database is beyond me, but moving on…

There are roughly 280 million records in one table in particular that need to be moved to an archiving database on the same server.  Usually, my approach is to BCP out the data that I want to keep and BULK INSERT it back into a new table, but I decided to try a new approach based upon Henk Vandervalk’s article here.

This approach uses multiple streams in parallel in SSIS to SELECT the data out into the new table.  This is accomplished by using the MODULO operator against an IDENTITY field in the table to partition out the streams based upon the remainder of the IDENTITY field.  Here’s an example:

CREATE TABLE dbo.TestModulo(
	ID		INT IDENTITY(1,1),
	VAL		VARCHAR(10)
)
GO

INSERT INTO dbo.TestModulo(Val)
SELECT 'Test'
GO 100

SELECT TOP 5 * FROM dbo.TestModulo
WHERE (ID%4)=3
OPTION (MAXDOP 1)

SELECT TOP 5 * FROM dbo.TestModulo
WHERE (ID%4)=2
OPTION (MAXDOP 1)

SELECT TOP 5 * FROM dbo.TestModulo
WHERE (ID%4)=1
OPTION (MAXDOP 1)

SELECT TOP 5 * FROM dbo.TestModulo
WHERE (ID%4)=0
OPTION (MAXDOP 1)

As you can see in the image below, this partitions the table up into sets based upon the remainder of the ID field divided by the MODULO operator rounded upwards. 

image

So, in SSIS I created a Data Flow Task.  In said Data Flow Task I created 4 OLEDB Sources.  In the sources is where I put the Sql Select statements that utilize the MODULO operation to divide up the sets of data.  I then dragged the output of all 4 to a UNION ALL transformation and then the output of the union all goes to the Sql Server Destination.  Of note, you can only use the Sql Server Destination if the destination table is on the local server.  If not, you must use an OLEDB Destination.

image

The result of this was moving 227 million rows in about an hour.  Not bad!  When doing this via BCP/BULK INSERT, the BCP out alone took over an hour, and the import took another 45 minutes on top of that.  This essentially cut down the loading time by about half!

Another thing I did to speed up the transformation was to increase the default packet size on the connections from 4k (Sql Server default) to 32767 (also mentioned in Henk Vandervalk’s excellent post).

Testing Database Connectivity

Here’s a tip I’ve been using for many years now.  To check connectivity from a disparate server to your database server, simply right-click on the desktop and select New, then select Text Document.  Name the file test.udl and hit enter.  Now simply double-click on the newly created .udl file and a dialog asking for connection information will appear.  Fill in the details and hit Test Connection.  Quick and dirty way to check connectivity problems.

image

Drop Users & Logins from Sql Server with Powershell

With our trust being severed, we now have a bunch of windows logins that need to be removed from the untrusted domain.  This script will iterate through all the databases and drop all the users specified in the array, then drop the login from the server.  Use at your own risk!

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$Error.Clear()
cls

$servers = Get-Content c:\Test\Servers.txt
$logins = @("aaLogin","TestLogin")

foreach($server in $servers)
{
	$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $server
	
	#drop database users
	foreach($database in $srv.Databases)
	{
		foreach($login in $logins)
		{
			if($database.Users.Contains($login))
			{
				$database.Users[$login].Drop();
			}
		}
	}
	
	#drop server logins
	foreach($login in $logins)
	{
		if ($srv.Logins.Contains($login)) 
		{ 
			$srv.Logins[$login].Drop(); 
		}
	}
}

Add Login to Servers & Databases with Powershell

I have trust issues.  No really, our trust between our domains was severed this morning.

Here is a script that will add logins to the servers, create users in all the user databases on the servers (under the datareader & datawriter roles, but configurable), and then grant view definition on all the procs, views & udf’s on the server.

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
$Error.Clear()
cls

$servers = Get-Content c:\Test\Servers.txt
$roles = @("db_datareader","db_datawriter")

[string]$LoginName = "aaLogin"; 
[string]$Password = "BenderIsGreat!&";

foreach($server in $servers)
{

	$srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $server

	#careful, if the login exists this will drop it
	if ($srv.Logins.Contains($LoginName)) 
	{ 
		$srv.Logins[$LoginName].Drop(); 
	}

	$login = New-Object ('Microsoft.SqlServer.Management.Smo.Login') $srv, $LoginName 
	$login.LoginType = [Microsoft.SqlServer.Management.Smo.LoginType]::SqlLogin; 
	$login.Create($Password);

	# refresh login so it contains the SID
	$srv.Logins.Refresh(); 
	$login = $srv.Logins[$LoginName]

	foreach($database in $srv.Databases)
	{
		if($database.IsSystemObject -eq $false -and $database.Name -ne "LiteSpeedLocal")
		{
			#careful, this will drop the user from the db if it exists
			if($database.Users.Contains($LoginName))
			{
				$database.Users[$LoginName].Drop();
			}
		
			$user = New-Object('Microsoft.SqlServer.Management.Smo.User') $database, $login.Name
			$user.Login = $login.Name
			$user.create();
			
			foreach($role in $roles)
			{
				$role = $database.Roles[$role]	
				$role.AddMember($user.Name)
			}
			
			foreach($proc in $database.StoredProcedures | where{$_.IsSystemObject -eq $false})
			{
				$perm = New-Object('Microsoft.SqlServer.Management.Smo.ObjectPermissionSet')
				$perm.ViewDefinition = $true;
				$proc.grant($perm, $LoginName);
			}
			
			foreach($view in $database.Views | where{$_.IsSystemObject -eq $false})
			{
				$perm = New-Object('Microsoft.SqlServer.Management.Smo.ObjectPermissionSet')
				$perm.ViewDefinition = $true;
				$view.grant($perm, $LoginName);
			}
			
			foreach($udf in $database.UserDefinedFunctions | where{$_.IsSystemObject -eq $false})
			{
				$perm = New-Object('Microsoft.SqlServer.Management.Smo.ObjectPermissionSet')
				$perm.ViewDefinition = $true;
				$udf.grant($perm, $LoginName);
			}
			
			foreach($uda in $database.UserDefinedAggregates | where{$_.IsSystemObject -eq $false})
			{
				$perm = New-Object('Microsoft.SqlServer.Management.Smo.ObjectPermissionSet')
				$perm.ViewDefinition = $true;
				$uda.grant($perm, $LoginName);
			}
			
			foreach($trigger in $database.Triggers | where{$_.IsSystemObject -eq $false})
			{
				$perm = New-Object('Microsoft.SqlServer.Management.Smo.ObjectPermissionSet')
				$perm.ViewDefinition = $true;
				$trigger.grant($perm, $LoginName);
			}
			
			
		}
	}

}

Find Servers with large sysjobhistory tables

This script will return you a list of servers with the rowcount of sysjobhistory >=500.  Much like the previous post, used for finding servers that have no history cleanup jobs.  You will need SQLPSX installed to run this.

cls
$servers = Get-Content c:\Test\Servers.txt
$countThreshold = 500

foreach($server in $servers)
{
	$rows = Get-SqlData -dbname msdb -sqlserver $server -qry "SELECT COUNT(*) AS HistCount FROM dbo.sysjobhistory" #| Format-Table -AutoSize
	$nRowCount = [int]$rows[0]
	if($nRowCount -ge $countThreshold)
	{
		Write-Host "Server $server has a rowcount of $nRowCount in sysjobhistory.  There is probably no history cleanup job on this server."
	}
}

Find Servers with No History Cleanup

Quick powershell script to show servers that don’t have a history cleanup job.  This script looks for jobs with the name ‘clean’ and ‘hist’ without ‘distribution’ in the name (to save false positives from the ‘Distribution clean up: distribution’ job, if your server is a distributor).  I’m sure there is a better way of checking for this (like checking the count of the msdb.dbo.sysjobhistory table), but this should work just as well (as long as your job is named aptly). 

You’ll need SQLPSX installed to run this.  Usual warnings, run at your own risk.

cls
$servers = Get-Content c:\Test\Servers.txt

foreach($server in $servers)
{
	$hashist = $false
	$jobs = Get-AgentJob $server
	foreach($job in $jobs)
	{
		#Write-Host $job.Name
		if($job.Name -like "*clean*" -and $job.Name -like "*hist*" -and $job.Name -notlike "*distribution*")
		{
			$hashist = $true;
			break;
		}
	}
	if($hashist -eq $false)
	{
		Write-Host "Server $server does not have a history clean up job"
	}
}

Powershell Script to check for low drive space

Just what it says.  This will email you when your drive space gets below the desired threshold.


function sendMail($srv, $DriveName, $free) {
	$subject = "Low Drive Space on $srv"
	$body = "<p style='font-family: calibri;'>The <B>$DriveName</B> on server <B>$srv</B> has only <B>$free % </B> free.<BR/>
	Please try to free up some space so it is above $percentThreshold% free space.
	</p>"
	
	# Init Mail address objects
	$emailFrom = New-Object system.net.Mail.MailAddress "donotreply@yourcomany.com" , "Low Drive Space on $srv"
	$emailTo = New-Object system.net.Mail.MailAddress $alertEmail , "Low Drive Space"
	$smtp = new-object Net.Mail.SmtpClient($mailServer)
	$MailMessage = new-object Net.Mail.MailMessage($emailFrom, $emailTo, $subject, $body)
	$MailMessage.IsBodyHtml = $true
	
	$smtp.Send($MailMessage)
}

function CheckDriveSpace($servername)
{
	$drives = gwmi win32_logicaldisk -filter "drivetype=3" -ComputerName ($servername) -ErrorAction SilentlyContinue
	foreach($drive in $drives)
	{
		$PercentFree = [int]([Int64]$drive.FreeSpace / [Int64]($drive.Size) * 100) 
		
		#SaveDriveinfo $ServerName $Drive.DeviceID $Drive.FreeSpace $drive.Size $PercentFree
		
		if($PercentFree -lt $percentThreshold)
		{	
			sendMail $servername $drive.DeviceID $PercentFree
		}
	}
}

function Main()
{
	try
	{
		$servers = Get-Content("C:\Servers.txt")
		foreach($server in $servers)
		{
			CheckDriveSpace $server
		}
		return 0;
	}
	catch
	{
		# Handle the error
		$err = $_.Exception
		write-host $err.Message
		while( $err.InnerException ) 
		{
			$err = $err.InnerException
			write-output $err.Message
		}
		return 1;
	}
}


Clear-Host
$percentThreshold = 15
$alertEmail = "dba@yourcompany.com"
$mailServer = "mail.yourcompany.com"
Main