PowerShell script path limitations

Having discovered that SQL Agent PowerShell steps use PowerShell 2.0, I found you can use the latest version of PowerShell on your server by invoking PowerShell and passing it the path to a script file:

powershell.exe -file “\\servername\path\filename.ps1”

I thought it would make the job easier to port to a new server one day if, instead of \\servername, I used \\ but PowerShell said that the current execution policy prevented running remote unsigned scripts. (Our execution policy was RemoteSigned.)

I don’t understand why it treats the names this way. According to my reading, you can spoof \\machinename simply by adding it to the hosts file and redirecting it elsewhere. But, although you can do that with, it doesn’t work so I would have though PowerShell would treat such a script as local.

I’m not sure what’s going on here but it’s one to remember.



SQL 2014 upgrade experience

I upgraded our servers to SQL 2014 and found it a pretty smooth experience. Thomas LaRock has a nice set of advice here – thanks Tom – with explanations of why and links to further reading.

We started from a mix of mostly SQL 2012 but a few stragglers on SQL 2008R2 that, for various reasons, I hadn’t got round to. I kept the compatibility levels as they were for a while to bed things in. There were no problems at all.

So then I changed the compatibility levels to 120 (SQL 2014) which invokes the new cardinality estimator. As far as I can see, performance is at least as good as before but two query plans did go bad. One needed a little refactoring but the other was very stubborn. After running out of other ideas I used the query hint OPTION (QUERYTRACEON 9481) to force the old cardinality model and that fixed it. I don’t like query hints but in this case I’ll make an exception.

I also notice a distinct improvement in the missing index recommendations in the Performance Dashboard report. For several years it’s been recommending two or three indexes that we already have. Those are gone now and that’s a good sign in my book.


SQL Agent CmdExec gotcha

I was writing a new SQL Agent job for somebody else so I thought I should comment it a bit more thoroughly than I usually do. There’s an Operating System (CmdExec) step that calls sqlcmd to run a script file:

sqlcmd -S localhost -E -d master -i "\\\joblogs\somescript.sql" -b

This pattern works fine elsewhere. Here though, I added a comment just to point out something:

rem A comment
sqlcmd -S localhost -E -d master -i "\\\joblogs\somescript.sql" -b

The job ran successfully but it was suspiciously fast. It turns out that’s because it didn’t do anything. Apparently SQL Agent interprets this differently. I know the DOS shell language is a bit arcane sometimes but rem means ‘ignore everything until the next command’. This works in a .bat file. I just tested to make sure I wasn’t going mad.

Time to dig deeper. Books Online (https://msdn.microsoft.com/en-GB/library/ms190264.aspx) says:

This topic describes how to create and define a Microsoft SQL Server Agent job step in SQL Server 2016 that uses an executable program or operating system command

That ‘an‘ is a subtle clue. I had always assumed this worked like a batch file but it seems only the first command gets executed. In my case the first line is commented out, which effectively means ‘do nothing’. I tested with multiple statements and again only the first one executed.

So don’t confuse it with batch files. Put one command per step and – a bit of lateral thinking here – put your comments in the line(s) underneath:

sqlcmd -S localhost -E -d master -i "\\\joblogs\somescript.sql" -b
rem A comment

SQL download file names

When you go to download SQL Server cumulative updates the options can be pretty baffling. Which file(s) do you need? Here’s some help.

For example, if you go to SQL Server 2014 CU1 you are offered these files:

  • Update_SQL2014_RTM_CU1_2931693_12_0_2342_x64
  • 2014_RTM_AOrcleDes_CU1_2931693_12_0_2342_x64
  • 2014_RTM_AOrcleSer_CU1_2931693_12_0_2342_x64

There’s no explanation on the page but presumably it’s explained somewhere. It is, but it’s not easy to find. I asked on the MSDN forum and was pointed to this page which explains it quite clearly.

PowerShell in SQL Agent: not what I’d hoped for

Like many DBAs, I suspect, I just haven’t found much need for PowerShell so far. There were lots of articles about how you could backup a database, run a query, get a list of databases etc. I just wasn’t seeing anything I couldn’t already do just fine in T-SQL. But with many Microsoft server products, PowerShell is taking over as the main management tool with the GUI serving as just a thin skin over the PowerShell API. Could this be coming our way?

So I’ve been meaning to give it a try and I just had a request to set up a daily FTP upload of some data to a partner organisation. I don’t know the command line stuff for FTP so this looked like a good excuse to learn a bit of PowerShell. After a while I had a script that worked nicely until I put it into a PowerShell step in a SQL Agent job. Then it failed complaining about how I was specifying non-existent arguments for a cmdlet. WTF?

After a lot of digging, I discovered that SQL Agent shells your script out to PowerShell 2.0. This is SQL Server 2014 and PowerShell 2.0 was RTMd in 2009. We’re up to PowerShell 5.0 now.

I imagine there are sound technical reasons for this but it’s a bit disappointing after all the hype.

Buffer pool extension gotcha

Don’t get caught out by this one.

We just got a big new server for our upgrade to SQL 2014, and I specially added a pair of SSDs to the spec so we would have a super-fast RAID 1 partition for the tempdb files and so we could use the new buffer pool extension feature. The server vendor insisted we had to use their approved, server-grade SSDs to be properly supported. These disks are far from cheap so we got the 120GB drives.

All was going well until I tried to enable the buffer pool extension:

Msg 868, Level 16, State 1, Line 2
Buffer pool extension size must be larger than the current memory allocation threshold 117894 MB. Buffer pool extension is not enabled.

I didn’t see that in any of the material announcing this feature. It’s in the (really) small print in Books Online so technically we have been warned. We have plenty of RAM so we would need to buy bigger SSDs to use buffer pool extensions.

Still, at least anything that spills to disk in tempdb is going to go way faster, which is better than nothing.

Missing SQL PerfMon counters

A server has just lost all the SQL Server PerfMon counters. They’ve all disappeared into thin air. It plays havoc with our monitoring tool! We’ve had a couple of instances of this lately. I haven’t found the cause and it’s starting to get on my nerves. Luckily the answer is fairly simple once you know. NB: This involves restarting the SQL Server service so you might want to do it out of hours.

The steps are:

  1. Run cmd.exe as Administrator.
  2. Go to the Binn directory. On my SQL 2014 instance this was cd C:\Program Files\Microsoft SQL Server\MSSQL12.MSSQLSERVER\MSSQL\Binn.
  3. unlodctr MSSQLSERVER (or unlodctr MSSQLSERVER$INSTANCENAME for a named instance).
  4. lodctr perf-MSSQLSERVERsqlctr.ini (or lodctr perf-MSSQLSERVER$INSTANCENAMEsqlctr.ini for a named instance).
  5. Restart the Remote Registry service if running.
  6. Restart the Performance Logs & Alerts service if running.
  7. Restart the SQL Server service.

There are some other strange things that can happen: Remote Registry service stopped, all PerfMon counters – or just SQL counters – disabled in the Registry.