Reading from C:\ProgramData without requiring UAC elevation

When trying to read some user settings from C:\ProgramData in my .NET app, I was getting an Access Denied exception, even though I was only attempting to read the configuration file, not write anything:

System.UnauthorizedAccessException: Access to the path ‘C:\ProgramData\YourApp\1.0.0.0\settings.xml’ is denied.

Even though I was only reading the file, and not writing anything, it still wanted elevation before it let me read it. It turns out that I need to signify my intent not to write anything when I open the stream. This code generated the exception (the “Using” statement actually threw the exception):

Using f As New FileStream(MySettingsFilePath, FileMode.Open)
    Dim formatter As New Formatters.Binary.BinaryFormatter
    MySettings = formatter.Deserialize(f)
    f.Close()
End Using

However, by changing the “FileStream” to a “StreamReader”, I signify my intent to read and not write, so the code runs without an issue (there are two changes):

Using f As New StreamReader(MySettingsFilePath)
    Dim formatter As New Formatters.Binary.BinaryFormatter
    MySettings = formatter.Deserialize(f.BaseStream)
    f.Close()
End Using
MORAL – Elevation isn’t required to read common application settings, only to write them, but you need to be clear about what you intend to do!

Periodic timeouts with local WCF endpoint

I have two applications – a Windows service and a client WinForm – that run on the same box, and the client needs to check on the status of the service every few seconds. It worked well most of the time, but every 4-5 times, it would timeout – I’d receive a System.TimeoutException, no matter how long the timeout was actually set for. It didn’t make sense – the two apps are on the same box, it happens with both a TCP and an NamedPipe endpoint, and it can happen even when the service isn’t busy.

It turns out that WCF endpoints only handle a very limited number of simultanious connections, and though I was creating a new ChannelFactory each time, I wasn’t closing the channel when I was finished with it. I assumed that it would automatically close the channel when it passed out of scope, but no dice. Since the channels remained open, it didn’t take long to fill them all up, and then the app would appear to be unresponsive until my channels started to time out. So here’s what I ended up doing – critical line in bold, at the end:

Dim tcpBinding As New NetTcpBinding
Dim pipeFactory As ChannelFactory(Of WCF_Class.IServiceRemoting) = New  _
    ChannelFactory(Of WCF_Class.IServiceRemoting)(tcpBinding, "net.tcp://localhost:4079")
Dim ServiceWCFConnection As WCF_Class.IServiceRemoting = pipeFactory.CreateChannel
MessageBox.Show(ServiceWCFConnection.Responsive)
pipeFactory.Close() ' This is what's important!

That’s it – make sure you close your Channel when you’re done with it! If you leave it open, you quickly hit the server’s connection limit and new requests will fail. Closing the channel when you’re done frees it up, instead of letting it time out and close on its own.

Removing an arbitrary number of spaces from a string in SQL Server

When I was concatenating some fields to create an address, I ended up with a number of spaces in my address that were unnecessary. For example, I had this:

SELECT StreetNumber + ' ' + Direction + ' ' + StreetName + ' ' + StreetType as Address

However, when an address didn’t have a direction, I ended up with a double-space in the middle of my address, and I wanted a way to clean it up. Enter the code I found at http://www.itjungle.com/fhg/fhg101106-story02.html:

SELECT name,
       REPLACE(REPLACE(REPLACE(name,' ','<>'),'><',''),'<>',' ')
  FROM SomeTable

This shortens any run of spaces in the string into a single space – sneaky! It works in any language that supports a function like REPLACE, which scans one string for instances of a second string, and swaps them out for something else.

Visual Studio solutions don’t load when you double-click on them

When I installed Windows 7, double-clicking on .SLN files to load them in Visual Studio stopped working – I would get the hourglass for a few moments, and then nothing. It turns out that it’s because I had set Visual Studio to always run as an Administrator, but when you double-click a .SLN file, you’re not actually running Visual Studio – you’re running the “Visual Studio Version Selector”, even if you only have one version of Visual Studio installed.

To resolve the problem, you’ll need to set the Version Selector to also run as an Administrator – to do this, find the Version Selector EXE:

C:\Program Files\Common Files\microsoft shared\MSEnv\VSLauncher.EXE (or “Program Files (x86)” if you have an x64 installation)

Right-click on it, go to the “Compatibility” tab, and check the box that says “Run this program as an administrator”. Since I have multiple users on my laptop that I use for development, I had to click “Change Settings for all users” before checking the “Run as admin” box.

Voila! The VS Version Selector will now properly load the solution files when you double-click on them – enjoy!

Migrate database indexes to a new file group

I recently had to mass-migrate all the indexes from a database to a new file group since we’d added some additional storage to our database server. I found this article at SQL Server Central (unfortunately, registration required, so I’ve included a copy of the original script in the download at the end). While it worked okay, there were some things I didn’t like about it:

  • Assumed 90% index fill rate
  • “Moved” indexes were all created as non-unique, regardless of original
  • Fail during index creation left you without an index (drop and then create, with no rollback)
  • Table was un-indexed during the move (index dropped and then created)
  • Script re-created indexes without any “Included” columns, even if original index had them

To address these limitations, I rebuilt the process using that script as a starting point. The new script:

  • Uses 90% fill rate by default, but if the original index had a different rate specified, it will use that
  • Re-creates indexes as unique if the source index was unique
  • Rollback problem resolved – new index is created with different name, old index is dropped, and then new index is renamed, all in a TRY-CATCH block
  • Since the new index is created and then the old one dropped, table indexing remains “online” during the move
  • Migrates “Included” columns in index
  • Updated the script to use SYS views (breaks compatibility with SQL 2000, since SYS is 2005/2008/beyond only)
I welcome any feedback on the script, and would love to know if you see any improvements that should be made.

Download .SQL scripts (contains both Original and Modified scripts)