Fix: Internet Explorer 9 Freezes After Resuming from Sleep

IE started freezing on my Win8 x64 PC whenever I resumed the machine from sleeping.  The GUI would load completely, but would freeze as soon as I tried to load a web page.  It would lock up for about 60 seconds and then come back like nothing was wrong.

  1. Try disabling all add-ons:  This didn’t work for me, but it is always the first step for troubleshooting IE.
  2. If that doesn’t work, try resetting IE settings:  This will require you to restart the computer.  It worked like a charm for me.
  3. If that doesn’t work, check the error logs.  Reproduce the error and make note of the exact time.  Go to start > run and type eventvwr.msc.  Run through the different error lists and look for errors which occurred at the same time as IE froze.  If you can find a suspicious error, try searching for that.

How to automatically update TFS workitems from the command line

A lot of my life is lived according to bugs filed on a Team Foundation Server somewhere.  One of my particular projects of late has been requiring weekly updates to a number of TFS workitems.  I guess they want to make sure we’re still paying attention to them.

I set out to find a solution where I could automatically bulk update my work items at the click of the button.  Stretch goal: I wanted to have this happen automatically without any interaction.

Step 1: download TFS Powertools from: (make sure you get the version which matches your copy of Visual Studio

Command line tool documentation is located at: C:\Program Files (x86)\Microsoft Team Foundation Server 2012 Power Tools\Help\TFPTCommandLineTool.mht

NOTE: I will write this assuming you are using VS 2012.  Assume the path to other versions will be different.

Step 2: Make sure your connection to TFS is configured.

cd "C:\Program Files (x86)\Microsoft Team Foundation Server 2012 Power Tools"
.\TFPT.exe connections

Verify that the correct TFS server is listed.

If the correct version of TFS is not configured:

  1. Open Visual Studio (2012)
  2. Click Team > Connect To Team Foundation Server
  3. Click Servers
  4. Click Add
  5. Configure the server

Step 3:

Create a query for the items you want to update.

The cron script will automatically update anything from a particular query.  You could just as easy specify work item IDs on the command line instead of using a query.  I decided against that because I wanted the flexibility to change the work items which get edited without having to change my code.

  1. Navigate to your TFS server in the browser (you could do this from Visual Studio Too)
  2. Click New > Query
  3. Build the query you want
  4. Save the query and make note of the location.
    In this case: Shiproom/My Queries/WeeklyAutoUpdate

NOTE: this query should actually say Changed Date < @Today – 6.  Not =.

Note that the last line is searching for @Today – 6.  I did that so that any work item I’ve updated within the last week isn’t touched by this query.

Step 4:

Verify that you are getting the correct work items IDs on the command line:  (replace http://tfsserver:8080/tfs with the server’s URL and the query name with your full query path)

PS C:\...> .\TFPT.exe query /collection:http://tfsserver:
8080/tfs /format:id "Shiproom/My Queries/WeeklyAutoUpdate"
PS C:\...>

Step 5:

Try editing an item through the command line.

Replace 942 with one of your work item IDs and once again replace http://tfsserver:8080/tfs with your server’s URL.

PS ...> .\TFPT.exe workitem /collection:http://tfsserver:8080/tfs /update 942 /fields:"History=."
Work item 942 updated.

Check to make sure your work item was updated.  It should now show a new entry in the history which merely contains a period.

Step 6:

Put it all together into a nice PowerShell script

$SERVER = "http://server:port/path" #Full URL to the server
$QUERY = "PATH/My Queries/QueryName" #Query of items to update
$UPDATECMD = '/fields:"History=."' #Command you want to run against them
$TFSPTPATH = "C:\Program Files (x86)\Microsoft Team Foundation Server 2012 Power Tools"

Set-Location $TFSPTPATH

$idlist = .\TFPT.exe query /collection:$SERVER /format:id $QUERY

#could avoid the loop and send all IDs at once
foreach($id in $idlist)
    $result = .\TFPT.exe workitem /collection:$SERVER /update $id $UPDATECMD
    Write-Host $result

pause  #remove this to run without an end prompt

I chose to nest the work item update in a for loop.  This isn’t necessary.  TFPT will take a list of work item IDs and perform the same operation on all of them.  I general feel safer when I wrap something like that in my own loop.  That way one failure doesn’t necessarily kill everything.  Additionally, you get instant progress if there are a large number of workitems which must be updated.

The program as it is above has a call to pause on the last line.  If you want to run this as a scheduled task, that line should be commented out.

Step 7:

Add this to a scheduled task so that you aren’t bothered to remember your bugs.


Thanks to:

This guy: for giving me a lot of the concept.

Securing THE CLOUD

One of the primary purposes for this blog is to talk about information security.  An appropriate first post on the topic is Securing THE CLOUD.

Cloud is one of those words I don’t like to use.  It was drummed up by marketing gurus to make an old idea sound new.  All the cloud means to me is putting your stuff on somebody else’s servers.  Generally that stuff is accessible from anywhere on the internet, although they have coined the term ‘private cloud’ to describe a scenario where access is more limited.  I digress.

As a developer for the cloud, the stakes are high for you.  Customers are entrusting you to protect their stuff which is logically connected every internet user on the planet (potentially sans N. Korea, Iran, China, et al.).

Here is a checklist that I like to think of when I’m evaluating the security of a cloud product.  I suppose you could turn it around and use it as a howto guide.  Just a quick caveat.  I am not a programmer.  I stopped programming as soon as I realized just how bad I am at it.  You know what they say, “those who can’t do manage”.

  1. Guard the front gates with everything you have.  Every restricted-access cloud service sits behind an authentication layer.  That is your first line of defense.  The first line of code on every page (figuratively) should be an authentication/authorization check.
  2. Don’t even think of writing your own authentication system.  Let somebody else do that for you.  Trust me, they are much better at it than you are.  Use Microsoft ID (aka LiveID), Open ID, Google ID, Facebook ID.  Better yet, let them pick.  Your customers will thank you because that is one less password they have to remember.
  3. Don’t let the bad guy pretend to be a legitimate user.  How do they do that?  Cross-site scripting (XSS) is the most common vulnerability I run across in the wild.  That is followed closely by cross-site request forgery (CSRF).  Both of these can be leveraged by an attacker to cause a user to shoot themselves in the foot.   This is a huge topic which probably warrants its own post.  I’ll try to break it down.
    • sanitize! sanitize! sanitize! Consider every piece of data which is ever in a user’s possession as evil. This includes cookies, form variables, url parameters, and uploaded files. They are all evil and must be stopped.  I do mean everything.  Use somebody else’s form sanitization library.  Again, they have thought of things you haven’t.
    • Canaries are your friend.  This is the only effective way to prevent cross-site request forgery.  Multi-step forms don’t work.  Cookies don’t work.  Use a challenge-response system with canaries to validate that a user really just came from one of your pages.
    • Use HTTP commands as designed.  Never change state with an HTTP GET.  Use a POST.  It may take some extra JavaScript foo, but it is much harder to attack.
    • Scan every version of your code.  There are a number of automated XSS scanners on the market.  They are pretty good at finding script kiddie vulnerabilities.  If you are a juicy enough of a target to warrant attention by ‘advanced’ researchers’, you may want to invest in a second set of eyes.  Trust me, an expert pentest crew is a whole lot cheaper than getting horribly owned.
  4. Don’t do dumb things with SQL.  It never ceases to amaze me that SQL injection still works. Use stored procedures for everything.  Also, building an SQL string in a stored procedure and EXECing it is just as bad as parameterized SQL.  Don’t do it.
  5. SSL is good.  If the data is important enough to put behind a login page, it is important enough to protect with SSL.  Period.
  6. Protect data in storage.  This is a topic I am planning on writing a dedicated post about.  The news is full of companies whose data has been stolen and put on pastebin or similar.  If it shouldn’t be shown to the world, it should be encrypted.  Stay tuned for the specifics of this.

As I write this, you may notice that none of these items are specific to the cloud.  These are all best practices for developing on the internet in general.  That is because they are the same thing.  Cloud == internet, internet == cloud.  Marketing will say what marketing will say.  For the rest of us, nothing has changed.  The web is still the web.  If you can secure one, you can secure the other.