Tuesday, December 24, 2013

Running Ubuntu full screen under Hyper V 2012

There is no way through the GUI to configure the screen resolution of an Ubuntu guest to run full screen when running under Hyper V 2012 (on Server 2012 / 2012 R2 or Windows 8 / 8.1).

One way to get Ubuntu running full screen is to hard-code the resolution of your monitor as a Linux kernel parameter. My monitor runs at 1680x1050, so I made the following changes to my Ubuntu 13.10 guest:


  1. Edit the grub configuration file, for example:
    sudo vi /etc/default/grub
     
  2. Find the line starting with GRUB_CMDLINE_LINUX_DEFAULT, and add "video=hyperv_fb:1680x1050" (or your custom resolution) in between the quotes. For example: 
    GRUB_CMDLINE_LINUX_DEFAULT="quiet splash video=hyperv_fb:1680x1050"
  3. Save and exit 
  4. Run 
    sudo update-grub
  5. Restart your computer


If you get an error dialog warning about not being able to apply the video mode when logging into the GUI, use the Ubuntu control panel to change the screen resolution to your chosen custom resolution.

Wednesday, November 20, 2013

System.Xml.XmlException: Data at the root level is invalid. Line 1, position 1

I ran into a problem recently that seemed very familiar because I am sure I have had encountered it before. So, once I re-solved it, I decided I would write a blog entry in the hope that I will find this post when I next run into the issue in a few years' time.

The problem manifested itself in the simple scenario below: I serialised a simple object into UTF-8 XML and then tried to parse the result using XDocument.Parse.

    public class SimpleClass
    {
        public string SomeProperty { get; set; }
        public int AnotherProperty { get; set; }
    }

    [TestMethod]
    public void Given_a_simple_object_When_serialised_to_XML_and_deserialised_Then_it_should_not_throw_an_exception()
    {
        var someObject = new SimpleClass  { SomeProperty = "Abc", AnotherProperty = 42 };

        using (var memoryStream = new MemoryStream())
        using (var xmlTextWriter = new XmlTextWriter(memoryStream, Encoding.UTF8))
        {
            var serialiser = new XmlSerializer(someObject.GetType());
            serialiser.Serialize(xmlTextWriter, someObject);

            var utf8Xml = Encoding.UTF8.GetString(memoryStream.ToArray());
            XDocument.Parse(utf8Xml);   //fails at this point with exception System.Xml.XmlException: Data at the root level is invalid. Line 1, position 1
        }
    }

The document in utf8Xml seems fine:

<?xml version="1.0" encoding="utf-8"?><SimpleClass xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><SomeProperty>Abc</SomeProperty><AnotherProperty>42</AnotherProperty></SimpleClass>
However, the test fails when trying to parse the XML.

I found the cause of the problem to be the byte order mark (BOM) added to the start of the UTF-8 string. The solution was to construct a new instance of UTF8Encoding rather than using the static Encoding.UTF8. One of the the UTF8Encoding constructor overloads takes a parameter encoderShouldEmitUTF8Identifier. Set this to false, and everything works!

This is the passing test:

    public class SimpleClass
    {
        public string SomeProperty { get; set; }
        public int AnotherProperty { get; set; }
    }

    [TestMethod]
    public void Given_a_simple_object_When_serialised_to_XML_and_deserialised_Then_it_should_not_throw_an_exception()
    {
        var someObject = new SimpleClass  { SomeProperty = "Abc", AnotherProperty = 42 };

        using (var memoryStream = new MemoryStream())
        using (var xmlTextWriter = new XmlTextWriter(memoryStream, new UTF8Encoding(false, true)))
        {
            var serialiser = new XmlSerializer(someObject.GetType());
            serialiser.Serialize(xmlTextWriter, someObject);

            var utf8Xml = Encoding.UTF8.GetString(memoryStream.ToArray());
            XDocument.Parse(utf8Xml);
        }
    }

Thursday, November 14, 2013

Get a list of missing updates on Server Core

If you want to get a list of Windows Updates that haven't been installed on Windows Core (or any machine for that matter where you may want to use the command line), the following PowerShell one-liner will work:
(New-Object -ComObject Microsoft.Update.Searcher).Search("IsInstalled=0 and Type='Software'").Updates | select LastDeploymentChangeTime, Title | ft -AutoSize

Friday, November 8, 2013

Installing Jekyll on Windows

Installing Jekyll on Windows is trivially easy if you already have Chocolatey on your machine. Run the following PowerShell one-liner and you’ll be set up in no time!
cup python; cup ruby1.9; cup ruby.devkit.ruby193; gem install jekyll; gem install wdm; gem uninstall pygments.rb; gem install pygments.rb --version "=0.5.0"
Now you just need to learn Jekyll by starting here: http://jekyllrb.com/docs/usage/. Bear in mind that there are some issues with Jekyll on Windows because it is not an officially supported platform.

Saturday, October 26, 2013

Angularjs directive documentation

It looks like the Angular documentation for directives has changed, but unfortunately a lot of valuable information has been lost in the process. I reproduce part of the original documentation below:
(source: http://code.angularjs.org/1.0.7/docs/guide/directive)



Directive Definition Object

The directive definition object provides instructions to the compiler. The attributes are:
  • name - Name of the current scope. Optional and defaults to the name at registration.
  • priority - When there are multiple directives defined on a single DOM element, sometimes it is necessary to specify the order in which the directives are applied. The priority is used to sort the directives before theircompile functions get called. Higher priority goes first. The order of directives within the same priority is undefined.
  • terminal - If set to true then the current priority will be the last set of directives which will execute (any directives at the current priority will still execute as the order of execution on same priority is undefined).
  • scope - If set to:
    • true - then a new scope will be created for this directive. If multiple directives on the same element request a new scope, only one new scope is created. The new scope rule does not apply for the root of the template since the root of the template always gets a new scope.
    • {} (object hash) - then a new 'isolate' scope is created. The 'isolate' scope differs from normal scope in that it does not prototypically inherit from the parent scope. This is useful when creating reusable components, which should not accidentally read or modify data in the parent scope.
      The 'isolate' scope takes an object hash which defines a set of local scope properties derived from the parent scope. These local properties are useful for aliasing values for templates. Locals definition is a hash of local scope property to its source:
      • @ or @attr - bind a local scope property to the value of DOM attribute. The result is always a string since DOM attributes are strings. If no attr name is specified then the attribute name is assumed to be the same as the local name. Given  and widget definition ofscope: { localName:'@myAttr' }, then widget scope property localName will reflect the interpolated value of hello {{name}}. As the name attribute changes so will the localName property on the widget scope. The name is read from the parent scope (not component scope).
      • = or =attr - set up bi-directional binding between a local scope property and the parent scope property of name defined via the value of the attr attribute. If no attr name is specified then the attribute name is assumed to be the same as the local name. Given  and widget definition of scope: { localModel:'=myAttr' }, then widget scope property localModel will reflect the value of parentModel on the parent scope. Any changes to parentModel will be reflected inlocalModel and any changes in localModel will reflect in parentModel.
      • & or &attr - provides a way to execute an expression in the context of the parent scope. If no attr name is specified then the attribute name is assumed to be the same as the local name. Given  and widget definition of scope: { localFn:'&myAttr' }, then isolate scope property localFn will point to a function wrapper for the count = count + valueexpression. Often it's desirable to pass data from the isolated scope via an expression and to the parent scope, this can be done by passing a map of local variable names and values into the expression wrapper fn. For example, if the expression is increment(amount) then we can specify the amount value by calling the localFn as localFn({amount: 22}).
  • controller - Controller constructor function. The controller is instantiated before the pre-linking phase and it is shared with other directives if they request it by name (see require attribute). This allows the directives to communicate with each other and augment each other's behavior. The controller is injectable with the following locals:
    • $scope - Current scope associated with the element
    • $element - Current element
    • $attrs - Current attributes object for the element
    • $transclude - A transclude linking function pre-bound to the correct transclusion scope:function(cloneLinkingFn).
    To avoid errors after minification the bracket notation should be used:
    1. controller: ['$scope', '$element', '$attrs', '$transclude', function($scope, $element, $attrs, $transclude) { ... }]
  • require - Require another controller be passed into current directive linking function. The require takes a name of the directive controller to pass in. If no such controller can be found an error is raised. The name can be prefixed with:
    • ? - Don't raise an error. This makes the require dependency optional.
    • ^ - Look for the controller on parent elements as well.
  • restrict - String of subset of EACM which restricts the directive to a specific directive declaration style. If omitted directives are allowed on attributes only.
    • E - Element name: 
    • A - Attribute: 
    • C - Class: 
    • M - Comment: 
  • template - replace the current element with the contents of the HTML. The replacement process migrates all of the attributes / classes from the old element to the new one. See the Creating Components section below for more information.
  • templateUrl - Same as template but the template is loaded from the specified URL. Because the template loading is asynchronous the compilation/linking is suspended until the template is loaded.
  • replace - if set to true then the template will replace the current element, rather than append the template to the element.
  • transclude - compile the content of the element and make it available to the directive. Typically used withngTransclude. The advantage of transclusion is that the linking function receives a transclusion function which is pre-bound to the correct scope. In a typical setup the widget creates an isolate scope, but the transclusion is not a child, but a sibling of the isolate scope. This makes it possible for the widget to have private state, and the transclusion to be bound to the parent (pre-isolate) scope.
    • true - transclude the content of the directive.
    • 'element' - transclude the whole element including any directives defined at lower priority.
  • compile: This is the compile function described in the section below.
  • link: This is the link function described in the section below. This property is used only if the compile property is not defined.

Tuesday, August 20, 2013

Installing meld or p4merge as diff / merge tool for Git on Windows

Meld

The following PowerShell uses Chocolatey to install meld and configures it as the diff and merge tool for Git. Note that meld is an officially supported diff tool in Git for Windows.
cinst meld

git config --global diff.tool meld
git config --global merge.tool meld

git config --global mergetool.meld.keeptemporaries false
git config --global mergetool.meld.trustExitCode false
git config --global mergetool.meld.keepbackup false

P4Merge (Perforce Merge)

The following PowerShell uses Chocolatey to install p4merge and configures it as the diff and merge tool for Git. Note that p4merge is an officially supported diff tool in Git for Windows.
cinst p4merge

git config --global diff.tool p4merge
git config --global merge.tool p4merge

git config --global mergetool.p4merge.keeptemporaries false
git config --global mergetool.p4merge.trustExitCode false
git config --global mergetool.p4merge.keepbackup false

Semantic Merge

If you're coding in C#, it may be worth using Semantic Merge (also installable via Chocolatey). Note that this set of commands sets up the PlasticSCM merge tool as the fall-back when not merging C#.
cinst SemanticMerge

git config --global diff.tool semantic
git config --global merge.tool semantic

$semanticMergeUserPath = Resolve-Path ~\AppData\local\PlasticSCM4\semanticmerge\semanticmergetool.exe
git config --global difftool.semantic.cmd ('''' + $semanticMergeUserPath + ''' -d=\"$LOCAL\" -s=\"$REMOTE\" -edt=\"mergetool.exe -d=\"\"@sourcefile\"\" -dn=\"\"@sourcesymbolic\"\" -s=\"\"@destinationfile\"\" -sn=\"\"@destinationsymbolic\"\" -t=\"\"@filetype\"\" -i=\"\"@comparationmethod\"\" -e=\"\"@fileencoding\"\"\"')
git config --global mergetool.semantic.cmd ('''' + $semanticMergeUserPath + ''' -b=\"$BASE\" -d=\"$LOCAL\" -s=\"$REMOTE\" -r=\"$MERGED\" -l=csharp -emt=\"mergetool.exe -b=\"\"@basefile\"\" -bn=\"\"@basesymbolic\"\" -s=\"\"@sourcefile\"\" -sn=\"\"@sourcesymbolic\"\" -d=\"\"@destinationfile\"\" -dn=\"\"@destinationsymbolic\"\" -r=\"\"@output\"\" -t=\"\"@filetype\"\" -i=\"\"@comparationmethod\"\" -e=\"\"@fileencoding\"\"\" -edt=\"mergetool.exe  -s=\"\"@sourcefile\"\" -sn=\"\"@sourcesymbolic\"\" -d=\"\"@destinationfile\"\" -dn=\"\"@destinationsymbolic\"\" -t=\"\"@filetype\"\" -i=\"\"@comparationmethod\"\" -e=\"\"@fileencoding\"\"\"')

git config --global mergetool.semantic.keeptemporaries false
git config --global mergetool.semantic.trustExitCode false
git config --global mergetool.semantic.keepbackup false

Other merge tools

You can see which diff tools are supported natively by git by running the following command:

git difftool --tool-help

You'll see something like the following result:

'git difftool --tool-<tool>' may be set to one of the following:
  gvimdiff
  gvimdiff2
  vimdiff
  vimdiff2

The following tools are valid, but not currently available:
  araxis
  bc3
  codecompare
  defaults
  deltawalker
  diffuse
  ecmerge
  emerge
  kdiff3
  kompare
  meld
  opendiff
  p4merge
  tkdiff
  vim
  xxdiff

Some of the tools listed above only work in a windowed
environment. If run in a terminal-only session, they will fail.

Saturday, August 3, 2013

Multiple recipients with SmtpClient in .NET

When sending a quick email, for example to notify of a PowerShell task failure, the simplest System.Net.Mail.SmtpClient method to use is usually this one:
public void Send(
string from,
string recipients,
string subject,
string body
)

The parameter name recipients implies that it is possible to send the email to more than one recipient - but the documentation does not say how. I tried separating the addresses with a semicolon, but that failed with the following error:
"Send" with "4" argument(s): "An invalid character was found in the mail header: ';'."
The correct separator to use, which I only found out by trawling through the .NET source with ILSpy, is a comma (,).

Monday, July 29, 2013

Finding the quietest time to do disruptive maintenance with IIS

Some changes to production environments may be much simpler if a little downtime is acceptable (such as migrating hosting providers). Using hard data to find that time to determine the impact on customers is possible using IIS logs and the LogParser tool.

  1. Get hold of the IIS logs from all production servers and copy them into a single folder e.g. 
    • Server1
      • W3SVC1
        • ex12345.log
    • Server2
      • W3SVC1
        • ex12345.log
  2. Download and install LogParser, if you don't have it.
  3. Run the following PowerShell from within the log folder root:
    & 'C:\Program Files (x86)\Log Parser 2.2\LogParser.exe' -i:iisw3c -recurse:-1 "select to_localtime(to_timestamp(date, quantize(time, 3600))) as LocalTimeBlock, count(*) as Count into Logs.csv from *.log where date is not null and `"cs(User-Agent)`" not like '%pingdom%' group by LocalTimeBlock"
    This query converts all dates and times into one hour blocks int local time and excludes requests from Pingdom, which we use for uptime monitoring.
  4. Load the generated Log.csv in Excel and create a scatter plot from the two columns. 

Saturday, July 20, 2013

Latency timings for various Azure datacentres.

I have published a tool that 'pings' various Azure datacentres around the world from your browser to see which have the lowest / highest latency. This obviously is most useful if most of your customers are in a single location, but I suspect that scenario is quite common.

The tool can be found at http://azureping.info.

Wednesday, July 17, 2013

Disable the "Attach Security Warning" dialog box

When remote debugging with Visual Studio, you may get an "Attach Security Warning" dialog for every process you want to attach to. There is no obvious way to stop this from popping up, and it can be quite frustrating when remote debugging several application pools. Microsoft's rationale for the security warning is here.

For those who want to silence this dialog forever, run the following PowerShell one-liner:
taskkill /IM devenv.exe; while(get-process -ea 0 devenv) { Write-Host "Waiting for Visual Studio to shut down..."; Start-Sleep -sec 1 }; ls HKCU:\Software\Microsoft\VisualStudio | % { $_.PSPath + "\Debugger" } | % { sp -ea 0 $_ DisableAttachSecurityWarning -Type DWORD -Value 1 }

Tuesday, July 16, 2013

Format a drive from PowerShell

After adding a new drive to a Windows Server 2012 Core machine, you will need to format it and assign a drive letter. This is how to do it.

$disk = Get-Disk | ? PartitionStyle -eq Raw
$disk | Initialize-Disk
$part = New-Partition -DiskNumber $disk.Number -UseMaximumSize
Initialize-Volume -Partition $part -FileSystem ReFS
Add-PartitionAccessPath -DiskNumber $disk.Number -PartitionNumber $part.PartitionNumber -AccessPath D:\

Thursday, July 4, 2013

Overhead of using async and await in C# 4.5

Although asynchronous code is touted as allowing for more scalable architectures through less threads (a la nodejs), there must be a latency overhead to this technology. While the total throughput of the system may be higher, an individual request must be slowed down due to increased context switching.

I was interested in seeing how much of a performance impact there is to using the new Async pattern in .NET (and simplified using the fantastic async / await keywords in C#).

I wrote an application that compares the FileStream.Read and FileStream.ReadAsync calls in a tight loop, reading one byte into a byte array 1 million times. Tweaking the file size and other parameters resulted in very different results, although the synchronous call was always at least five times faster than the asynchronous call.

Results

File length: 1000000 bytes.
Running 100 loops of Preload...
Done in 0 ms.
Running 1000000 loops of Asynchronous...
Done in 2452 ms.
Running 1000000 loops of Synchronous...
Done in 17 ms.

Conclusion:

In this single test scenario, using this particular method of file access, the overhead of using asynchronous calls for very fast operations is significant in that an operation that takes around 20 ms synchronously runs at 2500 ms asynchronously (125 times slower). In practice, this is an overhead of 0.00000248 seconds (2.5 μs) per asynchronous call. Whether this overhead is acceptable, and whether the reduction in system resources that asynchronous calls are supposed to yield is worth this slowdown is application dependent.

Source code

Wednesday, June 19, 2013

Ping using PowerShell

To 'ping' a machine using PowerShell, use Test-Connection:
Test-Connection -Quiet machine

Friday, May 17, 2013

Why can't I copy and paste a URL from a Gmail subject line?

Sometimes I send myself quick emails with only the URL of a website pasted into the subject line. Unfortunately, I have been frustrated by not being able to copy and paste those URLs from Gmail into a browser from Internet Explorer 10. This does not seem to affect Chrome.


TL;DR

If you are having trouble copying and pasting a URL from a Gmail subject line, paste it here and click Go:


Detail


Steps to reproduce:

  1. Visit a website:
  2. Copy the URL and send it to your Gmail account as the subject line.
  3. Open your Gmail account and copy the URL from the email you just sent yourself:
  4. Paste this URL into your browser's address bar.
  5. See the result:


Why does this happen?

It seems that Google must be inserting a special character called a Zero Width Space (Unicode 8203) at certain positions in the URL to prevent copying and pasting. I can only presume that this is some kind of security measure. For example, the URL I used in this example actually looked like this behind the scenes: http://www.iana.org/?domains/re?served, where the question marks (?) are the special zero width space characters.

Point to note: the DNS protocol seems to ignore this character. Pasting http://example.iana.org/ (which mapped to http://exa?mple.iana.?org/ when I tried it) into the browser will work fine.

The value is not a copy / paste artefact; it is in the DOM node value:
document.getElementById(':np').firstChild.nodeValue 
"http://www​.iana.org/​domains/re​served"
document.getElementById(':np').firstChild.nodeValue.length 
39

'http://www.iana.org/domains/reserved'.length 
36

Monday, April 29, 2013

Retrospectively timing long-running operations in PowerShell

Sometimes I run an operation that takes longer than I expect to execute, but once it is finished, the only way to see how long it did take is to run it again in Measure-Command or use some other timing mechanism. This PowerShell prompt preserves your existing prompt (for example PoshGit) and tacks on an execution time for each and every command you run.
C:\Demo [master]> Start-Sleep 3
00:00:03.0048920
C:\Demo [master]> Start-Sleep 5
00:00:04.9974939
C:\Demo [master]>
00:00:00.0004274
C:\Demo [master]>

Thursday, April 25, 2013

Old SysInternals source code

The source code for SysInternals tools is no longer published, although it was for some of the tools before Microsoft purchased them.

The source code can be downloaded for those tools at the Internet Archive.

Monday, April 22, 2013

Launch NUnit GUI with multiple assemblies

The NUnit GUI does not support loading multiple assemblies from the command line, so this PowerShell function creates an NUnit project file that can be specified as a command line argument. Just provide a file name for the new project file (must end in .nunit or NUnit will barf) and an array of assembly files to load.

As a bonus, this project file will load your assemblies in multiple process mode, which means that any associated assembly configuration files will be correctly loaded.

Monday, March 4, 2013

"I believe you're the owner of..." spam

I recently received the following email (with domain names changed to protect the innocent), which is notable due to the absence of the usual grammar mistakes that give away most scams. The most obvious giveaway is that the site owner already owns both domains - including the one that Faheem is so generously offering to sell them.

I am curious to see whether "Faheem" is able to upload an HTML file as he claims, which would indicate that the site has been compromised.

Hello,

I believe you're the owner of [adomain.org]. I've got a proposition concerning your website. Would you be interested in acquiring [adomain.com]?

I can upload an HTML file temporarily to verify ownership of the domain, in case you're concerned. Let me know what you think to discuss further.

PS: I'm only emailing you because I believe you can benefit from this. I do not intend to email you again unless you respond to this inquiry.

Regards,
Faheem.

Friday, February 15, 2013

Debugging NUnit tests

I generally use NCrunch or ReSharper to run my NUnit tests, but it seems each test runner has its own quirks. Sometimes, integration tests only fail on the build machine because they run in the command line runner. Debugging these is a little trickier.

The solution I found that works well is to launch the NUnit GUI and run the tests from there, attaching to the correct process by running the following command in the Package Manager Console (i.e. PowerShell in VS):

($dte.Debugger.LocalProcesses | ? { $_.Name.EndsWith("nunit-agent.exe") }).Attach()

Wednesday, January 2, 2013

Copying Windows Azure SQL Database to a local server

The following script will download a SQL Azure database and import it into a local running instance. It assumes the local database does not exist. The import step could also be used to create an off-site backup.

This script uses the SqlPackage tool, which can be downloaded using the Web Platform Installer to install Microsoft SQL Server Data-Tier Application Framework (DACFx).

You will need to replace the values surrounded in asterisks (*) for the script to work correctly.

param([string][parameter(mandatory)]$DatabaseAdministratorPassword)

$sqlPackage = 'C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe'

$bacPacFile = join-path $env:TEMP 'Import.bacpac'
try 
{
    & $sqlPackage /a:Export /ssn:*****AZURE_DATABASE_SERVER***** /sdn:*****DATABASE_NAME***** /su:*****AZURE DATABASE ADMIN USER NAME***** /sp:$DatabaseAdministratorPassword /tf:$bacPacFile
    if(!$?) {
        throw "Failed to export database."
    }
    & $sqlPackage /a:Import /tdn:*****DATABASE_NAME***** /tsn:localhost\sqlexpress /sf:$bacPacFile
    if(!$?) {
        throw "Failed to import database."
    }
}
finally {
    del $bacPacFile
}