Friday, 20 July 2012

Syncing Mercurial repositories between computers using hg serve

Occasionally I will copy a Mercurial repository onto another computer using a USB key (e.g. to work from home, or when I am giving a presentation). What happens when the two computers get out of sync? How do I bring them back in line? As always with DVCS there are several ways to deal with it.

  • If you know that changes have only been made in one repo, you could simply use the USB key again to copy the most up to date repo over the older one. This isn’t a recommended way to work, but in simple cases it does the trick.
  • Slightly safer would be to copy the newer repo onto the old computer in a different folder, then set that folder up as a new remote location and pull from that. If changes were made on both computers, you’d need to do this both ways, which would be cumbersome.
  • If you have a public server you can create repos on, then you can push to the server and pull from the other computer. Bitbucket is excellent for this, with its unlimited free private repos.

However, Mercurial also comes with a handy utility that allows you to sync repos between two computers without the need for a USB key or a central server. All you need is for them to be on the same network.

If you have TortoiseHg installed (which I highly recommend for Mercurial development), then you can simply right-click on any Mercurial repository and select TortoiseHg | Web Server.

image

This launches a very simple web server that underneath is using the hg serve command (e.g. hg serve --port 8000 --debug -R C:\Users\Mark\Code\MyApp).

image

You can visit this web site in your browser and view the changesets or browse the code.

image

Now, on your other computer, all you need to do is open up the .hg\hgrc file in your repository in a text editor (you might need to create it), and set up a new path. For example:

[paths]
default = http://cpu-1927482d:8000/

Now you can call hg pull -u to get bring in all the changes from your other computer. It is a very simple and quick way to sync a repo between two development computers.

Another great use for this feature is for two developers on a team to share in progress changes with one another that they are not yet ready to submit to the central repository.

Wednesday, 18 July 2012

Accessing XML Data using WebMatrix

Although I’ve been a .NET developer for around a decade, the majority of my work has not been on websites, so it is only occasionally that I get to dabble in ASP.NET. However, I do try to keep up with the new technologies by building a few small websites from time to time. Recently, I decided to try learning a bit of Azure and Razor syntax by porting the first ASP.NET site I made ages ago over to WebMatrix. I’m using the release candidate of WebMatrix 2.

The website I am porting did a very simple job. It displayed football fixtures and results from XML files I kept up to date manually. It could also keep a running total of the top scorers for the season. The main challenge in porting it was getting to grips with the Razor syntax, although how to access XML data was not something that was immediately obvious to me. Of course, it’s simple once you know how, but here’s the steps:

1. Put your XML files into App_Data

Just copy them into the folder (create App_Data yourself if it doesn’t exist)

2. Reference the System.Xml.Linq namespace

This can be done at the top of your Razor cshtml file with the @using keyword:

@using System.Xml.Linq

3. Load the XDocument

Loading the XDocument is straightforward enough, but you also need to use Server.MapPath to get the real path to your XML file. I have a different XML file for each season, so I used the query string to specify which one, which you can get at using the Request object:

@{
    var season = Request["season"] ?? "2003-2004";
    var file = XDocument.Load(Server.MapPath(@"/App_Data/Fixtures" + season + ".xml"));
}

4. Perform a Query

This is straightforward LINQ to XML. I used a strongly typed class called Fixture, which I put into a C# file in the App_Code folder (you have to create this yourself).

var fixtures = from e in file.Root.Elements()
select new Fixture { 
    Competition = e.Element("Competition").Value, 
    Date = e.Element("Date").Value,
    Venue = e.Element("Venue").Value,
    Opponents = e.Element("Opponents").Value,
    Result = e.Element("Result").Value,
    For = e.Element("For").Value,
    Against = e.Element("Against").Value,
    Goalscorers = e.Element("Goalscorers").Value,
};

5. Iterate it using foreach

For some reason it took me a few goes to work out how you did a foreach in Razor. It’s actually a very clean and simple syntax:

<tbody>
@foreach (var f in fixtures) {
    <tr>
        <td>@f.Competition</td>
        <td>@f.Date</td>
        <td>@f.Venue</td>
        <td>@f.Opponents</td>
        <td>@f.Result</td>
        <td>@f.For</td>
        <td>@f.Against</td>
        <td>@f.Goalscorers</td>
    </tr>
}
</tbody>    

And that’s it. Probably painfully obvious if you are a seasoned ASP.NET developer. I’m hoping to try deploying this to Azure as well as experiment with moving the data into an sdf file, so I might blog about how easy that is to do.