Saturday, 25 November 2017

Product Review – Dell XPS 13

I left my old job recently. And consequently found myself in the market for a new laptop. I hadn’t bothered replacing my old Acer Aspire V3-571 with custom 1TB Samsung SSD that had failed within two years. Because the Toshiba Satellite Pro P50-C-12Z my old job supplied worked fine as both a home and work machine. Especially once I upgraded it with the SSD from my home laptop. However, once I left that job, I found myself relying on a 10-year-old Dell Inspiron for a couple of weeks: 


That Inspiron is a loyal old work horse that has served me well as both a dev laptop and file server for a decade. And has been my go-to machine whenever other, newer, higher-spec machines have failed. It's had its screen and hinges replaced. And has had its hard drive upgraded several times with various SSDs. So saying it has lasted 10 years is a wee bit Ship of Theseus-y. But the motherboard, fans, etc have all stood the test of time. However, it just isn’t fast enough any more for modern development, so I needed a new machine.

After some research, I ended up opting for the Dell XPS 13. With 16GB RAM, 512GB PCIe SSD hard drive and the new 8th generation Kaby Lake Processor (i7-8550U.) I initially ordered the previous model. Which was identical in all respects to the above spec except the processor (it was an i7-7550U), and the price tag: the old spec was £300 cheaper. However, after reading  several reviews that all suggested the improvement in performance with the 8th gen processor is significant, I cancelled and ordered the new spec. It cost £1649 in early November 2017. At this time, in late November 2017, it is now retailing at £1547 on Amazon. That’s quite a price change in only a few weeks. Not waiting a few weeks to see how the price settled is my only regret so far about buying this machine.

The main thing that makes the 8th generation processor so much better is that it has a Quad Core, compared to the previous generation’s dual core. So, even with a lower clock speed, the newer processor burns through tasks like building Visual Studio solutions about 60% faster than its predecessor. That’s pretty significant. Performance gains are usually on the order of 10-20% increases between adjacent versions. As the newer processors have a lower speed, they also run cooler. So, except when you’re building solutions, you almost never hear the fan kick in. I've found that browsing the internet, watching videos, and even using MS Office are all silent experiences on this machine.

I had heard that coil whine (a low pitched noise that accompanies processor-intensive tasks like watching videos) was a problem in the previous model. But I haven’t experienced that problem myself in the three weeks I’ve been using my new machine. Similarly, I had read reports of the wireless network card being unreliable; but my machine has performed perfectly. It hasn't dropped connection to my home wireless even once.

The main things that drew me to this model were the form factor. (A 13.3 inch screen in a machined aluminium chassis, that would usually only be large enough to hold a 11 inch screen in other models.) And the battery life. (Dell reports 16 hours; I’m finding more like 9 or 10. But I do have the QHD screen, which is more battery-intensive.) Every time I’ve bought a laptop before, I’ve gone for power over convenience. But to be honest, the sort of 15-inch laptops I’ve opted for in the past aren’t great for commuting (small tables on trains.) And they typically don’t have a good battery life. (I’d have been lucky to get an hour or so on battery on my old Satellite Pro.) By comparison, this machine runs and runs. If I take it to bed at night, I have to be careful not to surf (do people still say "surf"?) or work too long. Because that battery could easily see me through til dawn. As it is, if I browse the internet or watch a video for 3  hours or so at night on battery, I find I still have 70% battery the next morning. That, to me, is better than having a 10% faster machine that doesn’t last as long.)

As a development machine

Honestly, I’ve not used this laptop heavily as a development machine yet. Though I have experimented with running old Visual Studio solutions on it. A solution with 36 projects Builds in 1-2 seconds in Visual Studio 2015 Professional and VS2017 Community from a cold start. And Re-builds in about 15 seconds. That's amazing - better than any laptop or desktop I've ever had, full size or not, with or without SSD. On my old full-size Satellite Pro laptop with 1TB Samsung SSD, that Rebuild would have taken about 20 seconds. I should warn that there are some gotchas in VS2017. e.g., if you leave the Lightweight Solution Load option on (as it is by default), then the initial build time for that same project is about 40 seconds in VS2017 Community. Also, the solution sometimes doesn't build at all. So, my advice is to switch Lightweight Solution Load off completely. It's not stable enough at the current time to be of any use.

I’m always dubious about using versions of Visual Studio in the same year as they are named for. But VS2017 appears to be particularly bad in terms of buggy-ness and generally poor design decisions. (e.g., I found that the Javascript Language Service (the part that should make Intellisense work for Typescipt and JS files) has been arbitrarily turned off in VS2017 for some of my existing solutions. Seemingly because some Microsoft developer had put it in the “too hard” pile to make Javascript / Typescript work properly in VS2017.) Note to Microsoft developers: if you can’t handle more than 20MB of Javascript / Typescript files for a single solution, you’re just wasting developers’ time. Don’t bother doing less than that and considering the job done. Turning off features that worked perfectly well in previous versions of VS is pretty inexcusable. The only reason for using Typescript over Javascript is that it provides object-oriented capabilities. But you can only leverage those features meaningfully if you have Intellisense. Disabling such a key feature at such a stupidly-low threshold is like having a car whose doors fall off if you go over 30 MPH. And closing bugs about same on the basis that you meant to do something that stupid is even more stupid than the design decision was in the first place. FWIW, I fixed the problems with JS/Typescript Intellisense that are evident in VS2017 by disabling the new Language Service completely using this option:

and including the following settings in a file named "tsconfig.json" in the root of my web project:

          "compilerOptions": {
            "disableSizeLimit": true,
            "module": "commonjs",
            "allowJs": true,
            "outDir": "out"
          "exclude": [
          "compileOnSave": true,
          "typeAcquisition": {
            "enable": true

Anyway, this review is about my XPS 13, not the poor design decisions of the Visual Studio 2017 development team. I mention these issues with VS2017 vs VS2015 purely to note how hard it is to assess new hardware if running new software too. Sometimes, it’s not the hardware that's to blame for any failings observed. Overall, like-for-like, my XPS 13 performs better than my old Satellite Pro + SATA 3 Samsung SSD. Even though that machine was no slouch. I’m glad I bought it, and will continue to use it as my main development laptop.

Battery Life

In terms of battery life, the XPS 13 is a world away from any laptop I’ve owned before. Realistically, I get about 10 hours out of it if I’m just browsing the internet, or watching videos. As noted, I haven’t used it for actual development in anger yet. But going by the way the fan kicks when I build VS solutions, I’d suspect that I’d get around 4-5 hours max out of it at full throttle, possibly less. All previous laptops I’ve used have only got around 1-1.5 hours on battery, even if I were only browsing. So, whilst I have to be careful this doesn’t make me sit up too long at night. It is a huge improvement. That battery life is the main reason I bought the XPS 13 over its big brother, the XPS 15. Every time I’ve bought a laptop in the past 10 years, my ‘sensible’ head has kicked in and coaxed me to go for raw power over portability and battery life. With this machine, I don’t need to compromise. It provides both ultra-portability, and processing power in one package. Whilst no doubt the latest generation XPS 15 probably could out-perform this model in sheer processing time. You can’t argue with sub-5 second builds in VS2015, combined with a full day’s battery life for commuting or using in the evening for lighter tasks. 

Other features

The XPS 13 has two features that I particularly like. Firstly, it has a nice, carbon fibre, rubberised keyboard surface. The keys themselves are pretty tactile, chicklet-style. As a touch typist, I find it suits me very well. But the palm rest is rubberised, which makes the keyboard pleasant to use if the laptop is cold. Had Dell opted for Aluminium all round, I think that would have made for some pretty cold hands when typing a quick email first thing in the morning. Or when transferring the laptop from a cold car boot to a warm office.

Secondly, the keyboard is well-lit. With differing levels of white lighting available. Including "off." In a dark room, that backlighting makes positioning your hands far easier. Also, the keyboard light only comes on when you type; so it's not distracting if you generally want a keyboard light, but also don't want to be distracted by same when watching a video. 

Minor Quibbles

My old Inspiron and new XPS 13 both have one design feature that I find annoying. Namely, there is a battery charging light right on the front of the machine. And it can't be disabled. It goes off when the battery is fully charged, but it'd have been nice to be able to switch it off electively. My old Inspiron battery has reached a stage where it doesn't hold a charge any more; it's therefore even more annoying, as it flashes orange to warn that the battery needs replaced. My new XPS 13 is too new to be able to tell if it does the same thing, though my understanding is that it will when the battery is too old to charge any more. It'd be nice not to have to use black electrical tape to switch this feature "off." 

Secondly, the webcam is badly-placed. I don't use it anyway, so it's not an issue for me. But if you do a lot of web conferencing, be aware that it is placed on the bottom-left of the screen. This is because the 13.3 inch display takes up nearly all of the height and width available. But it means that any Webex you have will involve participants looking right up your nose. Not pleasant.

Lastly, the hinge on the lid is very strong. I personally like this, as it means the screen doesn't move when you use the touchscreen. But some people have complained about having to use a whole two hands instead of one to open the lid. The main issue I do have with the lid is that it doubles as a very effective set of pliers if you placed your fingers on the hinge whilst opening. (Just as well you need both hands to open it then really, isn't it?)

Other options

Other options I considered included the Razer Blade Stealth. (In the end, I decided the lack of a 8th gen processor, combined with the fact I could only get the “gamer” version with a green logo and rainbow-coloured keyboard lighting were deal breakers for me. Plus, support is US-based whilst I’m in the UK.) I also liked the HP Spectre very much indeed – it seems a very nice machine. Just not quite as capable as the XPS 13 in terms of power or battery life. Beautiful, though.

Thursday, 17 August 2017

Product Review - Wago Connectors

I was re-wiring my garage recently, when I got fed up screwing wires into choc blocks. I figured someone must have come up with a better way of connecting wires together, got Googling, and found these guys - Wago Connectors:

Wago make lots of different kinds of connectors, some of which are re-usable. Those are the ones I went for. For historical reasons, there are two kinds of re-usable connectors. The 222s:

And newer 221s:

Both kinds come in 2-way, 3-way and 5-way forms. (For connecting the respective number of wires together.) The 221s are slightly more expensive, and take up about 40% less space. But they do the same job of letting you join wires together. Potentially wires of different gauges (such as when connecting twin and earth solid-core to multi-core flex cable used by most appliances in the UK.)

I can highly-recommend these useful little guys. They sped up the job considerably, and have proven very reliable in use.

I don’t have a fidget spinner, so I kept a few of these connectors on my desk over the next month or so to footer with whilst coding. Opening and closing the levers repeatedly. From that unscientific "test", I can say that the 222s are quite a bit more robust than the 221s. After a few hundred “opening and closing” operations on their levers, the more expensive 221 wouldn’t stay open fully any more. It is still usable, and I could hold it open whilst inserting a wire if I really needed to. But then it becomes just as fiddly to use as a choc block. So if you're going to be installing/uninstalling and re-building a lot, I'd say go for the 222s. If weight is a primary concern (e.g., building a drone) then use the 221s or just solder and accept that greater build time and reduced ability to dis-assemble is the price you pay for less weight.

On the upside, the levers on the 221s are considerably easier to open. Though neither is particularly difficult. There is a dedicated tool for opening them that costs over £100, but really it's a ridiculously over-engineered solution that I can't image anybody needing. Even people that are installing these all day would have no difficulty opening them with just their fingers.

The first time you open one of the 222s, you’ll be unsure if it’s broken. Because its jaws initially open to about half way quite easily, then you need to use substantially more force to open the lever all the way. It can also give you a nasty “mouse trap” snap on your fingers if you’re not careful whilst you close the lever to clamp your wire in place.

Over all, I think I’ll be using the cheaper 222s where space isn’t a consideration. To that end, I bought a box of the 3-way and 2-way 222s, and a box of the 5-way 221s. (Since when I need to connect 5 wires together, that’s usually when space is tightest.)

With regard to their ratings, I'm honestly not quite sure what amperage / voltage they can take. The problem is there are two ratings on each model. (Presumably to satisfy more than one set of tests for different markets.) 

The 222s are rated at "20A 300V" on one side and "600V" on the other side. The 221s have labels showing they are variously rated at "450V 32A" or "20A 300V". Confused?, you will be? Here is a YouTube video of someone actually burning the things out to test their limits

In practical use, I've had no problems having about 10 of these things in the same switch. I've also used three in series on the same circuit.

2-way 222 connectors: £13.23 for a pack of 50 @ Screwfix 

3-way 222 connectors: £15.13 for a pack of 50 @ Screwfix

5-way 221 connectors: £13.80 for a pack of 25 @ Screwfix 

Addendum: Thelma quite enjoyed these little devices too. She reports that the 222s, being rounder, are 50% “more chasy” than the “boring” more square 221s. They therefore fly faster when she bats them with her paws to simulate spontaneous movement.

Friday, 25 December 2015

Building a Total Quality Software environment, with Continuous Integration, Unit Testing, and Dependency Injection. And Futurama.

Recently at work, I’ve been working with my colleagues to set up a Total Quality software environment. I’ve been learning a lot from my peers about topics such as VMware, RTI and Code-First EF. (I’d previously used Schema-First, but Code First brings its own advantages and challenges). What I brought to the party was some project experience in: 

  • Continuous Integration platforms (specifically in this case, TeamCity.)
  • Unit Testing and Test-Driven Development techniques.
  • Dependency Injection to support writing testable code.
  • NAnt scripting.
  • Futurama.

We’ll get to that last one in a minute. Let’s go through the others in order first.

Continuous Integration (CI)

Everygeek who’s anynerd is using it these days. But lots of development teams and companies still avoid it, imagining it to be too difficult, too time-consuming, or just not worth the hassle. (For that matter, those same fallacious criticisms can be levelled at every other item in the list above too. Except Futurama.) A decade ago people used to say the same things about Source Control; thankfully there aren’t too many teams I encounter these days that haven’t got their head around how important that is.

Some teams aren’t even sure what CI is, what it does, or what advantages it brings. They’ve always worked by developers just producing software on their own PCs. And they just deal with any time-consuming fallout when it comes to making that software work in the real world as part of the cost of doing business.

OK, so here’s the unique selling point if you’re trying to make the case for introducing this where you work. Are you ready? What CI adds to your team’s game is simply this: repeatable, verifiable deployment. 

Unit Testing and Test-Driven Development techniques 

Unit Testing has been around for a Very Long Time. I know a lot of people who are otherwise very good developers but who “don’t see the point” of unit testing. And I have been such a developer myself in the murky past. 

The misconception that unit testing is pointless generally comes down to a few fallacies:

  • They believe that their own code always works.
  • The wider team and stakeholders place more value on quantity of new features than upon quality of existing features.
  • They believe that they will always personally be around to ensure that their code doesn’t get broken in the future.

Like most good fallacies, there’s just enough truth in most of these to preserve the illusion that unit testing doesn’t provide enough advantages to the person that has to implement it. Not when compared to the opportunity costs of them learning how to do it, or the kudos of pushing out new features (that don’t work as intended.)

Part of the reason more developers don’t give it a go, is that you have to change the way you write code. Most code I’ve seen in the wild is tightly-coupled. This is a phrase that many developers are familiar with, but in my experience vanishingly few know what it means. Basically, it means that if you are writing Class A, and your class depends upon Class B to do its job, your class will instantiatiate a new instance of Class B itself. This means that if Class B stops working, all you (and Users) know is that your class “doesn’t work.” They won't care if your code is perfect, and it's just that damn Class B that let you down.

So, when doing test-driven development, developers need to add another couple of skills to their arsenal. Which brings us to… 

Dependency Injection (DI)

One type of Tight Coupling is defined above. Code is also tightly coupled when it is too closely tied to one UI. So, if you’re a developer that puts all their business logic in code-behind files or controller actions, your code won’t be testable. Because your code needs the UI to do its job, before it will be able to be verified.

Fortunately, there are frameworks and coding styles out there that help developers implement loose coupling, to make their code independently testable. 

The basic idea behind all of these is that instead of your Class A consuming Class B directly to perform some function, it consumes Interface B instead. That is, some object that Class A doesn’t instantiate itself, satisfies some interface that represents the job Class B was doing for Class A. Typically this is achieved by making the constructor of Class A look like this :

 The above pattern is known as Constructor Injection. What it gives you is the ability to swap out whatever is implementing Interface B when it comes to unit testing Class A. So, instead of the object that really does implement Interface B in live use, you can use what is called a mock instance of Interface B. That is typically some object that always gives you anticipated responses. So you can concentrate on testing Class A. That way, any errors you see can be wholly attributed to Class A.

When you write your classes using the Constructor Injection pattern demonstrated above, DI frameworks provide concrete implementations of objects that implement interfaces at runtime. So, you 'magically' find a usable implementation or Interface B available in Class A's constructor. As the developer of Class A, you don't care particularly about where that implementation of Interface B comes from; that is the responsibility and concern of the developer of Interface B and your chosen DI framework.

This is just one of the techniques that developers moving from code that "just works" need to learn if they want their code to be verifiable. It is difficult to embrace. Because frankly writing code that "just works" is hard enough. And because using these techniques opens up the possibility of developers having to recognise errors in their own code. But unit testing also brings with it a huge number of advantages: The ability to prove that a given piece of code works, not just at the time of writing but every single time you build. And it protects your work from being modified in adverse ways by subsequent developers. 

Unit testing and Dependency Injection are whole topics on their own, so I won't say more about them here. (I'll perhaps save that for future blogs.) With regard to understanding tight and loose coupling, though, I'll leave you with an analogy. If a traveller wants to get to some destination, they don’t need to know what the bus driver’s name will be, the vehicle registration, what type of fuel the bus uses, etc. They just need to know what bus stop to be at, what time, and what is the correct bus number to get on. Similarly, Class A doesn’t need to know everything about Class B or where it comes from. It just needs to know that when it requires an object to do some job, one will be available at an agreed time. Class A instantiating Class B itself is analogous to a traveller trying to build their own bus.

Last time I checked, there were something like 22 DI frameworks that you can use with .Net. The one I implemented at work recently is called Castle Windsor, which I’ve been using for a few years. In benchmark tests it’s not the fastest. It’s not the simplest. And it’s not the most customisable/powerful. But it is the one that for my money strikes the right balance between those competing factors. And it integrates particularly well with ASP.Net MVC and Entity Framework. 

NAnt Scripting 

Continuous Integration platforms on their own give you a powerful way of automating builds and deployments. However, there are advantages to be gained to farming out some of that work to a more specialised tool. NAnt is one such tool.

For any system that gets developed, there are typically 10-25 individual “jobs” that are involved in setting up a copy of the system that Testers and ultimately Users can access. e.g, for a web app you might need to:

  • Create some Virtual Directories in IIS.
  • Copy the files that the website is made of into the folders those VDs point at.
  • Customise a web config that tells the site how to access the underlying database.
  • Create the underlying database in SQL Server.
  • Populate the database with data.
  • Create an App Pool in IIS under which the site will run.
  • Grant the relevant App Pool access to the database

You’d also be well-advised to have steps that involve:

  • Running unit tests, so you don’t deploy broken code.
  • Updating Assembly Information so that each build has an identifying number. That way, bugs can be reported against specific builds.
  • Backing up any prior version so that you can rollback any of the above steps if the deployment fails.

If you put these in a script that lives in your project instead of in build steps on your CI server, you can more easily mirror steps between different branches in your builds. 


One of the things that motivates me is getting to have a bit of fun whilst I work. In the team I joined a few months ago, there has been one common theme tying all of the above threads together: Futurama.

Myself and my colleagues have set up about 10 Windows Server 2012 machines that perform various jobs. e.g., One of them is a Domain Controller. Another is our CI server. Several more act as paired web and sql servers that can be temporarily allocated to testing, by internal testers or by end users. Or they can be used by developers to test the deployment process.

Each of our VMs is named after a Futurama character and has its own distinct colour scheme. (NB: They have a fully-qualified name too, like DVL-SQLALPHA, that describes their actual role.) This helps developers stay oriented when RDP-ing around what would otherwise be nearly-identical machines. It’s also fun.  

You saw how TeamCity / Professor Farnsworth looked above. This is how one of our Web Servers, characterised after Zapp Brannigan, looks. As you can see, it's easy to tell which VM you're on, even from a distance:


There are Futurama-themed Easter Eggs hidden in other parts of our build process too. e.g., each CI build produces a log file. At the end of which, a build gets reported as “Successful” or “Failed” for some detailed reason. A recent evening in my own time, I wanted to test implementing custom NAnt functions. (NAnt is written in C#, and you can write functions in C# to augment what it does.) In order to test this with something non-critical, I augmented that custom “Success” or “Failure” method thus:

The exact piece of ASCII art that gets rendered reflects whether the build was successful or not, and is semi-random. So, you might get Hermes with a brain slug saying something dumb if the build is broken. Or you might get Professor Farnsworth announcing “Good news, everyone!” if all went as planned.

These 'features' are of course whimsical. But at worst they give developers a smile during some of the tougher moments of the job. And at best they give you a chance to test out new techniques on non-critical features. As well as giving your brain a rest between more intensive tasks.

The best teams  I’ve worked with all knew their onions on a technical level, but also knew when to have fun too. I'm glad to be working in such a team at present. e.g., I recently implemented the following function:

My colleague Ian made me chuckle when I discovered this in our code repository a few weeks later:

Saturday, 10 October 2015

Product Review - Wrappz Laptop Decals and Custom Phone Skins

Like many developers, I get my money's worth out of the laptops I buy. Sometimes it seems I use them every minute of the day. And, over the years, I've accumulated quite a collection of physical machines in addition to the various VMs I have carrying out miscellaneous tasks around the house.

I secretly love the obviously-marketed-at-women ones that have cases made of pink brushed aluminium and the like. But, also being a professional developer, I have to say that almost always that fancy case comes accompanied by last year's technology (or older.) It simply isn't a good business decision to buy them when you review the spec.

As a wise philosopher once said, I'm a Barbie Girl in a Barbie World.

So, I often end up buying machines that have phenomenally-fast dual/quad processors with acres of RAM capable of running lots of memory-intensive applications concurrently. And I switch out the standard platter drive for a 1 TB SSD. (I usually also swap out the optical drive for a second 1TB SSD. And 2TB SSDs that I've not yet had a chance to get my hands on have also now become available, so my next dev machine will have 4TB total, but that's really a different review.)

Anyway, for some reason the most performant laptops always seem to come in boring black boxes. When you acquire a few of these over the years, it becomes difficult to tell them apart. So for a few years now I've been putting decals on the back and naming the machine according to which decal adorns it. I also make the login screen and desktop background of these machines match the decal. This all helps keep you oriented when you're navigating around, RDP-ing from one machine to another.

TaylorHe Decals

Up to now, my go-to decal manufacturer has been TaylorHe. They do some very nice pre-made patterns that suit almost every taste. With my new work laptop, however, I fancied doing something a bit more bespoke. I'm a huge Breaking Bad fan, so I wanted a machine that had a theme related to that. 

Since the device that I usually take pictures with is in this photo, my friend Ian O'Friel
kindly helped me take this. Which made it a much better photo than it would otherwise have
been as he has a real eye for photography. (You can see more of Ian's fab photos here. I
particularly like the one of the old rusty gate and the South Side At Night.)
Looking around, I found a company called Wrappz that provides exactly this type of product. Not only do they produce decals with custom designs, but they also print them on a custom-sized sheet. So you don't have to trim them to fit your machine. This may seem like a small advantage, but it was nice just to be able to use them out of the box like that rather than messing around with a scalpel or scissors.

Like TaylorHe, Wrappz also do custom phone cases. So I got one of those to match the decal. (Not that I actually own a phone, incidentally - I'm one of the few people I know that doesn't use one, and doesn't miss it. I have a Samsung Galaxy S2 'phone', but it acts as a personal organiser rather than as a communication device. I only put a temporary SIM in it when I have a reason to, which is almost never.)

If you want to order some of these decals / phone cases for yourself, here are some discount codes you can use to get them more cheaply. NB: I've got no commercial relationship with Wrappz, and I haven't benefitted in any way from this review. Also, I won't know whether anyone has used these codes:
Wrappz discount codes

More Wrappz discount codes

Last small tip for those who, like me, have multiple laptops in their network to access. You can place the name of each machine in the Task Bar by creating a new Toolbar, and calling it "\\%computername%", as described here. It makes it amazingly easy to see which machine you're on, even if you have a full-screen program running, and even if you're accessing it from another physical device.

Computer Name on Task Bar

Sunday, 14 December 2014

Product Review - LED Lenser LED7299R H14R.2 Rechargeable Head Torch

I bought one of these for running during the Winter months, when you inevitably find yourself having to make some runs in the dark or twilight.

There are plenty of options out there - ranging from an offering at £5 from Tesco, right the way through to Hollis Canister diving head torches at £800. Obviously, there’s a trade off between getting what you pay for, choosing a light that’s suitable to your purpose, and not spending more than you need to.

After checking out other reviews for several different options, I opted for the LED Lenser LED7299R H14R.2 Rechargeable Head Torch. You can spend anything from £90 to £130 depending on where and when you choose to buy this model. There’s also a similar-but-cheaper model in the same range that isn’t rechargeable. (No reason that you couldn’t buy separate rechargeable batteries of course.) However, I liked the convenience of having the recharging unit built in. It can alternatively take four conventional AA batteries, which you can use as a backup.

For running, it was important that the torch had enough light output to be able to see in pitch darkness on unlit trails with occasional tree cover that blocks ambient light. It was also important that it was comfortable to run with. A lot of runners recommended the Petzl range of head torches. I can see why. They’re a lot lighter than the one I chose (whilst at the same  time being a lot dimmer - typically about a third to a quarter of the light output.) My main criticism of the LED Lenser H14 R2 is that it can feel a bit hard and uncomfortable on your head, particularly the front torch holder. A softer, more padded material behind the lamp would have made it much more useable. As is, it’s more comfortable with a beanie hat underneath, but I wouldn’t fancy trying to run with it overnight in the Summer when a hat would make you overheat.

In terms of light output, it was difficult to find reliable information. The minimum light output was fairly consistently reported by various sources to be 60 Lumens. The product box and the site where I bought it both say the maximum output is 850 Lumens. Other sources quoted as low as 260 to 350 Lumens.There appears therefore to be some confusion about what is meant by "maximum". Namely, the torch has a 'boost' setting that increases brightness for 10 seconds at a time. However, there is a second definition which is the maximum brightness that the torch is able to consistently maintain. I suspect this distinction accounts for many of the differences reported by different sources.

60 Lumens is about as good as the majority of the Petzl range. The brightest setting for the H14 R2, whatever the real value in Lumens,  is a very bright light that is uncomfortable to look at directly at the highest setting. The very highest setting (known as the "boost" setting) only stays on for 10 seconds at a time. Most of the rest of the time, I used it at the highest 'stable' setting.

On that highest constant-current setting, the light can be diffused over an area about 5m wide and 10m far directly in front of you. You can also elect to have a narrower but more intense beam. The specs say it will project light up to about 260m. I found that not to be the case, though I did stick to the “wide and bright” setting throughout my run. Perhaps the boost setting when combined with the narrowest beam would momentarily illuminate the farther "260m" distance quoted for 10 seconds at a time; I didn't test that, because such brief and narrow momentary brightness isn't relevant for my use case or many others I can imagine. I did test the range on the max consistent setting combined with a wide beam when I returned to my car. I found that whilst that setting is quite good enough for running/walking in the pitch dark by allowing you to see what's immediately in front of you, the light didn’t even make it across to the trees at the far end of the 100m or so car park I was in. I’ll try it again on the “narrow beam, temporary boost” setting during my next night run. However, whilst I suspect that the specs are technically correct and that objects will be able to be illuminated at that distance, albeit briefly, it is only with a beam that’s about 1m wide. It's for the reader to decide whether that performance meets their actual needs.

I found the light was good enough for my use case. I ran during astronomical twilight (the third darkest phase of the night; pretty much pitch black for the purposes of this test.) Without the torch, I would just about have been able to see my hand in front of my face in open ground, but not the path I was running on. On stretches covered by trees, it'd have been completely dark. As it was, I missed a pothole in the same forested location (once on the way out, and once on the way back.) I couldn’t see how I’d done this at the time, as I felt I’d been seeing the path well enough to run at a normal pace. I stumbled at the exact same spot again, however the very next day during daylight. So, it just appeared to be a particularly well-camouflaged pothole, rather than a failing of the torch. 

The final lighting feature of note in this torch is the rear red light that you can turn on to allow traffic and cyclists to see you more easily. I thought that was a nice little safety feature. Although, there's no real way to tell if it's on or off once you have the torch on, and the button is very sensitive. Other non-lighting features include a battery-power indicator (the rear LED glows red, amber or green for five seconds when you switch it on, to let you know how charged up the battery is.) I've used mine for less than an hour so far, and it's still in the green from its first charge. I'll update this review with how long a full charge lasts when I've gone through a full cycle. Lastly, you can detach the battery pack (and the front torch itself if you want) and wear them as a belt attachment. I personally prefer the light being cast wherever I'm looking, and didn't find the battery pack intrusive where it was, so haven't used this option.

The last point I want to note about this product isn't about the torch itself. It's about the user manual that comes with it. For a top-of-the-range piece of kit, the quality of the instruction manual translation leaves a lot to be desired. It's some of the worst Deutsch-glish I've ever seen. Take this excerpt for example:

It's so bad that at first I thought I might have been sent a fake item, since I couldn't imagine any self-respecting manufacturer allowing such a poorly-translated document to accompany their product. But, the supplier I used ('s) bona fides checked out. And, checking with LED Lenser's own website, it seems that they've just done a very bad job of translating the user manual of an otherwise very good product. You can read the full manual (downloaded from LED Lenser's US site) for yourself here

All-in-all, I’m glad I bought this piece of kit. It’s good enough for what I need it for. The head harness could be a little more comfortable, but it’s very usable for its intended purpose nonetheless. I feel a Petzl and other cheaper options would probably not have been bright enough for what I need. And other more expensive options would have been brighter still, but wouldn’t have been designed to wear out of water.

Not a bad purchase : 7/10