Monday, September 30, 2013

7867.2

Being the cumulative mileage at the end of the month -- 255.7 miles for the month despite the damp patch in the second week of the month; 2275 (2707) miles ytd total and just under 1200 miles on the current chain (which is getting close to needing replacement).

So 3000 miles total for the year looks easy, but 3000 on my bike (or even 8500 on the clock) would require keeping up that rate all year, which isn't on with the clocks going in under 4 weeks.

Sunday, September 29, 2013

C++ enum.toString() and then some -- or "preprocessor metaprogramming is like, 'wow!', man"

So I have a simple enough problem. I have a series of named data objects

class Base {
const wchar_t * _name;
public:
Base(const wchar_t * name) : _name(name) {}
const wchar_t * name(void) const { return _name; }
};
template<class T>
class D : public Base {
public:
D(const wchar_t * name, const O &init) : Base(name), value(init) {}
T value;
};
view raw gistfile1.cpp hosted with ❤ by GitHub

and a container class

class Items
{
public:
Items(void);
virtual ~Items(void);
Initialise(void);
std::array<std::shared_ptr<Base>, COUNT> values;
};
view raw gistfile1.cpp hosted with ❤ by GitHub

where there are COUNT of these items, each with a name, type and default value, and the Initialise method where heap allocations are done outside the constructor -- it simply fills in the array with appropriately typed named values with their default values, and I want client code to be able to index the array with symbolic names.

Plain enums don't have a to-string (though there are tricks to auto-associate names with strings by simple preprocessor metaprogramming); to get a compile-time count there's the old stand-by of putting an extra dummy member at the end and a big fat comment about inserting new values above it.

C++11 enum classes don't give me anything over the older enums here. There are various enum-like classes out there, in varying degrees of complexity; these give type-safety (which I don't really need in this context), could be extended with extra data, can be iterated over -- but don't give a compile-time count.

Time to break out Boost for some industrial grade tooling, and be very glad that my compiler supports variadic macros.

I want to define my items one time, one place, no extras like

#define ENUMS \
((First, int, 0), \
(Second, bool, false), \
(Third, int, 42))
view raw gistfile1.cpp hosted with ❤ by GitHub

Let's start with basing our enum representation off the Boost MPL wrapper for size_t

#define MAKE_ENUM(N, NAME, T, DEFAULT) \
class NAME : public boost::mpl::size_t<N> { \
public: \
typedef T rep_type; \
static const wchar_t * const label; \
static const T default_value; \
};
view raw gistfile1.cpp hosted with ❤ by GitHub

with static initialization

#define _L(NAME) \
BOOST_PP_CAT(L, NAME)
#define INIT_ENUM_STATIC(N, NAME, T, DEFAULT) \
const wchar_t * const NAME::label = _L(BOOST_PP_STRINGIZE(NAME)); \
const T NAME::default_value = DEFAULT;
view raw gistfile1.cpp hosted with ❤ by GitHub

and from there it's just a case of using the preprocessor metaprogramming to put the bits together, just iterating over the definitions to define all the types, their static initialization and the array initialization:

#define MAKE_ENUM_I(R, DATA, N, ENUM_TUPLE) \
MAKE_ENUM(N, BOOST_PP_TUPLE_ELEM(0, ENUM_TUPLE), BOOST_PP_TUPLE_ELEM(1, ENUM_TUPLE), BOOST_PP_TUPLE_ELEM(2, ENUM_TUPLE))
#define INIT_ENUM_STATIC_I(R, DATA, N, ENUM_TUPLE) \
INIT_ENUM_STATIC(N, BOOST_PP_TUPLE_ELEM(0, ENUM_TUPLE), BOOST_PP_TUPLE_ELEM(1, ENUM_TUPLE), BOOST_PP_TUPLE_ELEM(2, ENUM_TUPLE))
#define INIT_ENUM(R, ARRAY, ELEM) \
ARRAY[ELEM()] = std::shared_ptr<D<ELEM::rep_type>>(new D<ELEM::rep_type>(ELEM::label, ELEM::default_value));
#define INIT_ENUM_T(R, ARRAY, ELEM) \
INIT_ENUM(R, ARRAY, BOOST_PP_TUPLE_ELEM(0, ELEM))
#define MAKE_ENUMS(ENUM_TUPLE) BOOST_PP_LIST_FOR_EACH_I(MAKE_ENUM_I, nullptr, BOOST_PP_TUPLE_TO_LIST(ENUM_TUPLE))
#define INIT_ENUMS(ENUM_TUPLE) BOOST_PP_LIST_FOR_EACH_I(INIT_ENUM_STATIC_I, nullptr, BOOST_PP_TUPLE_TO_LIST(ENUM_TUPLE))
#define COUNT_ENUMS(ENUM_TUPLE) BOOST_PP_TUPLE_SIZE(ENUM_TUPLE)
#define INITIALISE(ENUM_TUPLE, ARRAY) BOOST_PP_LIST_FOR_EACH(INIT_ENUM_T, ARRAY, BOOST_PP_TUPLE_TO_LIST(ENUM_TUPLE))
#define INITIALISE_ENUMS(ARRAY) INITIALISE(ENUMS, ARRAY)
view raw gistfile1.cpp hosted with ❤ by GitHub

Then in the Items.h header file where I defined D and Items, realise the common parts:

#define COUNT COUNT_ENUMS(ENUMS)
MAKE_ENUMS(ENUMS)
view raw gistfile1.cpp hosted with ❤ by GitHub

and the Items.cpp file realises the initialisations

// define enum statics
INIT_ENUMS(ENUMS)
Items::Initialise(void)
{
// default-fill array
INITIALISE_ENUMS(values)
}
view raw gistfile1.cpp hosted with ❤ by GitHub

and consumers have a tolerable symbolic constant access to the values

auto x = *(D<int> *)(items.values[Third()].get());
std::wcout << x.name() << L" = " << x.value << std::endl;
view raw gistfile1.cpp hosted with ❤ by GitHub

while extending the list only has to happen in one place.

I'm just not sure if it's a bit too magic for the purpose.

Wednesday, September 25, 2013

Five finger exercise -- a more complex cmdlet in C++/CLI

Moving on from the previous post, porting the main cmdlet example to C++/CLI -- along with adapting it to later PowerShell behaviour (the original code assumed that the provider path resolution would not throw on non-existent items, so some exception handling needed to be added ahead of the fallback check for the file existing), making it pass the MSFT FxCop rules for PowerShell (fixing the verb, class and namespace in the main), and allowing it to be localized (through resource files as well as through the PowerShell look-up mechanism), and so forth...

I let Visual Studio generate the class outline, so there's a slightly pointless split into header file and implementation, but that aside, it's only the little bits of baroque syntax that really distinguish it from the C# equivalent. So perhaps I shouldn't forget about the language quite so much.

#pragma once
using namespace System;
using namespace System::IO;
using namespace System::Management::Automation;
namespace PSBook { namespace Commands
{
[Cmdlet(VerbsCommon::Set, "FileTouchTime", DefaultParameterSetName = "Path", SupportsShouldProcess = true, ConfirmImpact = ConfirmImpact::Medium)]
public ref class SetFileTouchTimeCommand :
public PSCmdlet
{
public:
SetFileTouchTimeCommand(void);
[Parameter(ParameterSetName = "Path", Mandatory = true, Position = 1,
ValueFromPipeline = true, ValueFromPipelineByPropertyName = true)]
property String ^ Path;
[Parameter(ParameterSetName = "FileInfo", Mandatory = true, Position = 1,
ValueFromPipeline = true)]
property FileInfo ^ FileInfo;
[Parameter]
property DateTime Date;
protected:
virtual void ProcessRecord(void) override;
private:
void TouchFile(System::IO::FileInfo ^ fileInfo);
void HandleFileNotFound(String ^ path, Exception ^ exception);
Resources::ResourceManager ^ rm;
};
}}

#include "TouchFileCommand.h"
namespace PSBook { namespace Commands
{
SetFileTouchTimeCommand::SetFileTouchTimeCommand(void)
{
Date = DateTime::Now;
rm = gcnew Resources::ResourceManager(L"PowershellCpp.Messages", GetType()->Assembly);
}
void SetFileTouchTimeCommand::ProcessRecord(void)
{
if (FileInfo != nullptr)
{
TouchFile(FileInfo);
}
ProviderInfo ^ provider = nullptr;
try
{
auto resolvedPaths = GetResolvedProviderPathFromPSPath(Path, provider);
for each (String ^ path in resolvedPaths)
{
if (File::Exists(path))
{
auto info = gcnew System::IO::FileInfo(path);
TouchFile(info);
}
else
{
HandleFileNotFound(path, nullptr);
return;
}
}
}
catch (ItemNotFoundException ^ nf)
{
HandleFileNotFound(Path, nf);
}
}
void SetFileTouchTimeCommand::HandleFileNotFound(String ^ path, Exception ^ exception)
{
auto message = String::Format(
System::Globalization::CultureInfo::CurrentCulture,
rm->GetString("FileNotFound"), path);
auto ae = gcnew ArgumentException(message, exception);
auto error = gcnew ErrorRecord(ae, "FileNotFound", ErrorCategory::ObjectNotFound, path);
WriteError(error);
}
void SetFileTouchTimeCommand::TouchFile(System::IO::FileInfo ^ fileInfo)
{
if(ShouldProcess(fileInfo->FullName, String::Format(
System::Globalization::CultureInfo::CurrentCulture,
rm->GetString("ConfirmString"), Date)))
{
try
{
fileInfo->LastWriteTime = Date;
}
catch (UnauthorizedAccessException ^ uae)
{
auto error = gcnew ErrorRecord(uae, "UnauthorizedFileAccess", ErrorCategory::PermissionDenied, fileInfo ->FullName);
auto detail = String::Format(
System::Globalization::CultureInfo::CurrentCulture,
rm->GetString("AccessDenied"),
fileInfo->FullName);
error->ErrorDetails = gcnew ErrorDetails(detail);
WriteError(error);
return;
}
WriteObject(fileInfo);
}
}
}}

<?xml version="1.0" encoding="utf-8"?>
<root>
<!--
Microsoft ResX Schema
Version 2.0
The primary goals of this format is to allow a simple XML format
that is mostly human readable. The generation and parsing of the
various data types are done through the TypeConverter classes
associated with the data types.
Example:
... ado.net/XML headers & schema ...
<resheader name="resmimetype">text/microsoft-resx</resheader>
<resheader name="version">2.0</resheader>
<resheader name="reader">System.Resources.ResXResourceReader, System.Windows.Forms, ...</resheader>
<resheader name="writer">System.Resources.ResXResourceWriter, System.Windows.Forms, ...</resheader>
<data name="Name1"><value>this is my long string</value><comment>this is a comment</comment></data>
<data name="Color1" type="System.Drawing.Color, System.Drawing">Blue</data>
<data name="Bitmap1" mimetype="application/x-microsoft.net.object.binary.base64">
<value>[base64 mime encoded serialized .NET Framework object]</value>
</data>
<data name="Icon1" type="System.Drawing.Icon, System.Drawing" mimetype="application/x-microsoft.net.object.bytearray.base64">
<value>[base64 mime encoded string representing a byte array form of the .NET Framework object]</value>
<comment>This is a comment</comment>
</data>
There are any number of "resheader" rows that contain simple
name/value pairs.
Each data row contains a name, and value. The row also contains a
type or mimetype. Type corresponds to a .NET class that support
text/value conversion through the TypeConverter architecture.
Classes that don't support this are serialized and stored with the
mimetype set.
The mimetype is used for serialized objects, and tells the
ResXResourceReader how to depersist the object. This is currently not
extensible. For a given mimetype the value must be set accordingly:
Note - application/x-microsoft.net.object.binary.base64 is the format
that the ResXResourceWriter will generate, however the reader can
read any of the formats listed below.
mimetype: application/x-microsoft.net.object.binary.base64
value : The object must be serialized with
: System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
: and then encoded with base64 encoding.
mimetype: application/x-microsoft.net.object.soap.base64
value : The object must be serialized with
: System.Runtime.Serialization.Formatters.Soap.SoapFormatter
: and then encoded with base64 encoding.
mimetype: application/x-microsoft.net.object.bytearray.base64
value : The object must be serialized into a byte array
: using a System.ComponentModel.TypeConverter
: and then encoded with base64 encoding.
-->
<xsd:schema id="root" xmlns="" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
<xsd:import namespace="http://www.w3.org/XML/1998/namespace" />
<xsd:element name="root" msdata:IsDataSet="true">
<xsd:complexType>
<xsd:choice maxOccurs="unbounded">
<xsd:element name="metadata">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="value" type="xsd:string" minOccurs="0" />
</xsd:sequence>
<xsd:attribute name="name" use="required" type="xsd:string" />
<xsd:attribute name="type" type="xsd:string" />
<xsd:attribute name="mimetype" type="xsd:string" />
<xsd:attribute ref="xml:space" />
</xsd:complexType>
</xsd:element>
<xsd:element name="assembly">
<xsd:complexType>
<xsd:attribute name="alias" type="xsd:string" />
<xsd:attribute name="name" type="xsd:string" />
</xsd:complexType>
</xsd:element>
<xsd:element name="data">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
<xsd:element name="comment" type="xsd:string" minOccurs="0" msdata:Ordinal="2" />
</xsd:sequence>
<xsd:attribute name="name" type="xsd:string" use="required" msdata:Ordinal="1" />
<xsd:attribute name="type" type="xsd:string" msdata:Ordinal="3" />
<xsd:attribute name="mimetype" type="xsd:string" msdata:Ordinal="4" />
<xsd:attribute ref="xml:space" />
</xsd:complexType>
</xsd:element>
<xsd:element name="resheader">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="value" type="xsd:string" minOccurs="0" msdata:Ordinal="1" />
</xsd:sequence>
<xsd:attribute name="name" type="xsd:string" use="required" />
</xsd:complexType>
</xsd:element>
</xsd:choice>
</xsd:complexType>
</xsd:element>
</xsd:schema>
<resheader name="resmimetype">
<value>text/microsoft-resx</value>
</resheader>
<resheader name="version">
<value>2.0</value>
</resheader>
<resheader name="reader">
<value>System.Resources.ResXResourceReader, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
</resheader>
<resheader name="writer">
<value>System.Resources.ResXResourceWriter, System.Windows.Forms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</value>
</resheader>
<data name="AccessDenied" xml:space="preserve">
<value>Unable to touch file '{0}'; check whether it is read-only or otherwise inaccessible to you.</value>
</data>
<data name="ConfirmString" xml:space="preserve">
<value>Touch file last write time to {0}</value>
</data>
<data name="FileNotFound" xml:space="preserve">
<value>File '{0}' was not found</value>
</data>
</root>
view raw Messages.resx hosted with ❤ by GitHub

Monday, September 23, 2013

Five finger exercise -- simple cmdlet in C++/CLI

I tend to forget that there are other .net languages than F# (oh, and the one that pays my salary); in particular, after some traumatic encounters with the original Managed C++, that C++/CLI is there and isn't too bad, especially when significant native interop is required (where even F#'s P/Invoke starts to drown in attributes). But for hardcore managed stuff...?

So I've been playing with some native code again, and was thinking to surface the functionality in PowerShell, since GUI programming is pretty tedious (I've set projects aside for half a year or more when the next step is to include a tree view control). So my first thought was to do that in F#, as I did that ages ago, back when you needed snap-ins and there were glitches in how F# compiled against that particular object inheritance tree.

But then, I thought, there's no need to be so polyglot, even if C++/CLI is a bit like Geordie when compared to Standard C++'s RP. So, to practice with the somewhat distractingly different keyword placement, here's the very first example from the old (and out of print) Wrox book

using namespace System;
using namespace System::Management::Automation;
using namespace System::ComponentModel;
namespace PSBook { namespace Commands
{
[RunInstaller(true)]
public ref class PSBookChapter2MySnapIn : PSSnapIn
{
public:
// Name for the PowerShell snap-in.
virtual property String ^ Name
{
String ^ get() override
{
return "...";
}
}
// Vendor information for the PowerShell snap-in.
virtual property String ^ Vendor
{
String ^ get() override
{
return "...";
}
}
// Description of the PowerShell snap-in
virtual property String ^ Description
{
String ^ get() override
{
return "This is a sample PowerShell snap-in";
}
}
};
// Code to implement cmdlet Write-Hi
[Cmdlet(VerbsCommunications::Write, "HI")]
[System::Diagnostics::CodeAnalysis::SuppressMessage("Microsoft.PowerShell", "PS1101:AllCmdletsShouldAcceptPipelineInput", Justification = "No valid input")]
public ref class WriteHICommand : Cmdlet
{
protected:
virtual void ProcessRecord() override
{
WriteObject("Hi, World!");
}
};
// Code to implement cmdlet Write-Hello
[Cmdlet(VerbsCommunications::Write, "Hello")]
[System::Diagnostics::CodeAnalysis::SuppressMessage("Microsoft.PowerShell", "PS1101:AllCmdletsShouldAcceptPipelineInput", Justification = "No valid input")]
public ref class WriteHelloCommand : Cmdlet
{
protected:
virtual void ProcessRecord() override
{
WriteObject("Hello, World!");
}
};
}}
view raw SnapInPP.cpp hosted with ❤ by GitHub

and, yes, snap-ins are old-tech, but they provide examples of how to override property definitions (in the simplest form, without mapping names). This has also been tweaked to be FxCop clean, including for the MSFT powershell rules.

Of course this was the point where I discovered that C++/CLI in VS2010 will only target .net 4, and I was writing this on my old Vista laptop, without space for back-version installs, and had to use the old familiar hack. But, that aside, it all went through most gratifyingly.

Next, to port one of the real examples, to see what other joy is involved.

Sunday, September 22, 2013

.net under the covers — using/Using/use

Something I had occasion to look at the other day was exactly how the auto-disposal mechanism actually works for the out of the box .net languages: and there are some interesting quirks to be found. Take the equivalent sample code fragments

using (var r = new StringReader(text))
{
Console.WriteLine(r.ReadToEnd());
}
view raw gistfile1.cs hosted with ❤ by GitHub
Using r As New StringReader(text)
Console.WriteLine(r.ReadToEnd())
End Using
view raw gistfile1.vb hosted with ❤ by GitHub
use r = new StringReader(text)
printfn "%A" <| r.ReadToEnd()
view raw gistfile1.fs hosted with ❤ by GitHub

and for completeness, C++/CLI stack based disposal

{
StringReader r(text);
Console::WriteLine(r.ReadToEnd());
}
view raw gistfile1.cs hosted with ❤ by GitHub

All languages and configurations render this as a try {} finally { Dispose() } construction (apart from stack-based C++/CLI semantics, which generates a try {} catch { Dispose(); throw;} Dispose()), but there are devils in the details. In release mode, C# and VB compile to the same IL -- just dispose if the used value is not null

finally
{
IL_001a: ldloc.1
IL_001b: brfalse.s IL_0023
IL_001d: ldloc.1
IL_001e: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_0023: endfinally
} // end handler
view raw gistfile1.cs hosted with ❤ by GitHub

whereas F# compiles to

finally
{
IL_002c: ldloc.0
IL_002d: isinst [mscorlib]System.IDisposable
IL_0032: stloc.3
IL_0033: ldloc.3
IL_0034: brfalse.s IL_003f
IL_0036: ldloc.3
IL_0037: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_003c: ldnull
IL_003d: pop
IL_003e: endfinally
IL_003f: ldnull
IL_0040: pop
IL_0041: endfinally
} // end handler
// which ILSpy decompiled to
finally
{
IDisposable disposable = r as IDisposable;
if (disposable != null)
{
disposable.Dispose();
}
}
view raw gistfile1.cs hosted with ❤ by GitHub

which is unexpected, as the compiler enforces the subject of a use initialisation to be an IDisposable anyway -- but, importantly, it does not test for null on the original reference.


In debug mode, all three languages differ, C# flips the test around

finally
{
IL_001e: ldloc.1
IL_001f: ldnull
IL_0020: ceq
IL_0022: stloc.2
IL_0023: ldloc.2
IL_0024: brtrue.s IL_002d
IL_0026: ldloc.1
IL_0027: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_002c: nop
IL_002d: endfinally
} // end handler
view raw gistfile1.cs hosted with ❤ by GitHub

VB does something a bit more like F#

finally
{
IL_001f: ldloc.1
IL_0020: ldnull
IL_0021: ceq
IL_0023: ldc.i4.0
IL_0024: ceq
IL_0026: stloc.2
IL_0027: ldloc.2
IL_0028: brfalse.s IL_0031
IL_002a: ldloc.1
IL_002b: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_0030: nop
IL_0031: nop
IL_0032: endfinally
} // end handler
// which ILSpy decompiled to (not recognising it as a Using block even when decompiling to VB!)
finally
{
bool flag = r != null;
if (flag)
{
((IDisposable)r).Dispose();
}
}
view raw gistfile1.cs hosted with ❤ by GitHub

and F# just adds one of its usual bursts of gratuitous branching in the middle

finally
{
IL_0029: ldloc.0
IL_002a: isinst [mscorlib]System.IDisposable
IL_002f: stloc.s 4
IL_0031: ldloc.s 4
IL_0033: brfalse.s IL_0037
IL_0035: br.s IL_0039
IL_0037: br.s IL_0043
IL_0039: ldloc.s 4
IL_003b: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_0040: ldnull
IL_0041: pop
IL_0042: endfinally
IL_0043: ldnull
IL_0044: pop
IL_0045: endfinally
} // end handler
view raw gistfile1.cs hosted with ❤ by GitHub

The C++/CLI code can rely on the stack-based object not being null, so eschews any checks

fault
{
IL_0022: ldloc.0
IL_0023: callvirt instance void [mscorlib]System.IDisposable::Dispose()
IL_0028: endfinally
} // end handler
IL_0029: ldloc.0
IL_002a: callvirt instance void [mscorlib]System.IDisposable::Dispose()
// which ILSpy decompiled to
catch
{
((IDisposable)r).Dispose();
throw;
}
((IDisposable)r).Dispose();
view raw gistfile1.cs hosted with ❤ by GitHub

Tuesday, September 17, 2013

Harvest Home

The last of the plums have been gathered in, and the freezer isn't quite bursting at the seams. The peak of the crop came a couple of weeks after the usual Bank Holiday weekend date, showing how the late spring has affected things. OTOH, the fruit were all on the large size, with few of the thumb-size specimens that usually form the last vestiges of the crop.

Saturday, September 14, 2013

Should.BeAvailableInF#

Since I first heard of it, I've found the Should Assertions library convenient for the driver programs that stand in for more formal unit tests for code fragments that I want to blog. However, such main programs have to be written in C#, because you can't access C# extension methods on generic open types from F# -- which is exactly what the general run of Should extensions are.

So, you could write F# code like

Should.ObjectAssertExtensions.ShouldEqual(outstring, instring)
view raw gistfile1.fs hosted with ❤ by GitHub

but that is pretty ugly. And F# type extensions are defined against a specific class, so won't be any use. Which leaves us with generic functions as the best way to wrap the C# extension methods, but

shouldEqual outstring instring
view raw gistfile1.txt hosted with ❤ by GitHub

isn't much of an improvement.

The approach that brings us closer to the C# feel would be to make these methods infix operators.

In an ideal world, we'd use ⊦ (U+22A6 ASSERTION) as a lead character, but that's not yet available without forking the compiler, so I chose to write the functions in the form |-{op}, doubling the first character of the {op} for the versions that take custom comparison objects, adding a trailing % for variants taking a message, and using !- as an introducer for unary operations like !-/, where / alludes to ∅ (U+2205 EMPTY SET) for ShouldBeNull.

Apart from the object/generic assertion, the other type that needs transformation across the language barrier is the ShouldThrow<T> extension on Action, which can be wrapped as a generic function taking a unit -> unit, which uses that to create a delegate for the C# world.

Putting it all together we have module fhould, a pun on f for F# and for the old-fashioned cursive long 's' as in ſhould:

namespace Tinesware.FSharp
open System
open System.Collections.Generic
open Should
module fhould =
let inline (|->) x y =
ObjectAssertExtensions.ShouldBeGreaterThan(x, y)
let inline (|->>) (x:'a) (y:'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldBeGreaterThan(x, y, z)
let inline (|->=) x y =
ObjectAssertExtensions.ShouldBeGreaterThanOrEqualTo(x, y)
let inline (|->>=) (x:'a) (y:'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldBeGreaterThanOrEqualTo(x, y, z)
let inline (|-<>) x (y, z) =
ObjectAssertExtensions.ShouldBeInRange(x, y, z)
let inline (|-<<>>) (x:'a) (y:'a * 'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldBeInRange(x, fst y, snd y, z)
let inline (|-<) x y =
ObjectAssertExtensions.ShouldBeLessThan(x, y)
let inline (|-<<) (x:'a) (y:'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldBeLessThan(x, y, z)
let inline (|-<=) x y =
ObjectAssertExtensions.ShouldBeLessThanOrEqualTo(x, y)
let inline (|-<<=) (x:'a) (y:'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldBeLessThanOrEqualTo(x, y, z)
let inline (!-/) x =
ObjectAssertExtensions.ShouldBeNull(x)
let inline (|-===) x y =
ObjectAssertExtensions.ShouldBeSameAs(x, y)
let inline (|-@) x y =
ObjectAssertExtensions.ShouldBeType(x, y)
let inline (|-<@) x (y:Type) =
ObjectAssertExtensions.ShouldImplement(x, y)
let inline (|-<@%) x y z =
ObjectAssertExtensions.ShouldImplement(x, y, z)
let inline (|-=) x y =
ObjectAssertExtensions.ShouldEqual(x, y)
let inline (|-=%) x y (s:String)=
ObjectAssertExtensions.ShouldEqual(x, y, s)
let inline (|-==) (x:'a) (y:'a) (z:IEqualityComparer<'a>)=
ObjectAssertExtensions.ShouldEqual(x, y, z)
let inline (|-><) x (y, z) =
ObjectAssertExtensions.ShouldNotBeInRange(x, y, z)
let inline (|->><<) (x:'a) (y:'a * 'a) (z:IComparer<'a>) =
ObjectAssertExtensions.ShouldNotBeInRange(x, fst y, snd y, z)
let inline (!-?) x =
ObjectAssertExtensions.ShouldNotBeNull(x) |> ignore
let inline (|-?%) x y =
ObjectAssertExtensions.ShouldNotBeNull(x, y) |> ignore
let inline (|-!==) x y =
ObjectAssertExtensions.ShouldNotBeSameAs(x, y)
let inline (|-!@) x y =
ObjectAssertExtensions.ShouldNotBeType(x, y)
let inline (|-!=) x y=
ObjectAssertExtensions.ShouldNotEqual(x, y)
let inline (|-!!=) (x:'a) (y:'a) (z:IEqualityComparer<'a>)=
ObjectAssertExtensions.ShouldNotEqual(x, y, z)
let shouldThrow<'a when 'a :> Exception> f =
ActionAssertionExtensions.ShouldThrow<'a>(new Should.Core.Assertions.Assert.ThrowsDelegate(f))
view raw fhould.fs hosted with ❤ by GitHub

Note that there is no order checking in the range tuples (they are just passed as (low, high) to the corresponding arguments of the underlying code); and that 3-ary methods (custom comparers or message strings) have to be invoked in one or other of these styles to get the association correct

one |->> 0 <| Comparer<int>.Default
(one |->> 0) Comparer<int>.Default
view raw gistfile1.fs hosted with ❤ by GitHub

and F# syntax doesn't permit us to define unary/generic ShouldBeType or ShouldImplement operators in the style of (!-@)<'a>.


Tuesday, September 03, 2013

Season's end

By the start of this month, the sunrise is late enough that even though it's clear daylight when the alarm goes off, the sun has yet to clear the house across the road and stream into the bedroom. In the evening, we've lost two hours daylight since the peak, in terms of how long it stays light enough to read sitting outdoors on a clear evening.


Sunday, September 01, 2013

It was a good summer

At the start of the season, the TWO forecast was

The TWO summer forecast covering June, July and August has been issued and the headline is for a mixed summer with the best weather probably during the first half, with a deterioration later on. August is expected to bring the worst weather, but on the whole the summer isn't likely to be as bad as last year's.

which was upset by the heatwave in July -- even the August Bank Holiday weekend was mostly fine.

There was rain on the Thursday before, when we took Karen's mum out to lunch at the Willow Tree in Bourn, but it was fine when we returned there on Sunday afternoon for their barbecue garden party. While there, some of the players from the production of the Dream at Burwash Manor turned up in costume after the matinee, handing out flyers.

So we went to the Tuesday evening performance on a beautiful clear evening that only just started to feel cool after the stars were well out; and despite a forecast of cloud, Wednesday when we went to Orford for lunch at the Crown and Castle was almost cloudless, and lovely weather for sitting out on the terrace for lunch.

Thursday did the same for my long bike ride, then again of Friday for an afternoon at Linton Zoo, as did Saturday. Only today did it stay rather hazy.

As a result of the weather, with Karen being rather washed out by the heat, I did a lot more about house and garden, than I expected, getting around to bits of tidying that had needed doing since we got the adaptations done in '06 (the last time we had a semi-decent summer) and more miles on my bike at 100 miles a week -- including a lot of simple shopping runs.

OTOH, there are things I didn't do as a result -- code much, cook very much (when salads were often the order of the day), upgrade my phone or my home network, do anything shocking like get a tattoo.

But what I did accomplish was to shed a lot of stress, and decide that yes, I will be able to handle retirement if I can ever afford it. The lowered stress levels may last me until all of Tuesday, but it's still a decade until I can get my bus pass.