When does x/2 != x>>1 for integer x?
As I discovered, when porting the Ruby big-integer unit tests to C# to test a BigInteger class --
Answer: Sometimes but not always, when x is negative and odd.
Python and Ruby -- native and DLR -- do maintain the identity:
IronPython 1.1.1 (1.1.1) on .NET 2.0.50727.1433 Copyright (c) Microsoft Corporation. All rights reserved. >>> i = -57 >>> a = i/2 >>> b = i>>1 >>> print a,b -29 -29
version 2.0β2, IronRuby pre-alpha, Python 2.5.2, Ruby 1.8.6 equivalents all omitted.
Indeed the Ruby big-integer unit tests contain a couple of tests explicitly to assert this identity. But doing
using System; | |
using System.Collections.Generic; | |
using System.Text; | |
namespace negatives | |
{ | |
class Program | |
{ | |
static void Main(string[] args) | |
{ | |
int i = -57; | |
int a = i/2; | |
int b = i>>1; | |
Console.WriteLine("/2 -> " + a + "; >>1 -> " + b); | |
} | |
} | |
} |
/2 -> -28; >>1 -> -29
following the 'C' family "round towards zero" approach.
This is a potential peril in polyglot programming within an application -- the "same" arithmetic expression can yield different results depending on the language being used locally. Just being on the CLR, for example, does not mandate your arithmetic -- the language definition does. This is not something that one would at first consider might happen.
No comments :
Post a Comment