Setting Default Integer and Real Types

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Setting Default Integer and Real Types

Lorenzo Isella
Dear All,
I am quite new to Haskell and planning to see where it will lead me in
scientific computing (I will be looking into hmatrix soon).
Unless there are real memory problems, I would like to make sure that
all real numbers are Double and all integer numbers are Integer types.
Now, I understand that Haskell in most of the cases is smart enough to
infer the type of a variable (integer, real, etc...), but can I also set
its default precision once for all?
I.e. if I say a=5.2 I want a to be used as a double precision real
everywhere in the code.
Cheers

Lorenzo
Reply | Threaded
Open this post in threaded view
|

Setting Default Integer and Real Types

Brent Yorgey-2
On Wed, Sep 08, 2010 at 12:58:48PM +0200, Lorenzo Isella wrote:

> Dear All,
> I am quite new to Haskell and planning to see where it will lead me
> in scientific computing (I will be looking into hmatrix soon).
> Unless there are real memory problems, I would like to make sure that
> all real numbers are Double and all integer numbers are Integer
> types.
> Now, I understand that Haskell in most of the cases is smart enough
> to infer the type of a variable (integer, real, etc...), but can I
> also set its default precision once for all?
> I.e. if I say a=5.2 I want a to be used as a double precision real
> everywhere in the code.

Sure, just give a type signature:

a :: Double
a = 5.2

-Brent
Reply | Threaded
Open this post in threaded view
|

Setting Default Integer and Real Types

Daniel Fischer-4
In reply to this post by Lorenzo Isella
On Wednesday 08 September 2010 12:58:48, Lorenzo Isella wrote:
> Dear All,
> I am quite new to Haskell and planning to see where it will lead me in
> scientific computing (I will be looking into hmatrix soon).
> Unless there are real memory problems, I would like to make sure that
> all real numbers are Double and all integer numbers are Integer types.
> Now, I understand that Haskell in most of the cases is smart enough to
> infer the type of a variable (integer, real, etc...), but can I also set
> its default precision once for all?

Haskell has default declarations
http://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-790004.3.4
for that.

The default default is

default (Integer, Double)

which means that if possible, an ambiguous number type is instantiated as
Integer, if that's not possible (due to a Fractional or Floating constraint
for example), Double is tried. If neither is possible, you get a compile
error (ambiguous type variable ...).

> I.e. if I say a=5.2 I want a to be used as a double precision real
> everywhere in the code.

There's a slight catch in that.
The inferred type of a binding a = 5.2 is
a :: Fractional n => n
and the binding is really
a = fromRational (26 % 5)

Depending on whether the monomorphism restriction applies and the
optimisation level, it could be that without type signature, the call to
fromRational is evaluated at each use, which can slow down performance.

But basically, you get your desired behaviour automatically.

However, it's good practice to use a lot of type signatures although the
compiler's type inference makes that not necessary (apart from the
documentational effect of type signatures, specifying a monomorphic type
instead of the inferred polymorphic type can give significant performance
gains).

> Cheers
>
> Lorenzo