Why is type "Integer -> Integer" and not "(Num a) => a -> a"?

5 messages
Open this post in threaded view
|

Why is type "Integer -> Integer" and not "(Num a) => a -> a"?

 Hi! I have a problem with understanding some types given by ghc and hugs. The file loaded is: f1 = \x -> x * 2 f2 x = x * 2 After they are loaded I get the following from ghci *Main> :t f1 f1 :: Integer -> Integer *Main> :t f2 f2 :: (Num a) => a -> a *Main> :t \x -> x * 2 \x -> x * 2 :: (Num a) => a -> a I do not understand why f1 is given Integer -> Integer as a type and not the polymorphic (Num a) => a -> a. I believed that f1, f2 and the lambda expression should all have the same type. Similar output to that above is given by Hugs. Thanks, Dag Hovland -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 3880 bytes Desc: S/MIME Cryptographic Signature Url : http://www.haskell.org/pipermail/beginners/attachments/20091112/a88251b0/smime.bin
Open this post in threaded view
|

Why is type "Integer -> Integer" and not "(Num a) => a -> a"?

 Hi! That's very weird, I don't have a good answer (and fortunately there   are far smarter people on this list who will have a better answer...)   but my instincts say it has to do with defaulting. When GHC sees a literal digit, it tries to guess what it's type should   be, first via inference, but if that returns a polymorphic type (like   `Num a => a` or something) it will "default" to a particular type, for   literal whole positive/negative numbers, it defaults to `Integer`. My   guess is that, defining in GHCi  > let f x = x * 2  > let g = \x -> x * 2 the former doesn't default to anything (it just does inference) since   it's a function definition, and the latter defaults the '2' to an   Integer because it's a value -- or some suitable analog of that   situation. What will really blow your mind, try having GHCi inspect the type of  > :t \x -> x * 2 (the defn. of `g` w/o the let...) Short answer, I have no idea, but I think it has to do with defaulting. /Joe On Nov 12, 2009, at 3:37 AM, Dag Hovland wrote: > Hi! > > I have a problem with understanding some types given by ghc and hugs. > The file loaded is: > > f1 = \x -> x * 2 > f2 x = x * 2 > > After they are loaded I get the following from ghci > > *Main> :t f1 > f1 :: Integer -> Integer > *Main> :t f2 > f2 :: (Num a) => a -> a > *Main> :t \x -> x * 2 > \x -> x * 2 :: (Num a) => a -> a > > > I do not understand why f1 is given Integer -> Integer as a type and   > not > the polymorphic (Num a) => a -> a. I believed that f1, f2 and the   > lambda > expression should all have the same type. Similar output to that above > is given by Hugs. > > Thanks, > > Dag Hovland > _______________________________________________ > Beginners mailing list > [hidden email] > http://www.haskell.org/mailman/listinfo/beginners
Open this post in threaded view
|

Why is type "Integer -> Integer" and not "(Num a) => a -> a"?

 In reply to this post by Dag Hovland On Thu, Nov 12, 2009 at 9:37 AM, Dag Hovland <[hidden email]> wrote: > Hi! > > I have a problem with understanding some types given by ghc and hugs. > The file loaded is: > > f1 = \x -> x * 2 > f2 x = x * 2 > > After they are loaded I get the following from ghci > > *Main> :t f1 > f1 :: Integer -> Integer > *Main> :t f2 > f2 :: (Num a) => a -> a > *Main> :t \x -> x * 2 > \x -> x * 2 :: (Num a) => a -> a This is called the monomorphism restriction, it's a rule that state a binding _without_parameters_ shall be inferred of a monomorphic type unless an explicit signature is given. There are several reasons for it, some of efficiency (polymorphism has a cost) and some of a more technical nature, refer to the Haskell Report for a more detailed explanation. Some Haskellers think this restriction is no longer required, that GHC is now often intelligent enough to alleviate the cost of polymorphism, that the technical reasons are not really all that pertinent and that the default should be to infer the more general type in all case rather than confuse beginners and oblige experts to put explicit signatures. It is already possible to deactivate the restriction by using the -XNoMonomorphismRestriction argument (or putting the equivalent language pragma in the code itself or in the cabal description file) and making this the default is discussed for Haskell' (the future standard for Haskell). In the meantime, it is a good idea to put ":set -XNoMonomorphismRestriction" in your .ghci file since most usage of GHCi would only hit the disadvantages of this rule and reap no benefits from it. -- Jeda?