Proposal: Explicitly require "Data.Bits.bit (-1) == 0" property

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
149 messages Options
123456 ... 8
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

John Lato-2
On Mon, Feb 24, 2014 at 11:21 PM, Sven Panne <[hidden email]> wrote:

Regarding upper bounds: I never understood what their advantage should
be, IMHO they only lead to a version hell where you can't compile your
stuff anymore *only* because of the bounds, not because of anything
else.

IMHO this is a bad enough outcome, but it can also allow you to compile code in a way that it behaves incorrectly (if a function's behavior has changed but the type has not).  It also leads to a situation where cabal generates what it thinks is an acceptable dependency solution, but that solution fails, making the user need to solve the dependency tree themselves and specify constraints on the cabal command line.

This is reason the PVP specifies upper bounds on versions: it makes that work the responsibility of the developer rather than the user.  At the time the PVP was introduced, users often experienced serious hardships when installing various combinations of packages, and IIRC it was widely perceived that developers should shoulder the load of making sure their packages would work together as specified.  However, I think the PVP may have been a victim of its own success; user complaints about botched installs and invalid install plans seem quite rare these days, and some developers are pushing back against this extra workload.  (or maybe there are no Haskell users?)

John L.


_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Daniel Trstenjak-2
In reply to this post by Sven Panne-2

Hi Sven,

> o_O Dumb question: Can somebody please explain why this doesn't
> conform to the PVP? I have a very hard time reading that out of
> http://www.haskell.org/haskellwiki/Package_versioning_policy. Perhaps
> I'm looking at the wrong document or this interpretation is just
> wishful thinking...

If I'm getting it right, you don't have to increase a major version
number if you're e.g. just adding another function.

But if the user of your library imports unqualified or implicit,
then he will also get your added function and this function might
conflict with functions in your code base.


Greetings,
Daniel
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on

Herbert Valerio Riedel
In reply to this post by Michael Snoyman
On 2014-02-25 at 07:44:45 +0100, Michael Snoyman wrote:

[...]

> * I know that back in the base 3/4 transition there was good reason for
> upper bounds on base. Today, it makes almost no sense: it simply prevents
> cabal from even *trying* to perform a compilation. Same goes with libraries
> like array and template-haskell, which make up most of the issue with
> testing of GHC 7.8 Stackage builds[3]. Can any PVP proponent explain why
> these upper bounds still help us on corelibs?

I assume by 'corelibs' you mean the set of non-upgradeble libs,
i.e. those tied to the compiler version? (E.g. `bytestring` would be
upgradeable, as opposed to `base` or `template-haskell`)

Well, `base` (together with the other few non-upgradeable libs) is
indeed a special case; also, in `base` usually care is taken to avoid
semantic changes (not visible at the type-signature level), so an upper
bound doesn't gain that much in terms of protecting against semantic
breakages.

Otoh, the situation changes if you have a library where you have
different versions, which are tied to different version ranges of base,
where you want Cabal to select the matching version. Admittedly, this is
a special case for when use of MIN_VERSION_base() wouldn't suffice, but
I wanted to give an example exploiting upper-bounds on the `base` lib.

There's one other possible minor benefit I can think of, that upper
bounds give over compile-errors, which is a more user-friendly message,
to point to the reason of the failure, instead of requiring you guess
what the actual cause of the compile-error was. But for non-upgradeable
packages such as `base`, which do big major version jumps for almost
every release (mostly due to changes in GHC modules exposing internals
or adding type-class instances[1]), erring on the
confusing-compile-error side seems to provide more value.

So, as for `base` I mostly agree, that there seems to be little benefit
for upper bounds, *unless* a base3/4 situation comes up again in the
future. So, I'd suggest (for those who don't want to follow PVP with
`base`) to keep using at least a "super"-major upper bound, such as
'base < 5' to leave a door open for such an eventuality.


Cheers,
  hvr


 [1]: I'd argue (but I'd need research this, to back this up with
      numbers), that we're often suffering from the PVP, because it
      requires us to perform major-version jumps mostly due to
      typeclasses, in order to protect against conflicts with possible
      non-hideable orphan-instances; and that (as some have suggested in
      past already), we might want to reconsider requiring only a minor
      bump on instance-additions, and discourage the orphan-instance
      business by requiring those packages to have tighter-than-major
      upper-bounds
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on

Michael Snoyman



On Tue, Feb 25, 2014 at 11:12 AM, Herbert Valerio Riedel <[hidden email]> wrote:
On 2014-02-25 at 07:44:45 +0100, Michael Snoyman wrote:

[...]

> * I know that back in the base 3/4 transition there was good reason for
> upper bounds on base. Today, it makes almost no sense: it simply prevents
> cabal from even *trying* to perform a compilation. Same goes with libraries
> like array and template-haskell, which make up most of the issue with
> testing of GHC 7.8 Stackage builds[3]. Can any PVP proponent explain why
> these upper bounds still help us on corelibs?

I assume by 'corelibs' you mean the set of non-upgradeble libs,
i.e. those tied to the compiler version? (E.g. `bytestring` would be
upgradeable, as opposed to `base` or `template-haskell`)


Yes, that's what I meant. I realize now corelibs wasn't the right term, but I don't think we *have* a correct term for this. I like your usage of upgradeable.
 
Well, `base` (together with the other few non-upgradeable libs) is
indeed a special case; also, in `base` usually care is taken to avoid
semantic changes (not visible at the type-signature level), so an upper
bound doesn't gain that much in terms of protecting against semantic
breakages.

Otoh, the situation changes if you have a library where you have
different versions, which are tied to different version ranges of base,
where you want Cabal to select the matching version. Admittedly, this is
a special case for when use of MIN_VERSION_base() wouldn't suffice, but
I wanted to give an example exploiting upper-bounds on the `base` lib.


That situation is technically possible, but highly unlikely to ever occur in practice. Consider what would have to happen:

foo-1 is released, which works with base 4.5 and 4.6. It has a version bound base >= 4.5 && < 4.7.
foo-2 is released, which only works with base 4.5. It changes its version bound to base >= 4.5 && < 4.6.

In other words, a later release of the package would have to drop support for newer GHCs. The far more likely scenario to occur is where foo-1 simply didn't include upper bounds, and foo-2 adds them in. In that case, cabal will try to use foo-1, even though it won't anyway.

Does anyone have an actual example of base or template-haskell upper bounds that provided benefit?
 
There's one other possible minor benefit I can think of, that upper
bounds give over compile-errors, which is a more user-friendly message,
to point to the reason of the failure, instead of requiring you guess
what the actual cause of the compile-error was. But for non-upgradeable
packages such as `base`, which do big major version jumps for almost
every release (mostly due to changes in GHC modules exposing internals
or adding type-class instances[1]), erring on the
confusing-compile-error side seems to provide more value.


I'd actually argue that this is a disadvantage. It's true that we want users to have a good experience, but the *best* experience would be to let upstream packages get fixed. Imagine a common build error caused by the removal of `catch` from Prelude in base 4.6. With upper bounds, a user gets the error message "doesn't work with base 4.6" and reports to the package maintainer. The package maintainer then needs to download GHC and try to compile his package before getting any idea what the problem is (if there even *is* a problem!).

With more verbose errors, a user could give a meaningful error message and, in many cases, a maintainer would be able to fix the problem without even needing to download a new version of the compiler.
 
So, as for `base` I mostly agree, that there seems to be little benefit
for upper bounds, *unless* a base3/4 situation comes up again in the
future. So, I'd suggest (for those who don't want to follow PVP with
`base`) to keep using at least a "super"-major upper bound, such as
'base < 5' to leave a door open for such an eventuality.


Cheers,
  hvr


 [1]: I'd argue (but I'd need research this, to back this up with
      numbers), that we're often suffering from the PVP, because it
      requires us to perform major-version jumps mostly due to
      typeclasses, in order to protect against conflicts with possible
      non-hideable orphan-instances; and that (as some have suggested in
      past already), we might want to reconsider requiring only a minor
      bump on instance-additions, and discourage the orphan-instance
      business by requiring those packages to have tighter-than-major
      upper-bounds

+1, forcing major version bumps for each new instance just in case someone has an orphan instance is complete overkill.

Michael

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: [Mid-discussion Summary] Proposal: add new Data.Bits.Bits(bitZero) method

Casey McCann-2
In reply to this post by Gershom Bazerman
On Tue, Feb 25, 2014 at 1:23 AM, Gershom Bazerman <[hidden email]> wrote:

> The issue isn't about qualified or unqualified names at all. It is about
> names which express intent clearly and evocatively, and names which are
> unacceptably ambiguous.
>
> As such, I propose
>
> zero --> whereDidTheBitsGo
>
> and conversely,
>
> allBits --> iHaveAllTheBits
>
> It seems to me that these are expressive names with unmistakable meanings.

Well, for names in that vein, I'd suggest notOneBit and everyLastBit.
This avoids reinventing the wheel by relying on standard English
idioms, and it's well-known that imitating the flawless logical
structure of the English language is the highest goal for any
programming language.

- C.
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Brandon Allbery
In reply to this post by Michael Snoyman
On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]> wrote:
But that's only one half of the "package interoperability" issue. I face this first hand on a daily basis with my Stackage maintenance. I spend far more time reporting issues of restrictive upper bounds than I do with broken builds from upstream changes. So I look at this as purely a game of statistics: are you more likely to have code break because version 1.2 of text changes the type of the map function and you didn't have an upper bound, or because two dependencies of yours have *conflicting* versions bounds on a package like aeson[2]? In my experience, the latter occurs far more often than the former.

I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

Because we keep constantly seeing examples where saving the developer some upper bounds PVP work forces users to deal with unexpected errors, but since Haskell developers don't see that user pain it is considered irrelevant/nonexistent and certainly not any justification for saving developers some work.

Personally, I think any ecosystem which strongly prefers pushing versioning pain points onto end users instead of developers is doing itself a severe disservice.

Are there things that could be improved about versioning policy? Absolutely. But pushing problems onto end users is not an improvement.

--
brandon s allbery kf8nh                               sine nomine associates
[hidden email]                                  [hidden email]
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Brandon Allbery
In reply to this post by Sven Panne-2
On Tue, Feb 25, 2014 at 2:21 AM, Sven Panne <[hidden email]> wrote:
Regarding upper bounds: I never understood what their advantage should
be, IMHO they only lead to a version hell where you can't compile your
stuff anymore *only* because of the bounds, not because of anything
else.

A couple months ago we had yet another example of "that will never happen" caused by people ignoring upper bounds. Developers never saw any problem, of course; and who cares about all the users who had compiles explode with unexpected errors? I think it took less than two weeks after someone patched up the most visibly affected packages before developers were shouting to remove upper bounds from the PVP again, because the affected users are just users and apparently not important enough to consider when setting versioning policy.

--
brandon s allbery kf8nh                               sine nomine associates
[hidden email]                                  [hidden email]
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Edward Kmett-2
In reply to this post by Daniel Trstenjak-2
On Tue, Feb 25, 2014 at 2:51 AM, Daniel Trstenjak <[hidden email]> wrote:

Hi Sven,

> o_O Dumb question: Can somebody please explain why this doesn't
> conform to the PVP? I have a very hard time reading that out of
> http://www.haskell.org/haskellwiki/Package_versioning_policy. Perhaps
> I'm looking at the wrong document or this interpretation is just
> wishful thinking...

If I'm getting it right, you don't have to increase a major version
number if you're e.g. just adding another function.

But if the user of your library imports unqualified or implicit,
then he will also get your added function and this function might
conflict with functions in your code base.
 
Note: this particular concern would be much lessened at least for local definitions, had we done anything with Lennart's perfectly reasonable suggestion to change the scoping rules to let local definitions win over imports. When I mentioned above that I had had half a dozen problems in five years, four of them would have been resolved successfully by that proposal.

-Edward

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Omari Norman-2
In reply to this post by Brandon Allbery
On Tue, Feb 25, 2014 at 10:16 AM, Brandon Allbery <[hidden email]> wrote:
> because the affected users are just
> users and apparently not important enough to consider when setting
> versioning policy.

Users are important enough to consider, but their needs should not
trump all others.  In particular, (nearly?) all software on Hackage is
given to users at no charge.  Developers invest their time.  Their
needs are important too.  If policies make it too troublesome for
developers to maintain software or publicly post it on Hackage, they
will just stop posting it.

Obviously there is a balance to be struck, as if you make things too
hard for users then there will be no users.  The problem is that the
PVP is putting a considerable maintenance burden on developers but
it's not even clear there is commensurate benefit to users.  Often
it's hard to get different packages to work together because upper
bounds are too tight.
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Twan van Laarhoven
In reply to this post by Brandon Allbery
On 25/02/14 16:12, Brandon Allbery wrote:

> On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     But that's only one half of the "package interoperability" issue. I face
>     this first hand on a daily basis with my Stackage maintenance. I spend far
>     more time reporting issues of restrictive upper bounds than I do with broken
>     builds from upstream changes. So I look at this as purely a game of
>     statistics: are you more likely to have code break because version 1.2 of
>     text changes the type of the map function and you didn't have an upper
>     bound, or because two dependencies of yours have *conflicting* versions
>     bounds on a package like aeson[2]? In my experience, the latter occurs far
>     more often than the former.
>
>
> I have a question for you.
>
> Is it better to save a developer some work, or is it better to force that work
> onto end users?
>
> Because we keep constantly seeing examples where saving the developer some upper
> bounds PVP work forces users to deal with unexpected errors, but since Haskell
> developers don't see that user pain it is considered irrelevant/nonexistent and
> certainly not any justification for saving developers some work.
>
> Personally, I think any ecosystem which strongly prefers pushing versioning pain
> points onto end users instead of developers is doing itself a severe disservice.
>
> Are there things that could be improved about versioning policy? Absolutely. But
> pushing problems onto end users is not an improvement.

Strict upper bounds are horrible when a new version of, say, the base library
comes out. In reality 90% of the code will not break, it will just require a new
release with increased version bounds. These upper bounds actually *hurt* users,
because they suddenly couldn't use half of Hackage.

This reminds me of the situation of Firefox extensions. In earlier versions of
the browser these came with strict upper bounds, saying "I work in Firefox 11 up
to 13". But then every month or so when a new version came out, all extensions
would stop working. Newer versions of the browser have switched to an 'assume it
works' model, where problems are reported and only then will the extension be
disabled.

So, violating upper-bounds should be a warning at most, perhaps for some kind of
loose 'tested-with' upper bound. Additionally, we need a way to report build
successes and failures to Hackage, and automatically update these 'tested-with'
upper bounds.

In other words, make a distinction between upper bounds violations that mean
"not known to work with versions >X" and "known not to work with versions >X".


Twan
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Edward Kmett-2
In reply to this post by Brandon Allbery
On Tue, Feb 25, 2014 at 10:16 AM, Brandon Allbery <[hidden email]> wrote:
On Tue, Feb 25, 2014 at 2:21 AM, Sven Panne <[hidden email]> wrote:
Regarding upper bounds: I never understood what their advantage should
be, IMHO they only lead to a version hell where you can't compile your
stuff anymore *only* because of the bounds, not because of anything
else.

A couple months ago we had yet another example of "that will never happen" caused by people ignoring upper bounds. Developers never saw any problem, of course; and who cares about all the users who had compiles explode with unexpected errors? I think it took less than two weeks after someone patched up the most visibly affected packages before developers were shouting to remove upper bounds from the PVP again, because the affected users are just users and apparently not important enough to consider when setting versioning policy.

I tried living without upper bounds. My attempt was not motivated out of disdain for users, but from the fact that all of the complaints I had had had been about the opposite, constraints that were too tight. The change was motivate largely by a desire to improve end user experience.

However, after removing the bounds, the situations users wound up in were very hard to fix. From a POSIWID perspective, the purpose of removing upper bounds is to make Haskell nigh unusable without sandboxing or --force.

Consequently, I reverted to PVP compliance within a month. Yes, compliance. Despite Henning's attempt to grab the moral high ground there, the PVP does not require the use of qualified imports to depend on minor versions, it merely indicates that doing so is a slight risk.

To minimize breakage when new package versions are released, you can use dependencies that are insensitive to minor version changes (e.g. foo >= 1.2.1 && < 1.3). However, note that this approach is slightly risky: when a package exports more things than before, there is a chance that your code will fail to compile due to new name-clash errors. The risk from new name clashes may be small, but you are on the safe side if you import identifiers explicitly or using qualification. 

-Edward

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Daniel Trstenjak-2
In reply to this post by Brandon Allbery

Hi Brandon,

On Tue, Feb 25, 2014 at 10:12:29AM -0500, Brandon Allbery wrote:
> Is it better to save a developer some work, or is it better to force that work
> onto end users?

What is an end user? Someone installing a package containing an executable?
Then the package is an end point in the dependency graph and the PVP can
work pretty well for this case.

But if the package contains a library, then the end user is also the
developer, so you can only choose which kind of pain you prefer.

> Because we keep constantly seeing examples where saving the developer some
> upper bounds PVP work forces users to deal with unexpected errors, but since
> Haskell developers don't see that user pain it is considered irrelevant/
> nonexistent and certainly not any justification for saving developers some
> work.

I think that in most cases it doesn't really make much difference for the
end user, if they're seeing a package version mismatch or if they're
seeing a compile error.

Sure, the package version mismatch is more telling, but in most cases he
will be equally lost and has to ask for help.


Greetings,
Daniel
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Michael Snoyman
In reply to this post by Brandon Allbery



On Tue, Feb 25, 2014 at 5:12 PM, Brandon Allbery <[hidden email]> wrote:
On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]> wrote:
But that's only one half of the "package interoperability" issue. I face this first hand on a daily basis with my Stackage maintenance. I spend far more time reporting issues of restrictive upper bounds than I do with broken builds from upstream changes. So I look at this as purely a game of statistics: are you more likely to have code break because version 1.2 of text changes the type of the map function and you didn't have an upper bound, or because two dependencies of yours have *conflicting* versions bounds on a package like aeson[2]? In my experience, the latter occurs far more often than the former.

I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

Because we keep constantly seeing examples where saving the developer some upper bounds PVP work forces users to deal with unexpected errors, but since Haskell developers don't see that user pain it is considered irrelevant/nonexistent and certainly not any justification for saving developers some work.

Personally, I think any ecosystem which strongly prefers pushing versioning pain points onto end users instead of developers is doing itself a severe disservice.

Are there things that could be improved about versioning policy? Absolutely. But pushing problems onto end users is not an improvement.


I think it's a false dichotomy. I've received plenty of complaints from users about being unable to install newer versions of some dependency because a library that Yesod depends on has an unnecessary strict upper bound. Are there situations where the PVP saves a user some pain? Yes. Are there situations where the PVP causes a user some pain? Yes.

It's disingenuous to frame this as a black and white "developer vs user" issue, it's far more complex than that. After a lot of experience, I believe the PVP- or at least strict adherence to it- is a net loss.

And I think the *real* solution is something like Stackage, where curators have taken care of the versioning pain points instead of either developers or end users. Linux distributions have been doing this for a long time. 

Michael

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Brandon Allbery
In reply to this post by Daniel Trstenjak-2
On Tue, Feb 25, 2014 at 10:38 AM, Daniel Trstenjak <[hidden email]> wrote:
On Tue, Feb 25, 2014 at 10:12:29AM -0500, Brandon Allbery wrote:
> Is it better to save a developer some work, or is it better to force that work
> onto end users?

What is an end user? Someone installing a package containing an executable?
Then the package is an end point in the dependency graph and the PVP can
work pretty well for this case.

But if the package contains a library, then the end user is also the
developer, so you can only choose which kind of pain you prefer.

*A* developer, but not the developer of the package with the loose upper bound or the package that refused to compile with incomprehensible errors because of it, and generally not in a position to recognize the reason for the errors because they don't know the internals of the package they're trying to use. And I am certain of this because I'm sitting in #haskell fielding questions from them multiple times a day when some package gets broken by an overly lax or missing upper bound.

Also note that overly strict versioning certainly also leads to breakage --- but it's reported clearly by cabal as a version issue, not as ghc vomiting up unexpected errors from something that is presented as a curated package that should build without problems.

-- 
brandon s allbery kf8nh                               sine nomine associates
[hidden email]                                  [hidden email]
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Carter Schonwald
In reply to this post by Michael Snoyman
This thread is kinda missing an important point.  

Namely that on hackage now, the admins and trustees have the power to edit the cabal files to fix broken constraints. (As do maintainers of their own packages)

Whether relaxing incorrectly conservative or over strengthening overly lax constraints, this now doesn't require a rerelease to "fix".  There are valid reasons of provinence for why that might not make sense in all case

Unless I'm missing the point, doesn't that solve most of the matter?
-carter

On Tuesday, February 25, 2014, Michael Snoyman <[hidden email]> wrote:



On Tue, Feb 25, 2014 at 5:12 PM, Brandon Allbery <<a href="javascript:_e(%7B%7D,&#39;cvml&#39;,&#39;allbery.b@gmail.com&#39;);" target="_blank">allbery.b@...> wrote:
On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <<a href="javascript:_e(%7B%7D,&#39;cvml&#39;,&#39;michael@snoyman.com&#39;);" target="_blank">michael@...> wrote:
But that's only one half of the "package interoperability" issue. I face this first hand on a daily basis with my Stackage maintenance. I spend far more time reporting issues of restrictive upper bounds than I do with broken builds from upstream changes. So I look at this as purely a game of statistics: are you more likely to have code break because version 1.2 of text changes the type of the map function and you didn't have an upper bound, or because two dependencies of yours have *conflicting* versions bounds on a package like aeson[2]? In my experience, the latter occurs far more often than the former.

I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

Because we keep constantly seeing examples where saving the developer some upper bounds PVP work forces users to deal with unexpected errors, but since Haskell developers don't see that user pain it is considered irrelevant/nonexistent and certainly not any justification for saving developers some work.

Personally, I think any ecosystem which strongly prefers pushing versioning pain points onto end users instead of developers is doing itself a severe disservice.

Are there things that could be improved about versioning policy? Absolutely. But pushing problems onto end users is not an improvement.


I think it's a false dichotomy. I've received plenty of complaints from users about being unable to install newer versions of some dependency because a library that Yesod depends on has an unnecessary strict upper bound. Are there situations where the PVP saves a user some pain? Yes. Are there situations where the PVP causes a user some pain? Yes.

It's disingenuous to frame this as a black and white "developer vs user" issue, it's far more complex than that. After a lot of experience, I believe the PVP- or at least strict adherence to it- is a net loss.

And I think the *real* solution is something like Stackage, where curators have taken care of the versioning pain points instead of either developers or end users. Linux distributions have been doing this for a long time. 

Michael

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Michael Snoyman



On Tue, Feb 25, 2014 at 6:38 PM, Carter Schonwald <[hidden email]> wrote:
This thread is kinda missing an important point.  

Namely that on hackage now, the admins and trustees have the power to edit the cabal files to fix broken constraints. (As do maintainers of their own packages)

Whether relaxing incorrectly conservative or over strengthening overly lax constraints, this now doesn't require a rerelease to "fix".  There are valid reasons of provinence for why that might not make sense in all case

Unless I'm missing the point, doesn't that solve most of the matter?
-carter



The question would still remain: who's responsible for making those changes, and what is the default position for the version bounds? We could default to leaving version bounds off, and add them after the fact as necessary. This would reduce developer and Hackage maintainer overhead, but some users may get the "scary" error messages[1]. Or we could default to the PVP approach, and then increase the work on developers/maintainers, with the flip side that (1) users will never get the "scary" error messages, and (2) until developers/maintainers make the change, users may be blocked from even attempting to compile packages together.

There's also the issue that, currently, Hackage2 has turned off developer abilities to change version bounds, so all of the version tweaking onus would fall to admins and trustees.

Overall, I don't see this as a big improvement over the PVP status quo. It's not any harder for me to upload version 1.0.2.1 of a package with a tweaked version bound than to go to the Hackage web interface and manually edit version 1.0.2's cabal file. What I see the editing feature as very useful for is if we want to add upper bounds after the fact.

Michael

[1] Which I still think have value, since they are far more informative to a package author.

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Erik Hesselink
In reply to this post by Carter Schonwald

I believe this feature is currently turned off, because it comes with its own set of problems.

Erik

On Feb 25, 2014 5:38 PM, "Carter Schonwald" <[hidden email]> wrote:
This thread is kinda missing an important point.  

Namely that on hackage now, the admins and trustees have the power to edit the cabal files to fix broken constraints. (As do maintainers of their own packages)

Whether relaxing incorrectly conservative or over strengthening overly lax constraints, this now doesn't require a rerelease to "fix".  There are valid reasons of provinence for why that might not make sense in all case

Unless I'm missing the point, doesn't that solve most of the matter?
-carter

On Tuesday, February 25, 2014, Michael Snoyman <[hidden email]> wrote:



On Tue, Feb 25, 2014 at 5:12 PM, Brandon Allbery <[hidden email]> wrote:
On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]> wrote:
But that's only one half of the "package interoperability" issue. I face this first hand on a daily basis with my Stackage maintenance. I spend far more time reporting issues of restrictive upper bounds than I do with broken builds from upstream changes. So I look at this as purely a game of statistics: are you more likely to have code break because version 1.2 of text changes the type of the map function and you didn't have an upper bound, or because two dependencies of yours have *conflicting* versions bounds on a package like aeson[2]? In my experience, the latter occurs far more often than the former.

I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

Because we keep constantly seeing examples where saving the developer some upper bounds PVP work forces users to deal with unexpected errors, but since Haskell developers don't see that user pain it is considered irrelevant/nonexistent and certainly not any justification for saving developers some work.

Personally, I think any ecosystem which strongly prefers pushing versioning pain points onto end users instead of developers is doing itself a severe disservice.

Are there things that could be improved about versioning policy? Absolutely. But pushing problems onto end users is not an improvement.


I think it's a false dichotomy. I've received plenty of complaints from users about being unable to install newer versions of some dependency because a library that Yesod depends on has an unnecessary strict upper bound. Are there situations where the PVP saves a user some pain? Yes. Are there situations where the PVP causes a user some pain? Yes.

It's disingenuous to frame this as a black and white "developer vs user" issue, it's far more complex than that. After a lot of experience, I believe the PVP- or at least strict adherence to it- is a net loss.

And I think the *real* solution is something like Stackage, where curators have taken care of the versioning pain points instead of either developers or end users. Linux distributions have been doing this for a long time. 

Michael

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries


_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Vincent Hanquez
In reply to this post by Brandon Allbery
On 2014-02-25 15:12, Brandon Allbery wrote:

> On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]
> <mailto:[hidden email]>> wrote:
>
>     But that's only one half of the "package interoperability" issue.
>     I face this first hand on a daily basis with my Stackage
>     maintenance. I spend far more time reporting issues of restrictive
>     upper bounds than I do with broken builds from upstream changes.
>     So I look at this as purely a game of statistics: are you more
>     likely to have code break because version 1.2 of text changes the
>     type of the map function and you didn't have an upper bound, or
>     because two dependencies of yours have *conflicting* versions
>     bounds on a package like aeson[2]? In my experience, the latter
>     occurs far more often than the former.
>
>
> I have a question for you.
>
> Is it better to save a developer some work, or is it better to force
> that work onto end users?
>
As a *user* of many libraries, I had more problems with libraries that
follow the PvP religiously than the other way around. I usually like to
have the latest and greatest libraries, specially text, aeson, and such,
and there I have to manually bump dependencies of packages I depend on,
until the developers gets to update the package on hackage (which
sometimes takes many weeks).

As a *developer*, following the PvP would cost me a lot of my *free*
time. This is particularly true when the surface of contact with a
library is small, it's very unlikely that I will run into an API
changes. When I do, I release a new package quickly that account for the
API change, or I can put a upper bounds if I can't make the necessary
changes quickly enough. I usually found out quite quickly with stackage
nowadays, most of times, before any users get bitten.

Some other time, I'm testing some development ghc or some new unreleased
libraries, and I need to remove upper bounds from packages so that I can
test something.

Anyway, there's lots of reason that the PvP doesn't works fully. It
solves some problems for sure, but sadly swipe all the other problems
under the carpet. One problem being that a single set of numbers doesn't
properly account for API complexity and stability that might differ in
different modules of the same package.

--
Vincent
_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Carter Schonwald
indeed.

So lets think about how to add module types or some approximation thereof to GHC? (seriously, thats the only sane "best solution" i can think of, but its not something that can be done casually). Theres also the fact that any module system design will have to explicitly deal with type class instances in a more explicit fashion than we've done thus far.


On Tue, Feb 25, 2014 at 12:17 PM, Vincent Hanquez <[hidden email]> wrote:
On 2014-02-25 15:12, Brandon Allbery wrote:

On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email] <mailto:[hidden email]>> wrote:

    But that's only one half of the "package interoperability" issue.
    I face this first hand on a daily basis with my Stackage
    maintenance. I spend far more time reporting issues of restrictive
    upper bounds than I do with broken builds from upstream changes.
    So I look at this as purely a game of statistics: are you more
    likely to have code break because version 1.2 of text changes the
    type of the map function and you didn't have an upper bound, or
    because two dependencies of yours have *conflicting* versions
    bounds on a package like aeson[2]? In my experience, the latter
    occurs far more often than the former.


I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

As a *user* of many libraries, I had more problems with libraries that follow the PvP religiously than the other way around. I usually like to have the latest and greatest libraries, specially text, aeson, and such, and there I have to manually bump dependencies of packages I depend on, until the developers gets to update the package on hackage (which sometimes takes many weeks).

As a *developer*, following the PvP would cost me a lot of my *free* time. This is particularly true when the surface of contact with a library is small, it's very unlikely that I will run into an API changes. When I do, I release a new package quickly that account for the API change, or I can put a upper bounds if I can't make the necessary changes quickly enough. I usually found out quite quickly with stackage nowadays, most of times, before any users get bitten.

Some other time, I'm testing some development ghc or some new unreleased libraries, and I need to remove upper bounds from packages so that I can test something.

Anyway, there's lots of reason that the PvP doesn't works fully. It solves some problems for sure, but sadly swipe all the other problems under the carpet. One problem being that a single set of numbers doesn't properly account for API complexity and stability that might differ in different modules of the same package.

--
Vincent

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries


_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
Reply | Threaded
Open this post in threaded view
|

Re: qualified imports, PVP and so on (Was: add new Data.Bits.Bits(bitZero) method)

Edward Kmett-2
In reply to this post by Carter Schonwald
This is currently disabled if you go to try it.


On Tue, Feb 25, 2014 at 11:38 AM, Carter Schonwald <[hidden email]> wrote:
This thread is kinda missing an important point.  

Namely that on hackage now, the admins and trustees have the power to edit the cabal files to fix broken constraints. (As do maintainers of their own packages)

Whether relaxing incorrectly conservative or over strengthening overly lax constraints, this now doesn't require a rerelease to "fix".  There are valid reasons of provinence for why that might not make sense in all case

Unless I'm missing the point, doesn't that solve most of the matter?
-carter


On Tuesday, February 25, 2014, Michael Snoyman <[hidden email]> wrote:



On Tue, Feb 25, 2014 at 5:12 PM, Brandon Allbery <[hidden email]> wrote:
On Tue, Feb 25, 2014 at 1:44 AM, Michael Snoyman <[hidden email]> wrote:
But that's only one half of the "package interoperability" issue. I face this first hand on a daily basis with my Stackage maintenance. I spend far more time reporting issues of restrictive upper bounds than I do with broken builds from upstream changes. So I look at this as purely a game of statistics: are you more likely to have code break because version 1.2 of text changes the type of the map function and you didn't have an upper bound, or because two dependencies of yours have *conflicting* versions bounds on a package like aeson[2]? In my experience, the latter occurs far more often than the former.

I have a question for you.

Is it better to save a developer some work, or is it better to force that work onto end users?

Because we keep constantly seeing examples where saving the developer some upper bounds PVP work forces users to deal with unexpected errors, but since Haskell developers don't see that user pain it is considered irrelevant/nonexistent and certainly not any justification for saving developers some work.

Personally, I think any ecosystem which strongly prefers pushing versioning pain points onto end users instead of developers is doing itself a severe disservice.

Are there things that could be improved about versioning policy? Absolutely. But pushing problems onto end users is not an improvement.


I think it's a false dichotomy. I've received plenty of complaints from users about being unable to install newer versions of some dependency because a library that Yesod depends on has an unnecessary strict upper bound. Are there situations where the PVP saves a user some pain? Yes. Are there situations where the PVP causes a user some pain? Yes.

It's disingenuous to frame this as a black and white "developer vs user" issue, it's far more complex than that. After a lot of experience, I believe the PVP- or at least strict adherence to it- is a net loss.

And I think the *real* solution is something like Stackage, where curators have taken care of the versioning pain points instead of either developers or end users. Linux distributions have been doing this for a long time. 

Michael

_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries



_______________________________________________
Libraries mailing list
[hidden email]
http://www.haskell.org/mailman/listinfo/libraries
123456 ... 8