I don't know if you saw the following linked off /.
http://www.itwire.com.au/content/view/13339/53/ An amazon link for the book is here: http://www.amazon.com/Computer-Science-Reconsidered-Invocation-Expression/dp/0471798142 The basic claim appears to be that discrete mathematics is a bad foundation for computer science. I suspect the subscribers to this list would beg to disagree. Enjoy, T. -- Dr Thomas Conway [hidden email] Silence is the perfectest herald of joy: I were but little happy, if I could say how much. _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
drtomc:
> I don't know if you saw the following linked off /. > > http://www.itwire.com.au/content/view/13339/53/ > > An amazon link for the book is here: > > http://www.amazon.com/Computer-Science-Reconsidered-Invocation-Expression/dp/0471798142 > > The basic claim appears to be that discrete mathematics is a bad > foundation for computer science. I suspect the subscribers to this > list would beg to disagree. > > Enjoy, :-) And he's patented it... http://www.patentstorm.us/patents/5355496-description.html SUMMARY OF THE INVENTION A method and system for process expression and resolution is described. A first language structure comprising a possibility expression having at least one definition which is inherently and generally concurrent is provided. Further, a second language structure comprising an actuality expression including a fully formed input data name to be resolved is provided. Furthermore, a third language structure comprising an active expression initially having at least one invocation, the invocation comprising an association with a particular definition and the fully formed input data name of the actuality expression is provided. Subsequently, the process of resolving invocations begins in the active expression with fully formed input data names in relation to their associated definition to produce at least one or both of the following: (1) an invocation with a fully formed input data name and (2) a result data name. Interesting... -- Don _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
On 7/9/07, Donald Bruce Stewart <[hidden email]> wrote:
> And he's patented it... > > http://www.patentstorm.us/patents/5355496-description.html Clearly a winner then. :-) T. -- Dr Thomas Conway [hidden email] Silence is the perfectest herald of joy: I were but little happy, if I could say how much. _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Thomas Conway
On Monday 09 July 2007 17:42, Thomas Conway wrote:
> I don't know if you saw the following linked off /. > > http://www.itwire.com.au/content/view/13339/53/ > > An amazon link for the book is here: > > http://www.amazon.com/Computer-Science-Reconsidered-Invocation-Expression/d >p/0471798142 > > The basic claim appears to be that discrete mathematics is a bad > foundation for computer science. I suspect the subscribers to this > list would beg to disagree. I wouldn't want to comment on the validity of his claim, maybe he's wrong, or maybe he's... well, anyway... what I will say is I got a chuckle out of the 'Citations' that Amazon lists. I especially like it that Mr. Fant's book is apparently cited in 'The Essential Guide to Psychiatric Drugs: Includes The Most Recent Information On: Antidepressants, Tranquilizers and Antianxiety Drugs, Antipsychotics, ...' I shudder to think of the creative processes involved in the creation of the book. _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
On 7/9/07, Daniel McAllansmith <[hidden email]> wrote:
> I wouldn't want to comment on the validity of his claim, maybe he's wrong, or > maybe he's... well, anyway... what I will say is I got a chuckle out of > the 'Citations' that Amazon lists. As amusing as that thought is, it seems that this is regrettably an error on Amazon's part. After looking at the actual page images where the alleged citations occur, there is nowhere any mention of this book. (How could there be? It was just published.) It looks like Amazon's citation database is mistakenly using the index for the book _Beating Depression_ by John Rush (Toronto: John Wiley & Sons, Canada Ltd., 1983). Steve _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
> It looks like Amazon's citation database is mistakenly using the index
> for the book _Beating Depression_ by John Rush (Toronto: John Wiley & > Sons, Canada Ltd., 1983). > Yes it is so. Amazon.com mistakenly thinks that the given book is a new edition of the book titled "beating depression". Amazon also links hardcover and softcover editions of "beating depression" just below where the price and availability of the book is mentioned. Regards _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Donald Bruce Stewart
Doesn't Haskell already implement the 3-valued logic (True, False, NULL), that
Karl Fant proposes (see papers at http://www.theseusresearch.com/invocation%20model.htm) as an alternative to centralised clock-based coordination, by postulating that every data type includes the bottom value? I like his concept that: "concurrency is simple and primitive and sequentiality is a complex and risky derivative of concurrency." Can someone remind me why, in a language like Haskell that is referentially transparent and therefore inherently 'concurrent', we need explicit concurrency (threads, etc.) ? titto On Monday 09 July 2007 06:48:03 Donald Bruce Stewart wrote: > drtomc: > > I don't know if you saw the following linked off /. > > > > http://www.itwire.com.au/content/view/13339/53/ > > > > An amazon link for the book is here: > > > > http://www.amazon.com/Computer-Science-Reconsidered-Invocation-Expression > >/dp/0471798142 > > > > The basic claim appears to be that discrete mathematics is a bad > > foundation for computer science. I suspect the subscribers to this > > list would beg to disagree. > > > > Enjoy, > > > :-) > > And he's patented it... > > http://www.patentstorm.us/patents/5355496-description.html > > SUMMARY OF THE INVENTION > > A method and system for process expression and resolution is described. > A first language structure comprising a possibility expression having at > least one definition which is inherently and generally concurrent is > provided. Further, a second language structure comprising an actuality > expression including a fully formed input data name to be resolved is > provided. Furthermore, a third language structure comprising an active > expression initially having at least one invocation, the invocation > comprising an association with a particular definition and the fully formed > input data name of the actuality expression is provided. Subsequently, the > process of resolving invocations begins in the active expression with fully > formed input data names in relation to their associated definition to > produce at least one or both of the following: (1) an invocation with a > fully formed input data name and (2) a result data name. > > Interesting... > > -- Don > _______________________________________________ > Haskell-Cafe mailing list > [hidden email] > http://www.haskell.org/mailman/listinfo/haskell-cafe _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Thomas Conway
On 15:42 Mon 09 Jul , Thomas Conway wrote:
> I don't know if you saw the following linked off /. > > http://www.itwire.com.au/content/view/13339/53/ > I read that earlier and his comments, such as "This concept of 'process expression' is, he says, a common thread running through the various disciplines of computer science", made me think of arrows and category theory. And I wonder what kind of aberration a monte-carlo algorithm would be if this excerpt is to be taken seriously: "Any program utilising random input to carry out its process, such...is not an algorithm." Cheers, Asumu Takikawa _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe signature.asc (196 bytes) Download Attachment |
In reply to this post by Thomas Conway
Hi all
On 9 Jul 2007, at 06:42, Thomas Conway wrote: > I don't know if you saw the following linked off /. > > http://www.itwire.com.au/content/view/13339/53/ [..] > The basic claim appears to be that discrete mathematics is a bad > foundation for computer science. I suspect the subscribers to this > list would beg to disagree. It's true that some systems are better characterised as corecursive "coprograms", rather than as recursive "programs". This is not a popular or well-understood distinction. In my career as an advocate for total programming (in some carefully delineated fragment of a language) I have many times been gotcha'ed thus: "but an operating system is a program which isn't supposed to terminate". No, an operating system is supposed to remain responsive. And that's what total coprograms do. By the looks of this article, the program versus coprogram distinction seems to have occasioned an unprecedented degree of existential angst for this individual. Even so, I'd say that it's worth raising awareness of it. Haskell's identification of inductive data with coinductive data, however well motivated, has allowed people to be lazy. People aren't so likely to be thinking "do I mean inductive or coinductive here?", "is this function productive?" etc. The usual style is to write as if everything is inductive, and if it still works on infinite data, to pat ourselves on the back for using Haskell rather than ML. I'm certainly guilty of that. I'd go as far as to suggest that "codata" be made a keyword, at present doubling for "data", but with the documentary purpose of indicating that a different mode of (co)programming is in order. It might also be the basis of better warnings, optimisations, etc. Moreover, it becomes a necessary distinction if we ever need to identify a total fragment of Haskell. Overkill, perhaps, but I often find it's something I want to express. Just a thought Conor _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Thomas Conway
On 7/8/07, Thomas Conway <[hidden email]> wrote:
> The basic claim appears to be that discrete mathematics is a bad > foundation for computer science. I suspect the subscribers to this > list would beg to disagree. Wearing my tin foil hat for the moment, I think that there is a conspiracy by some computer scientists to drive a wedge between mathematicians and computer scientists. You can see hints of it in many places where both mathematicians and computer scientists hang out and there have been quite a few recent articles setting up mathematics and computer science as somehow in competition with each other. Many of the structures studied by mathematicians are algebraic. Many of the structures studied by computer scientists are coalgebraic (eg. the web itself can be seen as a vast coalgebraic structure). Sometimes I wonder if the only difference between mathematicians and computer scientists is the direction of their arrows. _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Conor McBride
Conor McBride wrote:
> Hi all > > It's true that some systems are better characterised as corecursive > "coprograms", rather than as recursive "programs". This is not a > popular or well-understood distinction. In my career as an advocate > for total programming (in some carefully delineated fragment of a > language) I have many times been gotcha'ed thus: "but an operating > system is a program which isn't supposed to terminate". No, an > operating system is supposed to remain responsive. And that's what > total coprograms do. > > By the looks of this article, the program versus coprogram distinction > seems to have occasioned an unprecedented degree of existential angst > for this individual. > > Even so, I'd say that it's worth raising awareness of it. Haskell's > identification of inductive data with coinductive data, however well > motivated, has allowed people to be lazy. People aren't so likely to > be thinking "do I mean inductive or coinductive here?", "is this > function productive?" etc. The usual style is to write as if > everything is inductive, and if it still works on infinite data, to > pat ourselves on the back for using Haskell rather than ML. I'm > certainly guilty of that. > > I'd go as far as to suggest that "codata" be made a keyword, at > present doubling for "data", but with the documentary purpose of > indicating that a different mode of (co)programming is in order. It > might also be the basis of better warnings, optimisations, etc. > Moreover, it becomes a necessary distinction if we ever need > to identify a total fragment of Haskell. Overkill, perhaps, but > I often find it's something I want to express. > > Just a thought Erm... Wait a sec... coroutines, comonads, coprograms, codata... what in the name of goodness does "co" actually *mean* anyway?? _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
On Tue, Jul 10, 2007 at 08:08:52PM +0100, Andrew Coppin wrote:
> Erm... Wait a sec... coroutines, comonads, coprograms, codata... what in > the name of goodness does "co" actually *mean* anyway?? Nothing. When mathematicians find a new thing completely unlike an OldThing, but related by some symmetry, they often call the new thing a CoOldThing. Data can only be constructed using constructors, but can be deconstructed using recursive folds; Codata can only be deconstructed using case analysis, but can be constructed using recursive unfolds. Monads keep things inside. Comonads keep things outside. Homology theory studies the boundaries of shapes. Cohomology theory studies the insides of curves. ... Stefan _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe signature.asc (196 bytes) Download Attachment |
In reply to this post by Dan Piponi-2
On 7/9/07, Dan Piponi <[hidden email]> wrote: On 7/8/07, Thomas Conway <[hidden email]> wrote: Okay Mr. Piponi, as a math geek I can't let that comment about the web slide without further explanation. Is it just the idea that coalgebras can capture the idea of side affects (a -> F a) or is there something more specific that you're thinking of? _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Conor McBride
On 7/9/07, Conor McBride <[hidden email]> wrote: Hi all I'm sorry, but can you expand a little further on this? I guess I don't understand how a corecursion => responsive to input but not terminating. Where does the idea of waiting for input fit into corecursion? _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Creighton Hogg-4
On 7/10/07, Creighton Hogg <[hidden email]> wrote:
> Okay Mr. Piponi, as a math geek I can't let that comment about the web slide > without further explanation. Is it just the idea that coalgebras can > capture the idea of side affects (a -> F a) or is there something more > specific that you're thinking of? First a quick bit of background on algebras. If F is a functor, an F-algebra is an arrow FX->X. For example if we choose FX = 1+X+X^2 (using + to mean disjoint union) then an F-algebra is a function 1+X+X^2->X. The 1->X part just picks out a constant, the image of 1. The X^2->X defines a binary operator and the X->X part is an endomorphism. A group has a constant element (the identity) an endomorphism (the inverse) and a binary operator (multiplication). So a group is an example of an F-algebra (with some extra equations added in so a group isn't *just* an F-coalgebra). A F-coalgebra is an arrow X->FX. As an example, let's pick FX=(String,[X]). So an F-coalgebra is a function X->(String,[X]). We can view this as two functions, 'appearance' of type X->String and 'links' of type X->[X]. If X is the type of web pages, then interpret 'appearance' as the rendering (as plain text) of the web page and links as the function that gives a list of links in the page. So the web forms a coalgebra. (Though you'll need some extra work to deal with persistent state like cookies.) The theme is that mathematicians often like to study objects with some kind of 'combination' operation like (generalised) addition or multiplication. These form algebras with maps FX->X. Computer scientists often like to study things that generate more stuff (eg. when you press a button or input something). So you end up with something of the form X->FX. This includes many familiar things like web pages, state machines and formal languages. This isn't a sharp divide (of course) but I think it reflects a real difference in emphasis. A great source for this stuff is the book 'Vicious Circles' by Barwise and Moss. It's full of computer sciencey stuff but it seems to be written for an audience that includes mathematicians and computer scientists. (It has quite a few typos and more serious errors however.) _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
Dan Piponi wrote:
> First a quick bit of background on algebras. > > If F is a functor, an F-algebra is an arrow FX->X. For example if we > choose FX = 1+X+X^2 (using + to mean disjoint union) then an F-algebra > is a function 1+X+X^2->X. The 1->X part just picks out a constant, the > image of 1. The X^2->X defines a binary operator and the X->X part is > an endomorphism. A group has a constant element (the identity) an > endomorphism (the inverse) and a binary operator (multiplication). So > a group is an example of an F-algebra (with some extra equations added > in so a group isn't *just* an F-coalgebra). > > A F-coalgebra is an arrow X->FX. As an example, let's pick > FX=(String,[X]). So an F-coalgebra is a function X->(String,[X]). We > can view this as two functions, 'appearance' of type X->String and > 'links' of type X->[X]. If X is the type of web pages, then interpret > 'appearance' as the rendering (as plain text) of the web page and > links as the function that gives a list of links in the page. So the > web forms a coalgebra. (Though you'll need some extra work to deal > with persistent state like cookies.) ...wooooosh... ...and now I know what normal people must feel like when *I* open my mouth. o_O _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
In reply to this post by Stefan O'Rear
Stefan O'Rear wrote:
> On Tue, Jul 10, 2007 at 08:08:52PM +0100, Andrew Coppin wrote: > >> Erm... Wait a sec... coroutines, comonads, coprograms, codata... what in >> the name of goodness does "co" actually *mean* anyway?? >> > > Nothing. > > When mathematicians find a new thing completely unlike an OldThing, but > related by some symmetry, they often call the new thing a CoOldThing. > > Data can only be constructed using constructors, but can be > deconstructed using recursive folds; > Codata can only be deconstructed using case analysis, but can be > constructed using recursive unfolds. > > Monads keep things inside. > Comonads keep things outside. > > Homology theory studies the boundaries of shapes. > Cohomology theory studies the insides of curves. > > ... > ...so it's similar to the term "normal"? As in Normal vector - a vector having unit length. Normal distribution - a common monomodal distribution following a characterstic Gaussian bell curve. Normal subgroup - a subset of a group such that all elements of it commute with the all elements of the whole group. ... _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
On 7/10/07, Andrew Coppin <[hidden email]> wrote:
> Stefan O'Rear wrote: > > On Tue, Jul 10, 2007 at 08:08:52PM +0100, Andrew Coppin wrote: > > > >> Erm... Wait a sec... coroutines, comonads, coprograms, codata... what in > >> the name of goodness does "co" actually *mean* anyway?? > > Nothing. > > When mathematicians find a new thing completely unlike an OldThing, but > > related by some symmetry, they often call the new thing a CoOldThing. (I got lost somewhere with the levels of quotation there...) It's more specific than this. Coalgebra, cohomology, codata, comonads and so on derive their name from the fact that they can be described using category theory. In category theory you draw lots of diagrams with arrows in them. When you flip all the arrows round you get a description of something else. Pairs of concepts connected in this way often differ by the prefix "co-". Often theorems you prove about objects have analogous theorems about the respective co-objects. In fact, often the proof is the same, just written with all the arrows pointing the other way. This carries over to Haskell too. You can sometimes write functional (as in useful) code simply by taking an already existing piece of code and figuring out what flipping the arrows means. It often means something very different, but it still makes sense. A really cool example is the relationship between fold and unfold. But I'll leave that for someone else. -- Dan _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
Dan Piponi wrote:
> (I got lost somewhere with the levels of quotation there...) > > It's more specific than this. Coalgebra, cohomology, codata, comonads > and so on derive their name from the fact that they can be described > using category theory. In category theory you draw lots of diagrams > with arrows in them. When you flip all the arrows round you get a > description of something else. Pairs of concepts connected in this way > often differ by the prefix "co-". Often theorems you prove about > objects have analogous theorems about the respective co-objects. In > fact, often the proof is the same, just written with all the arrows > pointing the other way. > > This carries over to Haskell too. You can sometimes write functional > (as in useful) code simply by taking an already existing piece of code > and figuring out what flipping the arrows means. It often means > something very different, but it still makes sense. A really cool > example is the relationship between fold and unfold. But I'll leave > that for someone else. Sounds a lot like the Boolean duality principle. (If a statement works one way, if you flip all the true/false and/or stuff, you get a brand new statement, which also works.) _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
On 7/10/07, Andrew Coppin <[hidden email]> wrote:
> Sounds a lot like the Boolean duality principle. That is, in fact, very closely related. -- Dan _______________________________________________ Haskell-Cafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskell-cafe |
Free forum by Nabble | Edit this page |