I am looking to continue to learn Haskell while working on something that might eventually be useful to others and get posted on Hackage. I have written quite a bit of Haskell code now, some useful and a lot just throw away for learning. In the past others have expressed interest in having a native Haskell sparse matrix and linear algebra library available(not just bindings to a C lib). This in combination with FEM is one of my interests. So my questions, is anyone currently working on a project like this? Does it seem like a good project/addition to the community? I'm also interested if anyone has any other project idea's, maybe even to collaborate on. Thanks

Hi Mark,
I might become your user. Currently, for Agda I have rolled my own sparse matrix implementation, see http://hackage.haskell.org/packages/archive/Agda/latest/doc/html/src/AgdaTerminationSparseMatrix.html Cheers, Andreas On 29.11.12 5:03 PM, Mark Flamer wrote: > I am looking to continue to learn Haskell while working on something that > might eventually be useful to others and get posted on Hackage. I have > written quite a bit of Haskell code now, some useful and a lot just throw > away for learning. In the past others have expressed interest in having a > native Haskell sparse matrix and linear algebra library available(not just > bindings to a C lib). This in combination with FEM is one of my interests. > So my questions, is anyone currently working on a project like this? Does it > seem like a good project/addition to the community? I'm also interested if > anyone has any other project idea's, maybe even to collaborate on. Thanks > > > >  > View this message in context: http://haskell.1045720.n5.nabble.com/IsanyoneworkingonasparsematrixlibraryinHaskelltp5721452.html > Sent from the Haskell  HaskellCafe mailing list archive at Nabble.com. > > _______________________________________________ > HaskellCafe mailing list > [hidden email] > http://www.haskell.org/mailman/listinfo/haskellcafe >  Andreas Abel <>< Du bist der geliebte Mensch. Theoretical Computer Science, University of Munich Oettingenstr. 67, D80538 Munich, GERMANY [hidden email] http://www2.tcs.ifi.lmu.de/~abel/ _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
In reply to this post by Mark Flamer
Hi Mark,
For my bachelor thesis I am doing something somewhat in that direction. I am developing a Echo State Neural Networks (ESNs) (http://minds.jacobsuniversity.de/esn_research) library in Haskell. I haven't worked on it for a while, since I was reading related literature in the last months. It will be used to classify motion capture data. Feel free to check it out: https://github.com/netogallo/LambdaNN. Sparse matrixes are used to build the ESNs, I have basic functions to produce sparse matrixes but nothing fancy at the moment.
Cheers, Ernesto Rodriguez On Thu, Nov 29, 2012 at 11:03 PM, Mark Flamer <[hidden email]> wrote: I am looking to continue to learn Haskell while working on something that Ernesto Rodriguez Bachelor of Computer Science  Class of 2013 Jacobs University Bremen _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
In reply to this post by Mark Flamer
Hi Mark
I can work on couple of algorithms if you have anything specific in mind. May be first start with how to represent the matrix and then continue with algorithms. Mukesh Tiwari On Fri, Nov 30, 2012 at 3:33 AM, Mark Flamer <[hidden email]> wrote: I am looking to continue to learn Haskell while working on something that _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
Thanks for all the replies,
It sounds like there is enough interest and even some potential collaborators out there. I have created a few data structures to represent sparse vectors and matrices. The vector was a simple binary tree and the matrix a quad tree. As I suspected a standard IntMap was around 3X as fast as my binary tree, so I have switched to the IntMap for now. I was hoping to hold on to my quad tree for the matrix rep. because I like the elegance of the recursive algorithms like Strassen’s for multiplication. In the end it will most likely be best to have a few different matrix representations anyway. For instance, I know that compressed row form is most efficient for certain algorithms. I have a Gauss iterative solver working currently and am thinking of moving on to a parallel Conjugate gradient or direct solver using LU decomposition. I’m no expert in Haskell or numeric methods but I do my best. I’ll clean up what I have and open up the project on Github or Bitbucket. Any other thoughts or ideas are welcome. I’m currently using the Matrix Market files for testing. It would be nice to benchmark this against an industry standard solver in C or Fortran, without having to do a lot of work to set it up. Does anyone know of an easy way to do this? I’m just trying to get results to compare in orders of magnitude for now. It would be motivating to see these comparisons. 
I look forward to see what comes of this!
On Fri, Nov 30, 2012 at 4:58 PM, Mark Flamer <[hidden email]> wrote: Thanks for all the replies, _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
In reply to this post by Mark Flamer
Hi Mark
To continue with library I just wrote a quick recursive matrix multiplication. Since you mentioned about Strassen's algorithm so I went to wikipedia ( http://en.wikipedia.org/wiki/Strassen_algorithm ) and wrote the recursive algorithm using 4 multiplication but it's not very hard to modify this to Strassen's algorithm. You can see code ( https://github.com/mukeshtiwari/Puzzles/blob/master/recursivematrixmultiplication/Matmultiplication.hs ). It's just quick code and it has lot of chance for improvement like using Data Parallel Haskell ) , some parallelism using Simon's monadpar library or distributed parallelism using Cloud Haskell and sparse matrix representation consideration. Mukesh Tiwari On Sat, Dec 1, 2012 at 3:28 AM, Mark Flamer <[hidden email]> wrote: Thanks for all the replies, _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
In reply to this post by Mark Flamer
Hello,
I started a FEM library, funfem [1], but I stopped it for the moment; Haskell is my hobby and I work on FEM all day long, I prefer to focus on orthogonal problems for home projects. It is a very naive implementation. Far from a version 0.0.1 too, i.e. unusable at the moment. I did not test the performance as it was not my main goal, so the following may not be completely relevant to your purpose. I define a type Tensor [2], which is a sparse matrix based on Data.Map. Not sure how efficient it is, I chose to start simple. The resulting conjugate gradient [3] is very clear. Please let us know how it goes, it's good to see more traction towards Haskell from our field, and I'll be glad to use your library ! Best regards, Adrien [1] http://www.funfem.org [2] https://github.com/adrienhaxaire/funfem/blob/master/Numeric/Funfem/Algebra/Tensor.hs [3] https://github.com/adrienhaxaire/funfem/blob/master/Numeric/Funfem/Algebra/Solver/CG.hs On Thursday 29 November 2012 14:03:04 Mark Flamer wrote: > I am looking to continue to learn Haskell while working on something that > might eventually be useful to others and get posted on Hackage. I have > written quite a bit of Haskell code now, some useful and a lot just throw > away for learning. In the past others have expressed interest in having a > native Haskell sparse matrix and linear algebra library available(not just > bindings to a C lib). This in combination with FEM is one of my interests. > So my questions, is anyone currently working on a project like this? Does it > seem like a good project/addition to the community? I'm also interested if > anyone has any other project idea's, maybe even to collaborate on. Thanks > > > >  > View this message in context: > http://haskell.1045720.n5.nabble.com/Isanyoneworkingonasparsematrixl > ibraryinHaskelltp5721452.html Sent from the Haskell  HaskellCafe > mailing list archive at Nabble.com. > > _______________________________________________ > HaskellCafe mailing list > [hidden email] > http://www.haskell.org/mailman/listinfo/haskellcafe Adrien Haxaire www.adrienhaxaire.org  @adrienhaxaire _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
Woops, forgot I switched to darcs after some time. The latest version can be
found here: http://www.funfem.org/browser/Numeric/Funfem/Algebra Adrien On Sunday 02 December 2012 00:00:32 Adrien Haxaire wrote: > Hello, > > I started a FEM library, funfem [1], but I stopped it for the moment; > Haskell is my hobby and I work on FEM all day long, I prefer to focus on > orthogonal problems for home projects. It is a very naive implementation. > Far from a version 0.0.1 too, i.e. unusable at the moment. > > I did not test the performance as it was not my main goal, so the following > may not be completely relevant to your purpose. > > I define a type Tensor [2], which is a sparse matrix based on Data.Map. Not > sure how efficient it is, I chose to start simple. The resulting conjugate > gradient [3] is very clear. > > Please let us know how it goes, it's good to see more traction towards > Haskell from our field, and I'll be glad to use your library ! > > Best regards, > Adrien > > [1] http://www.funfem.org > > [2] > https://github.com/adrienhaxaire/funfem/blob/master/Numeric/Funfem/Algebra/T > ensor.hs > > [3] > https://github.com/adrienhaxaire/funfem/blob/master/Numeric/Funfem/Algebra/S > olver/CG.hs > On Thursday 29 November 2012 14:03:04 Mark Flamer wrote: > > I am looking to continue to learn Haskell while working on something that > > might eventually be useful to others and get posted on Hackage. I have > > written quite a bit of Haskell code now, some useful and a lot just throw > > away for learning. In the past others have expressed interest in having a > > native Haskell sparse matrix and linear algebra library available(not just > > bindings to a C lib). This in combination with FEM is one of my interests. > > So my questions, is anyone currently working on a project like this? Does > > it seem like a good project/addition to the community? I'm also > > interested if anyone has any other project idea's, maybe even to > > collaborate on. Thanks > > > > > > > >  > > View this message in context: > > http://haskell.1045720.n5.nabble.com/Isanyoneworkingonasparsematrix > > l > > ibraryinHaskelltp5721452.html Sent from the Haskell  HaskellCafe > > mailing list archive at Nabble.com. > > > > _______________________________________________ > > HaskellCafe mailing list > > [hidden email] > > http://www.haskell.org/mailman/listinfo/haskellcafe Adrien Haxaire www.adrienhaxaire.org  @adrienhaxaire _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
In reply to this post by Mark Flamer
On 11/30/12 4:58 PM, Mark Flamer wrote:
> Thanks for all the replies, > It sounds like there is enough interest and even some potential > collaborators out there. I have created a few data structures to represent > sparse vectors and matrices. The vector was a simple binary tree and the > matrix a quad tree. As I suspected a standard IntMap was around 3X as fast > as my binary tree, so I have switched to the IntMap for now. I was hoping to > hold on to my quad tree for the matrix rep. because I like the elegance of > the recursive algorithms like Strassen’s for multiplication. In the end it > will most likely be best to have a few different matrix representations > anyway. For instance, I know that compressed row form is most efficient for > certain algorithms. I have a Gauss iterative solver working currently and am > thinking of moving on to a parallel Conjugate gradient or direct solver > using LU decomposition. I’m no expert in Haskell or numeric methods but I do > my best. I've also been working haphazardly on some similar stuff lately. However, my focus is rather different[1] so I'm afeared not much code sharing could happen at the moment. While I'm certainly no expert on numerical methods, I seem to have acquired some experience in that domain so I may be able to lend a hand from time to time. [1] In particular my goal has been to revive some old ideas about making linear algebra welltyped. The vast majority (if not all) of extant linear algebra systems are poorly typed and will do stupid things to "resolve" type errors (e.g., automatically padding, truncating, and reshaping things). Because of the use case I have in mind, this project also involves setting up a proper numerical typeclass hierarchy (i.e., one which expresses semirings and other things ignored by the numerical hierarchies out there today). My goal for all this is in setting up the type system, not performance. I figure there are other folks who know and care a lot more about the numerical tricks of giving the actual implementations.  Live well, ~wren _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
On Sun, Dec 2, 2012 at 10:52 AM, wren ng thornton <[hidden email]> wrote:
> My goal for all this is in setting up the type system, not performance. I figure there are other folks who know and care a lot more about the numerical tricks of giving the actual implementations. You've got my support  oldschool optimizations, numerical, compiler, or otherwise, are deadly boring. Death to them, I say! If we don't explore uncharted waters, who will?
 KimEe On Sun, Dec 2, 2012 at 10:52 AM, wren ng thornton <[hidden email]> wrote:
_______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
On 12/1/12 11:58 PM, KimEe Yeoh wrote:
> On Sun, Dec 2, 2012 at 10:52 AM, wren ng thornton <[hidden email]> wrote: >> My goal for all this is in setting up the type system, not performance. >> I figure there are other folks who know and care a lot more about the >> numerical tricks of giving the actual implementations. > > You've got my support  oldschool optimizations, numerical, compiler, or > otherwise, are deadly boring. Death to them, I say! If we don't explore > uncharted waters, who will? Well, there are interesting things to optimization[1], it's just that that's not my main area and I have few enough round tuits as it is. I also don't spend much time thinking about hardware, but I'm terribly glad there're other folks who really care about it. [1] For example, while matrix multiplication is associative, how exactly you associate things has a major impact on performance. Performanceminded compilers for linear algebra thus choose how to associate things by running an algorithm which is essentially the same as the chartparsing algorithms in NLP. As a NLPer, I think that's awesome; and since I'm not sure if anyone else has made that connection before, it'd be nice to see what each side could learn from the other. One thing I'd like to get out of the type classes I'm working on is the ability to define a DSL which allows this sort of optimization.  Live well, ~wren _______________________________________________ HaskellCafe mailing list [hidden email] http://www.haskell.org/mailman/listinfo/haskellcafe 
Thanks to all for the feedback. As I investigate the structures for organizing a library of sparse matrix representations a bit more and look into the repa 3 library, I cant help but wonder if these spare matrix types could just be additional instances of Source and Target in repa. Does anyone know of any reason why this would be a bad idea? I see that the lib was designed to be extended with new matrix representations. Just a thought.

Powered by Nabble  Edit this page 