www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Scientific computing with D

reply Lars Kyllingstad <public kyllingen.NOSPAMnet> writes:
I think D is, or at least could be, the scientific programming language 
of the future. Here's why -- and possibly how:

A couple of years ago, I took a university class called Numerical 
Physics. After finishing the course, I was left with the impression that 
numerical computing was all about squeezing every last bit of 
performance out of the computer -- that unnecessary operations and 
function calls should be avoided at any cost, even if the resulting code 
is full of nasty hacks and tricks, making it completely illegible and 
utterly unmaintainable. And of course, in many cases this is true.

Now, however, I have a bit more experience in the field and I know that 
it is not always so. In numerics, as in other areas of programming, it 
is a trade-off between development time and execution time. 
Traditionally, if one has a desperate need for speed (sorry), one uses 
FORTRAN, C or C++. The programs run very fast, but can be hard to 
develop, debug and maintain. For less processor-intensive tasks one uses 
Matlab, Mathematica, etc. which have a lot of built-in functionality and 
make for rapid development, but programs run at a snail's pace.

With D one has the best of both worlds. I've used both C++ and 
Mathematica for numerics in the past, but now I use D almost 
exclusively. I find it a lot easier (and more fun) to code in than C++, 
and I spend a LOT less time debugging my programs. On the other hand, 
calculations that would take an entire day in Mathematica are finished 
in a matter of minutes using D.

It's a fact that D programs don't have the performance of C(++) ones, 
but that, I think, is just a matter of time (pun not intended). It's a 
relatively new language, and the compilers are still somewhat immature.

The one thing I miss the most, however, and which I think is necessary 
for D to "take off" as a scientific language, is a native D scientific 
library.

Searching dsource, I find that many nice modules and libraries have been 
made already:
  - MultiArray (Bill, Fawzi)
  - dstat (dsimcha)
  - blip (Fawzi)
  - MathExtra, BLADE (Don)
  - Scrapple/backmath (BCS)
  - Scrapple/units (BCS)
  - bindings to GSL, BLAS, etc.
  - ...and probably more

Myself, I've written/ported some routines for numerical differentiation 
and integration, one- and multi-dimensional root-finding and some very 
basic linear algebra, but so far only for personal use. Currently, I'm 
thinking of porting QUADPACK to D.

I think it would be really nice if many or all of the above mentioned 
things could be collected in a single library, together with all kinds 
of other stuff. Something like the GSL, only written in D. (In my head 
it's called SciD. At first I thought of DSL - D Scientific Library - but 
that acronym is used all over the place.) I haven't the time, nor the 
skills, to write an entire such library myself, but I'd be happy to 
contribute where I can.

Here are some design goals I find important:
  - sensible, logical and tidy package hierarchy
  - access to high-level functionality for rapid development
  - access to low-level functionality for performance
  - make use of D's awesome compile-time functionality
  - reusability, pluggability and extensibility

(By the last point I mean that if one method doesn't work it should be 
quickly and easily replaceable in code with something else. This is 
achievable through the use of templates and interfaces, and ties in with 
the second point.)

So, what do you think? Am I making any sense? Am I the only one 
interested in these things?

All of the above are, of course, my personal opinions. What are yours?


-Lars
Jan 30 2009
next sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Lars Kyllingstad (public kyllingen.NOSPAMnet)'s article
 I think D is, or at least could be, the scientific programming language
 of the future. Here's why -- and possibly how:
 A couple of years ago, I took a university class called Numerical
 Physics. After finishing the course, I was left with the impression that
 numerical computing was all about squeezing every last bit of
 performance out of the computer -- that unnecessary operations and
 function calls should be avoided at any cost, even if the resulting code
 is full of nasty hacks and tricks, making it completely illegible and
 utterly unmaintainable. And of course, in many cases this is true.
 Now, however, I have a bit more experience in the field and I know that
 it is not always so. In numerics, as in other areas of programming, it
 is a trade-off between development time and execution time.
 Traditionally, if one has a desperate need for speed (sorry), one uses
 FORTRAN, C or C++. The programs run very fast, but can be hard to
 develop, debug and maintain. For less processor-intensive tasks one uses
 Matlab, Mathematica, etc. which have a lot of built-in functionality and
 make for rapid development, but programs run at a snail's pace.
 With D one has the best of both worlds. I've used both C++ and
 Mathematica for numerics in the past, but now I use D almost
 exclusively. I find it a lot easier (and more fun) to code in than C++,
 and I spend a LOT less time debugging my programs. On the other hand,
 calculations that would take an entire day in Mathematica are finished
 in a matter of minutes using D.
 It's a fact that D programs don't have the performance of C(++) ones,
 but that, I think, is just a matter of time (pun not intended). It's a
 relatively new language, and the compilers are still somewhat immature.
 The one thing I miss the most, however, and which I think is necessary
 for D to "take off" as a scientific language, is a native D scientific
 library.
 Searching dsource, I find that many nice modules and libraries have been
 made already:
   - MultiArray (Bill, Fawzi)
   - dstat (dsimcha)
   - blip (Fawzi)
   - MathExtra, BLADE (Don)
   - Scrapple/backmath (BCS)
   - Scrapple/units (BCS)
   - bindings to GSL, BLAS, etc.
   - ...and probably more
 Myself, I've written/ported some routines for numerical differentiation
 and integration, one- and multi-dimensional root-finding and some very
 basic linear algebra, but so far only for personal use. Currently, I'm
 thinking of porting QUADPACK to D.
 I think it would be really nice if many or all of the above mentioned
 things could be collected in a single library, together with all kinds
 of other stuff. Something like the GSL, only written in D. (In my head
 it's called SciD. At first I thought of DSL - D Scientific Library - but
 that acronym is used all over the place.) I haven't the time, nor the
 skills, to write an entire such library myself, but I'd be happy to
 contribute where I can.
 Here are some design goals I find important:
   - sensible, logical and tidy package hierarchy
   - access to high-level functionality for rapid development
   - access to low-level functionality for performance
   - make use of D's awesome compile-time functionality
   - reusability, pluggability and extensibility
 (By the last point I mean that if one method doesn't work it should be
 quickly and easily replaceable in code with something else. This is
 achievable through the use of templates and interfaces, and ties in with
 the second point.)
 So, what do you think? Am I making any sense? Am I the only one
 interested in these things?
 All of the above are, of course, my personal opinions. What are yours?
 -Lars
I think you're definitely onto something. My other problem with Matlab, R, etc. besides that they're slow is that they're _too_ domain specific. They're very good at what they're good at, but the minute you try to do any more general purpose programming with them the deficiencies become very obvious. A lot of engineers I know try to use Matlab as a general purpose language b/c they don't want to learn anything else. I think that, in addition to speed, D is a good language for this kind of stuff because it's general purpose, but has enough features (operator overloading, templates, garbage collection, etc.) to reimplement a lot of Matlab, etc. as a plain old library with decent syntax and ease of use. This way, when your domain specific language isn't enough for some subproblem, you have a _real, full-fledged_ general purpose language standing behind it.
Jan 30 2009
parent reply Bill Baxter <wbaxter gmail.com> writes:
On Fri, Jan 30, 2009 at 10:54 PM, dsimcha <dsimcha yahoo.com> wrote:
 == Quote from Lars Kyllingstad (public kyllingen.NOSPAMnet)'s article
 I think you're definitely onto something.  My other problem with Matlab, R,
etc.
 besides that they're slow is that they're _too_ domain specific.  They're very
 good at what they're good at, but the minute you try to do any more general
 purpose programming with them the deficiencies become very obvious.
You should check out NumPy/SciPy. That's exactly their mantra. All the flexibility and ease of Matlab/R, etc. BUT backed by a real, solid general purpose language.
 A lot of
 engineers I know try to use Matlab as a general purpose language b/c they don't
 want to learn anything else.  I think that, in addition to speed, D is a good
 language for this kind of stuff because it's general purpose, but has enough
 features (operator overloading, templates, garbage collection, etc.) to
 reimplement a lot of Matlab, etc. as a plain old library with decent syntax and
 ease of use.  This way, when your domain specific language isn't enough for
some
 subproblem, you have a _real, full-fledged_ general purpose language standing
 behind it.
I use NumPy often for it's interactive capabilities. Plotting and exploring data at the Python prompt. That's hard to do with a compiled language. A static language like D cannot satisfy that kind of use-case easily. Maybe Sci-MiniD there? :-) But fixed, compiled stuff, D is certainly the biz. I really wish there were a good plotting package for D. That would eliminate about half of my trips over to Python-land, which are just to get a quick peek at what the data generated in my D program looks like. --bb
Jan 30 2009
next sibling parent reply Don <nospam nospam.com> writes:
Bill Baxter wrote:
 On Fri, Jan 30, 2009 at 10:54 PM, dsimcha <dsimcha yahoo.com> wrote:
 == Quote from Lars Kyllingstad (public kyllingen.NOSPAMnet)'s article
 I think you're definitely onto something.  My other problem with Matlab, R,
etc.
 besides that they're slow is that they're _too_ domain specific.  They're very
 good at what they're good at, but the minute you try to do any more general
 purpose programming with them the deficiencies become very obvious.
You should check out NumPy/SciPy. That's exactly their mantra. All the flexibility and ease of Matlab/R, etc. BUT backed by a real, solid general purpose language.
 A lot of
 engineers I know try to use Matlab as a general purpose language b/c they don't
 want to learn anything else.  I think that, in addition to speed, D is a good
 language for this kind of stuff because it's general purpose, but has enough
 features (operator overloading, templates, garbage collection, etc.) to
 reimplement a lot of Matlab, etc. as a plain old library with decent syntax and
 ease of use.  This way, when your domain specific language isn't enough for
some
 subproblem, you have a _real, full-fledged_ general purpose language standing
 behind it.
I use NumPy often for it's interactive capabilities. Plotting and exploring data at the Python prompt. That's hard to do with a compiled language. A static language like D cannot satisfy that kind of use-case easily. Maybe Sci-MiniD there? :-) But fixed, compiled stuff, D is certainly the biz. I really wish there were a good plotting package for D. That would eliminate about half of my trips over to Python-land, which are just to get a quick peek at what the data generated in my D program looks like.
I agree. I imagine that even something faily basic which could just write to a png file, or pop up an OpenGL window (ie, not publication quality), would cover a big chunk of the use cases.
Jan 30 2009
next sibling parent Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-01-30 20:52:05 +0100, Don <nospam nospam.com> said:

 Bill Baxter wrote:
 On Fri, Jan 30, 2009 at 10:54 PM, dsimcha <dsimcha yahoo.com> wrote:
 == Quote from Lars Kyllingstad (public kyllingen.NOSPAMnet)'s article
 [...]
 A lot of
 engineers I know try to use Matlab as a general purpose language b/c they don't
 want to learn anything else.  I think that, in addition to speed, D is a good
 language for this kind of stuff because it's general purpose, but has enough
 features (operator overloading, templates, garbage collection, etc.) to
 reimplement a lot of Matlab, etc. as a plain old library with decent syntax and
 ease of use.  This way, when your domain specific language isn't enough 
 for some
 subproblem, you have a _real, full-fledged_ general purpose language standing
 behind it.
I use NumPy often for it's interactive capabilities. Plotting and exploring data at the Python prompt. That's hard to do with a compiled language. A static language like D cannot satisfy that kind of use-case easily. Maybe Sci-MiniD there? :-)
I am thinking to use Xpose and/or the serialization interface I defined in the future to get there, but it is more an idea for the moment than a reality.
 But fixed, compiled stuff, D is certainly the biz.  I really wish
 there were a good plotting package for D.  That would eliminate about
 half of my trips over to Python-land, which are just to get a quick
 peek at what the data generated in my D program looks like.
I agree. I imagine that even something faily basic which could just write to a png file, or pop up an OpenGL window (ie, not publication quality), would cover a big chunk of the use cases.
yes it would be very nice indeed, and it is exactly in those things that keeping full cross compatibility becomes difficult, while pure algorithms can be more easily independent as soon as you begin to need input/output, logging, parallelizing,... you need to do a choice about what you use :(
Jan 30 2009
prev sibling parent reply Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Don wrote:
 Bill Baxter wrote:
 But fixed, compiled stuff, D is certainly the biz.  I really wish
 there were a good plotting package for D.  That would eliminate about
 half of my trips over to Python-land, which are just to get a quick
 peek at what the data generated in my D program looks like.
I agree. I imagine that even something faily basic which could just write to a png file, or pop up an OpenGL window (ie, not publication quality), would cover a big chunk of the use cases.
Consider generating an svg image. The potentially infinite granularity is nice and it may also allow you to offload the rendering work to something else. All you have to do is describe where some lines, numbers, and points need to go, while you can safely ignore working with pixels, anti-aliasing, drawing lines, drawing circles, drawing dotted lines, etc etc. As it's somewhat related, I'll mention that I'm working on a library in D that renders svg images using OpenGL. It can currently trace lines, rectangles, and circles, and more or less understands svg's nesting hierarchy and mathematical transformations. I'll try to finish off all of the filling, blending, stroking, and such things this Spring quarter. The license will be something liberal (zlib/freebsd/etc).
Jan 30 2009
parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 7:32 AM, Chad J
<gamerchad __spam.is.bad__gmail.com> wrote:
 Don wrote:
 Bill Baxter wrote:
 But fixed, compiled stuff, D is certainly the biz.  I really wish
 there were a good plotting package for D.  That would eliminate about
 half of my trips over to Python-land, which are just to get a quick
 peek at what the data generated in my D program looks like.
I agree. I imagine that even something faily basic which could just write to a png file, or pop up an OpenGL window (ie, not publication quality), would cover a big chunk of the use cases.
Consider generating an svg image. The potentially infinite granularity is nice and it may also allow you to offload the rendering work to something else. All you have to do is describe where some lines, numbers, and points need to go, while you can safely ignore working with pixels, anti-aliasing, drawing lines, drawing circles, drawing dotted lines, etc etc. As it's somewhat related, I'll mention that I'm working on a library in D that renders svg images using OpenGL. It can currently trace lines, rectangles, and circles, and more or less understands svg's nesting hierarchy and mathematical transformations. I'll try to finish off all of the filling, blending, stroking, and such things this Spring quarter. The license will be something liberal (zlib/freebsd/etc).
Nice. I wrote something to do the opposite :-) To go from OpenGL to an SVG file. Actually I just cobbled together a couple of existing projects OGLE (non-intrusive OpenGL32.dll replacement that can output jpegs and wavefront .objs ) and gl2ps (intrusive API for outputing GL to PS/SVG/etc) -- resulting in a non-intrusive solution for capturing GL output to an SVG. Been using that heavily to make figures for papers lately. I haven't actually made the code available anywhere... should do that eventually. Back to your project, how are you handling A) filled polygons (gluTess?) B) antialiasing? Some day I'd love to try to port AGG to D. I bet its possible to make some really slick and elegant improvements to AGG given D's templates. --bb
Jan 30 2009
parent reply Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Bill Baxter wrote:
 As it's somewhat related, I'll mention that I'm working on a library in
 D that renders svg images using OpenGL.  It can currently trace lines,
 rectangles, and circles, and more or less understands svg's nesting
 hierarchy and mathematical transformations.  I'll try to finish off all
 of the filling, blending, stroking, and such things this Spring quarter.
  The license will be something liberal (zlib/freebsd/etc).
Nice. I wrote something to do the opposite :-) To go from OpenGL to an SVG file. Actually I just cobbled together a couple of existing projects OGLE (non-intrusive OpenGL32.dll replacement that can output jpegs and wavefront .objs ) and gl2ps (intrusive API for outputing GL to PS/SVG/etc) -- resulting in a non-intrusive solution for capturing GL output to an SVG. Been using that heavily to make figures for papers lately.
Huh. That sounds rather impressive and useful.
 I haven't actually made the code available anywhere... should do that
 eventually.
 
 Back to your project, how are you handling A) filled polygons
 (gluTess?)   B) antialiasing?
 
A.) Hand rolled tessellation (working on it, actually getting a bit distracted by it since it's an interesting problem and fun to play with.) B.) Antialiasing - let OpenGL handle it.
 Some day I'd love to try to port AGG to D.  I bet its possible to make
 some really slick and elegant improvements to AGG given D's templates.
 
 --bb
Ah AGG. Looked good, though I didn't use it because of the GPL in 2.5 (semi-maintained-maybe branch) and because I tend to avoid things that aren't maintained (2.4 branch). Oh and yeah wrapping C++ isn't fun. Maybe port the 2.4? ;)
Jan 30 2009
parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 9:14 AM, Chad J
<gamerchad __spam.is.bad__gmail.com> wrote:
 Bill Baxter wrote:
 As it's somewhat related, I'll mention that I'm working on a library in
 D that renders svg images using OpenGL.  It can currently trace lines,
 rectangles, and circles, and more or less understands svg's nesting
 hierarchy and mathematical transformations.  I'll try to finish off all
 of the filling, blending, stroking, and such things this Spring quarter.
  The license will be something liberal (zlib/freebsd/etc).
Nice. I wrote something to do the opposite :-) To go from OpenGL to an SVG file. Actually I just cobbled together a couple of existing projects OGLE (non-intrusive OpenGL32.dll replacement that can output jpegs and wavefront .objs ) and gl2ps (intrusive API for outputing GL to PS/SVG/etc) -- resulting in a non-intrusive solution for capturing GL output to an SVG. Been using that heavily to make figures for papers lately.
Huh. That sounds rather impressive and useful.
 I haven't actually made the code available anywhere... should do that
 eventually.

 Back to your project, how are you handling A) filled polygons
 (gluTess?)   B) antialiasing?
A.) Hand rolled tessellation (working on it, actually getting a bit distracted by it since it's an interesting problem and fun to play with.) B.) Antialiasing - let OpenGL handle it.
 Some day I'd love to try to port AGG to D.  I bet its possible to make
 some really slick and elegant improvements to AGG given D's templates.

 --bb
Ah AGG. Looked good, though I didn't use it because of the GPL in 2.5 (semi-maintained-maybe branch) and because I tend to avoid things that aren't maintained (2.4 branch). Oh and yeah wrapping C++ isn't fun. Maybe port the 2.4? ;)
Yeh definintely. I'm not even going to look at the 2.5 GPL version. And wrapping that is just out of the question. Well, you could but I don't see the point when D's templates are so rockin. As for stagnation, the 2.4 version is actually very mature. The mailing list is still pretty active, and some folks are maintaining it over on SourceForge, I think. Anyway, it seems Mr. AGG himself isn't really working on AGG either, these days since he got a job where he gets paid to do similar stuff. Probably not enough time, and too much legal murkiness about his work product vs. his free time effort. There was just a big debate on the AGG mailing list yesterday where people were saying AGG is withering away, and half dozen people sprang up to say it wasn't dead it just works fine and doesn't need much maintenance. So that's a pretty ideal situation for porting to D. If it's pretty much a finished lib, then there won't even be much in the way of headaches merging in new updates. Since there probably won't be many. Anyway, antiailiasing of AGG looks very nice. :-) I did my previous (C++) project using AGG for rendering. But this time I just decided to go with GL. GL is way easier to work with, but I do miss the quality of AGG rendering. MSAA/FSAA just can't hold a candle to genuine primitive-level antialiasing (2~16 gradations vs 256). Still, most of us have pretty decent graphics hardware on our desks these days. It's sad that we still can't really use it to get decent quality 2D graphics. So really I think the future is not AGG but some clever shaders to do AA with graphics hardware. I think DX10 class hardware opens up new possibilities here. And probably the distant distant future is back to just brute force 256x oversampling :-) --bb
Jan 30 2009
parent reply Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Bill Baxter wrote:
 On Sat, Jan 31, 2009 at 9:14 AM, Chad J
 <gamerchad __spam.is.bad__gmail.com> wrote:
 ...
 Maybe port the 2.4?  ;)
Yeh definintely. I'm not even going to look at the 2.5 GPL version. And wrapping that is just out of the question. Well, you could but I don't see the point when D's templates are so rockin. As for stagnation, the 2.4 version is actually very mature. The mailing list is still pretty active, and some folks are maintaining it over on SourceForge, I think. Anyway, it seems Mr. AGG himself isn't really working on AGG either, these days since he got a job where he gets paid to do similar stuff. Probably not enough time, and too much legal murkiness about his work product vs. his free time effort. There was just a big debate on the AGG mailing list yesterday where people were saying AGG is withering away, and half dozen people sprang up to say it wasn't dead it just works fine and doesn't need much maintenance. So that's a pretty ideal situation for porting to D. If it's pretty much a finished lib, then there won't even be much in the way of headaches merging in new updates. Since there probably won't be many. Anyway, antiailiasing of AGG looks very nice. :-) I did my previous (C++) project using AGG for rendering. But this time I just decided to go with GL. GL is way easier to work with, but I do miss the quality of AGG rendering. MSAA/FSAA just can't hold a candle to genuine primitive-level antialiasing (2~16 gradations vs 256). Still, most of us have pretty decent graphics hardware on our desks these days. It's sad that we still can't really use it to get decent quality 2D graphics. So really I think the future is not AGG but some clever shaders to do AA with graphics hardware. I think DX10 class hardware opens up new possibilities here. And probably the distant distant future is back to just brute force 256x oversampling :-) --bb
Interesting. So AGG is not dead and that looks like it'd be a nice project. A high quality software rendering SVG lib complementing a GL project. It would also be neat to see shaders/GPGPU stuff applied to an SVG renderer. Not just for the AA but doing all of the filters quickly too would be great. Right now though I'm just trying to write a reference render path without shaders or anything--something that even the really old hardware will run. I think I might require a stencil buffer. Oh dear. Hopefully this also makes porting to portable devices easier. Eventually shader/GPGPU trickery can be applied to optimize and add awesome, though it's probably not worth enough to me personally. I'd happily watch other people do it though :)
Jan 30 2009
next sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 10:41 AM, Chad J
<gamerchad __spam.is.bad__gmail.com> wrote:
 Still, most of us have pretty decent graphics hardware on our desks
 these days.  It's sad that we still can't really use it to get decent
 quality 2D graphics.  So really I think the future is not AGG but some
 clever shaders to do AA with graphics hardware.  I think DX10 class
 hardware opens up new possibilities here.  And probably the distant
 distant future is back to just brute force 256x oversampling :-)

 --bb
Interesting. So AGG is not dead and that looks like it'd be a nice project. A high quality software rendering SVG lib complementing a GL project. It would also be neat to see shaders/GPGPU stuff applied to an SVG renderer. Not just for the AA but doing all of the filters quickly too would be great. Right now though I'm just trying to write a reference render path without shaders or anything--something that even the really old hardware will run. I think I might require a stencil buffer. Oh dear. Hopefully this also makes porting to portable devices easier. Eventually shader/GPGPU trickery can be applied to optimize and add awesome, though it's probably not worth enough to me personally. I'd happily watch other people do it though :)
You may know about these things already but... The two things I had my eyes on when I was last looking into 2D rendering were Amanith (www.amanith.org) and OpenVG (http://www.khronos.org/openvg/) It looks like Amanith has morphed into a commercial OpenVG implementation... hmm I think it was open source (GPL?) for a while. It was original just meant to be a nice 2D rendering lib on top of OpenGL. There's also the "glitz" backend for Cairo, though I think it's also a little dead. It wasn't really functional last I checked. And Cairo wasn't very Windows-friendly then either. --bb
Jan 30 2009
parent reply Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Bill Baxter wrote:
 
 You may know about these things already but...
 
Yep :)
 The two things I had my eyes on when I was last looking into 2D
 rendering were Amanith (www.amanith.org) and OpenVG
 (http://www.khronos.org/openvg/)
 
 It looks like Amanith has morphed into a commercial OpenVG
 implementation... hmm I think it was open source (GPL?) for a while.
 It was original just meant to be a nice 2D rendering lib on top of
 OpenGL.
 
Yeah, commercial/GPL is undesirable. As for OpenVG, that looked cool, but I wonder if it has any implementations besides the reference one. It was soooo similar to SVG but had that caveat that there might be subtle differences every now and then to bite you. Most importantly though I'd like it to actually be implemented everywhere with hardware backing.
 There's also the "glitz" backend for Cairo, though I think it's also a
 little dead.  It wasn't really functional last I checked.  And Cairo
 wasn't very Windows-friendly then either.
 
 --bb
Cairo... ugh... for svg rendering it pulls in RSVG (IIRC) and that pulled in some Gnome deps (?!). So it had it's fair share of violating my criterion of no unreasonable dependencies and liberal licensing. I think we concur that this one is not on the list :/
Jan 30 2009
parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
Chad J wrote:
 Bill Baxter wrote:
 There's also the "glitz" backend for Cairo, though I think it's also a
 little dead.  It wasn't really functional last I checked.  And Cairo
 wasn't very Windows-friendly then either.

 --bb
Cairo... ugh... for svg rendering it pulls in RSVG (IIRC) and that pulled in some Gnome deps (?!). So it had it's fair share of violating my criterion of no unreasonable dependencies and liberal licensing. I think we concur that this one is not on the list :/
Well, Cairo isn't an SVG library; it's a rendering API like OpenGL. If you want to read or write SVG, you need another library for that. As for RSVG pulling in Gnome dependencies, that's because it's a Gnome library. Honestly, I've used Cairo from D a number of times, and it's very, VERY nice to work with. If all you're doing is 2D, it beats the pants off OpenGL. As for acceleration, I believe that it *is* accelerated under Windows provided you're rendering to an actual GDI surface (also providing that the video card drivers accelerate GDI calls.) But if you're rendering to an image buffer then no, it isn't. Which sucks, but there you go. Incidentally, I've got a visualisation project underway that's using Cairo to render the output; works just fine. [ This message brought to you by "that guy" who wrote the Cairo bindings and feels compelled to defend his choices so he doesn't look like a burke. ] -- Daniel
Jan 30 2009
parent Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Daniel Keep wrote:
 
 Chad J wrote:
 Bill Baxter wrote:
 There's also the "glitz" backend for Cairo, though I think it's also a
 little dead.  It wasn't really functional last I checked.  And Cairo
 wasn't very Windows-friendly then either.

 --bb
Cairo... ugh... for svg rendering it pulls in RSVG (IIRC) and that pulled in some Gnome deps (?!). So it had it's fair share of violating my criterion of no unreasonable dependencies and liberal licensing. I think we concur that this one is not on the list :/
Well, Cairo isn't an SVG library; it's a rendering API like OpenGL. If you want to read or write SVG, you need another library for that. As for RSVG pulling in Gnome dependencies, that's because it's a Gnome library. Honestly, I've used Cairo from D a number of times, and it's very, VERY nice to work with. If all you're doing is 2D, it beats the pants off OpenGL. As for acceleration, I believe that it *is* accelerated under Windows provided you're rendering to an actual GDI surface (also providing that the video card drivers accelerate GDI calls.) But if you're rendering to an image buffer then no, it isn't. Which sucks, but there you go. Incidentally, I've got a visualisation project underway that's using Cairo to render the output; works just fine. [ This message brought to you by "that guy" who wrote the Cairo bindings and feels compelled to defend his choices so he doesn't look like a burke. ] -- Daniel
Alright, well that makes sense. I suppose I should back off a bit and say that using Cairo as a lead for doing SVG graphics didn't go anywhere nice. I can certainly see Cairo working for other procedural graphics just fine though.
Jan 30 2009
prev sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 10:54 AM, Bill Baxter <wbaxter gmail.com> wrote:
 On Sat, Jan 31, 2009 at 10:41 AM, Chad J
 <gamerchad __spam.is.bad__gmail.com> wrote:
 Still, most of us have pretty decent graphics hardware on our desks
 these days.  It's sad that we still can't really use it to get decent
 quality 2D graphics.  So really I think the future is not AGG but some
 clever shaders to do AA with graphics hardware.  I think DX10 class
 hardware opens up new possibilities here.  And probably the distant
 distant future is back to just brute force 256x oversampling :-)

 --bb
Interesting. So AGG is not dead and that looks like it'd be a nice project. A high quality software rendering SVG lib complementing a GL project. It would also be neat to see shaders/GPGPU stuff applied to an SVG renderer. Not just for the AA but doing all of the filters quickly too would be great. Right now though I'm just trying to write a reference render path without shaders or anything--something that even the really old hardware will run. I think I might require a stencil buffer. Oh dear. Hopefully this also makes porting to portable devices easier. Eventually shader/GPGPU trickery can be applied to optimize and add awesome, though it's probably not worth enough to me personally. I'd happily watch other people do it though :)
You may know about these things already but... The two things I had my eyes on when I was last looking into 2D rendering were Amanith (www.amanith.org) and OpenVG (http://www.khronos.org/openvg/) It looks like Amanith has morphed into a commercial OpenVG implementation... hmm I think it was open source (GPL?) for a while. It was original just meant to be a nice 2D rendering lib on top of OpenGL. There's also the "glitz" backend for Cairo, though I think it's also a little dead. It wasn't really functional last I checked. And Cairo wasn't very Windows-friendly then either.
Oh, another one worth checking out is Thatcher Ulrich's GameSWF. It's very open and free code, so its a good reference for robust triangulation and such. http://tulrich.com/geekstuff/gameswf.html Robust triangulation is pretty hard, actually. So I would encourage you to try to reuse something that already exists like Thatcher Ulrich's code. Or taking the MESA version of GLUTess and refactoring it till its API doesn't suck. In fact I think I Thatcher said over on the gdalgorithms mailing list that if he had known about the open Mesa implementation of GLU Tess he would have hacked on that instead of rolling his own triangulator. And according to some of the other old-timers on gdalgorithms list, GLU tess is the only freely available triangulator out there that isn't a fragile piece of junk. --bb
Jan 30 2009
next sibling parent reply Chad J <gamerchad __spam.is.bad__gmail.com> writes:
Bill Baxter wrote:
 
 Oh, another one worth checking out is Thatcher Ulrich's GameSWF.  It's
 very open and free code, so its a good reference for robust
 triangulation and such.  http://tulrich.com/geekstuff/gameswf.html
 
 Robust triangulation is pretty hard, actually.  
Yep, noticed :)
 So I would encourage
 you to try to reuse something that already exists like Thatcher
 Ulrich's code.  Or taking the MESA version of GLUTess and refactoring
 it till its API doesn't suck.   In fact I think I Thatcher said over
 on the gdalgorithms mailing list that if he had known about the open
 Mesa implementation of GLU Tess he would have hacked on that instead
 of rolling his own triangulator.   And according to some of the other
 old-timers on gdalgorithms list, GLU tess is the only freely available
 triangulator out there that isn't a fragile piece of junk.
 
 --bb
Now these are some things I have not run across. I did not know about the open Mesa implementation of GLU Tess. I may just fall back on that or Thatcher's code once I've had my fun. I'm also shooting for very minimal dependencies, so I'd like to avoid linking against GLU, even if avoiding it is kinda silly. Being able to just port the open code to D would be nice. I think so far my renderer only requires Tango (for XML parsing and some odds and ends) and OpenGL (the whole point). Thanks for the info!
Jan 30 2009
parent Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 11:43 AM, Chad J
<gamerchad __spam.is.bad__gmail.com> wrote:
 Now these are some things I have not run across.  I did not know about
 the open Mesa implementation of GLU Tess.  I may just fall back on that
 or Thatcher's code once I've had my fun.
 I'm also shooting for very minimal dependencies, so I'd like to avoid
 linking against GLU, even if avoiding it is kinda silly.  Being able to
 just port the open code to D would be nice.  I think so far my renderer
 only requires Tango (for XML parsing and some odds and ends) and OpenGL
 (the whole point).
Yeh, using GLU tess from the GLU lib is better than nothing (and probably better than rolling your own) if you don't have time to do better. But from what I understand, while the algorithms in GLU Tess are quite good and solid the C interface is just horrid requiring lots of extra unnecessary internal allocations and such. Someone in that GDalgorithms conversation said they had ripped out the the Mesa GLU code, put a C++ interface on it, eliminated excess allocations, and used an efficient pool allocator for the rest. After that they said it performed pretty well. But it was for a company so unfortunately he couldn't share the code. I started looking in to doing that myself, but I ended up just using Shewchuck's Triangle library for the time being. I don't have a stringent time requirement on my triangulations. Anything under half-a-sec or so is ok for me. --bb
Jan 30 2009
prev sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
 So AGG is not dead and that looks like it'd be a nice project.  A high
 quality software rendering SVG lib complementing a GL project.
I have yet to find something that produces a quality higher than AGG. It's the graphics of the future :-) (Despite being done by the CPU) Bill Baxter:
 Oh, another one worth checking out is Thatcher Ulrich's GameSWF.
I have sometimes used this one, in C++: http://cimg.sourceforge.net/ I may find it useful to have the same in D. Bye, bearophile
Jan 30 2009
parent Daniel Keep <daniel.keep.lists gmail.com> writes:
bearophile wrote:
 So AGG is not dead and that looks like it'd be a nice project.  A high
 quality software rendering SVG lib complementing a GL project.
I have yet to find something that produces a quality higher than AGG. It's the graphics of the future :-) (Despite being done by the CPU)
Funnily enough, it's not inconceivable: http://en.wikipedia.org/wiki/Larrabee_(GPU) -- Daniel
Jan 31 2009
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 I use NumPy often for it's interactive capabilities.  Plotting and
 exploring data at the Python prompt.   That's hard to do with a
 compiled language.    A static language like D cannot satisfy that
 kind of use-case easily.  Maybe Sci-MiniD there? :-)
The D compiler is fast enough that this should be quite doable. On my 6 year old XP machine, compiling and linking a program a few lines long takes less than half a second.
Jan 31 2009
next sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 7:51 PM, Walter Bright
<newshound1 digitalmars.com> wrote:
 Bill Baxter wrote:
 I use NumPy often for it's interactive capabilities.  Plotting and
 exploring data at the Python prompt.   That's hard to do with a
 compiled language.    A static language like D cannot satisfy that
 kind of use-case easily.  Maybe Sci-MiniD there? :-)
The D compiler is fast enough that this should be quite doable. On my 6 year old XP machine, compiling and linking a program a few lines long takes less than half a second.
It's more the interactive prompt part that's hard to do with a compiled language. Where I have a big array of data, and don't really know what it looks like. So I plot it. Then I see some funny bumps. So I do an fft on it and plot that to see if there are any suspicious spikes. And i notice something there which gives me an idea to try something else. Or I figure out the component that I'm not interested in and subtract that off. Etc. Having to recompile and rerun after every one of those changes just isn't quite as direct. On the other hand sometimes the thing you're doing gets too complicated for one-liners at the command prompt and you have to move to a script. For that D would potentially be able to hold its own. --bb
Jan 31 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
Jan 31 2009
parent reply Daniel Keep <daniel.keep.lists gmail.com> writes:
Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway. -- Daniel
Jan 31 2009
next sibling parent Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 8:39 PM, Daniel Keep
<daniel.keep.lists gmail.com> wrote:
 Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway.
[/me discards half-written message saying the same thing half as well with twice as many words] *If* you could invoke the compiler as a library and have it return you a pointer to a freshly compiled function in memory somewhere, then you might have a shot and something that's a usable interactive prompt. Hmm compiler as a dll... sounds familiar. :-) --bb
Jan 31 2009
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Daniel Keep wrote:
 
 Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway.
It's the shell's responsibility to decide what semantics to present to the user, I'm just saying that the process of turning a code snippet into an executable is fast and should not be a barrier.
Jan 31 2009
next sibling parent Daniel Keep <daniel.keep.lists gmail.com> writes:
Walter Bright wrote:
 Daniel Keep wrote:
 Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway.
It's the shell's responsibility to decide what semantics to present to the user, I'm just saying that the process of turning a code snippet into an executable is fast and should not be a barrier.
Perhaps, but the above example shows why interactive prompts are so useful, and why simply feeding them to a standard compiler isn't likely to work very well. Speed isn't the issue in this particular case. Here's another fun one: $ void hello() { writefln("Hello, World!"); } $ hello(); Hello, World! $ import enAU : hello; hello(); G'day, World! An interactive prompt, to be really useful, should have slightly different semantics to the language proper. I personally think that turning DMD into a library won't help; you really need a purpose-built interpreter. -- Daniel
Jan 31 2009
prev sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
Walter Bright wrote:
 Daniel Keep wrote:
 Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway.
It's the shell's responsibility to decide what semantics to present to the user, I'm just saying that the process of turning a code snippet into an executable is fast and should not be a barrier.
If bash took 0.5 seconds to execute anything, I wouldn't use it. If it were something that I used infrequently, I'd tolerate that.
Jan 31 2009
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Christopher Wright wrote:
 Walter Bright wrote:
 Daniel Keep wrote:
 Walter Bright wrote:
 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
$ int a = 42; $ writefln("a = %s", a); $ double a = 3.0; // rounded to 1 sf How would you write a prompt that does that with D? Either you store each successive line in a source file and choke on the third one, or you compile each line separately and choke on the second. Or you could examine each line to look for things like redefining of symbols... but at that point you're half way to writing an interpreter anyway.
It's the shell's responsibility to decide what semantics to present to the user, I'm just saying that the process of turning a code snippet into an executable is fast and should not be a barrier.
If bash took 0.5 seconds to execute anything, I wouldn't use it. If it were something that I used infrequently, I'd tolerate that.
I think it's considerably less than 0.5 seconds for many small programs. dmd is surprisingly fast at rummaging through files, in fact so fast that I discovered that an intricate caching scheme I'd implemented in rdmd yielded no detectable improvement. Note that this is under Linux, which creates new processes way faster than Windows, so take this with a grain of salt. I don't have the time to read this but I vaguely recall it might be related: http://members.shaw.ca/burton-radons/The%20Joy%20and%20Gibbering%20Terror%20of%20Custom-Loading%20Executables.html Andrei
Jan 31 2009
prev sibling parent Jarrett Billingsley <jarrett.billingsley gmail.com> writes:
On Sat, Jan 31, 2009 at 5:51 AM, Walter Bright
<newshound1 digitalmars.com> wrote:
 Bill Baxter wrote:
 I use NumPy often for it's interactive capabilities.  Plotting and
 exploring data at the Python prompt.   That's hard to do with a
 compiled language.    A static language like D cannot satisfy that
 kind of use-case easily.  Maybe Sci-MiniD there? :-)
The D compiler is fast enough that this should be quite doable. On my 6 year old XP machine, compiling and linking a program a few lines long takes less than half a second.
Or, you know, they could use MiniD! <_<
Jan 31 2009
prev sibling next sibling parent reply Don <nospam nospam.com> writes:
Lars Kyllingstad wrote:
 I think D is, or at least could be, the scientific programming language 
 of the future. Here's why -- and possibly how:
 
 A couple of years ago, I took a university class called Numerical 
 Physics. After finishing the course, I was left with the impression that 
 numerical computing was all about squeezing every last bit of 
 performance out of the computer -- that unnecessary operations and 
 function calls should be avoided at any cost, even if the resulting code 
 is full of nasty hacks and tricks, making it completely illegible and 
 utterly unmaintainable. And of course, in many cases this is true.
 
 Now, however, I have a bit more experience in the field and I know that 
 it is not always so. In numerics, as in other areas of programming, it 
 is a trade-off between development time and execution time. 
 Traditionally, if one has a desperate need for speed (sorry), one uses 
 FORTRAN, C or C++. The programs run very fast, but can be hard to 
 develop, debug and maintain. For less processor-intensive tasks one uses 
 Matlab, Mathematica, etc. which have a lot of built-in functionality and 
 make for rapid development, but programs run at a snail's pace.
 
 With D one has the best of both worlds. I've used both C++ and 
 Mathematica for numerics in the past, but now I use D almost 
 exclusively. I find it a lot easier (and more fun) to code in than C++, 
 and I spend a LOT less time debugging my programs. On the other hand, 
 calculations that would take an entire day in Mathematica are finished 
 in a matter of minutes using D.
I completely agree.
 It's a fact that D programs don't have the performance of C(++) ones, 
 but that, I think, is just a matter of time (pun not intended). It's a 
 relatively new language, and the compilers are still somewhat immature.
 
 The one thing I miss the most, however, and which I think is necessary 
 for D to "take off" as a scientific language, is a native D scientific 
 library.
 
 Searching dsource, I find that many nice modules and libraries have been 
 made already:
  - MultiArray (Bill, Fawzi)
  - dstat (dsimcha)
  - blip (Fawzi)
  - MathExtra, BLADE (Don)
  - Scrapple/backmath (BCS)
  - Scrapple/units (BCS)
  - bindings to GSL, BLAS, etc.
  - ...and probably more
 
 Myself, I've written/ported some routines for numerical differentiation 
 and integration, one- and multi-dimensional root-finding and some very 
 basic linear algebra, but so far only for personal use. Currently, I'm 
 thinking of porting QUADPACK to D.
 
 I think it would be really nice if many or all of the above mentioned 
 things could be collected in a single library, together with all kinds 
 of other stuff.
Yes. I have put all of my completed code into tango.math, and there's a fair bit of Fawzi's code in there as well, now. There's a vague plan to integrate Bill's stuff eventually, too. Something that I think is true of all the libraries above, is that almost none of the code cares if it is Tango or Phobos. But currently, you have to choose, because there's nowhere to put common code.
 Something like the GSL, only written in D. (In my head 
 it's called SciD. At first I thought of DSL - D Scientific Library - but
 that acronym is used all over the place.) I haven't the time, nor the
 skills, to write an entire such library myself, but I'd be happy to 
 contribute where I can.
 
 Here are some design goals I find important:
  - sensible, logical and tidy package hierarchy
  - access to high-level functionality for rapid development
  - access to low-level functionality for performance
  - make use of D's awesome compile-time functionality
  - reusability, pluggability and extensibility
 
 (By the last point I mean that if one method doesn't work it should be 
 quickly and easily replaceable in code with something else. This is 
 achievable through the use of templates and interfaces, and ties in with 
 the second point.)
 
 So, what do you think? Am I making any sense? Am I the only one 
 interested in these things?
 
 All of the above are, of course, my personal opinions. What are yours?
I agree with pretty much everything you've said. I think all that's lacking is a bit of organisation. There are quite a lot of scientific programmers here, including an impressive number of library developers. We could really use a rallying point. I wonder if it would make sense for Walter to create a NG dedicated to scientific programming. digitalmars.D.sci or digitalmars.D.scientific or digitalmars.D.math or similar.
Jan 30 2009
next sibling parent Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-01-30 16:14:23 +0100, Don <nospam nospam.com> said:

 Lars Kyllingstad wrote:
 I think D is, or at least could be, the scientific programming language 
 of the future. Here's why -- and possibly how:
 [...]
I agree with pretty much everything you've said. I think all that's lacking is a bit of organisation. There are quite a lot of scientific programmers here, including an impressive number of library developers. We could really use a rallying point.
I also agree that D is an excellent language for scientific computation, and an extra rallying point would be nice The various splits of D make it more difficult that it should be (D1.0 vs D2.0, tango vs phobos), but it would be something nice to have. As a pratical example of the difficulties at the moment I am working an NUMA optimizations for example, and you need some system support, which will go into tango, will it come to phobos? I hope so, but I don't know. I am interested in making my code available to as many persons as possible, and profit of code of others, so anything that goes in that direction is good in my opinion, but on the other hand I am interested in developing and using my code, not on testing it in other situations that are not relevant for me. For the moment this means D1.0 and tango for me, I am willing to make some effort to be as independent as possible, but I am not sure how well it will work. A scientific package has to be tested, you need to trust your results and having something that might not compile for others is not exactly so attractive...
 I wonder if it would make sense for Walter to create a NG dedicated to 
 scientific programming. digitalmars.D.sci or digitalmars.D.scientific 
 or digitalmars.D.math or similar.
I think that it is a good idea, I wonder about the number of users, but I would for sure join to it Fawzi
Jan 30 2009
prev sibling next sibling parent BCS <none anon.com> writes:
Hello Don,

[...]
 Searching dsource, I find that many nice modules and libraries have
 been
 made already:
[...]
 - Scrapple/backmath (BCS)
 - Scrapple/units (BCS)
[...]
 Yes. I have put all of my completed code into tango.math, and there's
 a fair bit of Fawzi's code in there as well, now. There's a vague plan
 to integrate Bill's stuff eventually, too.
 
I'd be willing to let some of my libs be copied over as well.
Jan 30 2009
prev sibling parent Lars Kyllingstad <public kyllingen.NOSPAMnet> writes:
Don wrote:
 Lars Kyllingstad wrote:
 [...]

 Searching dsource, I find that many nice modules and libraries have 
 been made already:
  - MultiArray (Bill, Fawzi)
  - dstat (dsimcha)
  - blip (Fawzi)
  - MathExtra, BLADE (Don)
  - Scrapple/backmath (BCS)
  - Scrapple/units (BCS)
  - bindings to GSL, BLAS, etc.
  - ...and probably more

 Myself, I've written/ported some routines for numerical 
 differentiation and integration, one- and multi-dimensional 
 root-finding and some very basic linear algebra, but so far only for 
 personal use. Currently, I'm thinking of porting QUADPACK to D.

 I think it would be really nice if many or all of the above mentioned 
 things could be collected in a single library, together with all kinds 
 of other stuff.
Yes. I have put all of my completed code into tango.math, and there's a fair bit of Fawzi's code in there as well, now. There's a vague plan to integrate Bill's stuff eventually, too.
Nice. There were some things in tango.math which I hadn't noticed. :) But I guess at some point one has to draw the line as to what actually belongs in a "standard library", and what should be put into a library of its own. I'm not sure where that line goes, though.
 Something that I think is true of all the libraries above, is that 
 almost none of the code cares if it is Tango or Phobos. But currently, 
 you have to choose, because there's nowhere to put common code.
One reason for that is probably that the math modules of Phobos and Tango are written by you, and are more or less identical. ;) That fact would also help to make a full-fledged scientific library independent of whether Phobos or Tango is used.
 [...]

 So, what do you think? Am I making any sense? Am I the only one 
 interested in these things?

 All of the above are, of course, my personal opinions. What are yours?
I agree with pretty much everything you've said. I think all that's lacking is a bit of organisation. There are quite a lot of scientific programmers here, including an impressive number of library developers. We could really use a rallying point. I wonder if it would make sense for Walter to create a NG dedicated to scientific programming. digitalmars.D.sci or digitalmars.D.scientific or digitalmars.D.math or similar.
I would certainly join it. -Lars
Jan 31 2009
prev sibling next sibling parent reply BCS <none anon.com> writes:
Hello Lars,
 Myself, I've written/ported some routines for numerical
 differentiation and integration, one- and multi-dimensional
 root-finding and some very basic linear algebra, but so far only for
 personal use. Currently, I'm thinking of porting QUADPACK to D.
If you want, I can get you access to scrapple to host what you have.
 I think it would be really nice if many or all of the above mentioned
 things could be collected in a single library, together with all kinds
 of other stuff.
[...]
 
 So, what do you think? Am I making any sense? Am I the only one
 interested in these things?
 
The libs you list are rather scattered and I don't think they are likely to be merged any time soon. OTOH some sort of snapshot pack that collects stable versions that are known to work together might be practical.
 All of the above are, of course, my personal opinions. What are yours?
 
 -Lars
 
Jan 30 2009
parent Lars Kyllingstad <public kyllingen.NOSPAMnet> writes:
BCS wrote:
 Hello Lars,
 Myself, I've written/ported some routines for numerical
 differentiation and integration, one- and multi-dimensional
 root-finding and some very basic linear algebra, but so far only for
 personal use. Currently, I'm thinking of porting QUADPACK to D.
If you want, I can get you access to scrapple to host what you have.
At some point, maybe. As it is now, what I've written requires a bit more work and structure to be useful to others as a standalone library. But thanks anyway, I'll let you know if I need access. :)
 I think it would be really nice if many or all of the above mentioned
 things could be collected in a single library, together with all kinds
 of other stuff.
[...]
 So, what do you think? Am I making any sense? Am I the only one
 interested in these things?
The libs you list are rather scattered and I don't think they are likely to be merged any time soon. OTOH some sort of snapshot pack that collects stable versions that are known to work together might be practical.
I think you're right that a direct merge isn't feasible. But the libs I mentioned still illustrate the fact that there are people in the D community with both the talent and the interest for these things. My hope was just to get all these people to work together on a single, coherent library. :) -Lars
Jan 31 2009
prev sibling next sibling parent Jonathan Crapuchettes <jcrapuchettes gmail.com> writes:
I agree with you. I use D for complex economic calculations that are run from a 
web site. The great part about D is that other people in the company can easily 
read the code and it is fast enough for what I need to do. I ported a few ATLAS 
C functions and wrapped them with a class, offering high speed matrix 
calculations with excellent usability. Not many other languages allow for this.
JC

Lars Kyllingstad wrote:
 I think D is, or at least could be, the scientific programming language 
 of the future. Here's why -- and possibly how:
 
 A couple of years ago, I took a university class called Numerical 
 Physics. After finishing the course, I was left with the impression that 
 numerical computing was all about squeezing every last bit of 
 performance out of the computer -- that unnecessary operations and 
 function calls should be avoided at any cost, even if the resulting code 
 is full of nasty hacks and tricks, making it completely illegible and 
 utterly unmaintainable. And of course, in many cases this is true.
 
 Now, however, I have a bit more experience in the field and I know that 
 it is not always so. In numerics, as in other areas of programming, it 
 is a trade-off between development time and execution time. 
 Traditionally, if one has a desperate need for speed (sorry), one uses 
 FORTRAN, C or C++. The programs run very fast, but can be hard to 
 develop, debug and maintain. For less processor-intensive tasks one uses 
 Matlab, Mathematica, etc. which have a lot of built-in functionality and 
 make for rapid development, but programs run at a snail's pace.
 
 With D one has the best of both worlds. I've used both C++ and 
 Mathematica for numerics in the past, but now I use D almost 
 exclusively. I find it a lot easier (and more fun) to code in than C++, 
 and I spend a LOT less time debugging my programs. On the other hand, 
 calculations that would take an entire day in Mathematica are finished 
 in a matter of minutes using D.
 
 It's a fact that D programs don't have the performance of C(++) ones, 
 but that, I think, is just a matter of time (pun not intended). It's a 
 relatively new language, and the compilers are still somewhat immature.
 
 The one thing I miss the most, however, and which I think is necessary 
 for D to "take off" as a scientific language, is a native D scientific 
 library.
 
 Searching dsource, I find that many nice modules and libraries have been 
 made already:
  - MultiArray (Bill, Fawzi)
  - dstat (dsimcha)
  - blip (Fawzi)
  - MathExtra, BLADE (Don)
  - Scrapple/backmath (BCS)
  - Scrapple/units (BCS)
  - bindings to GSL, BLAS, etc.
  - ...and probably more
 
 Myself, I've written/ported some routines for numerical differentiation 
 and integration, one- and multi-dimensional root-finding and some very 
 basic linear algebra, but so far only for personal use. Currently, I'm 
 thinking of porting QUADPACK to D.
 
 I think it would be really nice if many or all of the above mentioned 
 things could be collected in a single library, together with all kinds 
 of other stuff. Something like the GSL, only written in D. (In my head 
 it's called SciD. At first I thought of DSL - D Scientific Library - but 
 that acronym is used all over the place.) I haven't the time, nor the 
 skills, to write an entire such library myself, but I'd be happy to 
 contribute where I can.
 
 Here are some design goals I find important:
  - sensible, logical and tidy package hierarchy
  - access to high-level functionality for rapid development
  - access to low-level functionality for performance
  - make use of D's awesome compile-time functionality
  - reusability, pluggability and extensibility
 
 (By the last point I mean that if one method doesn't work it should be 
 quickly and easily replaceable in code with something else. This is 
 achievable through the use of templates and interfaces, and ties in with 
 the second point.)
 
 So, what do you think? Am I making any sense? Am I the only one 
 interested in these things?
 
 All of the above are, of course, my personal opinions. What are yours?
 
 
 -Lars
Jan 30 2009
prev sibling next sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Fri, Jan 30, 2009 at 10:23 PM, Lars Kyllingstad
<public kyllingen.nospamnet> wrote:
 I think D is, or at least could be, the scientific programming language of
 the future.
Agreed! Numerics is one place where D fits nicely.
 [...]
 The one thing I miss the most, however, and which I think is necessary for D
 to "take off" as a scientific language, is a native D scientific library.
 I think it would be really nice if many or all of the above mentioned things
 could be collected in a single library, together with all kinds of other
 stuff. Something like the GSL, only written in D.
...and not GPL, preferably. :-) The goal sounds great. I'm certainly willing to help out. To me it seems one of the first things you need to get nailed down is how to represent multidimensional data. Both dense, strided, and -- for 2D -- common sparse and BLAS formats. Probably this can be done at the concept (compile-time interface) level without having to go into implementation details. But it might also be nice to have some concrete implementation nailed down too, so not everything is built as a giant house of template cards. Once you have that, then any package that can work with that concept or array format will be able to share data pretty easily. But it's not so easy. Even now I use 3 different libraries at different times for different things. * I have MatrixT/VectorT, fixed-size dense matrix and vector, whose 2D and 3D versions have some special operations like "cross product" * I have Dflat, which provides dynamically-sized Matrix types that match what BLAS accepts. And also a few dynamically-sized sparse formats that work with various sparse libs. * I have MultiArray which is a generic container for N-dimensional strided data, with dynamic number of dimensions and size. On the last one ... I consider that one a bit of a failed experiment. Dynamic number of dimensions causes too many headaches and is not really useful very often. I think Fawzi is brewing a fix-dim N-d array of his own, or was at one point. But anyway uniting all these things under a single library or even a single concept would be significant work. --bb
Jan 30 2009
next sibling parent Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-01-30 20:55:06 +0100, Bill Baxter <wbaxter gmail.com> said:

 On Fri, Jan 30, 2009 at 10:23 PM, Lars Kyllingstad
 <public kyllingen.nospamnet> wrote:
 I think D is, or at least could be, the scientific programming language of
 the future.
Agreed! Numerics is one place where D fits nicely.
 [...]
 The one thing I miss the most, however, and which I think is necessary for D
 to "take off" as a scientific language, is a native D scientific library.
 I think it would be really nice if many or all of the above mentioned things
 could be collected in a single library, together with all kinds of other
 stuff. Something like the GSL, only written in D.
...and not GPL, preferably. :-) The goal sounds great. I'm certainly willing to help out. To me it seems one of the first things you need to get nailed down is how to represent multidimensional data. Both dense, strided, and -- for 2D -- common sparse and BLAS formats. Probably this can be done at the concept (compile-time interface) level without having to go into implementation details. But it might also be nice to have some concrete implementation nailed down too, so not everything is built as a giant house of template cards. Once you have that, then any package that can work with that concept or array format will be able to share data pretty easily. But it's not so easy.
indeed probably one might need A: access 1) access to any element by index 1.b) setting any element 2) subslicing 3) looping on elements + indexes 3.b) looping of row, then cols 3.c) looping on col then rows B: vector space structure zero & 1 vector & diag matrixes scalar multiplication dot product in some cases maybe even wedge product A and B are rather separated depending on the exact structure different ways to do this might be more or less efficient or not possible at all (setting an element outside the sparsity pattern, row looping for some sparse formats,...). for A I think that one could have an interface for 1-3 that works, and a dot product could be implemented in a generic way on the top of it, but this interface will probably not be the most efficient (but probably ok) for B maybe templates, with a class wrapper that exposes all the stuff (one could also use only templates, but it might be more cumbersome In any case not easy...
   Even now I use 3 different libraries at
 different times for different things.
 * I have MatrixT/VectorT,  fixed-size dense matrix and vector, whose
 2D and 3D versions have some special operations like "cross product"
 * I have Dflat, which provides dynamically-sized Matrix types that
 match what BLAS accepts.  And also a few dynamically-sized sparse
 formats that work with various sparse libs.
 * I have MultiArray which is a generic container for N-dimensional
 strided data, with dynamic number of dimensions and size.
 
 On the last one ... I consider that one a bit of a failed experiment.
 Dynamic number of dimensions causes too many headaches and is not
 really useful very often.  I think Fawzi is brewing a fix-dim N-d
 array of his own, or was at one point.
yes it is in blip, it works quite well, it uses compile time rank (as you had suggested) and has a nice interface to lapack I haven't announced it officially yet because I wanted to make the SMP parallelization better, and add binary serialization but anyway here is what blip has: - N dimensional arrays with arbitrary strides, and subslicing without copying, and a nice interface to lapack, and fair performance. - random testing framework (that does tests in parallel) - serialization (at the moment to/from json format, but very extensible to other formats, can be activated with minimal effort if one uses Xpose), and is usabe also as input file - SMP parallelization (that is used by the random testing framework, and should improve drastically soon, and be used also by NArray) it is usable already now, so if you want to give a look... Fawzi
 
 But anyway uniting all these things under a single library or even a
 single concept would be significant work.
 
 --bb
Jan 31 2009
prev sibling parent reply Lars Kyllingstad <public kyllingen.NOSPAMnet> writes:
Bill Baxter wrote:
 On Fri, Jan 30, 2009 at 10:23 PM, Lars Kyllingstad
 <public kyllingen.nospamnet> wrote:
 I think D is, or at least could be, the scientific programming language of
 the future.
Agreed! Numerics is one place where D fits nicely.
 [...]
 The one thing I miss the most, however, and which I think is necessary for D
 to "take off" as a scientific language, is a native D scientific library.
 I think it would be really nice if many or all of the above mentioned things
 could be collected in a single library, together with all kinds of other
 stuff. Something like the GSL, only written in D.
...and not GPL, preferably. :-) The goal sounds great. I'm certainly willing to help out. To me it seems one of the first things you need to get nailed down is how to represent multidimensional data. Both dense, strided, and -- for 2D -- common sparse and BLAS formats. Probably this can be done at the concept (compile-time interface) level without having to go into implementation details. But it might also be nice to have some concrete implementation nailed down too, so not everything is built as a giant house of template cards. Once you have that, then any package that can work with that concept or array format will be able to share data pretty easily.
Agreed. Unfortunately, linear algebra is not my area of expertise. As I said, I have written some basic linear algebra code -- very simplistic vector/matrix classes, LU decomposition, etc. -- but just because they're needed for my numerical calculus modules. I haven't put any work into tweaking the performance of this code. For my use, and I expect for lots of other areas in numerics, very simple vector/matrix interfaces suffice. All I need is opIndex, opAdd, opSub and opMul. And then you linear algebra people can take care of the details. ;) But that's the nice thing about D -- by using templates and interfaces, I think it is possible to create a library that's user-friendly, but which has super-speedy functions and classes that are tailored for very specific use cases.
 But it's not so easy.  Even now I use 3 different libraries at
 different times for different things.
 * I have MatrixT/VectorT,  fixed-size dense matrix and vector, whose
 2D and 3D versions have some special operations like "cross product"
 * I have Dflat, which provides dynamically-sized Matrix types that
 match what BLAS accepts.  And also a few dynamically-sized sparse
 formats that work with various sparse libs.
 * I have MultiArray which is a generic container for N-dimensional
 strided data, with dynamic number of dimensions and size.
 
 On the last one ... I consider that one a bit of a failed experiment.
 Dynamic number of dimensions causes too many headaches and is not
 really useful very often.  I think Fawzi is brewing a fix-dim N-d
 array of his own, or was at one point.
 
 But anyway uniting all these things under a single library or even a
 single concept would be significant work.
...and, I think, would draw even more people to D. -Lars
Jan 31 2009
parent Bill Baxter <wbaxter gmail.com> writes:
On Sun, Feb 1, 2009 at 3:06 AM, Lars Kyllingstad
<public kyllingen.nospamnet> wrote:
 Bill Baxter wrote:
 On Fri, Jan 30, 2009 at 10:23 PM, Lars Kyllingstad
 <public kyllingen.nospamnet> wrote:
 I think D is, or at least could be, the scientific programming language
 of
 the future.
Agreed! Numerics is one place where D fits nicely.
 [...]
 The one thing I miss the most, however, and which I think is necessary
 for D
 to "take off" as a scientific language, is a native D scientific library.
 I think it would be really nice if many or all of the above mentioned
 things
 could be collected in a single library, together with all kinds of other
 stuff. Something like the GSL, only written in D.
...and not GPL, preferably. :-) The goal sounds great. I'm certainly willing to help out. To me it seems one of the first things you need to get nailed down is how to represent multidimensional data. Both dense, strided, and -- for 2D -- common sparse and BLAS formats. Probably this can be done at the concept (compile-time interface) level without having to go into implementation details. But it might also be nice to have some concrete implementation nailed down too, so not everything is built as a giant house of template cards. Once you have that, then any package that can work with that concept or array format will be able to share data pretty easily.
Agreed. Unfortunately, linear algebra is not my area of expertise. As I said, I have written some basic linear algebra code -- very simplistic vector/matrix classes, LU decomposition, etc. -- but just because they're needed for my numerical calculus modules. I haven't put any work into tweaking the performance of this code. For my use, and I expect for lots of other areas in numerics, very simple vector/matrix interfaces suffice. All I need is opIndex, opAdd, opSub and opMul. And then you linear algebra people can take care of the details. ;) But that's the nice thing about D -- by using templates and interfaces, I think it is possible to create a library that's user-friendly, but which has super-speedy functions and classes that are tailored for very specific use cases.
 But it's not so easy.  Even now I use 3 different libraries at
 different times for different things.
 * I have MatrixT/VectorT,  fixed-size dense matrix and vector, whose
 2D and 3D versions have some special operations like "cross product"
 * I have Dflat, which provides dynamically-sized Matrix types that
 match what BLAS accepts.  And also a few dynamically-sized sparse
 formats that work with various sparse libs.
 * I have MultiArray which is a generic container for N-dimensional
 strided data, with dynamic number of dimensions and size.

 On the last one ... I consider that one a bit of a failed experiment.
 Dynamic number of dimensions causes too many headaches and is not
 really useful very often.  I think Fawzi is brewing a fix-dim N-d
 array of his own, or was at one point.

 But anyway uniting all these things under a single library or even a
 single concept would be significant work.
...and, I think, would draw even more people to D.
...and it sounds like Andrei's got plans for creating basically that, a matrix lib that unites all those different needs. Or at least BLAS, sparse types and multidimensional arrays. Not sure if he has any plan for fixed-size arrays in that mix, too. (float2,float2x2, etc) --bb
Jan 31 2009
prev sibling next sibling parent reply Trass3r <mrmocool gmx.de> writes:
Lars Kyllingstad schrieb:
 For less processor-intensive tasks one uses 
 Matlab, Mathematica, etc. which have a lot of built-in functionality and 
 make for rapid development, but programs run at a snail's pace.
 
So it would probably be cool to also have certain functions for creating and manipulating figures like in Matlab.
Jan 30 2009
next sibling parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 12:37 PM, Trass3r <mrmocool gmx.de> wrote:
 Lars Kyllingstad schrieb:
 For less processor-intensive tasks one uses Matlab, Mathematica, etc.
 which have a lot of built-in functionality and make for rapid development,
 but programs run at a snail's pace.
So it would probably be cool to also have certain functions for creating and manipulating figures like in Matlab.
Once Qt is ported we could port this: http://qwt.sourceforge.net/ :-) --bb
Jan 30 2009
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Bill Baxter wrote:
 On Sat, Jan 31, 2009 at 12:37 PM, Trass3r <mrmocool gmx.de> wrote:
 Lars Kyllingstad schrieb:
 For less processor-intensive tasks one uses Matlab, Mathematica, etc.
 which have a lot of built-in functionality and make for rapid development,
 but programs run at a snail's pace.
So it would probably be cool to also have certain functions for creating and manipulating figures like in Matlab.
Once Qt is ported we could port this: http://qwt.sourceforge.net/ :-)
Speaking of which, is it me or there are quite a lot more stars than I'd anticipated getting aligned for D? I haven't done GUI programming in a while, but my understanding is that Qt is canine testes when it comes about solid portable GUI programming, so I see QtD as absolutely huge for D. Then we have the continuously improved Descent, Sean's awesome core lib, various language interface libraries, and now up-and-coming support in SlickEdit. Something is in the air! Andrei
Jan 30 2009
parent reply Bill Baxter <wbaxter gmail.com> writes:
On Sat, Jan 31, 2009 at 1:00 PM, Andrei Alexandrescu
<SeeWebsiteForEmail erdani.org> wrote:
 Bill Baxter wrote:
 On Sat, Jan 31, 2009 at 12:37 PM, Trass3r <mrmocool gmx.de> wrote:
 Lars Kyllingstad schrieb:
 For less processor-intensive tasks one uses Matlab, Mathematica, etc.
 which have a lot of built-in functionality and make for rapid
 development,
 but programs run at a snail's pace.
So it would probably be cool to also have certain functions for creating and manipulating figures like in Matlab.
Once Qt is ported we could port this: http://qwt.sourceforge.net/ :-)
Speaking of which, is it me or there are quite a lot more stars than I'd anticipated getting aligned for D? I haven't done GUI programming in a while, but my understanding is that Qt is canine testes when it comes about solid portable GUI programming, so I see QtD as absolutely huge for D. Then we have the continuously improved Descent, Sean's awesome core lib, various language interface libraries, and now up-and-coming support in SlickEdit. Something is in the air!
Yeh and on top of all that, I heard that some C++ bigwig is working on the D standard library now, and he's even working on a book about D! --bb
Jan 30 2009
parent reply "Saaa" <empty needmail.com> writes:
:D 
Jan 31 2009
parent reply Christopher Wright <dhasenan gmail.com> writes:
Saaa wrote:
 :D 
Is that some sort of successor to D? C++ is still alive and people are already trying to beat D!
Jan 31 2009
parent Jarrett Billingsley <jarrett.billingsley gmail.com> writes:
On Sat, Jan 31, 2009 at 12:25 PM, Christopher Wright <dhasenan gmail.com> wrote:
 Saaa wrote:
 :D
Is that some sort of successor to D? C++ is still alive and people are already trying to beat D!
D: could be a de-successor to D. D without all the cool stuff.
Jan 31 2009
prev sibling parent Lars Kyllingstad <public kyllingen.NOSPAMnet> writes:
Trass3r wrote:
 Lars Kyllingstad schrieb:
 For less processor-intensive tasks one uses Matlab, Mathematica, etc. 
 which have a lot of built-in functionality and make for rapid 
 development, but programs run at a snail's pace.
So it would probably be cool to also have certain functions for creating and manipulating figures like in Matlab.
I tend to use gnuplot for all my plotting needs, and I recently discovered a project called dplot. I haven't tried it yet, though. Does anyone have any experience with this? http://www.dsource.org/projects/dplot -Lars
Jan 31 2009
prev sibling next sibling parent Brian Palmer <d brian.codekitchen.net> writes:
Walter Bright Wrote:

 Bill Baxter wrote:
 Having to recompile and rerun after every one of those changes just
 isn't quite as direct.
If it can be done in under half a second, isn't that direct enough? Of course, I'm talking about a shell that does it for you.
If anybody is really serious about doing this they might want to look at the implementation of a similar shell for C at http://neugierig.org/software/c-repl/. Oddly much of it is written in Haskell, but there's also an early Ruby prototype that might be useful. Essentially each line is wrapped in a function, compiled into a library and then dlopen'ed in the parent process and executed. Assignments are translated into globals. This will be a little more complex in D because there isn't one global namespace. Actually I think this has been brought up on this list before, I can't find the original reference though.
Jan 31 2009
prev sibling parent reply Don <nospam nospam.com> writes:
Lars Kyllingstad wrote:
 I think D is, or at least could be, the scientific programming language 
 of the future. Here's why -- and possibly how:
 
 A couple of years ago, I took a university class called Numerical 
 Physics. After finishing the course, I was left with the impression that 
 numerical computing was all about squeezing every last bit of 
 performance out of the computer -- that unnecessary operations and 
 function calls should be avoided at any cost, even if the resulting code 
 is full of nasty hacks and tricks, making it completely illegible and 
 utterly unmaintainable. And of course, in many cases this is true.
 
 Now, however, I have a bit more experience in the field and I know that 
 it is not always so. In numerics, as in other areas of programming, it 
 is a trade-off between development time and execution time. 
 Traditionally, if one has a desperate need for speed (sorry), one uses 
 FORTRAN, C or C++. The programs run very fast, but can be hard to 
 develop, debug and maintain. For less processor-intensive tasks one uses 
 Matlab, Mathematica, etc. which have a lot of built-in functionality and 
 make for rapid development, but programs run at a snail's pace.
 
 With D one has the best of both worlds. I've used both C++ and 
 Mathematica for numerics in the past, but now I use D almost 
 exclusively. I find it a lot easier (and more fun) to code in than C++, 
 and I spend a LOT less time debugging my programs. On the other hand, 
 calculations that would take an entire day in Mathematica are finished 
 in a matter of minutes using D.
 
 It's a fact that D programs don't have the performance of C(++) ones, 
 but that, I think, is just a matter of time (pun not intended). It's a 
 relatively new language, and the compilers are still somewhat immature.
 
 The one thing I miss the most, however, and which I think is necessary 
 for D to "take off" as a scientific language, is a native D scientific 
 library.
 
 Searching dsource, I find that many nice modules and libraries have been 
 made already:
  - MultiArray (Bill, Fawzi)
  - dstat (dsimcha)
  - blip (Fawzi)
  - MathExtra, BLADE (Don)
  - Scrapple/backmath (BCS)
  - Scrapple/units (BCS)
  - bindings to GSL, BLAS, etc.
  - ...and probably more
 
 Myself, I've written/ported some routines for numerical differentiation 
 and integration, one- and multi-dimensional root-finding and some very 
 basic linear algebra, but so far only for personal use. Currently, I'm 
 thinking of porting QUADPACK to D.
 
 I think it would be really nice if many or all of the above mentioned 
 things could be collected in a single library, together with all kinds 
 of other stuff. Something like the GSL, only written in D. (In my head 
 it's called SciD. At first I thought of DSL - D Scientific Library - but 
 that acronym is used all over the place.) I haven't the time, nor the 
 skills, to write an entire such library myself, but I'd be happy to 
 contribute where I can.
 
 Here are some design goals I find important:
  - sensible, logical and tidy package hierarchy
  - access to high-level functionality for rapid development
  - access to low-level functionality for performance
  - make use of D's awesome compile-time functionality
  - reusability, pluggability and extensibility
 
 (By the last point I mean that if one method doesn't work it should be 
 quickly and easily replaceable in code with something else. This is 
 achievable through the use of templates and interfaces, and ties in with 
 the second point.)
 
 So, what do you think? Am I making any sense? Am I the only one 
 interested in these things?
 
 All of the above are, of course, my personal opinions. What are yours?
 
 
 -Lars
As a first step, I created a wiki page and copied your list to it. http://www.prowiki.org/wiki4d/wiki.cgi?ScientificLibraries I think it would be really useful to fill in the contents of each of these libraries (as I've done for mine), so that we can get a good idea of what's out there.
Feb 04 2009
next sibling parent Daniel Keep <daniel.keep.lists gmail.com> writes:
Don wrote:
 As a first step, I created a wiki page and copied your list to it.
 
 http://www.prowiki.org/wiki4d/wiki.cgi?ScientificLibraries
 
 I think it would be really useful to fill in the contents of each of
 these libraries (as I've done for mine), so that we can get a good idea
 of what's out there.
Regarding the wishlist; it might be an idea to write down what you would like from such a graphing package. I'm just thinking that there might be people out there who enjoy graphics programming, but don't necessarily know what such a library should contain... -- Daniel
Feb 04 2009
prev sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Don (nospam nospam.com)'s article
 As a first step, I created a wiki page and copied your list to it.
 http://www.prowiki.org/wiki4d/wiki.cgi?ScientificLibraries
 I think it would be really useful to fill in the contents of each of
 these libraries (as I've done for mine), so that we can get a good idea
 of what's out there.
Good idea. Stupid question, though: Where the heck is the account creation page so I can create wiki4d account and post some info to this page?
Feb 04 2009
parent reply Stewart Gordon <smjg_1998 yahoo.com> writes:
dsimcha wrote:
<snip>
 Good idea.  Stupid question, though:  Where the heck is the account creation
page
 so I can create wiki4d account and post some info to this page?
There isn't one. You just give yourself a username in the preferences and then you're ready to go. http://www.prowiki.org/wiki4d/wiki.cgi?action=editprefs Stewart.
Feb 05 2009
parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Stewart Gordon (smjg_1998 yahoo.com)'s article
 dsimcha wrote:
 <snip>
 Good idea.  Stupid question, though:  Where the heck is the account creation
page
 so I can create wiki4d account and post some info to this page?
There isn't one. You just give yourself a username in the preferences and then you're ready to go. http://www.prowiki.org/wiki4d/wiki.cgi?action=editprefs Stewart.
I kind of got the impression that that's how it's supposed to work, but every time I enter a username, I get an invalid username error message: Invalid username dsimcha not stored. server time: February 5, 2009 8:54 local time: February 5, 2009 16:54 The preferences have been saved. No matter what I do I can't get wiki4d to actually let me edit anything.
Feb 05 2009
parent Stewart Gordon <smjg_1998 yahoo.com> writes:
dsimcha wrote:
<snip>
 Invalid username dsimcha not stored.
 server time: February 5, 2009 8:54
 local time: February 5, 2009 16:54
 The preferences have been saved.
 
 No matter what I do I can't get wiki4d to actually let me edit anything.
AIUI your username needs to be BiCapitalised, like the PeterSmith example it gives. Could be clearer, I agree. Most of us use our real names with the space dropped - StewartGordon, DonClugston, etc. Stewart.
Feb 05 2009