digitalmars.D - Things I Learned from ACCU 2010
- Walter Bright (11/11) Apr 23 2010 * Not all functional programming languages do concurrency well. Haskell ...
- Walter Bright (3/4) Apr 23 2010 For example, this hilarious video was shown by James Bach, who created i...
- bearophile (13/18) Apr 23 2010 What kind of problems have you seen in Haskell?
- Walter Bright (13/30) Apr 23 2010 It wasn't me, it was Russell Wider. He wrote a parallel pi calculating p...
- Walter Bright (2/3) Apr 23 2010 That's Russel Winder.
- bearophile (4/14) Apr 23 2010 You have just the illusion to have learned something about this. Trying ...
- Walter Bright (9/19) Apr 23 2010 Fair enough, but in order to dismiss the results I'd need to know *why* ...
- bearophile (8/11) Apr 23 2010 Being easy to learn to use is not one of the qualities of Haskell. If yo...
- Walter Bright (4/7) Apr 23 2010 It's statements like this (and I've heard this repeatedly) that makes me...
- Nick Sabalausky (6/14) Apr 23 2010 Not that I have an opinion either way, but FWIW, very similar things cou...
- Walter Bright (2/15) Apr 23 2010 I know, and that creates an opportunity for other languages!
- Nick Sabalausky (9/24) Apr 23 2010 Definitely :)
- Justin Johansson (6/25) Apr 29 2010 Good one, retard; that's really funny and surprising that Andrei didn't
- retard (6/26) Apr 23 2010 Why not? Do you think parallelism is simple to manage (efficiently)? No
- Steven Schveighoffer (9/17) Apr 23 2010 I think his point was that a person who *does* understand parallelism an...
- BCS (6/8) Apr 24 2010 Very good point. If a design with no blatant flaws performs like that wi...
- Petr Kalny (7/31) Apr 29 2010 IIRC Haskell's problems with concurrency have roots in its 100% lazy
- Andrei Alexandrescu (6/41) Apr 29 2010 Which specific papers are you referring to?
- Petr Kalny (11/58) Apr 29 2010 Right, I couldn't find the paper, I have read about concurrency in
- Clemens (7/14) Apr 23 2010 Do you have a reference on that? I'll produce one to the contrary:
- Walter Bright (4/12) Apr 23 2010 All I've got is Russel Winder's talk on it, Parallelism: The Functional
- Clemens (3/17) Apr 23 2010 Ah, ok. As bearophile noted, that person seems to have not much experien...
- Walter Bright (9/17) Apr 23 2010 D is meant to give good results even for people who are not experts at i...
- Michael Rynn (6/20) Apr 23 2010 OK where's the naive version of the D Pi program that scales up with
- Walter Bright (2/5) Apr 23 2010 Nobody's written a library function to parallelize a map/reduce yet.
- Robert Jacques (6/11) Apr 23 2010 Dave Simcha has.
- Walter Bright (2/17) Apr 23 2010 Cool!
- Walter Bright (2/20) Apr 23 2010 Unfortunately, it currently fails to compile with D2.
- dsimcha (8/28) May 08 2010 Can you tell me what errors you're getting? I realize that map and redu...
- sybrandy (6/11) Apr 23 2010 Funny you mention that. I actually started a map/reduce library that I
- Andrei Alexandrescu (3/21) Apr 23 2010 And the correct way.
- Walter Bright (4/9) Apr 23 2010 Yes.
- Clemens (15/34) Apr 23 2010 Someone coming from C++ might think the following program entirely reaso...
- Walter Bright (2/4) Apr 23 2010 I'll see if Russel will email me the code.
- Robert Jacques (3/6) Apr 23 2010 Try http://www.russel.org.uk:8080/Bazaar/Pi_Quadrature/changes
- Leandro Lucarella (11/25) Apr 23 2010 Is very easy to make naive programs that have serious performance proble...
- Walter Bright (6/24) Apr 23 2010 Relatively inexperienced D programmers should be able to apply straightf...
- Andrej Mitrovic (1/1) Apr 23 2010 If only multicore programming was all about finding fibonnaci numbers or...
- Leandro Lucarella (11/25) Apr 23 2010 Like using one of the corner cases where the GC really sucks. =)
- Andrei Alexandrescu (3/8) Apr 23 2010 I hope that trend has been definitively reversed.
- so (9/16) Apr 24 2010 Haskell is cool but I am puzzled with that Haskell vs C example.
- Pelle (4/6) Apr 29 2010 Didn't Walter implement templates without grokking them? I think I read
- Walter Bright (5/13) Apr 29 2010 I also passed the quantum mechanics final in physics without understandi...
- bearophile (4/5) Apr 23 2010 I think 5 days of serious use are enough for Walter to learn some of the...
- BLS (5/10) Apr 23 2010 Good to know that you are able to estimate how much time people need to
- Nick Sabalausky (13/26) Apr 23 2010 When the academic researchers keep their work squirreled away in academi...
- Andrei Alexandrescu (18/30) Apr 24 2010 The style of academic papers is not convoluted on purpose. The good
- Nick Sabalausky (10/12) Apr 24 2010 Heh. I've surprised a lot of laymen, after telling them I'm a programmer...
- Walter Bright (6/12) Apr 24 2010 I share your opinion that most software and consumer electronics is terr...
- Gareth Charnock (4/21) Apr 25 2010 That's nothing. My laptop power supply crashes every so often and starts...
- Andrei Alexandrescu (14/35) Apr 25 2010 Well actually that's a different matter altogether. Power supplies are
- div0 (16/21) Apr 25 2010 -----BEGIN PGP SIGNED MESSAGE-----
- Walter Bright (5/12) Apr 25 2010 My current one doesn't either; it has no digital inputs. But it clearly ...
- Nick Sabalausky (4/15) Apr 25 2010 We had to get by with Bob Sagat saying vaguely amusng things overtop an
- Walter Bright (3/5) Apr 25 2010 Bob Sagat was never amusing. Though I felt sorry for him, how many jokes...
- Bernard Helyer (4/7) Apr 26 2010 http://www.youtube.com/watch?v=0HW4mPZmKPM (NSFW)
- NUBIE (6/30) Apr 24 2010 This a failure to read the manual, and arguably bad defaults by GHC here...
- Clemens (3/31) Apr 26 2010 Oh well. Thanks for proving my point.
* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower. Erlang and Clojure parallelize well, in that performance scales up proportionally as cores are added. * The future of multi-core hardware is to not have any shared memory, each core will have its own address space. Message passing looks like the future. * Monads have nothing in particular to with I/O. All monads are are a way to insert code to pre-process arguments going to a function, and insert code to post-process the result coming out of that function. * Probably nobody understands how to use C++0x atomics correctly, or ever will. * People really understand and get testing and how it improves programming.
Apr 23 2010
Walter Bright wrote:* People really understand and get testing and how it improves programming.For example, this hilarious video was shown by James Bach, who created it: http://www.youtube.com/watch?v=M37VOKIaDUw
Apr 23 2010
Walter Bright:* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.<What kind of problems have you seen in Haskell? I have read several articles about parallel code written in Haskell, and its situation doesn't look so bad.Erlang and Clojure parallelize well, in that performance scales up proportionally as cores are added.<There is no way for this to be true in general. "Scalability" is a lot a characteristic of the algorithm (and the way its subparts exchange data), not the language. And if you care of performance Erlang is not the best: http://proliferationofniches.blogspot.com/2008/07/multi-core-problem.html Erlang is good for other things, like reliability.Message passing looks like the future.<Message passing is one future. There is no single silver bullet to solve the concurrency/parallelism problems. Different algorithms will need different solutions: message passing (actors, agents), data parallelism (AVX registers, GPU cores, vector operations, parallel loops, etc), dataflow programming (http://en.wikipedia.org/wiki/Dataflow_programming ), etc. D will need several solutions in its core language, std lib, external libs.* Monads have nothing in particular to with I/O.<Right. They are used for the I/O in Haskell, but they are a quite more general concept that can be used for other purposes too.* People really understand and get testing and how it improves programming.<And D unit testing is not good enough yet :-) I think dynamic languages have shown why testing is so useful (for statically compiled languages too). Bye, bearophile
Apr 23 2010
bearophile wrote:Walter Bright:It wasn't me, it was Russell Wider. He wrote a parallel pi calculating program in several languages, and then tried it with 1, 2, 4, and 8 cores. The more cores, the longer the Haskell program took. Charting the core use showed that only one core would run at a time. Same with OCaml. OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.<What kind of problems have you seen in Haskell? I have read several articles about parallel code written in Haskell, and its situation doesn't look so bad.With Erlang and Clojure and the parallel pi programming, doubling the number of cores doubled the speed. Graphing the core use showed they were utilizing the cores simultaneously. Erlang was slow in general, but it *did* scale well with the number of cores.Erlang and Clojure parallelize well, in that performance scales up proportionally as cores are added.<There is no way for this to be true in general. "Scalability" is a lot a characteristic of the algorithm (and the way its subparts exchange data), not the language. And if you care of performance Erlang is not the best: http://proliferationofniches.blogspot.com/2008/07/multi-core-problem.html Erlang is good for other things, like reliability.
Apr 23 2010
Walter Bright wrote:It wasn't me, it was Russell Wider.That's Russel Winder.
Apr 23 2010
Walter Bright:It wasn't me, it was Russell Wider. He wrote a parallel pi calculating program in several languages, and then tried it with 1, 2, 4, and 8 cores. The more cores, the longer the Haskell program took. Charting the core use showed that only one core would run at a time. Same with OCaml. OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation. Bye, bearophile
Apr 23 2010
bearophile wrote:Walter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it. (Russel may or not be an expert, but he is certainly not a novice at FP or parallelism.) Basically, I'd welcome an explanatory riposte to Russel's results.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 23 2010
Walter Bright:You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it.<Being easy to learn to use is not one of the qualities of Haskell. If you want to write efficient programs in Haskell you need lot of brain, you can see it also from the large amount of discussions here: http://www.haskell.org/haskellwiki/Great_language_shootout So I think you need experience and knowledge to do anything significant in Haskell, not just highly parallel programs.Basically, I'd welcome an explanatory riposte to Russel's results.<If you want an explanation then I think you have to ask in (for example) an Haskell newsgroup, etc.Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell.<Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-) Bye, bearophile
Apr 23 2010
bearophile wrote:Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-)It's statements like this (and I've heard this repeatedly) that makes me wonder what the value of Haskell actually is to conventional programming tasks and regular programmers.
Apr 23 2010
"Walter Bright" <newshound1 digitalmars.com> wrote in message news:hqsn2j$1s29$1 digitalmars.com...bearophile wrote:Not that I have an opinion either way, but FWIW, very similar things could probably be said for C++. ------------------------------- Not sent from an iPhone.Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-)It's statements like this (and I've heard this repeatedly) that makes me wonder what the value of Haskell actually is to conventional programming tasks and regular programmers.
Apr 23 2010
Nick Sabalausky wrote:"Walter Bright" <newshound1 digitalmars.com> wrote in message news:hqsn2j$1s29$1 digitalmars.com...I know, and that creates an opportunity for other languages!bearophile wrote:Not that I have an opinion either way, but FWIW, very similar things could probably be said for C++.Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-)It's statements like this (and I've heard this repeatedly) that makes me wonder what the value of Haskell actually is to conventional programming tasks and regular programmers.
Apr 23 2010
Walter Bright" <newshound1 digitalmars.com> wrote in message news:hqsuac$29rh$2 digitalmars.com...Nick Sabalausky wrote:Definitely :) Although, I guess what I meant by that was that if there were someone unexperienced in imperative systems languages, C++ would probably have a few things to teach them, even though it may present them in a much-less-than-ideal form. ------------------------------- Not sent from an iPhone."Walter Bright" <newshound1 digitalmars.com> wrote in message news:hqsn2j$1s29$1 digitalmars.com...I know, and that creates an opportunity for other languages!bearophile wrote:Not that I have an opinion either way, but FWIW, very similar things could probably be said for C++.Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-)It's statements like this (and I've heard this repeatedly) that makes me wonder what the value of Haskell actually is to conventional programming tasks and regular programmers.
Apr 23 2010
retard wrote:Fri, 23 Apr 2010 10:57:31 -0700, Walter Bright wrote:Good one, retard; that's really funny and surprising that Andrei didn't bite :-) Hard to imagine Andrei doing maintenance programming in some infidel programming language that doesn't have decent metaprogramming facilities though !!!bearophile wrote:Regular programmers just die away. At some point we don't need crappy results anymore. The software engineering is often about reimplementing things. If a level 1 novice writes a blog engine, you need level 2..20 programmers to fix all the sql injection / xss bugs and caching issues. After that, even better programmers finally write maintainable and readable code. But it doesn't scale. That's why companies like Facebook hire guys like Andrei to fix the bugs caused by the 1st generation PHP newbies.Being Haskell not easy, it's even possible for me to not understand the explanation if some Haskell expert eventually explains me why that Haskell program was slow :-)It's statements like this (and I've heard this repeatedly) that makes me wonder what the value of Haskell actually is to conventional programming tasks and regular programmers.
Apr 29 2010
Fri, 23 Apr 2010 06:23:22 -0700, Walter Bright wrote:bearophile wrote:Why not? Do you think parallelism is simple to manage (efficiently)? No offence but a total novice has zero understanding of e.g. threads or 3rd party libraries. The best he can do is to come up with something using the stdlib Thread classes. Usually it fails miserably due to deadlocks or other synchronization issues with locks.Walter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 23 2010
On Fri, 23 Apr 2010 19:09:55 -0400, retard <re tard.com.invalid> wrote:Fri, 23 Apr 2010 06:23:22 -0700, Walter Bright wrote:I think his point was that a person who *does* understand parallelism and threading couldn't get it right. Not being an expert in the *language* does not make you a novice at threading. Of course someone who does not understand threading/parallelism is bound to have troubles no matter what the language until he/she gains more experience. You almost have to experience a deadlock-after-2-weeks problem to really get how important threading issues are (I did). -SteveYou shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it.Why not? Do you think parallelism is simple to manage (efficiently)? No offence but a total novice has zero understanding of e.g. threads or 3rd party libraries. The best he can do is to come up with something using the stdlib Thread classes. Usually it fails miserably due to deadlocks or other synchronization issues with locks.
Apr 23 2010
Hello Walter,You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it.Very good point. If a design with no blatant flaws performs like that with no easy to spot cause, I'd say there is a problem in the language, even if the problem is in the program. -- ... <IXOYE><
Apr 24 2010
On Fri, 23 Apr 2010 15:23:22 +0200, Walter Bright <newshound1 digitalmars.com> wrote:bearophile wrote:IIRC Haskell's problems with concurrency have roots in its 100% lazy evaluation. Anyone wanting more details may find this page useful: http://www.haskell.org/haskellwiki/Research_papers/Parallelism_and_concurrency PetrWalter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it. (Russel may or not be an expert, but he is certainly not a novice at FP or parallelism.) Basically, I'd welcome an explanatory riposte to Russel's results.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 29 2010
On 04/29/2010 06:03 AM, Petr Kalny wrote:On Fri, 23 Apr 2010 15:23:22 +0200, Walter Bright <newshound1 digitalmars.com> wrote:Which specific papers are you referring to? BTW, I wonder how current the page is. It features no paper from 2009 or 2010, one from 2008, none from 2007, and six from 2006. Of those, three links are broken. Andreibearophile wrote:IIRC Haskell's problems with concurrency have roots in its 100% lazy evaluation. Anyone wanting more details may find this page useful: http://www.haskell.org/haskellwiki/Research_papers/Parallelism_and_concurrencyWalter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it. (Russel may or not be an expert, but he is certainly not a novice at FP or parallelism.) Basically, I'd welcome an explanatory riposte to Russel's results.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 29 2010
On Thu, 29 Apr 2010 16:02:03 +0200, Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> wrote:On 04/29/2010 06:03 AM, Petr Kalny wrote:Right, I couldn't find the paper, I have read about concurrency in Haskell, there as well. (But I hoped there might be some other useful information :o). After more searching I located that paper at: http://research.microsoft.com/en-us/um/people/simonpj/papers/parallel/index.htm Runtime Support for Multicore Haskell http://research.microsoft.com/en-us/um/people/simonpj/papers/parallel/multicore-ghc.pdf HTH PetrOn Fri, 23 Apr 2010 15:23:22 +0200, Walter Bright <newshound1 digitalmars.com> wrote:Which specific papers are you referring to? BTW, I wonder how current the page is. It features no paper from 2009 or 2010, one from 2008, none from 2007, and six from 2006. Of those, three links are broken. Andreibearophile wrote:IIRC Haskell's problems with concurrency have roots in its 100% lazy evaluation. Anyone wanting more details may find this page useful: http://www.haskell.org/haskellwiki/Research_papers/Parallelism_and_concurrencyWalter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it. (Russel may or not be an expert, but he is certainly not a novice at FP or parallelism.) Basically, I'd welcome an explanatory riposte to Russel's results.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 29 2010
Walter Bright Wrote:* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.Do you have a reference on that? I'll produce one to the contrary: http://cgi.cse.unsw.edu.au/~dons/blog/2007/11/29#smoking-4core* Monads have nothing in particular to with I/O.Right.All monads are are a way to insert code to pre-process arguments going to a function, and insert code to post-process the result coming out of that function.That's a much too narrow view. While it may apply roughly to some uses of monads, even something as simple as the Maybe monad doesn't fit into this mental model anymore. I'd really recommend spending a few days with Haskell. Even if it may not be the language you'll want to spend the rest of your life with, there's no denying that a lot of interesting ideas and research is going into Haskell. (As an aside, I'm generally a bit put off by the hostility towards programming language research and theory in the D community. "We don't need no stinking theory, we'll just roll our own ad-hoc solution which will work much better because ivory-tower academics are completely out of touch with reality anyway." Bleh.) If you try to put ideas of pure functional programming into D, I think it would be a good idea to at least be somewhat familiar with the way the reigning king of that particular niche does it. -- Clemens
Apr 23 2010
Clemens wrote:Walter Bright Wrote:All I've got is Russel Winder's talk on it, Parallelism: The Functional Imperative, with the code and benchmarks. He ran them in real time. http://www.russel.org.uk/* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.Do you have a reference on that? I'll produce one to the contrary: http://cgi.cse.unsw.edu.au/~dons/blog/2007/11/29#smoking-4core
Apr 23 2010
Walter Bright Wrote:Clemens wrote:Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?Walter Bright Wrote:All I've got is Russel Winder's talk on it, Parallelism: The Functional Imperative, with the code and benchmarks. He ran them in real time. http://www.russel.org.uk/* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.Do you have a reference on that? I'll produce one to the contrary: http://cgi.cse.unsw.edu.au/~dons/blog/2007/11/29#smoking-4core
Apr 23 2010
Clemens wrote:Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon. For example, Andrei has expended a great deal of effort on making the naive use of stdio also the fast way. I will send your link to Russel, I'm sure he'd be interested. I am also interested in *why* Russel's Pi program is a bad example of Haskell programming, it's not enough to dismiss it because Russel is not a Haskell expert.
Apr 23 2010
On Fri, 23 Apr 2010 06:30:13 -0700, Walter Bright wrote:D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon. For example, Andrei has expended a great deal of effort on making the naive use of stdio also the fast way. I will send your link to Russel, I'm sure he'd be interested. I am also interested in *why* Russel's Pi program is a bad example of Haskell programming, it's not enough to dismiss it because Russel is not a Haskell expert.OK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet? -3.-1-4-1-5-9.. Michael Rynn
Apr 23 2010
Michael Rynn wrote:OK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.
Apr 23 2010
On Fri, 23 Apr 2010 11:10:48 -0300, Walter Bright <newshound1 digitalmars.com> wrote:Michael Rynn wrote:Dave Simcha has. Code: http://dsource.org/projects/scrapple/browser/trunk/parallelFuture/parallelFuture.d Docs: http://cis.jhu.edu/~dsimcha/parallelFuture.htmlOK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.
Apr 23 2010
Robert Jacques wrote:On Fri, 23 Apr 2010 11:10:48 -0300, Walter Bright <newshound1 digitalmars.com> wrote:Cool!Michael Rynn wrote:Dave Simcha has. Code: http://dsource.org/projects/scrapple/browser/trunk/parallelFut re/parallelFuture.d Docs: http://cis.jhu.edu/~dsimcha/parallelFuture.htmlOK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.
Apr 23 2010
Walter Bright wrote:Robert Jacques wrote:Unfortunately, it currently fails to compile with D2.On Fri, 23 Apr 2010 11:10:48 -0300, Walter Bright <newshound1 digitalmars.com> wrote:Cool!Michael Rynn wrote:Dave Simcha has. Code: http://dsource.org/projects/scrapple/browser/trunk/parallelFut re/parallelFuture.d Docs: http://cis.jhu.edu/~dsimcha/parallelFuture.htmlOK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.
Apr 23 2010
== Quote from Walter Bright (newshound1 digitalmars.com)'s articleWalter Bright wrote:http://dsource.org/projects/scrapple/browser/trunk/parallelFuture/parallelFuture.dRobert Jacques wrote:On Fri, 23 Apr 2010 11:10:48 -0300, Walter Bright <newshound1 digitalmars.com> wrote:Michael Rynn wrote:Dave Simcha has. Code:OK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.Can you tell me what errors you're getting? I realize that map and reduce are slightly brittle due to a combination of severe abuse of templates and subtle differences in the way different compiler releases handle IFTI, but for me all the unittests still compile and run successfully on 2.045. Also, I eat my own dogfood regularly and haven't noticed any problems with this lib, though the vast majority of my uses are the parallel foreach loop, not map and reduce.Unfortunately, it currently fails to compile with D2.Docs: http://cis.jhu.edu/~dsimcha/parallelFuture.htmlCool!
May 08 2010
On 04/23/2010 10:10 AM, Walter Bright wrote:Michael Rynn wrote:Funny you mention that. I actually started a map/reduce library that I was planning on having run in parallel on a single machine. I didn't get very far as I had to divert my attention elsewhere. I really need to get back to it because it was an interesting little problem to work on. CaseyOK where's the naive version of the D Pi program that scales up with 1,2,4 cores? How far off are we? Is the concurrency module working with it yet?Nobody's written a library function to parallelize a map/reduce yet.
Apr 23 2010
On 04/23/2010 08:30 AM, Walter Bright wrote:Clemens wrote:And the correct way. AndreiAh, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon. For example, Andrei has expended a great deal of effort on making the naive use of stdio also the fast way.
Apr 23 2010
Andrei Alexandrescu wrote:On 04/23/2010 08:30 AM, Walter Bright wrote:Yes. BTW, if it isn't obvious, the Erlang and Clojure versions of the Pi program were the naive approach, and produced expected multicore results.For example, Andrei has expended a great deal of effort on making the naive use of stdio also the fast way.And the correct way.
Apr 23 2010
Walter Bright Wrote:Clemens wrote:Someone coming from C++ might think the following program entirely reasonable (and I did indeed make this mistake when starting with D): class A { this() { /* initialize me */ } void foo() { /* do smth */ } } void main() { A a; a.foo(); // blam - segfault right here } This is about the level of understanding that seems to have been applied to Haskell in that example.Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon.I will send your link to Russel, I'm sure he'd be interested. I am also interested in *why* Russel's Pi program is a bad example of Haskell programming, it's not enough to dismiss it because Russel is not a Haskell expert.I tried to have a look at it (not that I'm anything near a Haskell expert), but this link just gives me an empty directory: http://www.russel.org.uk/Bazaar/Pi_Quadrature
Apr 23 2010
Clemens wrote:I tried to have a look at it (not that I'm anything near a Haskell expert), but this link just gives me an empty directory: http://www.russel.org.uk/Bazaar/Pi_QuadratureI'll see if Russel will email me the code.
Apr 23 2010
On Fri, 23 Apr 2010 11:16:29 -0300, Clemens <eriatarka84 gmail.com> wrote: [snip]I tried to have a look at it (not that I'm anything near a Haskell expert), but this link just gives me an empty directory: http://www.russel.org.uk/Bazaar/Pi_QuadratureTry http://www.russel.org.uk:8080/Bazaar/Pi_Quadrature/changes
Apr 23 2010
Walter Bright, el 23 de abril a las 06:30 me escribiste:Clemens wrote:Is very easy to make naive programs that have serious performance problems because of the GC. You have to be almost an expert to tune that programs to make the run fast. Ask bearophile and dschima for examples =) -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- Hey you, dont help them to bury the light Don't give in without a fight.Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon.
Apr 23 2010
Leandro Lucarella wrote:Walter Bright, el 23 de abril a las 06:30 me escribiste:Relatively inexperienced D programmers should be able to apply straightforward solutions to common programming problems and expect correct behavior and reasonably acceptable performance. I doubt we will always be able to achieve that, but we should always be working towards it.Clemens wrote:Is very easy to make naive programs that have serious performance problems because of the GC. You have to be almost an expert to tune that programs to make the run fast. Ask bearophile and dschima for examples =)Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?D is meant to give good results even for people who are not experts at it. If someone wrote a straightforward D app and it gave such poor results, I'd take it (and have taken such) as a problem that D needs to improve upon.
Apr 23 2010
If only multicore programming was all about finding fibonnaci numbers or the Nth number of Pi, then maybe some of the claims in this thread would be true. :)
Apr 23 2010
Clemens, el 23 de abril a las 09:06 me escribiste:Like using one of the corner cases where the GC really sucks. =) -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- For long you live and high you fly But only if you ride the tide And balanced on the biggest wave You race towards an early grave.All I've got is Russel Winder's talk on it, Parallelism: The Functional Imperative, with the code and benchmarks. He ran them in real time. http://www.russel.org.uk/Ah, ok. As bearophile noted, that person seems to have not much experience with Haskell, to put it politely. Obviously I didn't see the presentation and don't want to judge too harshly, but if your summary is an accurate representation of its take-away points, that reeks badly of intellectual dishonesty and FUD. See my link. Or put another way, would you like someone who has never used D before to do a live presentation on it and come to premature conclusions like this?
Apr 23 2010
On 04/23/2010 07:00 AM, Clemens wrote:(As an aside, I'm generally a bit put off by the hostility towards programming language research and theory in the D community. "We don't need no stinking theory, we'll just roll our own ad-hoc solution which will work much better because ivory-tower academics are completely out of touch with reality anyway." Bleh.)I hope that trend has been definitively reversed. Andrei
Apr 23 2010
On Fri, 23 Apr 2010 16:00:32 +0400, Clemens <eriatarka84 gmail.com> wrote:Walter Bright Wrote:Haskell is cool but I am puzzled with that Haskell vs C example. What is he comparing? Parallel Haskell vs what? Also he is right by using the argument "naive", but more likely "naive use of the language" for a given algorithm, the very thing you arguing against? I'd like to see the comparisons of a "non-naive" implementations too :) -- Using Opera's revolutionary e-mail client: http://www.opera.com/mail/* Not all functional programming languages do concurrency well. Haskell and OCaml in particular have severe fundamental problems with it such that parallelizing your code makes it slower.Do you have a reference on that? I'll produce one to the contrary: http://cgi.cse.unsw.edu.au/~dons/blog/2007/11/29#smoking-4core
Apr 24 2010
On 04/24/2010 01:13 AM, retard wrote:Maybe Walter is trying to break the world record for implementing things without understanding them first?Didn't Walter implement templates without grokking them? I think I read that somewhere around here. That's quite a respectable feat, if you ask me.
Apr 29 2010
Pelle wrote:On 04/24/2010 01:13 AM, retard wrote:Yes.Maybe Walter is trying to break the world record for implementing things without understanding them first?Didn't Walter implement templates without grokking them?I think I read that somewhere around here. That's quite a respectable feat, if you ask me.I also passed the quantum mechanics final in physics without understanding QM. I still understood how to apply the rules, though. On the other hand, I "got" newtonian mechanics.
Apr 29 2010
Clemens:I'd really recommend spending a few days with Haskell.I think 5 days of serious use are enough for Walter to learn some of the ideas Haskell. Bye, bearophile
Apr 23 2010
On 23/04/2010 18:37, bearophile wrote:Clemens:Good to know that you are able to estimate how much time people need to understand functional languages. well I am not that smart, I've already problems to understand i = i + 1; in gplI'd really recommend spending a few days with Haskell.I think 5 days of serious use are enough for Walter to learn some of the ideas Haskell. Bye, bearophile
Apr 23 2010
"retard" <re tard.com.invalid> wrote in message news:hqt94i$2sgv$2 digitalmars.com...Fri, 23 Apr 2010 08:57:54 -0500, Andrei Alexandrescu wrote:When the academic researchers keep their work squirreled away in academic circles and written in such a convoluted style that only other long-term ivory-tower residents can get far enough past the language to see the actual meaning, it's a wonder that *anyone* finds it surprising that programmers are ignorant of it. And that's just the researchers that actually *do* know what they're doing. Let's not fool ourselves into thinking that the *majority* of academia actually knows it's head from it's ass (yea, that's right - I've brought it back to hostility). ------------------------------- Not sent from an iPhone.On 04/23/2010 07:00 AM, Clemens wrote:Instead of hostility we now have blissful ignorance. Maybe I should post here more often again..(As an aside, I'm generally a bit put off by the hostility towards programming language research and theory in the D community. "We don't need no stinking theory, we'll just roll our own ad-hoc solution which will work much better because ivory-tower academics are completely out of touch with reality anyway." Bleh.)I hope that trend has been definitively reversed. Andrei
Apr 23 2010
On 04/23/2010 10:40 PM, Nick Sabalausky wrote:"retard"<re tard.com.invalid> wrote in messageThe style of academic papers is not convoluted on purpose. The good papers discuss new solutions to difficult problems and therefore must be very precise so as to convey a trove of information in a short space. Preparing a good academic paper may take six months or more. A magazine article of the same length may take an afternoon.Instead of hostility we now have blissful ignorance. Maybe I should post here more often again..When the academic researchers keep their work squirreled away in academic circles and written in such a convoluted style that only other long-term ivory-tower residents can get far enough past the language to see the actual meaning, it's a wonder that *anyone* finds it surprising that programmers are ignorant of it.And that's just the researchers that actually *do* know what they're doing. Let's not fool ourselves into thinking that the *majority* of academia actually knows it's head from it's ass (yea, that's right - I've brought it back to hostility).That's a truism. Clearly there will be many foot soldiers and few generals in any field. The majority of developers can also be considered to not quite know what they're doing. I've had an email diatribe with an acquaintance who had the same stance "academia sucks". He kept on going how pretentious and fake it was, and I couldn't figure where he was coming from, until one day when he mentioned he'd been an academist so he has first-hand experience. The problem was he was in the outer academic circles that go through the motions of research (author papers, hold conferences, publish proceedings and journals) but they aren't quite doing research. At that point I agreed with him. Andrei
Apr 24 2010
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:hquqfm$2vse$1 digitalmars.com...The majority of developers can also be considered to not quite know what they're doing.Heh. I've surprised a lot of laymen, after telling them I'm a programmer, by my opinions that most programmers are incompetent and most software and consumer electronics are terrible. Seems hugely ironic to those unfamiliar with the field, but being around it and (at the risk of narcissism) knowing what I'm going, I think puts me (along with many of the people on this board) in a prime position to notice flaws and steps backwards. ------------------------------- Not sent from an iPhone.
Apr 24 2010
Nick Sabalausky wrote:Heh. I've surprised a lot of laymen, after telling them I'm a programmer, by my opinions that most programmers are incompetent and most software and consumer electronics are terrible. Seems hugely ironic to those unfamiliar with the field, but being around it and (at the risk of narcissism) knowing what I'm going, I think puts me (along with many of the people on this board) in a prime position to notice flaws and steps backwards.I share your opinion that most software and consumer electronics is terrible. Of course, I've produced my share of terrible software, but I won't make any excuses for doing so. For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!
Apr 24 2010
Walter Bright wrote:Nick Sabalausky wrote:That's nothing. My laptop power supply crashes every so often and starts failing to report itself to my laptop. It then has to be power cycled. I am not making this up!Heh. I've surprised a lot of laymen, after telling them I'm a programmer, by my opinions that most programmers are incompetent and most software and consumer electronics are terrible. Seems hugely ironic to those unfamiliar with the field, but being around it and (at the risk of narcissism) knowing what I'm going, I think puts me (along with many of the people on this board) in a prime position to notice flaws and steps backwards.I share your opinion that most software and consumer electronics is terrible. Of course, I've produced my share of terrible software, but I won't make any excuses for doing so. For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!
Apr 25 2010
On 04/25/2010 04:55 AM, Gareth Charnock wrote:Walter Bright wrote:Well actually that's a different matter altogether. Power supplies are switching devices that, when old, fail to maintain oscillation. When you power cycle them they usually re-prime themselves because there's some simple electronics that does it. If you listen carefully to the source, you may hear a high-pitch sound when it's working. The louder the noise, the older the source. Failure to report comes from the third wire that connects the source to the laptop. That wire is quite thin and is the first to break on an older source. The manifestation is that the laptop intermittently fails to figure that it is connected to a correct power source. Time to change the power brick. Many go for under $10 on ebay, free shipping. (How the heck do they make money off them?) AndreiNick Sabalausky wrote:That's nothing. My laptop power supply crashes every so often and starts failing to report itself to my laptop. It then has to be power cycled. I am not making this up!Heh. I've surprised a lot of laymen, after telling them I'm a programmer, by my opinions that most programmers are incompetent and most software and consumer electronics are terrible. Seems hugely ironic to those unfamiliar with the field, but being around it and (at the risk of narcissism) knowing what I'm going, I think puts me (along with many of the people on this board) in a prime position to notice flaws and steps backwards.I share your opinion that most software and consumer electronics is terrible. Of course, I've produced my share of terrible software, but I won't make any excuses for doing so. For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!
Apr 25 2010
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Walter Bright wrote:For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!Yeah but your old telly didn't play youtube. How on earth we survived without an infinite number of low quality videos of cats doing vaguely amusing things, will forever be a mystery. - -- My enormous talent is exceeded only by my outrageous laziness. http://www.ssTk.co.uk -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.7 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFL1BVzT9LetA9XoXwRAm/NAJ9UAX13Ai3xBFBZJtP4fVcOCxIiPgCeLGVE 8qdQvtwv+XaxryU8gG9JN5A= =qCmB -----END PGP SIGNATURE-----
Apr 25 2010
div0 wrote:Walter Bright wrote:My current one doesn't either; it has no digital inputs. But it clearly has a computer internally. I bought it right before the collapse of LCD TV prices, it's the last of the flat screen tube jobs.For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!Yeah but your old telly didn't play youtube.
Apr 25 2010
"div0" <div0 users.sourceforge.net> wrote in message news:hr14hk$2de0$1 digitalmars.com...-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Walter Bright wrote:We had to get by with Bob Sagat saying vaguely amusng things overtop an infinite number of slightly-less-low-qualty videos of people falling down.For example, my TV set crashes every once in a while, and must be power cycled :-( The old analog sets never did that!Yeah but your old telly didn't play youtube. How on earth we survived without an infinite number of low quality videos of cats doing vaguely amusing things, will forever be a mystery.
Apr 25 2010
Nick Sabalausky wrote:We had to get by with Bob Sagat saying vaguely amusng things overtop an infinite number of slightly-less-low-qualty videos of people falling down.Bob Sagat was never amusing. Though I felt sorry for him, how many jokes could you make about the same pratfalls, over and over, week after week?
Apr 25 2010
On 26/04/10 07:56, Walter Bright wrote:http://www.youtube.com/watch?v=0HW4mPZmKPM (NSFW) If you don't know, Bob Saget is the most blue comic I have ever heard. Which was surprising to me, only knowing him from AFV and Full House.Bob Sagat was never amusing. Though I felt sorry for him, how many jokes could you make about the same pratfalls, over and over, week after week?
Apr 26 2010
Walter Bright Wrote:bearophile wrote:This a failure to read the manual, and arguably bad defaults by GHC here. At the haskell parMap versions work fine when you compile with -threaded, and run with: +RTS -N -- NUBIEWalter Bright:Fair enough, but in order to dismiss the results I'd need to know *why* the Haskell version failed so badly, and why such a straightforward attempt at parallelism is the wrong solution for Haskell. You shouldn't have to be an expert in a language that is supposedly good at parallelism in order to get good results from it. (Russel may or not be an expert, but he is certainly not a novice at FP or parallelism.) Basically, I'd welcome an explanatory riposte to Russel's results.OCaml has a global interpreter lock which explains its behavior. Russell didn't know why the Haskell behavior was so bad. He allowed that it was possible he was misusing it.You have just the illusion to have learned something about this. Trying to read too much from this single example is very wrong. A single benchmark, written by a person not expert in the language, means nearly nothing. You need at least a suite of good benchmarks, written by people that know the respective languages. And even then, you have just an idea of the situation.
Apr 24 2010
Nick Sabalausky Wrote:"retard" <re tard.com.invalid> wrote in message news:hqt94i$2sgv$2 digitalmars.com...I have no formal education in computer science, in fact I'm a self-taught programmer, and I had no problem understanding several papers on Haskell which I was pointed towards. It might take some dedication in some cases, but as Andrei pointed out: the problems are hard.Fri, 23 Apr 2010 08:57:54 -0500, Andrei Alexandrescu wrote:When the academic researchers keep their work squirreled away in academic circles and written in such a convoluted style that only other long-term ivory-tower residents can get far enough past the language to see the actual meaning, it's a wonder that *anyone* finds it surprising that programmers are ignorant of it.On 04/23/2010 07:00 AM, Clemens wrote:Instead of hostility we now have blissful ignorance. Maybe I should post here more often again..(As an aside, I'm generally a bit put off by the hostility towards programming language research and theory in the D community. "We don't need no stinking theory, we'll just roll our own ad-hoc solution which will work much better because ivory-tower academics are completely out of touch with reality anyway." Bleh.)I hope that trend has been definitively reversed. AndreiAnd that's just the researchers that actually *do* know what they're doing. Let's not fool ourselves into thinking that the *majority* of academia actually knows it's head from it's ass (yea, that's right - I've brought it back to hostility).Oh well. Thanks for proving my point.
Apr 26 2010