www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.announce - A safer/better C++?

reply "Walter Bright" <newshound digitalmars.com> writes:
Here's a lively debate over in comp.lang.c++.moderated people might be
interested in:

http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
#c92f7fd0dc9fedd1
Dec 11 2005
next sibling parent Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 Here's a lively debate over in comp.lang.c++.moderated people might be
 interested in:
 
 http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
 #c92f7fd0dc9fedd1
Wow. That thread has positively exploded in the past few days. Looks like I've got some catching up to do :-) Sean
Dec 11 2005
prev sibling next sibling parent pragma <pragma_member pathlink.com> writes:
In article <dnhvoi$dja$1 digitaldaemon.com>, Walter Bright says...
Here's a lively debate over in comp.lang.c++.moderated people might be
interested in:

http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
#c92f7fd0dc9fedd1
That didn't format correctly in my browser, so here's a tiny version for folks should they have the same problem: http://tinyurl.com/9hl3r - EricAnderton at yahoo
Dec 11 2005
prev sibling next sibling parent pragma <pragma_member pathlink.com> writes:
In article <dnhvoi$dja$1 digitaldaemon.com>, Walter Bright says...
Here's a lively debate over in comp.lang.c++.moderated people might be
interested in:

http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
#c92f7fd0dc9fedd1
Interesting! One choice quote, if I may: ---------------------- From: Kai-Uwe Bux <jkherci... gmx.net> - Find messages by this author Date: 7 Dec 2005 11:00:04 -0500 ---------------------- Peter Most wrote:
 Hello everybody, 
 I would like to get some opinions about whether it would make sense to
 change the C++ design principles, so it would become safer for beginner?
[snipped: suggestion to swap at() and operator[] in std::vector] I do not think this is a good idea. I would rather have operator[] do something like: reference operator[] ( size_type pos ) { assert( pos < this->size() ); return ...; } so that if DEBUG is defined I will have bug detection, and I do not have a performance penalty in production code. ---------------------- The rest of the thread reminds me of the debate of '04, where folks* were going back and forth over Error vs Exception and wether or not the current approach is valid. (*well, okay, it was mostly Matthew) - EricAnderton at yahoo
Dec 11 2005
prev sibling next sibling parent reply Dave <Dave_member pathlink.com> writes:
In article <dnhvoi$dja$1 digitaldaemon.com>, Walter Bright says...
Here's a lively debate over in comp.lang.c++.moderated people might be
interested in:

http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
#c92f7fd0dc9fedd1
After reading some of the threads, and seeing mention of D in there basically just get ignored, it becomes apparent that one could come up with the better mousetrap and still the very people who should take notice won't even acknowledge the existance of something that quite probably is not only better but readily available. Take me for example, I first noticed D two years ago while heavily involved in C++ development. Primarily four thoughts came to mind back then, and it took me a year of ignorance before I finally started to look seriously at D: 1) Garbage collected, ergo slow (wrong!). 2) The OOP support seems a lot like Java, ergo slow (wrong!). 3) If I already know (or think I know) a lot about C++, and D doesn't offer any features that I can't hack together in some way in C++, why switch? Ultimately the first two assumptions ended up being just plain wrong (on several levels) and are probably more important to me than many people anyway. But the last two I think still have some merit w.r.t. why people will not take the leap and try D. The arguments in favor of D for 3) and 4) such as: - productivity - power - speed - improved safety - unittest - DbC - C library compatibility - built-in AA's, first class arrays, foreach, etc., etc... - C-like support for naked pointers - Built-in inline assembler spec. all rolled into one language only become apparent after trying D. So it seems D is stuck in the proverbial 'catch 22' situation here, where the real merits of the language only become apparent after learning how to use it, yet those who need to try it won't because of their (understandable) assumptions... How can D break out of this situation?
Dec 11 2005
next sibling parent reply "Kris" <fu bar.com> writes:
"Dave" <Dave_member pathlink.com> wrote

 How can D break out of this situation?
Years ago I would have waxed on about the merits of grass-roots, technical superiority, clarity of focus and so on. That's all so wrong ;-) When D is deemed "ready", what it needs is widespread, persistent, and memorable ... marketing I don't know how that can be done on a small budget, even with the web. Do you know? Or is there some obvious way to get that sort of thing funded? Through licensing perhaps?
Dec 11 2005
next sibling parent reply clayasaurus <clayasaurus gmail.com> writes:
Kris wrote:
 "Dave" <Dave_member pathlink.com> wrote
 
 
How can D break out of this situation?
Years ago I would have waxed on about the merits of grass-roots, technical superiority, clarity of focus and so on. That's all so wrong ;-) When D is deemed "ready", what it needs is widespread, persistent, and memorable ... marketing I don't know how that can be done on a small budget, even with the web. Do you know? Or is there some obvious way to get that sort of thing funded? Through licensing perhaps?
It would be nice if everyone just 'got it,' but most people want to stay with the crowd. Grass roots works but it is slow because it is not mass spoon fed into the population like marketing is. I don't think it is time to worry about publicity now, if people started taking D seriously today they might be put off by lack of compiler options and an 'incomplete' language. Better leave it to the adventerous and those who do understand benefits. Here are some things I forsee as having a big enough impact for the masses to start noticing D. 1) Stable 1.0 compiler, for those who know about D but don't want the risk of a beta compiler 2) When the time is ready, another Slashdot article. 3) Acceptance amonst linux distro's. 4) A book or two If D doesn't start catching fire after that, then I'd start to worry why people are not accepting it. Of course, lets not hope this slightly modified matrix quote is true :o "Did you know that the first version of C++ was designed to be perfect? Where none suffered, where everyone would be happy. It was a disaster. No one would accept it. Entire programmers were lost. Some believed we lacked the means to describe your perfect language. But I believe that, as a species, human beings define their reality through suffering and misery. The perfect language was a dream that your primitive cerebrum kept trying to wake up from. Which is why C++ was redesigned to this: the peak of its popularity." - Agent Smith
Dec 11 2005
parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
clayasaurus wrote:

 Here are some things I forsee as having a big enough impact for the 
 masses to start noticing D.
See also: http://prowiki.org/wiki4d/wiki.cgi?HelpDProgress --anders
Dec 11 2005
prev sibling parent reply BCS <BCS_member pathlink.com> writes:
In article <dni880$11rv$1 digitaldaemon.com>, Kris says...
"Dave" <Dave_member pathlink.com> wrote

 How can D break out of this situation?
Years ago I would have waxed on about the merits of grass-roots, technical superiority, clarity of focus and so on. That's all so wrong ;-) When D is deemed "ready", what it needs is widespread, persistent, and memorable ... marketing I don't know how that can be done on a small budget, even with the web. Do you know? Or is there some obvious way to get that sort of thing funded? Through licensing perhaps?
How about a programming challenge? Develop a programming problem that requires the contestant to program against a D language API.
Dec 11 2005
next sibling parent =?iso-8859-1?q?Knud_S=F8rensen?= <12tkvvb02 sneakemail.com> writes:
On Sun, 11 Dec 2005 23:28:12 +0000, BCS wrote:

 In article <dni880$11rv$1 digitaldaemon.com>, Kris says...
"Dave" <Dave_member pathlink.com> wrote

 How can D break out of this situation?
Years ago I would have waxed on about the merits of grass-roots, technical superiority, clarity of focus and so on. That's all so wrong ;-) When D is deemed "ready", what it needs is widespread, persistent, and memorable ... marketing I don't know how that can be done on a small budget, even with the web. Do you know? Or is there some obvious way to get that sort of thing funded? Through licensing perhaps?
How about a programming challenge? Develop a programming problem that requires the contestant to program against a D language API.
A programming challenge is a good idea. What about setting up some bounties for writing some code modules for the D std library! Or maybe make a website where people can make bounty jars for there favorite code module and then let everyone interested in the module contribute with donations.
Dec 11 2005
prev sibling next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"BCS" <BCS_member pathlink.com> wrote in message
news:dnicmc$1c07$1 digitaldaemon.com...
 How about a programming challenge? Develop a programming problem that
requires
 the contestant to program against a D language API.
I've thought about this from time to time, but a reasonable way to implement it always seemed to escape me.
Dec 11 2005
next sibling parent =?iso-8859-1?q?Knud_S=F8rensen?= <12tkvvb02 sneakemail.com> writes:
On Sun, 11 Dec 2005 15:41:40 -0800, Walter Bright wrote:

 
 "BCS" <BCS_member pathlink.com> wrote in message
 news:dnicmc$1c07$1 digitaldaemon.com...
 How about a programming challenge? Develop a programming problem that
requires
 the contestant to program against a D language API.
I've thought about this from time to time, but a reasonable way to implement it always seemed to escape me.
How about this. The challenge is to write the module they like to see in the D std lib (phobes,ares ?) 1) With a deadline after 3 month. 2) Then all the entries is published to criticism and suggestions from the d community. 3) A month later there is a deadline for the final version. 4) Then the winners are found and they get there prizes. 5) Then the modules that have the quality and functionality need is add to the std lib. Knud
Dec 11 2005
prev sibling parent pragma <pragma_member pathlink.com> writes:
In article <dnif07$1h3a$1 digitaldaemon.com>, Walter Bright says...
"BCS" <BCS_member pathlink.com> wrote in message
news:dnicmc$1c07$1 digitaldaemon.com...
 How about a programming challenge? Develop a programming problem that
requires
 the contestant to program against a D language API.
I've thought about this from time to time, but a reasonable way to implement it always seemed to escape me.
Would the opposite also apply? Say a programming contest that is open to compiled languages and doesn't exclude D? For example, take this speed-programming contest I stumbled onto today: http://www.ludumdare.com/ (code a working game within 48 hours - use anything you want) While not an outstanding example, it does demonstrate that there are other grass-roots niches out there that are *easily* within reach of D. ------------ Assuming that we don't have the manpower and resources to mount the kind of ad campaign that D needs, the best we can do is improve the product itself to make it *the* most obvious choice for its niche. And it has to beat years of mindshare and momentum behind C++ and Java to do that. This means writeups, tutorials, FAQs (we don't even have one for the NG), manuals, etc. Lots of un-fun, non-paying, non-programming work. Honestly, we're still building the toolchain, and assembling 1st tier applications (IDEs, shell utils, kernels, servers, daemons) that I think the current mode is about where I'd expect us to be. Once D has something on par with JDK1.0, then marketing becomes easier; we're quite close to that now. Once DMD manages to shed its Beta status, and the associated ABI's and documentation are finalized, continued development on the compiler and spec becomes a *selling point* rather than a fault. It would allow all of us to say that D is "being actively improved" rather than left to languish with any faults it may have. And not to beat a dead horse, but I still think there's more that can be done on the web front that is directly under the community's control. There is a *huge* vacancy in the D community for press, publications, and editorials that could be available as a daily/weekly/monthly website. The current crop of sites have the "what" and "where" but nothing has the "who" and most importantly "why". The "how" is scattered about, which is fine, but there's no authoritative source to lend coherence to the community in this regard (IMO, the wiki isn't groomed enough to count). - EricAnderton at yahoo
Dec 11 2005
prev sibling parent reply "Lionello Lunesu" <lio remove.lunesu.com> writes:
 How about a programming challenge? Develop a programming problem that 
 requires
 the contestant to program against a D language API.
There's always the language shoot-out, but I've recently noticed that intel's C compiler took over the first spot!? L.
Dec 12 2005
parent reply "Dave" <Dave_member pathlink.com> writes:
"Lionello Lunesu" <lio remove.lunesu.com> wrote in message 
news:dnjo96$17tr$1 digitaldaemon.com...
 How about a programming challenge? Develop a programming problem that 
 requires
 the contestant to program against a D language API.
There's always the language shoot-out, but I've recently noticed that intel's C compiler took over the first spot!? L.
Yes, and SmartEiffel just took over the lead from both... There are some problems with a couple of tests that run well on my system, but don't on theirs. The tests are "Reverse Complement" and "Cheap-Concurrency". More is discussed here: http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D.learn/2287 (Bascially, Reverse Complement runs (and should) in half a second on my machine, but takes 10 secs. on their equally matched machine) Anybody running Debian unstable they could use to check these problems out? Another problem with "Cheap-Concurrency" is that the number of threads that phobos supports on Linux is artificially low (hence the loop that re-creates threads). Also, we're missing two tests and I don't have the time to figure them out: Regex DNA and Chameneos. Anyone want to take a stab at those? Make sure to follow the "rules" for each test; the FAQ for the site is here: http://shootout.alioth.debian.org/faq.php Thanks, - Dave
Dec 12 2005
parent Oskar Linde <oskar.lindeREM OVEgmail.com> writes:
Dave wrote:

 Also, we're missing two tests and I don't have the time to figure them out: 
 Regex DNA and Chameneos. Anyone want to take a stab at those? Make sure to 
 follow the "rules" for each test; the FAQ for the site is here: 
I implemented the Regex DNA a while back. Really straight-forward looking at other solutions :). I never submitted it because my program was painfully slow. I guess either the D regex implementation is extremely slow or I'm not using it correctly. /Oskar
Dec 12 2005
prev sibling next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Dave" <Dave_member pathlink.com> wrote in message
news:dni7a9$viu$1 digitaldaemon.com...
 After reading some of the threads, and seeing mention of D in there
basically
 just get ignored, it becomes apparent that one could come up with the
better
 mousetrap and still the very people who should take notice won't even
 acknowledge the existance of something that quite probably is not only
better
 but readily available.
There is some curious psychology going on with this. I've met many well known C++ experts who won't say anything positive about D in public, but in private to me they are quite enthusiastic about it. It's almost as if they fear censure or loss of face? One thing D does is defy the conventional wisdom about C++. There are two that you mention; some others are: 3) C++'s non-safety is necessary to get its power and flexibility 4) The solution to C++ programming language problem X is to educate the programmers better
 So it seems D is stuck in the proverbial 'catch 22' situation here, where
the
 real merits of the language only become apparent after learning how to use
it,
 yet those who need to try it won't because of their (understandable)
 assumptions...

 How can D break out of this situation?
The biggest thing is to write articles about D and get them published. Present papers on D at conferences. Continue to bring up D in contexts where it is appropriate. Dave - an article about performance? Kris - an article about Mango? Don - an article about D templates? Thomas - an article about your automated testing system? David - an article about implementing GDC? Everyone else, too, who has made important contributions to D and this n.g. You guys all know your stuff and these would be important, interesting articles. Me, I'll be presenting at Amazon's developer conference in January and in SDWest in March. But D can't succeed if it's just me writing articles and presenting. Would it help if I offered a bounty for articles? <g>
Dec 11 2005
next sibling parent reply "Kris" <fu bar.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote
 Would it help if I offered a bounty for articles? <g>
People are often financially motivated when there's no other truly driving reason ;-) The problem with articles is how do you "reject" contributions that are really poor? Ones which might actually cause more harm than good? It's tough to do that in an open-source environment without upset. Thus, a financial reward for articles deemed "worthy" might be something useful, but you'd probably have to make someone else "responsible" for such filtering :-) Still, those would be nice problems to have!
Dec 11 2005
next sibling parent "Kris" <fu bar.com> writes:
On second thoughts, this is bullshit. Just ignore it.

I should have noted that "I'd be a bit upset if an article written by me 
were rejected". It wasn't meant to be the kind of poxy generalization it 
sounds like :-(


"Kris" <fu bar.com> wrote in message news:dnif29$1h7k$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote
 Would it help if I offered a bounty for articles? <g>
People are often financially motivated when there's no other truly driving reason ;-) The problem with articles is how do you "reject" contributions that are really poor? Ones which might actually cause more harm than good? It's tough to do that in an open-source environment without upset. Thus, a financial reward for articles deemed "worthy" might be something useful, but you'd probably have to make someone else "responsible" for such filtering :-) Still, those would be nice problems to have!
Dec 11 2005
prev sibling next sibling parent reply Brad Anderson <brad dsource.dot.org> writes:
Kris wrote:
 "Walter Bright" <newshound digitalmars.com> wrote
 Would it help if I offered a bounty for articles? <g>
People are often financially motivated when there's no other truly driving reason ;-) The problem with articles is how do you "reject" contributions that are really poor? Ones which might actually cause more harm than good? It's tough to do that in an open-source environment without upset. Thus, a financial reward for articles deemed "worthy" might be something useful, but you'd probably have to make someone else "responsible" for such filtering :-) Still, those would be nice problems to have!
Kris, Think more outside the box... The bounty could be compiler/language features that you've been talking about in this NG. Give an article, get some Walter love into DMD and the spec... BA
Dec 11 2005
parent "Kris" <fu bar.com> writes:
"Brad Anderson" <brad dsource.dot.org> wrote
 Kris wrote:
 Still, those would be nice problems to have!
Kris, Think more outside the box... The bounty could be compiler/language features that you've been talking about in this NG. Give an article, get some Walter love into DMD and the spec...
I'll just pray that some don't write good articles :-D To play Devils' advocate for the moment: I'm not at all sure that a potentially "wonderful" article from me on "how D array-slicing can notably reduce Google's cluster-expenses and utility-bills" is gonna' result in read-only arrays for the language ~ the things I want to see appear to be overly controversial :-) This (above) is all very tongue-in-cheek though, and you certainly make a valid point :: my focus is too narrow.
Dec 11 2005
prev sibling next sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Kris" <fu bar.com> wrote in message news:dnif29$1h7k$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote
 Would it help if I offered a bounty for articles? <g>
People are often financially motivated when there's no other truly driving reason ;-) The problem with articles is how do you "reject" contributions that are really poor? Ones which might actually cause more harm than good? It's
tough
 to do that in an open-source environment without upset. Thus, a financial
 reward for articles deemed "worthy" might be something useful, but you'd
 probably have to make someone else "responsible" for such filtering :-)

 Still, those would be nice problems to have!
I have thought of this from time to time, and that's primarilly why I haven't done it. More than once, a valuable contributor to D has left because they were upset that I didn't agree with a feature they wanted in. Sigh. How about I pay the bounty if the article gets published in Dr. Dobb's or CUJ? Then, I'm not making the decision, the magazine editors are, and I have no influence over them <g>.
Dec 11 2005
prev sibling parent Sean Kelly <sean f4.ca> writes:
Kris wrote:
 "Walter Bright" <newshound digitalmars.com> wrote
 Would it help if I offered a bounty for articles? <g>
People are often financially motivated when there's no other truly driving reason ;-) The problem with articles is how do you "reject" contributions that are really poor? Ones which might actually cause more harm than good? It's tough to do that in an open-source environment without upset. Thus, a financial reward for articles deemed "worthy" might be something useful, but you'd probably have to make someone else "responsible" for such filtering :-)
I'm sure Walter would be willing to review any proposed articles :-) Writing doesn't occcur in a vaccuum, after all. Sean
Dec 12 2005
prev sibling next sibling parent reply "John C" <johnch_atms hotmail.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message 
news:dniddb$1dk0$1 digitaldaemon.com...
 "Dave" <Dave_member pathlink.com> wrote in message
 news:dni7a9$viu$1 digitaldaemon.com...
 After reading some of the threads, and seeing mention of D in there
basically
 just get ignored, it becomes apparent that one could come up with the
better
 mousetrap and still the very people who should take notice won't even
 acknowledge the existance of something that quite probably is not only
better
 but readily available.
There is some curious psychology going on with this. I've met many well known C++ experts who won't say anything positive about D in public, but in private to me they are quite enthusiastic about it. It's almost as if they fear censure or loss of face? One thing D does is defy the conventional wisdom about C++. There are two that you mention; some others are: 3) C++'s non-safety is necessary to get its power and flexibility 4) The solution to C++ programming language problem X is to educate the programmers better
 So it seems D is stuck in the proverbial 'catch 22' situation here, where
the
 real merits of the language only become apparent after learning how to 
 use
it,
 yet those who need to try it won't because of their (understandable)
 assumptions...

 How can D break out of this situation?
The biggest thing is to write articles about D and get them published. Present papers on D at conferences. Continue to bring up D in contexts where it is appropriate. Dave - an article about performance? Kris - an article about Mango? Don - an article about D templates? Thomas - an article about your automated testing system? David - an article about implementing GDC? Everyone else, too, who has made important contributions to D and this n.g. You guys all know your stuff and these would be important, interesting articles.
What about a place where all this expertise can be shared, where people can publish articles, tips and code? I'm thinking of an MSDN for D, a D Developers Network if you like. Community blogs are also pretty effective. Artima (http://www.artima.com/) might be a good model to follow. I'm not sure getting into CUJ is the best way - it's primarily a C++ magazine, and while there are occasional series on other languages, how many articles about D will its editors take?
 Me, I'll be presenting at Amazon's developer conference in January and in
 SDWest in March. But D can't succeed if it's just me writing articles and
 presenting.

 Would it help if I offered a bounty for articles? <g>
Rhetorical, surely.
Dec 12 2005
next sibling parent reply pragma <pragma_member pathlink.com> writes:
In article <dnjmmu$14mp$1 digitaldaemon.com>, John C says...
What about a place where all this expertise can be shared, where people can 
publish articles, tips and code? I'm thinking of an MSDN for D, a D 
Developers Network if you like. Community blogs are also pretty effective. 
Artima (http://www.artima.com/) might be a good model to follow.
:) Glad I'm not the only one thinking about this. http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D.announce/2082 I've never seen artima, but I'll check it out.
I'm not sure getting into CUJ is the best way - it's primarily a C++ 
magazine, and while there are occasional series on other languages, how many 
articles about D will its editors take?
You hit the nail right on the head. I think D will always make great "guest appaearances" there but utlimately, its not our scene. D has its own mentality, momentum and community apart from C++ now, and it will continue to grow that way. Better to build things up much like the Python, PHP and web standards folks have rather than seek exposure in a community that has too much invested in how things are done without D. - EricAnderton at yahoo
Dec 12 2005
next sibling parent "John C" <johnch_atms hotmail.com> writes:
"pragma" <pragma_member pathlink.com> wrote in message 
news:dnk5q0$1ofo$1 digitaldaemon.com...
 In article <dnjmmu$14mp$1 digitaldaemon.com>, John C says...
What about a place where all this expertise can be shared, where people 
can
publish articles, tips and code? I'm thinking of an MSDN for D, a D
Developers Network if you like. Community blogs are also pretty effective.
Artima (http://www.artima.com/) might be a good model to follow.
:) Glad I'm not the only one thinking about this. http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D.announce/2082
So I see. If I had some spare cash for a web host and domain, I'd get it off the ground. John.
 I've never seen artima, but I'll check it out.

I'm not sure getting into CUJ is the best way - it's primarily a C++
magazine, and while there are occasional series on other languages, how 
many
articles about D will its editors take?
You hit the nail right on the head. I think D will always make great "guest appaearances" there but utlimately, its not our scene. D has its own mentality, momentum and community apart from C++ now, and it will continue to grow that way. Better to build things up much like the Python, PHP and web standards folks have rather than seek exposure in a community that has too much invested in how things are done without D. - EricAnderton at yahoo
Dec 12 2005
prev sibling parent John Reimer <terminal.node gmail.com> writes:
pragma wrote:
 In article <dnjmmu$14mp$1 digitaldaemon.com>, John C says...
 What about a place where all this expertise can be shared, where people can 
 publish articles, tips and code? I'm thinking of an MSDN for D, a D 
 Developers Network if you like. Community blogs are also pretty effective. 
 Artima (http://www.artima.com/) might be a good model to follow.
:) Glad I'm not the only one thinking about this. http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D.announce/2082 I've never seen artima, but I'll check it out.
I think that's the site that Matthew Wilson often referred to. I think it's fairly respected for it's high-quality technical articles/blogs on software development topics. -JJR
Dec 12 2005
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"John C" <johnch_atms hotmail.com> wrote in message
news:dnjmmu$14mp$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:dniddb$1dk0$1 digitaldaemon.com...
 What about a place where all this expertise can be shared, where people
can
 publish articles, tips and code? I'm thinking of an MSDN for D, a D
 Developers Network if you like. Community blogs are also pretty effective.
 Artima (http://www.artima.com/) might be a good model to follow.
Artima is interested in D articles. That said, there's already a D developers network, it's this n.g.
 I'm not sure getting into CUJ is the best way - it's primarily a C++
 magazine, and while there are occasional series on other languages, how
many
 articles about D will its editors take?
CUJ is interested in D articles. They're not going to do D cover to cover, but that's not necessary. A good article on D now and then is all that's required.
Dec 12 2005
prev sibling next sibling parent reply John Reimer <terminal.node gmail.com> writes:
Walter Bright wrote:

 There is some curious psychology going on with this. I've met many well
 known C++ experts who won't say anything positive about D in public, but in
 private to me they are quite enthusiastic about it. It's almost as if they
 fear censure or loss of face?
 
Very interesting that you mention this... such "peer pressure" is noticeable in the scientific community as well. Every so often some bright but hapless scientist presents research that counters prior evidence on a politically loaded issue: for example, anything that even hints of influencing opinion on ethics or morality or metaphysics; or anything that might remotely influence popularly accepted industry -- think pharmacology or immunology; or anything contrary to politically popular themes -- think "green house" gas emission sources. No matter that his "unusual" results be right or true, he is derided severely by his peers who support the popular opinion, whatever that may be at the time (and oh how that changes). Many of these, though few they be, lose jobs, face, and position in the scientific community merely for presenting objectively relevant material. The scientific community is not always the objective giant that the general public is made to believe it to be. Every get tired of reading the "scientists say..." quotes? It's as if we are meant to believe there's always a wonderful consensus on such matters. It's the same with D verses C++. It's all about staying in the groove. People ignore it for two major reasons: (1) fear of moving away from what's familiar and (2) fear of moving away from what's popular. They don't want to lose face in front of their peers. Very few people have the guts to shake off the grip of "false", old ways in order to boldly proclaim the new and the true... all alone. Cheers, Walter, for being willing to take lone way yourself. -JJR
Dec 12 2005
next sibling parent reply Dave <Dave_member pathlink.com> writes:
In article <dnk9mk$1ur7$1 digitaldaemon.com>, John Reimer says...
Walter Bright wrote:

 There is some curious psychology going on with this. I've met many well
 known C++ experts who won't say anything positive about D in public, but in
 private to me they are quite enthusiastic about it. It's almost as if they
 fear censure or loss of face?
 
Very interesting that you mention this... such "peer pressure" is noticeable in the scientific community as well. Every so often some bright but hapless scientist presents research that counters prior evidence on a politically loaded issue: for example, anything that even hints of influencing opinion on ethics or morality or metaphysics; or anything that might remotely influence popularly accepted industry -- think pharmacology or immunology; or anything contrary to politically popular themes -- think "green house" gas emission sources.
Yea - I can remember when the big threat was a new ice-age back in the '70's. Guess what, that was also attributed to man-made pollution. Then there was the CFC/Ozone thing in the '80's/'90's, but now most of the new evidence suggests (after billions were spent making the refridgeration industry a lot of cash on replacements) that it is probably mostly due to a natural cycle. Part of this natural cycle is cooling around the South Pole, which, by-the-way, flies in the face of one of the greatest 'threats' of global warming - melting polar ice. I could go on - eggs and chloresterol, etc., etc., etc., but you get the picture.
No matter that his "unusual" results be right or true, he is derided 
severely by his peers who support the popular opinion, whatever that may 
be at the time (and oh how that changes).  Many of these, though few 
they be, lose jobs, face, and position in the scientific community 
merely for presenting objectively relevant material.  The scientific 
community is not always the objective giant that the general public is 
made to believe it to be.

Every get tired of reading the "scientists say..." quotes?

It's as if we are meant to believe there's always a wonderful consensus 
on such matters.

It's the same with D verses C++.  It's all about staying in the groove. 
   People ignore it for two major reasons: (1) fear of moving away from 
what's familiar and (2) fear of moving away from what's popular. They 
don't want to lose face in front of their peers.

Very few people have the guts to shake off the grip of "false", old ways 
in order to boldly proclaim the new and the true... all alone.

Cheers, Walter, for being willing to take lone way yourself.
What's refreshing about the D crowd is their pragmatism, and it's led by a master pragmatist.
-JJR
Dec 12 2005
parent reply John Reimer <terminal.node gmail.com> writes:
Dave wrote:
 In article <dnk9mk$1ur7$1 digitaldaemon.com>, John Reimer says...
 Walter Bright wrote:

 There is some curious psychology going on with this. I've met many well
 known C++ experts who won't say anything positive about D in public, but in
 private to me they are quite enthusiastic about it. It's almost as if they
 fear censure or loss of face?
Very interesting that you mention this... such "peer pressure" is noticeable in the scientific community as well. Every so often some bright but hapless scientist presents research that counters prior evidence on a politically loaded issue: for example, anything that even hints of influencing opinion on ethics or morality or metaphysics; or anything that might remotely influence popularly accepted industry -- think pharmacology or immunology; or anything contrary to politically popular themes -- think "green house" gas emission sources.
Yea - I can remember when the big threat was a new ice-age back in the '70's. Guess what, that was also attributed to man-made pollution. Then there was the CFC/Ozone thing in the '80's/'90's, but now most of the new evidence suggests (after billions were spent making the refridgeration industry a lot of cash on replacements) that it is probably mostly due to a natural cycle. Part of this natural cycle is cooling around the South Pole, which, by-the-way, flies in the face of one of the greatest 'threats' of global warming - melting polar ice.
Exactly. But you didn't hear me agree with you just now... ;-D
 I could go on - eggs and chloresterol, etc., etc., etc., but you get the
 picture.
 
 No matter that his "unusual" results be right or true, he is derided 
 severely by his peers who support the popular opinion, whatever that may 
 be at the time (and oh how that changes).  Many of these, though few 
 they be, lose jobs, face, and position in the scientific community 
 merely for presenting objectively relevant material.  The scientific 
 community is not always the objective giant that the general public is 
 made to believe it to be.

 Every get tired of reading the "scientists say..." quotes?

 It's as if we are meant to believe there's always a wonderful consensus 
 on such matters.

 It's the same with D verses C++.  It's all about staying in the groove. 
   People ignore it for two major reasons: (1) fear of moving away from 
 what's familiar and (2) fear of moving away from what's popular. They 
 don't want to lose face in front of their peers.

 Very few people have the guts to shake off the grip of "false", old ways 
 in order to boldly proclaim the new and the true... all alone.

 Cheers, Walter, for being willing to take lone way yourself.
What's refreshing about the D crowd is their pragmatism, and it's led by a master pragmatist.
Very true. -JJR
Dec 12 2005
parent "Dave" <Dave_member pathlink.com> writes:
"John Reimer" <terminal.node gmail.com> wrote in message 
news:dnkgqi$270u$1 digitaldaemon.com...
 Dave wrote:

 Exactly.  But you didn't hear me agree with you just now... ;-D
I hereby recind my comments, as I got carried away and don't want to start one of those debates, at least here <g>
Dec 12 2005
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"John Reimer" <terminal.node gmail.com> wrote in message
news:dnk9mk$1ur7$1 digitaldaemon.com...
 Cheers, Walter, for being willing to take lone way yourself.
I'm old enough to not care if I'm not on the bandwagon <g>.
Dec 12 2005
prev sibling next sibling parent Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 "Dave" <Dave_member pathlink.com> wrote in message
 news:dni7a9$viu$1 digitaldaemon.com...
 After reading some of the threads, and seeing mention of D in there
basically
 just get ignored, it becomes apparent that one could come up with the
better
 mousetrap and still the very people who should take notice won't even
 acknowledge the existance of something that quite probably is not only
better
 but readily available.
There is some curious psychology going on with this. I've met many well known C++ experts who won't say anything positive about D in public, but in private to me they are quite enthusiastic about it. It's almost as if they fear censure or loss of face?
I think it must be something like that. I tend to agree with most of the c.l.c++.m group on C++ issues, but it's not often you'll see many of them saying anything negative about the language. Andrei tends to be a refreshing exception to this apparent trend.
 How can D break out of this situation?
The biggest thing is to write articles about D and get them published. Present papers on D at conferences. Continue to bring up D in contexts where it is appropriate. Dave - an article about performance? Kris - an article about Mango? Don - an article about D templates? Thomas - an article about your automated testing system? David - an article about implementing GDC? Everyone else, too, who has made important contributions to D and this n.g. You guys all know your stuff and these would be important, interesting articles. Me, I'll be presenting at Amazon's developer conference in January and in SDWest in March. But D can't succeed if it's just me writing articles and presenting.
CUJ seems willing to accept them, so finding a forum shouldn't be difficult. And Artima may be a good place to get started for those a bit hesitant about trying for a print publication--it's fairly well regarded and Chick Allison has already written that he thinks D has a lot of promise.
 Would it help if I offered a bounty for articles? <g>
I'd most like to see an article on template metaprogramming in D. It's a hot topic in the C++ world, and D is much more elegant and powerful in many ways. Free time seems to be the biggest issue for most of the folks on this forum, but a short article shouldnt take that long... Sean
Dec 12 2005
prev sibling parent reply Don Clugston <dac nospam.com.au> writes:
Walter Bright wrote:
 There is some curious psychology going on with this. I've met many well
 known C++ experts who won't say anything positive about D in public, but in
 private to me they are quite enthusiastic about it. It's almost as if they
 fear censure or loss of face?
 
 One thing D does is defy the conventional wisdom about C++. There are two
 that you mention; some others are:
 3) C++'s non-safety is necessary to get its power and flexibility
 4) The solution to C++ programming language problem X is to educate the
 programmers better
 
So it seems D is stuck in the proverbial 'catch 22' situation here, where
the
real merits of the language only become apparent after learning how to use
it,
yet those who need to try it won't because of their (understandable)
assumptions...

How can D break out of this situation?
The biggest thing is to write articles about D and get them published. Present papers on D at conferences. Continue to bring up D in contexts where it is appropriate. Dave - an article about performance? Kris - an article about Mango? Don - an article about D templates? Thomas - an article about your automated testing system? David - an article about implementing GDC? Everyone else, too, who has made important contributions to D and this n.g. You guys all know your stuff and these would be important, interesting articles.
I intend to write a D template article, but I don't have much time, so it will take a while. I'll also update my delegate/member function pointer article on CodeProject and put a big reference to D at the end of it. Since it seems to have been the most popular programming article of 2004 (the web page is now approaching 200 000 hits), it might direct a few people here. I don't think D needs any marketing. It just needs more visibility. No one will use it if they've never heard of it. Personally, I only discovered D when trying to create a comprehensive list of C++ compilers. And it was only because of a remark on a newsgroup by Matthew Wilson that convinced me it was worth a look.
 Me, I'll be presenting at Amazon's developer conference in January and in
 SDWest in March. But D can't succeed if it's just me writing articles and
 presenting.
 
 Would it help if I offered a bounty for articles? <g>
 
 
Dec 13 2005
next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Don Clugston" <dac nospam.com.au> wrote in message
news:dnm162$gni$1 digitaldaemon.com...
 I'll also update my delegate/member function pointer article on
 CodeProject and put a big reference to D at the end of it. Since it
 seems to have been the most popular programming article of 2004 (the web
 page is now approaching 200 000 hits), it might direct a few people here.
Thanks!
 I don't think D needs any marketing. It just needs more visibility. No
 one will use it if they've never heard of it.
It's my experience that most programmers won't look at D the first time they run across it, or the second time, or the tenth time. But seeing reference to it constantly eventually convinces people that it is real, and worth looking at.
Dec 13 2005
next sibling parent reply Tom <Tom_member pathlink.com> writes:
In article <dnm6pb$lps$1 digitaldaemon.com>, Walter Bright says...
"Don Clugston" <dac nospam.com.au> wrote in message
news:dnm162$gni$1 digitaldaemon.com...
 I'll also update my delegate/member function pointer article on
 CodeProject and put a big reference to D at the end of it. Since it
 seems to have been the most popular programming article of 2004 (the web
 page is now approaching 200 000 hits), it might direct a few people here.
Thanks!
 I don't think D needs any marketing. It just needs more visibility. No
 one will use it if they've never heard of it.
It's my experience that most programmers won't look at D the first time they run across it, or the second time, or the tenth time. But seeing reference to it constantly eventually convinces people that it is real, and worth looking at.
I got here when some day I wondered, isn't there any other language compiled to native assembler other than C/C++ (without taking in account some others that I don't like so much, like Pascal)?? Is it possible that we could have been stucked in time with the same old languages forever? Isn't anyone correcting the horrors of C++ or enhancing it?. So I started to google for other languages (imperative ones of course) and I found D and it attracted me from the first time. When I first read the specs and presentation, I couldn't believe that *at last* someone did what I've dreamed for a long time (not so long since I am 24). D is (the more every day) what I've wished of a true compiled-language. Just like my precious C++ but much much more nicer. I think I'll cry :') Then when I saw a compiler that works so well like DMD I thought that it couldn't be real and that it couldn't be possible that this D wasn't famous enough yet! I still can't believe the opinions of other guys that will die with C++ because they think it can resolve anything (I don't believe the work it takes to make STL streams and strings work with UNICODE. That really stinks!). That's why I promote D so much in the academic and work fields in which I live (CS in UBA - Buenos Aires University and Economics Department of Argentina, Informatics Project). Among my colleagues and work bodies (which are the most the same people), everyone knows that if they hear about D they'll remember me and my insane obsession :D I'm just expecting D becomes mature enough to be used in a serious project (and a real and mature IDE that could make D coding a complete pleasure, I know this is rather a secondary issue). Also I can't deny the fact that each day I learn so much from the people of the NG and that I want to help hurrying this whole maturing process. Thanks to everybody (specially to the one with the courage to make my dream possible, Walter)... I'm going to cry :')... :P Tom PS: I'll keep promoting D among my people till the end!
Dec 13 2005
next sibling parent Sean Kelly <sean f4.ca> writes:
Tom wrote:
 
 That's why I promote D so much in the academic and work fields in which I live
 (CS in UBA - Buenos Aires University and Economics Department of Argentina,
 Informatics Project). Among my colleagues and work bodies (which are the most
 the same people), everyone knows that if they hear about D they'll remember me
 and my insane obsession :D 
Personally, I think D would be an ideal teaching language. It has a fairly low barrier for entry with garbage collection as the default allocation method, and it allows the programmer to "drill down" to whatever level of behavior he desires. By contrast, Java only allows relatively high-level interaction while C++ does a fairly bad job of hiding nuanced implementation issues that a student may not be ready for. If I ever end up teaching procedural programming, there's little question about which language I'll employ. Sean
Dec 13 2005
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Tom" <Tom_member pathlink.com> wrote in message
news:dnn9h0$1sa4$1 digitaldaemon.com...
 PS: I'll keep promoting D among my people till the end!
Thanks!
Dec 13 2005
prev sibling parent "Ameer Armaly" <ameer_armaly hotmail.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message 
news:dnm6pb$lps$1 digitaldaemon.com...
 "Don Clugston" <dac nospam.com.au> wrote in message
 news:dnm162$gni$1 digitaldaemon.com...
 I'll also update my delegate/member function pointer article on
 CodeProject and put a big reference to D at the end of it. Since it
 seems to have been the most popular programming article of 2004 (the web
 page is now approaching 200 000 hits), it might direct a few people here.
Thanks!
 I don't think D needs any marketing. It just needs more visibility. No
 one will use it if they've never heard of it.
It's my experience that most programmers won't look at D the first time they run across it, or the second time, or the tenth time. But seeing reference to it constantly eventually convinces people that it is real, and worth looking at.
I found it worthwhile when I discovered char[]... it just put me on cloud 9 as far as strings go. Then, when I discovered easy to use classes, a real good (compared to everything else) library, and easy sockets? I fell in love.
 
Dec 13 2005
prev sibling parent reply BCS <BCS_member pathlink.com> writes:
I'm not taking a side on this question but "Do we, or do we not WANT 
lots of users of D before it goes 1.0?"

This is a request for comment.

pros:
--The sooner we get more users the sooner it can go mainstream.
--More users = more testers / critics / experiment / etc.
...

cons:
--The more D code that is written, the harder it becomes to change the spec.
--If something major has to be changed and as a result, axes a major 
project, this would adversely effect the "image" of the language.
...
Dec 13 2005
parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
BCS wrote:


 I'm not taking a side on this question but "Do we, or do we not WANT 
 lots of users of D before it goes 1.0?"
When is that ? The D spec was conceived in Dec 1999. That's 6 years ago. The first DMD alpha was released in Dec 2001, 4 years ago. So if I do find someone that would be willing to try D, should tell them to wait "a little longer" while the language specification and reference compiler is being worked on ? Or should I ask them to help out meanwhile ? I don't know about you, but I prefer the collaborative approach and try to help out with the open parts of it...
 pros:
 --The sooner we get more users the sooner it can go mainstream.
For it to go "mainstream", it just needs marketing. Maybe support ?
 --More users = more testers / critics / experiment / etc.
This is both good and bad of course, especially the critics... ;-)
 cons:
 --The more D code that is written, the harder it becomes to change the 
 spec.
IMHO, this has already happened... As D seems to be pretty fixed ?
 --If something major has to be changed and as a result, axes a major 
 project, this would adversely effect the "image" of the language.
Maybe one should just draw a line in the sand and call it "1.0", and fix the shortcomings in "the first service pack" thereafter. Seems to be working for other companies and languages ? (eg Java) And if it wen't final sooner, it would save me from doing C++... --anders
Dec 13 2005
next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Anders F Björklund" <afb algonet.se> wrote in message
news:dnn82k$1qtu$1 digitaldaemon.com...
 Maybe one should just draw a line in the sand and call it "1.0",
 and fix the shortcomings in "the first service pack" thereafter.
That's my feeling, as well.
Dec 13 2005
parent clayasaurus <clayasaurus gmail.com> writes:
Walter Bright wrote:
 "Anders F Björklund" <afb algonet.se> wrote in message
 news:dnn82k$1qtu$1 digitaldaemon.com...
 
Maybe one should just draw a line in the sand and call it "1.0",
and fix the shortcomings in "the first service pack" thereafter.
That's my feeling, as well.
Christmas/New years gift? : ) I wouldn't mind D going 1.0 either, but I understand that it will offend a lot of people, just look what happened with the 'D user poll' thread I made where for the most part everyone said no to 1.0, naturally most D users are nothing short of perfectionists. You should at least get the opinions of the newsgroup first, lest you alienate any contributers. And please keep a seperate 1.0 and 2.0 compiler branch, only adding new features to the experimental D 2.0, and bug fixes for 1.0. That way we have a very stable compiler and development platform for big projects, as new features tend to introduce bugs and break things. My $.02
Dec 13 2005
prev sibling next sibling parent reply BCS <BCS_member pathlink.com> writes:
Anders F Björklund wrote:
 BCS wrote:
 
 
 I'm not taking a side on this question but "Do we, or do we not WANT 
 lots of users of D before it goes 1.0?"
When is that ? The D spec was conceived in Dec 1999. That's 6 years ago. The first DMD alpha was released in Dec 2001, 4 years ago. So if I do find someone that would be willing to try D, should tell them to wait "a little longer" while the language specification and reference compiler is being worked on ? Or should I ask them to help out meanwhile ? I don't know about you, but I prefer the collaborative approach and try to help out with the open parts of it...
[...]
 
 --anders
No, we shouldn't turn any one away, but should we ACTIVELY seek out users quite yet?
Dec 13 2005
parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
BCS wrote:

 No, we shouldn't turn any one away, but should we
 ACTIVELY seek out users quite yet?
I would love it for the D language specification to be released in a manner suitable for "standardization"... But beyond that and a book or two, plus some pet peeves, I don't have much hesitation recommending it to people ? Not that they move from C++ or Objective-C, the ingrates, but anyway. ;-) --anders PS: Now, if only I had the 80,000 lines of C headers that is the "Carbon" framework translated, I could write some Mac apps... See http://developer.apple.com/carbon/ and http://dsource.org/projects/carbonheaders/
Dec 13 2005
parent Sean Kelly <sean f4.ca> writes:
Anders F Björklund wrote:
 BCS wrote:
 
 No, we shouldn't turn any one away, but should we
 ACTIVELY seek out users quite yet?
I would love it for the D language specification to be released in a manner suitable for "standardization"... But beyond that and a book or two, plus some pet peeves, I don't have much hesitation recommending it to people ?
Same here. Actually, I already recommend D to people--either they'll like the design philosophy or they won't. Sean
Dec 13 2005
prev sibling next sibling parent reply "Dave" <Dave_member pathlink.com> writes:
"Anders F Björklund" <afb algonet.se> wrote in message 
news:dnn82k$1qtu$1 digitaldaemon.com...
 BCS wrote:


 I'm not taking a side on this question but "Do we, or do we not WANT lots 
 of users of D before it goes 1.0?"
When is that ? The D spec was conceived in Dec 1999. That's 6 years ago. The first DMD alpha was released in Dec 2001, 4 years ago.
I know and understand what you're getting at, but the timeframe is about par with how much time it took for other languages to start taking hold too, perhaps even less compared to C++ and Java. Of course, now we are dealing with 'internet time' when it comes to anything new in IT (and we wonder why there is so much crap out there <g>). What D has done in four years is remarkable when you consider how much was done by one person mostly on free time. C++ had a lot of help from what was then probably the premier CS research organization in the world, and Java had a large and rapidly growing tech. giant behind it. (I'm not forgetting the major contributions of others here, but no one can argue who's done the great majority of the work)
 So if I do find someone that would be willing to try D,
 should tell them to wait "a little longer" while the
 language specification and reference compiler is being
 worked on ? Or should I ask them to help out meanwhile ?
I'm telling people all of the time about D, but I do add the caveat that it's 'young' yet, *but* that's usually right before I also add that the language, tools and library are pretty stable, and are great for writing utlities where script just doesn't cut it. The primary problem is that A) I work for clients who drink the Microsoft koolaid or B) work for clients using proprietary Unix systems and tools.
 --If something major has to be changed and as a result, axes a major 
 project, this would adversely effect the "image" of the language.
Maybe one should just draw a line in the sand and call it "1.0", and fix the shortcomings in "the first service pack" thereafter.
I've got to ask myself "Is the language ready for 1.0, and would this really help the language grow right now?". I'm not sure, but I'm leaning towards "yes" for the first part and "no" for the 2nd at this point. And there's no turning back from the big v1.0 release, being that "1.0" seems to carry so much weight with people regarding what decisions are made for a progamming language from that point on. That said, I'm often wrong in such matters otherwise I wouldn't be here, but in Bermuda or somewhere warm sipping a beer on the beach. I just had to add my $0.02 worth for consideration <g>
Dec 13 2005
next sibling parent reply "Kris" <fu bar.com> writes:
Since we're now discussing a 1.0 release, I'd like to toss something into 
the pot:

1) consider calling it 2.0 ~ we all know the ramifications, and D has been 
around long enough to deserve that status.

2) whenever the first official, non-beta release is made, it should have 
addressed all or most of the little niggly things that would be hard to 
clean up later (once people start to use them). This is quite different from 
the various features people wish to see included ~ instead, it's a little 
bit of insurance that the /following/ release will not upset users in some 
manner. It helps to smooth the release cycle and generates confidence.

It's up to Walter, of course, but it might be useful to collect a list of 
these little niggly things? In case something is forgotten?

- Kris



"Dave" <Dave_member pathlink.com> wrote in message 
news:dnndcn$20rh$1 digitaldaemon.com...
 "Anders F Björklund" <afb algonet.se> wrote in message 
 news:dnn82k$1qtu$1 digitaldaemon.com...
 BCS wrote:


 I'm not taking a side on this question but "Do we, or do we not WANT 
 lots of users of D before it goes 1.0?"
When is that ? The D spec was conceived in Dec 1999. That's 6 years ago. The first DMD alpha was released in Dec 2001, 4 years ago.
I know and understand what you're getting at, but the timeframe is about par with how much time it took for other languages to start taking hold too, perhaps even less compared to C++ and Java. Of course, now we are dealing with 'internet time' when it comes to anything new in IT (and we wonder why there is so much crap out there <g>). What D has done in four years is remarkable when you consider how much was done by one person mostly on free time. C++ had a lot of help from what was then probably the premier CS research organization in the world, and Java had a large and rapidly growing tech. giant behind it. (I'm not forgetting the major contributions of others here, but no one can argue who's done the great majority of the work)
 So if I do find someone that would be willing to try D,
 should tell them to wait "a little longer" while the
 language specification and reference compiler is being
 worked on ? Or should I ask them to help out meanwhile ?
I'm telling people all of the time about D, but I do add the caveat that it's 'young' yet, *but* that's usually right before I also add that the language, tools and library are pretty stable, and are great for writing utlities where script just doesn't cut it. The primary problem is that A) I work for clients who drink the Microsoft koolaid or B) work for clients using proprietary Unix systems and tools.
 --If something major has to be changed and as a result, axes a major 
 project, this would adversely effect the "image" of the language.
Maybe one should just draw a line in the sand and call it "1.0", and fix the shortcomings in "the first service pack" thereafter.
I've got to ask myself "Is the language ready for 1.0, and would this really help the language grow right now?". I'm not sure, but I'm leaning towards "yes" for the first part and "no" for the 2nd at this point. And there's no turning back from the big v1.0 release, being that "1.0" seems to carry so much weight with people regarding what decisions are made for a progamming language from that point on. That said, I'm often wrong in such matters otherwise I wouldn't be here, but in Bermuda or somewhere warm sipping a beer on the beach. I just had to add my $0.02 worth for consideration <g>
Dec 13 2005
next sibling parent reply clayasaurus <clayasaurus gmail.com> writes:
Kris wrote:
 Since we're now discussing a 1.0 release, I'd like to toss something into 
 the pot:
 
 1) consider calling it 2.0 ~ we all know the ramifications, and D has been 
 around long enough to deserve that status.
Some people disagree about calling it 1.0, and you want to call it 2.0? People are going to wonder where D 1.0 dissapeared to, and then promptly laugh at our hubris.
 
 2) whenever the first official, non-beta release is made, it should have 
 addressed all or most of the little niggly things that would be hard to 
 clean up later (once people start to use them). This is quite different from 
 the various features people wish to see included ~ instead, it's a little 
 bit of insurance that the /following/ release will not upset users in some 
 manner. It helps to smooth the release cycle and generates confidence.
 
 It's up to Walter, of course, but it might be useful to collect a list of 
 these little niggly things? In case something is forgotten?
Like http://www.wikiservice.at/wiki4d/wiki.cgi?PendingPeeves ?
 - Kris
 
Dec 13 2005
next sibling parent "Kris" <fu bar.com> writes:
"clayasaurus" <clayasaurus gmail.com> wrote
 Kris wrote:
 Since we're now discussing a 1.0 release, I'd like to toss something into 
 the pot:

 1) consider calling it 2.0 ~ we all know the ramifications, and D has 
 been around long enough to deserve that status.
Some people disagree about calling it 1.0, and you want to call it 2.0? People are going to wonder where D 1.0 dissapeared to, and then promptly laugh at our hubris.
Well ~ the comment was made somewhat tongue-in-cheek :-) Yet, it feels like I've been using an "official" release for ages <g>
Dec 13 2005
prev sibling parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
clayasaurus wrote:

 Since we're now discussing a 1.0 release, I'd like to toss something 
 into the pot:

 1) consider calling it 2.0 ~ we all know the ramifications, and D has 
 been around long enough to deserve that status.
Some people disagree about calling it 1.0, and you want to call it 2.0? People are going to wonder where D 1.0 dissapeared to, and then promptly laugh at our hubris.
I think it should follow the lead of another celestial-body-named company, and rename the next release "DMD 142". That's sure to leave Java in the mud, as they've only gotten to 5 in all this time... :-P Seriously, it's not so much about numbers as about alpha/beta/release ? --anders
Dec 14 2005
parent reply clayasaurus <clayasaurus gmail.com> writes:
Kris scriva:
 Well ~ the comment was made somewhat tongue-in-cheek  :-)

 Yet, it feels like I've been using an "official" release for ages <g>
Hehe, ok, I understand :) I'm just afraid of D being thought of as the AOL version number jumper of the programming languages. Anders F Björklund escreva:
 clayasaurus wrote:
 I think it should follow the lead of another celestial-body-named 
 company, and rename the next release "DMD 142". That's sure to leave
 Java in the mud, as they've only gotten to 5 in all this time... :-P
 
 Seriously, it's not so much about numbers as about alpha/beta/release ?
 
 --anders
The problem is, maybe it is just me, that version 2.0 detonates it as the /second/ release. If we arn't going to care about version numbers, here are a few more suggestions... D 360, because we all know 360 is equated with fun! D 2014, who cares if it was released in 2005-6, it is just that much ahead of its time D 9.0 Optimized Edition, with full programmer protection D 3.0, followed by 3.1, followed by 3.14, etc. D XP, Ultimate Edition, Starter Edition, Enterprise Edition, collect them all D Cruise Control With A Vengance Pick your poison :-P
Dec 14 2005
parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
clayasaurus wrote:

 If we arn't going to care about version numbers, here are a few more 
 suggestions...
We do care about version numbers. We just don't think that 5.0 is necessarily better than 1.5.0... :-) --anders
Dec 14 2005
prev sibling parent Georg Wrede <georg.wrede nospam.org> writes:
Hmm. D 2.0 ?

Actually, it's been done before. Anybody ever heard of Solaris-1? Or 
dBase-I?

They guys at Sun wanted to rename SunOS to something cooler, but knew 
that whatever-1.0 won't cut it.

George Tate, the owner of Ashton-Tate that released dBase-II was a keen 
marketer too. First, he decided off-hand to skip 1.0. Then he also 
renamed the company to a two-person name, so as to hide the fact that it 
really wasn't a big company. Like something that's already done a major 
corporate merger. Also, he thought for a good while about the other 
name. He wanted it to sound Aristocratic, possibly British, or Old Money 
like.

At the time IBM got their PC out and Bill became a juggernaut, dBase-II 
was running on a bigger percentage of existing computers than any other 
database ever since.

By the time I got to hear this, I was using dBase-IV.

None of us couldn't possibly become adept at marketing. We're just too 
honest.


Kris wrote:
 Since we're now discussing a 1.0 release, I'd like to toss something
 into the pot:
 
 1) consider calling it 2.0 ~ we all know the ramifications, and D has
 been around long enough to deserve that status.
 
 2) whenever the first official, non-beta release is made, it should
 have addressed all or most of the little niggly things that would be
 hard to clean up later (once people start to use them). This is quite
 different from the various features people wish to see included ~
 instead, it's a little bit of insurance that the /following/ release
 will not upset users in some manner. It helps to smooth the release
 cycle and generates confidence.
 
 It's up to Walter, of course, but it might be useful to collect a
 list of these little niggly things? In case something is forgotten?
 
 - Kris
 
 
 
 "Dave" <Dave_member pathlink.com> wrote in message 
 news:dnndcn$20rh$1 digitaldaemon.com...
 
 "Anders F Björklund" <afb algonet.se> wrote in message 
 news:dnn82k$1qtu$1 digitaldaemon.com...
 
 BCS wrote:
 
 
 
 I'm not taking a side on this question but "Do we, or do we not
 WANT lots of users of D before it goes 1.0?"
When is that ? The D spec was conceived in Dec 1999. That's 6 years ago. The first DMD alpha was released in Dec 2001, 4 years ago.
I know and understand what you're getting at, but the timeframe is about par with how much time it took for other languages to start taking hold too, perhaps even less compared to C++ and Java. Of course, now we are dealing with 'internet time' when it comes to anything new in IT (and we wonder why there is so much crap out there <g>). What D has done in four years is remarkable when you consider how much was done by one person mostly on free time. C++ had a lot of help from what was then probably the premier CS research organization in the world, and Java had a large and rapidly growing tech. giant behind it. (I'm not forgetting the major contributions of others here, but no one can argue who's done the great majority of the work)
 So if I do find someone that would be willing to try D, should
 tell them to wait "a little longer" while the language
 specification and reference compiler is being worked on ? Or
 should I ask them to help out meanwhile ?
I'm telling people all of the time about D, but I do add the caveat that it's 'young' yet, *but* that's usually right before I also add that the language, tools and library are pretty stable, and are great for writing utlities where script just doesn't cut it. The primary problem is that A) I work for clients who drink the Microsoft koolaid or B) work for clients using proprietary Unix systems and tools.
 --If something major has to be changed and as a result, axes a
 major project, this would adversely effect the "image" of the
 language.
Maybe one should just draw a line in the sand and call it "1.0", and fix the shortcomings in "the first service pack" thereafter.
I've got to ask myself "Is the language ready for 1.0, and would this really help the language grow right now?". I'm not sure, but I'm leaning towards "yes" for the first part and "no" for the 2nd at this point. And there's no turning back from the big v1.0 release, being that "1.0" seems to carry so much weight with people regarding what decisions are made for a progamming language from that point on. That said, I'm often wrong in such matters otherwise I wouldn't be here, but in Bermuda or somewhere warm sipping a beer on the beach. I just had to add my $0.02 worth for consideration <g>
Jan 23 2006
prev sibling parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Dave wrote:

 What D has done in four years is remarkable when you consider how much was 
 done by one person mostly on free time. C++ had a lot of help from what was 
 then probably the premier CS research organization in the world, and Java 
 had a large and rapidly growing tech. giant behind it.
I think it could have gotten even further in the "bazaar", but also that the D design and implementation is more coherent as it is now... But if Walter hadn't opened up the implementation (i.e. GPL), then I for one wouldn't be here. Mostly since there would not *be* a gdcmac ?
 And there's no turning back from the big v1.0 release, being that 
 "1.0" seems to carry so much weight with people regarding what
 decisions are made for a  progamming language from that point on.
Seems like most software these days go through the stages "beta", "1.0" (a.k.a. Public Beta) and then end up being somewhat final in 1.1 or so. Then again, some - like Google - make it a sport to just stay in Beta...
 That said, I'm often wrong in such matters otherwise I wouldn't be here, but 
 in Bermuda or somewhere warm sipping a beer on the beach. 
Well, this is the internet... Who says you couldn't be, and still read digitalmars.D.announce ? :-) --anders
Dec 14 2005
parent reply "Dave" <Dave_member pathlink.com> writes:
"Anders F Björklund" <afb algonet.se> wrote in message 
news:dnojo2$caq$1 digitaldaemon.com...
 Dave wrote:

 What D has done in four years is remarkable when you consider how much 
 was done by one person mostly on free time. C++ had a lot of help from 
 what was then probably the premier CS research organization in the world, 
 and Java had a large and rapidly growing tech. giant behind it.
I think it could have gotten even further in the "bazaar", but also that the D design and implementation is more coherent as it is now... But if Walter hadn't opened up the implementation (i.e. GPL), then I for one wouldn't be here. Mostly since there would not *be* a gdcmac ?
Oh yea - those are things that are so cool about the 'D model' - coherence *and* sharing. Many good things about the language have some about from GDC, ability to run it on a Mac, new library submissions, implementation ideas, etc. One of the things that trully frightened me a while back was the widespread talk of "forking" language development, but I'm certainly not opposed to the free flow of ideas that the 'bazaar' brings - it has been huge in D's favor. Ben Hinkle for example, did the right thing IMHO, he took his ideas and started a new language instead of forking D (I'm not implying that Ben even considered forking D, but no matter what his intentions, he did it the right way. My hat is off to him and anyone else willing to make a commitment like that, that's for sure.).
 And there's no turning back from the big v1.0 release, being that "1.0" 
 seems to carry so much weight with people regarding what
 decisions are made for a  progamming language from that point on.
Seems like most software these days go through the stages "beta", "1.0" (a.k.a. Public Beta) and then end up being somewhat final in 1.1 or so. Then again, some - like Google - make it a sport to just stay in Beta...
 That said, I'm often wrong in such matters otherwise I wouldn't be here, 
 but in Bermuda or somewhere warm sipping a beer on the beach.
Well, this is the internet... Who says you couldn't be, and still read digitalmars.D.announce ? :-) --anders
Dec 14 2005
parent reply John Reimer <terminal.node gmail.com> writes:
Dave wrote:

 Ben Hinkle for example, did the right thing IMHO, he took his ideas and 
 started a new language instead of forking D (I'm not implying that Ben even 
 considered forking D, but no matter what his intentions, he did it the right 
 way. My hat is off to him and anyone else willing to make a commitment like 
 that, that's for sure.).
 
He did? Where, when, how, what? I didn't know he made a new language? Where did you read about that? I was certainly around when people were talking about forking, but I had no idea Ben went and made a new language? -JJR
Dec 14 2005
next sibling parent "Dave" <Dave_member pathlink.com> writes:
"John Reimer" <terminal.node gmail.com> wrote in message 
news:dnpmqj$1ggl$1 digitaldaemon.com...
 Dave wrote:

 Ben Hinkle for example, did the right thing IMHO, he took his ideas and 
 started a new language instead of forking D (I'm not implying that Ben 
 even considered forking D, but no matter what his intentions, he did it 
 the right way. My hat is off to him and anyone else willing to make a 
 commitment like that, that's for sure.).
He did? Where, when, how, what? I didn't know he made a new language? Where did you read about that?
More info. is available on his web site. AFAICT, it isn't a direct 'competitor' to D or C++; more of the "C extended" idea, but it does seem to borrow some ideas from D.
 I was certainly around when people were talking about forking, but I had 
 no idea Ben went and made a new language?

 -JJR 
Dec 14 2005
prev sibling next sibling parent J C Calvarese <technocrat7 gmail.com> writes:
In article <dnpmqj$1ggl$1 digitaldaemon.com>, John Reimer says...
Dave wrote:

 Ben Hinkle for example, did the right thing IMHO, he took his ideas and 
 started a new language instead of forking D (I'm not implying that Ben even 
 considered forking D, but no matter what his intentions, he did it the right 
 way. My hat is off to him and anyone else willing to make a commitment like 
 that, that's for sure.).
 
He did? Where, when, how, what? I didn't know he made a new language? Where did you read about that?
It's called "Cx". I don't know that it's based on the DMD front end at all. See http://home.comcast.net/~benhinkle/cx.html jcc7
Dec 14 2005
prev sibling parent reply "Ben Hinkle" <bhinkle mathworks.com> writes:
"John Reimer" <terminal.node gmail.com> wrote in message 
news:dnpmqj$1ggl$1 digitaldaemon.com...
 Dave wrote:

 Ben Hinkle for example, did the right thing IMHO, he took his ideas and 
 started a new language instead of forking D (I'm not implying that Ben 
 even considered forking D, but no matter what his intentions, he did it 
 the right way. My hat is off to him and anyone else willing to make a 
 commitment like that, that's for sure.).
He did? Where, when, how, what? I didn't know he made a new language? Where did you read about that? I was certainly around when people were talking about forking, but I had no idea Ben went and made a new language? -JJR
I'm still goofing around with it so it'll take a while to get anything worth a damn. It's hacking the tinycc compiler to do D-like things with some twists. I'm having fun which currently is my only goal. :-) I hadn't considered forking D - that someone feels rude to me since D is so new. I'm still very much paying attention to D stuff though. I'm not posting as much but I try to keep up with things.
Dec 14 2005
parent reply John Reimer <terminal.node gmail.com> writes:
Ben Hinkle wrote:

 
 I'm still goofing around with it so it'll take a while to get anything worth 
 a damn. It's hacking the tinycc compiler to do D-like things with some 
 twists. I'm having fun which currently is my only goal. :-) I hadn't 
 considered forking D - that someone feels rude to me since D is so new.
 I'm still very much paying attention to D stuff though. I'm not posting as 
 much but I try to keep up with things.
 
 
 
A thanks for explaining, Ben! The context just sounded a little too sinister for you! :D I did actually manage to find Cx on your website after it was mentioned here. -JJR
Dec 14 2005
parent reply Dave <Dave_member pathlink.com> writes:
In article <dnq4h0$1sm5$1 digitaldaemon.com>, John Reimer says...
Ben Hinkle wrote:

A thanks for explaining, Ben!  The context just sounded a little too 
sinister for you! :D
I thought I emphasized that I wasn't implying anything nasty about Ben's motivation <g> If it came across differently, I apologize Ben.
I did actually manage to find Cx on your website after it was mentioned 
here.

-JJR
Dec 14 2005
parent John Reimer <terminal.node gmail.com> writes:
Dave wrote:
 In article <dnq4h0$1sm5$1 digitaldaemon.com>, John Reimer says...
 Ben Hinkle wrote:

 A thanks for explaining, Ben!  The context just sounded a little too 
 sinister for you! :D
I thought I emphasized that I wasn't implying anything nasty about Ben's motivation <g> If it came across differently, I apologize Ben.
No, you were fine, Dave. I was just being facetious. Still, I was surprised I had missed (or possibly forgotten about) his little experiment. -JJR
Dec 14 2005
prev sibling parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Anders F Björklund wrote:
 IMHO, this has already happened... As D seems to be pretty fixed ?
My view is that D's had some positively *huge* breaking changes in the last couple of months, including but not limited to: * Type inference (the most major change; IMO should have been in D 2.0) * $ symbol in arrays * New features in Phobos * Refactoring and other changes in Phobos * String postfix literals * === and !== replaced with is and !is * .size replaced with .sizeof All of these changes are non-backwards compatible, and an even more major change is coming (stack allocation). To me this hardly seems dare to use them. The pace of breaking changes in D is *fast*, even when comparing against other alpha-stage languages. This is visible in e.g. GDC's constant struggle keep up with DMD. Given all this I don't blame anyone for not daring to use D in any major project. Projects whose timespan may be several years need such a level of fixture that D cannot offer in the foreseeable future. -- Niko Korhonen SW Developer
Dec 14 2005
next sibling parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Niko Korhonen wrote:

 IMHO, this has already happened... As D seems to be pretty fixed ?
My view is that D's had some positively *huge* breaking changes in the last couple of months, including but not limited to: * Type inference (the most major change; IMO should have been in D 2.0) * $ symbol in arrays * New features in Phobos * Refactoring and other changes in Phobos * String postfix literals * === and !== replaced with is and !is * .size replaced with .sizeof All of these changes are non-backwards compatible, and an even more major change is coming (stack allocation). To me this hardly seems 'fixed'.
AAs have also been through some dramatic changes, with the "in" etc ? And there's still a few "major annoyances" left in the D language... Actually I meant that it doesn't seem like any of my changes suggested earlier is going to make it into D, as it has been pretty "decided"... But you are right of course, there has been a lot of such new changes. And no way to make the new code backwards-compatible (e.g. #if/#endif) I just check in every 6 months or so, to see if it has been released :-) Hasn't been in the last year, but maybe in another 6 months it could ?

 dare to use them.
Java has done a few of such major changes, but it has been over time... And each time, you could still use the previous version while migrating. I think the idea and spirit of what I was trying to say that what we have now in D should be good enough to "freeze features" on and start fixing the bugs and get it released once - *then* move on to the 2.0 ? At least get the specification finalized, while fixing compiler bugs. But maybe that has already happened, only that it was called e.g. 0.110 (haven't really kept a track of when the major changes were implemented, anyone feel like writing a summary history of D from the DMD changelog?) I started last year with DMD 0.102, so I wasn't here in "the two-digits"
 The pace of breaking changes in D is *fast*, even when comparing against 
 other alpha-stage languages. This is visible in e.g. GDC's constant 
 struggle keep up with DMD.
True. And GDC has also had some *major* new features added, like new garbage collection, inline assembler, GCC 4.0, new platforms, etc... Another incredible one-man accomplishment! (GDC, by David Friedman)
 Given all this I don't blame anyone for not daring to use D in any major 
 project. Projects whose timespan may be several years need such a level 
 of fixture that D cannot offer in the foreseeable future.
That, and linking to some libraries, has forced me to use C++ as well. :-( I will still try to move my old C/C++ library over to D, but it doesn't use too many advanced D features so it hasn't been bitten by the above. --anders
Dec 14 2005
parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Anders F Björklund wrote:
 AAs have also been through some dramatic changes, with the "in" etc ?
 And there's still a few "major annoyances" left in the D language...
Yes, that is very much a major change, thanks for pointing that one out!
 I think the idea and spirit of what I was trying to say that what we
 have now in D should be good enough to "freeze features" on and start
 fixing the bugs and get it released once - *then* move on to the 2.0 ?
 At least get the specification finalized, while fixing compiler bugs.
I'm still not sure are the current standard library, GC scheme, AA's and type inference ready for prime time. Plus a couple of uncertain features such as bit arrays, AA's, array initialization etc. IMHO D should/must have a "pinned pointer" syntax in order to allow compacting GC's or explicitly state in the spec that "a compacting GC is not, will not and cannot be used in D, never ever ever". Otherwise this will be a *major* limitation in compiler/runtime implementation and it's going to cause *gargantuan* changes in existing code if it's implemented later on. I also would like to see bit arrays and AA's in the standard library instead of the language; IMO D isn't high-enough level language for that. Not to mention that I downright abhor the "whisper syntax". In the last couple of months I've seen D gradually mutating towards C++ with new features that increase complexity. Please don't make it any more complicated than it already is. -- Niko Korhonen SW Developer
Dec 14 2005
parent reply Walter Bright <Walter_member pathlink.com> writes:
In article <dnoru3$mgr$1 digitaldaemon.com>, Niko Korhonen says...
I also would like to see bit arrays and AA's in the standard library 
instead of the language; IMO D isn't high-enough level language for 
that.
While I agree that bit arrays in the core language, in retrospect, were probably a mistake they are in the language, are used, and so need to stay. I'll disagree with the AA's. I use them and like them a lot. It's a well used data structure, and having them in the core makes them just sweet.
Not to mention that I downright abhor the "whisper syntax". In the 
last couple of months I've seen D gradually mutating towards C++ with 
new features that increase complexity. Please don't make it any more 
complicated than it already is.
The whisper syntax was necessary at the time because D didn't support typesafe variadic argument lists. It does now (which is why writefln() works).
Dec 15 2005
next sibling parent reply Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 In article <dnoru3$mgr$1 digitaldaemon.com>, Niko Korhonen says...
 I also would like to see bit arrays and AA's in the standard library 
 instead of the language; IMO D isn't high-enough level language for 
 that.
While I agree that bit arrays in the core language, in retrospect, were probably a mistake they are in the language, are used, and so need to stay.
But how often are they used in a way that would be incompatible with arrays of 1-byte bits? I'll admit that I do like being able to use packed bit arrays from time to time (their use with slicing can be pretty cool), but just as often I don't want them. Most recently, it was when I wanted to do this: bit[] set; foreach( inout bit; set ) { ... } Obviously impossible with a packed array so I was forced to choose between double lookups (not a big deal with arrays, but annoying) and using another data type... I chose to use an array of ubytes, instead, and suffer the slight lack of value restrictions. This almost has me wishing there were a bool type in addition to the bit type, simply so both options were available. Sean
Dec 15 2005
parent reply "Kris" <fu bar.com> writes:
Since Walter feels that bit-arrays in the core-language were (in retrospect) 
probably a mistake, I think there's a really good opportunity to heal a 
wart.

Suppose there were a nice library-based bitset ~ just how much effort would 
be needed to clean up now rather than later? Is most of the usage actually 
within Phobos? Would those people who actually use bit-arrays kick and 
scream all the way to hell and back?

How about it?

- Kris



"Sean Kelly" <sean f4.ca> wrote in message 
news:dnsch6$enq$1 digitaldaemon.com...
 Walter Bright wrote:
 In article <dnoru3$mgr$1 digitaldaemon.com>, Niko Korhonen says...
 I also would like to see bit arrays and AA's in the standard library 
 instead of the language; IMO D isn't high-enough level language for 
 that.
While I agree that bit arrays in the core language, in retrospect, were probably a mistake they are in the language, are used, and so need to stay.
But how often are they used in a way that would be incompatible with arrays of 1-byte bits? I'll admit that I do like being able to use packed bit arrays from time to time (their use with slicing can be pretty cool), but just as often I don't want them. Most recently, it was when I wanted to do this: bit[] set; foreach( inout bit; set ) { ... } Obviously impossible with a packed array so I was forced to choose between double lookups (not a big deal with arrays, but annoying) and using another data type... I chose to use an array of ubytes, instead, and suffer the slight lack of value restrictions. This almost has me wishing there were a bool type in addition to the bit type, simply so both options were available. Sean
Dec 15 2005
parent reply Derek Parnell <derek psych.ward> writes:
On Thu, 15 Dec 2005 10:40:45 -0800, Kris wrote:

 Since Walter feels that bit-arrays in the core-language were (in retrospect) 
 probably a mistake, I think there's a really good opportunity to heal a 
 wart.
 
 Suppose there were a nice library-based bitset ~ just how much effort would 
 be needed to clean up now rather than later? Is most of the usage actually 
 within Phobos? Would those people who actually use bit-arrays kick and 
 scream all the way to hell and back?
 
 How about it?
I don't use bit arrays but I do you bool associative arrays. -- Derek Parnell Melbourne, Australia 16/12/2005 6:32:53 AM
Dec 15 2005
parent reply "Kris" <fu bar.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message 
news:1ltc1ir9rny3r.1vwocglyyu3uu.dlg 40tude.net...
 On Thu, 15 Dec 2005 10:40:45 -0800, Kris wrote:

 Since Walter feels that bit-arrays in the core-language were (in 
 retrospect)
 probably a mistake, I think there's a really good opportunity to heal a
 wart.

 Suppose there were a nice library-based bitset ~ just how much effort 
 would
 be needed to clean up now rather than later? Is most of the usage 
 actually
 within Phobos? Would those people who actually use bit-arrays kick and
 scream all the way to hell and back?

 How about it?
I don't use bit arrays but I do you bool associative arrays.
me too, but I think that's implemented akin to a byte-AA instead? That is, I doubt it would be impacted.
Dec 15 2005
parent reply Oskar Linde <oskar.lindeREM OVEgmail.com> writes:
In article <dnssku$r3g$1 digitaldaemon.com>, Kris says...
"Derek Parnell" <derek psych.ward> wrote in message 
news:1ltc1ir9rny3r.1vwocglyyu3uu.dlg 40tude.net...
 On Thu, 15 Dec 2005 10:40:45 -0800, Kris wrote:

 Since Walter feels that bit-arrays in the core-language were (in 
 retrospect)
 probably a mistake, I think there's a really good opportunity to heal a
 wart.

 Suppose there were a nice library-based bitset ~ just how much effort 
 would
 be needed to clean up now rather than later? Is most of the usage 
 actually
 within Phobos? Would those people who actually use bit-arrays kick and
 scream all the way to hell and back?

 How about it?
I don't use bit arrays but I do you bool associative arrays.
me too, but I think that's implemented akin to a byte-AA instead? That is, I doubt it would be impacted.
I've used bit arrays where I in C would use bit fields. (but IMHO, the C standard has done the wrong thing by allowing the order of fields in bit fields to become endian dependent) I tried implementing a BitArray class in D, just seeing what would turn up. This is how it works: BitArray - bit[] replacement BitField!(member1,size1,member2,size2,...) - C bit field replacement BitArray supports unaligned slices, concatenation etc... BitField also supports named fields: I'm not sure why dmd requires () after templated member functions... I can post the code if anyone is interested. /Oskar
Dec 18 2005
parent MicroWizard <MicroWizard_member pathlink.com> writes:
For some months I have a thought... And I just found the answer.

 Since Walter feels that bit-arrays in the core-language were (in 
 retrospect)
 probably a mistake, I think there's a really good opportunity to heal a
 wart.
Yes. Bit-arrays should not be embedded into the language itself. There are a lot of problems returning a bit from function: - Bit type out function parameters and so on. - Bit-arrays are not able to replace C-style bit fields. - Bit-arrays can not be used as normal sets. The basic operations missing. (in, or, and, xor) And finally I have found some posts in D newsgroups that explained me, most of these problems can be solved elegantly with templates/mixins in a convenient library.
I've used bit arrays where I in C would use bit fields. (but IMHO, the C
standard has done the wrong thing by allowing the order of fields in bit fields
to become endian dependent)
...
I can post the code if anyone is interested.
I am interested. Please send a link if possible. Thanks, Tamas Nagy
Jan 02 2006
prev sibling next sibling parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Walter Bright wrote:
 In article <dnoru3$mgr$1 digitaldaemon.com>, Niko Korhonen says...
 
I also would like to see bit arrays and AA's in the standard library 
instead of the language; IMO D isn't high-enough level language for 
that.
While I agree that bit arrays in the core language, in retrospect, were probably a mistake they are in the language, are used, and so need to stay. I'll disagree with the AA's. I use them and like them a lot. It's a well used data structure, and having them in the core makes them just sweet.
Why? I find them very convenient (and cool), although I haven't used them in a serious project/program. It's kind of a nice way to market D, you know .. looks attractive to me!
Dec 15 2005
prev sibling parent Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Walter Bright wrote:
 In article <dnoru3$mgr$1 digitaldaemon.com>, Niko Korhonen says...
 
I also would like to see bit arrays and AA's in the standard library 
instead of the language; IMO D isn't high-enough level language for 
that.
While I agree that bit arrays in the core language, in retrospect, were probably a mistake they are in the language, are used, and so need to stay.
Whoa, am I reading right? Isn't that against the philosophy of D? I mean, we complain about C++'s nuisances related to backwards compatibility, yet, when we find a certain feature in D that is regretable, it's decided it can't be taken off, moreso even tough D is still in development? -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Dec 16 2005
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Niko Korhonen" <niktheblak hotmail.com> wrote in message 
news:dnolt3$eeo$1 digitaldaemon.com...
 My view is that D's had some positively *huge* breaking changes in the 
 last couple of months, including but not limited to:

 * Type inference (the most major change; IMO should have been in D 2.0)
 * $ symbol in arrays
 * New features in Phobos
 * Refactoring and other changes in Phobos
 * String postfix literals
 * === and !== replaced with is and !is
 * .size replaced with .sizeof

 All of these changes are non-backwards compatible,
I don't agree, the type inference, $, new Phobos functions, and string postfix don't break existing code. The others are minor textual edits, and the compiler has good support for identifying the changes, and suggesting the fix. With this, fixing the code for them is not difficult or time consuming - there were no structural changes or rewrites necessary.
 and an even more major change is coming (stack allocation).
This won't break existing code.

 pace, nobody would dare to use them.

 The pace of breaking changes in D is *fast*, even when comparing against 
 other alpha-stage languages. This is visible in e.g. GDC's constant 
 struggle keep up with DMD.

 Given all this I don't blame anyone for not daring to use D in any major 
 project. Projects whose timespan may be several years need such a level of 
 fixture that D cannot offer in the foreseeable future.
Any project whose timespan is several years should archive the tools used to build the project, along with that project. Heck, gcc and dmc++ have seen breaking changes in the last year, and all along.
Dec 14 2005
next sibling parent =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Walter Bright wrote:

 I don't agree, the type inference, $, new Phobos functions, and string 
 postfix don't break existing code. The others are minor textual edits, and 
 the compiler has good support for identifying the changes, and suggesting 
 the fix. With this, fixing the code for them is not difficult or time 
 consuming - there were no structural changes or rewrites necessary.
The only breakage that I saw was that the new features wouldn't work with an older compiler, so I would have to wait for GDC to catch up. But in the end (and pretty darn quick too), it always has done so... http://www.prowiki.org/wiki4d/wiki.cgi?HistoryRoadmap --anders
Dec 14 2005
prev sibling next sibling parent Tom S <h3r3tic remove.mat.uni.torun.pl> writes:
Walter Bright wrote:
 "Niko Korhonen" <niktheblak hotmail.com> wrote in message 
 news:dnolt3$eeo$1 digitaldaemon.com...
 
My view is that D's had some positively *huge* breaking changes in the 
last couple of months, including but not limited to:

* Type inference (the most major change; IMO should have been in D 2.0)
* $ symbol in arrays
* New features in Phobos
* Refactoring and other changes in Phobos
* String postfix literals
* === and !== replaced with is and !is
* .size replaced with .sizeof

All of these changes are non-backwards compatible,
I don't agree, the type inference, $, new Phobos functions, and string postfix don't break existing code. The others are minor textual edits, and the compiler has good support for identifying the changes, and suggesting the fix. With this, fixing the code for them is not difficult or time consuming - there were no structural changes or rewrites necessary.
True :) I haven't been forced to commit any major changes in my code (to account for a new dmd release) ever since I've been using D. -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d-pu s+: a-->----- C+++$>++++ UL P+ L+ E--- W++ N++ o? K? w++ !O !M V? PS- PE- Y PGP t 5 X? R tv-- b DI- D+ G e>+++ h>++ !r !y ------END GEEK CODE BLOCK------ Tomasz Stachowiak /+ a.k.a. h3r3tic +/
Dec 14 2005
prev sibling parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Walter Bright wrote:
 I don't agree, the type inference, $, new Phobos functions, and string 
 postfix don't break existing code. The others are minor textual edits, and 
 the compiler has good support for identifying the changes, and suggesting 
 the fix. With this, fixing the code for them is not difficult or time 
 consuming - there were no structural changes or rewrites necessary.
True, only === and AA are actually breaking changes but the other ones are still not backwards compatible.
and an even more major change is coming (stack allocation).
This won't break existing code.
True, but it changes the language semantics in a major way thus requiring re-education of D programmers. Think of the "accidentally return a pointer to a stack-allocated object from a function" C++ scenario, please! In a previous reply I mentioned that D is gradually mutating into C++; the heap/stack allocation separation that you are going to implement in D is IMHO a major step in that direction, especially since the syntax is almost like in C++.
 Any project whose timespan is several years should archive the tools used to 
 build the project, along with that project. Heck, gcc and dmc++ have seen 
 breaking changes in the last year, and all along. 
Yes, but for argument's sake consider a situation where a project has 5M lines of DMD 0.141 code. Assume that it turns out that DMD 0.141 has a nasty compiler bug that forces the project to use some pretty ugly workarounds. Suppose this bug was fixed in DMD 0.143, but 0.142 added a breaking change which would need tens or possibly hundreds of man-hours to track down the needed changes in the source. The customer doesn't want to pay for these hours so the project management refuses the compiler update. Bang! The project is forever stuck with DMD 0.141 and the programmers are likewise stuck using these ugly workarounds and can only dream about the nifty new features that DMD 0.150 brought; say bye-bye to employer morale. I've seen a project at my work place that has stuck into Java 1.3.1_b24 forever because of some nasty breaking changes in the immediately following JDK version. It's horrible when that happens. Probably the worst thing that can happen in software development. -- Niko Korhonen SW Developer
Dec 14 2005
next sibling parent reply Tom S <h3r3tic remove.mat.uni.torun.pl> writes:
Niko Korhonen wrote:
 Walter Bright wrote:
 and an even more major change is coming (stack allocation).
This won't break existing code.
True, but it changes the language semantics in a major way thus requiring re-education of D programmers. Think of the "accidentally return a pointer to a stack-allocated object from a function" C++ scenario, please!
DMD.141 ------- Object foo() { auto Object x = new Object; return x; // error: escaping reference to auto local x } There you go :) DMD won't let you hurt yourself now and it won't let you when stack allocation is added.
 In a previous reply I mentioned that D is gradually mutating into C++; 
 the heap/stack allocation separation that you are going to implement in 
 D is IMHO a major step in that direction, especially since the syntax is 
 almost like in C++.
No, it's a step towards better performance of D code so that former C/C++ programmers won't complain too much. As Walter said, it's not going to break existing code. Unless it's buggy, because then its behaviour might be undefined anyway.
 Yes, but for argument's sake consider a situation where a project has 5M 
 lines of DMD 0.141 code. Assume that it turns out that DMD 0.141 has a 
 nasty compiler bug that forces the project to use some pretty ugly 
 workarounds. Suppose this bug was fixed in DMD 0.143, but 0.142 added a 
 breaking change which would need tens or possibly hundreds of man-hours 
 to track down the needed changes in the source. The customer doesn't 
 want to pay for these hours so the project management refuses the 
 compiler update. Bang! The project is forever stuck with DMD 0.141 and 
 the programmers are likewise stuck using these ugly workarounds and can 
 only dream about the nifty new features that DMD 0.150 brought; say 
 bye-bye to employer morale.
Your assumptions are wrong. There is no D project I've ever heard of to have 5M lines of code. As D matures, I expect less and less features being added to the 'stable' compiler branch. I can't speak for Walter, but DMD will probably branch off into the 'beta' / 'development' / 'research' compiler which will adopt new features until it becomes the 'stable' release. People are already starting to do this with GDC.
 I've seen a project at my work place that has stuck into Java 1.3.1_b24 
 forever because of some nasty breaking changes in the immediately 
 following JDK version. It's horrible when that happens. Probably the 
 worst thing that can happen in software development.
Since D is so easy to parse, then perhaps we could have an automatic 'conversion tool' between DMD version X and Y ? -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d-pu s+: a-->----- C+++$>++++ UL P+ L+ E--- W++ N++ o? K? w++ !O !M V? PS- PE- Y PGP t 5 X? R tv-- b DI- D+ G e>+++ h>++ !r !y ------END GEEK CODE BLOCK------ Tomasz Stachowiak /+ a.k.a. h3r3tic +/
Dec 14 2005
next sibling parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Tom S wrote:
 DMD.141
 -------
 
 Object foo() {
     auto Object x = new Object;
     return x;        // error: escaping reference to auto local x
 }
 
 There you go :) DMD won't let you hurt yourself now and it won't let you 
 when stack allocation is added.
Object* f() { auto Object obj = new Object(); Object* ptr = &obj; return ptr; } // The pointer is no longer valid int main() { Object* ptr = f(); return 0; } Compiles OK and warns only "warning - rettest.d(5): function rettest.f no return at end of function". Boo-yeah :) Actually I can argue that when a language allows pointers it's impossible to determine whether a local object reference is escaped during compile time. A nice casting round-trip should do the job.
 No, it's a step towards better performance of D code so that former 
 C/C++ programmers won't complain too much. As Walter said, it's not 
 going to break existing code. Unless it's buggy, because then its 
 behaviour might be undefined anyway.
Well, I just feel that: auto obj = new Object(); auto obj = Object(); is kind of ugly in the way of being very C++'ish. I would prefer the syntax for both heap and stack-allocated objects. Here's my adaptation of the concept for D (your mileage may vary): auto class StackObj; class HeapObj; StackObj obj0; // Stack-allocated, default-value initialized StackObj obj1 = new StackObj(); // Stack-allocated HeapObj obj2 = new HeapObj(); // Heap-allocated This doesn't solve the 'return &obj1' problem, but then again, nothing does.
 Since D is so easy to parse, then perhaps we could have an automatic 
 'conversion tool' between DMD version X and Y ?
Hmm, preferably not. Besides, who's going to do that? Walter, Derek and Anders are already busy! -- Niko Korhonen SW Developer
Dec 14 2005
parent reply Tom S <h3r3tic remove.mat.uni.torun.pl> writes:
Niko Korhonen wrote:
 Tom S wrote:
 
 DMD.141
 -------

 Object foo() {
     auto Object x = new Object;
     return x;        // error: escaping reference to auto local x
 }

 There you go :) DMD won't let you hurt yourself now and it won't let 
 you when stack allocation is added.
Object* f() { auto Object obj = new Object(); Object* ptr = &obj; return ptr; } // The pointer is no longer valid int main() { Object* ptr = f(); return 0; } Compiles OK and warns only "warning - rettest.d(5): function rettest.f no return at end of function". Boo-yeah :)
My point was that you get the same level of security when using stack allocation as when using heap allocation :P
 Actually I can argue that when a language allows pointers it's 
 impossible to determine whether a local object reference is escaped 
 during compile time. A nice casting round-trip should do the job.
/me agrees
 Well, I just feel that:
 
 auto obj = new Object();
 auto obj = Object();
 
 is kind of ugly in the way of being very C++'ish. I would prefer the 

 syntax for both heap and stack-allocated objects. Here's my adaptation 
 of the concept for D (your mileage may vary):
 
 auto class StackObj;
 class HeapObj;
 
 StackObj obj0; // Stack-allocated, default-value initialized
 StackObj obj1 = new StackObj(); // Stack-allocated
 HeapObj obj2 = new HeapObj(); // Heap-allocated
Sometimes I'd like to have a class object defined on the stack, sometimes on the heap (without RAII), would I have then to make 2 classes ? Is the D heap allocation syntax already 'frozen' ?
 Since D is so easy to parse, then perhaps we could have an automatic 
 'conversion tool' between DMD version X and Y ?
Hmm, preferably not. Besides, who's going to do that? Walter, Derek and Anders are already busy!
If I weren't busy with my stuff, I'd do it :/ Any volunteers ? ;) -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d-pu s+: a-->----- C+++$>++++ UL P+ L+ E--- W++ N++ o? K? w++ !O !M V? PS- PE- Y PGP t 5 X? R tv-- b DI- D+ G e>+++ h>++ !r !y ------END GEEK CODE BLOCK------ Tomasz Stachowiak /+ a.k.a. h3r3tic +/
Dec 14 2005
parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Tom S wrote:
 auto class StackObj;
 class HeapObj;

 StackObj obj0; // Stack-allocated, default-value initialized
 StackObj obj1 = new StackObj(); // Stack-allocated
 HeapObj obj2 = new HeapObj(); // Heap-allocated
Sometimes I'd like to have a class object defined on the stack, sometimes on the heap (without RAII), would I have then to make 2 classes ?
Yes, this is true, I actually didn't think about that very much. In the cannot be decided during instantiation. This is rather limiting, and can already be done in D with structs. I guess that we really want to allow deciding between allocation models during the instantiation phase, as is done in C++. This is kind of a difficult issue for me since it's very useful feature but adds a significant amount of complexity to the language. And I fear that it necessarily leads to other changes, like 'new' returning a pointer as suggested by others in another thread. I already have nightmares about that! I guess a *really* nice syntax would do the trick, something where the 'using' pattern maybe? using (Object obj = Object()) { // ... work with obj } // obj doesn't exist anymore
 Is the D heap allocation syntax already 'frozen' ?
It's probably going to make into DMD 0.142 as such and then get modified in a a major, and especially breaking, way in a later version :) -- Niko Korhonen SW Developer
Dec 14 2005
next sibling parent reply Tom S <h3r3tic remove.mat.uni.torun.pl> writes:
Niko Korhonen wrote:
 I guess that we really want to allow deciding between allocation models 
 during the instantiation phase, as is done in C++. This is kind of a 
 difficult issue for me since it's very useful feature but adds a 
 significant amount of complexity to the language. And I fear that it 
 necessarily leads to other changes, like 'new' returning a pointer as 
 suggested by others in another thread. I already have nightmares about 
 that!
What complexity ? We already have it. It's all about the syntactic sugar.
 I guess a *really* nice syntax would do the trick, something where the 

 'using' pattern maybe?
 
 using (Object obj = Object())
 {
   // ... work with obj
 } // obj doesn't exist anymore
:o I don't like it... Simple things should be simple, remember that. With the proposed D syntax, it's actually pretty obvious... When you use 'new', you get the memory from the heap. When not using it, you get it from the stack.
 Is the D heap allocation syntax already 'frozen' ?
It's probably going to make into DMD 0.142 as such and then get modified in a a major, and especially breaking, way in a later version :)
lol -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d-pu s+: a-->----- C+++$>++++ UL P+ L+ E--- W++ N++ o? K? w++ !O !M V? PS- PE- Y PGP t 5 X? R tv-- b DI- D+ G e>+++ h>++ !r !y ------END GEEK CODE BLOCK------ Tomasz Stachowiak /+ a.k.a. h3r3tic +/
Dec 14 2005
parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Tom S wrote:
 What complexity ? We already have it. It's all about the syntactic sugar.
No, not as such IMO. Struct semantics are different enough from class semantics for them not to get mixed in that manner and the current stack allocation scheme for classes (using alloca) is so ugly and obscure that no one uses it unless it's absolutely necessary. After the new stack allocation syntax addition we have a simple, viable way of creating class instances both on the heap and on the stack. So now we have to *choose whether we want stack or heap instances whenever we are creating a class instance*. Each time you fork the language construct in two, you get two schools of thought. These two schools of thought will wage war on each indefinitely. This is exactly what happened with C++ (in an extreme manner) and nowadays every C++ programmer knows and uses a different subset of the language and wages war with the other programmers. Consider: int x = 5; int x(5); Which is better? MyClass a; MyClass a = MyClass(); Again, which is better? void func(const MyClass& myClass); void func(const MyClass* myClass); Again, which is better? C++ has created millions of ways to do the same thing, i.e. the language is full of redundant constructs. A couple of recent additions to D (type inference and stack allocation) have forked/will fork the language in two, now we have to choose between: auto c = new MyClass(); MyClass c = new MyClass(); // Which is better? and soon: auto c = MyClass(); auto c = new MyClass(); People are bound to create Coding Guidelines for D with one saying 'Always use type inference' and the other saying 'Never use type inference'. I've read three different company internal style guides for C++ and all of them are completely different. I don't want the same to happen to D. The point that I'm desperately (and rather poorly come to think of it) trying to make is that we should keep the number of lanugage constructs to a bare minimum and absolutely ban any redundant constructs. This helps to keep the language shareable, clean and easy to understand and easy to parse. Whenever there is a choice between language constructs programmers will fight about which one is better. The only historically proven way to alleviate this issue is to reduce the number of choices. The C++ gurus standard answer of 'educating the programmers better' hasn't worked in the real world. Only if and only if there are no choices between language constructs to fight about, programmers will not fight about them. Otherwise they will. -- Niko Korhonen SW Developer
Dec 15 2005
next sibling parent reply Hasan Aljudy <hasan.aljudy gmail.com> writes:
Right on dude!!
I didn't really like type inference, but oh well, it can come in handy 
sometimes with long class names ..

However, I'm very worried about this stack allocation thing (when and 
where did Walter say that he was gonna implement it?!), IMO it'll be a 
step in the wrong direction (C++).

One of the things I hate most about C++ is the fact that you can 
allocate objects (class instances) on the stack OR on the heap, you have 
this choice, and it's always hard to make a decision, if it was up to 
me, I'd always create objects on the heap, but the problem is, the STL 
has alot of objects which are meant to be allocated on the stack, i.e. 
string, vector, list, map ... etc. The problem is, these classes have 
much of their functionality in overloaded operators, i.e. the 
implementation relies on operator overloading, and if you allocate these 
objects on the heap, you'd have to use pointers, in which case you won't 
be able to use the overloaded operators without some nasty, ugly 
work-arounds.

I liked java because there's only one way to create objects, and one way 
to use them!!
One of the main things that attracted me to D was that it's (C++)-like, 
yet it borrowed the right things from java (class syntax).
Another thing that attracted me is the fact that classes are written in 
one block, no seperate declaration and implementation (.h and .cpp) crap.
One more thing is that D has no confusing things like
 void func(const MyClass& myClass);
 void func(const MyClass* myClass);
etc .. Just my two cents, but the I'm seeing it (or that's how it seems to me) is that D is somehow moving in the wrong direction. Niko Korhonen wrote:
 Tom S wrote:
 
 What complexity ? We already have it. It's all about the syntactic sugar.
No, not as such IMO. Struct semantics are different enough from class semantics for them not to get mixed in that manner and the current stack allocation scheme for classes (using alloca) is so ugly and obscure that no one uses it unless it's absolutely necessary. After the new stack allocation syntax addition we have a simple, viable way of creating class instances both on the heap and on the stack. So now we have to *choose whether we want stack or heap instances whenever we are creating a class instance*. Each time you fork the language construct in two, you get two schools of thought. These two schools of thought will wage war on each indefinitely. This is exactly what happened with C++ (in an extreme manner) and nowadays every C++ programmer knows and uses a different subset of the language and wages war with the other programmers. Consider: int x = 5; int x(5); Which is better? MyClass a; MyClass a = MyClass(); Again, which is better? void func(const MyClass& myClass); void func(const MyClass* myClass); Again, which is better? C++ has created millions of ways to do the same thing, i.e. the language is full of redundant constructs. A couple of recent additions to D (type inference and stack allocation) have forked/will fork the language in two, now we have to choose between: auto c = new MyClass(); MyClass c = new MyClass(); // Which is better? and soon: auto c = MyClass(); auto c = new MyClass(); People are bound to create Coding Guidelines for D with one saying 'Always use type inference' and the other saying 'Never use type inference'. I've read three different company internal style guides for C++ and all of them are completely different. I don't want the same to happen to D. The point that I'm desperately (and rather poorly come to think of it) trying to make is that we should keep the number of lanugage constructs to a bare minimum and absolutely ban any redundant constructs. This helps to keep the language shareable, clean and easy to understand and easy to parse. Whenever there is a choice between language constructs programmers will fight about which one is better. The only historically proven way to alleviate this issue is to reduce the number of choices. The C++ gurus standard answer of 'educating the programmers better' hasn't worked in the real world. Only if and only if there are no choices between language constructs to fight about, programmers will not fight about them. Otherwise they will.
Dec 15 2005
next sibling parent reply "Regan Heath" <regan netwin.co.nz> writes:
On Thu, 15 Dec 2005 11:11:44 -0700, Hasan Aljudy <hasan.aljudy gmail.com>  
wrote:
 One of the things I hate most about C++ is the fact that you can  
 allocate objects (class instances) on the stack OR on the heap, you have  
 this choice, and it's always hard to make a decision, if it was up to  
 me, I'd always create objects on the heap, but the problem is, the STL  
 has alot of objects which are meant to be allocated on the stack, i.e.  
 string, vector, list, map ... etc. The problem is, these classes have  
 much of their functionality in overloaded operators, i.e. the  
 implementation relies on operator overloading, and if you allocate these  
 objects on the heap, you'd have to use pointers, in which case you won't  
 be able to use the overloaded operators without some nasty, ugly  
 work-arounds.
It's my impression (because Walter has been rather closed mouthed about the idea) that heap or stack allocation will have no effect (and/or should have no effect) on how a class reference behaves, eg: MyClass a = MyClass(); //stack MyClass b = new MyClass(); //heap MyClass c; a.foo(); b.foo(); c = a * b; Therefore none of your above comments will apply to D :) Regan
Dec 15 2005
parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Regan Heath wrote:
 On Thu, 15 Dec 2005 11:11:44 -0700, Hasan Aljudy 
 <hasan.aljudy gmail.com>  wrote:
 
 One of the things I hate most about C++ is the fact that you can  
 allocate objects (class instances) on the stack OR on the heap, you 
 have  this choice, and it's always hard to make a decision, if it was 
 up to  me, I'd always create objects on the heap, but the problem is, 
 the STL  has alot of objects which are meant to be allocated on the 
 stack, i.e.  string, vector, list, map ... etc. The problem is, these 
 classes have  much of their functionality in overloaded operators, 
 i.e. the  implementation relies on operator overloading, and if you 
 allocate these  objects on the heap, you'd have to use pointers, in 
 which case you won't  be able to use the overloaded operators without 
 some nasty, ugly  work-arounds.
It's my impression (because Walter has been rather closed mouthed about the idea) that heap or stack allocation will have no effect (and/or should have no effect) on how a class reference behaves, eg: MyClass a = MyClass(); //stack MyClass b = new MyClass(); //heap MyClass c; a.foo(); b.foo(); c = a * b; Therefore none of your above comments will apply to D :) Regan
I don't mind if the question of "where to allocate objects" is implementation-dependent, so that an implementation can create raii objects on the stack if the compiler author wishes to do so .. However, doing it in a cpp-like way /maybe/ a step in the wrong direction. You're right, this alone wouldn't be so bad, but I'm worried a bit about the future :P
Dec 15 2005
prev sibling parent reply Walter Bright <Walter_member pathlink.com> writes:
In article <dnsbkv$e59$1 digitaldaemon.com>, Hasan Aljudy says...
One of the things I hate most about C++ is the fact that you can 
allocate objects (class instances) on the stack OR on the heap, you have 
this choice, and it's always hard to make a decision, if it was up to 
me, I'd always create objects on the heap, but the problem is, the STL 
has alot of objects which are meant to be allocated on the stack, i.e. 
string, vector, list, map ... etc. The problem is, these classes have 
much of their functionality in overloaded operators, i.e. the 
implementation relies on operator overloading, and if you allocate these 
objects on the heap, you'd have to use pointers, in which case you won't 
be able to use the overloaded operators without some nasty, ugly 
work-arounds.
You raise a very important issue. I think the answer is that, athough a class object can be located on the stack or on the heap, the syntax with which you access it always remains the same. You don't have to worry about whether to use , ->, or & (by reference). Not only is the syntax all the same for D (the .), but the *semantics* are all the same. The difference will be that stack objects will go away when the function exits. The reason to have the stack allocated objects is that it opens up a large avenue of applications for templates.
Dec 15 2005
parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Walter Bright wrote:
 The reason to have the stack allocated objects is that it opens up a large
 avenue of applications for templates.
I couldn't help but noticing that a very large portion of the recent D feature additions have to do with template (meta)programming. Does allowing templates in a language necessarily lead to gradually increasing complexity until the language's template mechanism and supporting features start to look like C++? -- Niko Korhonen SW Developer
Dec 15 2005
next sibling parent clayasaurus <clayasaurus gmail.com> writes:
Niko Korhonen wrote:
 Walter Bright wrote:
 
 The reason to have the stack allocated objects is that it opens up a 
 large
 avenue of applications for templates.
I couldn't help but noticing that a very large portion of the recent D feature additions have to do with template (meta)programming. Does allowing templates in a language necessarily lead to gradually increasing complexity until the language's template mechanism and supporting features start to look like C++?
We have hindsight on our side, so I hope not.
Dec 16 2005
prev sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"Niko Korhonen" <niktheblak hotmail.com> wrote in message 
news:dntrhf$1g0e$1 digitaldaemon.com...
 Walter Bright wrote:
 The reason to have the stack allocated objects is that it opens up a 
 large
 avenue of applications for templates.
I couldn't help but noticing that a very large portion of the recent D feature additions have to do with template (meta)programming. Does allowing templates in a language necessarily lead to gradually increasing complexity until the language's template mechanism and supporting features start to look like C++?
That's a very legitimate concern. But I think what is wrong with C++ templates is not the idea, but the expression of the idea. C++ templates broke a lot of new ground, and just like early airplanes, once we know where we want to get to, we can device a better design.
Dec 19 2005
prev sibling parent reply Georg Wrede <georg.wrede nospam.org> writes:
Niko Korhonen wrote:

...

 Each time you fork the language construct in two, you get two schools of 
 thought. These two schools of thought will wage war on each 
 indefinitely. This is exactly what happened with C++ (in an extreme 
 manner) and nowadays every C++ programmer knows and uses a different 
 subset of the language and wages war with the other programmers.
 
 Consider:
 
 int x = 5;
 int x(5);
 
 Which is better?
 
 MyClass a;
 MyClass a = MyClass();
 
 Again, which is better?
 
 void func(const MyClass& myClass);
 void func(const MyClass* myClass);
 
 Again, which is better?
 
 C++ has created millions of ways to do the same thing, i.e. the language 
 is full of redundant constructs. A couple of recent additions to D (type 
 inference and stack allocation) have forked/will fork the language in 
 two, now we have to choose between:
 
 auto c = new MyClass();
 MyClass c = new MyClass(); // Which is better?
 
 and soon:
 
 auto c = MyClass();
 auto c = new MyClass();
 
 People are bound to create Coding Guidelines for D with one saying 
 'Always use type inference' and the other saying 'Never use type 
 inference'. I've read three different company internal style guides for 
 C++ and all of them are completely different. I don't want the same to 
 happen to D.
 
 The point that I'm desperately (and rather poorly come to think of it) 
 trying to make is that we should keep the number of lanugage constructs 
 to a bare minimum and absolutely ban any redundant constructs. This 
 helps to keep the language shareable, clean and easy to understand and 
 easy to parse.
 
 Whenever there is a choice between language constructs programmers will 
 fight about which one is better. The only historically proven way to 
 alleviate this issue is to reduce the number of choices. The C++ gurus 
 standard answer of 'educating the programmers better' hasn't worked in 
 the real world.
 
 Only if and only if there are no choices between language constructs to 
 fight about, programmers will not fight about them. Otherwise they will.
I tend to agree. Parallel ways of doing things confuses all who start with the language. They create war camps and religions. (The more trivial the difference, the harder the fight.) Later, when more changes get into the language, one often finds that one of the parallel ways is changed and the other not "hey, we have these two ways, let's change only one of them, thus creating more choice for the programmer!", leading ultimately to additional confusion to newcomers, to not-well thought out features, and to a loss of coherency and focus. Language design should remember to prune and not just add. "A language is like a garden. Let it grow without weeding, and you'll end up using a machete just to get through."
Jan 23 2006
parent BCS <BCS_member pathlink.com> writes:
 Niko Korhonen wrote:
 
 ...
 
 A couple of recent additions 
 to D (type inference and stack allocation) have forked/will fork the 
 language in two, now we have to choose between:

 auto c = new MyClass();
 MyClass c = new MyClass(); // Which is better?
...
 The point that I'm desperately (and rather poorly come to think of it) 
 trying to make is that we should keep the number of lanugage 
 constructs to a bare minimum and absolutely ban any redundant 
 constructs. This helps to keep the language shareable, clean and easy 
 to understand and easy to parse.
These construct are not redundant. The auto construct is needed in templates where the type of an expression is not known until compile time and the explicit typing is needed when the type of the variable must be something other than the type of the expression. (OK, you can do this with casts but that's just plain ugly)
Jan 23 2006
prev sibling next sibling parent reply "Dave" <Dave_member pathlink.com> writes:
"Niko Korhonen" <niktheblak hotmail.com> wrote in message 
news:dnpd6a$174q$1 digitaldaemon.com...

<snip regarding stack allocation syntax>

At the risk of muddling things more, the proposed new sytax (last I saw) 
was:

MyClass mc = new MyClass; // heap
MyClass mc2 = MyClass; // RAII (stack or heap, depending on implementor 
choice (i.e.: GC and machine architecture))

auto MyClass mc3 = new MyClass; // heap and RAII, this syntax will be 
deprecated but allowed for backward compatibility for a while.
auto MyClass mc4 = MyClass; // RAII, this syntax will be deprecated but 
allowed for backward compatibility for a while.

auto mc5 = new MyClass; // heap
auto mc6 = MyClass; // RAII (stack or heap, depending on implementor choice 
(i.e.: GC and machine architecture))

The reasoning is that if you need RAII anyway (destructor fires when it goes 
out of the current scope) it will be allocated in whichever way the 
implementor sees fit. From this perspective it is simply just "fast 
allocated" and the implementation details are left up to the compiler 
developers, but the compiler still has to do the same "escaping reference" 
checks it does now for auto.

The other important consideration here is that perhaps the D GC can quit 
checking for a finalizer for every heap allocated class during every GC 
sweep including that class reference, making the GC much faster too (This 
would be because, if the class is RAII and those are always stack allocated 
by implementors choice, then the GC does not have to be built to do the 
RAII; it has to do that now because RAII is currently heap allocated).

 It's probably going to make into DMD 0.142 as such and then get modified 
 in a a major, and especially breaking, way in a later version :)
Maybe I'm misunderstanding you but that sounds pretty definitive; where are you getting this information? Walter made no such commitments that I've ever seen for v0.142.
 -- 
 Niko Korhonen
 SW Developer 
Dec 14 2005
parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Dave wrote:
 MyClass mc = new MyClass; // heap
 MyClass mc2 = MyClass; // RAII (stack or heap, depending on implementor 
 choice (i.e.: GC and machine architecture))
So, is 'new MyClass;' equivalent to 'new MyClass();' now? Why not add 'int x(5);' in addition to 'int x = 5;' as well...
 Maybe I'm misunderstanding you but that sounds pretty definitive; where are 
 you getting this information? Walter made no such commitments that I've ever 
 seen for v0.142.
No, I just pulled the version number out of my hat :) Let's call it v0.1xy instead. -- Niko Korhonen SW Developer
Dec 15 2005
parent "Dave" <Dave_member pathlink.com> writes:
"Niko Korhonen" <niktheblak hotmail.com> wrote in message 
news:dnrcn8$2n9f$1 digitaldaemon.com...
 Dave wrote:
 MyClass mc = new MyClass; // heap
 MyClass mc2 = MyClass; // RAII (stack or heap, depending on implementor 
 choice (i.e.: GC and machine architecture))
So, is 'new MyClass;' equivalent to 'new MyClass();' now? Why not add
IIRC, it has been that way for as long as I've been playing with D (over a year).
 'int x(5);' in addition to 'int x = 5;' as well...
No, but maybe 'int x = int(5);' because along with the new RAII syntax for classes, I think that would be very useful for templates to cut down on 'static if(is(...))' and still be able to use the same templates. For example: // T could now be a native data type, or // a struct with a static opCall() and opMul(), or // a class with a copy ctor and opMul() returning a new'd (on the heap) class // In another words, trully 'generic' in that the same template could be used for all w/o // using conditional compilation. template Foo(T) { T Foo(T i) { T x = T(i); return x * x; } } One other thing to fill that out would be ctors for structs., but the new RAII syntax for classes should go a long way in this area I think. I see the argument you're making w.r.t. different coding standards, but IMHO, one of the big reasons C++ is still as popular as it is, is because one can allocate objects on the stack with it and Java can't, making it much better performing for a lot of problem domains out there.
 Maybe I'm misunderstanding you but that sounds pretty definitive; where 
 are you getting this information? Walter made no such commitments that 
 I've ever seen for v0.142.
No, I just pulled the version number out of my hat :) Let's call it v0.1xy instead. -- Niko Korhonen SW Developer
Dec 15 2005
prev sibling parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Niko Korhonen wrote:
 Tom S wrote:
 
 auto class StackObj;
 class HeapObj;

 StackObj obj0; // Stack-allocated, default-value initialized
 StackObj obj1 = new StackObj(); // Stack-allocated
 HeapObj obj2 = new HeapObj(); // Heap-allocated
Sometimes I'd like to have a class object defined on the stack, sometimes on the heap (without RAII), would I have then to make 2 classes ?
Yes, this is true, I actually didn't think about that very much. In the cannot be decided during instantiation. This is rather limiting, and can already be done in D with structs.
2.0, and for what I've searched, neither in 3.0 . -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Dec 15 2005
next sibling parent Ivan Senji <ivan.senji_REMOVE_ _THIS__gmail.com> writes:
Bruno Medeiros wrote:
 Niko Korhonen wrote:
 
 Tom S wrote:

 auto class StackObj;
 class HeapObj;

 StackObj obj0; // Stack-allocated, default-value initialized
 StackObj obj1 = new StackObj(); // Stack-allocated
 HeapObj obj2 = new HeapObj(); // Heap-allocated
Sometimes I'd like to have a class object defined on the stack, sometimes on the heap (without RAII), would I have then to make 2 classes ?
Yes, this is true, I actually didn't think about that very much. In and cannot be decided during instantiation. This is rather limiting, and can already be done in D with structs.
2.0, and for what I've searched, neither in 3.0 .
2.0 no, but 3.0 yes, it is called var instead of auto.
Dec 15 2005
prev sibling parent Niko Korhonen <niktheblak hotmail.com> writes:
Bruno Medeiros wrote:
 Yes, this is true, I actually didn't think about that very much. In 

 and cannot be decided during instantiation. This is rather limiting, 
 and can already be done in D with structs.
2.0, and for what I've searched, neither in 3.0 .
Therefore if you declare your object as 'struct' they are always stack allocated (even when they are instantiated with the 'new' syntax) and if you declare your object 'class' they are always heap-allocated. Like such: struct MyStruct { } class MyClass { } MyStruct m = new MyStruct(); // stack allocation MyClass c = new MyClass(); // heap allocation using (MemoryStream m = new MemoryStream()) { } // m.Dispose() is called at the end of the scope -- Niko Korhonen SW Developer
Dec 15 2005
prev sibling parent reply Niko Korhonen <niktheblak hotmail.com> writes:
Tom S wrote:
 Your assumptions are wrong. There is no D project I've ever heard of to 
 have 5M lines of code. As D matures, I expect less and less features 
 being added to the 'stable' compiler branch. I can't speak for Walter, 
 but DMD will probably branch off into the 'beta' / 'development' / 
 'research' compiler which will adopt new features until it becomes the 
 'stable' release. People are already starting to do this with GDC.
But that's not /now/ and given that the language spec/compiler has been in alpha stage for five years, I imagine it's not going to be so in a while. Someone might want to start a 5MLoC D project now. It would be pretty great if they could since no one wants to start a 5MLoC project before there are any successful 5MLoC projects... -- Niko Korhonen SW Developer
Dec 14 2005
parent Tom S <h3r3tic remove.mat.uni.torun.pl> writes:
Niko Korhonen wrote:

 Tom S wrote:
 
 Your assumptions are wrong. There is no D project I've ever heard of 
 to have 5M lines of code. As D matures, I expect less and less 
 features being added to the 'stable' compiler branch. I can't speak 
 for Walter, but DMD will probably branch off into the 'beta' / 
 'development' / 'research' compiler which will adopt new features 
 until it becomes the 'stable' release. People are already starting to 
 do this with GDC.
But that's not /now/ and given that the language spec/compiler has been in alpha stage for five years, I imagine it's not going to be so in a while.
As I said, I don't believe D to change so much in the following year or so. There aren't any drastic changes planned AFAICS
 Someone might want to start a 5MLoC D project now. It would be pretty 
 great if they could since no one wants to start a 5MLoC project before 
 there are any successful 5MLoC projects...
Ye, but it won't be a 5MLoC project from the beginning. It will have to grow. As it does, DMD will as well :P Maybe I'm too optimistic about D... But hell... No other language suits my needs :D -- -----BEGIN GEEK CODE BLOCK----- Version: 3.1 GCS/M d-pu s+: a-->----- C+++$>++++ UL P+ L+ E--- W++ N++ o? K? w++ !O !M V? PS- PE- Y PGP t 5 X? R tv-- b DI- D+ G e>+++ h>++ !r !y ------END GEEK CODE BLOCK------ Tomasz Stachowiak /+ a.k.a. h3r3tic +/
Dec 14 2005
prev sibling parent reply Walter Bright <Walter_member pathlink.com> writes:
In article <dnos3h$mp9$1 digitaldaemon.com>, Niko Korhonen says...
Yes, but for argument's sake consider a situation where a project has 5M 
lines of DMD 0.141 code. Assume that it turns out that DMD 0.141 has a 
nasty compiler bug that forces the project to use some pretty ugly 
workarounds. Suppose this bug was fixed in DMD 0.143, but 0.142 added a 
breaking change which would need tens or possibly hundreds of man-hours 
to track down the needed changes in the source. The customer doesn't 
want to pay for these hours so the project management refuses the 
compiler update. Bang! The project is forever stuck with DMD 0.141 and 
the programmers are likewise stuck using these ugly workarounds and can 
only dream about the nifty new features that DMD 0.150 brought; say 
bye-bye to employer morale.

I've seen a project at my work place that has stuck into Java 1.3.1_b24 
forever because of some nasty breaking changes in the immediately 
following JDK version. It's horrible when that happens. Probably the 
worst thing that can happen in software development.
I agree this can happen - and it does happen with every language. The flip side is that if D doesn't constantly improve as a language, it will die. Show me a language that is unchanging and I'll show you a dead language. D has no choice but to move forward. That said, there is good reason to fork D once 1.0 is set - one fork will be purely bug fixes, the language improvements go into the other fork.
Dec 15 2005
next sibling parent John Reimer <terminal.node gmail.com> writes:
Walter Bright wrote:

 That said, there is good reason to fork D once 1.0 is set - one fork will be
 purely bug fixes, the language improvements go into the other fork.
 
 
This is a truly wonderful statement! -JJR
Dec 15 2005
prev sibling next sibling parent reply "Kris" <fu bar.com> writes:
"Walter Bright" <Walter_member pathlink.com> wrote
 In article <dnos3h$mp9$1 digitaldaemon.com>, Niko Korhonen says...
I've seen a project at my work place that has stuck into Java 1.3.1_b24
forever because of some nasty breaking changes in the immediately
following JDK version. It's horrible when that happens. Probably the
worst thing that can happen in software development.
I agree this can happen - and it does happen with every language. The flip side is that if D doesn't constantly improve as a language, it will die. Show me a language that is unchanging and I'll show you a dead language. D has no choice but to move forward.
The great grand-daddy of OOP is Simula. For those who don't know anything about it, you might be rather surprised at just how little things have changed since 1967 (yes, almost 40 years). Simula died because it was deliberately and explicitly fixed in stone.
Dec 15 2005
parent Lars Ivar Igesund <larsivar igesund.net> writes:
Kris wrote:

 "Walter Bright" <Walter_member pathlink.com> wrote
 In article <dnos3h$mp9$1 digitaldaemon.com>, Niko Korhonen says...
I've seen a project at my work place that has stuck into Java 1.3.1_b24
forever because of some nasty breaking changes in the immediately
following JDK version. It's horrible when that happens. Probably the
worst thing that can happen in software development.
I agree this can happen - and it does happen with every language. The flip side is that if D doesn't constantly improve as a language, it will die. Show me a language that is unchanging and I'll show you a dead language. D has no choice but to move forward.
The great grand-daddy of OOP is Simula. For those who don't know
antioything
 about it, you might be rather surprised at just how little things have
 changed since 1967 (yes, almost 40 years).
 
 Simula died because it was deliberately and explicitly fixed in stone.
In the first year of my studies (in '98), I was so lucky to attend a lecture by Kristen Nygaard, one of the two original designers of Simula. Inspirational beyond belief. A man for the programming legends, even compared to Walter ;)
Dec 15 2005
prev sibling next sibling parent Niko Korhonen <niktheblak hotmail.com> writes:
Walter Bright wrote:
  > I agree this can happen - and it does happen with every language. 
The flip side
 is that if D doesn't constantly improve as a language, it will die. Show me a
 language that is unchanging and I'll show you a dead language. D has no choice
 but to move forward.
C++ has been unchanging for almost eight years now unless it counts that the compiler vendors have just recently started to implement *all* of the C++98 specification <g> C has been theoretically unchanging for six years since the C99 standard but practically for a whopping 16 years since C89 standard! But really, I agree with you. I'm basically just hoping that the D 1.0 spec would happen.
 That said, there is good reason to fork D once 1.0 is set - one fork will be
 purely bug fixes, the language improvements go into the other fork.
Sweet :) -- Niko Korhonen SW Developer
Dec 16 2005
prev sibling next sibling parent John Reimer <John_member pathlink.com> writes:
I agree this can happen - and it does happen with every language. The flip side
is that if D doesn't constantly improve as a language, it will die. Show me a
language that is unchanging and I'll show you a dead language. D has no choice
but to move forward.

That said, there is good reason to fork D once 1.0 is set - one fork will be
purely bug fixes, the language improvements go into the other fork.
This admission alone is an indicator that D should be turning 1.0 anytime soon. I'm certainly for it. -JJR
Dec 16 2005
prev sibling parent clayasaurus <clayasaurus gmail.com> writes:
Walter Bright wrote:
 That said, there is good reason to fork D once 1.0 is set - one fork will be
 purely bug fixes, the language improvements go into the other fork.
 
This is exactly what I wanted to hear, thanks.
Dec 16 2005
prev sibling next sibling parent Gordon James Miller <gmiller bittwiddlers.com> writes:
 
 How can D break out of this situation?
 
 
After a year or so of dabbling in D I've finally decided that I'm actually going to use it on a few production scale projects. Here are some of the things that have helped out significantly in letting me do this: 1) I'm in a unique situation as Tech Overlord of my group so I can pretty much dictate what language we use and force people to learn it and make my customer accept it. I have a particularly sharp, handpicked group that knows C++ and Java pretty well so that's helpful. 2) GDC becoming stable and having binary builds on multiple platforms (many hundreds of thanks to the GDC guys). Having an open source implementation of the language is necessary on this front. I can check in a copy of the GDC source and remove any fear of not having the language in the future. 3) I've started what I call D-Distilled which is essentially an in-house version of "D In a Nutshell". I'm using a Wiki to do this so its very easy for my team to cross link back and forth between the different sections. Having programmer level (not specification level) documentation helps for those people that get intimidated by specifications. (I find most self taught programmers fall into this category) My particular belief is that when people start seeing projects that are useful to them, that scratch their itch, that they will start using the language. As I've seen in the government contracting circles I run in, the main motivator in choosing the language is which languages the developers want to use. If I want to use Ruby to do scripting, I use it (which I did and it turned out great). I now want to use D for my system level stuff so I'm going to use it. I don't know if "Marketing campaigns" or anything targetted to managers helps in the environment I run in. Getting stable compilers that are easy to install, convincing people that it will not go away, and developing software that people actually use are the three key things.
Dec 11 2005
prev sibling parent reply JT <JT_member pathlink.com> writes:
I am waiting for the game industry to discover D. Many studios have libraries
and macros or custom preprocessors with built in ice makers and all sorts of
bells and whistles just to construct idioms within C++ to make it something akin
to D. I agree that there is some interesting psychology regarding languages. Its
as if the value of designing a new language and that of using Duct Tape v.2005
is completely out of whack. The game industry is going to see a monumental
increase in size and complexity of data before we know it. Some development
tools are going to crush under their own weight. My current position is D
*should* be the future of the game industry. Its just not worth the cost benifit
whatchamacallit to continue doing things the way they are now vs redesigning the
language which creates the foundation to the tools and engines. Which reminds me
of a thought I was having while porting the D front end to D. C/C++ was
originally as much a language spec as a compiler the same way D is as much a
language spec and a compiler. The complexity of the language is somewhat
relative to the complexity of the language the compiler is written in. C was
built using assembler, the C++ preprocessor using c? and D using C++. This is a
fundamental advance that should have occured a long time ago. The fact that the
software industry in general has not begun down this path before has as much to
do with collective psychology as money, management, legacy whatever. I had many
of the same ideas as dave when I first found D a couple years ago, D is cool but
lets just tweak C++ to hell and back. On a side note I think a lot of this has
to do with lack of compiler experience and this is where walter has become our
hero. So for whatever reason here we are today in 2005 with this situation. I
very much think if people in the game industry understood D they would fall in
love with it. One thing to mention however, most every game studio holds their
internal procedures, libraries, tools, and cup cake recipies as guarded as the
1985 bears with their playbook. It would be interesting to see how many
developers would even want to advertise their use of something like D. Not that
I would have any problem with it but its interesting community. I would like to
write an article for game developer magazine but im too busy. im planning a
studio around a brand new engine and suite of tools built in D.
Dec 11 2005
parent reply Lucas Goss <lgoss007 gmail.com> writes:
JT wrote:
 ...One thing to mention however, most every game studio holds their
 internal procedures, libraries, tools, and cup cake recipies as guarded as the
 1985 bears with their playbook. It would be interesting to see how many
 developers would even want to advertise their use of something like D. Not that
 I would have any problem with it but its interesting community. I would like to
 write an article for game developer magazine but im too busy. im planning a
 studio around a brand new engine and suite of tools built in D.
 
I also think D is an ideal game programming language. In fact, game programming is how I found D in the first place. I was looking for a language that took all of the intentions of C++ but got rid of all of the bad (or difficult) things of trying to be compatible with C (or use C as a subset). I'm still trying to form an opinion on keeping libraries and tools secret, as there are good and bad sides. I see many reinvented game libraries that just seem like a waste of time (a Matrix, Vector, etc. should be in a standard math/physics library). Also if you want to use a different game engine you have to switch all of your libraries (more waste of time). But at the same time if a company (or person), put many hours into a tool they should be able to get paid for it. Well I won't go on a long rant, but in high school and college it was hard to find an editor or a 3D modeller that didn't cost a fortune. Fortunately, some companies now make personal editions or cheaper editions for aspiring developers. I'll just stop there... My opinion is beginning to be swayed towards libraries being open with a type of FreeBSD license, while tools should be priced accordingly (for students/indies and companies). But anyways I've used Torque, OGRE, GeometricTools (used to be Wild-Magic), and lately I've been using OSG. My main problem is that all of those are in C++, which means I would have to redo all (or parts) of those libraries. Guess I better get coding... PS: I like the reference to the 1985 Bears :) And if your studio needs any programmers for this brand new engine... let me know.
Dec 15 2005
next sibling parent reply JT <JT_member pathlink.com> writes:
In article <dnrn8u$2vcs$1 digitaldaemon.com>, Lucas Goss says...
JT wrote:
 ...One thing to mention however, most every game studio holds their
 internal procedures, libraries, tools, and cup cake recipies as guarded as the
 1985 bears with their playbook. It would be interesting to see how many
 developers would even want to advertise their use of something like D. Not that
 I would have any problem with it but its interesting community. I would like to
 write an article for game developer magazine but im too busy. im planning a
 studio around a brand new engine and suite of tools built in D.
 
I also think D is an ideal game programming language. In fact, game programming is how I found D in the first place. I was looking for a language that took all of the intentions of C++ but got rid of all of the bad (or difficult) things of trying to be compatible with C (or use C as a subset). I'm still trying to form an opinion on keeping libraries and tools secret, as there are good and bad sides. I see many reinvented game libraries that just seem like a waste of time (a Matrix, Vector, etc. should be in a standard math/physics library). Also if you want to use a different game engine you have to switch all of your libraries (more waste of time). But at the same time if a company (or person), put many hours into a tool they should be able to get paid for it. Well I won't go on a long rant, but in high school and college it was hard to find an editor or a 3D modeller that didn't cost a fortune. Fortunately, some companies now make personal editions or cheaper editions for aspiring developers. I'll just stop there... My opinion is beginning to be swayed towards libraries being open with a type of FreeBSD license, while tools should be priced accordingly (for students/indies and companies). But anyways I've used Torque, OGRE, GeometricTools (used to be Wild-Magic), and lately I've been using OSG. My main problem is that all of those are in C++, which means I would have to redo all (or parts) of those libraries. Guess I better get coding... PS: I like the reference to the 1985 Bears :) And if your studio needs any programmers for this brand new engine... let me know.
Yeah I agree somehwhat with what you are saying. The culture of the industry has ultimately helped cause the situation where there is soooo much bad code floating around. Most of it is private and you never see it until you are getting paid to fix it :/ But even the commercial engines are a joke - they are mostly old engines with generation after generation of layers over them. Im still split between having a fully open engine or not, but I believe the core tools should be open. Ultimately I would love to be able to publish an open source engine with licensed artwork, but im not sure how realistic that is. Most of the engines you mentioned are nowhere near production quality, although I think bits like the Open Dynamics Engine can save a company a LOT of money. Idealy more and more companies would be using open libraries and contributing to them. Then, maybe we can spend more time focusing on gameplay and artwork....
Dec 15 2005
parent Lucas Goss <lgoss007 gmail.com> writes:
 ...Most of the engines you mentioned are nowhere near production quality,
although I
 think bits like the Open Dynamics Engine can save a company a LOT of money.
True, though Torque was used to create Tribes (yeah that code is a mess), and Dave Eberly worked for NetImmerse (which is now Gamebryo I believe), so I imagine his design and coding style would be similar since he was the lead engineer there. Also OSG is used quite a bit in the VIS/SIM industry for production, though not as much for games. But Pirates of the XXI Century is using OSG: http://www.openscenegraph.org/osgwiki/pmwiki.php/Screenshots/DIOSoftPirates But alas, I don't quite have enough money for something like the Unreal Engine or Gamebryo.
Dec 16 2005
prev sibling next sibling parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Lucas Goss wrote:
 JT wrote:
 
 I also think D is an ideal game programming language. In fact, game 
 programming is how I found D in the first place. I was looking for a 
 language that took all of the intentions of C++ but got rid of all of 
 the bad (or difficult) things of trying to be compatible with C (or use 
 C as a subset).
 
Same here. I first heard about D in a slashdot article some years ago. I looked at it at a glance and found some of the basic features(i.e. C++ fixes) very cool (no header files, defined primitive types size, etc.) but then left it at that. It was only a year later, after working on my first "real" (and by real, I mean a big enough one to get some good perspective on C++, not just some 2-3 cpp files projects) C++ academic semester-length project (which was a game), that I immediately and irrevocably got sick of C++. How could any one ever program in *that*? I fraking gives me shivers.. :/ So I then starting looking into D, and i've since stopped any hobbying multimedia_app/game learning and instead starting learning about D, participating in the newsgroup, etc. -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Dec 15 2005
parent Sean Kelly <sean f4.ca> writes:
Bruno Medeiros wrote:
 It was only a year later, after working on my first "real" (and by real,
 I mean a big enough one to get some good perspective on C++, not just 
 some 2-3 cpp files projects) C++ academic semester-length project (which
 was a game), that I immediately and irrevocably got sick of C++. How
 could any one ever program in *that*? I fraking gives me shivers.. :/
That's the big problem with C++: it supports so many different programming techniques that maintaining a coherent design philosophy in a team can be extremely difficult. I found that with D my productivity quite rapidly exceeded what it is in C++ (and I'm no slouch at C++), and the code was more readable and easier to maintain. That alone makes D a big win in my book. Sean
Dec 15 2005
prev sibling parent reply clayasaurus <clayasaurus gmail.com> writes:
Lucas Goss wrote:
 I also think D is an ideal game programming language. 
Even a newbie like myself can concur on that note :-P
 But anyways I've used Torque, OGRE, GeometricTools (used to be 
 Wild-Magic), and lately I've been using OSG. My main problem is that all 
 of those are in C++, which means I would have to redo all (or parts) of 
 those libraries. Guess I better get coding...
Hrm.. well it isn't that bad. You can help out the Sinbad folks once that gets rolling ( http://www.dsource.org/projects/sinbad/ ). Also, you don't necessarily have to convert all C++ code to D, you can create a C++ --> C --> D wrapper.
Dec 15 2005
parent Lucas Goss <lgoss007 gmail.com> writes:
clayasaurus wrote:
 Hrm.. well it isn't that bad. You can help out the Sinbad folks once 
 that gets rolling ( http://www.dsource.org/projects/sinbad/ ).
 
 Also, you don't necessarily have to convert all C++ code to D, you can 
 create a C++ --> C --> D wrapper.
Yeah I've looked at Sinbad and I was somewhat interested in it, but then I switched from OGRE to OSG for various reasons. But if the Sinbad team really does do a rewrite I'd be up for that. Wrappers for some reason rub me the wrong way (but I do them when I must).
Dec 16 2005
prev sibling next sibling parent reply Dave <Dave_member pathlink.com> writes:
In article <dnhvoi$dja$1 digitaldaemon.com>, Walter Bright says...
Here's a lively debate over in comp.lang.c++.moderated people might be
interested in:

http://groups.google.com/group/comp.lang.c++.moderated/browse_thread/thread/60117e9c1cd1c510/c92f7fd0dc9fedd1?lnk=st&q=safer+better+c%2B%2B&rnum=1&hl=en
#c92f7fd0dc9fedd1
I don't know whether or not this would have any bearing on that group or not, but maybe you could mention your experiences implementing DMD script since this is publically available, non-trivial code that they could look over? On my machines, the D version is 33% (AMD) to 50% (P4) faster than the C++ version running the included sieve.ds benchmark, and I believe you said that it was implemented with a 10% reduction in lines of code along with other advantages. BTW - Does the D version also build faster? Thanks, - Dave
Dec 12 2005
parent "Walter Bright" <newshound digitalmars.com> writes:
"Dave" <Dave_member pathlink.com> wrote in message
news:dnk67g$1otg$1 digitaldaemon.com...
 I don't know whether or not this would have any bearing on that group or
not,
 but maybe you could mention your experiences implementing DMD script since
this
 is publically available, non-trivial code that they could look over?
Yes, I do use this as a case study.
 On my machines, the D version is 33% (AMD) to 50% (P4) faster than the C++
 version running the included sieve.ds benchmark, and I believe you said
that it
 was implemented with a 10% reduction in lines of code along with other
 advantages.
It's about a 30% reduction in source code size.
 BTW - Does the D version also build faster?
Yes, much, though I haven't measured it.
Dec 12 2005
prev sibling parent Derek Parnell <derek psych.ward> writes:
On Sun, 11 Dec 2005 11:40:53 -0800, Walter Bright wrote:

 Here's a lively debate over in comp.lang.c++.moderated people might be
 interested in:
"The Debate" is also continuing in other forums, in one way or another. Here is a quote from the Euphoria Language forum ...
 There is no way D could be as fast as highly optimized C. D is
 loaded with features, including garbage collection (which slows
 things down a little). I suppose D looks superior to C and C++,
 but looks more like feces against Euphoria in most cases.
I don't regard this as a well thought out opinion, but it does show that D is in front of other audiences as well as the C/C++ crowd. For the record, I think Euphoria *and* D are brilliant languages, but they address different problem spaces. Comparing them with each other is a pointless exercise; a bit like comparing Mack trucks with Ferrari racing cars, both are great but you wouldn't want one trying to do the job of the other. Euphoria also uses garbage collection, has static and dynamic typing, and is interpreted; but as far as interpreters go, it is extremely fast. It takes about 3-4 times longer to run D's Word Count sample program than D does. -- Derek (skype: derek.j.parnell) Melbourne, Australia "A learning experience is one of those things that says, 'You know that thing you just did? Don't do that.'" - D.N. Adams 14/12/2005 1:37:15 PM
Dec 13 2005