www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Bug or plain misunderstanding?

reply "Bob W" <nospam aol.com> writes:
Hi,

after a couple of months of absence from D and
this forum, I am officially still busy reducing
the backlog of work I was accumulating due to
some unforeseen events. So please don't tell
anybody I was here!

D seems to be thriving and is probably right on
track - my old programs still compile with 0.145.

However, I have noticed one little dmd behaviour,
which I was not expecting:

When warnings are enabled with '-w' and if
the program contains unreachable code (e.g.
due to a casual insertion of a return statement
for debugging purposes), the compiler will
issue the proper warning but it will not
generate any code (i.e. exe file).

Should I understand the compiler usage info
"-w ... enable warnings" as
"-w ... enable warnings and treat them as errors"
or is this compiler behaviour unintentional?


For further clarification I have included this
little demo program:



///////////////////////////////////////////////////////////
//
// Possible bug in dmd 0.145:
//
// When attempting to disable a portion of code with an
// early return statement, dmd properly issues a warning
// if the program is compiled with '-w'.
// But it apparently treats the warning as an error, so
// it does not generate any code. If '-w' is omitted,
// everything works as expected.
//
///////////////////////////////////////////////////////////


import std.stdio;


int main() {
  writefln();
  writefln("Sample program demonstrating '-w' bug:");
  writefln();

/*
  // Uncomment this section and compile program with '-w'
  // to experience warning without code generation.
  // (You might want to erase an eventually existing old
  // .exe file first in order to verify this.)

  writefln("Attempt to stop program after this message.");
  return(0);  // <<<< stop the nonsense here!
*/

  writefln();
  transfertoSCO();
  installMSDOS();
  emailCongrats();
  return(0);
}


void transfertoSCO() {
  writefln("    Donating $100 000 to SCO Group Inc ...");
  /* sophisticated code goes here */
  writefln("... debiting Mr. Torvalds' bank account ...");
  /* sophisticated code goes here */
  writefln("... transaction successfully completed.");
  writefln();  writefln();
}

void installMSDOS() {
  writefln("    Erasing Windows XP and/or Linux ...");
  /* sophisticated code goes here */
  writefln("... installing MSDOS 2.01 ...");
  /* sophisticated code goes here */
  writefln("... MSDOS successfully installed.");
  writefln();  writefln();
}

void emailCongrats() {
  writefln("    Preparing email to GWB ...");
  /* sophisticated code goes here */
  writefln("... 'congrats to success finding WMD' ...");
  /* sophisticated code goes here */
  writefln("... attempting to send email ...");
  writefln("... reply from operating system:");
  writefln("    what is 'email' ???");
  writefln("    Abort, Retry, Fail?");
  writefln();  writefln();
}
Feb 05 2006
parent reply "Derek Parnell" <derek psych.ward> writes:
On Mon, 06 Feb 2006 06:38:50 +1100, Bob W <nospam aol.com> wrote:

 Hi,

 after a couple of months of absence from D and
 this forum, I am officially still busy reducing
 the backlog of work I was accumulating due to
 some unforeseen events. So please don't tell
 anybody I was here!

 D seems to be thriving and is probably right on
 track - my old programs still compile with 0.145.

 However, I have noticed one little dmd behaviour,
 which I was not expecting:

 When warnings are enabled with '-w' and if
 the program contains unreachable code (e.g.
 due to a casual insertion of a return statement
 for debugging purposes), the compiler will
 issue the proper warning but it will not
 generate any code (i.e. exe file).

 Should I understand the compiler usage info
 "-w ... enable warnings" as
 "-w ... enable warnings and treat them as errors"
 or is this compiler behaviour unintentional?
As far as Walter is concerned, a 'warning' is really an error too. So yes, DMD will display warnings but treat them as errors such that the compiler doesn't generate code. -- Derek Parnell Melbourne, Australia
Feb 05 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message 
news:op.s4imylg86b8z09 ginger.vic.bigpond.net.au...
 As far as Walter is concerned, a 'warning' is really an error too. So yes, 
 DMD will display warnings but treat them as errors such that the compiler 
 doesn't generate code.
That's right. If the programmer cares about warnings, he'll want them to be treated as errors. Otherwise, the warning message could scroll up and all too easilly be overlooked.
Feb 05 2006
next sibling parent reply "Derek Parnell" <derek psych.ward> writes:
On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright  
<newshound digitalmars.com> wrote:

 "Derek Parnell" <derek psych.ward> wrote in message
 news:op.s4imylg86b8z09 ginger.vic.bigpond.net.au...
 As far as Walter is concerned, a 'warning' is really an error too. So  
 yes,
 DMD will display warnings but treat them as errors such that the  
 compiler
 doesn't generate code.
That's right. If the programmer cares about warnings, he'll want them to be treated as errors. Otherwise, the warning message could scroll up and all too easilly be overlooked.
However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code. A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend. -- Derek Parnell Melbourne, Australia
Feb 05 2006
next sibling parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Derek Parnell" <derek psych.ward> wrote in message
news:op.s4iq6vrk6b8z09 ginger.vic.bigpond.net.au...
 On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright
 <newshound digitalmars.com> wrote:

 "Derek Parnell" <derek psych.ward> wrote in message
 news:op.s4imylg86b8z09 ginger.vic.bigpond.net.au...
 As far as Walter is concerned, a 'warning' is really an error too. So
 yes,
 DMD will display warnings but treat them as errors such that the
 compiler
 doesn't generate code.
That's right. If the programmer cares about warnings, he'll want them to be treated as errors. Otherwise, the warning message could scroll up and
all
 too easilly be overlooked.
However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code. A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend.
Pretty much agree. There should be a complementary "warnings as errors" flag as well. Walter's sentiments are agreeable when one is dealing with production builds, under which only a fool would not invoke "warnings as errors", but when experimenting, or working on something new, or inserting returns for debugging (something Walter strongly advocates, btw), or myriad other "messing around" purposes, it's going to be more hindrance than help. Sorry to start off the week with an insult, but this is another instance where Walter's personal prejudices impact negatively on the "sensibleness" of the language/compiler.
Feb 05 2006
parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Matthew wrote:

 Walter's sentiments are agreeable when one is dealing with production
 builds, under which only a fool would not invoke "warnings as errors", but
 when experimenting, or working on something new, or inserting returns for
 debugging (something Walter strongly advocates, btw), or myriad other
 "messing around" purposes, it's going to be more hindrance than help.
Phobos doesn't use warnings, though ? (i.e. doesn't compile with -w) I used to ship some of my libraries with a CFLAGS of "-Wall -Werror", but it only seemed to trip on the *system* headers on the client end and made the library less portable in the end. So I changed to -Wall. --anders
Feb 06 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Anders F Björklund" <afb algonet.se> wrote in message
news:ds7mir$30tk$1 digitaldaemon.com...
 Matthew wrote:

 Walter's sentiments are agreeable when one is dealing with production
 builds, under which only a fool would not invoke "warnings as errors",
but
 when experimenting, or working on something new, or inserting returns
for
 debugging (something Walter strongly advocates, btw), or myriad other
 "messing around" purposes, it's going to be more hindrance than help.
Phobos doesn't use warnings, though ? (i.e. doesn't compile with -w)
LOL! I didn't know that. I'm still waiting for Walter to explain the paradox between the bug-finding activities that he recommends to just about everyone that ever raises a compiler problem and not being able to generate code and link with something that emits a warning. (I may be waiting some time ...)
Feb 06 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:ds8jkh$qf8$1 digitaldaemon.com...
 I'm still waiting for Walter to explain the paradox between the 
 bug-finding
 activities that he recommends to just about everyone that ever raises a
 compiler problem and not being able to generate code and link with 
 something
 that emits a warning. (I may be waiting some time ...)
If you can show an example where being able to execute code that generates a warning will make the cause of the warning less mysterious, I'd be very interested.
Feb 06 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:ds8lvv$s2k$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds8jkh$qf8$1 digitaldaemon.com...
 I'm still waiting for Walter to explain the paradox between the
 bug-finding
 activities that he recommends to just about everyone that ever raises a
 compiler problem and not being able to generate code and link with
 something
 that emits a warning. (I may be waiting some time ...)
If you can show an example where being able to execute code that generates
a
 warning will make the cause of the warning less mysterious, I'd be very
 interested.
Are you affecting obtuseness as a means of stimulating debate for the people reading but not participating? (Not rhetoric. I'm genuinely confused.) Anway, here's the situation as I see it, as I can most plainly express it: 1. You recommend that people chop down code in order to isolate bugs. Such bugs can include compile-time and runtime bugs. 2. You offer a compiler that does not allow a user to generate code while observing warnings. These two are in obvious conflict. Anyone with any problem compiling any compiled-language will, in following your stipulation to the chop-and-test approach, have to remove code. Those with a version control system, a huge Undo buffer and perfect faith in their IDDE not to crash may genuinely _remove_ code. Those more careful/careworn might instead put in an early return statement here and there. Since the last thing one wants to do when debugging is to introduce new bugs, such users might sensibly wish to continue to be informed of potential problems by having their compiler emit as many warnings as it can. This is so obvious to anyone that's ever even touched the hem of a debugger that no example should be necessary. I'm left grasping for a credible explanation as to why you suggest one is necessary, other than perhaps to try and wear me down in fatuous effort. Please point out the flaw in my reasoning of your paradox, and, further, explain to me how, in following this technique in many circumstances where debugging has been necessitated in C/C++/D/Ruby/Java/.NET, I have gone so astray in my practice. What should I have been doing instead? This is not rhetoric for the sake of it: please do elucidate. If you cannot, please explain how DMD's current contradiction of 1 and 2 is not an arbitrary shackle to its users, one based solely on your own work practices and prejudices that, I hope, you would have to concede may not map to those of all other developers.
Feb 06 2006
next sibling parent reply Sean Kelly <sean f4.ca> writes:
I'm not sure I see the problem.  Since the warnings mode is optional, 
what's to stop you from building with warnings until you fix everything 
it flags that you care about then turn the option off and build to test?

My only issue is that with "warnings as errors" as the only 
warnings-enabled option, there must be a code-level way to fix anything 
the compiler warns about.  Otherwise, shops that build with warnings 
enabled as the release mode would be SOL if the compiler warns about 
something intended.

This is typically how I do things with VC++: require no messages at WL:4 
and preprocessor directives to situationally disable any warnings about 
utterly stupid things (like the "standard function is deprecated" 
business in VC 2005).  Things like implicit narrowing conversions are 
flagged by the compiler and casts are added where appropriate, etc.  To 
me, fixing warnings is mostly a matter of either fixing the bug or 
telling the compiler "yes, I meant to do that."  I can't think of a 
situation where I'd want to ship with some warning messages still 
displaying, assuming I cared enough to enable them.


Sean
Feb 06 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Sean Kelly" <sean f4.ca> wrote in message
news:ds927o$156u$1 digitaldaemon.com...
 I'm not sure I see the problem.  Since the warnings mode is optional,
 what's to stop you from building with warnings until you fix everything
 it flags that you care about then turn the option off and build to test?

 My only issue is that with "warnings as errors" as the only
 warnings-enabled option, there must be a code-level way to fix anything
 the compiler warns about.  Otherwise, shops that build with warnings
 enabled as the release mode would be SOL if the compiler warns about
 something intended.

 This is typically how I do things with VC++: require no messages at WL:4
 and preprocessor directives to situationally disable any warnings about
 utterly stupid things (like the "standard function is deprecated"
 business in VC 2005).  Things like implicit narrowing conversions are
 flagged by the compiler and casts are added where appropriate, etc.  To
 me, fixing warnings is mostly a matter of either fixing the bug or
 telling the compiler "yes, I meant to do that."  I can't think of a
 situation where I'd want to ship with some warning messages still
 displaying, assuming I cared enough to enable them.
You're replied on my last post, but none of this seems to address anything I raised. Did you reply on the wrong part? If this is a reply to my post, then I'm stumped. I cannot express my point - that seems to me to be one of the most obvious things ever discussed on this NG - any more clearly. Maybe I just have to assume I'm mad. (Not entirely without potential, as a hypothesis.) For the record, I'm not talking about shipping with warnings still firing. That's madness to do, and I suspect my opinion on that, and practice in not doing so, to be identical to your own. (Interestingly, regarding the reluctance to allow compiler flags for warning suppression: the only warnings I ever get, and which I am dangerously forced to ignore wholesale, are from linkers, since there seems to be no mechanism to advise them to selectively shut up. I know that's only real world C and C++ experience, and therefore of little consequence to the Decision-making process, but I think it makes its point pretty clearly.) Anyway, my entire point is tightly focused on the contradiction between Walter's stock technique for debugging, which he advises all comers to adopt whenever they report a problem, and the restrictions imposed by the DMD compiler that directly defeat that advice. I await that point being addressed, rather than some deflectory digression, with exteme patience. :-)
Feb 06 2006
parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 "Sean Kelly" <sean f4.ca> wrote in message
 news:ds927o$156u$1 digitaldaemon.com...
 I'm not sure I see the problem.  Since the warnings mode is optional,
 what's to stop you from building with warnings until you fix everything
 it flags that you care about then turn the option off and build to test?

 My only issue is that with "warnings as errors" as the only
 warnings-enabled option, there must be a code-level way to fix anything
 the compiler warns about.  Otherwise, shops that build with warnings
 enabled as the release mode would be SOL if the compiler warns about
 something intended.

 This is typically how I do things with VC++: require no messages at WL:4
 and preprocessor directives to situationally disable any warnings about
 utterly stupid things (like the "standard function is deprecated"
 business in VC 2005).  Things like implicit narrowing conversions are
 flagged by the compiler and casts are added where appropriate, etc.  To
 me, fixing warnings is mostly a matter of either fixing the bug or
 telling the compiler "yes, I meant to do that."  I can't think of a
 situation where I'd want to ship with some warning messages still
 displaying, assuming I cared enough to enable them.
You're replied on my last post, but none of this seems to address anything I raised. Did you reply on the wrong part?
No, but perhaps this was somewhat of a tangent. I was mostly suggesting that you compile with warnings on to ensure no new errors are introduced, and if you still can't produce a linkable object for some reason then turn warnings off and make sure the run-time bug still exists. Warnings are optional, after all.
 For the record, I'm not talking about shipping with warnings still firing.
 That's madness to do, and I suspect my opinion on that, and practice in not
 doing so, to be identical to your own.
I actually do release builds with warnings enabled for shipping applications, but the warnings I'm used to are compile-time ones. If D has run-time warnings, that's new to me (I've never actually used "-w" in D--shame on me). To me, the benefit of warnings is to point out potential coding mistakes that are syntactically valid--implicit narrowing conversions being the big one IMO--and to do so at compile-time.
 Anyway, my entire point is tightly focused on the contradiction between
 Walter's stock technique for debugging, which he advises all comers to adopt
 whenever they report a problem, and the restrictions imposed by the DMD
 compiler that directly defeat that advice. I await that point being
 addressed, rather than some deflectory digression, with exteme patience.
I apologize. I think I'm simply missing the point. Sean
Feb 07 2006
prev sibling next sibling parent reply "Regan Heath" <regan netwin.co.nz> writes:
On Tue, 7 Feb 2006 13:32:40 +1100, Matthew <matthew hat.stlsoft.dot.org>  
wrote:
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:ds8lvv$s2k$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds8jkh$qf8$1 digitaldaemon.com...
 I'm still waiting for Walter to explain the paradox between the
 bug-finding
 activities that he recommends to just about everyone that ever raises  
a
 compiler problem and not being able to generate code and link with
 something
 that emits a warning. (I may be waiting some time ...)
If you can show an example where being able to execute code that generates a warning will make the cause of the warning less mysterious, I'd be very interested.
Are you affecting obtuseness as a means of stimulating debate for the people reading but not participating? (Not rhetoric. I'm genuinely confused.) Anway, here's the situation as I see it, as I can most plainly express it: 1. You recommend that people chop down code in order to isolate bugs. Such bugs can include compile-time and runtime bugs. 2. You offer a compiler that does not allow a user to generate code while observing warnings. These two are in obvious conflict. Anyone with any problem compiling any compiled-language will, in following your stipulation to the chop-and-test approach, have to remove code. Those with a version control system, a huge Undo buffer and perfect faith in their IDDE not to crash may genuinely _remove_ code.
Or copy the file(s) involved and chop the cop(y|ies).
 Those more careful/careworn might instead put in an early
 return statement here and there. Since the last thing one wants to do  
 when
 debugging is to introduce new bugs, such users might sensibly wish to
 continue to be informed of potential problems by having their compiler  
 emit as many warnings as it can.
True, but not strictly required to produce a test case. Regan
Feb 06 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
 Anway, here's the situation as I see it, as I can most plainly express
 it:

 1. You recommend that people chop down code in order to isolate bugs.
 Such bugs can include compile-time and runtime bugs.

 2. You offer a compiler that does not allow a user to generate code
while
 observing warnings.

 These two are in obvious conflict. Anyone with any problem compiling any
 compiled-language will, in following your stipulation to the
 chop-and-test
 approach, have to remove code. Those with a version control system, a
 huge Undo buffer and perfect faith in their IDDE not to crash may
 genuinely
 _remove_ code.
Or copy the file(s) involved and chop the cop(y|ies).
Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task. To have to be making potentially 10s or even 100s of copies of a given file, and keeping track of what changes go where, is going to turn it into a really time-consuming and horrible arduous task. I don't want to always be coming across as this nasty sarcastic English cynic, who just refuses to buy into the Dream, but D is, for one thing, touted as being superior in productivity to C/C++. And then Walter imposes restrictions on the dominant (and largely defining) compiler based on two things: his needs/wants/experience as a compiler writer, and his experience/practices as a C/C++ coder. While I have huge respect for his achievements in both these fields, they do not represent *all* knowledge/experience/practice (nor even the best, perhaps). _In this particular case_, Walter's ideas/prejudices/whatever are a complete contradiction of the claims of D, hindering rather than helpful. As my mother always said (and continues to say) to me: "Watch that hubris".
 Those more careful/careworn might instead put in an early
 return statement here and there. Since the last thing one wants to do
 when
 debugging is to introduce new bugs, such users might sensibly wish to
 continue to be informed of potential problems by having their compiler
 emit as many warnings as it can.
True, but not strictly required to produce a test case.
Not at all. There are many ways. But why proscribe a more productive one for less productive ones in the name of one person's practice. And, before anyone says it, Yes, I know it's Walter's language, and he can do what he likes. My radical hypothesis is that by facilitating the (different) practices of other developers, more developers are likely to be attracted to the language.
Feb 06 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:ds9b89$1bq2$1 digitaldaemon.com...
 Or copy the file(s) involved and chop the cop(y|ies).
Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task.
??
 To have to be making potentially 10s or even 100s of
 copies of a given file,
No need for more than two. I have to wonder what you are doing.
 and keeping track of what changes go where,
No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop something else. No need to keep a 'stack' of more than one change.
 is going to turn it into a really time-consuming and horrible arduous 
 task.
If it is, you're going about it inefficiently. I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup copy back over the editted copy. Nothing arduous.
Feb 06 2006
next sibling parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:ds9dn9$1ddr$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds9b89$1bq2$1 digitaldaemon.com...
 Or copy the file(s) involved and chop the cop(y|ies).
Indeed. But I think you'd agree that that adds considerable overhead on
to
 what is usually a very chop intensive activity. chop-and-test is itself
a
 horrible arduous task.
??
 To have to be making potentially 10s or even 100s of
 copies of a given file,
No need for more than two. I have to wonder what you are doing.
 and keeping track of what changes go where,
No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop
something
 else. No need to keep a 'stack' of more than one change.

 is going to turn it into a really time-consuming and horrible arduous
 task.
If it is, you're going about it inefficiently. I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup
copy
 back over the editted copy. Nothing arduous.
I'd love to be able to poo-poo your statements as laughable fantasy, but I have no doubt that you believe what you're saying and that you're speaking the truth about your own experiences. What continues to sadden me is the apparent narrow-mindedness in that you project a certain belief that your experiences are enough of a sample to make pronouncements and direct tool strategy for everyone else. I can tell you that there are plenty of times when a binary approach such as you describe has not proved enough for me, perhaps as a consequence of my working with non-trivial C++ code (templates, etc.). There are often instances where a combinatorial effect on the compiler cannot be reduced in the manner you describe, and many more where the effort involved to bring everything in one file is far more than necessary compared with modifying several in parallel. I know you don't believe me about that, but that doesn't make it not so. Perhaps the answer is that I should not have the temerity to stretch the language to it's limits? But I think not. I continue to support DMC++ and D and you personally, and I want all to prosper highly, but you do seem to live in a world that does not relate to me and my experiences, and I am not yet sufficiently deluded to believe that I'm unique. Common sense therefore compels me to think that you have an overly narrow and prescriptive view of software engineering, and that that influences the progress of D to its detriment. Matthew btw, I use chop-and-test, and recognise it's value without reservation. But the same applies for flossing one's teeth. Doesn't stop either from being a vile activity. None of which has much if anything to do with my specific point, to which I continue to await any response.
Feb 07 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:ds9tq1$1r8n$1 digitaldaemon.com...
 I'd love to be able to poo-poo your statements as laughable fantasy, but I
 have no doubt that you believe what you're saying and that you're speaking
 the truth about your own experiences. What continues to sadden me is the
 apparent narrow-mindedness in that you project a certain belief that your
 experiences are enough of a sample to make pronouncements and direct tool
 strategy for everyone else.

 I can tell you that there are plenty of times when a binary approach such 
 as
 you describe has not proved enough for me, perhaps as a consequence of my
 working with non-trivial C++ code (templates, etc.). There are often
 instances where a combinatorial effect on the compiler cannot be reduced 
 in
 the manner you describe, and many more where the effort involved to bring
 everything in one file is far more than necessary compared with modifying
 several in parallel.
 I know you don't believe me about that, but that
 doesn't make it not so.
I know that you find this to not work, and you've often sent me those cases. None of them resisted reduction to a few lines. I've been working on solving template related bugs for 10 years :-(. At the end of the process, each and every one of the minimal examples goes into the DMC++ test suite, so I have quite an extensive test suite (maybe a thousand examples just for templates). Every one was reduced using the process I described, and they are all just a few lines of code. And, as a pretty general rule, none of the problems made any sense until they were so reduced, and they often did start out as incomprehensibly complex templates. There is a very solid factual basis for what I said works in isolating problems. The proof is not only in the DMC++ (and DMD) test suite, but in the irreducible cases you sent me that I reduced and sent back. If you want to dismiss such a successful track record, including success on your cases, as laughable fantasy, what can I say? P.S. I might uselessly add that none of this was aided or hindered by -w.
Feb 07 2006
parent "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:dsancf$2k92$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds9tq1$1r8n$1 digitaldaemon.com...
 I'd love to be able to poo-poo your statements as laughable fantasy, but
I
 have no doubt that you believe what you're saying and that you're
speaking
 the truth about your own experiences. What continues to sadden me is the
 apparent narrow-mindedness in that you project a certain belief that
your
 experiences are enough of a sample to make pronouncements and direct
tool
 strategy for everyone else.

 I can tell you that there are plenty of times when a binary approach
such
 as
 you describe has not proved enough for me, perhaps as a consequence of
my
 working with non-trivial C++ code (templates, etc.). There are often
 instances where a combinatorial effect on the compiler cannot be reduced
 in
 the manner you describe, and many more where the effort involved to
bring
 everything in one file is far more than necessary compared with
modifying
 several in parallel.
 I know you don't believe me about that, but that
 doesn't make it not so.
I know that you find this to not work, and you've often sent me those
cases.
 None of them resisted reduction to a few lines.
More assumption: that I've needed to consult you on all the cases that I've found, and that I've done so in all cases. Neither of these is true.
 I've been working on solving template related bugs for 10 years :-(. At
the
 end of the process, each and every one of the minimal examples goes into
the
 DMC++ test suite, so I have quite an extensive test suite (maybe a
thousand
 examples just for templates). Every one was reduced using the process I
 described, and they are all just a few lines of code.
None of that has the slightest relation to my point.
 And, as a pretty general rule, none of the problems made any sense until
 they were so reduced, and they often did start out as incomprehensibly
 complex templates.

 There is a very solid factual basis for what I said works in isolating
 problems. The proof is not only in the DMC++ (and DMD) test suite, but in
 the irreducible cases you sent me that I reduced and sent back. If you
want
 to dismiss such a successful track record, including success on your
cases,
 as laughable fantasy, what can I say?
Another furphy again. I don't poo-poo your track record. I point out that your software engineering is not monotonic, corresponding to only your experiences/opinions/practices, and that this detracts from D's general appeal and raises several manifest contradictions with your stated aims for D. Seems that several others are pointing that out now, so I'll leave it before it degenerates any further.
Feb 08 2006
prev sibling parent Derek Parnell <derek psych.ward> writes:
On Mon, 6 Feb 2006 21:55:56 -0800, Walter Bright wrote:

 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
 news:ds9b89$1bq2$1 digitaldaemon.com...
 Or copy the file(s) involved and chop the cop(y|ies).
Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task.
??
 To have to be making potentially 10s or even 100s of
 copies of a given file,
No need for more than two. I have to wonder what you are doing.
 and keeping track of what changes go where,
No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop something else. No need to keep a 'stack' of more than one change.
 is going to turn it into a really time-consuming and horrible arduous 
 task.
If it is, you're going about it inefficiently. I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup copy back over the editted copy. Nothing arduous.
I'm backing Walter's case here. Although doing the chop-retest can be boring to do, there is no need to make it hard to do and it always seems to give you a small code example of the bug. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Down with mediocracy!" 8/02/2006 11:56:49 AM
Feb 07 2006
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:ds90n7$1411$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:ds8lvv$s2k$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds8jkh$qf8$1 digitaldaemon.com...
 I'm still waiting for Walter to explain the paradox between the
 bug-finding
 activities that he recommends to just about everyone that ever raises a
 compiler problem and not being able to generate code and link with
 something
 that emits a warning. (I may be waiting some time ...)
If you can show an example where being able to execute code that generates a warning will make the cause of the warning less mysterious, I'd be very interested.
Are you affecting obtuseness as a means of stimulating debate for the people reading but not participating? (Not rhetoric. I'm genuinely confused.)
I genuinely have no idea what the problem is you're talking about.
 Anway, here's the situation as I see it, as I can most plainly express it:
 1. You recommend that people chop down code in order to isolate bugs. Such
 bugs can include compile-time and runtime bugs.
 2. You offer a compiler that does not allow a user to generate code while
 observing warnings.

 These two are in obvious conflict.
No, they aren't. If the person doesn't understand why the warning is being generated, executing the code won't help in the slightest. If it isn't about the warning, just turn off warnings. Turning off the warnings isn't any more work than turning on "warn but keep going".
 Anyone with any problem compiling any
 compiled-language will, in following your stipulation to the chop-and-test
 approach, have to remove code.
You write as if there's something wrong with the chop and test approach, or that it's something unique. It's not unique (it's taught at better universities). It's a generally and widely used problem solving technique, also known as divide-and-conquer. And it's the only one that has a reasonable chance of success with a complex system. I've spent 30 years in engineering, and those people who do not know how to reduce the problem domain spent hours, days, *months* futzing about chasing phantoms, wasting time, trying things essentially at random, and never figuring out what's going wrong.
 Those with a version control system, a huge
 Undo buffer and perfect faith in their IDDE not to crash may genuinely
 _remove_ code. Those more careful/careworn might instead put in an early
 return statement here and there. Since the last thing one wants to do when
 debugging is to introduce new bugs, such users might sensibly wish to
 continue to be informed of potential problems by having their compiler 
 emit
 as many warnings as it can.
The way to do chop-and-test is to make a copy of your project, and work from the copy. Or at least make a full backup first. If you do not make a copy first, you are nearly guaranteed to screw up your source code, whether or not you are using CVS or have undo buffers. Warnings aren't going to help in the slightest. There's no reason whatsoever to worry about introducing new bugs doing this. If you are in a position where you are somehow unable to back up your project or work from a copy of it, then for whatever reason you've completely lost control over the project and have much bigger problems than warnings.
 Please point out the flaw in my reasoning of your paradox, and, further,
 explain to me how, in following this technique in many circumstances where
 debugging has been necessitated in C/C++/D/Ruby/Java/.NET, I have gone so
 astray in my practice. What should I have been doing instead?
Make a full copy of your project, and work from that.
 This is not
 rhetoric for the sake of it: please do elucidate. If you cannot, please
 explain how DMD's current contradiction of 1 and 2 is not an
 arbitrary shackle to its users,
You're worried about screwing up your project by applying divide-and-conquer, so you want warnings to continue. That is a misunderstanding of how to go about it. Make a copy of your project. Then hack and slash the copy until you've isolated the problem. Turn off the warnings. Stop worrying about introducing a bug - it's a sacrificial copy, not the original.
 one based solely on your own work practices and prejudices that,
 I hope, you would have to concede may not map to those of all other 
 developers.
Trying to use warnings as a substitute for making a backup is like a carpenter trying to hammer a nail with a screwdriver. It's the wrong answer, and trying to make a screwdriver that can be used as a hammer is also the wrong answer.
Feb 06 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:ds9a6g$1b2m$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds90n7$1411$1 digitaldaemon.com...
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:ds8lvv$s2k$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds8jkh$qf8$1 digitaldaemon.com...
 I'm still waiting for Walter to explain the paradox between the
 bug-finding
 activities that he recommends to just about everyone that ever raises
a
 compiler problem and not being able to generate code and link with
 something
 that emits a warning. (I may be waiting some time ...)
If you can show an example where being able to execute code that generates a warning will make the cause of the warning less mysterious, I'd be very interested.
Are you affecting obtuseness as a means of stimulating debate for the people reading but not participating? (Not rhetoric. I'm genuinely confused.)
I genuinely have no idea what the problem is you're talking about.
 Anway, here's the situation as I see it, as I can most plainly express
it:
 1. You recommend that people chop down code in order to isolate bugs.
Such
 bugs can include compile-time and runtime bugs.
 2. You offer a compiler that does not allow a user to generate code
while
 observing warnings.

 These two are in obvious conflict.
No, they aren't. If the person doesn't understand why the warning is being generated, executing the code won't help in the slightest. If it isn't
about
 the warning, just turn off warnings. Turning off the warnings isn't any
more
 work than turning on "warn but keep going".

 Anyone with any problem compiling any
 compiled-language will, in following your stipulation to the
chop-and-test
 approach, have to remove code.
You write as if there's something wrong with the chop and test approach,
or
 that it's something unique. It's not unique (it's taught at better
 universities). It's a generally and widely used problem solving technique,
 also known as divide-and-conquer. And it's the only one that has a
 reasonable chance of success with a complex system. I've spent 30 years in
 engineering, and those people who do not know how to reduce the problem
 domain spent hours, days, *months* futzing about chasing phantoms, wasting
 time, trying things essentially at random, and never figuring out what's
 going wrong.

 Those with a version control system, a huge
 Undo buffer and perfect faith in their IDDE not to crash may genuinely
 _remove_ code. Those more careful/careworn might instead put in an early
 return statement here and there. Since the last thing one wants to do
when
 debugging is to introduce new bugs, such users might sensibly wish to
 continue to be informed of potential problems by having their compiler
 emit
 as many warnings as it can.
The way to do chop-and-test is to make a copy of your project, and work
from
 the copy. Or at least make a full backup first. If you do not make a copy
 first, you are nearly guaranteed to screw up your source code, whether or
 not you are using CVS or have undo buffers. Warnings aren't going to help
in
 the slightest.

 There's no reason whatsoever to worry about introducing new bugs doing
this.
 If you are in a position where you are somehow unable to back up your
 project or work from a copy of it, then for whatever reason you've
 completely lost control over the project and have much bigger problems
than
 warnings.


 Please point out the flaw in my reasoning of your paradox, and, further,
 explain to me how, in following this technique in many circumstances
where
 debugging has been necessitated in C/C++/D/Ruby/Java/.NET, I have gone
so
 astray in my practice. What should I have been doing instead?
Make a full copy of your project, and work from that.
 This is not
 rhetoric for the sake of it: please do elucidate. If you cannot, please
 explain how DMD's current contradiction of 1 and 2 is not an
 arbitrary shackle to its users,
You're worried about screwing up your project by applying divide-and-conquer, so you want warnings to continue. That is a misunderstanding of how to go about it. Make a copy of your project. Then hack and slash the copy until you've isolated the problem. Turn off the warnings. Stop worrying about introducing a bug - it's a sacrificial copy, not the original.
 one based solely on your own work practices and prejudices that,
 I hope, you would have to concede may not map to those of all other
 developers.
Trying to use warnings as a substitute for making a backup is like a carpenter trying to hammer a nail with a screwdriver. It's the wrong
answer,
 and trying to make a screwdriver that can be used as a hammer is also the
 wrong answer.
I get all of the above - some of which does _not_ address my point, and rather lights little adjacent campfires for us to worry about without addressing the point (what's new?) - but that's all from the position of _your own experience_. It does not accord with mine. If I'm unique, and uniquely wrong, then don't worry about it. But if I'm not, then you're making life harder than necessary for some potential users. Since life using D's hard enough already I think it's a bad strategy for getting wider acceptance of D. I don't expect you to change your mind now anymore than I did when I posted my original response on this thread, since you so rarely do. I just have the unfortunate personality trait of being unable to resist bursting bubbles. One of the things you claim is that D helps people work faster. This is an example where that claim is invalid.
Feb 06 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message
news:ds9bs7$1c6t$1 digitaldaemon.com...
 I get all of the above - some of which does _not_ address my point,
I have no idea what your point was, then. Your point sounded like you were worried about introducing bugs during chop-and-test. I recommended making a copy first, then no bugs can be introduced.
 One of the things you claim is that D helps people work faster. This is an
 example where that claim is invalid.
I expect that whatever you're doing that results in 100's of copies of a single file is going to be slow going in any language, but that is nothing I ever recommended. Chop-and-test requires at most 2 copies.
Feb 07 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:ds9o1o$1ml8$2 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:ds9bs7$1c6t$1 digitaldaemon.com...
 I get all of the above - some of which does _not_ address my point,
I have no idea what your point was, then. Your point sounded like you were worried about introducing bugs during chop-and-test. I recommended making
a
 copy first, then no bugs can be introduced.
I made the point quite clearly.
 One of the things you claim is that D helps people work faster. This is
an
 example where that claim is invalid.
I expect that whatever you're doing that results in 100's of copies of a single file is going to be slow going in any language, but that is nothing
I
 ever recommended. Chop-and-test requires at most 2 copies.
Feb 07 2006
parent reply Don Clugston <dac nospam.com.au> writes:
Matthew wrote:
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:ds9o1o$1ml8$2 digitaldaemon.com...
 
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message
news:ds9bs7$1c6t$1 digitaldaemon.com...

I get all of the above - some of which does _not_ address my point,
I have no idea what your point was, then. Your point sounded like you were worried about introducing bugs during chop-and-test. I recommended making
a
copy first, then no bugs can be introduced.
I made the point quite clearly.
It's not very clear to me. Was the point actually made by Derek's post? (your first post seemed to assume that the point had already been made), That is, you enable warnings so that you can see what assumptions the compiler is making. You add an early return statement in order to quickly simplify the situation, with minimal side-effects. When you do this, you want to see how the compiler's assumptions have changed. But unfortunately, the warnings-as-errors makes this impossible to do. Right? Or have I misunderstood, too? BTW, something that I've never seen anyone mention is the value of the -cov option for debugging. Absolutely fantastic for debugging, being able to see a 'snail trail' of where the program went. I'll never single-step through code again.
 
 
One of the things you claim is that D helps people work faster. This is
an
example where that claim is invalid.
I expect that whatever you're doing that results in 100's of copies of a single file is going to be slow going in any language, but that is nothing
I
ever recommended. Chop-and-test requires at most 2 copies.
Feb 07 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Don Clugston" <dac nospam.com.au> wrote in message
news:dsa2t0$1vhn$1 digitaldaemon.com...
 Matthew wrote:
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:ds9o1o$1ml8$2 digitaldaemon.com...

"Matthew" <matthew hat.stlsoft.dot.org> wrote in message
news:ds9bs7$1c6t$1 digitaldaemon.com...

I get all of the above - some of which does _not_ address my point,
I have no idea what your point was, then. Your point sounded like you
were
worried about introducing bugs during chop-and-test. I recommended
making
 a

copy first, then no bugs can be introduced.
I made the point quite clearly.
It's not very clear to me. Was the point actually made by Derek's post? (your first post seemed to assume that the point had already been made),
I said "1. You recommend that people chop down code in order to isolate bugs. Such bugs can include compile-time and runtime bugs. 2. You offer a compiler that does not allow a user to generate code while observing warnings. These two are in obvious conflict. Anyone with any problem compiling any compiled-language will, in following your stipulation to the chop-and-test approach, have to remove code. Those with a version control system, a huge Undo buffer and perfect faith in their IDDE not to crash may genuinely _remove_ code. Those more careful/careworn might instead put in an early return statement here and there. Since the last thing one wants to do when debugging is to introduce new bugs, such users might sensibly wish to continue to be informed of potential problems by having their compiler emit as many warnings as it can. "
 That is, you enable warnings so that you can see what assumptions the
 compiler is making. You add an early return statement in order to
 quickly simplify the situation, with minimal side-effects. When you do
 this, you want to see how the compiler's assumptions have changed.
 But unfortunately, the warnings-as-errors makes this impossible to do.

 Right?
Pretty much. Walter's point includes "No, they aren't. If the person doesn't understand why the warning is being generated, executing the code won't help in the slightest. If it isn't about the warning, just turn off warnings. Turning off the warnings isn't any more work than turning on "warn but keep going"." Since I know he's very smart, I sigh when I read this, as I don't know whether he's trying to quench debate before he has to actually address my point, or that he really can't see it. Either way it's sighworthy. It ignores the clear and very important issue that one might want to continue to have warnings emitted, in order to keep the "mental picture" of the compilation output consistent, in order to ensure that any changes to try and stop the compiler from its ICEs, e.g. reordering seemingly unrelated statements, have not done anything that's a _new_ problem, in order to see if a commented out block highlights a new warning, and so on and so forth. Of course, Walter can counter that each and every change should be atomic, but it's not living in the real world, or at least not in my real world. I want to work how I work best, not how Walter works best. And anyway, that only answers one of these three that I've just tossed up. Blah blah blah blah. I could go on, but this is all soooo obvious and so peurile, it blows my stack that we even have to discuss it. D continues to have a foot firmly in each of the two idiotic camps of "the compiler knows better than the user" (so you can't choose to generate object code with something containg warnings however much you know you need/want to, and so on) and "the language knows better than the user" (so there aren't even any warnings of any kind for implicit numeric conversions). Like making someone at medical school do thought experiments for 7 years, and then making them do unassisted brain surgery as soon as they graduate. That we continue to have to debate this crap year on year when far more interesting and important issues remain unresolved, is a bad joke.
Feb 07 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 "Don Clugston" <dac nospam.com.au> wrote in message
 news:dsa2t0$1vhn$1 digitaldaemon.com...
 
 That is, you enable warnings so that you can see what assumptions the
 compiler is making. You add an early return statement in order to
 quickly simplify the situation, with minimal side-effects. When you do
 this, you want to see how the compiler's assumptions have changed.
 But unfortunately, the warnings-as-errors makes this impossible to do.

 Right?
Pretty much.
Oh I see. I wasn't aware of the dead code warning. Your complaint makes a lot more sense now.
 D continues to have a foot firmly in each of the two idiotic camps of "the
 compiler knows better than the user" (so you can't choose to generate object
 code with something containg warnings however much you know you need/want
 to, and so on)
I agree that it should be possible to disable the aforementioned warning, as I occasionally do the same thing. Still, there's nothing to stop you from disabling warnings altogether if those are the only ones you're seeing. Sean
Feb 07 2006
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <matthew hat.stlsoft.dot.org> wrote in message 
news:dsabfk$282a$1 digitaldaemon.com...

Walter wrote:
 Your point sounded like you were
worried about introducing bugs during chop-and-test. I recommended
 making a copy first, then no bugs can be introduced.
It ignores the clear and very important issue that one might want to continue to have warnings emitted, in order to keep the "mental picture" of the compilation output consistent, in order to ensure that any changes to try and stop the compiler from its ICEs, e.g. reordering seemingly unrelated statements, have not done anything that's a _new_ problem, in order to see if a commented out block highlights a new warning, and so on and so forth.
So I summed up your point accurately.
 Of course, Walter can counter that each and every change should be atomic,
Nope. The only goal with the chop is "does the problem I am trying to isolate remain, or does it go away?" It's binary, yes or no. If you're introducing all kinds of other considerations, that's likely why you find chop-and-test to be so unusably arduous.
 but it's not living in the real world, or at least not in my real world. I
 want to work how I work best, not how Walter works best.
You often send me very large bug reports with a comment that you've been unable to chop it down further and that you've no idea what's going wrong. Every one of them I've reduced, using chop-and-test, to 10 lines or so after a few minutes. None of them hinged on -w in any way or required more than 2 of copies of any file. Your way isn't working (in the sense that it isn't finding the cause of the problem), why is it best? This would be probably helped a lot if I could look over your shoulder while you're doing chop-and-test, so I could show you how to do it.
Feb 07 2006
parent reply "Matthew" <matthew hat.stlsoft.dot.org> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message
news:dsaljv$2ikm$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:dsabfk$282a$1 digitaldaemon.com...

Walter wrote:
 Your point sounded like you were
worried about introducing bugs during chop-and-test. I recommended
 making a copy first, then no bugs can be introduced.
It ignores the clear and very important issue that one might want to continue to have warnings emitted, in order to keep the "mental picture" of the compilation output consistent, in order to ensure that any changes
to
 try and stop the compiler from its ICEs, e.g. reordering seemingly
 unrelated
 statements, have not done anything that's a _new_ problem, in order to
see
 if a commented out block highlights a new warning, and so on and so
forth.
 So I summed up your point accurately.

 Of course, Walter can counter that each and every change should be
atomic,
 Nope. The only goal with the chop is "does the problem I am trying to
 isolate remain, or does it go away?" It's binary, yes or no. If you're
 introducing all kinds of other considerations, that's likely why you find
 chop-and-test to be so unusably arduous.

 but it's not living in the real world, or at least not in my real world.
I
 want to work how I work best, not how Walter works best.
You often send me very large bug reports with a comment that you've been unable to chop it down further and that you've no idea what's going wrong. Every one of them I've reduced, using chop-and-test, to 10 lines or so
after
 a few minutes. None of them hinged on -w in any way or required more than
2
 of copies of any file. Your way isn't working (in the sense that it isn't
 finding the cause of the problem), why is it best?

 This would be probably helped a lot if I could look over your shoulder
while
 you're doing chop-and-test, so I could show you how to do it.
Good one. At junior debating college they taught us that when your opponent starts to play the man not the ball you know you have him on the ropes.
Feb 08 2006
parent "Regan Heath" <regan netwin.co.nz> writes:
On Wed, 8 Feb 2006 22:16:24 +1100, Matthew <matthew hat.stlsoft.dot.org>  
wrote:
 "Walter Bright" <newshound digitalmars.com> wrote in message
 news:dsaljv$2ikm$1 digitaldaemon.com...
 "Matthew" <matthew hat.stlsoft.dot.org> wrote in message
 news:dsabfk$282a$1 digitaldaemon.com...

Walter wrote:
 Your point sounded like you were
worried about introducing bugs during chop-and-test. I recommended
 making a copy first, then no bugs can be introduced.
It ignores the clear and very important issue that one might want to continue to have warnings emitted, in order to keep the "mental
picture"
 of
 the compilation output consistent, in order to ensure that any changes
to
 try and stop the compiler from its ICEs, e.g. reordering seemingly
 unrelated
 statements, have not done anything that's a _new_ problem, in order to
see
 if a commented out block highlights a new warning, and so on and so
forth.
 So I summed up your point accurately.

 Of course, Walter can counter that each and every change should be
atomic,
 Nope. The only goal with the chop is "does the problem I am trying to
 isolate remain, or does it go away?" It's binary, yes or no. If you're
 introducing all kinds of other considerations, that's likely why you  
 find
 chop-and-test to be so unusably arduous.

 but it's not living in the real world, or at least not in my real  
world.
I
 want to work how I work best, not how Walter works best.
You often send me very large bug reports with a comment that you've been unable to chop it down further and that you've no idea what's going wrong. Every one of them I've reduced, using chop-and-test, to 10 lines or so
after
 a few minutes. None of them hinged on -w in any way or required more  
 than
2
 of copies of any file. Your way isn't working (in the sense that it  
 isn't
 finding the cause of the problem), why is it best?

 This would be probably helped a lot if I could look over your shoulder
while
 you're doing chop-and-test, so I could show you how to do it.
Good one. At junior debating college they taught us that when your opponent starts to play the man not the ball you know you have him on the ropes.
It is called "Attacking the person": http://www.datanation.com/fallacies/attack.htm But I don't agree that Walter is guilty of it here. Regan
Feb 08 2006
prev sibling next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message 
news:op.s4iq6vrk6b8z09 ginger.vic.bigpond.net.au...
 On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright 
 <newshound digitalmars.com> wrote:
 That's right. If the programmer cares about warnings, he'll want them to 
 be
 treated as errors. Otherwise, the warning message could scroll up and all
 too easilly be overlooked.
However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code. A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend.
The assumption is that people who enable warnings by throwing "-w" care about any generated warnings and want to fix them before proceeding. If the warnings are irrelevant to them, they can just not throw "-w" and the code will compile successfully. I don't see a practical reason for deliberately throwing "-w" but not wanting to deal with the results. I know this isn't how C++ compilers deal with warnings by default, but my experience is that people who care about warnings from their C++ compiler also throw the switch "treat warnings as errors" for the express purpose of not missing them. Think of them as "optional errors" instead of warnings <g>.
Feb 05 2006
parent reply Derek Parnell <derek psych.ward> writes:
On Sun, 5 Feb 2006 13:51:55 -0800, Walter Bright wrote:

 "Derek Parnell" <derek psych.ward> wrote in message 
 news:op.s4iq6vrk6b8z09 ginger.vic.bigpond.net.au...
 On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright 
 <newshound digitalmars.com> wrote:
 That's right. If the programmer cares about warnings, he'll want them to 
 be
 treated as errors. Otherwise, the warning message could scroll up and all
 too easilly be overlooked.
However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code. A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend.
The assumption is that people who enable warnings by throwing "-w" care about any generated warnings and want to fix them before proceeding. If the warnings are irrelevant to them, they can just not throw "-w" and the code will compile successfully.
I really do understand your point of view here. The current dmd behaviour is not a problem for me as I use the "-w" switch to highlight the areas I want to remove the compiler's assumptions, *prior* to running tests of the executable. However, I can also see that some other people might really want to do both - see where the assumptions are *and* allow the compiler to make those assumptions.
 I don't see a practical reason for deliberately throwing "-w" but not 
 wanting to deal with the results. 
That maybe the case, but the fact that you "don't see a practical reason" does not necessarily rule out the possibility that other people might truly see a valid reason. One has to accept that some other people's thinking is valid, though thoroughly weird when seen from one's own point of view.
 I know this isn't how C++ compilers deal 
 with warnings by default, but my experience is that people who care about 
 warnings from their C++ compiler also throw the switch "treat warnings as 
 errors" for the express purpose of not missing them.
I quite sure that other people can also quote from their C++ experience with instances to the contrary - so what does that prove? Only that not everyone thinks the same as you do.
 Think of them as "optional errors" instead of warnings <g>.
Like I say, the current behaviour is not a problem for me. For example, given the example of the OP, I would do this instead ... int main() { writefln(); writefln("Sample program demonstrating '-w' bug:"); writefln(); debug { writefln("Attempt to stop program after this message."); return(0); } else { writefln(); transfertoSCO(); installMSDOS(); emailCongrats(); return(0); } } -- Derek (skype: derek.j.parnell) Melbourne, Australia "Down with mediocracy!" 6/02/2006 10:04:16 AM
Feb 05 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Derek Parnell" <derek psych.ward> wrote in message 
news:t4q17cntutnl$.1tpdi4h6n15md$.dlg 40tude.net...
 That maybe the case, but the fact that you "don't see a practical reason"
 does not necessarily rule out the possibility that other people might 
 truly
 see a valid reason. One has to accept that some other people's thinking is
 valid, though thoroughly weird when seen from one's own point of view.
I have many years of experience with warnings, and none of it finds any use for warnings that do not stop the compiler. If someone believes there is a valid reason, they need to present a very convincing case. Just because C compilers do it that way by default is not a compelling reason, since a lot of C compiler defaults are that way in order to not break existing makefiles rather than them being a good idea.
 I quite sure that other people can also quote from their C++ experience
 with instances to the contrary -
If there's a compelling case to be made, bring it on <g>.
 so what does that prove? Only that not everyone thinks the same as you do.
Of course. But at the end of the day I have to make some decisions about how D will and will not work.
Feb 05 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 "Derek Parnell" <derek psych.ward> wrote in message 
 news:t4q17cntutnl$.1tpdi4h6n15md$.dlg 40tude.net...
 That maybe the case, but the fact that you "don't see a practical reason"
 does not necessarily rule out the possibility that other people might 
 truly
 see a valid reason. One has to accept that some other people's thinking is
 valid, though thoroughly weird when seen from one's own point of view.
I have many years of experience with warnings, and none of it finds any use for warnings that do not stop the compiler. If someone believes there is a valid reason, they need to present a very convincing case.
So long as the warnings can be disabled, I'm fine with this behavior. I may not consider all warnings important (Visual C++ tends to warn about a handful of things that have nothing to do with code correctness), but for those I do, I typically want to fix them before proceeding. Sean
Feb 05 2006
prev sibling parent reply John Demme <me teqdruid.com> writes:
Walter Bright wrote:
 I have many years of experience with warnings, and none of it finds any
 use for warnings that do not stop the compiler. If someone believes there
 is a valid reason, they need to present a very convincing case.
How does one ignore just one specific warning or one type of warning while still compiling with -w? It'd be nice if there was some way to tell the compiler that a specific line or area of code is indeed correct and to not spit out a warning (or, more importantly, halt on it). ~John Demme
Feb 05 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"John Demme" <me teqdruid.com> wrote in message 
news:ds6ioh$227b$1 digitaldaemon.com...
 Walter Bright wrote:
 I have many years of experience with warnings, and none of it finds any
 use for warnings that do not stop the compiler. If someone believes there
 is a valid reason, they need to present a very convincing case.
How does one ignore just one specific warning or one type of warning while still compiling with -w? It'd be nice if there was some way to tell the compiler that a specific line or area of code is indeed correct and to not spit out a warning (or, more importantly, halt on it).
Each warning has an associated source code modification that will get rid of it.
Feb 05 2006
parent reply John Demme <me teqdruid.com> writes:
Walter Bright wrote:

 
 "John Demme" <me teqdruid.com> wrote in message
 news:ds6ioh$227b$1 digitaldaemon.com...
 Walter Bright wrote:
 I have many years of experience with warnings, and none of it finds any
 use for warnings that do not stop the compiler. If someone believes
 there is a valid reason, they need to present a very convincing case.
How does one ignore just one specific warning or one type of warning while still compiling with -w? It'd be nice if there was some way to tell the compiler that a specific line or area of code is indeed correct and to not spit out a warning (or, more importantly, halt on it).
Each warning has an associated source code modification that will get rid of it.
I assume you mean the user's source code- but what if the user's code is right? Or more importantly, not right- but how they want it. The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors. ~John
Feb 07 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"John Demme" <me teqdruid.com> wrote in message 
news:dsaaab$26tt$1 digitaldaemon.com...
 Walter Bright wrote:
 Each warning has an associated source code modification that will get rid
 of it.
I assume you mean the user's source code- but what if the user's code is right? Or more importantly, not right- but how they want it. The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors.
This goes back to why having warnings at all is a bad idea, as it implies a wishy-washy language specification. Is code legal or is it not?
Feb 07 2006
next sibling parent reply "Bob W" <nospam aol.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message 
news:dsapis$2mfo$1 digitaldaemon.com...

 This goes back to why having warnings at all is a bad idea, as it implies 
 a wishy-washy language specification. Is code legal or is it not?
Are you talking about a real-world PL which is evolving over time and where eventual syntax changes are deemed desirable or even necessary? Or are you talking about a perfect and therefore fictional language which is released as V1.0 and stays there and shines forever? - Then I'd have less troubles to agree. The second scenario does not apply to D because it seems to be more the evolving type - and I am glad for it. So you might consider to keep issuing warnings for several reasons. I give you 2 examples: 1) Let us - just theoreticaly - assume that at one stage you want to disallow the optional C-like syntax of declaring arrays. If such a major move is going to happen in the not so near future you would be well advised to introduce warnings in several compiler releases instead of moving radically and forcing even newbies to search for an "allow deprecated features" switch to get their job done. 2) One feature, which usually prevents otherwise unnecessary debugging sessions, is, to offer a warning to the less-than-perfect user about declared but otherwise unused variables. They might be here in error or are intentionally inserted to serve a future purpose. Your compiler has no way to know but it could politely issue a warning, which would be far better than ignoring this fact or refusing to compile alltogether.
Feb 07 2006
parent "Walter Bright" <newshound digitalmars.com> writes:
"Bob W" <nospam aol.com> wrote in message 
news:dsb8qr$5r8$1 digitaldaemon.com...
 1) Let us - just theoreticaly - assume that at one stage
 you want to disallow the optional C-like syntax of declaring
 arrays. If such a major move is going to happen in the not
 so near future you would be well advised to introduce
 warnings in several compiler releases instead of moving
 radically and forcing even newbies to search for an
 "allow deprecated features" switch to get their job done.
D has already done this a couple times in the past. It'll issue an error about a "deprecated" feature, suggesting the corrected approach. The -d will allow it to continue to compile (I'm not sure why -d would be hard to find for newbies, dmd doesn't have many switches). After a while, it becomes a hard error that can't be overridden with -d, though the fix is still suggested. Eventually, even that is removed and it's just invalid syntax. I think this approach has worked well as long as plenty of time accompanies each stage; at least nobody has complained about it <g>.
Feb 13 2006
prev sibling parent Derek Parnell <derek psych.ward> writes:
On Tue, 7 Feb 2006 10:04:54 -0800, Walter Bright wrote:

 "John Demme" <me teqdruid.com> wrote in message 
 news:dsaaab$26tt$1 digitaldaemon.com...
 Walter Bright wrote:
 Each warning has an associated source code modification that will get rid
 of it.
I assume you mean the user's source code- but what if the user's code is right? Or more importantly, not right- but how they want it. The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors.
This goes back to why having warnings at all is a bad idea, as it implies a wishy-washy language specification. Is code legal or is it not?
NO NO NO! May I be so bold as to reword John's paragraph according to how I see the issues. " I assume you mean the user's source code- but what if the user's code is what the coder intended? Or more importantly, not want the really coder intended. The compiler's assumptions are not always the same as the coder's intentions... if it was always the same and the warnings always pointed that out, then they'd be useless. " The "-w" switch is useful to seeing where my understanding of what the code says and what the compiler thinks it says, is different. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Down with mediocracy!" 8/02/2006 12:20:01 PM
Feb 07 2006
prev sibling parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Derek Parnell wrote:
 On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright 
 <newshound digitalmars.com> wrote:
 
 "Derek Parnell" <derek psych.ward> wrote in message
 news:op.s4imylg86b8z09 ginger.vic.bigpond.net.au...
 As far as Walter is concerned, a 'warning' is really an error too. So 
 yes,
 DMD will display warnings but treat them as errors such that the 
 compiler
 doesn't generate code.
That's right. If the programmer cares about warnings, he'll want them to be treated as errors. Otherwise, the warning message could scroll up and all too easilly be overlooked.
However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code. A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend. --Derek Parnell Melbourne, Australia
I too agree with this view, and we have voiced these concerns before (recall news://news.digitalmars.com:119/dn5en0$a4j$1 digitaldaemon.com or http://www.digitalmars.com/d/archives/digitalmars/D/bugs/5743.html ). So let me just make some comments in light of Walter's comments. Why we want warnings: We want warnings so we can be helped with notifications of possible code segments that are incorrect. Why don't we want halt-compilation-on-warnings (as the only option available): Because during the development process it is quite expected that in a given moment, many parts of the code will be incorrect, and the program as whole will be incomplete, but we want to be able to execute it, as it is nonetheless still meaningfull to do so. There may be those certain parts of code that we *know* are incorrect, and we don't care (at the moment) about warnings that come from there, but, we care about warnings that come from *another* part of the code. Having warnings disabled does not give the warnings about the part of the code we want, and having warnings-as-errors, does not allow one to compile the program without fixing the error in the part of the code we do not care about. One example of this situation is the mentioned test-and-chop method, which I too used often. It is quite common when chopping (or perhaps just commenting, that's what I do), that a certain segment of code now has an unused variable or an invalid code path, nonetheless the whole program structure is still meaningfully executable. When I'm stubbing out a certain part AA of code for testing purposes, I don't want to have to change another part BB of code (and then later change BB back when AA is fixed or stubbed in) just because that other part BB of the code annoys the compiler, even though I know the program is meaningful with part BB unchanged. It renders the code slightly less agile (i.e., harder to change). One can say: oh but then you can disable warnings-as-errors when you're doing the test chop runs, and enable it when your doing 'non-chop' development. But that is plain idiotic. I'm not going to change my build config files every time I wanna change from on mode to another. In fact, since different parts can be under work by different people, this idea is just not considerable. I find that this situation/rationale reminds me a lot of the checked exceptions issue (although not as worse). People imagine a clean, pristine utopian world were code always compiles without warnings [the checked throwing of exceptions]. Yet the reality/practice doesn't work that way, and people are annoyed by the forcing of the pristine idea. So people either suck up with the extra agile-ness, like cleaning warnings on all compiles [writing extra try/catch/throws] or find workarounds, like disabling warnings at all [wrapping normal exceptions in unchecked exceptions]. Walter Bright wrote:
 That's right. If the programmer cares about warnings, he'll want them 
to be
 treated as errors. Otherwise, the warning message could scroll up and 
all
 too easilly be overlooked.
I think the days of "scrolling up" are over. It's in-editor error and warning reporting now. Walter Bright wrote:
 I quite sure that other people can also quote from their C++ experience
 with instances to the contrary -
If there's a compelling case to be made, bring it on <g>.
I hope I made one. And it is confirmed in my pratical coding experience, Walter Bright wrote:
 Think of them as "optional errors" instead of warnings <g>.
Then can we (by 'we' I mean DMD and the docs) at least not call them warnings ? Because they are not. The option could be called strict, pedantic, or something like that, just not 'warnings'. That alone would alleviate a great deal of my contention with this issue. -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Feb 07 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Bruno Medeiros" <daiphoenixNO SPAMlycos.com> wrote in message 
news:dsaijo$2f0r$1 digitaldaemon.com...
 One can say: oh but then you can disable warnings-as-errors when you're 
 doing the test chop runs, and enable it when your doing 'non-chop' 
 development. But that is plain idiotic. I'm not going to change my build 
 config files every time I wanna change from on mode to another.
 In fact, since different parts can be under work by different people, this 
 idea is just not considerable.
Chop-and-test is not normal development. It is done with a local copy of the project, including the makefiles, not code that you or other people are working on. The whole technique of chop-and-copy implies being able to hack and slash away at the project source with abandon. You don't worry about breaking the build, warnings, introducing bugs, or screwing up someone else's part. This is all possible because one is working with a local copy that will be deleted as soon as the problem is found and a solution identified. The solution is then folded into the real development branch.
Feb 07 2006
parent reply Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Walter Bright wrote:
 "Bruno Medeiros" <daiphoenixNO SPAMlycos.com> wrote in message 
 news:dsaijo$2f0r$1 digitaldaemon.com...
 One can say: oh but then you can disable warnings-as-errors when you're 
 doing the test chop runs, and enable it when your doing 'non-chop' 
 development. But that is plain idiotic. I'm not going to change my build 
 config files every time I wanna change from on mode to another.
 In fact, since different parts can be under work by different people, this 
 idea is just not considerable.
Chop-and-test is not normal development. It is done with a local copy of the project, including the makefiles, not code that you or other people are working on. The whole technique of chop-and-copy implies being able to hack and slash away at the project source with abandon. You don't worry about breaking the build, warnings, introducing bugs, or screwing up someone else's part. This is all possible because one is working with a local copy that will be deleted as soon as the problem is found and a solution identified. The solution is then folded into the real development branch.
I think I might have misused the term chop-and-test. I meant any kind of test run (or runs) where you removed some part of the code, either by cutting or commenting. Your definition implies a destructive and more abragent version of this, which indeed, musn't happen often. So I correct my previous statements: by chop-and-test I meant not really the proper chop-and-test, but rather one or more of any kind of *temporary short-lived* editing of the code, with the intention of running/testing the program afterwards. This editing will probably have the life span of only the most immediate task the programmer is doing. This situation, does occur often and is part of normal development. And -w annoys the hell out of it. -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Feb 08 2006
parent Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Bruno Medeiros wrote:
 Your definition implies a destructive and more 
 abragent version of this, which indeed, musn't happen often.
Sorry, "abrangent" doesn't exist in english. I forgot to fix it before posting. "abrangent" means something like "encompassing", "far-reaching", "wide", etc. . -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Feb 08 2006
prev sibling parent reply "Bob W" <nospam aol.com> writes:
"Walter Bright" <newshound digitalmars.com> wrote in message 
news:ds5p5i$1e2j$1 digitaldaemon.com...
 "Derek Parnell" <derek psych.ward> wrote in message 
 news:op.s4imylg86b8z09 ginger.vic.bigpond.net.au...
 As far as Walter is concerned, a 'warning' is really an error too. So 
 yes, DMD will display warnings but treat them as errors such that the 
 compiler doesn't generate code.
That's right. If the programmer cares about warnings, he'll want them to be treated as errors. Otherwise, the warning message could scroll up and all too easilly be overlooked.
That is cool with me. I just needed to know. However, there is a philosophical side to it: What sense does it make to allow a specific program to compile without trouble if one ever forgets to turn on warnings, if the same programs would fail to compile otherwise? If punishment for warnings is the same as that for errors, why would you still call them warnings? Correct me if I am wrong, but my perception is that "warnings in D are morally less severe errors but still inexcusable if enabled to be caught", right? Whatever - may I just suggest the following modification to the usage screen of dmd? ------------- From ---------------- Usage: dmd files.d ... { -switch } ................ -version=ident compile in version code identified by ident -w enable warnings -------------- change to ------------- Usage: dmd files.d ... { -switch } ................ -version=ident compile in version code identified by ident -w enable warnings and treat them as errors
Feb 05 2006
parent Sebastián E. Peyrott <as7cf yahoo.com> writes:
I think the current treatment of warnings reduces the chances of writing 
invalid code in the future, and enforces correct use of the language 
specification. It may be annoying in some cases, but I think it's a good 
tradeoff. 
On the other hand, I think Bob had a good idea and "show warnings and treat 
them as errors" should be printed in DMD's help.

--Sebastián.
Feb 06 2006