digitalmars.D - Why D doesn't have warnings
- Walter (16/16) Jun 28 2004 Check out this exerpt from:
- J C Calvarese (8/26) Jun 28 2004 I much prefer how D makes me fix irregularities in my code by using only...
- Daniel Horn (15/47) Jun 28 2004 I agree for the most part... There's one exception to your steadfast
- Walter (6/20) Jun 28 2004 The difficulty with your approach is that when you pass on the code to
- Arcane Jill (24/28) Jun 28 2004 The problem, Walter, is that we don't all agree what is or is not an err...
- Sean Kelly (8/13) Jun 29 2004 I tend to think of Walter's compiler as a reference implementation. You...
- Walter (28/55) Jun 29 2004 compile,
- Rex Couture (6/9) Jun 29 2004 I'm shocked. Is this another C/C++ compiler? I thought it was supposed...
- Walter (13/23) Jun 29 2004 suboptimal
- Rex Couture (3/28) Jun 30 2004 Pardon me. I guess I'll have to defer to you on that one.
- Rex Couture (3/6) Jun 30 2004 Oh, and thanks for replying. Sorry I didn't see your reply sooner. The...
- Walter (4/12) Jun 30 2004 are a
- Derek Parnell (31/49) Jun 28 2004 In the code below, is the non-use of the function argument 'a' an error ...
- Walter (17/24) Jun 28 2004 It is not an error. There are many legitimate cases where one would have
- Derek Parnell (14/43) Jun 28 2004 I think at best one could say that is not necessarily an error. Yes ther...
- Mike Swieton (21/28) Jun 28 2004 I'll throw in my opinion here, because I know you're all dying to hear i...
- Walter (4/10) Jun 29 2004 same.
- Regan Heath (12/46) Jun 29 2004 From what you've said I think the best soln is that the compiler doesn'...
- Sean Kelly (5/11) Jun 29 2004 I had thought that D treated unused variables as errors. Is this not th...
- J C Calvarese (17/36) Jun 29 2004 DMD compiles either unused variables or unused arguments without complai...
- Rex Couture (12/13) Jun 29 2004 I think that's just plain nuts. Unused variables usually implies a prog...
- Derek Parnell (9/26) Jun 29 2004 I'm with you, Rex. I'm a firm believer in peer reviews and automated too...
- Regan Heath (12/43) Jun 29 2004 I think you should give Walter a chance to give you an example where you...
- Rex Couture (10/53) Jun 29 2004 By all means, if there is some guaranteed mechanism to warn you of unuse...
- Arcane Jill (11/11) Jun 30 2004 Not in reply to anyone in particular....
- Rex Couture (4/15) Jun 30 2004 An elegant solution.
- Russ Lewis (6/15) Jul 01 2004 After defining an API, you may change it in the future, and choose to
- Regan Heath (18/33) Jul 01 2004 Don't you then end up with an API that does not behave the same as it us...
- Derek Parnell (41/82) Jul 01 2004 I have an real-world example where occasionally it is right to ignore a
- Regan Heath (5/95) Jul 02 2004 Ahh.. yes, good example. :)
- Rex Couture (14/21) Jun 29 2004 Good point. You convinced me about warnings. But see Daniel Horn's poi...
- Regan Heath (23/30) Jun 29 2004 Can you give us one or two?
- Russ Lewis (27/44) Jul 01 2004 Ok, I'm not going to take sides on the warning issue, because frankly I
- Derek Parnell (7/60) Jul 01 2004 I like this idea a lot. It is brief, explicit, permissive, parsible, and
- Sean Kelly (7/20) Jul 02 2004 The equivalent thing in C/C++ would be to define foo this way:
- ANT (3/25) Jul 02 2004 and we can make intellisense for IDEs aware of it.
- Russ Lewis (12/38) Jul 02 2004 I agree that it is desirable to avoid a new keyword. However, let me
- Bruno A. Costa (9/27) Jun 29 2004 In part I agree with you, but I think there are some special cases where
- Sam McCall (8/44) Jun 29 2004 Hmm, in my code, unused member variables and local variables are usually...
- Regan Heath (7/54) Jun 29 2004 I agree, they are 2 different processes, they are only linked in that th...
- Walter (4/7) Jun 29 2004 It wouldn't be hard to use the existing DMD front end source as a starti...
- Juliano Ravasi Ferraz (29/37) Jun 29 2004 I can't take this seriously enough. I won't hire a programmer who
- Sean Kelly (13/35) Jun 29 2004 It sounds like you agree, but the D compiler always runs as if you had -...
-
Stewart Gordon
(34/37)
Jul 05 2004
- Regan Heath (22/54) Jul 05 2004 It's my impression that compilation and lint-like processing are 2
- Arcane Jill (15/33) Jul 05 2004 An interesting idea, but it sounds like solving the problem just by call...
- Regan Heath (30/77) Jul 05 2004 It is.. and it isn't. I think the core idea is that to be called a D
Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."
Jun 28 2004
Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."I much prefer how D makes me fix irregularities in my code by using only errors and no warnings. During the little bit of time that I've worked with C, I didn't often have the self-discipline to start fixing the warnings until they started to scroll off the screen. (tsk! tsk!) -- Justin (a/k/a jcc7) http://jcc_7.tripod.com/d/
Jun 28 2004
I agree for the most part... There's one exception to your steadfast rule if the following two conditions apply: a) the warning may be turned off (and probably is off by default) b) the compiler errors on said warning. Then it's just the compiler helping out the user who WANTS to be helped. Case in point of course is boolean logic. We should have an optional warning flag passed into the compiler when the user uses a boolean expression as an int or vice versa... the warning may be off for most devels, and for those who deign to turn it on, it would error on said warning. Of course libs that shipped with the compiler would have it enabled for compatability... the flag could look like -Wstrong-boolean -Werror ;-) J C Calvarese wrote:Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."I much prefer how D makes me fix irregularities in my code by using only errors and no warnings. During the little bit of time that I've worked with C, I didn't often have the self-discipline to start fixing the warnings until they started to scroll off the screen. (tsk! tsk!)
Jun 28 2004
The difficulty with your approach is that when you pass on the code to someone else, they are faced with "is it a bug or is it ok". Compilation should be a binary pass/fail, not something that sort of seems to compile, but who knows if it really did <g>. "Daniel Horn" <hellcatv hotmail.com> wrote in message news:cbq6rb$2ckl$1 digitaldaemon.com...I agree for the most part... There's one exception to your steadfast rule if the following two conditions apply: a) the warning may be turned off (and probably is off by default) b) the compiler errors on said warning. Then it's just the compiler helping out the user who WANTS to be helped. Case in point of course is boolean logic. We should have an optional warning flag passed into the compiler when the user uses a boolean expression as an int or vice versa... the warning may be off for most devels, and for those who deign to turn it on, it would error on said warning. Of course libs that shipped with the compiler would have it enabled for compatability... the flag could look like -Wstrong-boolean -Werror ;-)
Jun 28 2004
In article <cbqauv$2ilu$1 digitaldaemon.com>, Walter says...The difficulty with your approach is that when you pass on the code to someone else, they are faced with "is it a bug or is it ok". Compilation should be a binary pass/fail, not something that sort of seems to compile, but who knows if it really did <g>.The problem, Walter, is that we don't all agree what is or is not an error. I believe that the following SHOULD be an error: Now, we all know that a strong boolean type would render that an error, but even /that/ wouldn't catch ALL boolean type errors. I wrote a line of code the other which contained a subtle bug. I wrote: Now, the second == should have read =, obviously, but it compiled fine. And EVEN IF we'd have had a strong boolean type, it would STILL have compiled fine. But it /was/ a bug, and I think that it COULD have been spotted by a sufficiently intelligent compiler. Now, suppose, along comes a second, rival, D-compiler, which is intelligent enough to spot this circumstance and point it out to the user. Should it do so? Should it say "warning - using an equality test as a statement is a really dumb idea"? Or should it go with the idea that DMD's behavior is definitive, because "conflicting standards are bad"? So long as there exists only one D compiler, there is no use for warnings. But, enter Jill's imagined, hypothetical super-strict compiler designed to catch as many bugs as possible at compile-time - it would be quite reasonable for that hypothetical compiler to offer a "allow anything that DMD allows" option, in which case, that would be a good time to issue warnings. No? Jill
Jun 28 2004
In article <cbr37b$l31$1 digitaldaemon.com>, Arcane Jill says...So long as there exists only one D compiler, there is no use for warnings. But, enter Jill's imagined, hypothetical super-strict compiler designed to catch as many bugs as possible at compile-time - it would be quite reasonable for that hypothetical compiler to offer a "allow anything that DMD allows" option, in which case, that would be a good time to issue warnings. No?I tend to think of Walter's compiler as a reference implementation. Your example is a completely legitimate extension for a third-party compiler, provided it defaults to off. The problem in my mind is when third-party compilers display different behavior from one another. This is the case with current C++ compilers, and as someone who treats all warnings as errors this drives me crazy :) Sean
Jun 29 2004
"Arcane Jill" <Arcane_member pathlink.com> wrote in message news:cbr37b$l31$1 digitaldaemon.com...In article <cbqauv$2ilu$1 digitaldaemon.com>, Walter says...compile,The difficulty with your approach is that when you pass on the code to someone else, they are faced with "is it a bug or is it ok". Compilation should be a binary pass/fail, not something that sort of seems toerror. Ibut who knows if it really did <g>.The problem, Walter, is that we don't all agree what is or is not anbelieve that the following SHOULD be an error:We obviously disagree, because I don't consider that an error. There's no way any of us will agree 100% on the feature set of D.Now, we all know that a strong boolean type would render that an error,but even/that/ wouldn't catch ALL boolean type errors. I wrote a line of code theotherwhich contained a subtle bug. I wrote: Now, the second == should have read =, obviously, but it compiled fine.And EVENIF we'd have had a strong boolean type, it would STILL have compiled fine.Butit /was/ a bug, and I think that it COULD have been spotted by asufficientlyintelligent compiler. Now, suppose, along comes a second, rival, D-compiler, which isintelligentenough to spot this circumstance and point it out to the user. Should itdo so?Should it say "warning - using an equality test as a statement is a reallydumbidea"? Or should it go with the idea that DMD's behavior is definitive,because"conflicting standards are bad"?I'll go with my experience implementing the C and C++ standards - it's better to conform to the standards. Fixing what I consider to be suboptimal decisions in those standards has turned out to be a failure. Changing semantics from one compiler to the next will cause no end of grief and will impede the adoption of D - look at what happened with the varying interpretations of template rules in C++. That said, having an optional 'lint mode' offered by a particular D implementation can be an appealing feature for that implementation, as long as it is both optional and compiles a strict subset of D.So long as there exists only one D compiler, there is no use for warnings.But,enter Jill's imagined, hypothetical super-strict compiler designed tocatch asmany bugs as possible at compile-time - it would be quite reasonable forthathypothetical compiler to offer a "allow anything that DMD allows" option,inwhich case, that would be a good time to issue warnings. No?
Jun 29 2004
In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...I'll go with my experience implementing the C and C++ standards - it's better to conform to the standards. Fixing what I consider to be suboptimal decisions in those standards has turned out to be a failure.I'm shocked. Is this another C/C++ compiler? I thought it was supposed to be something better. And pardon my ignorance, but as a mere mortal programmer, I haven't the slightest idea why should compile. Sooner or later, someone is going to die over errors like that.
Jun 29 2004
"Rex Couture" <Rex_member pathlink.com> wrote in message news:cbt6f5$l92$1 digitaldaemon.com...In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...suboptimalI'll go with my experience implementing the C and C++ standards - it's better to conform to the standards. Fixing what I consider to beto bedecisions in those standards has turned out to be a failure.I'm shocked. Is this another C/C++ compiler? I thought it was supposedsomething better.Yup - here I was referring not to C/C++ in particular, but to my experience with the utility of implementing slightly non-standard compilers.And pardon my ignorance, but as a mere mortal programmer, I haven't the slightest idea why should compile. Sooner or later, someone is going to die over errors likethat. If a==c actually calls a function opCmp, and one is writing code that wants to 'tickle' that function. This comes up sometimes in writing test coverage code, or in profiling code. This can also come up in generic code if one wants to verify that a and b are "comparable", but not care what the result is. It can also come up as the result of the combination of various optimizations and function inlining.
Jun 29 2004
In article <cbti1b$1765$1 digitaldaemon.com>, Walter says..."Rex Couture" <Rex_member pathlink.com> wrote in message news:cbt6f5$l92$1 digitaldaemon.com...Oh. Sorry, I misunderstood.In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...suboptimalI'll go with my experience implementing the C and C++ standards - it's better to conform to the standards. Fixing what I consider to beto bedecisions in those standards has turned out to be a failure.I'm shocked. Is this another C/C++ compiler? I thought it was supposedsomething better.Yup - here I was referring not to C/C++ in particular, but to my experience with the utility of implementing slightly non-standard compilers.Pardon me. I guess I'll have to defer to you on that one.And pardon my ignorance, but as a mere mortal programmer, I haven't the slightest idea why should compile. Sooner or later, someone is going to die over errors likethat. If a==c actually calls a function opCmp, and one is writing code that wants to 'tickle' that function. This comes up sometimes in writing test coverage code, or in profiling code. This can also come up in generic code if one wants to verify that a and b are "comparable", but not care what the result is. It can also come up as the result of the combination of various optimizations and function inlining.
Jun 30 2004
In article <cbti1b$1765$1 digitaldaemon.com>, Walter says..."Rex Couture" <Rex_member pathlink.com> wrote in message news:cbt6f5$l92$1 digitaldaemon.com...Oh, and thanks for replying. Sorry I didn't see your reply sooner. There are a lot of messages here. Some of my later messages are a little tart.In article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...
Jun 30 2004
"Rex Couture" <Rex_member pathlink.com> wrote in message news:cbtvc7$23pn$1 digitaldaemon.com...In article <cbti1b$1765$1 digitaldaemon.com>, Walter says...are a"Rex Couture" <Rex_member pathlink.com> wrote in message news:cbt6f5$l92$1 digitaldaemon.com...Oh, and thanks for replying. Sorry I didn't see your reply sooner. ThereIn article <cbsfns$2n3g$1 digitaldaemon.com>, Walter says...lot of messages here. Some of my later messages are a little tart.No worries. I'm at least a couple thousand messages behind :-(
Jun 30 2004
On Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it? If its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not? I work with a language that, by default, informs me about this type of coding. I have the option to turn the warning off (on a per function basis if required). -- Derek Melbourne, Australia 29/Jun/04 10:50:48 AM
Jun 28 2004
"Derek Parnell" <derek psych.ward> wrote in message news:cbqep2$2nut$1 digitaldaemon.com...In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it?It is not an error. There are many legitimate cases where one would have unused arguments.If its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not?Let's say you know it is not an error in a particular case, and turn off or ignore the warning messages. Now you pass the code on to the maintainers, post it on the internet, sell it to a customer. They try to compile the code, and get the warning. What do they do now? From personal experience, I've found it leaves a bad impression, coupled with confusion and uncertainty, when users of the code compile it and it generates warnings. That leaves one's best option as "compile with warnings flagged as errors" so the customer doesn't see them, which is essentially what D does. (Another thing D does is adjust the syntax in a few cases so that many typical C warnings can't happen.)I work with a language that, by default, informs me about this type of coding. I have the option to turn the warning off (on a per function basis if required).I've shipped a lot of code with funky pragmas that turn off specific warnings in specific parts of the code. It's a kludge at best. I'm trying to do better with D.
Jun 28 2004
On Mon, 28 Jun 2004 19:01:52 -0700, Walter wrote:"Derek Parnell" <derek psych.ward> wrote in message news:cbqep2$2nut$1 digitaldaemon.com...I think at best one could say that is not necessarily an error. Yes there are many legit cases, and many non-legit cases too.In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it?It is not an error. There are many legitimate cases where one would have unused arguments.I would prefer that by default 'D', or any language for that matter, behaves like D does now. That is, allow coders to do this sort of thing. However, I'd also like to be able to turn on such a checking process once in a while, to make sure I didn't *accidentally* forget a piece of poor coding. So maybe a lint-like application is not such a silly idea. It is an additional review of my code; one that has better "eyes" than me or my colleagues. -- Derek Melbourne, Australia 29/Jun/04 2:07:46 PMIf its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not?Let's say you know it is not an error in a particular case, and turn off or ignore the warning messages. Now you pass the code on to the maintainers, post it on the internet, sell it to a customer. They try to compile the code, and get the warning. What do they do now? From personal experience, I've found it leaves a bad impression, coupled with confusion and uncertainty, when users of the code compile it and it generates warnings. That leaves one's best option as "compile with warnings flagged as errors" so the customer doesn't see them, which is essentially what D does. (Another thing D does is adjust the syntax in a few cases so that many typical C warnings can't happen.)I work with a language that, by default, informs me about this type of coding. I have the option to turn the warning off (on a per function basis if required).I've shipped a lot of code with funky pragmas that turn off specific warnings in specific parts of the code. It's a kludge at best. I'm trying to do better with D.
Jun 28 2004
On Tue, 29 Jun 2004 14:16:00 +1000, Derek Parnell wrote:I would prefer that by default 'D', or any language for that matter, behaves like D does now. That is, allow coders to do this sort of thing. However, I'd also like to be able to turn on such a checking process once in a while, to make sure I didn't *accidentally* forget a piece of poor coding. So maybe a lint-like application is not such a silly idea. It is an additional review of my code; one that has better "eyes" than me or my colleagues.I'll throw in my opinion here, because I know you're all dying to hear it ;) My problem with warnings in most languages is that they are not all the same. That is, if I have a product which needs to build on several GCC versions along with VC6 (this is the case right now, actually), there are several places where, due to differences in the warnings/errors of the compiler, certain C++ code simply cannot be done the same way on both compilers, and that's ridiculous. I don't mind the concept of warnings, because it really can tell you when you've done something that may be wrong. After all, a compiler shouldn't be so pedantic as to croak on *every* little thing. One consideration is this: the D standard could specify 'official' warnings, which are the only ones competing implementations may throw. Optionally, I feel a lint program could be valuable. Just as long as my different compilers work the same. Note: even some Java compilers/VMs can be bitchy, even with the same language version. Just some brain food for yah. Mike Swieton __ Things won are done, joy's soul lies in the doing. - William Shakespeare
Jun 28 2004
"Mike Swieton" <mike swieton.net> wrote in message news:pan.2004.06.29.04.31.05.718520 swieton.net...My problem with warnings in most languages is that they are not all thesame.That is, if I have a product which needs to build on several GCC versions along with VC6 (this is the case right now, actually), there are several places where, due to differences in the warnings/errors of the compiler, certain C++ code simply cannot be done the same way on both compilers, and that's ridiculous.I've run into that, too <g>.
Jun 29 2004
On Tue, 29 Jun 2004 00:31:06 -0400, Mike Swieton <mike swieton.net> wrote:On Tue, 29 Jun 2004 14:16:00 +1000, Derek Parnell wrote:From what you've said I think the best soln is that the compiler doesn't ever throw warnings, just errors. If you think it is and want to be 'extra safe' you run a lint-like program after it compiles. To me they are seperate and distinct things, one is compiling a program from instructions, either it will go, or it won't. The other is double checking potential mistakes in those instructions. A lint-like program, or your development ide or whatever can do this step. Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/I would prefer that by default 'D', or any language for that matter, behaves like D does now. That is, allow coders to do this sort of thing. However, I'd also like to be able to turn on such a checking process once in a while, to make sure I didn't *accidentally* forget a piece of poor coding. So maybe a lint-like application is not such a silly idea. It is an additional review of my code; one that has better "eyes" than me or my colleagues.I'll throw in my opinion here, because I know you're all dying to hear it ;) My problem with warnings in most languages is that they are not all the same. That is, if I have a product which needs to build on several GCC versions along with VC6 (this is the case right now, actually), there are several places where, due to differences in the warnings/errors of the compiler, certain C++ code simply cannot be done the same way on both compilers, and that's ridiculous. I don't mind the concept of warnings, because it really can tell you when you've done something that may be wrong. After all, a compiler shouldn't be so pedantic as to croak on *every* little thing. One consideration is this: the D standard could specify 'official' warnings, which are the only ones competing implementations may throw. Optionally, I feel a lint program could be valuable. Just as long as my different compilers work the same. Note: even some Java compilers/VMs can be bitchy, even with the same language version. Just some brain food for yah.
Jun 29 2004
In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says..."Derek Parnell" <derek psych.ward> wrote in message news:cbqep2$2nut$1 digitaldaemon.com...I had thought that D treated unused variables as errors. Is this not the case for unused arguments? Not that I'm complaining--I agree that there are legitimate uses for this technique. SeanIn the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it?It is not an error. There are many legitimate cases where one would have unused arguments.
Jun 29 2004
Sean Kelly wrote:In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says...DMD compiles either unused variables or unused arguments without complaint: int whatever(int i, int j, int k) { return 1; } void main() { int m; int n; n = whatever(n, n, n); } That's fine with me. I might not want to use all of the variables or arguments. -- Justin (a/k/a jcc7) http://jcc_7.tripod.com/d/"Derek Parnell" <derek psych.ward> wrote in message news:cbqep2$2nut$1 digitaldaemon.com...I had thought that D treated unused variables as errors. Is this not the case for unused arguments? Not that I'm complaining--I agree that there are legitimate uses for this technique. SeanIn the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it?It is not an error. There are many legitimate cases where one would have unused arguments.
Jun 29 2004
In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...DMD compiles either unused variables or unused arguments without complaint:I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
Jun 29 2004
On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such. -- Derek Melbourne, Australia 30/Jun/04 2:44:28 PMDMD compiles either unused variables or unused arguments without complaint:I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
Jun 29 2004
On Wed, 30 Jun 2004 14:46:38 +1000, Derek Parnell <derek psych.ward> wrote:On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code. Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such.DMD compiles either unused variables or unused arguments without complaint:I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
Jun 29 2004
In article <opsad6jgkc5a2sq9 digitalmars.com>, Regan Heath says...On Wed, 30 Jun 2004 14:46:38 +1000, Derek Parnell <derek psych.ward> wrote:By all means, if there is some guaranteed mechanism to warn you of unused variables -- whatever it is -- that's fine. But I wonder if there is any practical difference between this and a compiler warning. It probably has to warn you every time to be of any significant use. I think Walter has a different objective than most programmers. He wants to sell software to someone, and doesn't want it to ever generate a warning. Most of us need the compiler to tell us about unused variables, most of the time. There are simple ways of suppressing warnings if you really must. By the way, the strict boolean issue is not going to go away.On Wed, 30 Jun 2004 04:12:11 +0000 (UTC), Rex Couture wrote:I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code. Regan.In article <cbtc94$toq$1 digitaldaemon.com>, J C Calvarese says...I'm with you, Rex. I'm a firm believer in peer reviews and automated tools to *assist* the coder. A compiler, to me, is supposed to be a tool to help us poor humans. By all means, don't have the compiler /stop/ us doing stupid things, but at least let us know when we might be doing such.DMD compiles either unused variables or unused arguments without complaint:I think that's just plain nuts. Unused variables usually implies a programming error, and a warning is most appropriate. Of course, sometimes it means you have just commented out some code for debugging. Sometimes it means that you have no use for a returned argument, but I have rarely seen code with enough warnings to be a problem. I give up. For a language that's supposed to be not for purists, it seems like D is getting to be a very strange mixture of fire and purity. Typesafe conditional statements are too pure, but warnings are not pure enough. I guess real programmers don't need no stinkin' warnings to tell them they screwed up. I'm pretty sure by that criterion I'll never qualify as a real programmer.
Jun 29 2004
Not in reply to anyone in particular.... In C and C++, the function: will compile without error or warning, as I believe it should. That strange void line tells the compiler *I DON'T WANT TO USE THIS VARIABLE*. I'm in favor of unsused variables being a compile-error, unless explicitly indicated by the programmer, as above (or using some other, D-specific, syntax). Arcane Jill
Jun 30 2004
In article <cbtqrd$1lhq$1 digitaldaemon.com>, Arcane Jill says...Not in reply to anyone in particular.... In C and C++, the function: will compile without error or warning, as I believe it should. That strange void line tells the compiler *I DON'T WANT TO USE THIS VARIABLE*. I'm in favor of unsused variables being a compile-error, unless explicitly indicated by the programmer, as above (or using some other, D-specific, syntax). Arcane JillAn elegant solution. Can I assume a similar void statement would also work outside the function, to discard an unwanted parameter?
Jun 30 2004
Regan Heath wrote:I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code.After defining an API, you may change it in the future, and choose to ignore one or more arguments. Or, consider the situation where you have an API which had different implementations. Some of the implementations might make use of certain parameters which are ignored in other implementations.
Jul 01 2004
On Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis <spamhole-2001-07-16 deming-os.org> wrote:Regan Heath wrote:Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code.After defining an API, you may change it in the future, and choose to ignore one or more arguments.Or, consider the situation where you have an API which had different implementations. Some of the implementations might make use of certain parameters which are ignored in other implementations.In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 01 2004
On Fri, 02 Jul 2004 17:47:02 +1200, Regan Heath wrote:On Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis <spamhole-2001-07-16 deming-os.org> wrote:I have an real-world example where occasionally it is right to ignore a parameter. I use a library that is a Windows GUI development tool (BTW, I must think about porting it to D) and the way it handles events is that the application sets up an event handler for a control/event combination. The library calls your event handler with exactly three parameters : The ID of the control the event happened to, the id of the event type, and a dynamic array of parameters specific to the event. Not every event handler needs all this information, but it is given to each and every event handler. So frequently, the code in the event handler ignores one or more of the parameters supplied to it. An example might be useful (not D code) ... Here the 'event' parameter is not needed even though it is supplied by the GUI library. It would be nice to tell the compiler that I'm deliberately not using that parameter. The 'expire' idea would suffice. -- Derek Melbourne, Australia 2/Jul/04 3:53:09 PMRegan Heath wrote:Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code.After defining an API, you may change it in the future, and choose to ignore one or more arguments.Or, consider the situation where you have an API which had different implementations. Some of the implementations might make use of certain parameters which are ignored in other implementations.In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan.
Jul 01 2004
On Fri, 2 Jul 2004 16:17:39 +1000, Derek Parnell <derek psych.ward> wrote:On Fri, 02 Jul 2004 17:47:02 +1200, Regan Heath wrote:Ahh.. yes, good example. :) ReganOn Thu, 01 Jul 2004 21:53:18 -0700, Russ Lewis <spamhole-2001-07-16 deming-os.org> wrote:I have an real-world example where occasionally it is right to ignore a parameter. I use a library that is a Windows GUI development tool (BTW, I must think about porting it to D) and the way it handles events is that the application sets up an event handler for a control/event combination. The library calls your event handler with exactly three parameters : The ID of the control the event happened to, the id of the event type, and a dynamic array of parameters specific to the event. Not every event handler needs all this information, but it is given to each and every event handler. So frequently, the code in the event handler ignores one or more of the parameters supplied to it.Regan Heath wrote:Don't you then end up with an API that does not behave the same as it used to? i.e. int doStuff(bool sort, bool interleave, bool capitalise) { .. } if you start to ignore one or more of those, then the function will be doing either one or the other (sort or not sort ..etc..), and not what is specified.I think you should give Walter a chance to give you an example where you'd want to have an un-used parameter, I suspect the times you'd want one have all been solved by having default function parameters, see my post asking walter for examples, for an example if this. To be fair an un-used parameter does not cause a crash, it might not cause the desired behaviour, as it's not being used to do whatever it is supposed to do, but, you should notice this either in a DBC out block OR in a unittest OR the first time you run your code.After defining an API, you may change it in the future, and choose to ignore one or more arguments.Or, consider the situation where you have an API which had different implementations. Some of the implementations might make use of certain parameters which are ignored in other implementations.In this case, unless you can choose the implementation then it's the same as above, you get different results and have no control over when/where so the API will be incosistent. Basically if you start to ignore a parameter you change the behaviour, and that's bad.. right? Do you have a specific example where it doesn't change the behaviour? Regan.An example might be useful (not D code) ... parms) Here the 'event' parameter is not needed even though it is supplied by the GUI library. It would be nice to tell the compiler that I'm deliberately not using that parameter. The 'expire' idea would suffice.-- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 02 2004
Good point. You convinced me about warnings. But see Daniel Horn's point below. What about a compromise? A standard compiler directive (or metacode, or whatever you wish to call it), which can turn strict booleans off and on for those who need that feature. If you put it right in the code before and after the line(s) in question, it's right in the code, and they will not be in doubt. In article <cbqmu9$2e7$1 digitaldaemon.com>, Walter says...Let's say you know it is not an error in a particular case, and turn off or ignore the warning messages. Now you pass the code on to the maintainers, post it on the internet, sell it to a customer. They try to compile the code, and get the warning. What do they do now?I've shipped a lot of code with funky pragmas that turn off specific warnings in specific parts of the code. It's a kludge at best. I'm trying to do better with D.============================== On Monday, June 28, Daniel Horn wrote: "Case in point of course is boolean logic. "We should have an optional warning flag passed into the compiler when the user uses a boolean expression as an int or vice versa... the warning may be off for most devels, and for those who deign to turn it on, it would error on said warning."
Jun 29 2004
On Mon, 28 Jun 2004 19:01:52 -0700, Walter <newshound digitalmars.com> wrote:"Derek Parnell" <derek psych.ward> wrote in message news:cbqep2$2nut$1 digitaldaemon.com...Can you give us one or two? The only ones I can think of are for example... void doSomething(char *foo, int bar, int reserved) { ..use foo and bar.. } where reserved is for a future possible extension to this function. Default function parameters *solve* this case IMO. Instead of the above you have... void doSomething(char *foo, int bar) { ..use foo and bar.. } then you can extend it... void doSomething(char *foo, int bar, long[] baz = null) { ..use foo and bar and baz.. } Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it?It is not an error. There are many legitimate cases where one would have unused arguments.
Jun 29 2004
Derek Parnell wrote:On Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote: In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it? If its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not?Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value." Or, perhaps, maybe expire should be simply a statement that tells the compiler "pretend as though this is uninitialized data." So try out this code: void bar(int b) { int arg_save = b; printf("Old value of b = %d\n", b); expire b; int a = b; // SYNTAX ERROR, b is expired b = arg_save*arg_save; printf("New value of b = %d\n", b); // OK, b is valid again } Thoughts?
Jul 01 2004
On Thu, 01 Jul 2004 17:34:15 -0700, Russ Lewis wrote:Derek Parnell wrote:I like this idea a lot. It is brief, explicit, permissive, parsible, and doesn't break any existing code. -- Derek Melbourne, Australia 2/Jul/04 10:39:43 AMOn Mon, 28 Jun 2004 12:49:56 -0700, Walter wrote: In the code below, is the non-use of the function argument 'a' an error or not? If its an error then why does D allow it? If its not an error, wouldn't it be 'nice' to inform the coder of a POTENTIAL error or not?Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value." Or, perhaps, maybe expire should be simply a statement that tells the compiler "pretend as though this is uninitialized data." So try out this code: void bar(int b) { int arg_save = b; printf("Old value of b = %d\n", b); expire b; int a = b; // SYNTAX ERROR, b is expired b = arg_save*arg_save; printf("New value of b = %d\n", b); // OK, b is valid again } Thoughts?
Jul 01 2004
In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value."The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword. Sean
Jul 02 2004
In article <cc4bb3$ap1$1 digitaldaemon.com>, Sean Kelly says...In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...and we can make intellisense for IDEs aware of it. AntOk, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value."The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword.
Jul 02 2004
Sean Kelly wrote:In article <cc2ai7$25r$1 digitaldaemon.com>, Russ Lewis says...I agree that it is desirable to avoid a new keyword. However, let me point out another use for expire that is hard (sometimes impossible) to do currently: void bar() { int rc = GoDoStuff(); if(rc != 0) throw SomeSortOfException; expire rc; // we're not going to use this value again ...more code... } Thus, 'expire' would help us write more self-documenting code.Ok, I'm not going to take sides on the warning issue, because frankly I agree with both sides here. But I am going to jump in and say that you could treat this as an error, and still handle the legitimate cases, if you had an 'expire' construct, that worked like this: int foo(int a) { expire a; return 1; } 'expire' would simply make a contract that a variable must not be used later in the function. So, it would be an error to have a function which did not use one of its arguments - unless it explicitly expired them. It is an explicit contract of "I don't care about this value."The equivalent thing in C/C++ would be to define foo this way: int foo(int) { return 1; } Thus indicating to the compiler that the parameter is not used in the function body. D does not allow this syntax, but I favor it over the "expire" idea since it avoids the creation of a new keyword.
Jul 02 2004
In part I agree with you, but I think there are some special cases where warnings are wellcome. A simple example: In big projects, unused variables may happen and they tend to obfuscate the code, IMHO. In that cases we could have an option like "-Wall" from gcc to instruct the compiler to generate warnings. And of course, bugs in the library should not be treated by the compiler. Greetings, Bruno. Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."
Jun 29 2004
Bruno A. Costa wrote:In part I agree with you, but I think there are some special cases where warnings are wellcome. A simple example: In big projects, unused variables may happen and they tend to obfuscate the code, IMHO. In that cases we could have an option like "-Wall" from gcc to instruct the compiler to generate warnings.Hmm, in my code, unused member variables and local variables are usually bugs, and unused parameters are usually not bugs, there are exceptions to both. However I don't see why searching for these should happen at compile time, the two processes interfere with each other. Perhaps we need a lint and a compiler, likely sharing some code? SamAnd of course, bugs in the library should not be treated by the compiler. Greetings, Bruno. Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."
Jun 29 2004
On Wed, 30 Jun 2004 03:52:48 +1200, Sam McCall <tunah.d tunah.net> wrote:Bruno A. Costa wrote:I agree, they are 2 different processes, they are only linked in that they look at the same imput. I think the best soln is to seperate them. It paves the way for a competitive market for good lint-like applications. ReganIn part I agree with you, but I think there are some special cases where warnings are wellcome. A simple example: In big projects, unused variables may happen and they tend to obfuscate the code, IMHO. In that cases we could have an option like "-Wall" from gcc to instruct the compiler to generate warnings.Hmm, in my code, unused member variables and local variables are usually bugs, and unused parameters are usually not bugs, there are exceptions to both. However I don't see why searching for these should happen at compile time, the two processes interfere with each other. Perhaps we need a lint and a compiler, likely sharing some code?-- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/And of course, bugs in the library should not be treated by the compiler. Greetings, Bruno. Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160 "Of course, this has been tried before-most compilers generate various warnings when they encounter questionable code. Old-time Unix/C programmers will certainly recall lint(1), a code-checker that did cross-file error checking and parameter type matching. These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you. Anyone who has worked with enough code will tell you that, generally, software that compiles without warnings crashes less often. As far as I'm concerned, warnings are for wimps. Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."
Jun 29 2004
"Regan Heath" <regan netwin.co.nz> wrote in message news:opsadrjmq55a2sq9 digitalmars.com...I agree, they are 2 different processes, they are only linked in that they look at the same imput. I think the best soln is to seperate them. It paves the way for a competitive market for good lint-like applications.It wouldn't be hard to use the existing DMD front end source as a starting point for such an app.
Jun 29 2004
Walter wrote:"...Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter..."I think it is a matter of when and how you want to spend that time."...As far as I'm concerned, warnings are for wimps..."I can't take this seriously enough. I won't hire a programmer who ignores or disables compiler warnings if they are there.Tools such as lint(1) and DevStudio should not issue warnings: they should decide if they've found an error and stop the build process, or they should shut up and generate code."There are many warnings that are real life-savers. I usually compile my own programs with gcc's -Wall -Werror (enable all common warnings and turn them into erros). They were countless times those warnings saved me. I know that D aims to fix the syntax where it may lead to bugs (like not allowing assignment expressions where a boolean result is expected), but this is still far away from "safe". There is an ancient programmer wisdom that says that if you manage to create an idiot-proof application, an enhanced type of idiot will be born on the day after. BTW, both dmd and lastest version of gcc with `-W -WallŽ compiles this without any warning (nested functions are a gcc extension): int main() { int a, b; void f(int a) { if (a) { int a = a+1; b == a++; } } f(a); return 0; } <fun> how much time for someone poping up with a www.iodcc.org? :-D </fun> -- Juliano
Jun 29 2004
In article <cbspu4$427$1 digitaldaemon.com>, Juliano Ravasi Ferraz says...There are many warnings that are real life-savers. I usually compile my own programs with gcc's -Wall -Werror (enable all common warnings and turn them into erros). They were countless times those warnings saved me. I know that D aims to fix the syntax where it may lead to bugs (like not allowing assignment expressions where a boolean result is expected), but this is still far away from "safe".It sounds like you agree, but the D compiler always runs as if you had -Wall -Werror set. I think this is fantastic. Working with third-party libraries in C++ is a headache for folks like me who compile with these options set in C++. It's often not feasible to fix errors in third-party code, and wrapping everything in a million pragmas is not a fun way to code :)There is an ancient programmer wisdom that says that if you manage to create an idiot-proof application, an enhanced type of idiot will be born on the day after.I'm in the "safety through documentation" camp. If the user compiles without DBC enabled and didn't read the docs then it's not my problem if he's being an idiot.BTW, both dmd and lastest version of gcc with `-W -WallŽ compiles this without any warning (nested functions are a gcc extension): int main() { int a, b; void f(int a) { if (a) { int a = a+1; b == a++; } } f(a); return 0; }Kind of an odd construct, but I don't see anything wrong with an expression as a statement. And this one has a side-effect so a compiler might not consider it unnecessary code anyway. Sean
Jun 29 2004
Walter wrote:Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160<snip> Arcane Jill has half taken the words out of my mouth, but I'll say it anyway. The idea of a _language_ having warnings or not having warnings makes little or no sense to me. Indeed, does the C specification say anything about whether the compiler should complain about an unused function parameter, a comparison with an inherently constant truth value, or the common typo of if (qwert = yuiop) { ... } ...? Different compiler writers, and hence different compilers, have different conceptions of what is probably a coding error and what isn't. For example, Borland C++ has warnings that GCC doesn't, and vice versa. Is this meant to be a decree that any D compiler written by anybody should never output a warning in its life? This has its own problems. Three options remain: (a) let the dodgy code silently slip through, no matter how dodgy it is. This would hinder the principle of eliminating common bugs from the start, and consequently reduce the scope for competition in implementation quality. Since D aims to help get rid of common bugs, it seems silly to try and stop different compilers from helping further. (b) take anything the implementer feels is dodgy as an error. This could seriously break the other bit of D philosophy: portability. (c) have (a) and (b) as separate compilation modes. (a) would be used to compile programs written by someone else, (b) would be used to compile your own stuff. But what about using libraries? You'd end up mixing the two compilation modes in the process of building a project, even within a single module and its imports. I for one don't know what I'd do if I decided to write a D compiler.... Stewart. -- My e-mail is valid but not my primary mailbox, aside from its being the unfortunate victim of intensive mail-bombing at the moment. Please keep replies on the 'group where everyone may benefit.
Jul 05 2004
On Mon, 05 Jul 2004 11:53:13 +0100, Stewart Gordon <smjg_1998 yahoo.com> wrote:Walter wrote:It's my impression that compilation and lint-like processing are 2 different steps. The compiler does the first, it does not do the second. What this means is that regardless of which D *compiler* you use, you will get the exact same errors. This gives the code greater portability, no more weird errors on system X. In addition, imagine you are writing a cross platform app, compiling it on Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X times (where X is the number of operating systems you compile for), you only *need* to do it once. Furthermore, you want to be able to choose which lint-like program to run, with your favourite config options and you want to run it on the fastest hardware, not that old Mac you use for your MacOSX compilations. Walter has implied that it would be easy to write a lint-like program using the DMD front end. I suggest that anyone who is really worried about catching these sorts of 'possibly a coding error' errors starts a lint-like project using the dmd front end. Regan. -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/Check out this exerpt from: http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=160<snip> Arcane Jill has half taken the words out of my mouth, but I'll say it anyway. The idea of a _language_ having warnings or not having warnings makes little or no sense to me. Indeed, does the C specification say anything about whether the compiler should complain about an unused function parameter, a comparison with an inherently constant truth value, or the common typo of if (qwert = yuiop) { ... } ...? Different compiler writers, and hence different compilers, have different conceptions of what is probably a coding error and what isn't. For example, Borland C++ has warnings that GCC doesn't, and vice versa. Is this meant to be a decree that any D compiler written by anybody should never output a warning in its life? This has its own problems. Three options remain: (a) let the dodgy code silently slip through, no matter how dodgy it is. This would hinder the principle of eliminating common bugs from the start, and consequently reduce the scope for competition in implementation quality. Since D aims to help get rid of common bugs, it seems silly to try and stop different compilers from helping further. (b) take anything the implementer feels is dodgy as an error. This could seriously break the other bit of D philosophy: portability. (c) have (a) and (b) as separate compilation modes. (a) would be used to compile programs written by someone else, (b) would be used to compile your own stuff. But what about using libraries? You'd end up mixing the two compilation modes in the process of building a project, even within a single module and its imports. I for one don't know what I'd do if I decided to write a D compiler....
Jul 05 2004
In article <opsany5h0b5a2sq9 digitalmars.com>, Regan Heath says...It's my impression that compilation and lint-like processing are 2 different steps. The compiler does the first, it does not do the second. What this means is that regardless of which D *compiler* you use, you will get the exact same errors. This gives the code greater portability, no more weird errors on system X. In addition, imagine you are writing a cross platform app, compiling it on Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X times (where X is the number of operating systems you compile for), you only *need* to do it once. Furthermore, you want to be able to choose which lint-like program to run, with your favourite config options and you want to run it on the fastest hardware, not that old Mac you use for your MacOSX compilations. Walter has implied that it would be easy to write a lint-like program using the DMD front end. I suggest that anyone who is really worried about catching these sorts of 'possibly a coding error' errors starts a lint-like project using the dmd front end. Regan.An interesting idea, but it sounds like solving the problem just by calling things by different names. Which is okay, of course. What I mean is, if a third party brought out the ACME-D compiler which generated warnings, it could get away with it just by calling itself, not a compiler, but a lint-tool and compiler combined. It could simply state that it was the lint-component, not the compiler-component, which was generating the warnings, and everyone would be happy. Well, I'd buy it. However, I imagine that those purists who argue that D is so inherently well-defined that there should never be any such thing as a warning (and I am not one of them) might see this as just re-introducing warnings through the back door. Well, whatever. I'd be happy to call it a lint add-on if it did the job and kept everyone happy. Arcane Jill
Jul 05 2004
On Mon, 5 Jul 2004 20:51:12 +0000 (UTC), Arcane Jill <Arcane_member pathlink.com> wrote:In article <opsany5h0b5a2sq9 digitalmars.com>, Regan Heath says...It is.. and it isn't. I think the core idea is that to be called a D compiler it must behave as do all other D compilers (if only by default). This gives all the benefits I have mentioned above: - portability of code - speed/efficiency of compilation - flexibility to choose (the lint process/options) This does not mean a D compiler cannot have a non-conformant 'mode', this mode should not be the default mode however.It's my impression that compilation and lint-like processing are 2 different steps. The compiler does the first, it does not do the second. What this means is that regardless of which D *compiler* you use, you will get the exact same errors. This gives the code greater portability, no more weird errors on system X. In addition, imagine you are writing a cross platform app, compiling it on Windows, Linux, FreeBSD, MacOSX, ..etc.. why do the lint-like process X times (where X is the number of operating systems you compile for), you only *need* to do it once. Furthermore, you want to be able to choose which lint-like program to run, with your favourite config options and you want to run it on the fastest hardware, not that old Mac you use for your MacOSX compilations. Walter has implied that it would be easy to write a lint-like program using the DMD front end. I suggest that anyone who is really worried about catching these sorts of 'possibly a coding error' errors starts a lint-like project using the dmd front end. Regan.An interesting idea, but it sounds like solving the problem just by calling things by different names. Which is okay, of course.What I mean is, if a third party brought out the ACME-D compiler which generated warnings, it could get away with it just by calling itself, not a compiler, but a lint-tool and compiler combined. It could simply state that it was the lint-component, not the compiler-component, which was generating the warnings, and everyone would be happy. Well, I'd buy it.As long as ACME-D did not do the lint-processing by default, no problem. Making the 2 processes part of the same executable does have benefits: - smaller than 2 containing the same dmd front-end or similar syntax processing code. - you only need one executable to do both things, more likely it's installed on system X. - you can do both in 1 step. So as long as lint processing is not the default behaviour you only get benefits from doing it this way.However, I imagine that those purists who argue that D is so inherently well-defined that there should never be any such thing as a warning (and I am not one of them)I don't believe this is even possible. The compiler can and does know the syntax and can verify that. The compiler cannot and does not know your intent, it only knows what you wrote, not what you meant to write.might see this as just re-introducing warnings through the back door.Not the secretive back door, instead the more obvious and well signposted service entrance.Well, whatever. I'd be happy to call it a lint add-on if it did the job and kept everyone happy.An 'add-on' by it's very nature is not 'default', so by renaming it, we have defined behaviour which I believe is the best way to handle this desire/situation. Regan -- Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
Jul 05 2004