digitalmars.D - Porting D2 code to D1
- Jason House (8/8) Jul 16 2008 It may sound backwards, but I think it may be the best way to get things...
- Steven Schveighoffer (10/29) Jul 16 2008 As someone who has worked on trying to port Tango to D2 (as recently as ...
- Jason House (3/9) Jul 16 2008 Knowing what kind of code can be converted from D1 to D2 and back to D1 ...
- Jarrett Billingsley (31/50) Jul 16 2008 Just removing const might not work if i.e. you have two methods that do ...
- Jason House (8/55) Jul 16 2008 Yeah, given a conflict, it's probably correct to go with the non-const v...
- Jarrett Billingsley (8/13) Jul 16 2008 I've always been dubious about using the version construct for various
- Jason House (3/18) Jul 17 2008 If that construct is restricted to the converter (ie. ignore eliminate v...
- Jarrett Billingsley (4/26) Jul 17 2008 Oh, oh, I see what you're saying now. Those version blocks would just b...
- Jason House (2/34) Jul 17 2008 Done right, the code could compile as D2 code out of the box. Using my ...
- Ryan Bloomfield (11/48) Jul 17 2008 Could a language construct be created in D1 and D2 to allow for a "lazy"...
- Jarrett Billingsley (4/22) Jul 17 2008 To be honest, I kind of wish this were the case with version statements ...
- Ryan Bloomfield (3/13) Jul 17 2008 I disagree. I think it's good for all version blocks to compile unde...
- Bill Baxter (16/30) Jul 17 2008 But it doesn't really solve the problem. So you have syntactically
- Ryan Bloomfield (6/21) Jul 17 2008 I thought about the limitations of syntactic validity when I was writing...
- Bill Baxter (12/37) Jul 18 2008 I think I get your gist. The syntactic validity requirement is not so
- Ryan Bloomfield (14/27) Jul 18 2008 That would solve the portability issue with D version standards. What i...
- Bill Baxter (5/36) Jul 18 2008 I think preventing the proliferation of vendor-specific language
- Leandro Lucarella (14/35) Jul 18 2008 That's true, but I think this opens a new posibility to D to introduce n...
- Bill Baxter (21/33) Jul 16 2008 I was going to suggest using some sort of pre-processor but I think that...
- Ryan Bloomfield (2/8) Jul 18 2008 "pragma" won't allow a new keyword. What if someone comes up with an ex...
- Ryan Bloomfield (2/12) Jul 18 2008 But alas, C99 doesn't have exceptions. I was wrong, must of read someth...
- Yigal Chripun (24/36) Jul 18 2008 IMHO, vendor extensions are the wrong way to add features to the
It may sound backwards, but I think it may be the best way to get things like Tango to move forward to D2. Tango officially supports D1. If it were to add D2 support, someone would have to port the D1 code to D2. Then, with each change, a similar change must occur to the D2 code base. From a maintenance standpoint, this really shouldn't be acceptable. Since adding concepts such as const to D1 code in an automated fashion is essentially impossible, it seems the best approach is to convert D2 code into D1 code. Programatically, it should be pretty easy to remove const-awareness. That'd allow Tango to convert to D2 once and then (more or less) maintain one code base. Maybe this would require a few special cases with D1-specific and D2-specific code, but I'd hope that wouldn't be very common. I guess I have a few questions: 1. Besides const removal, what else must get done to convert D2 code to D1 code? 2. How can D version-specific code be mixed into a single code base? 3. Any thoughts on how to programatically do all the conversions? 4. Would this be enough for D1 library maintainers to move to D2?
Jul 16 2008
"Jason House" wroteIt may sound backwards, but I think it may be the best way to get things like Tango to move forward to D2. Tango officially supports D1. If it were to add D2 support, someone would have to port the D1 code to D2. Then, with each change, a similar change must occur to the D2 code base. From a maintenance standpoint, this really shouldn't be acceptable. Since adding concepts such as const to D1 code in an automated fashion is essentially impossible, it seems the best approach is to convert D2 code into D1 code. Programatically, it should be pretty easy to remove const-awareness. That'd allow Tango to convert to D2 once and then (more or less) maintain one code base. Maybe this would require a few special cases with D1-specific and D2-specific code, but I'd hope that wouldn't be very common. I guess I have a few questions: 1. Besides const removal, what else must get done to convert D2 code to D1 code? 2. How can D version-specific code be mixed into a single code base? 3. Any thoughts on how to programatically do all the conversions? 4. Would this be enough for D1 library maintainers to move to D2?As someone who has worked on trying to port Tango to D2 (as recently as last week), I believe there is more work to be done on D2 before anything like this is considered. As of now, I have at least one critical bug that must be solved before Tango on D2 can be done: http://d.puremagic.com/issues/show_bug.cgi?id=1644 Unless Tango actually builds on D2, there is no point at looking for ways to port changes back to D1 :) -Steve
Jul 16 2008
Steven Schveighoffer Wrote:As of now, I have at least one critical bug that must be solved before Tango on D2 can be done: http://d.puremagic.com/issues/show_bug.cgi?id=1644Wow, I didn't realize anyone was still actively trying to port Tango to D2. I notice that bug is nearly a year old and has had no activity on it (besides you adding how important it is to Tango).Unless Tango actually builds on D2, there is no point at looking for ways to port changes back to D1 :)Knowing what kind of code can be converted from D1 to D2 and back to D1 isn't a bad exercise. My interest really is in getting Tango to port to D2, but I'm sure there are other D1 library developers that are also holding back.
Jul 16 2008
"Jason House" <jason.james.house gmail.com> wrote in message news:g5ld20$1jbt$1 digitalmars.com...It may sound backwards, but I think it may be the best way to get things like Tango to move forward to D2. Tango officially supports D1. If it were to add D2 support, someone would have to port the D1 code to D2. Then, with each change, a similar change must occur to the D2 code base. From a maintenance standpoint, this really shouldn't be acceptable. Since adding concepts such as const to D1 code in an automated fashion is essentially impossible, it seems the best approach is to convert D2 code into D1 code. Programatically, it should be pretty easy to remove const-awareness. That'd allow Tango to convert to D2 once and then (more or less) maintain one code base. Maybe this would require a few special cases with D1-specific and D2-specific code, but I'd hope that wouldn't be very common. I guess I have a few questions: 1. Besides const removal, what else must get done to convert D2 code to D1 code?Just removing const might not work if i.e. you have two methods that do the same thing, but one's const and one's non-const, though the converter might be able to pick those out. Code that depends on D2's closure support would not work in D1, and you would have to do semantic analysis to know how to automatically convert from a D2 closure to a D1 struct. Some minor syntactic differences, like "invariant {}" in D1 vs. "invariant() {}" in D2, as well as syntactic differences, like the new string literals in D2. Numerical foreach, but that can be replaced by a simple for loop. The (laughable) "enum" syntax for constants. Struct postblits and dtors (and eventually ctors). Stuff that depends on D2's overload sets would be very tricky to convert. __traits just cannot be converted for most cases. D2 allows you to overload opStar and opDot; both could simply be called manually as methods in the D1 code but you'd need semantic analysis to know when to.2. How can D version-specific code be mixed into a single code base?You almost answered it: mixins. String mixins, that is. It's somewhat frustrating that invariants for example must be handled something like this: // Or would this be 'enum' in D2? lol const invariantBody = "{ some code blah blah blah }"; version(D2) mixin("invariant() " ~ invariantBody); else mixin("invariant " ~ invariantBody); It's times like this when you kind of wish for a "stupid" preprocessor.3. Any thoughts on how to programatically do all the conversions?Start with DParser, I'd guess. Or one of the other D parser projects. You'd need semantic analysis for more than the simplest tasks, though.4. Would this be enough for D1 library maintainers to move to D2?Not for me, no. I won't move to D2 until it's actually complete.
Jul 16 2008
Jarrett Billingsley Wrote:"Jason House" <jason.james.house gmail.com> wrote in message news:g5ld20$1jbt$1 digitalmars.com...I was implicitly assuming something like a port of D1 code to D2 and then back to D2. Obviously, D2 has cool functionality that would be tough to port to D1. I probably should have also asked which subset of D2 can be easily converted to D1. Postblits and such are obviously not portable to D1.It may sound backwards, but I think it may be the best way to get things like Tango to move forward to D2. Tango officially supports D1. If it were to add D2 support, someone would have to port the D1 code to D2. Then, with each change, a similar change must occur to the D2 code base. From a maintenance standpoint, this really shouldn't be acceptable. Since adding concepts such as const to D1 code in an automated fashion is essentially impossible, it seems the best approach is to convert D2 code into D1 code. Programatically, it should be pretty easy to remove const-awareness. That'd allow Tango to convert to D2 once and then (more or less) maintain one code base. Maybe this would require a few special cases with D1-specific and D2-specific code, but I'd hope that wouldn't be very common. I guess I have a few questions: 1. Besides const removal, what else must get done to convert D2 code to D1 code?Just removing const might not work if i.e. you have two methods that do the same thing, but one's const and one's non-const, though the converter might be able to pick those out.Yeah, given a conflict, it's probably correct to go with the non-const version.Code that depends on D2's closure support would not work in D1, and you would have to do semantic analysis to know how to automatically convert from a D2 closure to a D1 struct.In the clarified context, I think this is an ignorable difference.Some minor syntactic differences, like "invariant {}" in D1 vs. "invariant() {}" in D2, as well as syntactic differences, like the new string literals in D2. Numerical foreach, but that can be replaced by a simple for loop. The (laughable) "enum" syntax for constants.Those should be easy to convert over.Struct postblits and dtors (and eventually ctors). Stuff that depends on D2's overload sets would be very tricky to convert. __traits just cannot be converted for most cases. D2 allows you to overload opStar and opDot; both could simply be called manually as methods in the D1 code but you'd need semantic analysis to know when to.Those are probably all tougher to convert over, but are hopefully out of context. I'm sure a previous D1 library maintainer wouldn't mind sticking with D1-like syntax :)I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}2. How can D version-specific code be mixed into a single code base?You almost answered it: mixins. String mixins, that is. It's times like this when you kind of wish for a "stupid" preprocessor.
Jul 16 2008
"Jason House" <jason.james.house gmail.com> wrote in message news:g5lib0$23ju$1 digitalmars.com...I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}I've always been dubious about using the version construct for various language versions. It's great for program options, but since the stuff in the version block has to be syntactically legal, it makes it worthless for supporting multiple versions of D. It's almost like another construct is needed. I like what you're thinking but I don't know if using the version construct is the right way to do it.
Jul 16 2008
Jarrett Billingsley Wrote:"Jason House" <jason.james.house gmail.com> wrote in message news:g5lib0$23ju$1 digitalmars.com...Who says what's in version(d1) has to be legal D1 code? ;) I'd vote that the code would still look like D2 code and would be converted in the same way as all other code.I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}I've always been dubious about using the version construct for various language versions. It's great for program options, but since the stuff in the version block has to be syntactically legal, it makes it worthless for supporting multiple versions of D.It's almost like another construct is needed. I like what you're thinking but I don't know if using the version construct is the right way to do it.If that construct is restricted to the converter (ie. ignore eliminate version(d2){}), I think nobody will complain.
Jul 17 2008
"Jason House" <jason.james.house gmail.com> wrote in message news:g5nhb4$r9d$1 digitalmars.com...Jarrett Billingsley Wrote:Oh, oh, I see what you're saying now. Those version blocks would just be dealt with by the converter, not the compiler."Jason House" <jason.james.house gmail.com> wrote in message news:g5lib0$23ju$1 digitalmars.com...Who says what's in version(d1) has to be legal D1 code? ;) I'd vote that the code would still look like D2 code and would be converted in the same way as all other code.I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}I've always been dubious about using the version construct for various language versions. It's great for program options, but since the stuff in the version block has to be syntactically legal, it makes it worthless for supporting multiple versions of D.
Jul 17 2008
Jarrett Billingsley Wrote:"Jason House" <jason.james.house gmail.com> wrote in message news:g5nhb4$r9d$1 digitalmars.com...Done right, the code could compile as D2 code out of the box. Using my ad hoc example, that'd require version=d2 to be included in the source. The version(d1) would silently (and correctly) be ignored. I don't know if this is best, but it certainly seems simple enough to doJarrett Billingsley Wrote:Oh, oh, I see what you're saying now. Those version blocks would just be dealt with by the converter, not the compiler."Jason House" <jason.james.house gmail.com> wrote in message news:g5lib0$23ju$1 digitalmars.com...Who says what's in version(d1) has to be legal D1 code? ;) I'd vote that the code would still look like D2 code and would be converted in the same way as all other code.I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}I've always been dubious about using the version construct for various language versions. It's great for program options, but since the stuff in the version block has to be syntactically legal, it makes it worthless for supporting multiple versions of D.
Jul 17 2008
Jason House Wrote:Jarrett Billingsley Wrote:Could a language construct be created in D1 and D2 to allow for a "lazy" version? As Far as I understand, the compiler would have to lexically analyze the code after the version is evaluated, but string mixins already have to be lexed after evaluation. I imagine the "{" and "}" operators would be ambiguous, so a new operators would have to be created to signify lazy syntax. "{{" and "}}", just for example: version(d1) {{ ... }} version(d2) {{ ... }} The code within the blocks would be treated something like string literals at the lexical stage, and kept as static data until the version statement is evaluated, then code would be inserted much like string mixins already do, but without the need to stringify the code."Jason House" <jason.james.house gmail.com> wrote in message news:g5nhb4$r9d$1 digitalmars.com...Done right, the code could compile as D2 code out of the box. Using my ad hoc example, that'd require version=d2 to be included in the source. The version(d1) would silently (and correctly) be ignored. I don't know if this is best, but it certainly seems simple enough to doJarrett Billingsley Wrote:Oh, oh, I see what you're saying now. Those version blocks would just be dealt with by the converter, not the compiler."Jason House" <jason.james.house gmail.com> wrote in message news:g5lib0$23ju$1 digitalmars.com...Who says what's in version(d1) has to be legal D1 code? ;) I'd vote that the code would still look like D2 code and would be converted in the same way as all other code.I can't imagine any library maintainer being willing to do that just for portability between D versions. It'd be nice if D2 code could have something like version(d1){} and version(d2){} for this purpose. That'd then allow the d2 compiler to ignore version(d1){} and allow the converter to strip out version(d2){}I've always been dubious about using the version construct for various language versions. It's great for program options, but since the stuff in the version block has to be syntactically legal, it makes it worthless for supporting multiple versions of D.
Jul 17 2008
"Ryan Bloomfield" <_sir_maniacREMOVE_ME yahoo.com> wrote in message news:g5oct3$2vpl$1 digitalmars.com...Could a language construct be created in D1 and D2 to allow for a "lazy" version? As Far as I understand, the compiler would have to lexically analyze the code after the version is evaluated, but string mixins already have to be lexed after evaluation. I imagine the "{" and "}" operators would be ambiguous, so a new operators would have to be created to signify lazy syntax. "{{" and "}}", just for example: version(d1) {{ ... }} version(d2) {{ ... }} The code within the blocks would be treated something like string literals at the lexical stage, and kept as static data until the version statement is evaluated, then code would be inserted much like string mixins already do, but without the need to stringify the code.To be honest, I kind of wish this were the case with version statements all the time.
Jul 17 2008
Jarrett Billingsley Wrote:"Ryan Bloomfield" <_sir_maniacREMOVE_ME yahoo.com> wrote in message news:g5oct3$2vpl$1 digitalmars.com...I disagree. I think it's good for all version blocks to compile under normal circumstances. If it didn't do that, uncompilable code could creep in. It would be especially bad if it involves machine-specific code. Coding between different versions of D is clearly going to be used less then the other intended uses of 'version'. Nevertheless, in my humble opinion, I do think the ability to ignore, or relax syntax rules needs to happen. Without it, any compiler or version specific feature could never be included in portable code. And compiler specific features seem to be the norm for C and C++ compilers.Could a language construct be created in D1 and D2 to allow for a "lazy" version?To be honest, I kind of wish this were the case with version statements all the time.
Jul 17 2008
Ryan Bloomfield wrote:Jarrett Billingsley Wrote:But it doesn't really solve the problem. So you have syntactically correct code. Yippee. You still can't say whether it's semantically valid until you actually compile it. So it provides protection from code rot some of the time. But if you're worried about code rot, syntactic validity is not enough. There's no substitute for actually compiling and testing the code you want to have work. If you're not worried about code rot then it's just a nuisance that gets in the way of legitimate goals like making code portable between major versions of D."Ryan Bloomfield" <_sir_maniacREMOVE_ME yahoo.com> wrote in message news:g5oct3$2vpl$1 digitalmars.com...I disagree. I think it's good for all version blocks to compile under normal circumstances. If it didn't do that, uncompilable code could creep in. It would be especially bad if it involves machine-specific code. Coding between different versions of D is clearly going to be used less then the other intended uses of 'version'.Could a language construct be created in D1 and D2 to allow for a "lazy" version?To be honest, I kind of wish this were the case with version statements all the time.Nevertheless, in my humble opinion, I do think the ability to ignore, or relax syntax rules needs to happen. Without it, any compiler or version specific feature could never be included in portable code. And compiler specific features seem to be the norm for C and C++ compilers.In light of what I said above, I don't really see any point in forcing versioned-out code to be syntactically valid like D does. It doesn't protect you from what is probably the biggest classes of code rot issues, which are things like variables getting renamed or functions changing their number of parameters, etc. --bb
Jul 17 2008
Bill Baxter Wrote:Ryan Bloomfield wrote:I thought about the limitations of syntactic validity when I was writing it. I suppose I should have included that thought. I think the following is still an issue: Nothing within version blocks can be guaranteed to be syntactically correct for any form of syntax analysis, without semantically determining the actual values set with 'version=...'. This also appears breaks the rule: "a source file can be syntactically analyzed without needing any semantic information" from the following interview: http://www.bitwisemag.com/copy/programming/d/interview/d_programming_language.html I may just be rambling on here, does this make sense?Jarrett Billingsley Wrote:But it doesn't really solve the problem. So you have syntactically correct code. Yippee. You still can't say whether it's semantically valid until you actually compile it. So it provides protection from code rot some of the time. But if you're worried about code rot, syntactic validity is not enough. There's no substitute for actually compiling and testing the code you want to have work.To be honest, I kind of wish this were the case with version statements all the time.I disagree. I think it's good for all version blocks to compile under normal circumstances. If it didn't do that, uncompilable code could creep in. It would be especially bad if it involves machine-specific code. Coding between different versions of D is clearly going to be used less then the other intended uses of 'version'.
Jul 17 2008
Ryan Bloomfield wrote:Bill Baxter Wrote:I think I get your gist. The syntactic validity requirement is not so much for catching programming errors as it is for keeping parsing simple for tools. That makes sense, and I think I've actually heard that argument before but just forgot about it. I don't like special cases in general, but maybe an exception can be made here. Version blocks that specify a version of the D language would be treated specially. That is, the D 1 and 2 specs could be amended to say that any version matching the pattern "D_Version[0-9]+" is special, and the contents of the following block should be ignored by a compiler or tool that does not support that version of the language. --bbRyan Bloomfield wrote:I thought about the limitations of syntactic validity when I was writing it. I suppose I should have included that thought. I think the following is still an issue: Nothing within version blocks can be guaranteed to be syntactically correct for any form of syntax analysis, without semantically determining the actual values set with 'version=...'. This also appears breaks the rule: "a source file can be syntactically analyzed without needing any semantic information" from the following interview: http://www.bitwisemag.com/copy/programming/d/interview/d_programming_language.html I may just be rambling on here, does this make sense?Jarrett Billingsley Wrote:But it doesn't really solve the problem. So you have syntactically correct code. Yippee. You still can't say whether it's semantically valid until you actually compile it. So it provides protection from code rot some of the time. But if you're worried about code rot, syntactic validity is not enough. There's no substitute for actually compiling and testing the code you want to have work.To be honest, I kind of wish this were the case with version statements all the time.I disagree. I think it's good for all version blocks to compile under normal circumstances. If it didn't do that, uncompilable code could creep in. It would be especially bad if it involves machine-specific code. Coding between different versions of D is clearly going to be used less then the other intended uses of 'version'.
Jul 18 2008
Bill Baxter Wrote:I think I get your gist. The syntactic validity requirement is not so much for catching programming errors as it is for keeping parsing simple for tools. That makes sense, and I think I've actually heard that argument before but just forgot about it. I don't like special cases in general, but maybe an exception can be made here. Version blocks that specify a version of the D language would be treated specially. That is, the D 1 and 2 specs could be amended to say that any version matching the pattern "D_Version[0-9]+" is special, and the contents of the following block should be ignored by a compiler or tool that does not support that version of the language. --bbThat would solve the portability issue with D version standards. What if a standards compliant compiler adds an additional feature that is syntactically incompatible? Would a special type of version statement be more useful? As an example 'dversion(__GNUD__) { ... }', or maybe 'version(D_VERSION,__GNUD__){ ... }'. This could even be used to conditionally compile specifically by feature: version(D_VERSION,__forany__) { forany(...) ... } else { foreach(...) ... } just my 2 cents
Jul 18 2008
Ryan Bloomfield wrote:Bill Baxter Wrote:I think preventing the proliferation of vendor-specific language extensions is considered a feature of the current system. There is always "pragma" for non-language extensions. --bbI think I get your gist. The syntactic validity requirement is not so much for catching programming errors as it is for keeping parsing simple for tools. That makes sense, and I think I've actually heard that argument before but just forgot about it. I don't like special cases in general, but maybe an exception can be made here. Version blocks that specify a version of the D language would be treated specially. That is, the D 1 and 2 specs could be amended to say that any version matching the pattern "D_Version[0-9]+" is special, and the contents of the following block should be ignored by a compiler or tool that does not support that version of the language. --bbThat would solve the portability issue with D version standards. What if a standards compliant compiler adds an additional feature that is syntactically incompatible? Would a special type of version statement be more useful? As an example 'dversion(__GNUD__) { ... }', or maybe 'version(D_VERSION,__GNUD__){ ... }'. This could even be used to conditionally compile specifically by feature: version(D_VERSION,__forany__) { forany(...) ... } else { foreach(...) ... } just my 2 cents
Jul 18 2008
Bill Baxter, el 18 de julio a las 17:50 me escribiste:That's true, but I think this opens a new posibility to D to introduce new small backward incompatible features in a stable version giving the user some time to update (or keep the old behavior), like Python's __future__ module[1]. That would make D evolving much easier... [1] http://www.python.org/dev/peps/pep-0236/ -- Leandro Lucarella (luca) | Blog colectivo: http://www.mazziblog.com.ar/blog/ ---------------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------------- SATANAS EN COMISARIA -- Crónica TVThat would solve the portability issue with D version standards. What if a standards compliant compiler adds an additional feature that is syntactically incompatible? Would a special type of version statement be more useful? As an example 'dversion(__GNUD__) { ... }', or maybe 'version(D_VERSION,__GNUD__){ ... }'. This could even be used to conditionally compile specifically by feature: version(D_VERSION,__forany__) { forany(...) ... } else { foreach(...) ... } just my 2 centsI think preventing the proliferation of vendor-specific language extensions is considered a feature of the current system. There is always "pragma" for non-language extensions.
Jul 18 2008
Jason House wrote:It may sound backwards, but I think it may be the best way to get things like Tango to move forward to D2. Tango officially supports D1. If it were to add D2 support, someone would have to port the D1 code to D2. Then, with each change, a similar change must occur to the D2 code base. From a maintenance standpoint, this really shouldn't be acceptable. Since adding concepts such as const to D1 code in an automated fashion is essentially impossible, it seems the best approach is to convert D2 code into D1 code. Programatically, it should be pretty easy to remove const-awareness. That'd allow Tango to convert to D2 once and then (more or less) maintain one code base. Maybe this would require a few special cases with D1-specific and D2-specific code, but I'd hope that wouldn't be very common. I guess I have a few questions: 1. Besides const removal, what else must get done to convert D2 code to D1 code? 2. How can D version-specific code be mixed into a single code base? 3. Any thoughts on how to programatically do all the conversions? 4. Would this be enough for D1 library maintainers to move to D2?I was going to suggest using some sort of pre-processor but I think that is only a reasonable solution if everyone is on the same page that supporting both D1 and D2 is important. In practice I think the current situation is more that there are devs who aren't interested in D2 and there are those who aren't interested in D1. And neither group will be satisfied having to tart up the source code with pre-processor macros in order to make it work with the "other" version. Also having to make the same code base compatible with both is increasingly going to mean crippling the D2 version, as D2 accumulates more and more tricks that are just impossible with D1. Like partial IFTI, which now works in D2 and is used heavily in Andrei's std.algorithm. So I think probably forking is more realistic. That way the people who are wild about D2 are free to come up with solutions that make the most sense in D2 without having to worry about whether it will work in D1 or not. Advances in D1 Tango can be ported forward as seems fit. Maybe if you wait until Tango is officially 1.0 then the number of such changes will begin to taper off. But I think it may be best to not put *too* much effort into making a D2 Tango "compatible" with D1 Tango. Of course making it wildly different for no reason is also not a good idea. --bb
Jul 16 2008
Bill Baxter Wrote:I think preventing the proliferation of vendor-specific language extensions is considered a feature of the current system. There is always "pragma" for non-language extensions. --bb"pragma" won't allow a new keyword. What if someone comes up with an experimental feature, that later proves useful enough to become part of a standard. In C and C++, this happens all the time. C exceptions are a good example.
Jul 18 2008
Ryan Bloomfield Wrote:Bill Baxter Wrote:But alas, C99 doesn't have exceptions. I was wrong, must of read something wrong. Ok, how about the "long long" type.I think preventing the proliferation of vendor-specific language extensions is considered a feature of the current system. There is always "pragma" for non-language extensions. --bb"pragma" won't allow a new keyword. What if someone comes up with an experimental feature, that later proves useful enough to become part of a standard. In C and C++, this happens all the time. C exceptions are a good example.
Jul 18 2008
Ryan Bloomfield wrote:Bill Baxter Wrote:IMHO, vendor extensions are the wrong way to add features to the language. This promotes vendor lock-in or pollutes the code with lots of code like: if compiler vendor is X do A else if ... .. else do Z anyway, a much better way is to allow extensibility without the need to write a whole compiler, via a plug-in mechanism. This also allows for non compiler experts to add features that do not actually require oneself to be a compiler expert. extensions for Firefox, and plug ins for eclipse are good examples of such a mechanism that brought a lot of innovation and new features. also the best of those addons can be added to the main software itself, as done in Firefox (only where it makes sense, of course) there are other languages that already provide this. to implement it you'd need a way to interface with the compiler via an API and use AST macros (currently planned for D v3) or something similar to define the new features. This will allow you to provide language extensions as libraries instead of requiring a specific compiler.I think preventing the proliferation of vendor-specific language extensions is considered a feature of the current system. There is always "pragma" for non-language extensions. --bb"pragma" won't allow a new keyword. What if someone comes up with an experimental feature, that later proves useful enough to become part of a standard. In C and C++, this happens all the time. C exceptions are a good example.
Jul 18 2008