digitalmars.D - why to (not) support "older" compiler versions
- yawniek (20/20) Nov 03 2015 i have seen many PR's and also Forum entries that deal with the
- Johannes Pfau (4/18) Nov 03 2015 I guess it's to be compatible with the latest DMD, LDC and GDC. GDC
- drug (3/5) Nov 03 2015 A bit offtopic - will the situation change with ddmd accepted? I mean
- Daniel Murphy (4/11) Nov 03 2015 While DDMD does not have any direct effect on our ability to keep the
- Iain Buclaw via Digitalmars-d (4/17) Nov 03 2015 Whilst other clean-up work has destroyed years of stable compatibility
- drug (4/20) Nov 03 2015 Hmm, I asked because I've heard that using ddmd would help with keeping
- Iain Buclaw via Digitalmars-d (24/54) Nov 03 2015 Well, how would that work? :-)
- drug (2/24) Nov 03 2015 I see. Thank you for your answer!
- Johannes Pfau (24/90) Nov 03 2015 I'd like to see some statistics how many DMD pull requests are
- Daniel Murphy (6/13) Nov 18 2015 It's not a crazy idea at all. The problem is that we will need to get
- Iain Buclaw via Digitalmars-d (8/22) Nov 18 2015 There are many factors at play here from my side. Some technical blocke...
- yawniek (8/10) Nov 03 2015 this makes sense.
- Sebastiaan Koppe (9/13) Nov 03 2015 For end-users it is always good to support a lot of versions.
- Mathias Lang (18/38) Nov 04 2015 Why do we keep backward compatibility ? The answer is dead
i have seen many PR's and also Forum entries that deal with the problem of newer features of the compiler not being able and then patching or working around that to support older compiler versions. since it is really easy to keep up with compiler versions and even switch (and not many features are being removed from dmd) what are good reasons to keep backward compatiblity? the latest example i saw was replacing groupBy by a loop to keep compatiblity with 2.066. while not a big thing, it adds up. since still a lot of useful features do get added into phobos at a fairly fast pace, would it not be better to to keep targeting just the two most recent versions and moving the ecosystem a little bit further. For people entering the world of D it would be much more encouraging to read a lot of concise code using all the nice features we have instead of just lipstick'd C.
Nov 03 2015
Am Tue, 03 Nov 2015 08:08:26 +0000 schrieb yawniek <dlang srtnwz.com>:i have seen many PR's and also Forum entries that deal with the problem of newer features of the compiler not being able and then patching or working around that to support older compiler versions. since it is really easy to keep up with compiler versions and even switch (and not many features are being removed from dmd) what are good reasons to keep backward compatiblity? the latest example i saw was replacing groupBy by a loop to keep compatiblity with 2.066. while not a big thing, it adds up.I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend.
Nov 03 2015
On 03.11.2015 11:22, Johannes Pfau wrote:I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend.A bit offtopic - will the situation change with ddmd accepted? I mean the situation with different frontend version in different compilers.
Nov 03 2015
On 3/11/2015 7:52 PM, drug wrote:On 03.11.2015 11:22, Johannes Pfau wrote:While DDMD does not have any direct effect on our ability to keep the three compilers synced, some of the cleanup work that has been done does help.I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend.A bit offtopic - will the situation change with ddmd accepted? I mean the situation with different frontend version in different compilers.
Nov 03 2015
On 3 November 2015 at 11:35, Daniel Murphy via Digitalmars-d < digitalmars-d puremagic.com> wrote:On 3/11/2015 7:52 PM, drug wrote:Whilst other clean-up work has destroyed years of stable compatibility between different 'ends'. ;-)On 03.11.2015 11:22, Johannes Pfau wrote:While DDMD does not have any direct effect on our ability to keep the three compilers synced, some of the cleanup work that has been done does help.I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend. A bit offtopic - will the situation change with ddmd accepted? I meanthe situation with different frontend version in different compilers.
Nov 03 2015
On 03.11.2015 14:11, Iain Buclaw via Digitalmars-d wrote:On 3 November 2015 at 11:35, Daniel Murphy via Digitalmars-d <digitalmars-d puremagic.com <mailto:digitalmars-d puremagic.com>> wrote: On 3/11/2015 7:52 PM, drug wrote: On 03.11.2015 11:22, Johannes Pfau wrote: I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend. A bit offtopic - will the situation change with ddmd accepted? I mean the situation with different frontend version in different compilers. While DDMD does not have any direct effect on our ability to keep the three compilers synced, some of the cleanup work that has been done does help. Whilst other clean-up work has destroyed years of stable compatibility between different 'ends'. ;-)Hmm, I asked because I've heard that using ddmd would help with keeping the compilers synced and we would have the same version of frontend everywhere...
Nov 03 2015
On 3 November 2015 at 12:57, drug via Digitalmars-d < digitalmars-d puremagic.com> wrote:On 03.11.2015 14:11, Iain Buclaw via Digitalmars-d wrote:Well, how would that work? :-) What you've probably misheard is half of a phrase. Moving to towards ddmd is not to be confused with moving towards a shared 'frontend' codebase, and is the first half of the correct sentence. The second half is that even then, that has no guarantee of keeping things in sync without also integrating other 'ends' into the CI process. This requires that we set-up an infrastructure where: - New PRs are tested against all compilers before merging. This not to be confused with our current set-up where all compilers build DMD. Specifically new changes upstream must: 1. Be able to apply the change cleanly in their local repositories 2. Build themselves without error. - We then need another process in place to keep each end in sync after changes upstream are applied. It was hoped that moving towards ddmd would force a lot of the ABI-specific code to be moved into Target or Port (host) interfaces that are agnostic to the backend. There are still many target-specific areas where this is not the case, and on top of that there are regressions in the host-specific interfaces. In short, there will always be a heavy maintenance burden regardless of what language we're written in. :-) IainOn 3 November 2015 at 11:35, Daniel Murphy via Digitalmars-d <digitalmars-d puremagic.com <mailto:digitalmars-d puremagic.com>> wrote: On 3/11/2015 7:52 PM, drug wrote: On 03.11.2015 11:22, Johannes Pfau wrote: I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend. A bit offtopic - will the situation change with ddmd accepted? I mean the situation with different frontend version in different compilers. While DDMD does not have any direct effect on our ability to keep the three compilers synced, some of the cleanup work that has been done does help. Whilst other clean-up work has destroyed years of stable compatibility between different 'ends'. ;-)Hmm, I asked because I've heard that using ddmd would help with keeping the compilers synced and we would have the same version of frontend everywhere...
Nov 03 2015
On 03.11.2015 15:50, Iain Buclaw via Digitalmars-d wrote:Well, how would that work? :-) What you've probably misheard is half of a phrase. Moving to towards ddmd is not to be confused with moving towards a shared 'frontend' codebase, and is the first half of the correct sentence. The second half is that even then, that has no guarantee of keeping things in sync without also integrating other 'ends' into the CI process. This requires that we set-up an infrastructure where: - New PRs are tested against all compilers before merging. This not to be confused with our current set-up where all compilers build DMD. Specifically new changes upstream must: 1. Be able to apply the change cleanly in their local repositories 2. Build themselves without error. - We then need another process in place to keep each end in sync after changes upstream are applied. It was hoped that moving towards ddmd would force a lot of the ABI-specific code to be moved into Target or Port (host) interfaces that are agnostic to the backend. There are still many target-specific areas where this is not the case, and on top of that there are regressions in the host-specific interfaces. In short, there will always be a heavy maintenance burden regardless of what language we're written in. :-) IainI see. Thank you for your answer!
Nov 03 2015
Am Tue, 3 Nov 2015 13:50:55 +0100 schrieb Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com>:On 3 November 2015 at 12:57, drug via Digitalmars-d < digitalmars-d puremagic.com> wrote:I'd like to see some statistics how many DMD pull requests are frontend-only vs how many pull requests also touch the backend*. A crazy idea: Once gdc supports the latest frontend version we could theoretically adjust the dmd pull request testing to also merge dmd pull requests into the gdc frontend and test gdc with these frontend-only requests. We would then only merge dmd pull requests that build for gdc as well. Then we would need some hooks to also automatically pull these into gdc. Or we could setup the frontend as a submodule. The main problem is that even frontend-only changes will depend on earlier backend changes. So we'd need to keep the compilers somehow in sync. Every request touching the backend would have to be ported to GDC before merging into dmd. This will keep GDC/DMD 100% in sync, but it will also slow down DMD development. So the interesting question now is what does the frontend/backend* pull request ration look like? If only very few pull requests touch the backend this approach could work. The situation for druntime is similar, although I guess there are fewer compiler specific pull requests for druntime. Phobos should be mostly compiler independent. * DMD backend improvements are fine. The critical pull requests are requests which affect the explicit and implicit frontend/backend interface.On 03.11.2015 14:11, Iain Buclaw via Digitalmars-d wrote:Well, how would that work? :-) What you've probably misheard is half of a phrase. Moving to towards ddmd is not to be confused with moving towards a shared 'frontend' codebase, and is the first half of the correct sentence. The second half is that even then, that has no guarantee of keeping things in sync without also integrating other 'ends' into the CI process. This requires that we set-up an infrastructure where: - New PRs are tested against all compilers before merging. This not to be confused with our current set-up where all compilers build DMD. Specifically new changes upstream must: 1. Be able to apply the change cleanly in their local repositories 2. Build themselves without error. - We then need another process in place to keep each end in sync after changes upstream are applied. It was hoped that moving towards ddmd would force a lot of the ABI-specific code to be moved into Target or Port (host) interfaces that are agnostic to the backend. There are still many target-specific areas where this is not the case, and on top of that there are regressions in the host-specific interfaces. In short, there will always be a heavy maintenance burden regardless of what language we're written in. :-) IainOn 3 November 2015 at 11:35, Daniel Murphy via Digitalmars-d <digitalmars-d puremagic.com <mailto:digitalmars-d puremagic.com>> wrote: On 3/11/2015 7:52 PM, drug wrote: On 03.11.2015 11:22, Johannes Pfau wrote: I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend. A bit offtopic - will the situation change with ddmd accepted? I mean the situation with different frontend version in different compilers. While DDMD does not have any direct effect on our ability to keep the three compilers synced, some of the cleanup work that has been done does help. Whilst other clean-up work has destroyed years of stable compatibility between different 'ends'. ;-)Hmm, I asked because I've heard that using ddmd would help with keeping the compilers synced and we would have the same version of frontend everywhere...
Nov 03 2015
On 4/11/2015 3:12 AM, Johannes Pfau wrote:A crazy idea: Once gdc supports the latest frontend version we could theoretically adjust the dmd pull request testing to also merge dmd pull requests into the gdc frontend and test gdc with these frontend-only requests. We would then only merge dmd pull requests that build for gdc as well. Then we would need some hooks to also automatically pull these into gdc. Or we could setup the frontend as a submodule.It's not a crazy idea at all. The problem is that we will need to get the compilers in sync first, and I'm not sure that's getting any closer to being reality. I think the number of pull requests touching the glue layer is low enough that it would work, once the CI system is set up to enforce it.
Nov 18 2015
On 18 November 2015 at 09:24, Daniel Murphy via Digitalmars-d < digitalmars-d puremagic.com> wrote:On 4/11/2015 3:12 AM, Johannes Pfau wrote:There are many factors at play here from my side. Some technical blockers (like the use of floating point in the front-end), some are design (where will we store data to associate the backend symbol with the frontend), some fall into the something-else category. It may be due to the latter that the only way forward is to take the current front-end HEAD and take a much ground-up approach.A crazy idea: Once gdc supports the latest frontend version we could theoretically adjust the dmd pull request testing to also merge dmd pull requests into the gdc frontend and test gdc with these frontend-only requests. We would then only merge dmd pull requests that build for gdc as well. Then we would need some hooks to also automatically pull these into gdc. Or we could setup the frontend as a submodule.It's not a crazy idea at all. The problem is that we will need to get the compilers in sync first, and I'm not sure that's getting any closer to being reality.
Nov 18 2015
On Tuesday, 3 November 2015 at 08:22:37 UTC, Johannes Pfau wrote:I guess it's to be compatible with the latest DMD, LDC and GDC. GDC currently only provides the 2.066.1 frontend.this makes sense. unfortunately often it happens that i pull in one or the other library that just happens not to work on ldc or gdc and its over my head to fix it => stuck with dmd. i like you'r "crazy idea", i think the ecosystem could greatly benefit from tackling this problem at its root.
Nov 03 2015
On Tuesday, 3 November 2015 at 08:08:28 UTC, yawniek wrote:i have seen many PR's and also Forum entries that deal with the problem of newer features of the compiler not being able and then patching or working around that to support older compiler versions.For end-users it is always good to support a lot of versions. For me its the opposite; that handwritten loop I wrote to replace groupBy - while only being 6 loc - had a bug; I had to install dvm to compile with 2.066 and it didn't work in cygwin / mingw, so I had to manually edit environment variables. Maybe D needs a compatibility library that has backports for all the new fancy stuff. Then again, I rather write the occasional classic loop than double/triple the work on new features.
Nov 03 2015
On Tuesday, 3 November 2015 at 08:08:28 UTC, yawniek wrote:i have seen many PR's and also Forum entries that deal with the problem of newer features of the compiler not being able and then patching or working around that to support older compiler versions. since it is really easy to keep up with compiler versions and even switch (and not many features are being removed from dmd) what are good reasons to keep backward compatiblity? the latest example i saw was replacing groupBy by a loop to keep compatiblity with 2.066. while not a big thing, it adds up.Why do we keep backward compatibility ? The answer is dead simple: people need it. The assumption that it's easy to upgrade is totally false. Upgrading to a newer version is costly. You need to test it, maybe repackage / redeploy new applications / library and monitor that every still runs smoothly. This has a cost, and the bigger you are, the higher the cost.since still a lot of useful features do get added into phobos at a fairly fast pace, would it not be better to to keep targeting just the two most recent versions and moving the ecosystem a little bit further.Unless the new release has a definitive advantage for you, like a much-needed feature, that cost isn't justified, and you're better off spending time / money on things that matter to you (like new features).For people entering the world of D it would be much more encouraging to read a lot of concise code using all the nice features we have instead of just lipstick'd C.If 2.066 is just lipstick'd C to you, you had already spend too much time having fun with D and not enough using C ;) One thing important for people entering the D world (or any world) is as little friction as possible. And if things don't work out of the box it's a lot of friction. Backward compatibility help with that as well.
Nov 04 2015