digitalmars.D - D's tail
- Brian Tiffin (46/46) Aug 09 2021 Being a COBOL guy (for fun), I have opinions on this, but I'll
- Adam D Ruppe (19/26) Aug 09 2021 Most my code compiles with compilers up to 5 years old. Some will
- evilrat (7/13) Aug 09 2021 There was also tons of changes around 2015-2016, very often
- Guillaume Piolat (5/8) Aug 10 2021 I have some code that builds with DMD 2.076.0, that's 1 Sep 2017.
- Dukc (16/17) Aug 10 2021 Normally the programs use the standard library and/or DUB
- Vladimir Panteleev (9/12) Aug 10 2021 Old D versions do not receive platform compatibility updates.
- Dominikus Dittes Scherkl (10/16) Aug 10 2021 Forever.
- Bastiaan Veelo (6/8) Aug 11 2021 This. We successfully use
- jmh530 (3/8) Aug 11 2021 I don't recall seeing that there. Thanks for pointing it out.
- Johan (23/25) Aug 11 2021 Very much depends on the type of project. How much do you depend
- FeepingCreature (17/46) Aug 11 2021 Funkwerk: this is what we do as well. All tools involved in a
- Bastiaan Veelo (7/11) Aug 11 2021 That’s what we plan to do as well. Is that accomplished with off
- Johan (8/18) Aug 11 2021 It is scripted in-house.
- Brian Tiffin (19/25) Aug 11 2021 :-)
- FeepingCreature (16/43) Aug 12 2021 Personally speaking, I am strongly against this.
- Brian Tiffin (67/95) Aug 12 2021 I lean to stable, but don't disagree with your sentiment,
- Dukc (4/13) Aug 12 2021 When you want as long tail as possible for your code, GDC is
- Andrei Alexandrescu (8/35) Aug 12 2021 FWIW the situation is similar with large-scale C++ projects as well.
Being a COBOL guy (for fun), I have opinions on this, but I'll try and only let them leak out a ~~little bit~~ lot, as this is meant as a question, not a judgement. How long do you expect your D programs to last? How long do you think the D system should support your source code? D is 20+ now, but how long is its tail? Meaning, if you found D code from 19 years ago, would you expect it to compile today, with a current compiler? How far back in time should D source age? The length of the tail. Is it better to cut off the growing tail for progress, or should progress allow for long tailed source code and binaries? To be fair, a 1.0 to 2.0 bump is major enough to warrant lopping off a tail. (Or is it?) Being new to D, that seems to put D2 at 12-ish years old. Does anyone have 12 year old code that would recompile without touching the sources? Would you expect it to? Or is it ok to write programs with fixed (with a randomness factor) life expectancy? The idea to ask this question came up from reading the `std.datetime` article, and how it deprecated `std.date`. Would it be reasonable to freeze code rather than deprecate and remove? Leave the tail, even though it may be wrinkled and dry at the ends, rather than lop it off? On one hand, a long tail means hauling luggage, an ever growing, possibly useless (to most) pile of baggage. On that hand, a long tail would mean that D only accumulates, ever growing. On the other, D becomes a smaller in toto codebase that slides through time, leaving it's past behind as it pursues perfection. Leaking bias; I'm amazed that sophisticated Report Writer code from 1969 (pre structured programming) still compiles in a 2020+1 COBOL compiler. Has that code been superseded by events? Not really, dollars are still dollars, pennies are still pennies, page breaks are still page breaks and sums are still sums. The output format is archaic, but the code is still useful, and cups-pdf will gladly convert a plain old text file to PDF. But I also realize that IBM charged customers a high premium to advertise and maintain that backward compatibility. Maybe only Computer Business needs those kinds of timelines and guarantees. Perhaps Computer Science *can* slide, allowing old code to become worthless without surgery after a reasonably short life span, as Science evolves faster in general. Lots of life forms leave behind old shells behind to grow and mature. So it's a question. What is a reasonable life expectancy for the projects I start in D today? Have good, make well.
Aug 09 2021
On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:How far back in time should D source age?Most my code compiles with compilers up to 5 years old. Some will go back as far as 12 years old, though that's a fairly significant effort hat I don't bother with on anything except maintaining code from that era.Does anyone have 12 year old code that would recompile without touching the sources?I have some small modules that still will, but the vast majority of things do require an active effort to maintain compatibility from back then (especially if you want to support both new and old compilers). The majority of these things are due to library changes. There was a period around 2011 where tons of little things changed, like tolower became toLower and std.date got removed, and stuff like that. Or of course in 2008ish when strings changed their type to immutable... None was too hard to update but it did force updates.Would it be reasonable to freeze code rather than deprecate and remove?yeah that's been closer to the policy in recent years.So it's a question. What is a reasonable life expectancy for the projects I start in D today?It so depends on what you use. If you avoid outside libraries (at least copying them into your project), you can realistically keep it for many many years. But it does take some active effort.
Aug 09 2021
On Tuesday, 10 August 2021 at 01:57:49 UTC, Adam D Ruppe wrote:The majority of these things are due to library changes. There was a period around 2011 where tons of little things changed, like tolower became toLower and std.date got removed, and stuff like that. Or of course in 2008ish when strings changed their type to immutable... None was too hard to update but it did force updates.There was also tons of changes around 2015-2016, very often packages older than 2016 simply won't work, some language changes from that period makes it non trivial to upgrade such code. No examples, sorry, but one thing I remember was module ctors and shared variables issues when upgrading old package to 2019 compiler.
Aug 09 2021
On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:So it's a question. What is a reasonable life expectancy for the projects I start in D today? Have good, make well.I have some code that builds with DMD 2.076.0, that's 1 Sep 2017. You can probably go much earlier with some effort. Ultimately I hope an event as breaking as D2 circa 2010 won't happen again, many people departed.
Aug 10 2021
On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:How long do you expect your D programs to last?Normally the programs use the standard library and/or DUB libraries. In that usual case, it in my experience takes around an year or so for code to stop compiling. However, it's almost always easy to migrate - just find-replace a library function call, add an `alias` for it, or stuff like that. If you copy-paste the library code instead, or don't use it, your code will usually keep compiling for multiple years. DMD is a good example - it may well compile with 20 minors older version of itself. But all in all, in D one has to be prepared to use older compiler versions if updating old code is not possible or worth the effort. On the other hand, I think one should always be prepared to do so in any language. Even in standartised languages, it does happen that the code accidently uses non-standard features or the new compiler versions include regressions.
Aug 10 2021
On Tuesday, 10 August 2021 at 12:19:26 UTC, Dukc wrote:But all in all, in D one has to be prepared to use older compiler versions if updating old code is not possible or worth the effort.Old D versions do not receive platform compatibility updates. E.g., it's not possible to build a program with D 2.060 on recent versions of macOS, and old versions of macOS don't run on new hardware, so you actually have to keep the old hardware around. (Or at least that's how I understand it - not a mac user.) Old versions of D also do not compile on recent systems. I maintain Digger which attempts to solve this problem, though. (This doesn't help solve the problem above.)
Aug 10 2021
On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:How long do you expect your D programs to last?Forever.How long do you think the D system should support your source code?Forever.D is 20+ now, but how long is its tail? Meaning, if you found D code from 19 years ago, would you expect it to compile today, with a current compiler?No. But with a compiler from then. -> your code should mention the version of D it is supposed to be compiled with. If it's not too old, I would expect to be able to update the source code to compile with a current compiler without much effort. Of course it gets more complicated with every library involved...
Aug 10 2021
On Tuesday, 10 August 2021 at 14:18:47 UTC, Dominikus Dittes Scherkl wrote:-> your code should mention the version of D it is supposed to be compiled with.This. We successfully use https://dub.pm/package-format-json.html#toolchain-requirements for this. -- Bastiaan.
Aug 11 2021
On Wednesday, 11 August 2021 at 10:05:19 UTC, Bastiaan Veelo wrote:[snip] This. We successfully use https://dub.pm/package-format-json.html#toolchain-requirements for this. -- Bastiaan.I don't recall seeing that there. Thanks for pointing it out.
Aug 11 2021
On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:So it's a question. What is a reasonable life expectancy for the projects I start in D today?Very much depends on the type of project. How much do you depend on new features (likely to change), how much do you depend on library code, how much do you depend on ABI details, how much are you working around current language/compiler issues...? At Weka, thusfar any new compiler means compilation failures. It is a demanding codebase. Some reasons for failures: workarounds for compiler bugs, ABI dependencies, template attribute deduction deltas, dependence on druntime internals, new compiler bugs, change of language semantics (sometimes intended, but also often unintended), no way to disable/enable specific warnings/deprecations, ... On top of that, performance may change too with compiler version, so even if the build succeeds that does not mean the program still functions within the required performance constraints. Keeping the code compatible with multiple frontends is a large burden, currently avoided by specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system). Updating the compiler to a new version is a lengthy process. Weka is currently at LDC 1.24 (with Weka modifications). -Johan
Aug 11 2021
On Wednesday, 11 August 2021 at 11:33:42 UTC, Johan wrote:On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:Funkwerk: this is what we do as well. All tools involved in a build are pinned at a fixed version per repository. Anyone doing enterprise development in D should do this as a matter of course; without reproducible builds, how can you be sure you can react effectively when a deployed system breaks? You don't want to mess around with deprecations when a customer's system is down. We generally update DMD about twice a year, in jumps of four minor versions, to reduce the cost of fixing compiler issues and deprecations across lots of projects. Current DMD versions used in repos: 2.094 = 37 2.090 = 14 2.086 = 16 2.082 = 11 2.077 = 8 The older repos are generally not actively maintained though.So it's a question. What is a reasonable life expectancy for the projects I start in D today?Very much depends on the type of project. How much do you depend on new features (likely to change), how much do you depend on library code, how much do you depend on ABI details, how much are you working around current language/compiler issues...? At Weka, thusfar any new compiler means compilation failures. It is a demanding codebase. Some reasons for failures: workarounds for compiler bugs, ABI dependencies, template attribute deduction deltas, dependence on druntime internals, new compiler bugs, change of language semantics (sometimes intended, but also often unintended), no way to disable/enable specific warnings/deprecations, ... On top of that, performance may change too with compiler version, so even if the build succeeds that does not mean the program still functions within the required performance constraints. Keeping the code compatible with multiple frontends is a large burden, currently avoided by specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system). Updating the compiler to a new version is a lengthy process. Weka is currently at LDC 1.24 (with Weka modifications). -Johan
Aug 11 2021
On Wednesday, 11 August 2021 at 11:33:42 UTC, Johan wrote:… specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system).That’s what we plan to do as well. Is that accomplished with off the shelf tooling, or largely scripted in house? I’d like to make this accessible with as little effort as possible. Imagine checking out any project at any revision and it builds without intervention at any time. — Bastiaan.
Aug 11 2021
On Wednesday, 11 August 2021 at 14:22:31 UTC, Bastiaan Veelo wrote:On Wednesday, 11 August 2021 at 11:33:42 UTC, Johan wrote:It is scripted in-house. There's some extra trickery involved because of the modified LDC version, and to give some small freedom in slightly different versions on dev's machines and CI/build system (although unused atm). -Johan… specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system).That’s what we plan to do as well. Is that accomplished with off the shelf tooling, or largely scripted in house? I’d like to make this accessible with as little effort as possible. Imagine checking out any project at any revision and it builds without intervention at any time.
Aug 11 2021
On Wednesday, 11 August 2021 at 14:22:31 UTC, Bastiaan Veelo wrote:On Wednesday, 11 August 2021 at 11:33:42 UTC, Johan wrote::-) As mentioned at the top, a COBOL guy, source code from 1972 *just works*. COBOL programmers live in that *imaginary* world. Ok, that's exaggerating the truth, but it's the disappointing exception that is the exception, not the rule. The rule is, code it now and it compiles and works, in perpetuity. And thanks for the hints on how to go about increasing the life expectancy for D source codes. To be honest, I'm hoping to be able to assume that D source will be as resilient to the ever changing surrounding environments for as long as I'd assume C source to be. Source code should age gracefully, not fearing each and every revision of build tools or run-time support libraries. Yes, you retest after upgrades, but the assumption should be more life in a newly renovated house, not assuming you have to count the number of broken fingers that need attention before getting back to work. :-) Have good, make well.… specifying the compiler version in the codebase and using... Imagine checking out any project at any revision and it builds without intervention at any time. — Bastiaan.
Aug 11 2021
On Wednesday, 11 August 2021 at 19:40:36 UTC, Brian Tiffin wrote:On Wednesday, 11 August 2021 at 14:22:31 UTC, Bastiaan Veelo wrote:Personally speaking, I am strongly against this. I think that works for a compact, well-defined language that is "done", that generally works well, or at least good enough that you can live with the corner-cases. Like C. In comparison, D is to some large extent also an *exploration* of language design, an attempt to see what works and what doesn't, and when you lean yourself out of the window like that you have to be able and willing to course correct, to say "well we tried that but it didn't work out", or you get stuck in mediocrity. D is too big and too speculative to support true long-term stability. As a compromise, we have the deprecation mechanism, which gives you early warning for which parts are going away in future versions. I am very glad for this mechanism, and honestly think the rate of change of D could even be a bit higher than it is today, maybe up to twice as fast.On Wednesday, 11 August 2021 at 11:33:42 UTC, Johan wrote::-) As mentioned at the top, a COBOL guy, source code from 1972 *just works*. COBOL programmers live in that *imaginary* world. Ok, that's exaggerating the truth, but it's the disappointing exception that is the exception, not the rule. The rule is, code it now and it compiles and works, in perpetuity. And thanks for the hints on how to go about increasing the life expectancy for D source codes. To be honest, I'm hoping to be able to assume that D source will be as resilient to the ever changing surrounding environments for as long as I'd assume C source to be. Source code should age gracefully, not fearing each and every revision of build tools or run-time support libraries. Yes, you retest after upgrades, but the assumption should be more life in a newly renovated house, not assuming you have to count the number of broken fingers that need attention before getting back to work. :-) Have good, make well.… specifying the compiler version in the codebase and using... Imagine checking out any project at any revision and it builds without intervention at any time. — Bastiaan.
Aug 12 2021
On Thursday, 12 August 2021 at 08:52:29 UTC, FeepingCreature wrote:On Wednesday, 11 August 2021 at 19:40:36 UTC, Brian TiffinI lean to stable, but don't disagree with your sentiment, FeepingCreature. COBOL is one of the very few Computer Business programming languages. It has a domain of strength, and has owned that domain since the inception of writing programs. Big monies ride on it, and computers are not the purpose of those businesses, just a tool to help run said business. Computer Science needs and wants to evolve faster than Computer Business. But there does need to be stable or promise of stable if you want a business to invest in a technology. Except for a very few fields, computers are not the purpose of a business, just a tool to automate some tasks or speed up calculations to a point that making those calculations can happen in a reasonable time frame. Think about D replacing the user land of a Linux distro. GNU doesn't replace `cat` every 4 years because it was forced on the project because a build tool changed. coreutils is maintained, yes, mostly by volunteers, but it wouldn't be GNU/Linux if the GNU part failed on every other kernel update. The keenest of programmers age out. If a team was to try and replace coreutils with D, there would likely be an initial round of totally keen programmers, eager to try. If they succeeded and completed all the commands, the amount of keen to go back and fix things later would be greatly diminished. coreutils-d would be a very cool thing to see. But, 7 years later it would likely be in a state of bitrot. It might not entice enough keen programmers to tweak and improve, if all the time is spent on forced maintenance due to constant upstream changes to tooling. When you can't entice the keen programmers, the keen programmers do something else. coreutils-d would start to fray, and would likely become an effort of a few critical maintainers, who will age out. coreutils will stay written in C and C++, and will compile with gcc for the foreseeable future. Decades of future added to the decades of past. I'm loving exploring D and can see the absolutely immense potential. But, I'm doing this in my spare time at no one's expense. I'm not doing D as a person in need of hiring a team to work on a business problem or new product. Programmers are expensive, people only want to have to pay them once for each feature coded. Then pat them on the back for a job well done, and punt them from the payroll as soon as possible. If D is ok with being used in small system programming fields with short-range life expectancy, then it could very well own the domain, and be the world class tool. In that reality, world domination is off the table. I'd be ok with that form of D in the long term, as a hobbyist writing little throw away programs, while having a lot of fun doing it. But I would not yet want to publish anything that might become famous, only to be chained to maintenance on a schedule randomly dictated by others. I'm a computer programmer, my attention span is counted in seconds, intermixed with long periods of deep focus that end when they end. And then it is on to the next thing. Don't like being forced by circumstance to redo a deep focus; that's supposed to be my decision of when and where, unless I've already traded my time for someone else's money, then they get to decide the where, the when, and the how. So, I guess I'm not yet strongly for or against stable, long term, source level D. Leaning to for, I will opine as an old guy, as-is, D is not yet destined for world domination. Which is ok. A fraction of the millions of human hours spent programming every hour around the world, is still a lot of hours. Redoing hours is no fun though, and this thread is helping set expectations. Have good, make well.... To be honest, I'm hoping to be able to assume that D source will be as resilient to the ever changing surrounding environments for as long as I'd assume C source to be. Source code should age gracefully, not fearing each and every revision of build tools or run-time support libraries. Yes, you retest after upgrades, but the assumption should be more life in a newly renovated house, not assuming you have to count the number of broken fingers that need attention before getting back to work. :-) Have good, make well.Personally speaking, I am strongly against this. I think that works for a compact, well-defined language that is "done", that generally works well, or at least good enough that you can live with the corner-cases. Like C. In comparison, D is to some large extent also an *exploration* of language design, an attempt to see what works and what doesn't, and when you lean yourself out of the window like that you have to be able and willing to course correct, to say "well we tried that but it didn't work out", or you get stuck in mediocrity. D is too big and too speculative to support true long-term stability. As a compromise, we have the deprecation mechanism, which gives you early warning for which parts are going away in future versions. I am very glad for this mechanism, and honestly think the rate of change of D could even be a bit higher than it is today, maybe up to twice as fast.
Aug 12 2021
On Wednesday, 11 August 2021 at 19:40:36 UTC, Brian Tiffin wrote:To be honest, I'm hoping to be able to assume that D source will be as resilient to the ever changing surrounding environments for as long as I'd assume C source to be. Source code should age gracefully, not fearing each and every revision of build tools or run-time support libraries. Yes, you retest after upgrades, but the assumption should be more life in a newly renovated house, not assuming you have to count the number of broken fingers that need attention before getting back to work. :-)When you want as long tail as possible for your code, GDC is recommended. See https://forum.dlang.org/post/oixwnblhdxyehongfvss forum.dlang.org
Aug 12 2021
On 8/11/21 7:33 AM, Johan wrote:On Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:FWIW the situation is similar with large-scale C++ projects as well. Whenever I teach, I first ask people what version of their C++ compiler they're on - does it support C++14, C++17, etc. Many mention some really outdated release and invariably invoke the difficulty of upgrading due to compilation errors and performance issues. Also, most of big tech hack into their C++ compiler (and some their Linux, too).So it's a question. What is a reasonable life expectancy for the projects I start in D today?Very much depends on the type of project. How much do you depend on new features (likely to change), how much do you depend on library code, how much do you depend on ABI details, how much are you working around current language/compiler issues...? At Weka, thusfar any new compiler means compilation failures. It is a demanding codebase. Some reasons for failures: workarounds for compiler bugs, ABI dependencies, template attribute deduction deltas, dependence on druntime internals, new compiler bugs, change of language semantics (sometimes intended, but also often unintended), no way to disable/enable specific warnings/deprecations, ... On top of that, performance may change too with compiler version, so even if the build succeeds that does not mean the program still functions within the required performance constraints. Keeping the code compatible with multiple frontends is a large burden, currently avoided by specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system). Updating the compiler to a new version is a lengthy process. Weka is currently at LDC 1.24 (with Weka modifications).
Aug 12 2021
On Thursday, 12 August 2021 at 13:42:27 UTC, Andrei Alexandrescu wrote:On 8/11/21 7:33 AM, Johan wrote:May I please request you to share how often you hear that people are still using C++98/03? Maybe using C++03 but using clang++'s non-standard extensions that allow access to C++11 features like move semantics?[...]FWIW the situation is similar with large-scale C++ projects as well. Whenever I teach, I first ask people what version of their C++ compiler they're on - does it support C++14, C++17, etc. Many mention some really outdated release and invariably invoke the difficulty of upgrading due to compilation errors and performance issues. Also, most of big tech hack into their C++ compiler (and some their Linux, too).
Aug 12 2021
On Thursday, 12 August 2021 at 13:42:27 UTC, Andrei Alexandrescu wrote:On 8/11/21 7:33 AM, Johan wrote:Hi Andrei, Indeed, good to include that perspective. Do you think that industrial C++ programming has become more standard compliant in the last ~10years ? E.g. due to the availability of viable gcc alternative clang compiler, thus you start to notice the reliance on compiler specifics; and also perhaps due to the compiler community putting emphasis on standard compliance (even MSVC). Regardless of standard compliance, performance deltas will remain an issue. cheers, JohanOn Tuesday, 10 August 2021 at 00:53:29 UTC, Brian Tiffin wrote:FWIW the situation is similar with large-scale C++ projects as well. Whenever I teach, I first ask people what version of their C++ compiler they're on - does it support C++14, C++17, etc. Many mention some really outdated release and invariably invoke the difficulty of upgrading due to compilation errors and performance issues. Also, most of big tech hack into their C++ compiler (and some their Linux, too).So it's a question. What is a reasonable life expectancy for the projects I start in D today?Keeping the code compatible with multiple frontends is a large burden, currently avoided by specifying the compiler version in the codebase and using that exact compiler on build systems and developer's machines (i.e. each dev will have multiple compilers on his machine, automatically downloaded and used by build system). Updating the compiler to a new version is a lengthy process. Weka is currently at LDC 1.24 (with Weka modifications).
Aug 12 2021