www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Dicebot on leaving D: It is anarchy driven development in all its

reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
Just found by chance, if someone is interested [1] [2].

/Paolo

[1] 
https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
[2] 
https://blog.mist.global/articles/My_concerns_about_D_programming_language.html
Aug 22 2018
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo

 [1] 
 https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
 [2] 
 https://blog.mist.global/articles/My_concerns_about_D_programming_language.html
Pretty positive overall, and the negatives he mentions are fairly obvious to anyone paying attention. D would really benefit from a project manager, which I think Martin Nowak has tried to do, and which the companies using D and the community should get together and fund as a paid position. Maybe it could be one of the funding targets for the Foundation. If the job was well-defined, so I knew exactly what we're getting by hiring that person, I'd contribute to that.
Aug 22 2018
next sibling parent reply Ali <fakeemail example.com> writes:
On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
Aug 22 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are fairly 
 obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death. With great regrets, Shachar
Aug 22 2018
next sibling parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 Every single one of the people rushing to defend D at the time 
 has since come around. There is still some debate on whether, 
 points vs. counter points, choosing D was a good idea, but the 
 overwhelming consensus inside Weka today is that D has *fatal* 
 flaws and no path to fixing them.

 And by "fatal", I mean literally flaws that are likely to 
 literally kill the language.
How so? If he's right with those issues, they can definitely prevent D from becoming mainstream, but how would they kill D? I mean, will not there always be some existing users who have no need or wish to move on?
Aug 22 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 07:35, Dukc wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh wrote:
 Every single one of the people rushing to defend D at the time has 
 since come around. There is still some debate on whether, points vs. 
 counter points, choosing D was a good idea, but the overwhelming 
 consensus inside Weka today is that D has *fatal* flaws and no path to 
 fixing them.

 And by "fatal", I mean literally flaws that are likely to literally 
 kill the language.
How so? If he's right with those issues, they can definitely prevent D from becoming mainstream, but how would they kill D? I mean, will not there always be some existing users who have no need or wish to move on?
Maintaining a language requires a lot of work. The "payback" for that work comes from people who actually use that work. If the D community starts to contract, it will become more and more difficult to find people willing to work on D's core features, which will lead to stagnation which is the same as death. But, again, it is interesting to see what you took from my mail. I'd be much more worried about the fact that it is working with D that caused people to recognize the problems as fundamental than about what "death" means in this context. Shachar
Aug 22 2018
parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 23 August 2018 at 04:44:47 UTC, Shachar Shemesh 
wrote:
 But, again, it is interesting to see what you took from my mail.
I think the biggest problem is lack of reviewers when making PR:s. The fact that we have D language foundation, state of D survey, extensive autotester and regular release schelude seem to imply, for me, that much more than ADD is being done. But then again, my D projects so far are too small that I could really know where the problems are. It may be that in time, if they grow, I start to agree with you.
Aug 22 2018
parent Eugene Wissner <belka caraus.de> writes:
On Thursday, 23 August 2018 at 04:59:47 UTC, Dukc wrote:
 On Thursday, 23 August 2018 at 04:44:47 UTC, Shachar Shemesh 
 wrote:
 But, again, it is interesting to see what you took from my 
 mail.
I think the biggest problem is lack of reviewers when making PR:s. The fact that we have D language foundation, state of D survey, extensive autotester and regular release schelude seem to imply, for me, that much more than ADD is being done. But then again, my D projects so far are too small that I could really know where the problems are. It may be that in time, if they grow, I start to agree with you.
JinShil referenced in another thread a PR where Walter and Andrei just ignored the review and merged the pull request (I had to laugh). A valid merge-stopper (missing/wrong documentation) was called "bureaucracy". https://github.com/dlang/dmd/pull/8346
Aug 22 2018
prev sibling next sibling parent reply Eugene Wissner <belka caraus.de> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 No, no and no.

 I was holding out on replying to this thread to see how the 
 community would react. The vibe I'm getting, however, is that 
 the people who are seeing D's problems have given up on 
 affecting change.

 It is no secret that when I joined Weka, I was a sole D 
 detractor among a company quite enamored with the language. I 
 used to have quite heated water cooler debates about that point 
 of view.

 Every single one of the people rushing to defend D at the time 
 has since come around. There is still some debate on whether, 
 points vs. counter points, choosing D was a good idea, but the 
 overwhelming consensus inside Weka today is that D has *fatal* 
 flaws and no path to fixing them.

 And by "fatal", I mean literally flaws that are likely to 
 literally kill the language.

 And the thing that brought them around is not my power of 
 persuasion. The thing that brought them around was spending a 
 couple of years working with the language on an every-day basis.

 And you will notice this in the way Weka employees talk on this 
 forum: except me, they all disappeared. You used to see Idan, 
 Tomer and Eyal post here. Where are they?

 This forum is hostile to criticism, and generally tries to keep 
 everyone using D the same way. If you're cutting edge D, the 
 forum is almost no help at all. Consensus among former posters 
 here is that it is generally a waste of time, so almost 
 everyone left, and those who didn't, stopped posting.

 And it's not just Weka. I've had a chance to talk in private to 
 some other developers. Quite a lot have serious, fundamental 
 issues with the language. You will notice none of them speaks 
 up on this thread.

 They don't see the point.

 No technical project is born great. If you want a technical 
 project to be great, the people working on it have to focus on 
 its *flaws*. The D's community just doesn't do that.

 To sum it up: fatal flaws + no path to fixing + no push from 
 the community = inevitable eventual death.

 With great regrets,
 Shachar
"anarchy driven development" is a pearl. It is also mood driven development. Yesterday was scope and -dip1000 super important, today is betterC very hot and everyone works on betterC druntime, betterC Phobos, betterC libraries. Maybe -dip1000 will be made default at some point and the language will get another one well-intentioned but only half-working feature. And I'm beginning to doubt that the real problem is that the community doesn't help. Don't get me wrong, I do development in absolutely the same, anarchy driven :), way. Sometimes I can't work long enough at the same thing, sometimes I lose interest. It is also great for research and trying out new ideas since D tries to be innovative and offer a better developer experience. And I can also understand that the language authors want to control the evolution of the language and try make it better testing new ideas. But this kind of development doesn't work anymore that well for commercial customers that aren't (only) interested in research. From this perspective D becomes over-complicated, half-finished language. And nobody can tell what will be "in" tomorrow.
Aug 22 2018
parent reply Guillaume Piolat <spam smam.org> writes:
On Thursday, 23 August 2018 at 04:46:25 UTC, Eugene Wissner wrote:
 But this kind of development doesn't work anymore that well for 
 commercial customers that aren't (only) interested in research. 
 From this perspective D becomes over-complicated, half-finished 
 language. And nobody can tell what will be "in" tomorrow.
My point of view (as a tiny commercial user) is that D has everything you need: - a way to convert money into bugfixes with the new Foundation deals - it's pretty boring and predictable with upgrades, few surprise. Each releases bring improvements, you get CI and docs for free etc. - future-proof. It's not a weaponized language, people love it etc. It cannot be wiped out by a giant because it doesn't fit a corporate agenda. Of the real problems I think I see - and everyone sees those differently: A - D needs to keep generating drama (like this thread) to storify its development and keep people active. It creates engagement. A lot of the D marketing is very reasonable and that doesn't cut it that much. Where is superdan? B - sometimes bizzarre allocation of effort, that's probably part of some unexplainable evil plan, but can at times feel like counterproductive wrt marketing. For example: why implement AVX in DMD backend? Who are the users that will be delighted by that? Those interested in performance already use some other back-end, it's imo a completely useless development since _no one_ use D_SIMD seriously apart from compiler tests (https://github.com/search?l=D&p=4&q=D_SIMD&type=Code) C - _absurd_ focus on the language, and in lesser part Phobos instead of _community_ and DUB things (ie: libraries, making sure they exist and make some sense while not putting more work on core developers). What if the language was "good enough" while the ecosystem wasn't? Example: A best example of this is std.experimental: * "to develop something for std.experimental, put it on DUB it will be able to be used and upgraded!" * "to update something in std.experimental, put it back on DUB it will be able to be upgraded while keeping backward-compatibility!" Example: The solution is (still subjectively) obviously to offload as much as possible to the community, simply because there are more people making libraries than core developers so why core developers should take ownership of an ever growing amount of "big Phobos"? Phobos shouldn't even be a thing, we always read "hopefully this will be moved into Phobos" which is entirely wrong. People should be pushed to use the community to their advantage. SemVer is where it's at.
Aug 23 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 00:32:59 UTC, Guillaume Piolat wrote:

   For example: why implement AVX in DMD backend? Who are the 
 users that will be delighted by that? Those interested in 
 performance already use some other back-end, it's imo a 
 completely useless development since _no one_ use D_SIMD 
 seriously apart from compiler tests 
 (https://github.com/search?l=D&p=4&q=D_SIMD&type=Code)
But I need it to implement `memcpy` and `memcmp` in D, so we can remove the dependency on the D standard library :-) https://github.com/JinShil/memcpyD I know. I'm weird.
 Phobos shouldn't even be a thing, we always read "hopefully 
 this will be moved into Phobos" which is entirely wrong. People 
 should be pushed to use the community to their advantage. 
 SemVer is where it's at.
Totally agree. It seems, from someone without much historical perspective, that Phobos was intended to be something like the .Net Framework for D. Perhaps there are a few fundamentals (std.algorithm, std.allocator, etc.) to keep, but for the others... move 'em to Dub and let the "free market" sort it out. Mike
Aug 23 2018
next sibling parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 00:46:14 UTC, Mike Franklin wrote:

 But I need it to implement `memcpy` and `memcmp` in D, so we 
 can remove the dependency on the D standard library :-)
Gah! What a typo. I mean the C standard library.
Aug 23 2018
parent reply Guillaume Piolat <spam smam.org> writes:
On Friday, 24 August 2018 at 00:47:18 UTC, Mike Franklin wrote:
 On Friday, 24 August 2018 at 00:46:14 UTC, Mike Franklin wrote:

 But I need it to implement `memcpy` and `memcmp` in D, so we 
 can remove the dependency on the D standard library :-)
Gah! What a typo. I mean the C standard library.
Do you also mean to reimplement everything related to FILE*? floating-point parsing and conversion to string? multithreaded malloc?
Aug 23 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 00:53:20 UTC, Guillaume Piolat wrote:

 Do you also mean to reimplement everything related to FILE*?
 floating-point parsing and conversion to string?
 multithreaded malloc?
Only what's need for druntime. That would include multi-threaded malloc, but not the FILE* string-related stuff.
Aug 23 2018
parent reply Guillaume Piolat <spam smam.org> writes:
On Friday, 24 August 2018 at 00:56:10 UTC, Mike Franklin wrote:
 On Friday, 24 August 2018 at 00:53:20 UTC, Guillaume Piolat 
 wrote:

 Do you also mean to reimplement everything related to FILE*?
 floating-point parsing and conversion to string?
 multithreaded malloc?
Only what's need for druntime. That would include multi-threaded malloc, but not the FILE* string-related stuff.
D programs tend to use the C runtime directly, and quite a lot of it: https://github.com/search?l=D&q=%22import+core.stdc%22&type=Code
Aug 23 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 00:58:35 UTC, Guillaume Piolat wrote:

 D programs tend to use the C runtime directly, and quite a lot 
 of it:
 https://github.com/search?l=D&q=%22import+core.stdc%22&type=Code
I know. They should get that from https://github.com/D-Programming-Deimos/libc or perhaps even Dub.
Aug 23 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, August 23, 2018 7:01:41 PM MDT Mike Franklin via Digitalmars-d 
wrote:
 On Friday, 24 August 2018 at 00:58:35 UTC, Guillaume Piolat wrote:
 D programs tend to use the C runtime directly, and quite a lot
 of it:
 https://github.com/search?l=D&q=%22import+core.stdc%22&type=Code
I know. They should get that from https://github.com/D-Programming-Deimos/libc or perhaps even Dub.
Unless you're trying to argue for folks dropping Phobos, that's just not going to fly. Phobos uses libc heavily, and it really can't do what it needs to do without it (e.g. file operations). Divorcing druntime from libc may help folks focused on embedded development and who don't want to use Phobos, but for most D programs, it really doesn't provide any real benefit to try to make druntime not use libc. So, while such an effort may provide some benefits, I don't see how it could really be for anything other than a niche part of the community. - Jonathan M Davis
Aug 23 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 04:12:42 UTC, Jonathan M Davis wrote:

 Unless you're trying to argue for folks dropping Phobos, that's 
 just not going to fly. Phobos uses libc heavily, and it really 
 can't do what it needs to do without it (e.g. file operations). 
 Divorcing druntime from libc may help folks focused on embedded 
 development and who don't want to use Phobos, but for most D 
 programs, it really doesn't provide any real benefit to try to 
 make druntime not use libc. So, while such an effort may 
 provide some benefits, I don't see how it could really be for 
 anything other than a niche part of the community.
It's not a problem for Phobos to depend on the C standard library. My goals have to do with making D, the language, freestanding (a.k.a nimble-D). If and when druntime no longer depends on the C standard library, the bindings can be moved to a separate repository (e.g. Deimos). A compatibility shim can also be created in a separate repository to forward `core.stdc` names to `c.std` or whatever name the new repository chooses. That compatibility shim could be marked deprecated in favor of the new name, and then many years down the line it can be removed (or kept, I don't care). It then becomes part of the toolchain packaging process to add Deimos-libc and the compatibility shim to the dmd.conf file and include it in the distribution. Users won't even know it happened. Users's coding in D, the language (no Phobos), will no longer have to obtain a C toolchain to generate their binaries. We're at least half a decade away from any of this, and there's a good chance it will never even happen, so don't sweat it. And about niche use case. They're niche right now because D doesn't provide good support for them, and noone's writing D software for them. If I have my druthers, that's going to change, and those use case will become major considerations when making language design choices, and it will become obvious that C is a more of a liability than an asset. Mike
Aug 23 2018
parent reply Dave Jones <dave jones.com> writes:
On Friday, 24 August 2018 at 04:50:34 UTC, Mike Franklin wrote:
 On Friday, 24 August 2018 at 04:12:42 UTC, Jonathan M Davis 
 wrote:


 It's not a problem for Phobos to depend on the C standard 
 library.  My goals have to do with making D, the language, 
 freestanding (a.k.a nimble-D).
If the poster feature for D in the upcoming years is memory safety then how can Walter seriously consider continued dependency on libc?
Aug 24 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 2:46:06 AM MDT Dave Jones via Digitalmars-d 
wrote:
 On Friday, 24 August 2018 at 04:50:34 UTC, Mike Franklin wrote:
 On Friday, 24 August 2018 at 04:12:42 UTC, Jonathan M Davis
 wrote:


 It's not a problem for Phobos to depend on the C standard
 library.  My goals have to do with making D, the language,
 freestanding (a.k.a nimble-D).
If the poster feature for D in the upcoming years is memory safety then how can Walter seriously consider continued dependency on libc?
For any kind of normal operating system, you _have_ to use libc. It's part of the OS. Some pieces could be done without it, but on the whole, you use libc if you want to talk to the OS. That's just life. The only exceptions I'm aware of to that are embedded devices, and my understanding is that if anything, such devices are more and more likely to run a fullblown OS, making it that much less likely that you'd ever do anything without libc. Sure, we don't need to call C functions like strcmp, but if you want to do things like I/O, you have to use the OS' API, and that's libc. And yes, that means that we depend on code that's not vetted via safe, but at this point, the major OSes are written in C, and they present a C API. So, accessing that functionality means depending on the OS devs to have gotten it right, much as it would be nice if it were verified with something like safe. The same is going to happen with plenty of existing libraries that are almost certainly not going to have replacements in D (e.g. postgres, ffmpeg, etc). We're never going to completely escape the safety issues introduced by C. Ultimately, the best that we can do is make sure that the actual D code is safe as long as any libraries it uses are safe. - Jonathan M Davis
Aug 24 2018
next sibling parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 09:46:08 UTC, Jonathan M Davis wrote:

 For any kind of normal operating system, you _have_ to use 
 libc. It's part of the OS. Some pieces could be done without 
 it, but on the whole, you use libc if you want to talk to the 
 OS. That's just life. The only exceptions I'm aware of to that 
 are embedded devices, and my understanding is that if anything, 
 such devices are more and more likely to run a fullblown OS, 
 making it that much less likely that you'd ever do anything 
 without libc.
That is not true. You can write your own system calls. Here's "hello world" with no libc: ---object.d module object; alias immutable(char)[] string; private long __d_sys_write(long arg1, in void* arg2, long arg3) { long result; asm { mov RAX, 1; mov RDI, arg1; mov RSI, arg2; mov RDX, arg3; syscall; } return result; } void write(string text) { __d_sys_write(2, text.ptr, text.length); } private void __d_sys_exit(long arg1) { asm { mov RAX, 60; mov RDI, arg1; syscall; } } extern void main(); private extern(C) void _start() { main(); __d_sys_exit(0); } ---main.d module main; void main() { write("Hello, World\n"); } $dmd -c -lib -conf= object.d main.d -of=main.o $ld main.o -o main $size main text data bss dec hex filename 176 0 0 176 b0 main $main Hello, World You just need to re-implement what you need in D. For Phobos, that might be a big job (or maybe not, I haven't looked into it). For druntime, it's not so bad -- it's mostly just memcpy, memcmp, malloc, free, and maybe a couple of others. Those are not trivial implementations, but it's not out of reach, and I think there is opportunity with, with CTFE, templates, __traits, inline asm, and a number of other D features, to do even better than C, and make it more type- and memory-safe while we're at it. Implementing those building-blocks in D, would be good for the language, would solve a number of issues I'm currently having with druntime updates, and would be a fun project for those that are interested in that kind of stuff. Mike
Aug 24 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 4:18:31 AM MDT Mike Franklin via Digitalmars-d 
wrote:
 On Friday, 24 August 2018 at 09:46:08 UTC, Jonathan M Davis wrote:
 For any kind of normal operating system, you _have_ to use
 libc. It's part of the OS. Some pieces could be done without
 it, but on the whole, you use libc if you want to talk to the
 OS. That's just life. The only exceptions I'm aware of to that
 are embedded devices, and my understanding is that if anything,
 such devices are more and more likely to run a fullblown OS,
 making it that much less likely that you'd ever do anything
 without libc.
That is not true. You can write your own system calls. Here's "hello world" with no libc:
Linux is the only OS I'm aware of that considers the syscall layer to be something that anything outside the OS would normally call. Other OSes consider libc to be part of the OS. In theory, you could call the syscalls directly in the BSDs (and probably on Mac OS), but the expectation is that you're going to use libc. Calling them directly would be way more error-prone, since you'd basically have to reimplement portions of libc and have to deal with any changes they make which normally would be hidden by libc. You're basically trying to bypass the OS' public API if you're trying to bypass libc. And of course, that's definitely not how things are done on Windows. Honestly, I don't see how it's at all reasonable to be trying to access syscalls directly rather than using libc under any kind of normal circumstances - especially if you're not on Linux. - Jonathan M Davis
Aug 24 2018
parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 11:15:21 UTC, Jonathan M Davis wrote:

 Linux is the only OS I'm aware of that considers the syscall 
 layer to be something that anything outside the OS would 
 normally call.
I think Linux considers system calls the OS API.
 Other OSes consider libc to be part of the OS.
Not Windows. Windows has it's own API, and when interfacing with the OS, I never use libc.
 In theory, you could call the syscalls directly in the BSDs 
 (and probably on Mac OS), but the expectation is that you're 
 going to use libc.
That's the expectation for application programming, not systems programming.
 Calling them directly would be way more error-prone, since 
 you'd basically have to reimplement portions of libc and have 
 to deal with any changes they make which normally would be 
 hidden by libc.
Most of what we need is already implemented in the language, druntime, and Phobos. We just need a few fundamental building blocks that are currently implemented in C and a few calls into the OS APIs (syscalls for linux, Window API for Windows, not sure about Mac...probably syscalls). And just like DMD benefited from being written in D, so would those building-blocks, and the code that calls into them.
 You're basically trying to bypass the OS' public API if you're 
 trying to bypass libc.
No I'm trying to bypass libc and use the OS API directly.
 Honestly, I don't see how it's at all reasonable to be trying 
 to access syscalls directly rather than using libc under any 
 kind of normal circumstances - especially if you're not on 
 Linux.
I think it'd be nice if D were freestanding and portable without requiring libraries written in other languages. Purity, safety, CTFE, introspection, etc... all the way down. Mike
Aug 24 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 6:05:40 AM MDT Mike Franklin via Digitalmars-d 
wrote:
 You're basically trying to bypass the OS' public API if you're
 trying to bypass libc.
No I'm trying to bypass libc and use the OS API directly.
And my point is that most OSes consider libc to be their OS API (Linux doesn't, but it's very much abnormal in that respect). Trying to bypass it means reimplementing core OS functionality and risking all of the bugs that go with it. It's also _far_ less portable - especially on *nix systems where the POSIX API gives you what you need in a mostly cross-platform manner (whereas syscalls aren't cross-platform at all). Trying to skip libc to call syscalls directly means that D programs risk not acting like programs written in other languages doing the same thing, and it seriously increases the maintenance cost, because then we have to worry about implementing core libc functionality that we currently get simply by linking against libc. I honestly don't see how attempting to divorce druntime from libc does anything but increase the amount of work that we have to do and increase the likelihood that basic OS functionality is going to be buggy, since we will have then reimplemented it rather than using the same core OS functionality that everyone else is using. If you're talking about avoiding libc functions like strcmp that's one thing, but if you're talking about reimplementing stuff that uses syscalls, then honestly, I think that you're crazy. Even if we were overflowing with extra manpower, I would think that that was a terrible idea, and given that we're _not_ overflowing with extra manpower, it's an even worse idea. - Jonathan M Davis
Aug 24 2018
next sibling parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis wrote:

 I think that you're crazy.
No, I just see more potential in D than you do. Mike
Aug 24 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 7:46:57 AM MDT Mike Franklin via Digitalmars-d 
wrote:
 On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis wrote:
 I think that you're crazy.
No, I just see more potential in D than you do.
To be clear, I'm not calling you crazy in general. I'm calling the idea of bypassing libc to call syscalls directly under any kind of normal circumstances crazy. There is tons of work to be done around here to improve D, and IMHO, reimplementing OS functions just because they're written in C is a total waste of time and an invitation for bugs - in addition to making the druntime code that much less portable, since it bypasses the API layer that was standardized for POSIX systems. It's the kind of thing that's going to cause us way more work, more bugs, and make us that much less compatible with existing libraries. And for what? To _maybe_ get slightly better performance (which you probably won't get)? I honestly think that trying to bypass libc to talk to the kernel directly is actively worse than just using libc much as it would be great if we somehow lived in a world where every library we used was written in D. But the reality of the matter is that there is a _lot_ out there already written in C where it simply makes no sense to try to replace it. We're always going to need to interoperate with C unless we somehow convince all of the C developers to at least switch to -betterC (which obviously isn't happening). - Jonathan M Davis
Aug 24 2018
next sibling parent reply Dominikus Dittes Scherkl <dominikus scherkl.de> writes:
On Friday, 24 August 2018 at 22:16:25 UTC, Jonathan M Davis wrote:
 On Friday, August 24, 2018 7:46:57 AM MDT Mike Franklin via 
 Digitalmars-d wrote:
 On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis 
 wrote:
 I think that you're crazy.
No, I just see more potential in D than you do.
To be clear, I'm not calling you crazy in general. I'm calling the idea of bypassing libc to call syscalls directly under any kind of normal circumstances crazy. There is tons of work to be done around here to improve D, and IMHO, reimplementing OS functions just because they're written in C is a total waste of time and an invitation for bugs - in addition to making the druntime code that much less portable, since it bypasses the API layer that was standardized for POSIX systems. It's the kind of thing that's going to cause us way more work, more bugs, and make us that much less compatible with existing libraries. And for what? To _maybe_ get slightly better performance (which you probably won't get)? I honestly think that trying to bypass libc to talk to the kernel directly is actively worse than just using libc much as it would be great if we somehow lived in a world where every library we used was written in D. But the reality of the matter is that there is a _lot_ out there already written in C where it simply makes no sense to try to replace it. We're always going to need to interoperate with C unless we somehow convince all of the C developers to at least switch to -betterC (which obviously isn't happening). - Jonathan M Davis
You're underestimating the benefits. It's not just to be eventually slightly faster. It makes safe versions possible, this in turn avoids a lot of trusted calls, so reduces review effort. It allows also to develop own kernels (for maybe new hardware) without needing a c-toolchain an it makes D more self contained. There are certainly more advantages. And if you don't like it, the c stuff remains there for you to use.
Aug 24 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 4:44:31 PM MDT Dominikus Dittes Scherkl via 
Digitalmars-d wrote:
 You're underestimating the benefits. It's not just to be
 eventually slightly faster. It makes  safe versions possible,
 this in turn avoids a lot of  trusted calls, so reduces review
 effort. It allows also to develop own kernels (for maybe new
 hardware) without needing a c-toolchain an it makes D more self
 contained. There are certainly more advantages. And if you don't
 like it, the c stuff remains there for you to use.
It doesn't reduce the number of trusted calls at all. Best case, you're talking about using trusted with syscall instead of a function like stat, and for many of the C functions, it's perfectly reasonable to just mark their bindings as trusted, eliminating the need to use trusted yourself entirely. You're not reducing the number of uses of trusted. All you get out of it is that then you've written the OS code using system rather than relying on the C programmers to do their job when writing basic OS stuff. You then might catch a problem writing that code that someone writing the same code in C wouldn't have caught as easily, but that's it. You're bypassing a _heavily_ used and tested piece of code written by experts just because you want to be able to have safe verify it, or because you want to avoid it simply because it's C. And because of the prevalence of pointers to local addresses in such code, there's a pretty good chance that a lot of it will have to be hand-vetted and marked with trusted anyway instead of being able to take advantage of safe. And if someone wants to write an OS in D, then fine. They can do it. There's nothing about our current approach that stops them. As I understand it, there have already been a couple of projects to do exactly that, but you're not going to replace the major OSes with D any time soon (or likely ever), and the vast majority of D code is going to be interacting with those OSes - most of which provide their public APIs via C (many via the same POSIX API). By using libc like everyone else, we get to take advantage of that work and work with a more portable API, risking fewer bugs in the process. Right now, we don't have to test all of the bindings in druntime to death, because they're just bindings, and we can rely on the libc guys to have done their job, whereas we would then be doing their jobs if we insisted on bypassing libc. It's a maintenance nightmare for little to no benefit. I don't want to have to deal with it as a maintainer, and I don't want programs that I write to be bypassing libc using a custom implementation just because someone decided that they didn't like the fact that it was in C instead of D. - Jonathan M Davis
Aug 24 2018
prev sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/24/18 6:16 PM, Jonathan M Davis wrote:
 On Friday, August 24, 2018 7:46:57 AM MDT Mike Franklin via Digitalmars-d
 wrote:
 On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis wrote:
 I think that you're crazy.
No, I just see more potential in D than you do.
To be clear, I'm not calling you crazy in general. I'm calling the idea of bypassing libc to call syscalls directly under any kind of normal circumstances crazy. There is tons of work to be done around here to improve D, and IMHO, reimplementing OS functions just because they're written in C is a total waste of time and an invitation for bugs - in addition to making the druntime code that much less portable, since it bypasses the API layer that was standardized for POSIX systems.
Let me say that I both agree with Jonathan and with Mike. I think we should reduce Phobos dependence on the user-library part of libc, while at the same time, not re-inventing how the OS bindings are called. For example, using Martin's std.io library instead of <stdio.h>. I really don't want to see dlang have to maintain posix system calls on all supported OSes when that's already being done for us. Windows makes this simpler -- the system calls are separate from the C runtime. It would be nice if Posix systems were that way, but it's both silly to reinvent the system calls (they are on every OS anyways, and in shared-library form), and a maintenance nightmare. For platforms that DON'T have an OS abstraction, or it's split out from the user library part of libc, it would be perfectly acceptable to write a shim there if needed. I'd be surprised if it's not already present in C form. -Steve
Aug 24 2018
parent Mike Franklin <slavo5150 yahoo.com> writes:
On Friday, 24 August 2018 at 22:52:07 UTC, Steven Schveighoffer 
wrote:

 I really don't want to see dlang have to maintain posix system 
 calls on all supported OSes when that's already being done for 
 us.

 Windows makes this simpler -- the system calls are separate 
 from the C runtime. It would be nice if Posix systems were that 
 way, but it's both silly to reinvent the system calls (they are 
 on every OS anyways, and in shared-library form), and a 
 maintenance nightmare.
Keep in mind that we only need to implement the system calls that we need. I haven't looked into Phobos, and probably never will. My interest is mostly in druntime. At this time, I think we only need 2: `sbrk` and `mmap` for `malloc`. I don't consider that much of a maintenance burden, and `malloc` and friends are my least concern at the moment. We're disproportionately leveraging libc in druntime; there are only a few things needed from libc for druntime, and I think I can demonstrate benefit writing them in D (or if someone else wants to, please do, I may never even get to it). If I even stick around in the D community long enough to pursue this, this change it'll be incremental and I'll demonstrate benefit each step of the way. Mike
Aug 24 2018
prev sibling next sibling parent Guillaume Piolat <spam smam.org> writes:
On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis wrote:
 I honestly don't see how attempting to divorce druntime from 
 libc does anything but increase the amount of work that we have 
 to do and increase the likelihood that basic OS functionality 
 is going to be buggy, since we will have then reimplemented it 
 rather than using the same core OS functionality that everyone 
 else is using.
+1 Now that the leadership talks about Leverage points http://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/ When rated from 1 to 9 this is a 10, it likely brings negative value. This is even worse if D_SIMD version of AVX is implemented because of that, loosing precious Walter-time (an illiquid asset) for a non-existent fraction of users: also a 10. Whereas merely adding colors to DUB would be a 7. I feel even more uneasy by all that safe focus, as if native programmers started to care overnight. No, we still don't. People say they do.
Aug 24 2018
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 24 August 2018 at 13:21:25 UTC, Jonathan M Davis wrote:
 On Friday, August 24, 2018 6:05:40 AM MDT Mike Franklin via 
 Digitalmars-d wrote:
 You're basically trying to bypass the OS' public API if 
 you're trying to bypass libc.
No I'm trying to bypass libc and use the OS API directly.
And my point is that most OSes consider libc to be their OS API (Linux doesn't, but it's very much abnormal in that respect).
Well, it used to be the case that it was normal to call OS directly by using traps, but since the context switch is so expensive on modern CPUs we a have a situation where the calling stub is a fraction of the calling cost these days. Thus most don't bother with it. What usually can happen if you don't use the c-stubs with dynamic linkage is that your precompiled program won't work with new versions of the OS. But that can also happen with static linkage.
 Trying to bypass it means reimplementing core OS functionality 
 and risking all of the bugs that go with it.
It is the right thing to do for a low level language. Why have libc as a dependency if you want to enable hardware oriented programming? Using existing libraries also put limits on low level language semantics.
 If you're talking about avoiding libc functions like strcmp 
 that's one thing, but if you're talking about reimplementing 
 stuff that uses syscalls, then honestly, I think that you're 
 crazy.
No it isn't a crazy position, why the hostile tone? Libc isn't available in many settings. Not even in webassembly.
Sep 04 2018
prev sibling parent reply Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Friday, 24 August 2018 at 09:46:08 UTC, Jonathan M Davis wrote:
 On Friday, August 24, 2018 2:46:06 AM MDT Dave Jones via 
 Digitalmars-d wrote:
 On Friday, 24 August 2018 at 04:50:34 UTC, Mike Franklin wrote:
 On Friday, 24 August 2018 at 04:12:42 UTC, Jonathan M Davis 
 wrote:


 It's not a problem for Phobos to depend on the C standard 
 library.  My goals have to do with making D, the language, 
 freestanding (a.k.a nimble-D).
If the poster feature for D in the upcoming years is memory safety then how can Walter seriously consider continued dependency on libc?
For any kind of normal operating system, you _have_ to use libc. It's part of the OS. Some pieces could be done without it, but on the whole, you use libc if you want to talk to the OS. That's just life. The only exceptions I'm aware of to that are embedded devices, and my understanding is that if anything, such devices are more and more likely to run a fullblown OS, making it that much less likely that you'd ever do anything without libc.
Another notable exception is WebAssembly. Others include the whole distroless container trend going down to uni-kernels for use as a slim base for micro services. Why ship a container with full libc, if you're only going to use a handful of syscalls? That's simply a larger than necessary attack surface.
 Sure, we don't need to call C functions like strcmp, but if you 
 want to do things like I/O, you have to use the OS' API, and 
 that's libc. And yes, that means that we depend on code that's 
 not vetted via  safe, but at this point, the major OSes are 
 written in C, and they present a C API. So, accessing that 
 functionality means depending on the OS devs to have gotten it 
 right, much as it would be nice if it were verified with 
 something like  safe. The same is going to happen with plenty 
 of existing libraries that are almost certainly not going to 
 have replacements in D (e.g. postgres, ffmpeg, etc). We're 
 never going to completely escape the  safety issues introduced 
 by C. Ultimately, the best that we can do is make sure that the 
 actual D code is  safe as long as any libraries it uses are 
  safe.

 - Jonathan M Davis
One of the things that makes Go successful is the quality/ease of use of its toolchain. They have full cross-compilation support out of the box because they don't rely on anything from the C toolchain (libc, linker, etc.). They implement implement everything themselves, from the syscall layer - up the whole stack (e.g. https://golang.org/pkg/syscall , https://golang.org/pkg/os/ ). Rust is also taking the same path. In recent times it seems that every new systems programming language is trying to prove that while it can interact with existing C libraries effortlessly, it has to be fully independent in order to be taken seriously. You can't replace C/C++ if you depend on them. One small data point: changes to the C toolchain on Linux broke the druntime's backtrace support several times. If we didn't depend on them this probably wouldn't have happened.
Aug 24 2018
parent Elronnd <elronnd elronnd.net> writes:
On Friday, 24 August 2018 at 11:55:47 UTC, Petar Kirov 
[ZombineDev] wrote:
 One of the things that makes Go successful is the quality/ease 
 of use of its toolchain. They have full cross-compilation 
 support out of the box because they don't rely on anything from 
 the C toolchain (libc, linker, etc.). They implement implement 
 everything themselves, from the syscall layer
This is something that has caused breakage on newer versions of macos, and broadly makes it difficult to port to new OSes. Something that might be worth pursuing, though, is implementing some of the things in core.stdc in pure d, like printf or strcmp, but printf *would* ultimately forward to the actual libc fwrite.
Aug 24 2018
prev sibling parent reply Jon Degenhardt <jond noreply.com> writes:
On Friday, 24 August 2018 at 00:46:14 UTC, Mike Franklin wrote:
 It seems, from someone without much historical perspective, 
 that Phobos was intended to be something like the .Net 
 Framework for D.  Perhaps there are a few fundamentals 
 (std.algorithm, std.allocator, etc.) to keep, but for the 
 others... move 'em to Dub and let the "free market" sort it out.
That might work for some use cases, but not for others. For my use cases, a rock solid standard library is a basic requirement (think STL, Boost, etc). These don't normally come out of a loose knit community of individuals, there needs to be some sort of organizational presence involved to ensure quality, consistency, completeness, etc. If Phobos or an equivalent wasn't available at its present level of quality then D wouldn't be in the consideration set. On the other hand, my use-cases don't have the requirements that drive other folks towards removing dependence on druntime and similar. An individual or organization's prioritization preferences will depend on their goals. --Jon
Aug 23 2018
parent Guillaume Piolat <spam smam.org> writes:
On Friday, 24 August 2018 at 01:50:53 UTC, Jon Degenhardt wrote:
 quality, consistency, completeness
My point is that it's more important to have useful, easy to change, messy libraries. If you restrict yourself to whatever is in Phobos then the burden on core developers is only ever increasing.
Aug 24 2018
prev sibling next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 No, no and no.

 I was holding out on replying to this thread to see how the 
 community would react. The vibe I'm getting, however, is that 
 the people who are seeing D's problems have given up on 
 affecting change.

 It is no secret that when I joined Weka, I was a sole D 
 detractor among a company quite enamored with the language. I 
 used to have quite heated water cooler debates about that point 
 of view.

 Every single one of the people rushing to defend D at the time 
 has since come around. There is still some debate on whether, 
 points vs. counter points, choosing D was a good idea, but the 
 overwhelming consensus inside Weka today is that D has *fatal* 
 flaws and no path to fixing them.
A list, please? Now that I actually have time to fix things, I intend to do so.
 And by "fatal", I mean literally flaws that are likely to 
 literally kill the language.

 And the thing that brought them around is not my power of 
 persuasion. The thing that brought them around was spending a 
 couple of years working with the language on an every-day basis.

 And you will notice this in the way Weka employees talk on this 
 forum: except me, they all disappeared. You used to see Idan, 
 Tomer and Eyal post here. Where are they?

 This forum is hostile to criticism, and generally tries to keep 
 everyone using D the same way. If you're cutting edge D, the 
 forum is almost no help at all. Consensus among former posters 
 here is that it is generally a waste of time, so almost 
 everyone left, and those who didn't, stopped posting.

 And it's not just Weka. I've had a chance to talk in private to 
 some other developers. Quite a lot have serious, fundamental 
 issues with the language. You will notice none of them speaks 
 up on this thread.

 They don't see the point.
That reminds me, what happened to our conversation with Ali Çehreli about splitting general into Technical and less technical? Not to imply that the problems listed are purely technical. There is a distinct lack of well documented direction beyond incremental improvements.
 No technical project is born great. If you want a technical 
 project to be great, the people working on it have to focus on 
 its *flaws*. The D's community just doesn't do that.

 To sum it up: fatal flaws + no path to fixing + no push from 
 the community = inevitable eventual death.

 With great regrets,
 Shachar
Indeed. It is time to push, then. Nic
Aug 22 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 08:20, Nicholas Wilson wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh wrote:
 No, no and no.

 I was holding out on replying to this thread to see how the community 
 would react. The vibe I'm getting, however, is that the people who are 
 seeing D's problems have given up on affecting change.

 It is no secret that when I joined Weka, I was a sole D detractor 
 among a company quite enamored with the language. I used to have quite 
 heated water cooler debates about that point of view.

 Every single one of the people rushing to defend D at the time has 
 since come around. There is still some debate on whether, points vs. 
 counter points, choosing D was a good idea, but the overwhelming 
 consensus inside Weka today is that D has *fatal* flaws and no path to 
 fixing them.
A list, please? Now that I actually have time to fix things, I intend to do so.
Let's start with this one: https://issues.dlang.org/show_bug.cgi?id=14246#c6 The problems I'm talking about are not easily fixable. They stem from features not playing well together. One that hurt me lately was a way to pass a scoped lazy argument (i.e. - to specify that the implicit delegate need not allocate its frame, because it is not used outside the function call). Shachar
Aug 22 2018
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh 
wrote:

 One that hurt me lately was a way to pass a scoped lazy 
 argument (i.e. - to specify that the implicit delegate need not 
 allocate its frame, because it is not used outside the function 
 call).
I don't see why we just can't add support for scoped lazy parameters. It's already in the language just with a different syntax (delegates). That would probably be an easy fix (last famous words :)). I guess it would be better if it could be inferred. -- /Jacob Carlborg
Aug 22 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 09:17, Jacob Carlborg wrote:
 On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh wrote:
 
 One that hurt me lately was a way to pass a scoped lazy argument (i.e. 
 - to specify that the implicit delegate need not allocate its frame, 
 because it is not used outside the function call).
I don't see why we just can't add support for scoped lazy parameters. It's already in the language just with a different syntax (delegates). That would probably be an easy fix (last famous words :)). I guess it would be better if it could be inferred. -- /Jacob Carlborg
Here's the interesting question, though: is this *going* to happen? We've known about this problem for ages now. No movement. Some of the other problems are considerably less easy to fix. Examples: A struct may be disabled this(this), disable this() and/or disable init. Can you say that libraries.. Actually, strike that. Can you say that the *standard* libraries work with all 8 combinations? Shachar
Aug 22 2018
next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On Thursday, 23 August 2018 at 06:34:04 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 09:17, Jacob Carlborg wrote:
 On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh 
 wrote:
 
 One that hurt me lately was a way to pass a scoped lazy 
 argument (i.e. - to specify that the implicit delegate need 
 not allocate its frame, because it is not used outside the 
 function call).
I don't see why we just can't add support for scoped lazy parameters. It's already in the language just with a different syntax (delegates). That would probably be an easy fix (last famous words :)). I guess it would be better if it could be inferred. -- /Jacob Carlborg
Here's the interesting question, though: is this *going* to happen? We've known about this problem for ages now. No movement.
It's on my todo list, however I've instead been doomed to work on higher priority things. More generally though, some time should be spent on trying out things in the spirit of "will it blend" just to see what happens. Putting effort towards having a more homogeneous environment in the language should in the long run pay its dividends.
 Some of the other problems are considerably less easy to fix. 
 Examples:

 A struct may be  disabled this(this),  disable this() and/or 
  disable init. Can you say that libraries..

 Actually, strike that.

 Can you say that the *standard* libraries work with all 8 
 combinations?
The same goes for using shared, immutable and const against the standard library. Iain
Aug 23 2018
parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 23 August 2018 at 07:00:01 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 06:34:04 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 09:17, Jacob Carlborg wrote:
 I don't see why we just can't add support for scoped lazy 
 parameters. It's already in the language just with a 
 different syntax (delegates). That would probably be an easy 
 fix (last famous words :)). I guess it would be better if it 
 could be inferred.
Here's the interesting question, though: is this *going* to happen? We've known about this problem for ages now. No movement.
It's on my todo list, however I've instead been doomed to work on higher priority things. More generally though, some time should be spent on trying out things in the spirit of "will it blend" just to see what happens. Putting effort towards having a more homogeneous environment in the language should in the long run pay its dividends.
Is there even any way to escape a lazy? If no, then lazy is identical to scope lazy. E.g. https://run.dlang.io/is fails to compile
Aug 23 2018
parent Iain Buclaw <ibuclaw gdcproject.org> writes:
On Thursday, 23 August 2018 at 09:29:30 UTC, Nicholas Wilson 
wrote:
 On Thursday, 23 August 2018 at 07:00:01 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 06:34:04 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 09:17, Jacob Carlborg wrote:
 I don't see why we just can't add support for scoped lazy 
 parameters. It's already in the language just with a 
 different syntax (delegates). That would probably be an easy 
 fix (last famous words :)). I guess it would be better if it 
 could be inferred.
Here's the interesting question, though: is this *going* to happen? We've known about this problem for ages now. No movement.
It's on my todo list, however I've instead been doomed to work on higher priority things. More generally though, some time should be spent on trying out things in the spirit of "will it blend" just to see what happens. Putting effort towards having a more homogeneous environment in the language should in the long run pay its dividends.
Is there even any way to escape a lazy? If no, then lazy is identical to scope lazy. E.g. https://run.dlang.io/is fails to compile
Link not found. I assume you tried something like returning a delegate that references a lazy parameter?
Aug 28 2018
prev sibling parent Jacob Carlborg <doob me.com> writes:
On Thursday, 23 August 2018 at 06:34:04 UTC, Shachar Shemesh 
wrote:

 Here's the interesting question, though: is this *going* to 
 happen?
I didn't know about the issue until you recently brought it up. -- /Jacob Carlborg
Aug 23 2018
prev sibling next sibling parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh 
wrote:
 Let's start with this one:
 https://issues.dlang.org/show_bug.cgi?id=14246#c6

 The problems I'm talking about are not easily fixable. They 
 stem from features not playing well together.

 One that hurt me lately was a way to pass a scoped lazy 
 argument (i.e. - to specify that the implicit delegate need not 
 allocate its frame, because it is not used outside the function 
 call).
The only real problem with D is that it's a language designed with GC in mind, yet there are numerous attempts to use it without GC. Also, supporting GC-less programming gets in the way of improving D's GC (which is pretty damn bad by modern standards). That's the only real technical problem. For example, the "bug" above just means that D doesn't support RAII (in the C++ sense). That's hardly a *fatal flaw*. Lots of languages don't those. And yes, most of those just use GC to dispose of memory - other resources are rarely used (compared to memory) and it's not a problem to manage them manually. You also mentioned lazy parameters allocating... GC thing again. Just allocate then? No? IMO, if getting the maximum number of users is the main goal, D is indeed going the wrong way. It would be better to get rid of nogc, betterC, dip1000, implement write barriers and use them to improve GC. Martin Nowak (I think) mentioned that write barriers will decrease performance of D programs by 1-5%. Seems like a small price to pay for better GC with shorter pauses. It would also probably be simpler technically than stuff like dip1000 and rewriting Phobos. Of course, maximizing the number of users is not the only goal, or even the main one. My understanding is that Walter wants a "systems language" with "zero cost abstractions". Well, it's very well possible that D's design precludes that. Other than memory management, I don't see any real fundamental problems.
Aug 22 2018
next sibling parent reply JN <666total wp.pl> writes:
On Thursday, 23 August 2018 at 06:34:01 UTC, nkm1 wrote:
 The only real problem with D is that it's a language designed 
 with
 GC in mind, yet there are numerous attempts to use it without 
 GC.
 Also, supporting GC-less programming gets in the way of 
 improving
 D's GC (which is pretty damn bad by modern standards).
 That's the only real technical problem.
I think a large part is defining what kind of users D wants to attract. There are two main groups of programmers, and there is a vast rift between those groups. One group is people who are Javascript. These people are OK with things like garbage collectors and in cases where it matters, have learned to work around it (avoid allocations in hot loops, etc.). I feel like D1 was attractive for these people for having the convenience they are used to from their languages (batteries included standard library, automatic memory management), with additional features that their language/environments struggle with (C interop, native binaries), everything packed with a very clean syntax. The second group are the C/C++ programmers, the 'zero cost abstraction' group. For this group of programmers, any overhead is a disadvantage, garbage collector is unusable for most usecases (whether true or not, that's the perception). D1 appealed to those people, for having a clean syntax and the features they know without having to include the monster that is Boost. Battlefield was different back then too. Around D2 came the competition, be it Rust, Go, or C++17. Go is appealing more to the first group of programmers, since it has a GC, and mostly sticks to webservice usage. Rust is heavily appealing to the zero-cost abstraction group and C++17 obviously appeals to C++ folks. Is it possible to make a language that both groups would be happy to use? Perhaps, or perhaps the gap is too wide. Is adding features like dip1000 and betterC spreading ourselves too thin? Perhaps. Perhaps there are features that aren't really used, and should be reworked or cut from the language instead (has anyone ever used contracts?). D's not UNIX (DNU?), but the first rule of UNIX philosophy is "Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new 'features'.". It may or may not be relevant here. BTW. on the offtopic note - the thread title doesn't look too good. Imagine being a newcomer, and the first thread you see on the forum is titled "D is dead".
Aug 23 2018
next sibling parent Laeeth Isharc <laeeth laeeth.com> writes:
On Thursday, 23 August 2018 at 07:27:56 UTC, JN wrote:
 On Thursday, 23 August 2018 at 06:34:01 UTC, nkm1 wrote:
 The only real problem with D is that it's a language designed 
 with
 GC in mind, yet there are numerous attempts to use it without 
 GC.
 Also, supporting GC-less programming gets in the way of 
 improving
 D's GC (which is pretty damn bad by modern standards).
 That's the only real technical problem.
I think a large part is defining what kind of users D wants to attract. There are two main groups of programmers, and there is a vast rift between those groups. One group is people who are Javascript. These people are OK with things like garbage collectors and in cases where it matters, have learned to work around it (avoid allocations in hot loops, etc.). I feel like D1 was attractive for these people for having the convenience they are used to from their languages (batteries included standard library, automatic memory management), with additional features that their language/environments struggle with (C interop, native binaries), everything packed with a very clean syntax. The second group are the C/C++ programmers, the 'zero cost abstraction' group. For this group of programmers, any overhead is a disadvantage, garbage collector is unusable for most usecases (whether true or not, that's the perception). D1 appealed to those people, for having a clean syntax and the features they know without having to include the monster that is Boost. Battlefield was different back then too. Around D2 came the competition, be it Rust, Go, or C++17. Go is appealing more to the first group of programmers, since it has a GC, and mostly sticks to webservice usage. Rust is heavily appealing to the zero-cost abstraction group and C++17 obviously appeals to C++ folks. Is it possible to make a language that both groups would be happy to use? Perhaps, or perhaps the gap is too wide. Is adding features like dip1000 and betterC spreading ourselves too thin? Perhaps. Perhaps there are features that aren't really used, and should be reworked or cut from the language instead (has anyone ever used contracts?). D's not UNIX (DNU?), but the first rule of UNIX philosophy is "Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new 'features'.". It may or may not be relevant here. BTW. on the offtopic note - the thread title doesn't look too good. Imagine being a newcomer, and the first thread you see on the forum is titled "D is dead".
Not sure about your taxonomy. There are quite a few people involved in D who are native code C/C++ developers spiritually and by prior work focus who moved to D and don't mind, in fact maybe relish the GC. One might include Walter and Andrei in this. I don't want to speak for Atila but I would like to see his face if you tell him that he is deep down a Java programmer at heart, as I don't believe that's quite right. I don't think he despises the GC even though he has written libraries that make it easier not to use it. It's silly to think that languages are in a death match competition against each other in some zero-sum game because there's no justification for it. There have never been so many programmers in the world as today, and in twenty years i doubt there will be fewer. There have also never been as many practitioners who write code as part of their job doing something else as today. Furthermore the amount of code written depends on the extent to which one can express ones ideas efficiently in code. I personally would have programmed much less since 2014 if I had to write in a language I didn't find agreeable, and I would have hired fewer programmers too - quite a lot of code exists now that simply wouldn't exist without D. How can you possibly form an opinion on the different sorts of programmer ? Most programmers aren't active on social media and don't work for companies that talk in public about their work. Go for example just isn't a relevant alternative to what I am doing except perhaps for DevOps. It solves a different kind of problem and is intended to be used by a different sort of person - Google hire a lot of programmers without much experience and have to find a way to get useful work out off them. That's different from my own challenges. For me the relevant alternatives might be Julia and Python. And even then they aren't very close substitutes. Implicitly D is even a competitor for VBA because our little DSL written in D solves many of the problems VBA and Excel are used for in a much better way. One could have done that in many other languages too, but it is in D because it was originally written for another purpose and turns out it solves this problem too. There's a chap here with a family business where its PHP versus D. For Bastian it was a modern Pascal dialect Vs D versus Ada. Most code is written in business and academe. The beautiful thing about being up and coming is you don't need to take on the contender. You just need to find a few more adherents from divers sources. It's absolutely by no means what you want to be thought of as a threat until it's too late. You want most people to write you off whilst having a very high appeal to those to whom you do appeal. Does D have a very high appeal to certain kinds of programmer? "So I can work from home and write D? Sounds good, when should I start?". Another guy. "What are your compensation expectations?" "I earn X but if it's working in D then we can talk about it - I could take a pay cut". These aren't callow youth filled with unrealistic ideas about what it's like to be a programmer or what it's like working in D. I think the niche of D is defined differently from conventional categories that people have come up with. It's especially suited to some domains yes (these happen to be rapidly growing domains - they should make an ETF based on the share prices of companies using D if only they were public) but I think the appeal is based on values, taste and virtues. As regards the latter you have to be able to endure a bit of discomfort to get very far with D. So for what I am doing personally the very things that people grumble about on Reddit are net positives for me. I need people who know how to learn and can figure things out without being instructed step by step what to do. Many businesses are quite different from each other, particularly SMEs that create the jobs. Even in my little part of the world there's a massive difference between companies in the sector and they do things in quite different ways because they have different problems, goals and people. The world of social media is a very different place from the world of principals making decisions about how to solve their business challenges. And I tell you that faced with an understanding of these problems the same people that might write on media about the problems of D may have a different assessment when it comes to the practical question of whether D is indeed a good answer to the practical commercial problem you face. It's very hard to generalise. If one is in the business of making predictions one ought to make them concrete and back them with money as a wager. Otherwise it's just talk. And beyond talk I don't think concrete actions are more interesting. In open source more even than business sometimes problems are also opportunities and it's not deluded happy talk to suggest there's some benefit in trying to be the change in the world that you wish to see.
Aug 23 2018
prev sibling parent reply rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 07:27:56 +0000, JN wrote:

 I think a large part is defining what kind of users D wants to attract.
I've begun wondering whether "pragmatism" is sometimes used as a code word for indecision.
 Is it possible to make a language that both groups would be happy to
 use? Perhaps, or perhaps the gap is too wide. Is adding features like
 dip1000 and betterC spreading ourselves too thin? Perhaps. Perhaps there
 are features that aren't really used, and should be reworked or cut from
 the language instead
I do think that D can do it. And I think D is the only language I've looked at that can do it. But I think it's going to take Walter and Andrei, in conversation with the core team, putting together a real list of priorities and setting a high-level direction. Look at what the end goal really is and what it will take to get there. The current high level document tends to read as a list of what's already being worked on, but piecemeal improvements probably aren't going to cut it -- this goes back to the leverage conversation Andrei started earlier.
 (has anyone ever used contracts?).
I do. It's a shame D doesn't take them seriously. As it is, I generally use them solely to express intent, which you don't get by placing asserts in the function body. I often read the function signature of functions I'm calling without reading the body, so separating the asserts from the body is helpful. And they're often useful on interfaces.
Sep 01 2018
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Sep 01, 2018 at 01:26:01PM +0000, rjframe via Digitalmars-d wrote:
 On Thu, 23 Aug 2018 07:27:56 +0000, JN wrote:
[...]
 (has anyone ever used contracts?).
I do.
Me too. They are very useful to express intent, even if the current implementation leaves some things to be desired.
 It's a shame D doesn't take them seriously.
Not sure what you mean by that. Care to elaborate?
 As it is, I generally use them solely to express intent, which you
 don't get by placing asserts in the function body. I often read the
 function signature of functions I'm calling without reading the body,
 so separating the asserts from the body is helpful.
[...] Yes. The current implementation of contracts leaves some things to be desired, but I'm hopeful after the recent syntax revamp was accepted and merged into git master. The next milestone to fight for is pushing the checking of contracts to the caller, rather than the callee. This will be important to solve the currently very annoying (and debilitating) problem with binary shared libraries, in that it will allow the same shared binaries to be used when compiling both with and without contracts. It should be the end user's build script that decides whether or not contracts are compiled in, but currently this is not the case. This change may also address the current hackish implementation of subclass contracts (which involves catching Errors, an arguably dangerous thing to do), though I'm not 100% sure. T -- Старый друг лучше новых двух.
Sep 01 2018
prev sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, September 1, 2018 8:31:36 AM MDT H. S. Teoh via Digitalmars-d 
wrote:
 This change may also address the current hackish implementation of
 subclass contracts (which involves catching Errors, an arguably
 dangerous thing to do), though I'm not 100% sure.
AFAIK, there's absolutely nothing required to fix that other than just implementing it. As I understand it, the assertions act normally and thus throw AssertErrors, but there should be no technical reason why they couldn't be transformed into something else. e.g. in { assert(foo < 7); } could be lowered to something like in { if(foo < 7) return false; return true; } The only differences would then be if any functions called in the contract resulted in an AssertError (sinc that would no longer count as a contract failure - which is arguably a bug fix) and that explicitly throwing an AssertError wouldn't work anymore - but I expect that that's rare enough that it wouldn't be all that big a deal. - Jonathan M Davis
Sep 01 2018
prev sibling next sibling parent reply Ali <fakeemail example.com> writes:
On Thursday, 23 August 2018 at 06:34:01 UTC, nkm1 wrote:
 The only real problem with D is that it's a language designed 
 with
 GC in mind, yet there are numerous attempts to use it without 
 GC.
 Also, supporting GC-less programming gets in the way of 
 improving
 D's GC (which is pretty damn bad by modern standards).
 That's the only real technical problem.
I completely agree with this opinion
 For example, the "bug" above just means that D doesn't support 
 RAII
 (in the C++ sense). That's hardly a *fatal flaw*. Lots of 
 languages don't
 support RAII.
RAII is nowadays described as C++ nicest , more important and most powerful feature D again, missed the opportunity on that one D needs a more serious and ambitious roadmap The pragmatic approach, described in the main blog post, is not good enough, for a language that have a small community like D
Aug 23 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, August 23, 2018 9:03:24 AM MDT Ali via Digitalmars-d wrote:
 RAII is nowadays described as C++ nicest , more important and
 most powerful feature
 D again, missed the opportunity on that one
D was designed to have RAII, and it does. It's just that the implementation is buggy (which is obviously a serious problem), so depending on what your program is doing, it's not going to work correctly. It's not like we opted to not have RAII in D, and I'm sure that it will be fixed at some point. So, while it can certainly be argued that we've dropped the ball by not getting it fully fixed by now, I don't really see how it could be termed a missed opportunity. - Jonathan M Davis
Aug 23 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 9:19 AM, Jonathan M Davis wrote:
 D was designed to have RAII, and it does. It's just that the implementation
 is buggy (which is obviously a serious problem), so depending on what your
 program is doing, it's not going to work correctly. It's not like we opted
 to not have RAII in D, and I'm sure that it will be fixed at some point. So,
 while it can certainly be argued that we've dropped the ball by not getting
 it fully fixed by now, I don't really see how it could be termed a missed
 opportunity.
As far as I know, the only known bug in RAII is https://issues.dlang.org/show_bug.cgi?id=14246 where constructors are not properly unwinding partially constructed objects. My personal opinion is that constructors that throw are an execrable programming practice, and I've wanted to ban them. (Andrei, while sympathetic to the idea, felt that too many people relied on it.) I won't allow throwing constructors in dmd or any software I have authority over. (Having throwing destructors is even worse, it's just madness. Although it is allowed in C++, it doesn't actually work.) My personal opinion aside, it is a feature in D, and should work correctly, and is a serious issue for those that use throwing constructors. (It's not a disaster, though, as scope(failure) can be used to work around it.)
Aug 23 2018
next sibling parent reply David Nadlinger <code klickverbot.at> writes:
On Thursday, 23 August 2018 at 21:31:41 UTC, Walter Bright wrote:
 My personal opinion is that constructors that throw are an 
 execrable programming practice, and I've wanted to ban them. 
 (Andrei, while sympathetic to the idea, felt that too many 
 people relied on it.) I won't allow throwing constructors in 
 dmd or any software I have authority over.
Throwing constructors are fundamental for making RAII work in a composable fashion. If constructors are not allowed to throw and you want to avoid manually creating a "uninitialized" state – which is error-prone and defeats much of the point of an RAII strategy –, all dependencies need to be injected externally, that is, constructed independently and then passed into the constructor. Sometimes, inversion of control is of course the right call – cf. the hype around DI –, but sometimes you'd rather cleanly abstract the implementation details away. Banning them from the language only pushes the complexity of handling semi-constructed objects into ad-hoc user code solutions, which I'd argue is worse in terms of usability and potential for bugs. I suppose you view this as advantageous because you place more weight on the language not having to explicitly deal with this scenario in the text of the specification? — David
Aug 23 2018
next sibling parent reply Ethan <gooberman gmail.com> writes:
On Thursday, 23 August 2018 at 22:12:17 UTC, David Nadlinger 
wrote:
 Throwing constructors are fundamental for making RAII work in a 
 composable fashion.
Is that actually true, or would handling exceptions within the constructor allow one to initialise the object to an invalid state and thus still follow RAII paradigms correctly? The amount of bugs and pain I've had to deal with in production code because of throwing constructors makes me lean more towards Walter's viewpoint here. Even one of our most basic datatypes - the IEEE float - initialises to an invalid state. Why can't RAII objects do the same if it performs operations it *knows* throw exceptions?
Aug 23 2018
parent David Nadlinger <code klickverbot.at> writes:
On Thursday, 23 August 2018 at 23:06:00 UTC, Ethan wrote:
 Is that actually true, or would handling exceptions within the 
 constructor allow one to initialise the object to an invalid 
 state and thus still follow RAII paradigms correctly?
If you end up needing to check for that uninitialised state in all the other member functions for correctness (to avoid operating on invalid pointers/handles/…), that's a imho bastardisation of RAII that destroys most of the benefits. Also, how do you inform the client code that something went wrong internally? Most likely, you'd want to construct the objects internally and pass them into the constructor to get around that; hence the composability argument.
 The amount of bugs and pain I've had to deal with in production 
 code because of throwing constructors makes me lean more 
 towards Walter's viewpoint here.
What was the root cause of these issues?
 Why can't RAII objects do the same if it performs operations it 
 *knows* throw exceptions?
Apart from the above argument, this would require making all constructors implicitly nothrow to avoid degrading into faith-based programming. — David
Aug 23 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 3:12 PM, David Nadlinger wrote:
 On Thursday, 23 August 2018 at 21:31:41 UTC, Walter Bright wrote:
 My personal opinion is that constructors that throw are an execrable 
 programming practice, and I've wanted to ban them. (Andrei, while sympathetic 
 to the idea, felt that too many people relied on it.) I won't allow throwing 
 constructors in dmd or any software I have authority over.
Throwing constructors are fundamental for making RAII work in a composable fashion.
I understand my opinions on this diverge from the conventional wisdom.
 If constructors are not allowed to throw and you want to avoid manually
creating 
 a "uninitialized" state – which is error-prone and defeats much of the point
of 
 an RAII strategy –, all dependencies need to be injected externally, that
is, 
 constructed independently and then passed into the constructor. Sometimes, 
 inversion of control is of course the right call – cf. the hype around DI
–, but 
 sometimes you'd rather cleanly abstract the implementation details away.
 
 Banning them from the language only pushes the complexity of handling 
 semi-constructed objects into ad-hoc user code solutions, which I'd argue is 
 worse in terms of usability and potential for bugs.
 
 I suppose you view this as advantageous because you place more weight on the 
 language not having to explicitly deal with this scenario in the text of the 
 specification?
It's easy to specify. That's not an issue at all. It's also easy to implement - where my PR for it failed, was it broke existing code that should never have compiled anyway (for example, a nothrow constructor would then call a throwing destructor, or an safe constructor now would call an unsafe destructor). Dealing with this likely means a compiler switch so an upgrade path is easier. Let's deal first with the easy case - throwing destructors. The problem is that unwinding the stack to deal with the exception means calling destructors. You then have the infamous "double fault exception", and C++ deals with it by terminating the program, which is hardly useful. D deals with it via "chained exceptions", which is terrifyingly difficult to understand. If you believe it is understandable, just try to understand the various devious test cases in the test suite. I regard D's chained exceptions as an utter failure. Back to throwing constructors. 1) They are expensive, adding considerable hidden bloat in the form of finally blocks, one for each constructing field. These unwinding frames defeat optimization. The concept of "zero-cost exception handling" is a bad joke. (Even Chandler Carruth stated that the LLVM basically gives up trying to optimize in the presence of exception handlers.) Herb Sutter has a recent paper out proposing an alternative, completely different, error handling scheme for C++ because of this issue. 2) The presence of constructors that throw makes code hard to reason about. (I concede that maybe that's just me.) I like looking at code and knowing the construction is guaranteed to succeed. Somehow, I've been able to use C++ for decades without needing throwing constructors. Let's take the canonical example, a mutex: { Mutex a; // acquires a mutex, throws if it fails ... locked code ... } // mutex is released My suggestion: { Mutex a; // creates a mutex a.acquire(); // does the obvious thing ... locked code ... } // mutex is released It's still RAII, as the destructor checks to see if the Mutex is required, and if so, releases it. You might argue "my code cannot handle that extra check in the destructor." Fair point, but stack that against the cost of the EH bloat in the constructor, and (for me) the inherently unintuitive nature of a constructor that tries to do far too much. 3) Much of the utility of throwing constructors in C++ comes from "what if the constructor fails to allocate memory". In D, out of memory errors are fatal, no recovery is necessary. That pushes any requirement for throwing constructors to the fringes in the first place.
Aug 23 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
One takeaway from this is despite my loathing of throwing constructors, they
are 
a part of D and I need to make them work properly.

I.e. I am not dictating everything in D based on my personal opinions. I try to 
pick my battles.
Aug 23 2018
prev sibling next sibling parent reply David Nadlinger <code klickverbot.at> writes:
On Thursday, 23 August 2018 at 23:27:51 UTC, Walter Bright wrote:
 D deals with it via "chained exceptions", which is terrifyingly 
 difficult to understand. If you believe it is understandable, 
 just try to understand the various devious test cases in the 
 test suite.
I don't think that assessment is accurate. Yes, I ported EH to a few new targets and wrote the first correct implementation of exception chaining for LDC, so I'm probably a couple of standard deviations off the average D user in that regard. But said average D user doesn't care about most of the nuances of this problem, like the details of implementing exception chaining without allocating too much, or which exceptions take precedence in various intentionally twisted test cases. What they do care about is that the fact that an error has occurred is propagated sensibly in all cases without requiring extra attention, and that information as to the origin is not lost (hence chaining rather than just replacement). Heck, the fact that we still don't have default backtrace handlers that consistently work on all platforms is probably a much bigger problem than the minutiae of exception chaining behaviour. All this is not to say that nothrow constructors aren't a good idea, though. ———
 1) They are expensive, adding considerable hidden bloat in the 
 form of finally blocks, one for each constructing field. These 
 unwinding frames defeat optimization. […]
This cost is inherent to the problem, at least as long as exceptions are used to represent the error conditions in question in the first place. Whether the potentially throwing operations are performed in the constructor or in another method, either way the object will need to be destructed and all preceding initialisation steps undone when an error occurs.
 2) The presence of constructors that throw makes code hard to 
 reason about. (I concede that maybe that's just me.) I like 
 looking at code and knowing the construction is guaranteed to 
 succeed.
This might be just you indeed. How are constructors different than other method calls in that regard? (Granted, constructors can be called implicitly in C++.)
 Somehow, I've been able to use C++ for decades without needing 
 throwing constructors.
How much of what would be considered "modern" C++ code have you written in that time, as opposed to C-with-classes style?
 It [having a separate initialisation method]'s still RAII
One might wonder about the acronym then, but whether your example should be called RAII is an entirely different debate, and one I'm not particularly interested in (I've always thought something like "Resource Release Is Destruction" or just "Scope-based resource management" would be a better name anyway).
 You might argue "my code cannot handle that extra check in the 
 destructor."
It's not just one extra check in the destructor. It's an extra check in every member function, to throw an exception if called on an object that has not been initialised. You might argue that these should be errors/assertions instead, and hence are not as expensive. True, but this points to another problem: Splitting up the initialisation procedure invites a whole class of bugs that would otherwise be syntactically impossible to write. There is considerable theoretical and practical beauty to the idea that as an object is accessible, it is in a valid state. (Linear/affine types, cf. Rust, show the power of this notion extended to ownership.) RAII in the conventional sense (acquisition in the constructor) effectively provides a way to encode 1 bit of typestate. By blurring the line between object initialisation and the phase where it is fully initialised, that descriptive power is lost.
 […] (for me) the inherently unintuitive nature of a constructor 
 that tries to do far too much.
I suppose our opinions just differ here then. To me, this is exactly what I intuitively expect a constructor to do: Create an instance of an object in a normal state. Drawing the line anywhere else, for example (implicitly) allocating memory but not initialising it to a fully usable state, seems artificial to me.
 3) Much of the utility of throwing constructors in C++ comes 
 from "what if the constructor fails to allocate memory". In D, 
 out of memory errors are fatal, no recovery is necessary.
This is unrelated to the discussion at hand, but std.experimental.allocator does allow handle allocation failure, and with good reason: It doesn't make sense to abort the program when a fixed-size local cache or a static global object pool (common in embedded programming) is exhausted. — David
Aug 23 2018
next sibling parent reply David Nadlinger <code klickverbot.at> writes:
On Friday, 24 August 2018 at 03:53:38 UTC, David Nadlinger wrote:
 […]
 All this is not to say that nothrow constructors aren't a good 
 idea, though.
This was meant to say nothrow DEstructors, as hopefully obvious from context. —David
Aug 24 2018
parent David Gileadi <gileadisNOSPM gmail.com> writes:
On 8/24/18 10:02 AM, David Nadlinger wrote:
 On Friday, 24 August 2018 at 03:53:38 UTC, David Nadlinger wrote:
 […]
 All this is not to say that nothrow constructors aren't a good idea, 
 though.
This was meant to say nothrow DEstructors, as hopefully obvious from context. —David
I was about to throw down some constructive criticism, but you caught it early ;)
Aug 24 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 8:53 PM, David Nadlinger wrote:
 On Thursday, 23 August 2018 at 23:27:51 UTC, Walter Bright wrote:
 D deals with it via "chained exceptions", which is terrifyingly difficult to 
 understand. If you believe it is understandable, just try to understand the 
 various devious test cases in the test suite.
I don't think that assessment is accurate. Yes, I ported EH to a few new targets and wrote the first correct implementation of exception chaining for LDC, so I'm probably a couple of standard deviations off the average D user in that regard. But said average D user doesn't care about most of the nuances of this problem, like the details of implementing exception chaining without allocating too much,
I find myself unable to explain the rationale of the behavior exhibited by the current chaining system. I dared not change it, as I presumed somebody surely built a store around it. It does not simply chain exceptions.
 or which exceptions take precedence in various intentionally twisted test
cases.
The only documentation for this is the test suite itself, which does not have any documentation or rationale either, just tests. I would appreciate it if you did document what it's supposed to do and why, as likely nobody else knows. Maybe if I understood why I'd be more forgiving of it :-)
 What they do care about is that the fact that an error has occurred is 
 propagated sensibly in all cases without requiring extra attention, and that 
 information as to the origin is not lost (hence chaining rather than just 
 replacement). Heck, the fact that we still don't have default backtrace
handlers 
 that consistently work on all platforms is probably a much bigger problem than 
 the minutiae of exception chaining behaviour.
I wish the minutiae was documented somewhere :-( as I care about the nuances of it, if only because I'm ultimately responsible for keeping it working correctly.
 All this is not to say that nothrow constructors aren't a good idea, though.
Not much point to debating that, as they're here to stay.
Aug 27 2018
parent reply Don <prosthetictelevisions teletubby.medical.com> writes:
On Monday, 27 August 2018 at 07:34:37 UTC, Walter Bright wrote:
 On 8/23/2018 8:53 PM, David Nadlinger wrote:
 On Thursday, 23 August 2018 at 23:27:51 UTC, Walter Bright 
 wrote:
 D deals with it via "chained exceptions", which is 
 terrifyingly difficult to understand. If you believe it is 
 understandable, just try to understand the various devious 
 test cases in the test suite.
I don't think that assessment is accurate. Yes, I ported EH to a few new targets and wrote the first correct implementation of exception chaining for LDC, so I'm probably a couple of standard deviations off the average D user in that regard. But said average D user doesn't care about most of the nuances of this problem, like the details of implementing exception chaining without allocating too much,
I find myself unable to explain the rationale of the behavior exhibited by the current chaining system. I dared not change it, as I presumed somebody surely built a store around it. It does not simply chain exceptions.
 or which exceptions take precedence in various intentionally 
 twisted test cases.
The only documentation for this is the test suite itself, which does not have any documentation or rationale either, just tests. I would appreciate it if you did document what it's supposed to do and why, as likely nobody else knows. Maybe if I understood why I'd be more forgiving of it :-)
 What they do care about is that the fact that an error has 
 occurred is propagated sensibly in all cases without requiring 
 extra attention, and that information as to the origin is not 
 lost (hence chaining rather than just replacement). Heck, the 
 fact that we still don't have default backtrace handlers that 
 consistently work on all platforms is probably a much bigger 
 problem than the minutiae of exception chaining behaviour.
I wish the minutiae was documented somewhere :-( as I care about the nuances of it, if only because I'm ultimately responsible for keeping it working correctly.
I can explain this, since I did the original implementation. When I originally implemented this, I discovered that the idea of "chained exceptions" was hopeless naive. The idea was that while processing one exception, if you encounter a second one, and you chain them together. Then you get a third, fourth, etc. The problem is that it's much more complicated than that. Each of the exceptions can be a chain of exceptions themselves. This means that you don't end up with a chain of exceptions, but rather a tree of exceptions. That's why there are those really nasty test cases in the test suite. The examples in the test suite are very difficult to understand if you expect it to be a simple chain! On the one hand, I was very proud that I was able to work out the barely-documented behaviour of Windows SEH, and it was really thorough. In the initial implementation, all the complexity was covered. It wasn't the bugfix-driven-development which dmd usually operates under <g>. But on the other hand, once you can see all of the complexity, exception chaining becomes much less convincing as a concept. Sure, the full exception tree is available in the final exception which you catch. But, is it of any use? I doubt it very much. It's pretty clearly a nett loss to the language, it increases complexity with negligible benefit. Fortunately in this case, the cost isn't really high. -Don.
Aug 27 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/27/2018 7:02 AM, Don wrote:
 I can explain this, since I did the original implementation.
 [...]
Thank you, Don. And you do have my mad respect for figuring out Windows SEH.
Aug 28 2018
prev sibling parent reply Trass3r <un known.com> writes:
On Thursday, 23 August 2018 at 23:27:51 UTC, Walter Bright wrote:
 Back to throwing constructors.

 1) They are expensive, adding considerable hidden bloat in the 
 form of finally blocks, one for each constructing field. These 
 unwinding frames defeat optimization. The concept of "zero-cost 
 exception handling" is a bad joke. (Even Chandler Carruth 
 stated that the LLVM basically gives up trying to optimize in 
 the presence of exception handlers.) Herb Sutter has a recent 
 paper out proposing an alternative, completely different, error 
 handling scheme for C++ because of this issue.
Are you referring to http://wg21.link/P0709 ?
Aug 24 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 1:45 AM, Trass3r wrote:
 Are you referring to http://wg21.link/P0709 ?
Yes. (please don't use link shorteners, they tend to go poof) http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf
Aug 24 2018
parent Trass3r <un known.com> writes:
On Friday, 24 August 2018 at 09:52:20 UTC, Walter Bright wrote:
 On 8/24/2018 1:45 AM, Trass3r wrote:
 Are you referring to http://wg21.link/P0709 ?
Yes. (please don't use link shorteners, they tend to go poof) http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r1.pdf
I expect it to always point to the latest revision. Not sure if it's an official WG21 service though. Anyway, very interesting paper and approach. I'm eager to see how this will work out.
Aug 24 2018
prev sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, August 23, 2018 3:31:41 PM MDT Walter Bright via Digitalmars-d 
wrote:
 On 8/23/2018 9:19 AM, Jonathan M Davis wrote:
 D was designed to have RAII, and it does. It's just that the
 implementation is buggy (which is obviously a serious problem), so
 depending on what your program is doing, it's not going to work
 correctly. It's not like we opted to not have RAII in D, and I'm sure
 that it will be fixed at some point. So, while it can certainly be
 argued that we've dropped the ball by not getting it fully fixed by
 now, I don't really see how it could be termed a missed opportunity.
As far as I know, the only known bug in RAII is https://issues.dlang.org/show_bug.cgi?id=14246 where constructors are not properly unwinding partially constructed objects.
Yeah. I've used RAII plenty in D without problems, but the fact remains that certain uses of it are very broken right now thanks to the constructor issue. I suspect that Shachar's as negative about this as he is in part because having RAII go wrong with the kind of low-level stuff Weka does would be a serious problem, whereas if you're not dealing with some sort of resource management in the constructor, having it not unwind properly on failure isn't likely to be a big deal (much as it's still not good), because it's just going to leave the constructed object in a worse state, and it's not going to be used anyway, because the constructor failed.
 My personal opinion is that constructors that throw are an execrable
 programming practice, and I've wanted to ban them. (Andrei, while
 sympathetic to the idea, felt that too many people relied on it.) I won't
 allow throwing constructors in dmd or any software I have authority over.
Wow. I'm surprised by this. I definitely agree with David on this one. Without being able to throw from a constructor, you can't really have it fail, and there are times when it needs to be able to fail. Not being able to have throwing constructors basically means having to do two-part initialization which I would have thought was almost universally considered bad. I would consider constructors and exceptions to go absolutely hand-in-hand and wouldn't expect a language without exceptions to have constructors unless maybe it constructed all objects on the heap such that the result could be null and could thus checked for failure afterwards.
 (Having throwing destructors is even worse, it's just madness. Although it
 is allowed in C++, it doesn't actually work.)
Yeah. We probably should have required that destructors be nothrow and force destructor failures to be treated as Errors. I recall Herb Sutter wanting to making default destructors in C++11 to be noexcept, but the committee decided not to because of some stray code base that was doing something insane with exceptions and destructors. We could probably still fix it in D (particularly since I think that it is generally accepted that throwing from destructors is a bad idea), but it would undoubtedly break code simply because of functions being called which aren't nothrow but won't actually throw - thus forcing stuff like try { ... } catch(Exception) assert(0); I expect that you'd have a riot on your hands though if you actually tried to push for getting rid of throwing constructors. - Jonathan M Davis
Aug 23 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 24/08/18 05:33, Jonathan M Davis wrote:
 
 Yeah. I've used RAII plenty in D without problems, but the fact remains that
 certain uses of it are very broken right now thanks to the constructor
 issue. I suspect that Shachar's as negative about this as he is in part
 because having RAII go wrong with the kind of low-level stuff Weka does
 would be a serious problem
Yes. I will point out that I was never bit by this bug either. We found it while trying to figure out whether we want to start relying on destructors internally. The thing is, when a destructor doesn't run, this costs you a *lot* of time in finding out why. We actually have stuff that is downright weird as a result of not trusting destructors. That stuff is so weird, that for Mecca I essentially said I'm going to rely on them. Sadly, this means that this bug has become a bigger blocker than it was.
 (Having throwing destructors is even worse, it's just madness. Although it
 is allowed in C++, it doesn't actually work.)
Yeah. We probably should have required that destructors be nothrow and force destructor failures to be treated as Errors.
I'm sorry, but I'm not following your logic. If you're willing to have an error raised by a destructor abort the whole program, isn't the C++ solution preferable (abort the program only on double errors, which hardly ever happens)? Shachar
Aug 24 2018
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 1:31:21 AM MDT Shachar Shemesh via Digitalmars-d 
wrote:
 Yeah. We probably should have required that destructors be nothrow and
 force destructor failures to be treated as Errors.
I'm sorry, but I'm not following your logic. If you're willing to have an error raised by a destructor abort the whole program, isn't the C++ solution preferable (abort the program only on double errors, which hardly ever happens)?
The C++ code bases that I've worked on have typically marked destructors with throw() or noexcept, which effectively kills your program whenever an exception is thrown for a destructor, and I'm not sure if I've ever seen it be triggered. It's just not a typical bug in my experience. Either way, since I would consider it a serious bug for an exception to be thrown from a destructor, if it happens, I'd rather have the program just die so that the bug can be noticed and fixed. - Jonathan M Davis
Aug 24 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 12:31 AM, Shachar Shemesh wrote:
 If you're willing to have an error raised by a destructor abort the whole 
 program, isn't the C++ solution preferable (abort the program only on double 
 errors, which hardly ever happens)?
C++ has changed: https://akrzemi1.wordpress.com/2013/08/20/noexcept-destructors/
Aug 24 2018
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 7:33 PM, Jonathan M Davis wrote:
 Wow. I'm surprised by this. I definitely agree with David on this one.
 Without being able to throw from a constructor, you can't really have it
 fail, and there are times when it needs to be able to fail. Not being able
 to have throwing constructors basically means having to do two-part
 initialization which I would have thought was almost universally considered
 bad.
Not really. You can use a factory function which can do it for you.
 I expect that you'd have a riot on your hands though if you actually tried
 to push for getting rid of throwing constructors.
Maybe not. C++11 tries hard to discourage their use, and we know how hard they try to not disrupt existing code. I'd be favorably disposed towards a DIP getting rid of them.
Aug 24 2018
prev sibling parent reply John Carter <john.carter taitradio.com> writes:
On Friday, 24 August 2018 at 02:33:31 UTC, Jonathan M Davis wrote:
 Walter Bright wrote:
 My personal opinion is that constructors that throw are an 
 execrable programming practice, and I've wanted to ban them. 
 (Andrei, while sympathetic to the idea, felt that too many 
 people relied on it.) I won't allow throwing constructors in 
 dmd or any software I have authority over.
Wow. I'm surprised by this. I expect that you'd have a riot on your hands though if you actually tried to push for getting rid of throwing constructors.
A generation of programmers have been mislead down a deep rabbit hole thinking that "Constructors" are things that "Construct" objects. This has to led to a generation of vaguely smelly code that "does too much work in the constructor" (of which throwing exceptions is evidence). The last few years I have told myself (and anyone who doesn't back away fast enough) that "Constructors" do _not_ construct objects, they are "Name Binders." (Sort of like lisp's "let" macro) They bind instance variable names to pre-existing sub-objects. This attitude coupled with an a rule of thumb, "make it immutable unless I prove to myself that I _need_ it to be mutable" has led to a major improvement in my code.
Aug 26 2018
next sibling parent John Carter <john.carter taitradio.com> writes:
Or to put it another way....

RAII should be

"Taking Ownership of a Resource is Initialization, and 
relinquishing ownership is automatic at the object life time end, 
but Failure to Acquire a Resource Is Not An Exceptional 
Circumstance"

Not as catchy, but far less problematic.
Aug 26 2018
prev sibling parent reply Guillaume Piolat <spam smam.org> writes:
On Monday, 27 August 2018 at 03:06:17 UTC, John Carter wrote:
 The last few years I have told myself (and anyone who doesn't 
 back away fast enough) that "Constructors" do _not_ construct 
 objects, they are "Name Binders." (Sort of like lisp's "let" 
 macro)

 They bind instance variable names to pre-existing sub-objects.
One could say there is "storage" and "instantiation" of an object. C++ binds the two in the same operation. D does not, T.init must be a valid object. This is a major cultural change, though I believe the D way is superior on the efficiency stand-point (you can create large arrays of valid objects quite fast).
Aug 27 2018
parent FeepingCreature <feepingcreature gmail.com> writes:
On Monday, 27 August 2018 at 11:02:14 UTC, Guillaume Piolat wrote:
 C++ binds the two in the same operation.

 D does not, T.init must be a valid object. This is a major 
 cultural change, though I believe the D way is superior on the 
 efficiency stand-point (you can create large arrays of valid 
 objects quite fast).
I mean, then again, disable this() ...
Aug 27 2018
prev sibling parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Thursday, 23 August 2018 at 06:34:01 UTC, nkm1 wrote:
 On Thursday, 23 August 2018 at 05:37:12 UTC, Shachar Shemesh 
 wrote:
 Let's start with this one:
 https://issues.dlang.org/show_bug.cgi?id=14246#c6

 The problems I'm talking about are not easily fixable. They 
 stem from features not playing well together.

 One that hurt me lately was a way to pass a scoped lazy 
 argument (i.e. - to specify that the implicit delegate need 
 not allocate its frame, because it is not used outside the 
 function call).
The only real problem with D is that it's a language designed with GC in mind, yet there are numerous attempts to use it without GC. Also, supporting GC-less programming gets in the way of improving D's GC (which is pretty damn bad by modern standards). That's the only real technical problem. For example, the "bug" above just means that D doesn't support RAII (in the C++ sense). That's hardly a *fatal flaw*. Lots of languages don't those. And yes, most of those just use GC to dispose of memory - other resources are rarely used (compared to memory) and it's not a problem to manage them manually. You also mentioned lazy parameters allocating... GC thing again. Just allocate then? No? IMO, if getting the maximum number of users is the main goal, D is indeed going the wrong way. It would be better to get rid of nogc, betterC, dip1000, implement write barriers and use them to improve GC. Martin Nowak (I think) mentioned that write barriers will decrease performance of D programs by 1-5%. Seems like a small price to pay for better GC with shorter pauses. It would also probably be simpler technically than stuff like dip1000 and rewriting Phobos. Of course, maximizing the number of users is not the only goal, or even the main one. My understanding is that Walter wants a "systems language" with "zero cost abstractions". Well, it's very well possible that D's design precludes that. Other than memory management, I don't see any real fundamental problems.
+1 Making D a "true" C++ competitor is not going to happen soon. Even Rust, which IS by definition a true C++ competitor (no GC, etc), will still find it very hard to replace C++ in its current niche markets, like embedded and game development. While putting all the "funded" efforts in making D a *direct* would be an achievable goal, IMHO...
Sep 04 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/22/2018 10:37 PM, Shachar Shemesh wrote:
 Let's start with this one:
 https://issues.dlang.org/show_bug.cgi?id=14246#c6
https://github.com/dlang/dmd/pull/8697
Sep 15 2018
prev sibling parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 08/22/2018 10:20 PM, Nicholas Wilson wrote:

 That reminds me, what happened to our conversation with Ali Çehreli 
 about splitting general [newsgroup/forum] into Technical and less technical?
Even I remember that conversation. :) I don't remember who were involved but as soon as I opened the discussion at the next dinner table, I was discouraged; I think people were against the idea. Ali
Aug 29 2018
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, August 29, 2018 10:38:48 AM MDT Ali ehreli via Digitalmars-d 
wrote:
 On 08/22/2018 10:20 PM, Nicholas Wilson wrote:
 That reminds me, what happened to our conversation with Ali ehreli
 about splitting general [newsgroup/forum] into Technical and less
 technical?
Even I remember that conversation. :) I don't remember who were involved but as soon as I opened the discussion at the next dinner table, I was discouraged; I think people were against the idea.
As it is, we're lucky if we can even get folks to post in D.Learn when their post should be there instead of the main newsgroup... Also, dlang-study was already basically an attempt to get more focused, technical discussions separate from the main newsgroup, and it didn't work. - Jonathan M Davis
Aug 29 2018
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 30/08/2018 5:20 AM, Jonathan M Davis wrote:
 Also, dlang-study was already basically an attempt to get more focused,
 technical discussions separate from the main newsgroup, and it didn't work.
It wasn't as simple as posting to D.General. It required subscription (based upon what I did) and it couldn't just be posted to like D.General even from e.g. Thunderbird. So there was a barrier to actually posting there.
Aug 29 2018
prev sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Wednesday, 29 August 2018 at 16:38:48 UTC, Ali Çehreli wrote:
 On 08/22/2018 10:20 PM, Nicholas Wilson wrote:

 That reminds me, what happened to our conversation with Ali 
 Çehreli about splitting general [newsgroup/forum] into 
 Technical and less technical?
Even I remember that conversation. :) I don't remember who were involved but as soon as I opened the discussion at the next dinner table, I was discouraged; I think people were against the idea. Ali
It was You, me, Matthias Lang, Shachar, Liran, Don(?) and maybe some more. That is a pity, it would be nice to have a list that has a better SNR than general. Greater industry participation on the technical matters would be a nice thing to have.
Aug 29 2018
prev sibling next sibling parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

 And it's not just Weka. I've had a chance to talk in private to 
 some other developers. Quite a lot have serious, fundamental 
 issues with the language. You will notice none of them speaks 
 up on this thread.

 They don't see the point.

 No technical project is born great. If you want a technical 
 project to be great, the people working on it have to focus on 
 its *flaws*. The D's community just doesn't do that.

 To sum it up: fatal flaws + no path to fixing + no push from 
 the community = inevitable eventual death.
The D Foundation has an Open Collective page (https://opencollective.com/dlang) with a $12,000 annual "Corporate Bronze" option that includes 3 priority bug fixes per month. Is that not a worthwhile investment for Weka or other organizations invested in D to help address some of the problems you're encountering? If not, is there an option that would be? Mike
Aug 22 2018
parent Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 09:04, Mike Franklin wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh wrote:
 
 And it's not just Weka. I've had a chance to talk in private to some 
 other developers. Quite a lot have serious, fundamental issues with 
 the language. You will notice none of them speaks up on this thread.

 They don't see the point.

 No technical project is born great. If you want a technical project to 
 be great, the people working on it have to focus on its *flaws*. The 
 D's community just doesn't do that.

 To sum it up: fatal flaws + no path to fixing + no push from the 
 community = inevitable eventual death.
The D Foundation has an Open Collective page (https://opencollective.com/dlang) with a $12,000 annual "Corporate Bronze" option that includes 3 priority bug fixes per month.  Is that not a worthwhile investment for Weka or other organizations invested in D to help address some of the problems you're encountering?  If not, is there an option that would be?
I will definitely pass it on. Shachar
Aug 22 2018
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death.
Can you list what you or other Weka devs believe those fatal flaws to be? Because you've not listed any here, which makes you no better than some noob that comes in here, says D has to get better or it will die, then can't articulate what they mean by "better" or worse, mentions something trivial. Of course, you've actually used the language for years, so presumably you've got some real concerns, but do you really think the bug you just posted is "fatal" to the language? If you think there are fatal flaws, you might as well list them, whether technical or the development process, or you will just be ignored like any other noob who talks big and can't back it up. You may be ignored anyway, ;) but at least you'll have made a case that shows you know what you're talking about.
Aug 22 2018
next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
 wrote:
 On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death.
Can you list what you or other Weka devs believe those fatal flaws to be? Because you've not listed any here, which makes you no better than some noob that comes in here, says D has to get better or it will die, then can't articulate what they mean by "better" or worse, mentions something trivial. Of course, you've actually used the language for years, so presumably you've got some real concerns, but do you really think the bug you just posted is "fatal" to the language? If you think there are fatal flaws, you might as well list them, whether technical or the development process, or you will just be ignored like any other noob who talks big and can't back it up. You may be ignored anyway, ;) but at least you'll have made a case that shows you know what you're talking about.
I'd define fatal as some that can be fixed, but breaks 100% of everyone's code, even if the change is net positive all round. However how big a problem really is is in the eye of the beholder. An example: Symptom: The compiler can't discard unused symbols at compile time, and so it will spend a lot of time pointlessly optimising code. Problem: D has no notion of symbol visibility. Possible Solution: Make all globals hidden by default unless 'export'. Side effects: Everyone will be spending weeks to months fixing their libraries in order to only mark what should be visible outside the current compilation unit as 'export'. Benefits: Faster compile times, as in, in the most extreme example I've built one project on github with gdc -O2 and build time went from 120 seconds to just 3! Iain.
Aug 23 2018
next sibling parent reply Eugene Wissner <belka caraus.de> writes:
On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:
 Symptom: The compiler can't discard unused symbols at compile 
 time, and so it will spend a lot of time pointlessly optimising 
 code.

 Problem: D has no notion of symbol visibility.

 Possible Solution: Make all globals hidden by default unless 
 'export'.

 Side effects: Everyone will be spending weeks to months fixing 
 their libraries in order to only mark what should be visible 
 outside the current compilation unit as 'export'.

 Benefits: Faster compile times, as in, in the most extreme 
 example I've built one project on github with gdc -O2 and build 
 time went from 120 seconds to just 3!

 Iain.
You can probably solve it like haskell does: Export all symbols by default. So this module exports everything: ----------- module MyModule where -- My code goes here ----------- this one not: ----------- module MyModule (symbol1, symbol2) where -- My code goes here -----------
Aug 23 2018
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 23/08/2018 7:47 PM, Eugene Wissner wrote:
 
 You can probably solve it like haskell does: Export all symbols by default.
 
 So this module exports everything:
 
 -----------
 module MyModule where
 
 -- My code goes here
 
 -----------
 
 this one not:
 
 -----------
 module MyModule (symbol1, symbol2) where
 
 -- My code goes here
 
 -----------
We already have the solution, and is export as an attribute. --------- module bar; export: int foo; --------- --------- module bar; export int foo; ---------
Aug 23 2018
prev sibling next sibling parent Trass3r <un known.com> writes:
On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:
 Possible Solution: Make all globals hidden by default unless 
 'export'.
Same mess as in C++. But there you have -fvisibility=hidden at least to fix it.
 Side effects: Everyone will be spending weeks to months fixing 
 their libraries in order to only mark what should be visible 
 outside the current compilation unit as 'export'.
Shouldn't it be sufficient to put an export: at the top of each module to roll back to the old behavior and do the real fix later?
 Benefits: Faster compile times, as in, in the most extreme 
 example I've built one project on github with gdc -O2 and build 
 time went from 120 seconds to just 3!
So why not add such an opt-in flag to all compilers, deprecate the old behavior and do the switch at some point.
Aug 23 2018
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
 wrote:
 On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death.
Can you list what you or other Weka devs believe those fatal flaws to be? Because you've not listed any here, which makes you no better than some noob that comes in here, says D has to get better or it will die, then can't articulate what they mean by "better" or worse, mentions something trivial. Of course, you've actually used the language for years, so presumably you've got some real concerns, but do you really think the bug you just posted is "fatal" to the language? If you think there are fatal flaws, you might as well list them, whether technical or the development process, or you will just be ignored like any other noob who talks big and can't back it up. You may be ignored anyway, ;) but at least you'll have made a case that shows you know what you're talking about.
I'd define fatal as some that can be fixed, but breaks 100% of everyone's code, even if the change is net positive all round. However how big a problem really is is in the eye of the beholder. An example: Symptom: The compiler can't discard unused symbols at compile time, and so it will spend a lot of time pointlessly optimising code. Problem: D has no notion of symbol visibility. Possible Solution: Make all globals hidden by default unless 'export'. Side effects: Everyone will be spending weeks to months fixing their libraries in order to only mark what should be visible outside the current compilation unit as 'export'. Benefits: Faster compile times, as in, in the most extreme example I've built one project on github with gdc -O2 and build time went from 120 seconds to just 3!
So your example of a fatal flaw is that D could be 100X faster at compilation instead of just 10X than most every other native language out there?! C'mon. On Thursday, 23 August 2018 at 09:09:40 UTC, Shachar Shemesh wrote:
 On 23/08/18 09:58, Joakim wrote:
 Because you've not listed any here, which makes you no better 
 than some noob
Here's one: the forum does not respond well to criticism.
Sounds more like you don't respond well to criticism, as the point stands that your original post was content-free.
 Here's an incredibly partial list:

 * Features not playing well together.

 Despite what Joakim seems to think, I've actually brought up an 
 example in this thread.
Despite what you seem to think, perhaps because you didn't read what I wrote very closely, I noted your bugzilla link in my post you're quoting and asked you if you really thought it was fatal.
 Here is another one:

 functions may be  safe, nothrow,  nogc, pure. If it's a method 
 it might also be const/inout/immutable, static. The number of 
 libraries that support all combinations is exactly zero (e.g. - 
 when passing a delegate in).
Yes, this is a known problem with D: why do you think it's fatal?
 * Language complexity

 Raise your hand if you know how a class with both opApply and 
 the get/next/end functions behaves when you pass it to foreach. 
 How about a struct? Does it matter if it allows copying or not?

 The language was built because C++ was deemed too complex! 
 Please see the thread about lazy [1] for a case where a 
 question actually has an answer, but nobody seems to know it 
 (and the person who does know it is hard pressed to explain the 
 nuance that triggers this).
By this rationale, C++ should be dead by now. Why do you think it's fatal to D?
 * Critical bugs aren't being solved

 People keep advertising D as supporting RAII. I'm sorry, but 
 "supports RAII" means "destructors are always run when the 
 object is destroyed". If the community (and in this case, this 
 includes Walter) sees a bug where that doesn't happen as not 
 really a bug, then there is a deep problem, at least, 
 over-promising. Just say you don't support RAII and destructors 
 are unreliable and live with the consequences.

 BTW: Python's destructors are unworkable, but they advertise it 
 and face the consequences. The D community is still claiming 
 that D supports RAII.
Maybe they're not critical to everyone else? How much time or money exactly has Weka spent on getting this issue and other "critical" bugs fixed? It is fairly laughable for a company that raised $42 million to complain that a bunch of unpaid volunteers aren't fixing bugs fast enough for them: https://www.crunchbase.com/organization/weka-io
 * The community

 Oh boy.

 Someone who carries weight needs to step in when the forum is 
 trying to squash down on criticism. For Mecca, I'm able to do 
 that [2], but for D, this simply doesn't happen.
As Walter pointed out, "the forum" is not some D hive mind: it's a community filled with differing opinions. As I've faced what I felt to be unthinking criticism on another issue I raised only indirectly related to D, I know somewhat of what you're talking about, but honestly, there is a lot of criticism of D and its process too here. Anybody on the internet is free to criticize any opinion you post online, you need to get a thicker skin. I really don't see much "squashing" going on, and I've previously argued against deleting/banning even those who voiced their criticism of D in uncouth ways: https://forum.dlang.org/post/tzubjgtuiyfnmnglwywu forum.dlang.org Walter generally has a light touch with deletions/bans.
 This is a partial list, but it should give you enough to not 
 accusing me of making baseless accusations. The simple point of 
 the matter is that anyone who's been following what I write 
 should already be familiar with all of the above.
I see, so every single bug you've ever posted to the forum in the past was a fatal flaw? If so, I question your judgement. If not, I was simply looking for some prioritization, which flaws were "fatal" and why.
 The main thing for me, however, is how poorly the different D 
 features fit together (my first point above). The language 
 simply does not feel like it's composed of building blocks I 
 can use to assemble whatever I want. It's like a Lego set where 
 you're not allowed to place a red brick over a white brick if 
 there is a blue brick somewhere in your building.
This is a common criticism of D. You may be right that it will be "fatal." All I was looking for was substantive claims like this. On Thursday, 23 August 2018 at 09:16:23 UTC, Mihails wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention. D would really 
 benefit from a project manager, which I think Martin Nowak has 
 tried to do, and which the companies using D and the community 
 should get together and fund as a paid position. Maybe it 
 could be one of the funding targets for the Foundation.

 If the job was well-defined, so I knew exactly what we're 
 getting by hiring that person, I'd contribute to that.
Didn't intend to chime in, but no, that was not what I have meant at all.
I never said that's what you "meant," it was my own suggestion to make things better.
Aug 23 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 18:35, Joakim wrote:
 
 So your example of a fatal flaw is that D could be 100X faster at 
 compilation instead of just 10X than most every other native language 
 out there?! C'mon.
Have you tried Stephan's example yet? static foreach(i; 0..16384) {} Do give it a shot, tell me what you think of its compilation time.
 On Thursday, 23 August 2018 at 09:09:40 UTC, Shachar Shemesh wrote:
 * Features not playing well together.

 Despite what Joakim seems to think, I've actually brought up an 
 example in this thread.
Despite what you seem to think, perhaps because you didn't read what I wrote very closely, I noted your bugzilla link in my post you're quoting and asked you if you really thought it was fatal.
Each problem on its own, maybe not. All together? Most definitely. A language needs to be coherent. A programmer needs to be able to look at code and know what the compiler will make of that code. The less that can happen, the less useful the language is. This is, in fact, precisely the criticism the D community levels against C++.
 Yes, this is a known problem with D: why do you think it's fatal?
See above.
 
 By this rationale, C++ should be dead by now. Why do you think it's 
 fatal to D?
C++ does not suffer from this *kind* of complexity. For the most part, C++'s complexity is feature centric. You use a feature, you need to really learn that feature in order to get it to work. D's complexity is waiting to pounce you behind street corners. You use a feature, and all's well. And then, when you're doing stuff completely tangential to your old code, things suddenly break. You can avoid C++'s complexity by not using features. The same is not true of D. With that said, who said these problems don't affect C++? Had C++ not being plagued by these problems, D (and Rust, and Go, and so on) would probably never had been born. These are languages written with the explicit hope of killing C++. They do not seem like they are going to, but D lacks quite a few things that C++ has going for it. To name a few: * Large community * excellent tooling * large use base
 
 * Critical bugs aren't being solved

 People keep advertising D as supporting RAII. I'm sorry, but "supports 
 RAII" means "destructors are always run when the object is destroyed". 
 If the community (and in this case, this includes Walter) sees a bug 
 where that doesn't happen as not really a bug, then there is a deep 
 problem, at least, over-promising. Just say you don't support RAII and 
 destructors are unreliable and live with the consequences.

 BTW: Python's destructors are unworkable, but they advertise it and 
 face the consequences. The D community is still claiming that D 
 supports RAII.
Maybe they're not critical to everyone else?
Maybe. Just don't lie to users.
 How much time or money 
 exactly has Weka spent on getting this issue and other "critical" bugs 
 fixed?
Weka is paying prominent D developers as contractors. We've had David Nadlinger and currently employ Johan Engelen. Both said they are cannot fix this particular bug. If you can, feel free to contact me off-list, and I'm fairly sure we can get the budget for you to work on it. The same goes for anyone else on this list. We also contribute our own workarounds for D's shortcomings for everyone to enjoy. This include DIP-1014 and Mecca, as well as the less obvious upstreaming of bugs our contractors fix. This is beyond the fact that our "fork" of the compiler is, itself, public (https://github.com/weka-io/ldc). I think claiming that Weka is leaching off the community is simply unwarranted.
 It is fairly laughable for a company that raised $42 million to 
 complain that a bunch of unpaid volunteers aren't fixing bugs fast 
 enough for them:
First of all, and this is important, I do not speak for Weka. I can pass recommendations, and I can sometime estimate in advance what will and what will not be approved, but that's it. As *I* don't have 42 million dollars, I find that particular criticism irrelevant (not to mention downright incorrect, as pointed above). With that said, I am not complaining about anything. I am stating the situation as I see it. I understand it is uncomfortable to hear, and thus the aggressiveness of your response. I can only point out the obvious: you shaming me will not make me change my mind. At best, it will make me not say it publicly. Shachar
Aug 23 2018
next sibling parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 18:35, Joakim wrote:
 
 So your example of a fatal flaw is that D could be 100X faster 
 at compilation instead of just 10X than most every other 
 native language out there?! C'mon.
Have you tried Stephan's example yet? static foreach(i; 0..16384) {}
I don't see any posts by a "Stephan" in this thread. I don't doubt that there are degenerate cases in D's compile-time features, particularly the new ones.
 Each problem on its own, maybe not. All together? Most 
 definitely.

 A language needs to be coherent. A programmer needs to be able 
 to look at code and know what the compiler will make of that 
 code. The less that can happen, the less useful the language is.

 This is, in fact, precisely the criticism the D community 
 levels against C++.
Ah, I see none of these are "fatal flaws," but they all combine together to create a fatal situation. That's an argument that can actually be defended, but not the one you started off making.
 By this rationale, C++ should be dead by now. Why do you think 
 it's fatal to D?
C++ does not suffer from this *kind* of complexity. For the most part, C++'s complexity is feature centric. You use a feature, you need to really learn that feature in order to get it to work. D's complexity is waiting to pounce you behind street corners. You use a feature, and all's well. And then, when you're doing stuff completely tangential to your old code, things suddenly break. You can avoid C++'s complexity by not using features. The same is not true of D.
Sorry, it sounds like you're simply saying you prefer one type of complexity to another. I'm not a C++ developer, but my understanding is it has many of the same problems as D, more because of backwards compatibility.
 With that said, who said these problems don't affect C++?
Nobody, that's my point in bringing it up.
Had C++ not being plagued by these problems, D (and Rust, and Go,
 and so on) would probably never had been born. These are 
 languages written with the explicit hope of killing C++.

 They do not seem like they are going to, but D lacks quite a 
 few things that C++ has going for it. To name a few:

 * Large community
 * excellent tooling
 * large use base
My point was that C++ has always had what you believe to be the "fatal flaw" of incoherence, yet was able to build all those up. Walter has been there from the beginning and seen how C++ did it, perhaps his anarchy-driven development isn't so misguided after all.
 * Critical bugs aren't being solved

 People keep advertising D as supporting RAII. I'm sorry, but 
 "supports RAII" means "destructors are always run when the 
 object is destroyed". If the community (and in this case, 
 this includes Walter) sees a bug where that doesn't happen as 
 not really a bug, then there is a deep problem, at least, 
 over-promising. Just say you don't support RAII and 
 destructors are unreliable and live with the consequences.

 BTW: Python's destructors are unworkable, but they advertise 
 it and face the consequences. The D community is still 
 claiming that D supports RAII.
Maybe they're not critical to everyone else?
Maybe. Just don't lie to users.
Is it a lie if you don't know about some bug in the implementation?
 How much time or money exactly has Weka spent on getting this 
 issue and other "critical" bugs fixed?
Weka is paying prominent D developers as contractors. We've had David Nadlinger and currently employ Johan Engelen. Both said they are cannot fix this particular bug. If you can, feel free to contact me off-list, and I'm fairly sure we can get the budget for you to work on it. The same goes for anyone else on this list.
I'm sorry, I wouldn't know how to and am not interested in learning.
 We also contribute our own workarounds for D's shortcomings for 
 everyone to enjoy. This include DIP-1014 and Mecca, as well as 
 the less obvious upstreaming of bugs our contractors fix. This 
 is beyond the fact that our "fork" of the compiler is, itself, 
 public (https://github.com/weka-io/ldc). I think claiming that 
 Weka is leaching off the community is simply unwarranted.
Sure, but nobody claimed you're just leeching: I know you've contributed in various ways. But the question stands: were you able to apply time/money to get any _critical_ bugs fixed and upstreamed? If so, why don't you believe you could get more fixed? Nobody's asking you to do it all yourself, you could work with Sociomantic and the community to raise bounties on those issues.
 It is fairly laughable for a company that raised $42 million 
 to complain that a bunch of unpaid volunteers aren't fixing 
 bugs fast enough for them:
First of all, and this is important, I do not speak for Weka. I can pass recommendations, and I can sometime estimate in advance what will and what will not be approved, but that's it. As *I* don't have 42 million dollars, I find that particular criticism irrelevant (not to mention downright incorrect, as pointed above).
I don't think it's irrelevant or incorrect, as that's essentially what you're doing. You may not officially represent Weka, but you're basically complaining that you can't get your work done properly with these bugs, which is done at Weka. That reflects on Weka, whether you like it or not.
 With that said, I am not complaining about anything. I am 
 stating the situation as I see it.
Yes, you're stating a giant complaint about the way things are being done.
 I understand it is uncomfortable to hear, and thus the 
 aggressiveness of your response.
The only part's that's uncomfortable is your continual rhetorical strategy to exaggerate, then when that's pointed out, to claim you're being "squashed" or "shamed." I want accurate, substantive claims about D and feel no discomfort from such, some of which you've made. I'm only aggressive in trying to understand what the underlying roots of your unhappiness are and pointing out ways to fix it. You seem intent on casting that as something else.
 I can only point out the obvious: you shaming me will not make 
 me change my mind. At best, it will make me not say it publicly.
I think you misinterpret my questions. I've been trying to get you to add substance to your initial broad claims with all my posts and questions. That is an attempt to elicit real, substantial criticism, not silence or shame it.
Aug 23 2018
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
wrote:

 If you can, feel free to contact me off-list, and I'm fairly 
 sure we can get the budget for you to work on it. The same goes 
 for anyone else on this list.
I don't think Kenji will see your message, but he may be able to help given the right financial incentive.
Aug 23 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 20:57, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh wrote:
 
 If you can, feel free to contact me off-list, and I'm fairly sure we 
 can get the budget for you to work on it. The same goes for anyone 
 else on this list.
I don't think Kenji will see your message, but he may be able to help given the right financial incentive.
I have no idea who that is. Can you contact me off-list (either email or the DLang slack)? Shachar
Aug 23 2018
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, August 23, 2018 12:05:53 PM MDT Shachar Shemesh via 
Digitalmars-d wrote:
 On 23/08/18 20:57, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh wrote:
 If you can, feel free to contact me off-list, and I'm fairly sure we
 can get the budget for you to work on it. The same goes for anyone
 else on this list.
I don't think Kenji will see your message, but he may be able to help given the right financial incentive.
I have no idea who that is. Can you contact me off-list (either email or the DLang slack)?
He's talking about Kenji Hara, who was an extremely prolific developer for dmd and Phobos for several years but who left a couple of years ago after he was unhappy with some of what was happening with his PRs for dmd. AFAIK, he's completely disappeared from the general D community (though he may be involved with the Japanese D community, since they kind of have their own thing going as I understand it). I have no clue if he'd be willing to do anything to help you or not, but this is his github account: https://github.com/9rnsr - Jonathan M Davis
Aug 23 2018
prev sibling next sibling parent bachmeier <no spam.net> writes:
On Thursday, 23 August 2018 at 18:05:53 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 20:57, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
 wrote:
 
 If you can, feel free to contact me off-list, and I'm fairly 
 sure we can get the budget for you to work on it. The same 
 goes for anyone else on this list.
I don't think Kenji will see your message, but he may be able to help given the right financial incentive.
I have no idea who that is. Can you contact me off-list (either email or the DLang slack)? Shachar
Others probably have better information than I do. Here's his Github account: https://github.com/9rnsr All I really know is that he was a prolific and brilliant contributor to the compiler when I started using D. Pretty much Walter and him as I understood it. Then he had a disagreement with the leadership and disappeared.
Aug 23 2018
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Aug 23, 2018 at 09:05:53PM +0300, Shachar Shemesh via Digitalmars-d
wrote:
 On 23/08/18 20:57, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh wrote:
 
 If you can, feel free to contact me off-list, and I'm fairly sure
 we can get the budget for you to work on it. The same goes for
 anyone else on this list.
I don't think Kenji will see your message, but he may be able to help given the right financial incentive.
I have no idea who that is. Can you contact me off-list (either email or the DLang slack)?
[...] He used to be one of the core dmd developers, and easily the most prolific and productive. Unfortunately, he left about 2 or so years ago after a sharp disagreement with some other dmd devs. T -- IBM = I'll Buy Microsoft!
Aug 23 2018
prev sibling next sibling parent reply David Nadlinger <code klickverbot.at> writes:
On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 18:35, Joakim wrote:
 […]
 How much time or money exactly has Weka spent on getting this 
 issue and other "critical" bugs fixed?
Weka is paying prominent D developers as contractors. We've had David Nadlinger and currently employ Johan Engelen. Both said they are cannot fix this particular bug.
Not to put too fine a point on this, but I don't think I've ever said I couldn't fix the bug; although I probably did mention it would be better to fix in the upstream DMD frontend than in LDC. I would be happy to have a look at it for Weka at this point, actually. — David
Aug 23 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 3:20 PM, David Nadlinger wrote:
 Not to put too fine a point on this, but I don't think I've ever said I
couldn't 
 fix the bug; although I probably did mention it would be better to fix in the 
 upstream DMD frontend than in LDC.
 
 I would be happy to have a look at it for Weka at this point, actually.
I don't believe it's unfixable, either.
Aug 23 2018
prev sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 23 August 2018 at 17:02:12 UTC, Shachar Shemesh 
wrote:
 How much time or money exactly has Weka spent on getting this 
 issue and other "critical" bugs fixed?
Weka is paying prominent D developers as contractors. We've had David Nadlinger and currently employ Johan Engelen. Both said they are cannot fix this particular bug. If you can, feel free to contact me off-list, and I'm fairly sure we can get the budget for you to work on it. The same goes for anyone else on this list.
I'll be taking a good stab at it. I am currently wading through the swamps of Semantic Analysis and am currently up against a ~thousand headed hydra~ thousand line function .
Aug 23 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 5:02 PM, Nicholas Wilson wrote:
 am currently up against a ~thousand headed hydra~ thousand 
 line function .
Good times :-)
Aug 24 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 24 August 2018 at 09:58:13 UTC, Walter Bright wrote:
 On 8/23/2018 5:02 PM, Nicholas Wilson wrote:
 am currently up against a ~thousand headed hydra~ thousand 
 line function .
Good times :-)
Tell me about it. I'm now wading through its carcass in review. I will split it up but it takes a lot of effort.
Aug 24 2018
prev sibling next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On Thursday, 23 August 2018 at 15:35:45 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
 wrote:
 [...]
Can you list what you or other Weka devs believe those fatal flaws to be? Because you've not listed any here, which makes you no better than some noob that comes in here, says D has to get better or it will die, then can't articulate what they mean by "better" or worse, mentions something trivial. Of course, you've actually used the language for years, so presumably you've got some real concerns, but do you really think the bug you just posted is "fatal" to the language? If you think there are fatal flaws, you might as well list them, whether technical or the development process, or you will just be ignored like any other noob who talks big and can't back it up. You may be ignored anyway, ;) but at least you'll have made a case that shows you know what you're talking about.
I'd define fatal as some that can be fixed, but breaks 100% of everyone's code, even if the change is net positive all round. However how big a problem really is is in the eye of the beholder. An example: Symptom: The compiler can't discard unused symbols at compile time, and so it will spend a lot of time pointlessly optimising code. Problem: D has no notion of symbol visibility. Possible Solution: Make all globals hidden by default unless 'export'. Side effects: Everyone will be spending weeks to months fixing their libraries in order to only mark what should be visible outside the current compilation unit as 'export'. Benefits: Faster compile times, as in, in the most extreme example I've built one project on github with gdc -O2 and build time went from 120 seconds to just 3!
So your example of a fatal flaw is that D could be 100X faster at compilation instead of just 10X than most every other native language out there?! C'mon.
But that's not true. D isn't a fast language to compile, dmd is just a fast compiler. You may get a little leading edge with codebases that are effectively C. Once you throw templates into the mix though, your problems become exponential. Spending 4 seconds in the front end and codegen, only to wait 2 minutes in the optimizer is horrific. The alternative of discarding what seem to be unused symbols only results in linker error of the obscure edge cases sort. Template emission strategy is a mess, we're better off just instantiating all templates in all compilation units, and let the compiler decide whatever to discard. Even -allinst does not instantiate enough to allow the compiler to make such decisions that C++ has no problem with (most of the time).
Aug 28 2018
next sibling parent Joakim <dlang joakim.fea.st> writes:
On Tuesday, 28 August 2018 at 13:39:40 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 15:35:45 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 07:37:07 UTC, Iain Buclaw wrote:
 On Thursday, 23 August 2018 at 06:58:13 UTC, Joakim wrote:
 On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
 wrote:
 [...]
Can you list what you or other Weka devs believe those fatal flaws to be? Because you've not listed any here, which makes you no better than some noob that comes in here, says D has to get better or it will die, then can't articulate what they mean by "better" or worse, mentions something trivial. Of course, you've actually used the language for years, so presumably you've got some real concerns, but do you really think the bug you just posted is "fatal" to the language? If you think there are fatal flaws, you might as well list them, whether technical or the development process, or you will just be ignored like any other noob who talks big and can't back it up. You may be ignored anyway, ;) but at least you'll have made a case that shows you know what you're talking about.
I'd define fatal as some that can be fixed, but breaks 100% of everyone's code, even if the change is net positive all round. However how big a problem really is is in the eye of the beholder. An example: Symptom: The compiler can't discard unused symbols at compile time, and so it will spend a lot of time pointlessly optimising code. Problem: D has no notion of symbol visibility. Possible Solution: Make all globals hidden by default unless 'export'. Side effects: Everyone will be spending weeks to months fixing their libraries in order to only mark what should be visible outside the current compilation unit as 'export'. Benefits: Faster compile times, as in, in the most extreme example I've built one project on github with gdc -O2 and build time went from 120 seconds to just 3!
So your example of a fatal flaw is that D could be 100X faster at compilation instead of just 10X than most every other native language out there?! C'mon.
But that's not true. D isn't a fast language to compile, dmd is just a fast compiler. You may get a little leading edge with codebases that are effectively C. Once you throw templates into the mix though, your problems become exponential. Spending 4 seconds in the front end and codegen, only to wait 2 minutes in the optimizer is horrific. The alternative of discarding what seem to be unused symbols only results in linker error of the obscure edge cases sort. Template emission strategy is a mess, we're better off just instantiating all templates in all compilation units, and let the compiler decide whatever to discard. Even -allinst does not instantiate enough to allow the compiler to make such decisions that C++ has no problem with (most of the time).
I think I've hit a variation of this problem before, where pulling in a single selective import in Phobos somewhere meant the entire module was compiled into the executable (though I suppose that could be a linker issue?): https://forum.dlang.org/thread/gmjqfjoemwtvgqrtdsdr forum.dlang.org I guess this is why scoped/selective imports didn't help that much in disentangling Phobos. I figured it wasn't a big deal if it was just causing bigger executables, but even though I mentioned compilation speed there, I didn't think of how that's slowing down the compiler too, as you now note. Pruning what's evaluated by the compiler based on scoped/selective imports, rather than apparently including the whole module, and getting D compilers to compile parallelized without separately invoking each module/package, ie a -j flag for the compiler when you invoke it with all your source at once, might be good projects for us to crowdfund, as discussed in this and my earlier Nim thread. Separate parallel compilation works wonders on my octa-core Android/AArch64 phone, where I mostly build D now, would be good to be able to combine that with invoking ldc with all source at once.
Aug 28 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/28/2018 6:39 AM, Iain Buclaw wrote:
 Template emission strategy is a mess, we're better off just instantiating all 
 templates in all compilation units, and let the compiler decide whatever to 
 discard. Even -allinst does not instantiate enough to allow the compiler to
make 
 such decisions that C++ has no problem with (most of the time).
Martin and I proposed a simple strategy for that, but Kenji implemented a different algorithm that nobody understands, and has proved inadequate. There are a couple unresolved bug reports on that.
Aug 28 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, August 28, 2018 11:31:40 PM MDT Walter Bright via Digitalmars-d 
wrote:
 On 8/28/2018 6:39 AM, Iain Buclaw wrote:
 Template emission strategy is a mess, we're better off just
 instantiating all templates in all compilation units, and let the
 compiler decide whatever to discard. Even -allinst does not instantiate
 enough to allow the compiler to make such decisions that C++ has no
 problem with (most of the time).
Martin and I proposed a simple strategy for that, but Kenji implemented a different algorithm that nobody understands, and has proved inadequate. There are a couple unresolved bug reports on that.
Would it make sense then to change it to work more like what you and Martin were thinking of doing? - Jonathan M Davis
Aug 28 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/28/2018 11:52 PM, Jonathan M Davis wrote:
 Would it make sense then to change it to work more like what you and Martin
 were thinking of doing?
Yes, it would. But it would be a non-trivial effort to remove Kenji's design.
Aug 29 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, August 29, 2018 2:37:17 AM MDT Walter Bright via Digitalmars-d 
wrote:
 On 8/28/2018 11:52 PM, Jonathan M Davis wrote:
 Would it make sense then to change it to work more like what you and
 Martin were thinking of doing?
Yes, it would. But it would be a non-trivial effort to remove Kenji's design.
Well, then at least it should be an issue of time and manpower rather than it being something that we can't reasonably fix from a technical perspective. So, then presumably, it's a question of priority and whether it's a pressing enough issue to merit the pain of sorting it out - not that the todo list is ever really getting shorter around here... - Jonathan M Davis
Aug 29 2018
prev sibling parent reply rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 15:35:45 +0000, Joakim wrote:

 * Language complexity

 Raise your hand if you know how a class with both opApply and the
 get/next/end functions behaves when you pass it to foreach.
 How about a struct? Does it matter if it allows copying or not?

 The language was built because C++ was deemed too complex! Please see
 the thread about lazy [1] for a case where a question actually has an
 answer, but nobody seems to know it (and the person who does know it is
 hard pressed to explain the nuance that triggers this).
By this rationale, C++ should be dead by now. Why do you think it's fatal to D?
It's worth noting that C++ isn't always chosen for its technical merits. It's a well-known language whose more or less standard status in certain domains means it's the default choice; C++ is sometimes used for projects in which Stroustrup would say it's obviously the wrong language for the job. D is far more likely to require justification based on technical merit. If D becomes another C++, why bother taking a chance with D when you can just use C++, use a well-supported, commonly-used compiler, and hire from a bigger pool of jobseekers?
Sep 01 2018
next sibling parent reply TheSixMillionDollarMan <smdm outlook.com> writes:
On Saturday, 1 September 2018 at 12:33:49 UTC, rjframe wrote:
 C++ is sometimes used for projects in which Stroustrup would 
 say it's obviously the wrong language for the job.

 D is far more likely to require justification based on 
 technical merit. If D becomes another C++, why bother taking a 
 chance with D when you can just use C++, use a well-supported, 
 commonly-used compiler, and hire from a bigger pool of 
 jobseekers?
Stroustrup also said, that "achieving any degree of compatibility [with C/C++] is very hard, as the C/C++ experience shows." (reference => http://stroustrup.com/hopl-almost-final.pdf (2007) (and here refers to D on page 42 btw - that was 11 years ago now). And yet, D is very intent on doing just that, while also treading its own path. I personally think this is why D has not taken off, as many would hope. It's hard. I think it's also why D won't take off, as many hope. It's hard. Stroustrup was correct (back in the 90's). Yes, it really is hard. Made even harder now, since C++ has evolved into a 'constantly' moving target...
Sep 01 2018
parent Chris <wendlec tcd.ie> writes:
On Saturday, 1 September 2018 at 18:35:30 UTC, 
TheSixMillionDollarMan wrote:
 On Saturday, 1 September 2018 at 12:33:49 UTC, rjframe wrote:
 [...]
Stroustrup also said, that "achieving any degree of compatibility [with C/C++] is very hard, as the C/C++ experience shows." (reference => http://stroustrup.com/hopl-almost-final.pdf (2007) (and here refers to D on page 42 btw - that was 11 years ago now). And yet, D is very intent on doing just that, while also treading its own path. I personally think this is why D has not taken off, as many would hope. It's hard. I think it's also why D won't take off, as many hope. It's hard. Stroustrup was correct (back in the 90's). Yes, it really is hard. Made even harder now, since C++ has evolved into a 'constantly' moving target...
D++
Sep 02 2018
prev sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Saturday, 1 September 2018 at 12:33:49 UTC, rjframe wrote:
 On Thu, 23 Aug 2018 15:35:45 +0000, Joakim wrote:

 * Language complexity

 Raise your hand if you know how a class with both opApply and 
 the
 get/next/end functions behaves when you pass it to foreach.
 How about a struct? Does it matter if it allows copying or 
 not?

 The language was built because C++ was deemed too complex! 
 Please see the thread about lazy [1] for a case where a 
 question actually has an answer, but nobody seems to know it 
 (and the person who does know it is hard pressed to explain 
 the nuance that triggers this).
By this rationale, C++ should be dead by now. Why do you think it's fatal to D?
It's worth noting that C++ isn't always chosen for its technical merits. It's a well-known language whose more or less standard status in certain domains means it's the default choice; C++ is sometimes used for projects in which Stroustrup would say it's obviously the wrong language for the job. D is far more likely to require justification based on technical merit. If D becomes another C++, why bother taking a chance with D when you can just use C++, use a well-supported, commonly-used compiler, and hire from a bigger pool of jobseekers?
That's why the people that adopt D will inordinately be principals not agents in the beginning. They will either be residual claimants on earnings or will have acquired the authority to make decisions without persuading a committee that makes decisions on the grounds of social factors. If D becomes another C++ ? C++ was ugly from the beginning (in my personal subjective assessment) whereas D was designed by people with good taste. That's why it appeals inordinately to people with good taste. In Hong Kong we had some difficulty hiring a support person for a trading floor. Spoke in some cases to the most senior person in HK for even large and well-known funds (small office in this case) and they simply were not good enough. Thanks to someone from the D community I met a headhunter who used to be at Yandex but realized the money was better as a headhunter. They don't have many financial clients I think, don't have connections on the talent side in finance. But the runners up were by far better than anyone we had found through other sources and the best was outstanding. Good job, I said. It's funny that the person we hired came from a big bank when other headhunters are looking in the same place and know that world better. By the way, how many people did you interact with to find X ? In London if a headhunter puts 10 people before you and you are really pretty happy then that's a good day. He said two hundred ! And they had to come up with a hiring test too. So the basic reason they could find good people in technology in finance when others couldn't is that they have much better taste. Do you see ? The others knew many more people, they had experience doing it, and somebody who had to persuade a committee would have found it hard to justify. Programming ability follows a Pareto curve - see the best and the The incidence of outstanding ones is lower than in the D community for the very reason that only someone obtuse or very smart will learn D for career reasons - intrinsic motivation draws the highest talent. It depends if your model of people doing work is an army of intelligent trained monkeys or a force made up of small elite groups of the best people you have ever worked with. Of course the general of the trained monkey army is going to be difficult to persuade. And so ? On the other hand, someone who is smart and has good taste and has earned the right to decide - D is a less popular language that has fewer tutorials and less shiny IDE and debugger support. Well if you're a small company and you are directly or in effect a proxy owner of the residual (ie an owner of some kind) it's a pragmatic question and saying nobody got fired for buying IBM - that's missing the point because the success is yours and the failure is yours and you can't pass the buck. The beauty of being the underdog is that it's easy to succeed. You don't need to be the top dog, and in fact it's not strategically wise to do something that might think you stand a chance - let them think what they want. The underdog just needs to keep improving and keep getting more adoption, which I don't have much doubt is happening. Modern people can be like children in their impatience sometimes! I've only been programming since 1983 so I had the benefit of high level languages like BBC BASIC, C, a Forth I wrote myself, and Modula 3. And although I had to write a disassembler at least I has assemblers built in. Programming using a hex keypad is not that satisfying after a while. It takes a long time to develop a language, its ecosystem and community. An S curve is quite flat in the beginning. D is a very ambitious language so of course in any one domain it seems like nobody is using it, but this is deceptive. It isn't like Go where its more used for a particular purpose. Anyone that thinks they are a strong programmer by comparison with others in their world and can keep up here and would like to write D should contact me (if you can't see my email then Michael Parker will have it). And if the same but they would like to Dicebot wrote in the old post that triggered this discussion something that people interpreted as meaning he was leaving D. Around the same time as his old post was discovered he released his version of dtoh that he intends to be integrated into the compiler. So he may or may not have changed his mind about making upstream contributions but this is something pretty close to that even if it's not the same. It's not for me to say, but I believe he continues to work in D professionally also. It would be wonderful if we could get back to figuring out what constructive steps we could take to make a better world. People who make predictions should back their views with something - they should have skin in the game. If someone wants to make a prediction about D I am happy to take the other side of a wager if I disagree strongly. I play for high stakes - usually cups of coffee. I have a bet with my economist that there won't be another referendum about whether to leave or soft Vs hard before we leave the EU. I win two coffees if no referendum and lose one if there is. Since it takes six months to prepare the question and Brexit date is March, I think that's not bad for me.
Sep 02 2018
next sibling parent reply lurker <lurker aol.com> writes:
after the beta i tried it the final again - just to be fair.

1.) install d, install visual d.
2.) trying to to look at options under visual d without a project 
crashes VS2017 - latest
     service pack.
3.) VS2017 - displays a problem on startup
4.) creating the dummy project - compile for x64. error something 
is missing.
5.) deinstall everything and wait for another year

this crap does not even work out of the box - what else is not 
tested in D?

i guess you don't intend to draw crowds to D and just keep 
talking on how to this and that a little better in the compiler 
pet project.

is D that dead that the releases are not tested or do you want to 
keep all windows users out?
Sep 02 2018
next sibling parent Everlast <Everlast For.Ever> writes:
On Sunday, 2 September 2018 at 14:48:34 UTC, lurker wrote:
 after the beta i tried it the final again - just to be fair.

 1.) install d, install visual d.
 2.) trying to to look at options under visual d without a 
 project crashes VS2017 - latest
     service pack.
 3.) VS2017 - displays a problem on startup
 4.) creating the dummy project - compile for x64. error 
 something is missing.
 5.) deinstall everything and wait for another year

 this crap does not even work out of the box - what else is not 
 tested in D?

 i guess you don't intend to draw crowds to D and just keep 
 talking on how to this and that a little better in the compiler 
 pet project.

 is D that dead that the releases are not tested or do you want 
 to keep all windows users out?
Don't worry, that is just the beginning... It's like that in all aspects. The design is fundamentally flawed and no one seems to care(or recognize it). Software is far too complex now days to be using mentalities of the 60's and 70's.
Sep 02 2018
prev sibling next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Sunday, 2 September 2018 at 14:48:34 UTC, lurker wrote:
 after the beta i tried it the final again - just to be fair.

 1.) install d, install visual d.
 2.) trying to to look at options under visual d without a 
 project crashes VS2017 - latest
     service pack.
 3.) VS2017 - displays a problem on startup
 4.) creating the dummy project - compile for x64. error 
 something is missing.
 5.) deinstall everything and wait for another year

 this crap does not even work out of the box - what else is not 
 tested in D?

 i guess you don't intend to draw crowds to D and just keep 
 talking on how to this and that a little better in the compiler 
 pet project.

 is D that dead that the releases are not tested or do you want 
 to keep all windows users out?
There are a lot of motivated people here willing to help you to get your issue solved if you provide the details. I can confirm that DMD is working like a charm for me (different visual studio versions on build servers, MS build tools on local pc). I use IntelliJ instead of Visual Studio, but that is only my personal preferation. Kind regards Andre
Sep 02 2018
parent Manu <turkeyman gmail.com> writes:
On Sun, 2 Sep 2018 at 16:05, Andre Pany via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 2 September 2018 at 14:48:34 UTC, lurker wrote:
 after the beta i tried it the final again - just to be fair.

 1.) install d, install visual d.
 2.) trying to to look at options under visual d without a
 project crashes VS2017 - latest
     service pack.
 3.) VS2017 - displays a problem on startup
 4.) creating the dummy project - compile for x64. error
 something is missing.
 5.) deinstall everything and wait for another year

 this crap does not even work out of the box - what else is not
 tested in D?

 i guess you don't intend to draw crowds to D and just keep
 talking on how to this and that a little better in the compiler
 pet project.

 is D that dead that the releases are not tested or do you want
 to keep all windows users out?
There are a lot of motivated people here willing to help you to get your issue solved if you provide the details. I can confirm that DMD is working like a charm for me (different visual studio versions on build servers, MS build tools on local pc). I use IntelliJ instead of Visual Studio, but that is only my personal preferation. Kind regards Andre
I'm currently installing VS2017 to test your anecdote (been on 2015 for ages)... although I use VS2017 at work and haven't had any problems.
Sep 02 2018
prev sibling next sibling parent bauss <jj_1337 live.dk> writes:
On Sunday, 2 September 2018 at 14:48:34 UTC, lurker wrote:
 after the beta i tried it the final again - just to be fair.

 1.) install d, install visual d.
 2.) trying to to look at options under visual d without a 
 project crashes VS2017 - latest
     service pack.
 3.) VS2017 - displays a problem on startup
 4.) creating the dummy project - compile for x64. error 
 something is missing.
 5.) deinstall everything and wait for another year

 this crap does not even work out of the box - what else is not 
 tested in D?

 i guess you don't intend to draw crowds to D and just keep 
 talking on how to this and that a little better in the compiler 
 pet project.

 is D that dead that the releases are not tested or do you want 
 to keep all windows users out?
Visual D is not official, remember that. Most people would go with VS Code anyway.
Sep 02 2018
prev sibling parent bauss <jj_1337 live.dk> writes:
On Sunday, 2 September 2018 at 14:48:34 UTC, lurker wrote:
 after the beta i tried it the final again - just to be fair.

 1.) install d, install visual d.
 2.) trying to to look at options under visual d without a 
 project crashes VS2017 - latest
     service pack.
 3.) VS2017 - displays a problem on startup
 4.) creating the dummy project - compile for x64. error 
 something is missing.
 5.) deinstall everything and wait for another year

 this crap does not even work out of the box - what else is not 
 tested in D?

 i guess you don't intend to draw crowds to D and just keep 
 talking on how to this and that a little better in the compiler 
 pet project.

 is D that dead that the releases are not tested or do you want 
 to keep all windows users out?
Oh and also your error is probably that you're missing the C++ build tools which come with the correct linker for x64, as far as I remember.
Sep 02 2018
prev sibling next sibling parent reply Pjotr Prins <pjotr.public12 thebird.nl> writes:
On Sunday, 2 September 2018 at 12:07:17 UTC, Laeeth Isharc wrote:
 I've only been programming since 1983 so I had the benefit of 
 high level languages like BBC BASIC, C, a Forth I wrote myself, 
 and Modula 3.  And although I had to write a disassembler at 
 least I has assemblers built in.  Programming using a hex 
 keypad is not that satisfying after a while.  It takes a long 
 time to develop a language, its ecosystem and community.
Hear, hear! Even though some languages like Julia, Rust and Go are much better funded than D - and their creators have excellent taste in different ways - they still have to go through similar evolutionary steps. There is no fast path. Whatever design decision you make, you always end up fixes bugs and corner cases. I was amazed how behind Rust's debugger support was last year (I witnessed a talk at FOSDEM). They are catching up, but it just goes to show... One thing I want to add that we ought to be appreciative of the work people put in - much of it in their spare time. I wonder if W&A and others sometimes despair for the lack of appreciation they get. Guido van Rossum burning out (W, notably, was the one to post that here first) is a shame. Even though he created a language which I find less tasteful he did not deserve to be treated like that. Simple.
Sep 02 2018
next sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 3 September 2018 at 06:29:02 UTC, Pjotr Prins wrote:

 Hear, hear!

 Even though some languages like Julia, Rust and Go are much 
 better funded than D - and their creators have excellent taste 
 in different ways - they still have to go through similar 
 evolutionary steps. There is no fast path. Whatever design 
 decision you make, you always end up fixes bugs and corner 
 cases. I was amazed how behind Rust's debugger support was last 
 year (I witnessed a talk at FOSDEM). They are catching up, but 
 it just goes to show...
No programming language is ever finished. But most programming languages try to get the basics right first and then add new features. If you want to run you have to learn how to walk first. Languages take time to evolve, but we shouldn't be in a situation where the fixing of basic bugs and flaws are considered part of the "long term goals".
 One thing I want to add that we ought to be appreciative of the 
 work people put in - much of it in their spare time. I wonder 
 if W&A and others sometimes despair for the lack of 
 appreciation they get. Guido van Rossum burning out (W, 
 notably, was the one to post that here first) is a shame. Even 
 though he created a language which I find less tasteful he did 
 not deserve to be treated like that. Simple.
I hold both Walter and Andrei (and all the other great contributors) in high esteem and D was the right tool for me back in the day. Without it things would have been a lot harder. But I think D is past the laboratory stage and I as a user feel that our actual experience is less important than design experiments. Respect goes both ways, after all it's the users who keep a programming language alive. If there isn't something fundamentally wrong in the communication between the leadership / language developers and the users, why do we get posts like this: "Thanks! Please add anything you think is missing to https://github.com/dlang/dlang.org/pull/2453 since Walter doesn't seem to be interested." https://forum.dlang.org/post/mxgyoflrsibeyavvmjuq forum.dlang.org Not good.
Sep 03 2018
prev sibling parent walker <growup_wei qq.com> writes:
On Monday, 3 September 2018 at 06:29:02 UTC, Pjotr Prins wrote:
 One thing I want to add that we ought to be appreciative of the 
 work people put in - much of it in their spare time. I wonder 
 if W&A and others sometimes despair for the lack of 
 appreciation they get. Guido van Rossum burning out (W, 
 notably, was the one to post that here first) is a shame. Even 
 though he created a language which I find less tasteful he did 
 not deserve to be treated like that. Simple.
I feel the same. There is no need to put a huge burden on them even if there is something could not be fixed. A good subset of a language is still a good language I think. Powerful, expressive, precise, that's important for me.
Sep 03 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 2 September 2018 at 12:07:17 UTC, Laeeth Isharc wrote:

 That's why the people that adopt D will inordinately be 
 principals not agents in the beginning. They will either be 
 residual claimants on earnings or will have acquired the 
 authority to make decisions without persuading a committee that 
 makes decisions on the grounds of social factors.

 If D becomes another C++ ?  C++ was ugly from the beginning (in 
 my personal subjective assessment) whereas D was designed by 
 people with good taste.

 That's why it appeals inordinately to people with good taste.
[snip] Be that as it may, however, you forget the fact that people "with good taste" who have (had) an intrinsic motivation to learn D are also very critical people who take no bs, else they wouldn't have ended up using D in the first place. Since they've already learned a lot of concepts etc. with D over the years, it's technically easy for them to move on to either an easier language or one that offers more or less the same features as D. So once they're no longer happy with the way things are, they can dive into a any language fast enough for the cost of transition to be low. One has to be practical too. Programming involves more than just features and concepts. Good, out of the box system integration (e.g. Android, iOS) is important too and he who ignores this simple truth will have to pay a high price. That's why developers of new languages are so keen on giving users a smooth experience when it comes to app development and cross compilation which leads me to the next point: IDEs. No. You don't need an IDE to develop in D. However, an IDE can a) make coding comfortable and b) boost your productivity. As to a): maybe you just grow tired of the text editor & cli approach and you just want to click a few buttons to fix imports or correct typos and be done with it, and as to b): all this helps to boost your productivity, especially when you can easily set up an app or a web service with a few mouse clicks. In D, if you want to do something with ARM/Android you will invariably end up with a potpourri of build scripts and spaghetti lines full of compiler flags etc. Not smooth, it takes a lot of time to set it up manually and it's not easily maintainable. Doable, yes, but just because something is doable doesn't mean it's recommendable nor that people will actually bother with doing it. I'm under the impression that the D Foundation doesn't pay much attention to these things once they are kind of "doable" and somebody has volunteered to "look into it" with no guarantee whatsoever if and when it will be available to users. And if there are complaints, hey, it is not "official" ask the guy who's looking into it. Not very professional. See, that doesn't really give you confidence in D and it gives you an uneasy feeling. Nothing worse in software development than to be programming thinking "Am I actually wasting my time here?", and of course, you become reluctant to start anything new in D - which is only natural.
Sep 03 2018
next sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Monday, 3 September 2018 at 11:32:42 UTC, Chris wrote:
 On Sunday, 2 September 2018 at 12:07:17 UTC, Laeeth Isharc 
 wrote:

 That's why the people that adopt D will inordinately be 
 principals not agents in the beginning. They will either be 
 residual claimants on earnings or will have acquired the 
 authority to make decisions without persuading a committee 
 that makes decisions on the grounds of social factors.

 If D becomes another C++ ?  C++ was ugly from the beginning 
 (in my personal subjective assessment) whereas D was designed 
 by people with good taste.

 That's why it appeals inordinately to people with good taste.
[snip] Be that as it may, however, you forget the fact that people "with good taste" who have (had) an intrinsic motivation to learn D are also very critical people who take no bs, else they wouldn't have ended up using D in the first place. Since they've already learned a lot of concepts etc. with D over the years, it's technically easy for them to move on to either an easier language or one that offers more or less the same features as D. So once they're no longer happy with the way things are, they can dive into a any language fast enough for the cost of transition to be low.
 One has to be practical too.
Yes! And being practical involves recognising different objectives, starting points and considerations apply to different situations and contexts.
 Programming involves more than just features and concepts. 
 Good, out of the box system integration (e.g. Android, iOS) is 
 important too and he who ignores this simple truth will have to 
 pay a high price.
Important for whom? It depends a lot! Ask Sociomantic, Bastian, Weka if the lack of Android or iOS integration is a big problem for them, and I don't think you will get the answer that it is important. For what I am doing, Android or iOS would be nice, but it doesn't need to be out of the box, and you can do quite a lot on Android already. I compiled ldc on my Huawei watch, which I never expected to be possible though given it has 4 Gig of RAM it's not that surprising. JNI is not that bad though could certainly be made easier with a bit of work. And I haven't tried, but I guess you could write the GUI stuff in Python or Lua for a simple app and do the heavy lifting with D. Of course for the ecosystem generally yes it matters.
 why developers of new languages are so keen on giving users a 
 smooth experience when it comes to app development and cross 
 compilation which leads me to the next point: IDEs.
D has never been about smooth experiences! That's a commercial benefit if you think that hormesis brings benefits and you are not looking for programmers of the trained-monkey, strap a few APIs together type. It's a question of developmental stages too. I was a late developer as a person, but then I continued to develop into my 30s and perhaps 40s too. For human beings there are different kinds of people and implicit life strategies and natural fits with niches. Some are quick to grow up, but stop developing sooner and others mature more slowly but this process may continue long after others are done. I'm not saying a computer language is like a human being, but it is in part an organic phenomenon and social institutions develop according to their own logic and rhythm in my experience of studying them. D is a late developer, and I think that's because it is a tremendously ambitious language. What use case is D intended to target? Well it's not like that - it's a general purpose programming language at a time when people have given up on that idea and think that it simply must be that you pick one tool for the job and couldn't possibly have a tool that does many different kind of things reasonably well. So the kind of use cases D is suited for depends much more on the capabilities and virtues of the people using it than is the case for other languages. (In truth in a negative sense that's true also of other languages - Go was designed to be easy to learn and to use for people who didn't have much programming experience).
 No. You don't need an IDE to develop in D
Indeed, and much less so than with some other languages because you can understand the code that's out of focus more easily and hold more of it in your head and reason about it. I personally use Sublime and vim, but tools are very personal because problems are different and people think differently and there's not much upside in engaging in a holy war about tools.
 However, an IDE can a) make coding comfortable and b) boost 
 your productivity.
Sure - in can do for some people in some cases.
 to a): maybe you just grow tired of the text editor & cli 
 approach and you just want to click a few buttons to fix 
 imports or correct typos and be done with it, and as to b): all 
 this helps to boost your productivity, especially when you can 
 easily set up an app or a web service with a few mouse clicks.
Sure. I would agree with what you write but say that it's a case of capabilities and hormesis too sometime. Nassim Taleb told a story about checking into a hotel and seeing a guy in a suit tip the bellboy to carry his bags upstairs. Later on he saw the same guy in a gym lifting weights (and I think on a Nautilus-type machine which is much inferior to free weights). So any tool can make you lazy, and yet any tool - no matter how shiny, polished, and expensive - sometimes will break and then if you are afraid of the command line or just very out of practice you can end up utterly helpless. It's a matter of balance to be sure.
 In D, if you want to do something with ARM/Android you will 
 invariably end up with a potpourri of build scripts and 
 spaghetti lines full of compiler flags etc. Not smooth, it 
 takes a lot of time to set it up manually and it's not easily 
 maintainable.
I didn't find the experience last time I tried to be worse than just going through the Android C/C++ native SDK instructions. The first time I tried it was quite tough as I struggled to even build the compiler as the instructions weren't quite right. I disagree about it not being maintainable as it's much easier to keep something you understand and can reason about working, but it's harder to use in the beginning, for sure. I think that the point for Android and ARM is not the build process but integration with Java APIs. If you can't figure out a build process that when I tried it mostly just worked and that doesn't have too much dark magic, I fear for how easy you are going to find JNI. (JNI is fine, but building a D project on Android requires less demanding technical capabilities).
 Doable, yes, but just because something is doable doesn't mean 
 it's recommendable nor that people will actually bother with 
 doing it.
You had one or two people who stubbornly devoted considerable parts of their lives to getting D to build on Android. And instead of saying what a remarkable achievement, and thank you so much for this work, and this is very cool but we really should consider in a constructive manner how to make this easy to use, you are saying I want more! Fair enough - it's a free society, although I don't think you were ever promised that the Android experience would be something different from what it is. But I really am not surprised that people burn out doing open source. It's very odd to see, because I came back to this world after a long break. My first 'open source' contribution was to part of Tom Jenning's work on FidoNet in 1989 - an improvement to some node routing table, and in those days people used to be pretty appreciative. Same thing with Chuck Forsberg who invented ZModem and came to that same conference - people then didn't talk about all the deficiencies but they understood this was a labour of love and the kind of attitude one sees so commonly today I couldn't have imagined.
 I'm under the impression that the D Foundation doesn't pay much 
 attention to these things once they are kind of "doable" and 
 somebody has volunteered to "look into it" with no guarantee 
 whatsoever if and when it will be available to users.
Dude - it's open-source and a community-developed language with some - and increasing - commercial support. I'm not saying that the things you ask for might not be valuable things. But I'm curious to know from a rational means-ends perspective how you think your chosen means will be helpful in achieving your desired ends. Do you think that complaining without taking the smallest step towards making things a reality (if you have done so, then I apologise - but your message would have been more effective had you articulated what those steps were) will change things?
 And if there are complaints, hey, it is not "official" ask the 
 guy who's looking into it. Not very professional.
Gesellschaft and gemeinschaft, and open-source is something new. One can pick only from the options available and those one can imaginatively create. Suppose we made you dictator of D, but subject to the same constraints that currently exist. What steps would you take to achieve the ends you desire?
 See, that doesn't really give you confidence in D and it gives 
 you an uneasy feeling. Nothing worse in software development 
 than to be programming thinking "Am I actually wasting my time 
 here?", and of course, you become reluctant to start anything 
 new in D - which is only natural.
Don't use D if you don't want to. Almost certainly it's not suitable for everyone. But the opposite of love is indifference. Somehow you still choose to spend your time here for now. And since that's the case, I strongly encourage you to think about what little baby steps in concrete ways you can take to be the change you wish to become. I just spoke with Dicebot about work stuff. He incidentally mentioned what I said before based on my impressions. The people doing work with a language have better things to do than spend a lot of time on forums. And I think in open source you earn the right to be listened to by doing work of some kind. He said (which I knew already) it was an old post he didn't put up in the end - somebody discovered it in his repo. He is working fulltime as a consultant with me for Symmetry and is writing D as part of that role. I don't think that indicates he didn't mean his criticisms, and maybe one could learn from those. But a whole thread triggered by this is quite entertaining.
Sep 03 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 3 September 2018 at 14:26:46 UTC, Laeeth Isharc wrote:
 On Monday, 3 September 2018 at 11:32:42 UTC, Chris wrote:
 [...]
 D has never been about smooth experiences!  That's a commercial 
 benefit if you think that hormesis brings benefits and you are 
 not looking for programmers of the trained-monkey, strap a few 
 APIs together type.
It's high time it got a bit smoother if you want people to use it. Is everybody who doesn't use cli and knows all compiler flags by heart a coding monkey? Has it ever occurred to you that people want a smooth experience so they can concentrate on a job and get done with it?
 It's a question of developmental stages too.  I was a late 
 developer as a person, but then I continued to develop into my 
 30s and perhaps 40s too.  For human beings there are different 
 kinds of people and implicit life strategies and natural fits 
 with niches.  Some are quick to grow up, but stop developing 
 sooner and others mature more slowly but this process may 
 continue long after others are done.  I'm not saying a computer 
 language is like a human being, but it is in part an organic 
 phenomenon and social institutions develop according to their 
 own logic and rhythm in my experience of studying them.
This is not a yoga class.
 D is a late developer, and I think that's because it is a 
 tremendously ambitious language.  What use case is D intended 
 to target?  Well it's not like that - it's a general purpose 
 programming language at a time when people have given up on 
 that idea and think that it simply must be that you pick one 
 tool for the job and couldn't possibly have a tool that does 
 many different kind of things reasonably well.  So the kind of 
 use cases D is suited for depends much more on the capabilities 
 and virtues of the people using it than is the case for other 
 languages.  (In truth in a negative sense that's true also of 
 other languages - Go was designed to be easy to learn and to 
 use for people who didn't have much programming experience).
It's not D's usefulness I'm concerned with, it can do a lot of things. It's just a bit awkward to use in production and there's no reason why things should still be like in 2010.
 [...]
Sure. I would agree with what you write but say that it's a case of capabilities and hormesis too sometime. Nassim Taleb told a story about checking into a hotel and seeing a guy in a suit tip the bellboy to carry his bags upstairs. Later on he saw the same guy in a gym lifting weights (and I think on a Nautilus-type machine which is much inferior to free weights). So any tool can make you lazy, and yet any tool - no matter how shiny, polished, and expensive - sometimes will break and then if you are afraid of the command line or just very out of practice you can end up utterly helpless. It's a matter of balance to be sure.
Funny story, but this is not the place for esoteric contemplations.
 [...]
I didn't find the experience last time I tried to be worse than just going through the Android C/C++ native SDK instructions. The first time I tried it was quite tough as I struggled to even build the compiler as the instructions weren't quite right. I disagree about it not being maintainable as it's much easier to keep something you understand and can reason about working, but it's harder to use in the beginning, for sure. I think that the point for Android and ARM is not the build process but integration with Java APIs. If you can't figure out a build process that when I tried it mostly just worked and that doesn't have too much dark magic, I fear for how easy you are going to find JNI. (JNI is fine, but building a D project on Android requires less demanding technical capabilities).
I know JNI, I've connected D with Java (and vice versa) a few times.
 [...]
You had one or two people who stubbornly devoted considerable parts of their lives to getting D to build on Android. And instead of saying what a remarkable achievement, and thank you so much for this work, and this is very cool but we really should consider in a constructive manner how to make this easy to use, you are saying I want more! Fair enough - it's a free society, although I don't think you were ever promised that the Android experience would be something different from what it is.
I never gave out about the guys (I think one of them is Joakim) who made it possible in the end, because without their efforts we wouldn't have anything. I'm just surprised they don't get more full time support to wrap it up nicely.
 But I really am not surprised that people burn out doing open 
 source.  It's very odd to see, because I came back to this 
 world after a long break.  My first 'open source' contribution 
 was to part of Tom Jenning's work on FidoNet in 1989 - an 
 improvement to some node routing table, and in those days 
 people used to be pretty appreciative.  Same thing with Chuck 
 Forsberg who invented ZModem and came to that same conference - 
 people then didn't talk about all the deficiencies but they 
 understood this was a labour of love and the kind of attitude 
 one sees so commonly today I couldn't have imagined.

 [...]
Dude - it's open-source and a community-developed language with some - and increasing - commercial support. I'm not saying that the things you ask for might not be valuable things. But I'm curious to know from a rational means-ends perspective how you think your chosen means will be helpful in achieving your desired ends. Do you think that complaining without taking the smallest step towards making things a reality (if you have done so, then I apologise - but your message would have been more effective had you articulated what those steps were) will change things?
The default answer: it's open source therefore it's under-resourced.
 [...]
Gesellschaft and gemeinschaft, and open-source is something new. One can pick only from the options available and those one can imaginatively create. Suppose we made you dictator of D, but subject to the same constraints that currently exist. What steps would you take to achieve the ends you desire?
 [...]
Don't use D if you don't want to. Almost certainly it's not suitable for everyone. But the opposite of love is indifference. Somehow you still choose to spend your time here for now. And since that's the case, I strongly encourage you to think about what little baby steps in concrete ways you can take to be the change you wish to become. I just spoke with Dicebot about work stuff. He incidentally mentioned what I said before based on my impressions. The people doing work with a language have better things to do than spend a lot of time on forums. And I think in open source you earn the right to be listened to by doing work of some kind. He said (which I knew already) it was an old post he didn't put up in the end - somebody discovered it in his repo. He is working fulltime as a consultant with me for Symmetry and is writing D as part of that role. I don't think that indicates he didn't mean his criticisms, and maybe one could learn from those. But a whole thread triggered by this is quite entertaining.
So the essence of your message is what I've been hearing for years. 1 It's open source, so it cannot be smooth / reliable / streamlined / consistent. 2. If you complain and want any of the features listed in 1. you're probably just a well-trained monkey. D is only for the chosen few. And maybe that was true in 2010. But it's 2018 and D is losing the edge it had on other languages. 3. Shame anyone who complains, ungrateful little brats! But it's my own fault. When the D Foundation was founded I really thought that we would soon have something that resembles a good product. I didn't expect the endless discussions about features, and the breakages etc. to continue like back in the day. I thought more effort would be put into packaging D into a good product. Mea maxima culpa.
Sep 03 2018
parent reply Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Monday, 3 September 2018 at 15:23:12 UTC, Chris wrote:
 On Monday, 3 September 2018 at 14:26:46 UTC, Laeeth Isharc 
 wrote:
 On Monday, 3 September 2018 at 11:32:42 UTC, Chris wrote:
 [...]
 D has never been about smooth experiences!  That's a 
 commercial benefit if you think that hormesis brings benefits 
 and you are not looking for programmers of the trained-monkey, 
 strap a few APIs together type.
It's high time it got a bit smoother if you want people to use it. Is everybody who doesn't use cli and knows all compiler flags by heart a coding monkey? Has it ever occurred to you that people want a smooth experience so they can concentrate on a job and get done with it?
Yes. It almost sounds like a smooth experience would be a bad thing to have, especially with the classic "you don't need an IDE anyway" speech. Editing experience seems often dismissed as unimportant, when it's one of the first things new users will come across when trying out D. First impressions can matter a lot.
Sep 03 2018
next sibling parent reply RhyS <sale rhysoft.com> writes:
On Monday, 3 September 2018 at 15:41:48 UTC, Laurent Tréguier 
wrote:
 Yes. It almost sounds like a smooth experience would be a bad 
 thing to have, especially with the classic "you don't need an 
 IDE anyway" speech. Editing experience seems often dismissed as 
 unimportant, when it's one of the first things new users will 
 come across when trying out D. First impressions can matter a 
 lot.
Its the same issue why Linux as a Desktop has been stuck with almost no growth. Its easy to break things ( nvidia graphical driver *lol* ), too much is so focused on the Cli that people who do have a issue and are not system users quick run into a flooding swamp. Too much resources split among too many distributions, graphical desktops etc. Choice is good but too much choice means projects are starved for resources, comparability are issues, bugs are even more present, ... A good example being the resources going into DMD, LDC, GDC... 3 Compilers for one language, when even well funded languages stick to one compiler. And now some people think its a good idea to have DMD also cross compile because "its not hard to do". No, maybe not but who will do all the testing, what resources are going to spend when things do not work for some users ( and the negative impact on their experience )... Its a long list but people do not look past this. It sounds like fun, lets think / or do it. Its just so frustrating that a lot of people here do not understand. Most programmers are not open-source developers, they are not coding gods, they are simply people who want things to good smooth. Install compiler, install good supported graphical IDE ( and no, VIM does not count! ), read up some simple documentation and off we go... We are not looking to be bug testers, core code implementer's, etc... Selfish, ... sure ... but this is how D gain more people. The more people work with your language, the more potential people you have that slowly are interested in helping out. But when D puts the carrot in front of the cart instead of the mule. Then do not be so surprised that a lot of people find D extreme frustrating and have a love-hate relationship with it.
Sep 03 2018
next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On 3 September 2018 at 18:07, RhyS via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 Too much resources split among too many distributions, graphical desktops
 etc. Choice is good but too much choice means projects are starved for
 resources, comparability are issues, bugs are even more present, ...

 A good example being the resources going into DMD, LDC, GDC... 3 Compilers
 for one language, when even well funded languages stick to one compiler.
This is an argument that has been batted to death and rebutted for nearly 2 decades now. 15 years ago, people were complaining that there was only one D compiler. It is ironic that people now complain that there's too many.
Sep 03 2018
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 3 September 2018 at 16:41:32 UTC, Iain Buclaw wrote:
 15 years ago, people were complaining that there was only one D 
 compiler.  It is ironic that people now complain that there's 
 too many.
One needs multiple implementations to confirm the accuracy of the language specification. D still has one implementation, i.e. one compiler with multiple backends, distributed as multiple executables (with tweaks). Anyway, I think people complained about the first and only compiler being non-free. That's not relevant now, of course.
Sep 04 2018
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Monday, 3 September 2018 at 16:07:21 UTC, RhyS wrote:
 On Monday, 3 September 2018 at 15:41:48 UTC, Laurent Tréguier 
 wrote:
 Yes. It almost sounds like a smooth experience would be a bad 
 thing to have, especially with the classic "you don't need an 
 IDE anyway" speech. Editing experience seems often dismissed 
 as unimportant, when it's one of the first things new users 
 will come across when trying out D. First impressions can 
 matter a lot.
I didn't give a you don't need an IDE speech, and I didn't say a smooth experience was a bad thing. But in my experience a strong reality orientation leads to good things coming out of life and telling the universe it should be something different from what it is is a recipe for misery and suffering, and why would you do that to yourself? So if you want the world to be different, come up with a plan. It could be I am going to donate X dollars a month to the Foundation to fund IDE development, or if could be figuring out how you can help with the work in whatever way. But just grumbling - I really think that mistakes the nature of the situation, and not to mention human psychology. You can accomplish things with a vision that's thought through and inspires others. Negativity is part of being creative but not if you stop there.
 Its the same issue why Linux as a Desktop has been stuck with 
 almost no growth. Its easy to break things ( nvidia graphical 
 driver *lol* ), too much is so focused on the Cli that people 
 who do have a issue and are not system users quick run into a 
 flooding swamp.

 Too much resources split among too many distributions, 
 graphical desktops etc. Choice is good but too much choice 
 means projects are starved for resources, comparability are 
 issues, bugs are even more present, ...
Chrome books and Android seem to be doing okay. I run Linux on the desktop and have done full-time since 2014. Maybe you're right that it's not for everyone at this point. And so ? There just wasn't a path for people to put effort into making it utterly easy for non technical people beyond a certain point. Does that mean Linux or Linux on the desktop has failed? I don't think so. It's just not for everyone. It's interesting to see Microsoft making it possible to run Linux on Windows - turns out a minority audience can be an important audience.
 A good example being the resources going into DMD, LDC, GDC... 
 3 Compilers for one language, when even well funded languages 
 stick to one compiler. And now some people think its a good 
 idea to have DMD also cross compile because "its not hard to 
 do". No, maybe not but who will do all the testing, what 
 resources are going to spend when things do not work for some 
 users ( and the negative impact on their experience )... Its a 
 long list but people do not look past this. It sounds like fun, 
 lets think / or do it.
What resources do you think go into GDC? I think Iain would love to hear about all these resources because I am not sure he has been made aware of them because they don't exist beyond him and possibly a tiny number of others helping in part at certain stages.
 Its just so frustrating that a lot of people here do not 
 understand. Most programmers are not open-source developers, 
 they are not coding gods, they are simply people who want 
 things to good smooth. Install compiler, install good supported 
 graphical IDE ( and no, VIM does not count! ), read up some 
 simple documentation and off we go... We are not looking to be 
 bug testers, core code implementer's, etc...
Sure, and probably most people would be better off at this point using a language that makes getting started easy. One doesn't need to appeal to most people to succeed. That's just a pragmatic statement of the obvious. In time it will change but j don't see how recognising your observation could rationally lead anyone to do something differently from what they would have done before. To change the world you need a goal and a first cut at a plan for getting there. Whether the goal is entirely realistic is much less important than having a plan to begin. And I speak from experience here having at certain points not much more than that.
 Selfish, ... sure ... but this is how D gain more people. The 
 more people work with your language, the more potential people 
 you have that slowly are interested in helping out.
I disagree. At this point the way for D to appeal to more people is to increase its appeal just a bit more to those who are already on the cusp of using D or would be if they had looked into it and to those who use D already in some way but could use it more. The way for D to appeal to more people is not to address the complaints of those who spend more time writing on the forum grumbling but don't use it much, because in my experience you do much better appealing to the people who are your best customers than to those who tell you if only you could do X there would be huge demand. I think that has been Walter's experience too. Like I say, the advantage of being the underdog is that you don't need to appeal to everyone to continue to grow.
 But when D puts the carrot in front of the cart instead of the 
 mule. Then do not be so surprised that a lot of people find D 
 extreme frustrating and have a love-hate relationship with it.
Sure. That's not surprising. It's surprising that people think by complaining rather than taking action they will achieve much of a change in the world. But with human nature it has always been thus, I suppose.
Sep 03 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/3/2018 7:19 PM, Laeeth Isharc wrote:
 The way for D to appeal to more people is not to address 
 the complaints of those who spend more time writing on the forum grumbling but 
 don't use it much, because in my experience you do much better appealing to
the 
 people who are your best customers than to those who tell you if only you
could 
 do X there would be huge demand.  I think that has been Walter's experience
too.
I've bin in this business a long time. Fun anecdotes: --- I was out jogging with a colleague in the 1990's one day. He said what the world really needs, and what he really needed, was a Java implementation that generated native code. It would set the world on fire! I told him I wrote one, he could get it today from Symantec. He never said another word on the subject. It turns out nobody wanted a native Java compiler. --- Back in the old Datalight days in the 1980s, a big customer said they couldn't use Datalight C because it didn't have Feature X. If only it had X, they'd place a Big Order. So I implemented X, and excitedly showed it to them and asked for the Big Order. They hemmed and hawed, then said what they really needed was Feature Y! After that, I was a lot less credulous of dangling promises of a Big Order. I'd often say sure, and ask for an advance on the order, which worked well at filtering out the chain-jerking. --- Related to me by a friend: X told me that what he really wanted in a C++ compiler was compile speed. It was the most important feature. He went on and on about it. I laughed and said that compile speed was at the bottom of his list. He looked perplexed, and asked how could I say that? I told him that he was using Cfront, a translator, with Microsoft C as the backend, a combination that compiled 4 times slower than Zortech C++, and didn't have critical (for DOS) features like near/far pointers. What he really regarded as the most important feature was being a name brand. --- Henry Ford said that his market research suggested that people wanted a faster horse. --- Trying to figure out where we should allocate our scarce resources is probably the most difficult task I face. I know it looks easy, but it is all too easy to wind up with a faster horse when everyone else developed a car.
Sep 03 2018
prev sibling next sibling parent reply Iain Buclaw <ibuclaw gdcproject.org> writes:
On 4 September 2018 at 04:19, Laeeth Isharc via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 3 September 2018 at 16:07:21 UTC, RhyS wrote:
 A good example being the resources going into DMD, LDC, GDC... 3 Compilers
 for one language, when even well funded languages stick to one compiler. And
 now some people think its a good idea to have DMD also cross compile because
 "its not hard to do". No, maybe not but who will do all the testing, what
 resources are going to spend when things do not work for some users ( and
 the negative impact on their experience )... Its a long list but people do
 not look past this. It sounds like fun, lets think / or do it.
What resources do you think go into GDC? I think Iain would love to hear about all these resources because I am not sure he has been made aware of them because they don't exist beyond him and possibly a tiny number of others helping in part at certain stages.
*Looks behind self* *Looks under desk* *Looks under keyboard* There must be resources somewhere, but none appear to be within reach. :-)
Sep 03 2018
parent Laeeth Isharc <laeeth laeeth.com> writes:
On Tuesday, 4 September 2018 at 05:38:49 UTC, Iain Buclaw wrote:
 On 4 September 2018 at 04:19, Laeeth Isharc via Digitalmars-d 
 <digitalmars-d puremagic.com> wrote:
 On Monday, 3 September 2018 at 16:07:21 UTC, RhyS wrote:
 A good example being the resources going into DMD, LDC, 
 GDC... 3 Compilers for one language, when even well funded 
 languages stick to one compiler. And now some people think 
 its a good idea to have DMD also cross compile because "its 
 not hard to do". No, maybe not but who will do all the 
 testing, what resources are going to spend when things do not 
 work for some users ( and the negative impact on their 
 experience )... Its a long list but people do not look past 
 this. It sounds like fun, lets think / or do it.
What resources do you think go into GDC? I think Iain would love to hear about all these resources because I am not sure he has been made aware of them because they don't exist beyond him and possibly a tiny number of others helping in part at certain stages.
*Looks behind self* *Looks under desk* *Looks under keyboard* There must be resources somewhere, but none appear to be within reach. :-)
If Iain had a beer for every person that complained about the effort spent by team GDC without having first thanked him and his vast team then... People are sometimes quite disconnected from reality. At least I have no other explanation for people demanding others do this or do that without doing the minimum necessary to make it appealing for others to work on it. I mean my experience is that you can pay people a lot of money and ask them beforehand do you want to work on X, and it's no guarantee they actually will be willing to when it comes to it. Programmers in general can be very independent-minded people, and if somebody is looking for especially meek and compliant people then if you have come to the D forums you are in the wrong place! One can be much more persuasive with positive words than complaints. Most people are well aware of that so if they are complaining it's in my judgement because they want to complain. People with high standards will do that when they feel powerless. I'm not talking here about notorious highly intelligent trolls like Ola and sock-puppets who never seem to actually write code in D. But nobody who can keep up here is powerless. It's possible to change the world you know, and from the most unpromising start. Forget about what's realistic, and focus on what you want to achieve. Believe me, you can achieve an awful lot from the most unpromising start. People talk about how most people are not super-hackers and one shouldn't expect them to manage without polish. Well hacker is a state of mind,a way of being in the world. Ask Iain if his self-conception is as a super-hacker with l33t skillz that a mere professional programmer couldn't match and you might be surprised (I think his self-conception might be wrong, but that's Dunning Kruger in action for you). It's really much more about values and virtues then capabilities. Are you able to tolerate discomfort and the accurate initial feeling of conscious incompetence? Because that's what real learning feels like once you leave the spoon-feeding stream of education. D is a gift to the world from Walter, Andrei, and those who contributed after it was begun. Just demanding people do stuff for you without doing anything to contribute back - that's not how life works. I don't think I have ever seen this degree of a feeling of entitlement in my life! And I've been working in finance since 1993. If doesn't want to pay money towards the development of IDE integration, doesn't want to do any work themselves, then the least they could do is draw up a feature list of what's missing and find a way to help from time to time with the organisation of the work. That's the only way things ever get done anyway. Have you noticed how the documentation has gotten much better? Runnable examples too. Did that happen because people complained? No - it happened because Seb Wilzbach (and maybe others) took the initiative to make it happen and did the work themselves. A little money goes a long way in open source. So if you're a company and you're complaining and not donating money to the Foundation then what exactly do you expect? We have a few support contracts with MongoDB (a choice made before I got involved) and the legal fees alone were 20k and we pay about 30k USD a year. If a few companies contributed at that scale to the Foundation that's at least a couple of full-time developers. And if you disagree with Andrei and Walter choices about priorities you know you can just direct where the money should be spent as we are with SAoC.
Sep 04 2018
prev sibling parent Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Tuesday, 4 September 2018 at 02:19:20 UTC, Laeeth Isharc wrote:
 On Monday, 3 September 2018 at 16:07:21 UTC, RhyS wrote:
 On Monday, 3 September 2018 at 15:41:48 UTC, Laurent Tréguier 
 wrote:
 Yes. It almost sounds like a smooth experience would be a bad 
 thing to have, especially with the classic "you don't need an 
 IDE anyway" speech. Editing experience seems often dismissed 
 as unimportant, when it's one of the first things new users 
 will come across when trying out D. First impressions can 
 matter a lot.
I didn't give a you don't need an IDE speech, and I didn't say a smooth experience was a bad thing.
I know, I know. But it's always the same story, whenever people wonder about D's state in IDE integration, often other people will just say that vim + terminal is enough.
 But in my experience a strong reality orientation leads to good 
 things coming out of life and telling the universe it should be 
 something different from what it is is a recipe for misery and 
 suffering, and why would you do that to yourself?

 So if you want the world to be different, come up with a plan.  
 It could be I am going to donate X dollars a month to the 
 Foundation to fund IDE development, or if could be figuring out 
 how you can help with the work in whatever way.  But just 
 grumbling - I really think that mistakes the nature of the 
 situation, and not to mention human psychology.  You can 
 accomplish things with a vision that's thought through and 
 inspires others.  Negativity is part of being creative but not 
 if you stop there.
I stated it in an earlier post, I've been working on editor/IDE integration myself because of this. I don't just grumble, although I do like grumbling a lot :)
Sep 03 2018
prev sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Monday, September 3, 2018 9:41:48 AM MDT Laurent Trguier via 
Digitalmars-d wrote:
 On Monday, 3 September 2018 at 15:23:12 UTC, Chris wrote:
 On Monday, 3 September 2018 at 14:26:46 UTC, Laeeth Isharc

 wrote:
 On Monday, 3 September 2018 at 11:32:42 UTC, Chris wrote:
 [...]
D has never been about smooth experiences! That's a commercial benefit if you think that hormesis brings benefits and you are not looking for programmers of the trained-monkey, strap a few APIs together type.
It's high time it got a bit smoother if you want people to use it. Is everybody who doesn't use cli and knows all compiler flags by heart a coding monkey? Has it ever occurred to you that people want a smooth experience so they can concentrate on a job and get done with it?
Yes. It almost sounds like a smooth experience would be a bad thing to have, especially with the classic "you don't need an IDE anyway" speech. Editing experience seems often dismissed as unimportant, when it's one of the first things new users will come across when trying out D. First impressions can matter a lot.
Most of the work that gets done is the stuff that the folks contributing think is the most important - frequently what is most important for them for what they do, and very few (if any) of the major contributors use or care about IDEs for their own use. And there's tons to do that has nothing to do with IDEs. There are folks who care about it enough to work on it, which is why projects such as VisualD exist at all, and AFAIK, they work reasonably well, but the only two ways that they're going to get more work done on them than is currently happening is if the folks who care about that sort of thing contribute or if they donate money for it to be worked on. Not long ago, the D Foundation announced that they were going to use donations to pay someone to work on his plugin for Visual Studio Code: https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org So, if you want stuff like that to get worked on, then donate or pitch in. The situation with D - both with IDEs and in general - has improved greatly over time even if it may not be where you want it to be. But if you're ever expecting IDE support to be a top priority of many of the contributors, then you're going to be sorely disappointed. It's the sort of thing that we care about because we care about D being successful, but it's not the sort of thing that we see any value in whatsoever for ourselves, and selfish as it may be, when we spend the time to contribute to D, we're generally going to work on the stuff that we see as having the most value for getting done what we care about. And there's a lot to get done which impacts pretty much every D user and not just those who want something that's IDE-related. - Jonathan M Davis
Sep 03 2018
next sibling parent reply Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
wrote:
 Most of the work that gets done is the stuff that the folks 
 contributing think is the most important - frequently what is 
 most important for them for what they do, and very few (if any) 
 of the major contributors use or care about IDEs for their own 
 use. And there's tons to do that has nothing to do with IDEs. 
 There are folks who care about it enough to work on it, which 
 is why projects such as VisualD exist at all, and AFAIK, they 
 work reasonably well, but the only two ways that they're going 
 to get more work done on them than is currently happening is if 
 the folks who care about that sort of thing contribute or if 
 they donate money for it to be worked on. Not long ago, the D 
 Foundation announced that they were going to use donations to 
 pay someone to work on his plugin for Visual Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then donate 
 or pitch in.

 The situation with D - both with IDEs and in general - has 
 improved greatly over time even if it may not be where you want 
 it to be. But if you're ever expecting IDE support to be a top 
 priority of many of the contributors, then you're going to be 
 sorely disappointed. It's the sort of thing that we care about 
 because we care about D being successful, but it's not the sort 
 of thing that we see any value in whatsoever for ourselves, and 
 selfish as it may be, when we spend the time to contribute to 
 D, we're generally going to work on the stuff that we see as 
 having the most value for getting done what we care about. And 
 there's a lot to get done which impacts pretty much every D 
 user and not just those who want something that's IDE-related.

 - Jonathan M Davis
The complaints I have is exactly why I'm myself maintaining plugins for VSCode, Atom, and others soon. Don't worry, I still think D is worth putting some time and effort into and I know actions generally get more things done than words. I also know that tons of stuff is yet to be done in regards to the actual compilers and such. It just baffles me a bit to see the state of D in this department, when languages like Go or Rust (hooray for yet another comparison to Go and Rust) are a lot younger, but already have what looks like very good tooling. Then again they do have major industry players backing them though...
Sep 03 2018
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Monday, September 3, 2018 11:15:03 AM MDT Laurent Trguier via 
Digitalmars-d wrote:
 It just baffles me a bit to see the state of D in this
 department, when languages like Go or Rust (hooray for yet
 another comparison to Go and Rust) are a lot younger, but already
 have what looks like very good tooling.
 Then again they do have major industry players backing them
 though...
The dynamics are fundamentally different when you're paying someone to work on something. As I understand it, in addition to whatever volunteer work is done, Google and Mozilla pay people to work on those languages. And when you're doing that, it's trivial enough to say that you think that something matters enough to pay someone to work on it even if it's not something that anyone contributing actually wants to do or really cares about having for themselves. Relatively little time has been spent contributing to D by people who are paid to work on it. Even if both Walter and Andrei agreed that something should be treated as top priority, aside from paying someone to work on it through the D Foundation, they really can't make anyone work on it. What gets done is usually what the contributors care about. That's one reason why donations could end up being a game changer over time. It makes it possible to pay someone to do something that no contributors want to spend their free time doing. - Jonathan M Davis
Sep 03 2018
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Monday, 3 September 2018 at 17:15:03 UTC, Laurent Tréguier 
wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
 wrote:
 Most of the work that gets done is the stuff that the folks 
 contributing think is the most important - frequently what is 
 most important for them for what they do, and very few (if 
 any) of the major contributors use or care about IDEs for 
 their own use. And there's tons to do that has nothing to do 
 with IDEs. There are folks who care about it enough to work on 
 it, which is why projects such as VisualD exist at all, and 
 AFAIK, they work reasonably well, but the only two ways that 
 they're going to get more work done on them than is currently 
 happening is if the folks who care about that sort of thing 
 contribute or if they donate money for it to be worked on. Not 
 long ago, the D Foundation announced that they were going to 
 use donations to pay someone to work on his plugin for Visual 
 Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then donate 
 or pitch in.

 The situation with D - both with IDEs and in general - has 
 improved greatly over time even if it may not be where you 
 want it to be. But if you're ever expecting IDE support to be 
 a top priority of many of the contributors, then you're going 
 to be sorely disappointed. It's the sort of thing that we care 
 about because we care about D being successful, but it's not 
 the sort of thing that we see any value in whatsoever for 
 ourselves, and selfish as it may be, when we spend the time to 
 contribute to D, we're generally going to work on the stuff 
 that we see as having the most value for getting done what we 
 care about. And there's a lot to get done which impacts pretty 
 much every D user and not just those who want something that's 
 IDE-related.

 - Jonathan M Davis
The complaints I have is exactly why I'm myself maintaining plugins for VSCode, Atom, and others soon. Don't worry, I still think D is worth putting some time and effort into and I know actions generally get more things done than words. I also know that tons of stuff is yet to be done in regards to the actual compilers and such. It just baffles me a bit to see the state of D in this department, when languages like Go or Rust (hooray for yet another comparison to Go and Rust) are a lot younger, but already have what looks like very good tooling. Then again they do have major industry players backing them though...
Why is Go's IDE support baffling? It was a necessity to achieve Google's commercial aims, I should think. " The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt." – Rob Pike I don't know the story of Rust, but if I were working on a project as large as Firefox I guess I would want an IDE too! Whereas it doesn't seem like it's so important to some of D's commercial users because they have a different context. I don't think it's overall baffling that D hasn't got the best IDE support of emerging languages. The people that contribute to it, as Jonathan says, seen to be leas interested in IDEs and no company has found it important enough to pay someone else to work on it. So far anyway but as adoption grows maybe that will change.
Sep 03 2018
parent reply Manu <turkeyman gmail.com> writes:
On Mon, 3 Sep 2018 at 18:45, Laeeth Isharc via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 3 September 2018 at 17:15:03 UTC, Laurent Tréguier
 wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis
 wrote:
 Most of the work that gets done is the stuff that the folks
 contributing think is the most important - frequently what is
 most important for them for what they do, and very few (if
 any) of the major contributors use or care about IDEs for
 their own use. And there's tons to do that has nothing to do
 with IDEs. There are folks who care about it enough to work on
 it, which is why projects such as VisualD exist at all, and
 AFAIK, they work reasonably well, but the only two ways that
 they're going to get more work done on them than is currently
 happening is if the folks who care about that sort of thing
 contribute or if they donate money for it to be worked on. Not
 long ago, the D Foundation announced that they were going to
 use donations to pay someone to work on his plugin for Visual
 Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then donate
 or pitch in.

 The situation with D - both with IDEs and in general - has
 improved greatly over time even if it may not be where you
 want it to be. But if you're ever expecting IDE support to be
 a top priority of many of the contributors, then you're going
 to be sorely disappointed. It's the sort of thing that we care
 about because we care about D being successful, but it's not
 the sort of thing that we see any value in whatsoever for
 ourselves, and selfish as it may be, when we spend the time to
 contribute to D, we're generally going to work on the stuff
 that we see as having the most value for getting done what we
 care about. And there's a lot to get done which impacts pretty
 much every D user and not just those who want something that's
 IDE-related.

 - Jonathan M Davis
The complaints I have is exactly why I'm myself maintaining plugins for VSCode, Atom, and others soon. Don't worry, I still think D is worth putting some time and effort into and I know actions generally get more things done than words. I also know that tons of stuff is yet to be done in regards to the actual compilers and such. It just baffles me a bit to see the state of D in this department, when languages like Go or Rust (hooray for yet another comparison to Go and Rust) are a lot younger, but already have what looks like very good tooling. Then again they do have major industry players backing them though...
Why is Go's IDE support baffling? It was a necessity to achieve Google's commercial aims, I should think. " The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt." – Rob Pike I don't know the story of Rust, but if I were working on a project as large as Firefox I guess I would want an IDE too! Whereas it doesn't seem like it's so important to some of D's commercial users because they have a different context. I don't think it's overall baffling that D hasn't got the best IDE support of emerging languages. The people that contribute to it, as Jonathan says, seen to be leas interested in IDEs and no company has found it important enough to pay someone else to work on it. So far anyway but as adoption grows maybe that will change.
It's been a key hurdle for as long as I've been around here. I've been saying for 10 years that no company I've ever worked at can take D seriously without industry standard IDE support. My feeling is that we have recently reached MVP status... that's a huge step, 10 years in the making ;) I think it's now at a point where more people *wouldn't* reject it on contact than those who would. But we need to go much further to make developers genuinely comfortable, and thereby go out of their way to prefer using D than C++ and pitch as such to their managers. Among all developers I've demo-ed or introduced recently, I can say for certain that developer enthusiasm is driven by their perception of the tooling in the order of 10x more than the language.
Sep 03 2018
parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Tuesday, 4 September 2018 at 02:24:25 UTC, Manu wrote:
 On Mon, 3 Sep 2018 at 18:45, Laeeth Isharc via Digitalmars-d 
 <digitalmars-d puremagic.com> wrote:
 On Monday, 3 September 2018 at 17:15:03 UTC, Laurent Tréguier 
 wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M 
 Davis wrote:
 Most of the work that gets done is the stuff that the folks 
 contributing think is the most important - frequently what 
 is most important for them for what they do, and very few 
 (if any) of the major contributors use or care about IDEs 
 for their own use. And there's tons to do that has nothing 
 to do with IDEs. There are folks who care about it enough 
 to work on it, which is why projects such as VisualD exist 
 at all, and AFAIK, they work reasonably well, but the only 
 two ways that they're going to get more work done on them 
 than is currently happening is if the folks who care about 
 that sort of thing contribute or if they donate money for 
 it to be worked on. Not long ago, the D Foundation 
 announced that they were going to use donations to pay 
 someone to work on his plugin for Visual Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then 
 donate or pitch in.

 The situation with D - both with IDEs and in general - has 
 improved greatly over time even if it may not be where you 
 want it to be. But if you're ever expecting IDE support to 
 be a top priority of many of the contributors, then you're 
 going to be sorely disappointed. It's the sort of thing 
 that we care about because we care about D being 
 successful, but it's not the sort of thing that we see any 
 value in whatsoever for ourselves, and selfish as it may 
 be, when we spend the time to contribute to D, we're 
 generally going to work on the stuff that we see as having 
 the most value for getting done what we care about. And 
 there's a lot to get done which impacts pretty much every D 
 user and not just those who want something that's 
 IDE-related.

 - Jonathan M Davis
The complaints I have is exactly why I'm myself maintaining plugins for VSCode, Atom, and others soon. Don't worry, I still think D is worth putting some time and effort into and I know actions generally get more things done than words. I also know that tons of stuff is yet to be done in regards to the actual compilers and such. It just baffles me a bit to see the state of D in this department, when languages like Go or Rust (hooray for yet another comparison to Go and Rust) are a lot younger, but already have what looks like very good tooling. Then again they do have major industry players backing them though...
Why is Go's IDE support baffling? It was a necessity to achieve Google's commercial aims, I should think. " The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt." – Rob Pike I don't know the story of Rust, but if I were working on a project as large as Firefox I guess I would want an IDE too! Whereas it doesn't seem like it's so important to some of D's commercial users because they have a different context. I don't think it's overall baffling that D hasn't got the best IDE support of emerging languages. The people that contribute to it, as Jonathan says, seen to be leas interested in IDEs and no company has found it important enough to pay someone else to work on it. So far anyway but as adoption grows maybe that will change.
It's been a key hurdle for as long as I've been around here. I've been saying for 10 years that no company I've ever worked at can take D seriously without industry standard IDE support. My feeling is that we have recently reached MVP status... that's a huge step, 10 years in the making ;) I think it's now at a point where more people *wouldn't* reject it on contact than those who would. But we need to go much further to make developers genuinely comfortable, and thereby go out of their way to prefer using D than C++ and pitch as such to their managers. Among all developers I've demo-ed or introduced recently, I can say for certain that developer enthusiasm is driven by their perception of the tooling in the order of 10x more than the language.
That's only because you insist on working for companies where people use IDEs and think the ones that don't must be in boring industries :) Kidding aside, would you care to enumerate what capabilities are missing that would tip the balance for such people were they to be there? And then would you care to estimate the degree of work involved in implementing them. For decent and motivated people, how many man years ? Knowing the full scope of a problem is sometimes one step towards solving it. And how would you rate the importance of tooling Vs finishing C++ integration ?
Sep 03 2018
parent Manu <turkeyman gmail.com> writes:
On Mon, 3 Sep 2018 at 19:35, Laeeth Isharc via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Tuesday, 4 September 2018 at 02:24:25 UTC, Manu wrote:
 On Mon, 3 Sep 2018 at 18:45, Laeeth Isharc via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 3 September 2018 at 17:15:03 UTC, Laurent Tréguier
 wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M
 Davis wrote:
 Most of the work that gets done is the stuff that the folks
 contributing think is the most important - frequently what
 is most important for them for what they do, and very few
 (if any) of the major contributors use or care about IDEs
 for their own use. And there's tons to do that has nothing
 to do with IDEs. There are folks who care about it enough
 to work on it, which is why projects such as VisualD exist
 at all, and AFAIK, they work reasonably well, but the only
 two ways that they're going to get more work done on them
 than is currently happening is if the folks who care about
 that sort of thing contribute or if they donate money for
 it to be worked on. Not long ago, the D Foundation
 announced that they were going to use donations to pay
 someone to work on his plugin for Visual Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then
 donate or pitch in.

 The situation with D - both with IDEs and in general - has
 improved greatly over time even if it may not be where you
 want it to be. But if you're ever expecting IDE support to
 be a top priority of many of the contributors, then you're
 going to be sorely disappointed. It's the sort of thing
 that we care about because we care about D being
 successful, but it's not the sort of thing that we see any
 value in whatsoever for ourselves, and selfish as it may
 be, when we spend the time to contribute to D, we're
 generally going to work on the stuff that we see as having
 the most value for getting done what we care about. And
 there's a lot to get done which impacts pretty much every D
 user and not just those who want something that's
 IDE-related.

 - Jonathan M Davis
The complaints I have is exactly why I'm myself maintaining plugins for VSCode, Atom, and others soon. Don't worry, I still think D is worth putting some time and effort into and I know actions generally get more things done than words. I also know that tons of stuff is yet to be done in regards to the actual compilers and such. It just baffles me a bit to see the state of D in this department, when languages like Go or Rust (hooray for yet another comparison to Go and Rust) are a lot younger, but already have what looks like very good tooling. Then again they do have major industry players backing them though...
Why is Go's IDE support baffling? It was a necessity to achieve Google's commercial aims, I should think. " The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt." – Rob Pike I don't know the story of Rust, but if I were working on a project as large as Firefox I guess I would want an IDE too! Whereas it doesn't seem like it's so important to some of D's commercial users because they have a different context. I don't think it's overall baffling that D hasn't got the best IDE support of emerging languages. The people that contribute to it, as Jonathan says, seen to be leas interested in IDEs and no company has found it important enough to pay someone else to work on it. So far anyway but as adoption grows maybe that will change.
It's been a key hurdle for as long as I've been around here. I've been saying for 10 years that no company I've ever worked at can take D seriously without industry standard IDE support. My feeling is that we have recently reached MVP status... that's a huge step, 10 years in the making ;) I think it's now at a point where more people *wouldn't* reject it on contact than those who would. But we need to go much further to make developers genuinely comfortable, and thereby go out of their way to prefer using D than C++ and pitch as such to their managers. Among all developers I've demo-ed or introduced recently, I can say for certain that developer enthusiasm is driven by their perception of the tooling in the order of 10x more than the language.
That's only because you insist on working for companies where people use IDEs and think the ones that don't must be in boring industries :) Kidding aside, would you care to enumerate what capabilities are missing that would tip the balance for such people were they to be there? And then would you care to estimate the degree of work involved in implementing them. For decent and motivated people, how many man years ? Knowing the full scope of a problem is sometimes one step towards solving it. And how would you rate the importance of tooling Vs finishing C++ integration ?
I'm working to move the bar on C++ integration, and Atila, yourself, etc also seem to have active work on that front... so I'm fairly confident at this stage that that's on course towards where it needs to be. There's a couple of hard problems though that are significant DMD developments. - Cross-language class hierarchies require that extern(C++) construction semantics mimic C++, which I think is reasonable. There's been plenty of discussion, stakeholders mostly know what to do, but it's a significant job. I can't estimate work involved in IDE tooling, I have no experience in Rainer's probably the only person who can comment with authority. For VisualD, which is the only environment that really matters to my industry, I'd suggest an expected feature set would look something like (in no particular order): - Auto-complete suggestions should be fast and accurate. (Currently, suggestions tend to be incomplete, and there also seems to be a lot of junk in the suggestion list that shouldn't be there) * Auto-complete is the _primary_ method of discovering API's that they don't often interact with. They will use the suggestion data to quickly learn the API and move on with their task without a break in productivity. * Most corp code is not documented, has none or poor comments, expert might be in another office (or gone). Auto-complete suggestions save heaps of time breaking flow and trawling through foreign code. - Additional support for plugging in cross-compilers (Android, consoles). * Rename symbols throughout code. * Suggest import module when calling a function or using a type that's not in scope. * Cleanup imports; remove imports with no references. they are the things they will miss) - Goto definition must *always* work. - Syntax colouring that's comprehensive and *accurate*; semantically correct. (VS actually sets a low-bar on this, practically everyone uses Visual Assist for syntax highlighting, we should aim for that) - Performance feels clunky at times. It shouldn't feel sluggish at any time if it wants to inspire confidence. A lot of these issues lean on general problems with the semantic analysis; it seems that it has trouble with certain constructs and then just stops or gives up. So as soon as code gets complex, or there's some meta creating some indirection to symbols, it's possible that the things stop working (goto definition, correct colouring, auto-complete) My work right now presents a very interesting opportunity; we're there's lambdas, delegates, GC, etc. But... those guy have, and expect, arguably the very best IDE tooling experience of any language that exists. D could nail it though, all If we gave them their tooling, I reckon they'd give us their loyalty :P
Sep 03 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
wrote:
 Most of the work that gets done is the stuff that the folks 
 contributing think is the most important - frequently what is 
 most important for them for what they do, and very few (if any) 
 of the major contributors use or care about IDEs for their own 
 use. And there's tons to do that has nothing to do with IDEs. 
 There are folks who care about it enough to work on it, which 
 is why projects such as VisualD exist at all, and AFAIK, they 
 work reasonably well, but the only two ways that they're going 
 to get more work done on them than is currently happening is if 
 the folks who care about that sort of thing contribute or if 
 they donate money for it to be worked on. Not long ago, the D 
 Foundation announced that they were going to use donations to 
 pay someone to work on his plugin for Visual Studio Code:

 https://forum.dlang.org/post/rmqvglgccmgoajmhynog forum.dlang.org

 So, if you want stuff like that to get worked on, then donate 
 or pitch in.

 The situation with D - both with IDEs and in general - has 
 improved greatly over time even if it may not be where you want 
 it to be. But if you're ever expecting IDE support to be a top 
 priority of many of the contributors, then you're going to be 
 sorely disappointed. It's the sort of thing that we care about 
 because we care about D being successful, but it's not the sort 
 of thing that we see any value in whatsoever for ourselves, and 
 selfish as it may be, when we spend the time to contribute to 
 D, we're generally going to work on the stuff that we see as 
 having the most value for getting done what we care about. And 
 there's a lot to get done which impacts pretty much every D 
 user and not just those who want something that's IDE-related.

 - Jonathan M Davis
Dear Jonathan, you've just said it. There is no real plan and only problems that someone deems interesting or challenging at a given moment are tackled. If they solve a problem for a lot of users, it's only a side effect. The advent of a D Foundation hasn't changed anything in this regard, and it seems not to be just a financial issue. It's the mentality. In other words, D is still unreliable, and if that what the community wants, fine, but instead of promoting it as a substitute for C/C++, Java etc. it should come with a warning label that says "D is in many parts still at an experimental stage and ships with no guarantees whatsoever. Use at your own risk." That would save both the language developers and (potential) users a lot of headaches. I think this sort of misunderstanding is the source of a lot of friction on this forum. Some users think (or in my case: thought) that D will be a sound and stable language one day, a language they can use for loads of stuff, while the leadership prefers to keep it at a stage where they can test ideas to see what works and what doesn't, wait let me rephrase this, where the user can test other people's ideas with every new release.
Sep 03 2018
next sibling parent reply Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:
 it should come with a warning label that says "D is in many 
 parts still at an experimental stage and ships with no 
 guarantees whatsoever. Use at your own risk."
Well it comes with the Boost license that says: `THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND`
Sep 03 2018
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 3 September 2018 at 18:52:45 UTC, Laurent Tréguier 
wrote:
 On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:
 it should come with a warning label that says "D is in many 
 parts still at an experimental stage and ships with no 
 guarantees whatsoever. Use at your own risk."
Well it comes with the Boost license that says: `THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND`
You know exactly what I mean, don't you?
Sep 03 2018
parent Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Monday, 3 September 2018 at 22:30:47 UTC, Chris wrote:
 On Monday, 3 September 2018 at 18:52:45 UTC, Laurent Tréguier 
 wrote:
 On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:
 it should come with a warning label that says "D is in many 
 parts still at an experimental stage and ships with no 
 guarantees whatsoever. Use at your own risk."
Well it comes with the Boost license that says: `THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND`
You know exactly what I mean, don't you?
I think I know what you mean. But licenses are not decorative. If it says "WITHOUT WARRANTY OF ANY KIND", it means that it actually comes without warranty of any kind.
Sep 03 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Monday, September 3, 2018 12:26:57 PM MDT Chris via Digitalmars-d wrote:
 There is no real plan and
 only problems that someone deems interesting or challenging at a
 given moment are tackled. If they solve a problem for a lot of
 users, it's only a side effect. The advent of a D Foundation
 hasn't changed anything in this regard, and it seems not to be
 just a financial issue. It's the mentality. In other words, D is
 still unreliable, and if that what the community wants, fine, but
 instead of promoting it as a substitute for C/C++, Java etc. it
 should come with a warning label that says "D is in many parts
 still at an experimental stage and ships with no guarantees
 whatsoever. Use at your own risk." That would save both the
 language developers and (potential) users a lot of headaches.

 I think this sort of misunderstanding is the source of a lot of
 friction on this forum. Some users think (or in my case: thought)
 that D will be a sound and stable language one day, a language
 they can use for loads of stuff, while the leadership prefers to
 keep it at a stage where they can test ideas to see what works
 and what doesn't, wait let me rephrase this, where the user can
 test other people's ideas with every new release.
Plenty of people - whole companies included - use D for real projects and products. It is an extremely powerful tool which can be used for real work. Is it as polished as some other languages? Maybe not, but it's plenty stable for real world use. And it's continually improving. All programming languages and tools are "used at your own risk." They all come with their own sets of pros and cons. If what you want is a language that doesn't change much, then there are plenty of other choices, just like there are plenty of languages that change all the time. Over time D has become more stable, and it doesn't change anywhere near as rapidly as it used to, but if you don't like how it works or is developed, then feel free to go elsewhere. Those of use that stick around find that its pros outweigh its cons. Plenty of folks disagree with us, and they've chosen different languages, which is just fine. In any case, I have better things to do than argue about whether D is a solid, useful language or not. It's the language that I prefer. I'm going to use it as much as I can, and I'm going to continue to contribute to it. If you don't like where D is, and you don't think that it's worth your time to contribute to it, then that's perfectly fine, but it's a waste of my time to continue to argue about it. I spend too much of my time in this newsgroup as it is, and this sort of argument doesn't contribute anything to improving D. - Jonathan M Davis
Sep 03 2018
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:

 I think this sort of misunderstanding is the source of a lot of 
 friction on this forum. Some users think (or in my case: 
 thought) that D will be a sound and stable language one day, a 
 language they can use for loads of stuff, while the leadership 
 prefers to keep it at a stage where they can test ideas to see 
 what works and what doesn't, wait let me rephrase this, where 
 the user can test other people's ideas with every new release.
D is not a petri dish for testing ideas. It's not an experiment. It's a serious language. Walter, Andrei, and everyone involved in maintaining and developing it want to see it succeed. They aren't just treading water, wasting everyone's time. And I know you keep hearing this, but I'll say it anyway: most of the development is based on volunteer work, and trying to get volunteers to do anything they don't want to do is like asking them to voluntarily have their teeth pulled when they don't need to. Walter has said that people come to him and ask what they should work on. He provides them with a list of priority tasks. Then they go off and work on something else. That's the nature of unsponsored open-source development and has been the biggest challenge for D for years. I have high hopes that some of this can be turned around by raising more money and I have long-term plans to try and facilitate that over the coming months. With more money, we can get people to work on targeted tasks even when they have no vested interest in it. We can pull in full-time coders, maybe get someone to oversee PR reviews so they don't stay open so long, fund development of broader ecosystem projects. There isn't anyone involved in the core D development who isn't aware of the broader issues in the community or the ecosystem, and they are working to bring improvements. I have been around this community since 2003. From my perspective, it has been one continuous march of progress. Sometimes slow, sometimes fast, but always moving, always getting better. And right now there are more people involved than ever in moving it forward. Unfortunately, there are also more demands on more fronts than ever. There are longtime users who become increasingly frustrated when the issues that matter to them still aren't resolved, newcomers who have no context of all the progress that has been made and instead hold D in comparison to Rust, Go, and other languages that have stronger backing and more manpower. That's perfectly legitimate. And of course, low manpower and funding aren't the complete picture. Management also play a role. Both Walter and Andrei have freely admitted they are not managers and that they're learning as they go. Mistakes have been made. In hindsight, some decisions should have gone a different way. But that is not the same as not caring, or not understanding/ So please, don't attribute any disingenuous motives to any of the core team members. They all want D to succeed. Identifying core problems and discussing ways to solve them is a more productive way to spend our bandwidth.
Sep 03 2018
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 4 September 2018 at 01:36:53 UTC, Mike Parker wrote:
 D is not a petri dish for testing ideas. It's not an experiment.
Well, the general consensus for programming languages is that it a language is experimental (or proprietary) until it is fully specced out as a stable formal standard with multiple _independent_ implementations...
Sep 04 2018
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Tuesday, 4 September 2018 at 01:36:53 UTC, Mike Parker wrote:
 On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:

 I think this sort of misunderstanding is the source of a lot 
 of friction on this forum. Some users think (or in my case: 
 thought) that D will be a sound and stable language one day, a 
 language they can use for loads of stuff, while the leadership 
 prefers to keep it at a stage where they can test ideas to see 
 what works and what doesn't, wait let me rephrase this, where 
 the user can test other people's ideas with every new release.
First of all, thanks a lot for your answer. I appreciate it. I have to know where I'm standing in order to be able to plan ahead. I don't think this is unreasonable.
 D is not a petri dish for testing ideas. It's not an 
 experiment. It's a serious language. Walter, Andrei, and 
 everyone involved in maintaining and developing it want to see 
 it succeed. They aren't just treading water, wasting everyone's 
 time. And I know you keep hearing this, but I'll say it anyway: 
 most of the development is based on volunteer work, and trying 
 to get volunteers to do anything they don't want to do is like 
 asking them to voluntarily have their teeth pulled when they 
 don't need to.
I have no doubt at all that Walter, Andrei et al are a 100% serious about D, as in "a professional tool" and I do not question their expertise and experience. However, for a bit more than a year I've been under the impression that scarce resources are spent on features and details that are not critical to production when you use D, while more basic things that are sometimes not related to D as such are put on the long finger.
 Walter has said that people come to him and ask what they 
 should work on. He provides them with a list of priority tasks. 
 Then they go off and work on something else. That's the nature 
 of unsponsored open-source development and has been the biggest 
 challenge for D for years.
I can imagine that. This is why volunteers are not the way to go when it comes to core development and the ecosystem. This is why foundations with a lot of funding and IT companies spend a lot of resources on these two aspects.
 I have high hopes that some of this can be turned around by 
 raising more money and I have long-term plans to try and 
 facilitate that over the coming months. With more money, we can 
 get people to work on targeted tasks even when they have no 
 vested interest in it. We can pull in full-time coders, maybe 
 get someone to oversee PR reviews so they don't stay open so 
 long, fund development of broader ecosystem projects.

 There isn't anyone involved in the core D development who isn't 
 aware of the broader issues in the community or the ecosystem, 
 and they are working to bring improvements. I have been around 
 this community since 2003. From my perspective, it has been one 
 continuous march of progress. Sometimes slow, sometimes fast, 
 but always moving, always getting better. And right now there 
 are more people involved than ever in moving it forward.

 Unfortunately, there are also more demands on more fronts than 
 ever. There are longtime users who become increasingly 
 frustrated when the issues that matter to them still aren't 
 resolved, newcomers who have no context of all the progress 
 that has been made and instead hold D in comparison to Rust, 
 Go, and other languages that have stronger backing and more 
 manpower. That's perfectly legitimate.
Which is what I've been talking about in this thread. D is too old to live with its parents :) Too many people already use it in production or are interested in it. There are a) longtime users who still have to put up with OSS style hacks and are growing tired of it (after years of putting in a lot of effort). There are b) new users who are put off by the lack of a smooth ecosystem. And both groups are told that "that's the way we do things around here".
 And of course, low manpower and funding aren't the complete 
 picture. Management also play a role. Both Walter and Andrei 
 have freely admitted they are not managers and that they're 
 learning as they go. Mistakes have been made. In hindsight, 
 some decisions should have gone a different way. But that is 
 not the same as not caring, or not understanding/
Exactly. As I said in an earlier message, it's not just the money (or the lack thereof), it's the approach. In my opinion Both Walter and Andrei should hire a manager who is not involved in the core development. If you're involved in the core development you cannot be managing things at the same time and "learn as you go along". That's a recipe for disaster. It's not a good idea to mix management and development because they are two completely different things, and you might end up not being good at any of them. A manager could lay down a practical road map based on what users need _most_, secure funding and direct the funding towards the most urgent issues. Everyone should do what they're good at, only in this way you can get optimal results.
 So please, don't attribute any disingenuous motives to any of 
 the core team members. They all want D to succeed. Identifying 
 core problems and discussing ways to solve them is a more 
 productive way to spend our bandwidth.
Never would I attribute "disingenuous motives" to the members of the core team. But sometimes perceptions are different and one might not see the wood for trees so it's good to get input from the outside now and again. The reason I got a bit sarcastic or polemical at times is that I didn't get a real answer (until now), especially at a moment when I'm reassessing my programming options for the future. And I don't seem to be the only one either who asks himself "to D or not to D?". Other (new) languages now have many features that made me pick up D in the first place, and they offer a smoother and more comprehensive experience on top of that. I don't think this should be taken lightly by the core team / D Foundation. Don't forget, for various reasons developers are keen on testing new stuff at the moment. I only posted here to raise awareness of the issues, I could have just left silently as many others have. And yes, I have better things to do than to be on this forum, like testing my future optionst.
Sep 04 2018
prev sibling parent reply TheSixMillionDollarMan <smdm outlook.com> writes:
On Tuesday, 4 September 2018 at 01:36:53 UTC, Mike Parker wrote:
 On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:

 And of course, low manpower and funding aren't the complete 
 picture. Management also play a role. Both Walter and Andrei 
 have freely admitted they are not managers and that they're 
 learning as they go. Mistakes have been made. In hindsight, 
 some decisions should have gone a different way. But that is 
 not the same as not caring, or not understanding/

 So please, don't attribute any disingenuous motives to any of 
 the core team members. They all want D to succeed. Identifying 
 core problems and discussing ways to solve them is a more 
 productive way to spend our bandwidth.
I think D's 'core' problem, is that it's trying to compete with, what are now, widely used, powerful, and well supported languages, with sophisticate ecosystems in place already. Then it's also trying to compete with startup languages (Go, Rust ....) - and some of those languages have billion dollar organisations behind them, not to mention the talent levels of their *many* designers and contributors. C++ is much more than just a langauge. It's an established, international treaty on what the language must be. Java is backed by Oracle (one the of the largest organisations in the world). Go is backed by Google...Rust by Mozilla...(both billion dollar global companies). So one has to wonder, what would motivate a person (or an organisation) to focus their attention on D. That is not a statement about the quality of D. It's a statement about the competitive nature of programming languages. If you've ever read 'No Contest - the case against competition' by Alfie Kohn, then you'd know (or at least you might agree with that statement) that competition is not an inevitable part of human nature. "It warps recreation by turning the playing into a battlefield." I wonder has already happened to D. D should, perhaps, focus on being a place for recreation, where one can focus on technical excellence, instead of trying to compete in the battlefield. I just do not see, how D can even defeat its' major competitors. Instead D could be a place where those competitors come to look for great ideas (which, as I understand it, does occur .. ranges for example). In any case, you have to work out what it is, that is going to motivate people to focus their attention on D. You seem to be saying that, raising money so you can pay people, is enough. But I wonder about that.
Sep 04 2018
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Tuesday, 4 September 2018 at 13:34:03 UTC, 
TheSixMillionDollarMan wrote:
 On Tuesday, 4 September 2018 at 01:36:53 UTC, Mike Parker wrote:
 On Monday, 3 September 2018 at 18:26:57 UTC, Chris wrote:

 And of course, low manpower and funding aren't the complete 
 picture. Management also play a role. Both Walter and Andrei 
 have freely admitted they are not managers and that they're 
 learning as they go. Mistakes have been made. In hindsight, 
 some decisions should have gone a different way. But that is 
 not the same as not caring, or not understanding/

 So please, don't attribute any disingenuous motives to any of 
 the core team members. They all want D to succeed. Identifying 
 core problems and discussing ways to solve them is a more 
 productive way to spend our bandwidth.
I think D's 'core' problem, is that it's trying to compete with, what are now, widely used, powerful, and well supported languages, with sophisticate ecosystems in place already. Then it's also trying to compete with startup languages (Go, Rust ....) - and some of those languages have billion dollar organisations behind them, not to mention the talent levels of their *many* designers and contributors. C++ is much more than just a langauge. It's an established, international treaty on what the language must be. Java is backed by Oracle (one the of the largest organisations in the world). Go is backed by Google...Rust by Mozilla...(both billion dollar global companies). So one has to wonder, what would motivate a person (or an organisation) to focus their attention on D. That is not a statement about the quality of D. It's a statement about the competitive nature of programming languages. If you've ever read 'No Contest - the case against competition' by Alfie Kohn, then you'd know (or at least you might agree with that statement) that competition is not an inevitable part of human nature. "It warps recreation by turning the playing into a battlefield." I wonder has already happened to D. D should, perhaps, focus on being a place for recreation, where one can focus on technical excellence, instead of trying to compete in the battlefield. I just do not see, how D can even defeat its' major competitors. Instead D could be a place where those competitors come to look for great ideas (which, as I understand it, does occur .. ranges for example). In any case, you have to work out what it is, that is going to motivate people to focus their attention on D. You seem to be saying that, raising money so you can pay people, is enough. But I wonder about that.
That's a good question, let me see if I can answer it. Do you know what the first search engine for the web was and when it was created? It wasn't Yahoo, google, or Bing: https://en.m.wikipedia.org/wiki/Web_search_engine#History The first search engines were created in 1993, google came along in 1998 after at least two dozen others in that list, and didn't make a profit till 2001. Some of those early competitors were giant "billion dollar global companies," yet it's google that dominates the web search engine market today. Why is that? Well, for one, resources don't matter for software on the internet as much as ideas. It's not that resources don't matter, but that they take a back seat to your fundamental design and the ideas behind it. And coming up with that design and ideas takes time, the "developmental stage" that Laeeth refers to above. In that incubation stage, you're better off _not_ having a bunch of normal users who want a highly polished product, just a bunch of early adopters who can give you good feedback and are okay with rough edges. For D, that means all the advanced features don't fully play together well yet, and there are various bugs here and there. To use it, you have to be okay with that. Now, it's a fair question to ask when D will leave that developmental stage and get more resources towards that polish, as Chris asks, and I'm not saying I know the answers to those questions. And let me be clear: as long as you don't push the envelope with mixing those advanced D features and are okay working around some bugs here and there, you're probably good now. But simply asserting that others are rushing full-speed ahead with more resources and therefore they will win completely misunderstands how the game has changed online. Resources do matter, but they're not the dominant factor like they used to be for armies or manufacturing. Ideas are now the dominant factor, and D has plenty of those. ;)
Sep 04 2018
next sibling parent Neia Neutuladh <neia ikeran.org> writes:
On Tuesday, 4 September 2018 at 14:23:33 UTC, Joakim wrote:
 The first search engines were created in 1993, google came 
 along in 1998 after at least two dozen others in that list, and 
 didn't make a profit till 2001. Some of those early competitors 
 were giant "billion dollar global companies," yet it's google 
 that dominates the web search engine market today.

 Why is that? Well, for one, resources don't matter for software 
 on the internet as much as ideas. It's not that resources don't 
 matter, but that they take a back seat to your fundamental 
 design and the ideas behind it.
Google had a $100k angel round in 1998 and a $25 million Series A in 1999. The difference between Google and the $12 billion-ish valued Lycos of the time was not insurmountable, yes, but $25 million was enough to hire dozens of developers, lease offices, and buy the hardware they needed. Similarly, we don't need Google-level funding to produce a developer ecosystem that's sufficiently polished not to be a blocker for corporate VS-only types who rely on autocomplete. But we need a bit more than $4k for that, or it's always going to be someone's personal project that's mostly complete but might be abandoned in six months.
Sep 04 2018
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
 The first search engines were created in 1993, google came 
 along in 1998 after at least two dozen others in that list, and 
 didn't make a profit till 2001. Some of those early competitors 
 were giant "billion dollar global companies," yet it's google 
 that dominates the web search engine market today.

 Why is that?
Their original page-rank algorithm. Basically, they found an efficient way of emulating random clicks to all outgoing links from a page and thus got better search result rankings. It was a matter of timing.
Sep 04 2018
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 4 September 2018 at 13:34:03 UTC, 
TheSixMillionDollarMan wrote:
 I think D's 'core' problem, is that it's trying to compete 
 with, what are now, widely used, powerful, and well supported 
 languages, with sophisticate ecosystems in place already. 

Yes, I believe there was an academic early on that allowed students to use D, but when C++17 (etc) came about he held the view that it would be better for D to align its semantics with C++. He was met with silence, except me, who supported that view. D is too much like C++ for a skilled modern C++ programmer to switch over, but D semantics are also too different to compile to C++ in a useful manner.
 Then it's also trying to compete with startup languages (Go, 
 Rust ....) - and some of those languages have billion dollar 
 organisations behind them, not to mention the talent levels of 
 their *many* designers and contributors.
Ok, so my take on this is that Rust is in the same group as D right now, and I consider it experimental as I am not convinced that it is sufficient for effective low level programming. Although Rust has more momentum, it depends too much on a single entity (with unclear profitability) despite being open sourced, just like D. Too much singular ownership. Go is also open source in theory, but if we put legalities aside then I think it is having the traits of a proprietary language. They are building up a solid runtime, and it has massive momentum within services, but the language itself is somewhat primitive and messy. Go could be a decent compilation target for a high level language. That said , I think most languages don't compete directly with other languages, but compete within specific niches. Rust: for writing services and command line programs where C++ would have been a natural candidate, but for people who want a higher level language or dislike C++. expected to be too resource-intensive. D: based on what seems to be recurring themes in the forums D seems to be used by independent programmers (personal taste?) and programmers in finance that find interpreted languages too slow and aren't willing to adopt C++.
 C++ is much more than just a langauge. It's an established, 
 international treaty on what the language must be.
Yes, it is an ISO-standard, and evolve using a global standardisation community as input. As such it evolves with the feedback from a wide range of user groups by the nature of the process.
 That is not a statement about the quality of D. It's a 
 statement about the competitive nature of programming languages.
It kinda is both, but the issue is really what you aim to be supporting and what you do to move in that direction. When there is no focus on any particular use case, just language features, then it becomes very difficult to move and difficult to engage people in a way that make them pull in the same direction.
 I wonder has already happened to D.
No, it mostly comes down to a lack of focus and a process to back it up. Also, memory management should be the first feature to nail down, should come before language semantics...
 I just do not see, how D can even defeat its' major competitors.
But are they really competitors? Is D predominantly used for writing web-services? What is D primarily used for? Fast scripting-style programming?
 Instead D could be a place where those competitors come to look 
 for great ideas (which, as I understand it, does occur .. 
 ranges for example).
No, there are thousands of such languages. Each university has a handful of languages that they create in order to back their comp.sci. research. No need to focus on performance in that setting.
 You seem to be saying that, raising money so you can pay 
 people, is enough.

 But I wonder about that.
There has to be a focus based on analysis of where you can be a lot better for a specific use scenario, define the core goals that will enable something valuable for that scenario, then cut back on secondary ambitions and set up a process to achieve those core goals (pertaining to a concrete usage scenario). Being everything for everybody isn't really a strategy. Unless you are Amazon, and not even then. Without defining a core usage scenario you cannot really evaluate the situation or the process that has to be set up to change the situation... Well, I've said this stuff many times before.
Sep 04 2018
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
wrote:
 But if you're ever expecting IDE support to be a top priority 
 of many of the contributors, then you're going to be sorely 
 disappointed. It's the sort of thing that we care about because 
 we care about D being successful, but it's not the sort of 
 thing that we see any value in whatsoever for ourselves
Why is that? I've never used an IDE much, but I wonder why you don't and what your impressions are of why many other core D users don't either.
Sep 03 2018
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Monday, September 3, 2018 12:55:01 PM MDT Joakim via Digitalmars-d wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis

 wrote:
 But if you're ever expecting IDE support to be a top priority
 of many of the contributors, then you're going to be sorely
 disappointed. It's the sort of thing that we care about because
 we care about D being successful, but it's not the sort of
 thing that we see any value in whatsoever for ourselves
Why is that? I've never used an IDE much, but I wonder why you don't and what your impressions are of why many other core D users don't either.
Because they can't hold a candle to vim. As far as text editing goes, there simply is no comparison. The same goes for emacs. Most of the other capabilities that a typical IDE has are either geared towards dealing with boilerplate stuff that D tries to avoid or can be made to work in programs like vim or emacs if you want them. Personally, I have pretty much all I need with just vim, grep, and gdb. And I lose out on so much with any IDE that there's no point in even considering using one. vim and emacs (especially vim) have high learning curves, which scares off plenty of programmers, but in general, it seems like most folks who actually take to the time to really learn one of them won't go back to using an IDE if they can help it, because IDEs are very, very poor text editors, and vim and emacs (especially emacs) are far more flexible. Whether your choice is vim or emacs, they both are total powerhouses as code editors, whereas IDEs really are not. This is an answer that I gave to a similar question on SO several years ago: https://stackoverflow.com/questions/2695919/why-do-people-use-command-line-instead-of-ide/2695956#2695956 - Jonathan M Davis
Sep 03 2018
parent User <user tmp.com> writes:
On Monday, 3 September 2018 at 19:31:58 UTC, Jonathan M Davis 
wrote:

 Because they can't hold a candle to vim. As far as text editing 
 goes, there simply is no comparison.
All these arguments, especially the above, makes me sad. May be this is the nature of open source that volunteers will work only on things that they like and may not always be aligned with all the users needs. D was born almost two decades ago when IDEs and tools that make user experience smooth as defined by current standards didn't exist that freely. Major competition then was c++ and Java. D was a breath of fresh air. It was as fast as c++ and as clean as Java. No wonder many people loved D. Nowadays the programming language landscape is much different. With Go, Rust, etc the competition is not only catching up but even surpassing D in popularity. I wonder why. I sometimes feel D is still stuck in the previous era. At least in my experience this smoothness factor has a heavy weight. I abandoned Java wonderful ecosystem for D's native and fast compilation and fast startup. I wrote D programs in notepad++ etc. I endured lack of so many wonderful features of a mature IDE like eclipse or netbeans. Now after 20ish years still a mature and smooth ecosystem is no where in sight. D did find some success with expert programs, good for them, but I couldnt take it any more. Funnily, I went back to Java. The improvements in java language, JVM and hardware in general lessened the pain of java very much. It was amazing how much easy and smooth experience matters to increase the productivity. I still keep an eye on D, the ecosystem seems to be getting better although at glacial pace. Everytime I read a comment like above, this comes to my mind https://imgs.xkcd.com/comics/supported_features.png
Sep 03 2018
prev sibling next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/03/2018 02:55 PM, Joakim wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis wrote:
 But if you're ever expecting IDE support to be a top priority of many 
 of the contributors, then you're going to be sorely disappointed. It's 
 the sort of thing that we care about because we care about D being 
 successful, but it's not the sort of thing that we see any value in 
 whatsoever for ourselves
Why is that? I've never used an IDE much, but I wonder why you don't and what your impressions are of why many other core D users don't either.
I used to use them all the time, but it got too frustratingly difficult to find ones that didn't take forever to start up, and didn't lag like crazy while trying to get my work done. Plus, I've done so much development on so many platforms that (even if only at the time) didn't have much in the way of either IDE or debugger support (or good support for *my* current IDE and required me to use *their* IDE), that I just learned how to be productive with basic editor + file manager + command line. With those, I can do pretty much anything I need for just about any platform/language. Whenever I've relied on an IDE, I was constantly dealing with bad support for X in language Z, no support for Y when deving for platform W, trying to do Q was a big series of steps that could NOT be easily automated, etc...It was an endless mess, and trying to add support for XYZ was always a major project in an of itself. And there was ALWAYS something new I needed support for, but couldn't get and didn't have the time to build. It was a series of prisons. Weening myself off IDEs freed me. I'll tell you, it's REALLY nice being able to get my work done, *improve* my workflow as I see fit(!!), in just about any language I need, for just about any target platform I need, without ever having to whine about "I can't use your tech unless you build better integration with this one particular IDE!" (Sound similar to anything often heard around here? ;)) Plus, plain old non-IDE editors have come a LONG, long way in the last 20 years. (For example, syntax highlighting used to be something you mainly just got with the IDEs. Not anymore!)
Sep 03 2018
parent reply ShadoLight <ettienne.gilbert gmail.com> writes:
On Tuesday, 4 September 2018 at 00:16:16 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 09/03/2018 02:55 PM, Joakim wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis 
 wrote:
 But if you're ever expecting IDE support to be a top priority 
 of many of the contributors, then you're going to be sorely 
 disappointed. It's the sort of thing that we care about 
 because we care about D being successful, but it's not the 
 sort of thing that we see any value in whatsoever for 
 ourselves
Why is that? I've never used an IDE much, but I wonder why you don't and what your impressions are of why many other core D users don't either.
I used to use them all the time, but it got too frustratingly difficult to find ones that didn't take forever to start up, and didn't lag like crazy while trying to get my work done. Plus, I've done so much development on so many platforms that (even if only at the time) didn't have much in the way of either IDE or debugger support (or good support for *my* current IDE and required me to use *their* IDE), that I just learned how to be productive with basic editor + file manager + command line. With those, I can do pretty much anything I need for just about any platform/language. Whenever I've relied on an IDE, I was constantly dealing with bad support for X in language Z, no support for Y when deving for platform W, trying to do Q was a big series of steps that could NOT be easily automated, etc...It was an endless mess, and trying to add support for XYZ was always a major project in an of itself. And there was ALWAYS something new I needed support for, but couldn't get and didn't have the time to build. It was a series of prisons. Weening myself off IDEs freed me. I'll tell you, it's REALLY nice being able to get my work done, *improve* my workflow as I see fit(!!), in just about any language I need, for just about any target platform I need, without ever having to whine about "I can't use your tech unless you build better integration with this one particular IDE!" (Sound similar to anything often heard around here? ;)) Plus, plain old non-IDE editors have come a LONG, long way in the last 20 years. (For example, syntax highlighting used to be something you mainly just got with the IDEs. Not anymore!)
I fully understand this. I think most of the IDE users get this. But there is another _fact_of_life_ that some of us (most of us..?) still have to deal with - we don't have a choice in this _at_work_! We work full-time for employers which, in my case, employs thousands of engineers - and as a result engineering principles are applied to everything - including tools. So all SW dev teams here use standardized tooling/processes/coding standards/etc. - you simply do not have a choice to use your own editor of choice. other teams (say web based portals/etc) it is maybe Java on eclipse, etc. In my team we often still have to do support/bug fixes/upgrades on legacy projects from people that are not even working here anymore. Most of the time migrating a complex project to your favorite IDE/editor/build system/whatever is simply not an option. Nick - I suspect you work for yourself or on occasion as a contractor in different environments (and for different customers) - so this is perfectly OK for you to have that view and opinion. But please be aware it is not the full story for others! I don't get to use D at work so am strictly a D 'hobbyist' at home. So at home I can play with D and for some other toys (Raspberry-Pi!) and can use Sublime Text3 or cross compile C++ using Codeblocks or any other editor of my choice. (I just never made the jump to Vim/Emacs). But the same choices are not available at work. I really miss the appreciation of this fact in these incessant 'use Vim/Emacs' answers to people's queries on IDE support is the forum. This is not the reality for many people at work - this article [1] describes the reason why businesses prefer IDE's quite nicely. Most of my colleagues are not interested in hobby coding at home - they consider their family life separate to their professional lives - and that is perfectly OK. It is their choice. But it makes it impossible for people like me to even try to push their managers to "try D" if it does not fit into the workflow/processes that are already followed. For me (like for Manu [2]) this absolutely necessitates that it supports Visual Studio integration _out_of_the_box_! When I read answers like yours and Jonathan's it always makes me wonder: does D want to cater for the kind of businesses I describe as well? If not, ok - that is a perfectly valid answer and D can, as a consequence remain the slightly obscure language it has been up to now - used by enthusiasts that are willing to go the extra mile to get stuff done, and can hack around any limitation. That is perfectly fine. But if the D community want to achieve the big critical-mass breakthrough into mainstream programming with lots of commercial customers I suspect that Manu's views [2] need to be realized and I know the "we use Vim/Emacs, why don't you pitch in and help on VisualD if you want to use it" view is valid opinion, but it will not bring the masses since it will never happen - the critical mass is composed of devs that want to _use_ VS/eclipse/etc - not _develop_ to enable them. Besides, they are not coding at home, and there is very little incentive for said enterprises to assist with this - they see it simply as a cost if D does not offer happen except as an effort inside the D community itself. [1] https://codecraft.co/2014/05/13/why-you-should-use-an-ide-instead-of-vim-or-emacs/ [2] https://forum.dlang.org/post/mailman.3572.1536035692.29801.digitalmars-d puremagic.com
Sep 04 2018
next sibling parent rjframe <dlang ryanjframe.com> writes:
On Tue, 04 Sep 2018 11:56:54 +0000, ShadoLight wrote:

 I know the "we use Vim/Emacs, why don't you pitch in and help on VisualD
 if you want to use it" view is valid opinion, but it will not bring the
 masses since it will never happen - the critical mass is composed of
 devs that want to _use_ VS/eclipse/etc - not _develop_ to enable them.
 Besides, they are not coding at home,
 and there is very little incentive for said enterprises to assist with
 this - they see it simply as a cost if D does not offer sufficient

 effort inside the D community itself.
I would be surprised if VisualD isn't on the D foundation's list (see [1]). They just have other things that (right or wrong) are higher priority right now. And Rainer Schuetze has been making major improvements, both to VisualD and to D on Windows in general. It is getting better.[2] Until there's money on VisualD's development, this constant back and forth isn't going to change much. People that don't use VisualD aren't likely to work on it, and won't be able to test their work if they do. It really is going to take people who use it and decide it's worth improving. --Ryan [1]: https://dlang.org/blog/2018/07/13/funding-code-d/ [2]: I was going to add "just at the pace of one person's spare time," then realized he's actually offering much better support than a well-known vendor I've spent $10,000+ with on one of my major problems...
Sep 04 2018
prev sibling next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, September 4, 2018 5:56:54 AM MDT ShadoLight via Digitalmars-d 
wrote:
 We work full-time for employers which, in my case, employs
 thousands of engineers - and as a result engineering principles
 are applied to everything - including tools. So all SW dev teams
 here use standardized tooling/processes/coding standards/etc. -
 you simply do not have a choice to use your own editor of choice.
[snip]
 I really miss the appreciation of this fact in these incessant
 'use Vim/Emacs' answers to people's queries on IDE support is the
 forum. This is not the reality for many people at work - this
 article [1] describes the reason why businesses prefer IDE's
 quite nicely.

 Most of my colleagues are not interested in hobby coding at home
 - they consider their family life separate to their professional
 lives - and that is perfectly OK. It is their choice. But it
 makes it impossible for people like me to even try to push their
 managers to "try D" if it does not fit into the
 workflow/processes that are already followed. For me (like for
 Manu [2]) this absolutely necessitates that it supports Visual
 Studio integration _out_of_the_box_!

 When I read answers like yours and Jonathan's it always makes me
 wonder: does D want to cater for the kind of businesses I
 describe as well? If not, ok - that is a perfectly valid answer
 and D can, as a consequence remain the slightly obscure language
 it has been up to now - used by enthusiasts that are willing to
 go the extra mile to get stuff done, and can hack around any
 limitation. That is perfectly fine.
[snip]
 I know the "we use Vim/Emacs, why don't you pitch in and help on
 VisualD if you want to use it" view is valid opinion, but it will
 not bring the masses since it will never happen - the critical
 mass is composed of devs that want to _use_ VS/eclipse/etc - not
 _develop_ to enable them. Besides, they are not coding at home,
 and there is very little incentive for said enterprises to assist
 with this - they see it simply as a cost if D does not offer

 happen except as an effort inside the D community itself.
Honestly, I don't understand why it would make any sense to require that all of the programmers use a particular code editor. Standardizing the build tools makes perfect sense (in fact, it would be crazy not to), and I've certainly worked at places that have required that a specific tool like visual studio or eclipse be used, because it's used for building, but they've never then disallowed using a tool like vim or emacs for code editing. And if an employer did, I'd almost certainly be looking for a new job (though finding a job that sucks less than your current one is frequently far from easy). I completely disagree that IDEs are a better tool (at least as long as you're willing to put in the time to actually learn a tool like vim or emacs), but I'm not against someone using an IDE if they want to. And even if I think that it's stupid for a company to say that you must use editor X or IDE Y (regardless of what that program may be), I do think that folks should choose whatever code editor / IDE works best for them. If a whole bunch of folks want to use VisualD, then great, more power to them. I certainly don't agree with their choice, but they're the ones writing their code, not me. I'm not looking to force vim or emacs on anyone any more than I want to be forced to write code in Visual Studio. And much as I think that IDEs are generally inferior, given how many folks insist on using them, I do think that it's good for D to have good IDE support, even if I don't want to ever use it. That being said, I'm not about to spend my time working on IDE support. While I want D to succeed, I don't spend my time on D doing things geared towards getting more people to use D. I spend my time on things that make D better as a language or which improve its libraries. That should then make it more desirable for folks to use D, and in some cases, it does get rid of obstacles that prevent folks from using D. So, I expect that that will help increase D's user base, but my entire focus is on making D better, not on trying to pave the way for others to use it - especially if it's an issue like you describe where an employer is really picky about the tools that programmers use simply to edit code. I honestly wouldn't expect a company like that to be interested in D anyway. So, I'm definitely not going to spend my free time on things aimed at making them happy, much as I sympathize with your situation. Personally, I'm only able to work in D now, because I work as a contractor. It was a lost cause to get D into any of my previous work places. They just weren't the kind of places where that was going to fly, regardless of the current state of D. Most companies and most programmers are not looking for a better programming language to use, and the reasons that they pick a particular language often has little to do with how good a language is. But ultimately, regardless of the reasons why someone might want to use an IDE, if the current state of IDEs for D is not where it needs to be for them to use D with an IDE, and the amount of effort currently being put into improving IDEs for D is not going to get those IDEs to that point soon, then someone who actually cares about the issue is going to need to either pitch in or donate so that someone else can be paid to work on it; otherwise IDE support isn't going to improve enough. Regardless of the perceived value in IDE support, there are just too many other things that need doing for most of us to want to put our free time into working on an issue that isn't going to benefit us. And while a number of us do do at least some work on D-related stuff that we don't care about aside from wanting to improve the D ecosystem for others, when the vast majority of the time being put in is on a volunteer basis, the reality of the matter is that most of the effort is going to go towards things that those contributing care about and not what the community at large might care about or what folks who may join the community might care about. That's just how it goes with open source and is part of why it can be critical for individuals or companies to donate money towards improving aspects of a project that no one wants to work on. And that's part of why it can be a big deal for there to be a big company backing a project. While it can be frustrating for someone to be told that they need to either pitch in or donate to get something that they want done, if it's something that isn't a priority for those who are spending their free time to do the work, it's often the cold the reality that that thing isn't going to get done any time soon. - Jonathan M Davis
Sep 04 2018
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/04/2018 04:00 PM, Jonathan M Davis wrote:
 On Tuesday, September 4, 2018 5:56:54 AM MDT ShadoLight via Digitalmars-d
 wrote:
 We work full-time for employers which, in my case, employs
 thousands of engineers - and as a result engineering principles
 are applied to everything - including tools. So all SW dev teams
 here use standardized tooling/processes/coding standards/etc. -
 you simply do not have a choice to use your own editor of choice.
Honestly, I don't understand why it would make any sense to require that all of the programmers use a particular code editor. Standardizing the build tools makes perfect sense (in fact, it would be crazy not to), and I've certainly worked at places that have required that a specific tool like visual studio or eclipse be used, because it's used for building, but they've never then disallowed using a tool like vim or emacs for code editing. And if an employer did, I'd almost certainly be looking for a new job (though finding a job that sucks less than your current one is frequently far from easy).
Yes, exactly. Out of all the actual employment-based jobs I've had writing code, not a single one would've cared what editor I was using, as long as I was getting my work done and not causing problems.
Sep 04 2018
parent reply ShadoLight <ettienne.gilbert gmail.com> writes:
On Tuesday, 4 September 2018 at 22:38:08 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 09/04/2018 04:00 PM, Jonathan M Davis wrote:
 On Tuesday, September 4, 2018 5:56:54 AM MDT ShadoLight via 
 Digitalmars-d
 wrote:
 We work full-time for employers which, in my case, employs
 thousands of engineers - and as a result engineering 
 principles
 are applied to everything - including tools. So all SW dev 
 teams
 here use standardized tooling/processes/coding standards/etc. 
 -
 you simply do not have a choice to use your own editor of 
 choice.
Honestly, I don't understand why it would make any sense to require that all of the programmers use a particular code editor. Standardizing the build tools makes perfect sense (in fact, it would be crazy not to), and I've certainly worked at places that have required that a specific tool like visual studio or eclipse be used, because it's used for building, but they've never then disallowed using a tool like vim or emacs for code editing. And if an employer did, I'd almost certainly be looking for a new job (though finding a job that sucks less than your current one is frequently far from easy).
Yes, exactly. Out of all the actual employment-based jobs I've had writing code, not a single one would've cared what editor I was using, as long as I was getting my work done and not causing problems.
I in fact share your (and Jonathan's) view on this i.t.o. editing. But it is not as simple as you make it sounds - the fly in the ointment is often debugging, not coding. For example, all devs in my team, besides participating in 1 or 2 projects's _development_ at any one time, you also have to do support: most of this entails fixing some reported issue in applications in production that you (often) did not write or even contribute to (we have a lot of legacy projects still in production - a few go as far back as VC++ 6 solutions!). Lets say in a legacy VS2008 C++ solution (1xEXE project with many (often 10+) DLL projects). Often with these kind of projects the EXE/DLL/LIB output is directed to a specific folder (with the necessary INI/CFG files, test files, etc are all present) as well as scripts to update registry, etc - i.e. to set up everything to allow testing. It is quite quick (and VS is normally very reliable with this) to import the legacy VS2008 solution into VS2015 - rebuild the whole thing, set some breakpoints and start debugging (and I echo Manu's sentiments here - VS has the best integrated debugger in the business bar none!). Find the bug. Fix the bug. Run unit/CI tests. (All from within VS). Update release-note. Release new updated version to Deployment. Check fixed sources into Version control. Move on to next bug or back to normal dev... Are you really telling me you are going to port each of these VS Solutions with all the project details into the equivalent Vim/Emacs structure just for each and every of the projects you have to fix? Every time? The actual coding part (fixing the bug) actually most of the time takes much less time than the rest of this process. Whether you believe your Vim/Emacs can be tuned with the necessary plugins to achieve the same result is actually irrelevant here - the question is can you fix each bug as quickly as I can by just staying in the original VS ecosystem. It is also not a question if your employer/organization allows you to code in your editor of choice (we are allowed to do this in fact - I frequently use (and love!) Notepad++, and I even have VisualD installed at work to toy with D over lunch). The real question is if your organization has/needs a standard for project solutions under source control. So in our case, irrespective of what you use to develop the project, under source control it has to be a VS solution - because that is the standard dev tool in the team. It is simply not viable to imagine that everyone can check out and convert each project solution (be it VS/eclipse/whatever) into their own favorite editor equivalent (mostly through a host of different plugins, with no standard way in sight!). And then having to update the (lets imagine VS) solution under source control with the Vim equivalent solution (for future Vim users), just for the next guy to add the Sublime way. As much as I too would have liked unfettered freedom to do things "my way", I can fully appreciate why there needs to be a "standard way" (at least in serious engineering organizations where multiple developers can and do work on the same code bases). Note that I do not think my organization "sucks" because of this requirement and that this necessitates finding another job "if you have an ounce of dev self respect" left! (I know, I exaggerate a bit here, but you get the drift!). I think you should consider yourself quite privileged if you can dictate the choice of all the tools you use - this is obviously easier for new projects or if you work alone on a project or other developers will not need to access/change/fix your code base. Also note that this is not much different to why, even for D, there is still an insistence to follow D's coding standards for contributors. Both are intended to simply make it easier for multiple devs/contributors to code/fix/contribute. It is the same principle.
Sep 05 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, September 5, 2018 6:35:59 AM MDT ShadoLight via Digitalmars-d 
wrote:
 On Tuesday, 4 September 2018 at 22:38:08 UTC, Nick Sabalausky

 (Abscissa) wrote:
 On 09/04/2018 04:00 PM, Jonathan M Davis wrote:
 On Tuesday, September 4, 2018 5:56:54 AM MDT ShadoLight via
 Digitalmars-d

 wrote:
 We work full-time for employers which, in my case, employs
 thousands of engineers - and as a result engineering
 principles
 are applied to everything - including tools. So all SW dev
 teams
 here use standardized tooling/processes/coding standards/etc.
 -
 you simply do not have a choice to use your own editor of
 choice.
Honestly, I don't understand why it would make any sense to require that all of the programmers use a particular code editor. Standardizing the build tools makes perfect sense (in fact, it would be crazy not to), and I've certainly worked at places that have required that a specific tool like visual studio or eclipse be used, because it's used for building, but they've never then disallowed using a tool like vim or emacs for code editing. And if an employer did, I'd almost certainly be looking for a new job (though finding a job that sucks less than your current one is frequently far from easy).
Yes, exactly. Out of all the actual employment-based jobs I've had writing code, not a single one would've cared what editor I was using, as long as I was getting my work done and not causing problems.
I in fact share your (and Jonathan's) view on this i.t.o. editing. But it is not as simple as you make it sounds - the fly in the ointment is often debugging, not coding.
[snip]
 Are you really telling me you are going to port each of these VS
 Solutions with all the project details into the equivalent
 Vim/Emacs structure just for each and every of the projects you
 have to fix? Every time? The actual coding part (fixing the bug)
 actually most of the time takes much less time than the rest of
 this process.
Except that you don't have projects or solutions with something like vim or emacs. There is no structure specific to them. You can set them up to do the build from inside them, and with emacs, you can run gdb inside it if you're on an appropriate platform, but you're not going to have a "vim" project or an "emacs" project. That whole concept is an IDE thing. They edit files, and they can do that perfectly fine regardless of what's being used to run the build or whatever other tools are necessary for the development process. If I'm in a situation like you describe, then I usually set it up so that I can just run the build and tests from the command line and not even bother opening up Visual Studio. VS projects actually have a way to do that. You don't actually have to open up VS to do any building. And if I really need to open up VS to run the debugger, then I'll do that, but I won't use VS for anything that I don't have to use it for. And in my experience, the debugger is pretty much the only thing that would typically require actually opening up VS. There is no reason to muck with the build process or source control stuff in order to use vim or emacs. That stuff can pretty much always be done from the command-line using all of the standard tools that everyone else is using. Just because most developers would use the IDE to run the build doesn't mean that it's actually required for it. If it were, then stuff like automated builds wouldn't be possible. Regardless, I use vim for editing code. And if I'm actually forced to have an IDE like VS or Eclipse open because of some tool that has to be run from inside for some reason (which aside from the debugger is rarely the case), then I'll have the IDE open for whatever it has to be open for. But I don't use the IDE for editing code, because that would be a horribly inefficient way to do go about it. - Jonathan M Davis
Sep 05 2018
next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
 Except that you don't have projects or solutions with something 
 like vim or emacs. There is no structure specific to them. You 
 can set them up to do the build from inside them, and with 
 emacs, you can run gdb inside it if you're on an appropriate 
 platform, but you're not going to have a "vim" project or an 
 "emacs" project. That whole concept is an IDE thing. They edit 
 files, and they can do that perfectly fine regardless of what's 
 being used to run the build or whatever other tools are 
 necessary for the development process.

 If I'm in a situation like you describe, then I usually set it 
 up so that I can just run the build and tests from the command 
 line and not even bother opening up Visual Studio. VS projects 
 actually have a way to do that. You don't actually have to open 
 up VS to do any building. And if I really need to open up VS to 
 run the debugger, then I'll do that, but I won't use VS for 
 anything that I don't have to use it for. And in my experience, 
 the debugger is pretty much the only thing that would typically 
 require actually opening up VS.

 There is no reason to muck with the build process or source 
 control stuff in order to use vim or emacs. That stuff can 
 pretty much always be done from the command-line using all of 
 the standard tools that everyone else is using. Just because 
 most developers would use the IDE to run the build doesn't mean 
 that it's actually required for it. If it were, then stuff like 
 automated builds wouldn't be possible.

 Regardless, I use vim for editing code. And if I'm actually 
 forced to have an IDE like VS or Eclipse open because of some 
 tool that has to be run from inside for some reason (which 
 aside from the debugger is rarely the case), then I'll have the 
 IDE open for whatever it has to be open for. But I don't use 
 the IDE for editing code, because that would be a horribly 
 inefficient way to do go about it.

 - Jonathan M Davis
+1 What must be absolutely standardized is what is *shared* across the members of the team (code presentation, tabs, naming conventions, build process, versioning, test and deployment procedures, etc etc). But as long as the coding standard is followed, obviously any code editor should be fine if it makes you more productive. For instance, even for contract work, I use Geany for all my developments. And a portable IDE like Geany is especially useful when developping *crossplatform* C++ multimedia applications which must be edited and tested both on Windows, MacOS and Linux. It is the perfect companion to cmake, behaving exactly the same whatever the platform (editing, find and replace, find in files, macros, settings, etc). And indeed you can still open your project in Visual Studio when you need to use a Windows debugger. Personally I use Geany even for Unity game development, as Unity allows to define which editor should be used to show the message. Geany is great for that too, as it opens often much faster than other IDE... So my point is, as long as all the shared team standard procedures are respected, I fon't see why any company should decide which code editor *must* be used by all its developers...
Sep 05 2018
parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/05/2018 01:05 PM, Ecstatic Coder wrote:
 
 For instance, even for contract work, I use Geany for all my developments.
 
 And a portable IDE like Geany is especially useful when developping 
 *crossplatform* C++ multimedia applications which must be edited and 
 tested both on Windows, MacOS and Linux.
Man, I wish SOO much, that was true of my favorite editor (Programmer's Notepad 2). I love it, but it's a windows thing and has some issues under wine. :( Closest I've found since I moved to Linux is a custom-configured KDevelop, but it's still just not as good :( Heck, maybe I'll give Geany a go...
 Personally I use Geany even for Unity game development, as Unity allows 

 code when double clicking onto an error message.
I didn't know Unity could do that! I've just been manually going to the file/line of the error. I'll have to check into that! (Unity's relative unfriendliness to non-Unity-oriented workflows has always been one of my biggest pain points with it. Most normal IDEs aren't as problematic as Unity in those ways. Even the Unity's own team had to go to some significant lengths just to make automated builds possible, and even then, I'm not sure you can use just any off the shelf CI system. I really hate vertical integration.)
Sep 26 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 27/09/18 04:54, Nick Sabalausky (Abscissa) wrote:
 Man, I wish SOO much, that was true of my favorite editor (Programmer's 
 Notepad 2). I love it, but it's a windows thing and has some issues 
 under wine.
Can you elaborate on what issues? Merely downloading and installing seem to work fine.
Sep 26 2018
parent "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/26/2018 10:33 PM, Shachar Shemesh wrote:
 On 27/09/18 04:54, Nick Sabalausky (Abscissa) wrote:
 Man, I wish SOO much, that was true of my favorite editor 
 (Programmer's Notepad 2). I love it, but it's a windows thing and has 
 some issues under wine.
Can you elaborate on what issues? Merely downloading and installing seem to work fine.
The main big one was that the option to use MRU order for Ctrl-Tab file-switching doesn't work. Instead, it ends up being some kind of semi-random order. I know that doesn't sound like a big deal, but it turned out to be a HUGE drain on both my mental focus and my productivity. (I habitually rely *very* heavily on Ctrl-Tabbing between two main documents I'm focusing on - or occasionally three. And the MRU behavior is very deeply ingrained in my muscle memory. Technically, I could save-and-close all other documents, but that also messes with my workflow and mental processes.) I love the program so much that I tried to work through it (and filed bug reports, etc), but ultimately it proved to be too much of a problem. Another, smaller, issue was that it doesn't start up nearly as fast under wine (still WAY faster than most IDEs though.) Although that's admittedly very minor, and not a deal-breaker. Although it's near-instant startup was one of the things I loved about it back on windows. I seem the remember there being one other quirk I ran into (maybe related to find or replace?), but it's been awhile so I don't remember exactly what that was.
Sep 26 2018
prev sibling parent reply ShadoLight <ettienne.gilbert gmail.com> writes:
On Wednesday, 5 September 2018 at 13:11:18 UTC, Jonathan M Davis 
wrote:

[snip]

 Except that you don't have projects or solutions with something 
 like vim or emacs. There is no structure specific to them. You 
 can set them up to do the build from inside them, and with 
 emacs, you can run gdb inside it if you're on an appropriate 
 platform, but you're not going to have a "vim" project or an 
 "emacs" project. That whole concept is an IDE thing.
True, which is the reason I was referring to the "the equivalent Vim/Emacs structure" provided (possibly) by plugins in Vim [1] and Emacs [2] to manage projects/solutions. It anyway appears that Vim/Emacs are often extended by plugins, and this will be the only way to have some project manage features. [1] https://stackoverflow.com/questions/1119585/vim-is-there-an-easy-way-to-manage-visual-studio-solutions-makefile-projects [2] https://github.com/bbatsov/projectile I maintain that it is not practical trying to duplicate this in your editor of choice except if the amount of time you will save (from increased productivity) exceed the time taken to do this. I maintain that for bug fixing/support in a big organization this will hardly ever be the case. But even if you avoid this step and can build/run/test from the command-line it may not be optimal in certain debugging scenarios. See next point.
 If I'm in a situation like you describe, then I usually set it 
 up so that I can just run the build and tests from the command 
 line and not even bother opening up Visual Studio. VS projects 
 actually have a way to do that. You don't actually have to open 
 up VS to do any building. And if I really need to open up VS to 
 run the debugger, then I'll do that, but I won't use VS for 
 anything that I don't have to use it for. And in my experience, 
 the debugger is pretty much the only thing that would typically 
 require actually opening up VS.
Right, but depending on your type of debugging there is some things which just make more sense to do from right inside the debugger. If you hit a data value break-point or such on an attached debugger you can just double-click the line in the stack trace to go to the appropriate line in the IDE editor. No need to switch tasks to Vim/Emacs, do a go-to or whatever to get to the same place. The type of debugging I'm talking about is not your 'single step' variety. I sometimes wonder if the Vim/Emacs 'affectionados' spend so much time mastering their editors (which by all accounts have a steep learning curve), that they forgot that IDE development did not stagnate after they left!
 There is no reason to muck with the build process or source 
 control stuff in order to use vim or emacs. That stuff can 
 pretty much always be done from the command-line using all of 
 the standard tools that everyone else is using.
Agreed.
 Regardless, I use vim for editing code. And if I'm actually 
 forced to have an IDE like VS or Eclipse open because of some 
 tool that has to be run from inside for some reason (which 
 aside from the debugger is rarely the case), then I'll have the 
 IDE open for whatever it has to be open for. But I don't use 
 the IDE for editing code, because that would be a horribly 
 inefficient way to do go about it.
Again, it depends on what you mean by 'editing'. If you are referring to coding where you are developing from scratch, then sure - I agree. You will be doing a lot of coding before building the 1st time. And then the build will fail for the 1st few times with initial bugs. And then (unit/CI) testing will show up some more bugs, which will necessitate more changes/fixes. Repeat and rinse until the project is finally delivered. In this phase the editor of your choice is really nice/handy (if you can handle everything from the command-line). I suspect this is your (and Nick's) primary use-case. And is maybe the primary use case for the majority of D contributors. But the whole point of my post was to point out that this is not the only use-case for some of us. And in some of these other use-cases IDEs are actually superior to editors. For another example IDEs are also in some ways a 'standard' inside big organizations in a way that any editor cannot be - the lowest barrier of entry to get new members up to speed in a team. appeal/power of the languages is in many ways directly related to development... I'm sure you get my drift!
Sep 05 2018
next sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Wednesday, 5 September 2018 at 17:34:17 UTC, ShadoLight wrote:
 On Wednesday, 5 September 2018 at 13:11:18 UTC, Jonathan M 
 Davis wrote:

 It anyway appears that Vim/Emacs are often extended by plugins, 
 and this will be the only way to have some project manage 
 features.
I'm an Emacs user. I have never needed project management features. If I want to edit a new file, I do that. You might be confusing "project management" with a build system. I'm not sure, but then I just use a build system such as CMake.
 I maintain that it is not practical trying to duplicate this in 
 your editor of choice except if the amount of time you will 
 save (from increased productivity) exceed the time taken to do 
 this. I maintain that for bug fixing/support in a big 
 organization this will hardly ever be the case.
True, but why would anyone want to duplicate it? The only reason I can think of is if the team is using Visual Studio and the .sln file is the agreed-upon build system. I know this happens in real life, but it shouldn't. And even then... open VS, add a file, go back to editing in Emacs/vim/whathaveyou. Or edit the XML directly.
 But even if you avoid this step and can build/run/test from the 
 command-line it may not be optimal in certain debugging 
 scenarios. See next point.
You don't have to build/run/test from the command-line, you can do it in-editor.
 Right, but depending on your type of debugging there is some 
 things which just make more sense to do from right inside the 
 debugger. If you hit a data value break-point or such on an 
 attached debugger you can just double-click the line in the 
 stack trace to go to the appropriate line in the IDE editor. No 
 need to switch tasks to Vim/Emacs, do a go-to or whatever to 
 get to the same place. The type of debugging I'm talking about 
 is not your 'single step' variety.
No need to switch tasks to Emacs either, just run the debugger in Emacs and you can double-click if you want to. Although, if you're an Emacs user you're probably not going to want to use the mouse.
 I sometimes wonder if the Vim/Emacs 'affectionados' spend so 
 much time mastering their editors (which by all accounts have a 
 steep learning curve), that they forgot that IDE development 
 did not stagnate after they left!
It's not a question of forgetting what IDEs can do. It's a question of either not needing those features or having them in the editor. I've used Visual Studio, Eclipse, IDEA, etc. I just don't like them. This is what I need from an IDE: autocompletion, go to definition, on-the-fly syntax checking. I have all of that in Emacs.
 Again, it depends on what you mean by 'editing'.
I think he means... editing. Cutting, pasting, replacing, that kind of thing.
 If you are referring to coding where you are developing from 
 scratch, then sure - I agree.
That's not editing, that's writing. In that case, notepad is enough, or cat. There's a reason why vim's normal mode is about editing, not writing (inserting).
 But the whole point of my post was to point out that this is 
 not the only use-case for some of us. And in some of these 
 other use-cases IDEs are actually superior to editors.
That's your opinion, you're entitled to it and I'm not going to try and change your mind. Mine is that no IDE gets close to the power of a good editor. In your favourite IDE, can you set up any key combination you want to: 1. Jump to the end of the current line 2. Check to see if there's a semicolon there 3. If not, add one 4. Open a new line beneath No? I don't learn how to use Emacs, Emacs learns *me*. And that was just a simple example.
 For another example IDEs are also in some ways a 'standard' 
 inside big organizations in a way that any editor cannot be - 
 the lowest barrier of entry to get new members up to speed in a 


 appeal/power of the languages is in many ways directly related 

 development... I'm sure you get my drift!
Most of what I'd need an IDE for in Java (I'd probably use IDEA if I were to write Java) I don't need for D.
Sep 12 2018
parent Manu <turkeyman gmail.com> writes:
On Wed, 12 Sep 2018 at 04:45, Atila Neves via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Wednesday, 5 September 2018 at 17:34:17 UTC, ShadoLight wrote:
 On Wednesday, 5 September 2018 at 13:11:18 UTC, Jonathan M
 Davis wrote:

 It anyway appears that Vim/Emacs are often extended by plugins,
 and this will be the only way to have some project manage
 features.
I'm an Emacs user. I have never needed project management features. If I want to edit a new file, I do that. You might be confusing "project management" with a build system. I'm not sure, but then I just use a build system such as CMake.
 I maintain that it is not practical trying to duplicate this in
 your editor of choice except if the amount of time you will
 save (from increased productivity) exceed the time taken to do
 this. I maintain that for bug fixing/support in a big
 organization this will hardly ever be the case.
True, but why would anyone want to duplicate it? The only reason I can think of is if the team is using Visual Studio and the .sln file is the agreed-upon build system. I know this happens in real life, but it shouldn't. And even then... open VS, add a file, go back to editing in Emacs/vim/whathaveyou. Or edit the XML directly.
Or use glob's in the XML.
Sep 12 2018
prev sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/05/2018 01:34 PM, ShadoLight wrote:
 
 I sometimes wonder if the Vim/Emacs 'affectionados' spend so much time 
 mastering their editors (which by all accounts have a steep learning 
 curve), that they forgot that IDE development did not stagnate after 
 they left!
I sometimes wonder similar things about Vim/Emacs users, too ;) But don't forget, not all non-IDE people are Vim/Emacs. And just like IDE development, plain-editor development didn't stagnate either. Many non-IDE users (like me) use editors that are far more contemporary than Vim/Emacs and *don't* have that learning curve. And for that matter, sometimes I get the impression that IDE users think non-IDE editors are far less capable than they really are. For the most part, "IDE" mostly just means: editor + GUI-based buildsystem + debugger.
 If you are referring to 
 coding where you are developing from scratch, then sure - I agree. You 
 will be doing a lot of coding before building the 1st time. 
No offence, but if that's how someone's developing a new project, then they're doing things very, VERY wrong. *Always* start a new project with some kind of "Hello world" or some such which builds and runs *right from the start*, and then grow it from there. I'm speaking from decades of experience doing things BOTH ways. Ultimately, any time you do a large amount of coding (writing and/or editing) in between working builds (no matter if it's the beginning or middle of development) then you're just asking for problems.
 For another example IDEs are also in some ways a 'standard' inside big 
 organizations in a way that any editor cannot be - the lowest barrier of 
 entry to get new members up to speed in a team. And for some languages 


 directly related to the IDE!
back in the day (and I still don't really hate it or anything). But speaking from experience here: It's not so much that the IDEs are a great feature of those languages, it's more like (especially with Java) the IDEs are used as a crutch to help mitigate major faults in the languages. But that said, there are examples of IDEs that really do provide a genuine benefit beyond mitigating language problems. These tend to be domain-specific to at least some extent. Some examples that come to mind are the old "RAD"-style tools for GUI apps (like Delphi and VB6). Or Unity3D for either games or Flash-like multimedia.
Sep 26 2018
parent Neia Neutuladh <neia ikeran.org> writes:
On 09/26/2018 08:23 PM, Nick Sabalausky (Abscissa) wrote:
 On 09/05/2018 01:34 PM, ShadoLight wrote:
 I sometimes wonder if the Vim/Emacs 'affectionados' spend so much time 
 mastering their editors (which by all accounts have a steep learning 
 curve), that they forgot that IDE development did not stagnate after 
 they left!
I sometimes wonder similar things about Vim/Emacs users, too ;)
A lot of people use Vim/Emacs plus a full IDE. I use IntelliJ for work. I also use Vim. Vim is much better when I know my APIs, and it's exceptional at applying transformations to a block of text. IntelliJ is much better when I'm using an API I'm unfamiliar with. Sometimes I'll switch back and forth editing the same file -- I'll hack something together in Vim and then use IntelliJ to quickly find and fix errors. For D, unfortunately, I haven't gotten an IDE to work yet. Not with any appreciable degree of autocomplete. So I stick with Vim pretty much entirely.
 But don't forget, not all non-IDE people are Vim/Emacs. And just like 
 IDE development, plain-editor development didn't stagnate either. Many 
 non-IDE users (like me) use editors that are far more contemporary than 
 Vim/Emacs and *don't* have that learning curve.
Pretty much all advanced features in a text editor have a learning curve. Kind of unavoidable; we're asking text editors to do complex things. GUI editors can offer *less* of a learning curve, and they can offer advice better, but they can't eliminate it entirely.
 And for that matter, sometimes I get the impression that IDE users think 
 non-IDE editors are far less capable than they really are. For the most 
 part, "IDE" mostly just means: editor + GUI-based buildsystem + debugger.
Autocomplete, highlighting errors, semantic code navigation, and displaying extra semantic information are other IDE features that text editors tend to lack. On the other hand, I've seen projects billing themselves as IDEs when they were pretty much just a tree view for files in the project, a GtkSourceView, and a build button.
Sep 26 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Sep 04, 2018 at 02:00:37PM -0600, Jonathan M Davis via Digitalmars-d
wrote:
[...]
 And while a number of us do do at least some work on D-related stuff
 that we don't care about aside from wanting to improve the D ecosystem
 for others, when the vast majority of the time being put in is on a
 volunteer basis, the reality of the matter is that most of the effort
 is going to go towards things that those contributing care about and
 not what the community at large might care about or what folks who may
 join the community might care about.
[...]
 While it can be frustrating for someone to be told that they need to
 either pitch in or donate to get something that they want done, if
 it's something that isn't a priority for those who are spending their
 free time to do the work, it's often the cold the reality that that
 thing isn't going to get done any time soon.
[...] Yes, that's just the cold hard reality. Demanding for something in the forums rarely has the desired effect of making that thing happen. In fact, it may have the opposite effect of turning off would-be volunteers because they get tired of hearing said demands and decide to just ignore it. Similarly, getting mad at the current state of things, while it may be a good way to vent one's frustrations, rarely results in any actual change in the status quo. Realistically speaking, there are really only two ways to make a change happen: (1) Do it yourself, then (optionally) contribute the code to the community so that everyone else can reap the benefits; or (2) Convince someone to do it for you -- which usually means pay them to do the work, which can be hiring someone yourself to do it, or donating to the D Foundation so that they can pay someone to do it. Demanding volunteers to do something they aren't really interested in rarely has the desired effect. One could argue that this state of things sucks, and I might even agree. But that still won't change anything. Like it or not, nothing is going to happen until either (1) or (2) happens. Things aren't going to materialize out of thin air just because people demand for it loudly enough, even if we'd like for that to happen. *Somebody* has to do the work. That's just how the universe works. T -- Those who've learned LaTeX swear by it. Those who are learning LaTeX swear at it. -- Pete Bleackley
Sep 04 2018
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 4 September 2018 at 20:45:53 UTC, H. S. Teoh wrote:
 happens.  Things aren't going to materialize out of thin air 
 just because people demand for it loudly enough, even if we'd 
 like for that to happen.  *Somebody* has to do the work. That's 
 just how the universe works.
Human beings are social in nature and follow a group-mentality, so when there are people willing to follow and you have a persuasive leader willing to lead and outline a bright and vivid future, then things can "materialize" out of thin air. But it is not going to happen without focused leadership and the right timing. There certainly have been people in the forums over the years looking for things to do. So I really don't believe the whole "nothing will happen because nothing happens by itself" mantra. What is almost always the case is that if you base an open source sub-project on only 1-2 people then that work is mostly wasted, as they most likely will leave it unmaintained. So leadership is critical, even when things do "materialize" out of thin air.
Sep 04 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/3/2018 11:55 AM, Joakim wrote:
 On Monday, 3 September 2018 at 16:55:10 UTC, Jonathan M Davis wrote:
 But if you're ever expecting IDE support to be a top priority of many of the 
 contributors, then you're going to be sorely disappointed. It's the sort of 
 thing that we care about because we care about D being successful, but it's 
 not the sort of thing that we see any value in whatsoever for ourselves
Why is that? I've never used an IDE much, but I wonder why you don't and what your impressions are of why many other core D users don't either.
If you're going to start a new thread, which is a good idea in this giant thread, just changing the Subject isn't enough. You can't mark is as a "followup".
Sep 04 2018
prev sibling next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 3 September 2018 at 14:26:46 UTC, Laeeth Isharc wrote:

 I just spoke with Dicebot about work stuff.  He incidentally 
 mentioned what I said before based on my impressions.  The 
 people doing work with a language have better things to do than 
 spend a lot of time on forums.  And I think in open source you 
 earn the right to be listened to by doing work of some kind.  
 He said (which I knew already) it was an old post he didn't put 
 up in the end - somebody discovered it in his repo.  He is 
 working fulltime as a consultant with me for Symmetry and is 
 writing D as part of that role.  I don't think that indicates 
 he didn't mean his criticisms, and maybe one could learn from 
 those.  But a whole thread triggered by this is quite 
 entertaining.
I'm the person how found the post, and I'm enjoying the readings... and I'm learning something also! I'm amused by the amount of different topics, minus one, the original: why feature branches are not an option in DLangLand. /Paolo
Sep 03 2018
prev sibling parent Meta <jared771 gmail.com> writes:
On Monday, 3 September 2018 at 14:26:46 UTC, Laeeth Isharc wrote:
 I just spoke with Dicebot about work stuff.  He incidentally 
 mentioned what I said before based on my impressions.  The 
 people doing work with a language have better things to do than 
 spend a lot of time on forums.  And I think in open source you 
 earn the right to be listened to by doing work of some kind.  
 He said (which I knew already) it was an old post he didn't put 
 up in the end - somebody discovered it in his repo.  He is 
 working fulltime as a consultant with me for Symmetry and is 
 writing D as part of that role.  I don't think that indicates 
 he didn't mean his criticisms, and maybe one could learn from 
 those.  But a whole thread triggered by this is quite 
 entertaining.
Interesting, I did not realize that he had left Sociomantic. Even if he did not release the article, I think it's a good idea that we take some of his criticisms to heart. I, at the very least, agree with at least a few of them, and as we've seen, so do others.
Sep 03 2018
prev sibling parent Laeeth Isharc <laeeth laeeth.com> writes:
On Monday, 3 September 2018 at 11:32:42 UTC, Chris wrote:
 On Sunday, 2 September 2018 at 12:07:17 UTC, Laeeth Isharc 
 wrote:

 That's why the people that adopt D will inordinately be 
 principals not agents in the beginning. They will either be 
 residual claimants on earnings or will have acquired the 
 authority to make decisions without persuading a committee 
 that makes decisions on the grounds of social factors.

 If D becomes another C++ ?  C++ was ugly from the beginning 
 (in my personal subjective assessment) whereas D was designed 
 by people with good taste.

 That's why it appeals inordinately to people with good taste.
[snip] Be that as it may, however, you forget the fact that people "with good taste" who have (had) an intrinsic motivation to learn D are also very critical people who take no bs, else they wouldn't have ended up using D in the first place. Since they've already learned a lot of concepts etc. with D over the years,
 it's technically easy for them to move on to either an easier 
 language or one that offers more or less the same features as D.
I don't think so. If we are talking about the set of technically very capable people with an aesthetic sense then I don't think easier or feature set in a less beautiful way is appealing. This is based on revealed preference, because the conversations I have with technically very capable people that know many other languages as well or better than D go like "what compensation are you expecting? X. But if it's to write D, I can be flexible" and so on. Template meta-programming in D is quite simple. C++ has many of the features that D has. Therefore it's easy to do template meta-programming in C++, and just as easy for others to read your code in C++ as D? I don't think so. Having learnt the concepts in D and that it can be beautiful and easy kind of ruins you for inferior approaches. It talks to a C style API (connected to an internal C++ code base side declaration of the C function that returns an exception function that throws an exception if the exception string is not empty. Then you have a layer on top that puts the class back together. Then you have a high level wrapper layer. Then you have the bit that talks to Excel. I thought surely there must be decent code generation it up. Microsoft say use HTML templates. Well, okay... but I'm not sure I like the trade-off of having to do stuff like that versus having to deal with some pain at the command-line now and then.
 So once they're no longer happy with the way things are, they 
 can dive into a any language fast enough for the cost of 
 transition to be low.
You're making an implicit empirical statement that I don't believe to be accurate based on my experience. I would say if a representative programmer from the D community decides the costs no longer offset the benefits then sure they can learn another language because the representative programmer here is pretty talented. But so what?
Sep 03 2018
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 09:58, Joakim wrote:
 Because you've not listed any here, which makes you no better than some noob
Here's one: the forum does not respond well to criticism. Here's an incredibly partial list: * Features not playing well together. Despite what Joakim seems to think, I've actually brought up an example in this thread. Here is another one: functions may be safe, nothrow, nogc, pure. If it's a method it might also be const/inout/immutable, static. The number of libraries that support all combinations is exactly zero (e.g. - when passing a delegate in). * Language complexity Raise your hand if you know how a class with both opApply and the get/next/end functions behaves when you pass it to foreach. How about a struct? Does it matter if it allows copying or not? The language was built because C++ was deemed too complex! Please see the thread about lazy [1] for a case where a question actually has an answer, but nobody seems to know it (and the person who does know it is hard pressed to explain the nuance that triggers this). * Critical bugs aren't being solved People keep advertising D as supporting RAII. I'm sorry, but "supports RAII" means "destructors are always run when the object is destroyed". If the community (and in this case, this includes Walter) sees a bug where that doesn't happen as not really a bug, then there is a deep problem, at least, over-promising. Just say you don't support RAII and destructors are unreliable and live with the consequences. BTW: Python's destructors are unworkable, but they advertise it and face the consequences. The D community is still claiming that D supports RAII. * The community Oh boy. Someone who carries weight needs to step in when the forum is trying to squash down on criticism. For Mecca, I'm able to do that [2], but for D, this simply doesn't happen. ------ This is a partial list, but it should give you enough to not accusing me of making baseless accusations. The simple point of the matter is that anyone who's been following what I write should already be familiar with all of the above. The main thing for me, however, is how poorly the different D features fit together (my first point above). The language simply does not feel like it's composed of building blocks I can use to assemble whatever I want. It's like a Lego set where you're not allowed to place a red brick over a white brick if there is a blue brick somewhere in your building. Shachar 1 - https://forum.dlang.org/thread/pjp2ef$310c$1 digitalmars.com 2 - https://forum.dlang.org/post/pctsgk$182l$1 digitalmars.com
Aug 23 2018
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 23/08/2018 9:09 PM, Shachar Shemesh wrote:
 On 23/08/18 09:58, Joakim wrote:
 Because you've not listed any here, which makes you no better than 
 some noob
Here's one: the forum does not respond well to criticism. Here's an incredibly partial list: * Features not playing well together. Despite what Joakim seems to think, I've actually brought up an example in this thread. Here is another one: functions may be safe, nothrow, nogc, pure. If it's a method it might also be const/inout/immutable, static. The number of libraries that support all combinations is exactly zero (e.g. - when passing a delegate in).
Indeed that combination is horrible. It does deserve a rethink, but it probably doesn't warrant changing for a little while since its more of a polishing than anything else (IMO anyway).
 * Language complexity
 
 Raise your hand if you know how a class with both opApply and the 
 get/next/end functions behaves when you pass it to foreach. How about a 
 struct? Does it matter if it allows copying or not?
get/next/end functions, what?
 The language was built because C++ was deemed too complex! Please see 
 the thread about lazy [1] for a case where a question actually has an 
 answer, but nobody seems to know it (and the person who does know it is 
 hard pressed to explain the nuance that triggers this).
 
 * Critical bugs aren't being solved
 
 People keep advertising D as supporting RAII. I'm sorry, but "supports 
 RAII" means "destructors are always run when the object is destroyed". 
 If the community (and in this case, this includes Walter) sees a bug 
 where that doesn't happen as not really a bug, then there is a deep 
 problem, at least, over-promising. Just say you don't support RAII and 
 destructors are unreliable and live with the consequences.
 
 BTW: Python's destructors are unworkable, but they advertise it and face 
 the consequences. The D community is still claiming that D supports RAII.
 
 * The community
 
 Oh boy.
 
 Someone who carries weight needs to step in when the forum is trying to 
 squash down on criticism. For Mecca, I'm able to do that [2], but for D, 
 this simply doesn't happen.
The N.G. by in large is self regulating. If you see behavior that isn't acceptable you say so. Anybody can do this. Only when it gets really bad does Walter step in and say to stop. If we need to move away from assuming people are good and will be professional given the chance, it will destroy the community. But I can understand if we move to a more private channel for regulars and go for a more regulated option for everybody else. Would that suit you?
Aug 23 2018
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, August 23, 2018 3:26:46 AM MDT rikki cattermole via 
Digitalmars-d wrote:
 * Language complexity

 Raise your hand if you know how a class with both opApply and the
 get/next/end functions behaves when you pass it to foreach. How about a
 struct? Does it matter if it allows copying or not?
get/next/end functions, what?
I'm sure that he meant front, popFront, and empty. And yes, the fact that we have both opApply (the D1 solution for supporting foreach) and ranges (the D2 solution for foreach) definitely complicates things. Those two features do theoretically interact properly, because work was done several years ago to ensure that they did, but while they may interact properly from a technical standpoint, it's a mess from a social standpoint. I'm sure that some of the folks here could correctly answer whether foreach or the range API wins with foreach, but it's really not the sort of thing that the average D programmer is going to know, and while IIRC, there is very good reasoning as to which one wins, it's not immediately obvious to most folks. I _think_ that it's opApply that wins, but I'd have to either look it up or test it, and while I certainly don't know everything there is to know about D, given how much I've been involved with D, if I don't know something about it, the odds are pretty good that a lot of D programmers don't. D does have a problem in general of having a lot of great features that work really well in isolation but don't necessarily work well in concert (and it doesn't help that some features have really never been properly finished). And frequently, the answer that folks go with is to simply not use sections of the language (e.g. it's _very_ common for folks to just give up on a lot of attributes like pure, nothrow, or safe). A number of the issues do get worked out over time, but not all of them do, and sometimes the solutions cause a lot of problems. For instance, DIP 1000 may end up being great for safe and will help solve certain issues, but it results in yet another attribute that has to be pasted all over your code and which most folks simply won't use. So, it's helping to fix a real problem, but is it making things better overall? I don't know. And while I definitely think that D is easier to understand than C++ (in spite of the increase in D's complexity over time), it's also very much true that D continues to get more and more complicated as we add more stuff. Generally, each solution is solving a real problem, and at least some of time, the solution actually interacts quite well with the rest of the language, but it all adds up. And honestly, I don't think that there's a real solution to that. Languages pretty much always get more complicated over time, and unless we're willing to get rid of more stuff, it's guaranteed to just become more complicated over time rather than less. D definitely improves over time, but certain classes of issues just never seem to be fixed for some reason (e.g. the issue with RAII and destructors really should have been fixed ages ago), and some of the major design decisions don't get fully sorted out for years, because they're not a high enough priority (e.g. shared). I don't really agree that D is in much danger of dying at this point, but I completely agree that we as a group are not doing a good enough job getting some of the key things done (much of which comes down to an issue of manpower, though some of it is also likely due to organizational issues). - Jonathan M Davis
Aug 23 2018
next sibling parent reply Mike Franklin <slavo5150 yahoo.com> writes:
On Thursday, 23 August 2018 at 10:41:03 UTC, Jonathan M Davis 
wrote:

 Languages pretty much always get more complicated over time, 
 and unless we're willing to get rid of more stuff, it's 
 guaranteed to just become more complicated over time rather 
 than less.
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." -- Antoine de Saint-Exupery I think that's actually a mistranslation from what he actually said, but it's still quite good. I think that's a important point of focus. We should be trying to get rid of stuff. It's one of the reasons I've been trying to move forward on some of the deprecations. Once the decision to deprecate is made it takes at least 2 years to get it done. The longer we wait, the longer we have to carry its baggage and risk its poor interaction with new features. It's a good question to ask: What can we get rid of? I know it's radical, but I'd like to see if we could enhance structs a little and get rid of classes and interfaces. See https://theartofmachinery.com/2018/08/13/inheritance_and_polymorphism_2.html for what I mean. Mike
Aug 23 2018
parent Wyatt <wyatt.epp gmail.com> writes:
On Thursday, 23 August 2018 at 11:02:31 UTC, Mike Franklin wrote:
 On Thursday, 23 August 2018 at 10:41:03 UTC, Jonathan M Davis 
 wrote:

 Languages pretty much always get more complicated over time, 
 and unless we're willing to get rid of more stuff, it's 
 guaranteed to just become more complicated over time rather 
 than less.
"A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away." -- Antoine de Saint-Exupery I think that's actually a mistranslation from what he actually said, but it's still quite good.
Liberties were taken there, but it's probably more applicable to this situation than a lot of the times C/Unix beards try to play it as though their tech of choice is beyond culpability. For context, he's talking about the process of aeronautical engineering and the thrust of this statement is really commentary on effort and elegance. A little before that, he talks about the grand irony that so much thoughtful effort and design goes into refining things so they're as simple as possible. But "simple" is relative to the thing and the task (my understanding is that "simple" kind of conflates "reliable" here, too). So this is where he rightly acknowledges that the process of refinement isn't a waste for what it removes even though it's often much greater than the effort to create something in the first place. It's wrapped in a broader understanding that you have to have something that works at all before you can streamline it. -Wyatt
Aug 24 2018
prev sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Thursday, 23 August 2018 at 10:41:03 UTC, Jonathan M Davis 
wrote:
 D does have a problem in general of having a lot of great 
 features that work really well in isolation but don't 
 necessarily work well in concert (and it doesn't help that some 
 features have really never been properly finished). And 
 frequently, the answer that folks go with is to simply not use 
 sections of the language (e.g. it's _very_ common for folks to 
 just give up on a lot of attributes like pure, nothrow, or 
  safe). A number of the issues do get worked out over time, but 
 not all of them do, and sometimes the solutions cause a lot of 
 problems. For instance, DIP 1000 may end up being great for 
  safe and will help solve certain issues, but it results in yet 
 another attribute that has to be pasted all over your code and 
 which most folks simply won't use. So, it's helping to fix a 
 real problem, but is it making things better overall? I don't 
 know.

 And while I definitely think that D is easier to understand 
 than C++ (in spite of the increase in D's complexity over 
 time), it's also very much true that D continues to get more 
 and more complicated as we add more stuff. Generally, each 
 solution is solving a real problem, and at least some of time, 
 the solution actually interacts quite well with the rest of the 
 language, but it all adds up. And honestly, I don't think that 
 there's a real solution to that. Languages pretty much always 
 get more complicated over time, and unless we're willing to get 
 rid of more stuff, it's guaranteed to just become more 
 complicated over time rather than less.

 D definitely improves over time, but certain classes of issues 
 just never seem to be fixed for some reason (e.g. the issue 
 with RAII and destructors really should have been fixed ages 
 ago), and some of the major design decisions don't get fully 
 sorted out for years, because they're not a high enough 
 priority (e.g. shared). I don't really agree that D is in much 
 danger of dying at this point, but I completely agree that we 
 as a group are not doing a good enough job getting some of the 
 key things done (much of which comes down to an issue of 
 manpower, though some of it is also likely due to 
 organizational issues).

 - Jonathan M Davis
This is a great summary of the situation, thanks for such a good and honest appraisal. From a technical POV I'd say it could replace the whole thread. But there is a social/psychological aspect to the whole thing. Sachar's comment is obviously the cry of pain of someone whose back has just been broken by a last straw. He is being told, 'the straw you are complaining about is nothing'. There is a class of developers who expect things to Just Work TM, especially if they are told that it Just Works. Each time that they discover some combination of features that doesn't work they have to refactor their code and remember not to try that again. Ultimately the developer painfully learns the things that they should not attempt to use, or they give up before the process is complete and leave. I expect the pain caused by this is much more acute in a commercial environment where the pressure is on. Long term D developers have learnt not to bother with certain features or combinations of features and forget all the pain they went through to get that knowledge. They are ones saying, come in the water's lovely. For anyone considering using D for a commercial project the situation you describe is cause for concern. The issues can be fixed but it will take some brave and ruthless decisions, I suspect.
Aug 23 2018
parent rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 19:34:46 +0000, Abdulhaq wrote:

 There is a class of developers who expect things to Just Work TM,
 especially if they are told that it Just Works. Each time that they
 discover some combination of features that doesn't work they have to
 refactor their code and remember not to try that again. Ultimately the
 developer painfully learns the things that they should not attempt to
 use, or they give up before the process is complete and leave. I expect
 the pain caused by this is much more acute in a commercial environment
 where the pressure is on.
 
 Long term D developers have learnt not to bother with certain features
 or combinations of features and forget all the pain they went through to
 get that knowledge. They are ones saying, come in the water's lovely.
+1 It's easy to recommend D to someone because it does X, Y, and Z so well, not realizing they need X, Y, and B. And D has a honeymoon period - it's so awesome and will solve all our problems... until you dig deeper, trying to get more and more out of it and struggle to make sense of how to make it fit together. The pragmatic approach to language design has its downsides.
Sep 01 2018
prev sibling parent Jesse Phillips <Jesse.K.Phillips+D gmail.com> writes:
On Thursday, 23 August 2018 at 09:26:46 UTC, rikki cattermole 
wrote:
 On 23/08/2018 9:09 PM, Shachar Shemesh wrote:
 functions may be  safe, nothrow,  nogc, pure. If it's a method 
 it might also be const/inout/immutable, static. The number of 
 libraries that support all combinations is exactly zero (e.g. 
 - when passing a delegate in).
Indeed that combination is horrible. It does deserve a rethink, but it probably doesn't warrant changing for a little while since its more of a polishing than anything else (IMO anyway).
I think that tends to be where D's biggest failing tends to be is the polish. I started using D before these features existed, I also continue to use despite these features existing, like many I try to use them but end up falling back to not. But these things tend to be a big part of the D marketing. I would classify the --dip1000 work as a polishing effort, but it also opens the doors to more areas that need polish, and most likely won't get it before --dip1000 is considered done. I would also disagree with the community being hostile to criticism. Explanation and workarounds are usually given, not hostility. There is a huge resistance to change, but that should be expected and continue to increase. It is usually just sad how long it takes for change to happen when it needs to. Tangent: Sem versions may be easy to implement, but they have no value without practice. Practice isn't so easy to manage. We may not be following the sem version spec but it wouldn't add value to current practice. There is a patch release and there is a breaking changes and feature release. If we don't get the management of breaking changes correct, sem versioning is broken anyway.
Aug 23 2018
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 2:09 AM, Shachar Shemesh wrote:
 On 23/08/18 09:58, Joakim wrote:
 Because you've not listed any here, which makes you no better than some noob
Here's one: the forum does not respond well to criticism.
Not sure what you mean by that.
 * Features not playing well together.
 
 Despite what Joakim seems to think, I've actually brought up an example in
this 
 thread. Here is another one:
 
 functions may be  safe, nothrow,  nogc, pure. If it's a method it might also
be 
 const/inout/immutable, static. The number of libraries that support all 
 combinations is exactly zero (e.g. - when passing a delegate in).
If, for example, a library functions allocates with the gc, then it can't work with nogc code. But still, fair enough - if there are combinations which should work, but do not, please submit bug reports. If you have already, is there a list of them?
 * Language complexity
 
 Raise your hand if you know how a class with both opApply and the get/next/end 
 functions behaves when you pass it to foreach.
 How about a struct?
I presume you meant empty/front/popFront. This is in the language spec: https://dlang.org/spec/statement.html#foreach-with-ranges "If the aggregate expression is a struct or class object, but the opApply for foreach, or opApplyReverse foreach_reverse do not exist, then iteration over struct and class objects can be done with range primitives." This was done for backward compatibility, since empty/front/popFront came later. I actually tried to deprecate opApply at one point, but that would have broken too much existing code. But I'm puzzled why one would write a struct with both iteration mechanisms.
 Does it matter if it allows copying or not?
For the preference for opApply, no.
 The language was built because C++ was deemed too complex! Please see the
thread 
 about lazy [1] for a case where a question actually has an answer, but nobody 
 seems to know it (and the person who does know it is hard pressed to explain
the 
 nuance that triggers this).
C++ doesn't have lazy. Lazy in D is a rarely used feature, so I'm not surprised there is less institutional knowledge about it, and it hasn't been thrashed as much as other features.
 * Critical bugs aren't being solved
 
 People keep advertising D as supporting RAII. I'm sorry, but "supports RAII" 
 means "destructors are always run when the object is destroyed". If the 
 community (and in this case, this includes Walter) sees a bug where that
doesn't 
 happen as not really a bug, then there is a deep problem, at least, 
 over-promising. Just say you don't support RAII and destructors are unreliable 
 and live with the consequences.
pretending it isn't a problem. It is.
 * The community
 
 Oh boy.
 
 Someone who carries weight needs to step in when the forum is trying to squash 
 down on criticism. For Mecca, I'm able to do that [2], but for D, this simply 
 doesn't happen.
If someone is trying to squash criticism, I would like to see what you're referring to. If a post contains unprofessional behavior, like using f--k or harassing people, it will get removed. Simply being critical is not removed (if it was, this this thread would have disappeared).
Aug 23 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 14:02, Walter Bright wrote:
 On 8/23/2018 2:09 AM, Shachar Shemesh wrote:
 functions may be  safe, nothrow,  nogc, pure. If it's a method it 
 might also be const/inout/immutable, static. The number of libraries 
 that support all combinations is exactly zero (e.g. - when passing a 
 delegate in).
If, for example, a library functions allocates with the gc, then it can't work with nogc code. But still, fair enough - if there are combinations which should work, but do not, please submit bug reports. If you have already, is there a list of them?
None of that continues to work once delegates are involved. I am yet to see a library that can accept a delegate and correctly create a function around it that matches its attributes. And yes, I do include Mecca in this. I tried. It is too difficult to get right, so I gave up. Attribute inference was supposed to solve this, but attribute inference is completely broken with separate compilation.
 
 
 * Language complexity

 Raise your hand if you know how a class with both opApply and the 
 get/next/end functions behaves when you pass it to foreach.
> How about a struct? I presume you meant empty/front/popFront.
Indeed.
 
 This is in the language spec:
How many people know that without resorting to the specs.
 
 Does it matter if it allows copying or not?
For the preference for opApply, no.
But it does for empty/front/popFront, which is exactly my point.
 
 * Critical bugs aren't being solved

 People keep advertising D as supporting RAII. I'm sorry, but "supports 
 RAII" means "destructors are always run when the object is destroyed". 
 If the community (and in this case, this includes Walter) sees a bug 
 where that doesn't happen as not really a bug, then there is a deep 
 problem, at least, over-promising. Just say you don't support RAII and 
 destructors are unreliable and live with the consequences.
that is pretending it isn't a problem. It is.
When I first reported this, about 3 and a half years ago, the forum explained to me that this is working as expected. however, it broke (I'm not sure why it broke, but did notice it deliberately would not work with disable init structs. See my interoperability comment from above). Since then (over a year), no progress has been made except to revert the changelog that claims it was resolved. When I talked to you about it at the last DConf, I got a reply that could be largely summarized as "yeah, we should probably do some flow analysis, sometimes". The only time I got anyone to take this problem seriously was when someone on the forum would claim that D supports RAII, to which I tend to reply with "no, it doesn't". So you will excuse me, but I don't think this bug is being taken as seriously as I think it should. I get it, it is not a simple bug to solve. It is an unfortunate truth that some important bugs are not going to be easy. C++ went a different path with this (strictly setting when members are initialized). I get that this is a very unpopular feature of C++, and I can see why, but seeing how D struggles to resolve this issue, I can't say it is a mistake on C++'s behalf. Shachar
Aug 23 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 4:31 AM, Shachar Shemesh wrote:
 None of that continues to work once delegates are involved. I am yet to see a 
 library that can accept a delegate and correctly create a function around it 
 that matches its attributes.
 
 And yes, I do include Mecca in this. I tried. It is too difficult to get
right, 
 so I gave up.
 
 Attribute inference was supposed to solve this, but attribute inference is 
 completely broken with separate compilation.
I'd like to see an example so I can understand exactly what you're having trouble with.
 This is in the language spec:
How many people know that without resorting to the specs.
This is a little unfair. It's plainly stated in the documentation for foreach. Heck, I wrote a C compiler and the library for it, and yesterday I had to look up again how strncmp worked. I refer to the documentation regularly. Back when I designed digital circuits, I had a well-worn TTL data book on my desk, too. If it wasn't documented, or documented confusingly, it would be a fair point.
 Does it matter if it allows copying or not?
For the preference for opApply, no.
But it does for empty/front/popFront, which is exactly my point.
If front() returns by ref, then no copying happens. If front() returns by value, then a copy is made. This should not be surprising behavior.

 pretending it isn't a problem. It is.
When I first reported this, about 3 and a half years ago, the forum explained to me that this is working as expected.
The forum can be anyone saying anything. A more reliable answer would be the bugzilla entry being closed as "invalid", which did not happen.

however, 
 it broke (I'm not sure why it broke, but did notice it deliberately would not 
 work with  disable init structs. See my interoperability comment from above). 
 Since then (over a year), no progress has been made except to revert the 
 changelog that claims it was resolved.
The problem was calling other destructors that had different attributes, such as the constructor being safe but calling a destructor that was not. It's not an intractable problem, but I work on critical problems every day. It's just that everybody has a different issue that is critical to them.
 So you will excuse me, but I don't think this bug is being taken as seriously
as 
 I think it should.
It is a serious problem. (There are workarounds available, like using scope(failure).) I'd appreciate a list of bugzilla issues you regard as critical to your operations.
Aug 23 2018
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/23/18 8:03 AM, Walter Bright wrote:
 On 8/23/2018 4:31 AM, Shachar Shemesh wrote:
 This is in the language spec:
How many people know that without resorting to the specs.
This is a little unfair. It's plainly stated in the documentation for foreach. Heck, I wrote a C compiler and the library for it, and yesterday I had to look up again how strncmp worked. I refer to the documentation regularly. Back when I designed digital circuits, I had a well-worn TTL data book on my desk, too. If it wasn't documented, or documented confusingly, it would be a fair point.
On the point of opApply, the choice is quite obvious. Why would you put opApply in an aggregate if you didn't want to control foreach behavior? Once you think about it, there shouldn't really be any more discussion.
 Does it matter if it allows copying or not?
For the preference for opApply, no.
But it does for empty/front/popFront, which is exactly my point.
If front() returns by ref, then no copying happens. If front() returns by value, then a copy is made. This should not be surprising behavior.
I think he means, if the range ITSELF doesn't allow copying, it won't work with foreach (because foreach makes a copy), but it will work with opApply.

 that is pretending it isn't a problem. It is.
When I first reported this, about 3 and a half years ago, the forum explained to me that this is working as expected.
The forum can be anyone saying anything. A more reliable answer would be the bugzilla entry being closed as "invalid", which did not happen.
There have been several people who I have spoken with in person, and also seen posted here, that say the forum is unfriendly or not open to criticism of D. I feel it's the opposite (in fact, most of the die-hard supporters are very critical of D), but everyone has their own experiences. There are many people who post short curt answers, maybe even cynical. But this isn't necessarily the authoritative answer. Where I see this happening, I usually try to respond with a more correct answer (even though my voice isn't authoratative exactly), but the sad truth is that we can't spend all our day making sure we have a super-pleasant forum where every answer is valid and nobody is rude. In reply to Shachar's general point: This whole thread seems very gloomy and final, but I feel like the tone does not match in my mind how D is progressing. "Every single one of the people [at Weka] rushing to defend D at the time has since come around." Seems like you all have decided to either ditch D internally, maybe moving forward, or accepted that Weka will fail eventually due to the choice of D? It sure reads that way. This is in SHARP contrast to the presentation that Liran gave at Dconf this year, touting D as a major reason Weka was able to develop what they did, and to some degree, your showcase of how Mecca works. My experience with D is that it has gotten much better over the years. I suppose that having worked with the earlier versions, and seeing what has happened gives me a different perspective. I guess I just don't have that feeling that there are some unfixable problems that will "kill" the language. Everything in a programming language is fixable, it just matters how much pain you are willing to deal with to fix it. If we get to a point where there really is a sticking point, D3 can be born. I do feel that we need, in general, more developers working on the compiler itself. So many of the problems need compiler changes, and the learning curve to me just seems so high to get into it. -Steve
Aug 23 2018
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 24/08/2018 12:55 AM, Steven Schveighoffer wrote:
 
 I do feel that we need, in general, more developers working on the 
 compiler itself. So many of the problems need compiler changes, and the 
 learning curve to me just seems so high to get into it.
It depends, some parts are very easy to get into. But development in general definitely isn't welcoming to get into. In general the bar is very high and it isn't possible for a regular to jump in and take ownership of a part of it that doesn't have anybody interested in it and that's a real problem.
Aug 23 2018
prev sibling next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 15:55, Steven Schveighoffer wrote:
 This whole thread seems very gloomy and final, but I feel like the tone 
 does not match in my mind how D is progressing. "Every single one of the 
 people [at Weka] rushing to defend D at the time has since come around." 
 Seems like you all have decided to either ditch D internally, maybe 
 moving forward, or accepted that Weka will fail eventually due to the 
 choice of D? It sure reads that way.
I'll clarify, in order to not create the wrong impression. No, Weka is neither ditching D, nor is it banking on failing. We're doing pretty good as a company, and D will not change that. What I did mean by that is that the enthusiasm from D has *greatly* diminished. Many (but not all) developer will definitely not choose D for our next project, should the choice be ours to make. Like I said in my original post, it is not even in consensus whether picking D to begin with had been a mistake. Some think it was, some think, even in hind sight and after being more or less disillusioned, that it was still better than picking another language. As such, it is not even universally true that engineers at Weka regret going with D. I think Mecca is a great library, and I'm very proud of writing it. There are certainly aspects of it that would not be possible (or, at least, highly impractical) in any other language I know. With that said, it is also true that there are aspects of Mecca where D was holding me back from doing stuff I knew I could do much easier in C++. There were also areas where run-time bugs were discovered during integration with the main Weka code base (integration that is still ongoing) that were difficult to diagnose at best (and yes, I have it on my todo list to submit a PR to fix some aspects of some of them). Example of something I couldn't/wouldn't do in any other language: the second form of spawnFiber: https://weka-io.github.io/mecca/docs/mecca/reactor/Reactor.spawnFiber.html The fact that the arguments for spawnFiber are the arguments for F, and that's verified by the compiler, no casts, no inference, no code bloat, is huge. I am sorely going to miss it at my next project (which, like I said, will not be written in D). On the other hand, look at ConnectedSocket.connect: https://weka-io.github.io/mecca/docs/mecca/reactor/io/fd/ConnectedSocket.connect.html Why do I need two forms? What good is that? Why is the second form a template? Answer: Because in D, structs can't inherit, and I cannot define an implicit cast. What I'd really want to do is to have SockAddrIPv4 be implicitly castable to SockAddr, so that I can pass a SockAddrIPv4 to any function that expects SockAddr. Except what I'd _really_ like to do is for them to be the same thing. I'd like inheritance. Except I can't do that for structs, and if I defined SockAddr as a class, I'd mandate allocating it on the GC, violating the whole point behind writing Mecca to begin with. ---- To summarize: Weka isn't ditching D, and people aren't even particularly angry about it. It has problems, and we've learned to live with them, and that's that. The general consensus, however, is that these problems will not be resolved (we used to file bugs in Bugzilla. We stopped doing that because we saw nothing happens with them), and as far as the future of the language goes, that's bad news. Shachar
Aug 23 2018
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 24/08/2018 1:22 AM, Shachar Shemesh wrote:
 Except what I'd _really_ like to do is for them to be the same thing. 
 I'd like inheritance. Except I can't do that for structs, and if I 
 defined SockAddr as a class, I'd mandate allocating it on the GC, 
 violating the whole point behind writing Mecca to begin with.
Between multiple alias this, DIP1000 and potentially signatures, that should be possible. Although you won't need to use override at least ;)
Aug 23 2018
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/23/18 9:22 AM, Shachar Shemesh wrote:

 On the other hand, look at ConnectedSocket.connect:
 https://weka-io.github.io/mecca/docs/mecca/reactor/io/fd/Connecte
Socket.connect.html 
 
 
 Why do I need two forms? What good is that? Why is the second form a 
 template? Answer: Because in D, structs can't inherit, and I cannot 
 define an implicit cast. What I'd really want to do is to have 
 SockAddrIPv4 be implicitly castable to SockAddr, so that I can pass a 
 SockAddrIPv4 to any function that expects SockAddr.
 
 Except what I'd _really_ like to do is for them to be the same thing. 
 I'd like inheritance. Except I can't do that for structs, and if I 
 defined SockAddr as a class, I'd mandate allocating it on the GC, 
 violating the whole point behind writing Mecca to begin with.
So interestingly, you are accepting the sockaddr by VALUE. Which eliminates any possibility of using inheritance meaningfully anyway (except that depending how you define SockAddr, it may include all the data of the full derived address, sockaddr is quirky that way, and NOT like true inheritance). You CAN use inheritance, just like you would with classes, but you have to pass by reference for it to make sense struct SockAddr { int addressFamily; // forget what this really is called ... } struct SockAddrIPv4 { SockAddr base; ref SockAddr getBase() { return base; } alias getBase this; ... } Now, you can pass SockAddrIPv4 into a ref SockAddr, check the address family, and cast to the correct thing. Just like you would with classes and inheritance. You can even define nice mechanisms for this. e.g.: struct SockAddr { ... ref T cast(T)() if (isSomeSockaddr!T) { assert(addressFamily == T.requiredAddressFamily); return *cast(T*)&this; } }
 To summarize: Weka isn't ditching D, and people aren't even particularly 
 angry about it. It has problems, and we've learned to live with them, 
 and that's that.
This sounds more like what I would have expected, so thank you for clarifying.
 The general consensus, however, is that these problems 
 will not be resolved (we used to file bugs in Bugzilla. We stopped doing 
 that because we saw nothing happens with them), and as far as the future 
 of the language goes, that's bad news.
Bugs do get fixed, there is just no assigned timeframe for having them fixed. An all volunteer workforce has this issue. It took 10 (I think) years for bug 304 to get fixed. It was a huge pain in the ass, but it did get fixed. I wouldn't stop filing them, definitely file them. If they are blocking your work, complain about them loudly, every day. But not filing them doesn't help anyone. I'm not saying all bugs you file will be fixed, but all bugs you *don't* file will definitely not be fixed. -Steve
Aug 23 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 17:01, Steven Schveighoffer wrote:

 If they are blocking 
 your work, complain about them loudly, every day. But not filing them 
 doesn't help anyone.
The economics don't add up. If a bug is blocking my work, there are two options: 1. I work around it, at which point it is no longer blocking my work (grep mecca for DMDBUG) 2. Work actively (in our case, get Johan to do so) until it does not. Waiting for the community to fix a bug in D that is blocking Weka will get Weka kicked out of the market. There is no value proposition. We simply have to work faster than that. The problem is that once I do work around a bug, I no longer have the resources to continue complaining about it. I need to move on. My main job is to develop for Weka, not develop D itself. So telling me to keep filing them is simply a non-starter. I've got bugs that simply don't reproduce in watered down examples. I will not spend two days just to create a test case that demonstrates the bug outside the Weka code base. If nothing else, my boss won't allow me to spend that time. Oh, and our code base over 300,000 lines. Don't say "dustmite". It is unable to process the code.
 
 I'm not saying all bugs you file will be fixed, but all bugs you *don't* 
 file will definitely not be fixed.
So far, my experience is that it has about the same chances of being fixed both ways, and not filing takes less effort. I'm reminded of a friend of mine, who kept hoping to win the lottery despite never buying a ticket. His reasoning was that the chances of winning are not much changed by buying a ticket. Shachar
Aug 23 2018
next sibling parent reply RhyS <sale rhysoft.com> writes:
On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
wrote:
 Waiting for the community to fix a bug in D that is blocking 
 Weka will get Weka kicked out of the market. There is no value 
 proposition. We simply have to work faster than that.

 Oh, and our code base over 300,000 lines. Don't say "dustmite". 
 It is unable to process the code.
A quick question, if Weka did not have the current 300k backlog of code, what language of choice is more likely to be picked by the team at Weka?
Aug 23 2018
parent Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 19:58, RhyS wrote:
 
 A quick question, if Weka did not have the current 300k backlog of code, 
 what language of choice is more likely to be picked by the team at Weka?
I don't know. Like I said, while the feeling that D has completely lost its way is fairly universal, the claim that picking D was a mistake is not. There are some unique D features we are using to great effect, and people have gone used to the weaknesses. I don't think, at this point, picking a different direction would have happened even if it were technically feasible. Shachar
Aug 23 2018
prev sibling next sibling parent reply Ali <fakeemail example.com> writes:
On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
Aug 23 2018
next sibling parent Radu <rad.racariu gmail.com> writes:
On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
This sounds about right. Obviously the solution is to have the payed consultants fill those bugs for you if you think your time is not best spent at this. The only issue I can see here is that the D BDFL would need to be in the loop so they can expedite the fix/review of those bugs. Or design some kind of process for this. I know Dlang foundation has a sponsor package that has some level of support. I'd be interested to see if this can work for Wekka, as others might be in the same boat at some point.
Aug 23 2018
prev sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
I hope a startup can choose D without having to do that. Otherwise D is not really a viable option for startups because they need to focus on survival rather than language development.
Aug 23 2018
next sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 20:52, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
I hope a startup can choose D without having to do that. Otherwise D is not really a viable option for startups because they need to focus on survival rather than language development.
This! Maybe Weka can afford it, but being all smug about it is a destructive attitude to have. I know that some of Weka's leadership are uncomfortable about the fact that we, almost by definition, are facing language related issues that no-one in the community has before us. Weka is in a good place, and is going in a good direction, but don't forget that we are up against giants, and are selling a product where 0.1% failure is considered the same as utter failure. Being able to trust the compiler was supposed to be a given. Yes, Weka is, at this point, committed. The next start-up isn't. Shachar
Aug 23 2018
parent reply rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 21:04:36 +0300, Shachar Shemesh wrote:

 On 23/08/18 20:52, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
I hope a startup can choose D without having to do that. Otherwise D is not really a viable option for startups because they need to focus on survival rather than language development.
This! Maybe Weka can afford it, but being all smug about it is a destructive attitude to have. I know that some of Weka's leadership are uncomfortable about the fact that we, almost by definition, are facing language related issues that no-one in the community has before us. Weka is in a good place, and is going in a good direction, but don't forget that we are up against giants, and are selling a product where 0.1% failure is considered the same as utter failure. Being able to trust the compiler was supposed to be a given. Yes, Weka is, at this point, committed. The next start-up isn't. Shachar
I don't really understand this reasoning; a compiler is a dependency, much like a third party library. When a dependency gets in the way of your product, you have to make a choice. If you can't afford 0.1% failure, then if the compiler is holding you back, the choice seems to be fix the compiler, replace the compiler/ language, or don't do what you want to do. Should you have to fix the bugs you run into? No. But if they keep you from doing your work, it seems like the economics of fixing D's bugs can make sense. If Weka were to assign its own priorities to D's bugs*, and have one person, once a week, fix the largest-priority bug, how big of an investment would that be, and would the return be worth it? Many bugs will definitely not be worth your time, but others might. * I don't know that it's common, but I have maintained third-party bugs in my own tracker; this makes it easy to check their changelog against the bugs I care about, especially when I don't subscribe to the bug in their tracker for one reason or other. Being able to prioritize their bugs against bugs in my own project also helps me decide whether to spend my time fixing the third-party library.
Sep 01 2018
parent rjframe <dlang ryanjframe.com> writes:
On Sat, 01 Sep 2018 11:25:31 +0000, rjframe wrote:

 Should you have to fix the bugs you run into? No. But if they keep you
 from doing your work, it seems like the economics of fixing D's bugs can
 make sense. If Weka were to assign its own priorities to D's bugs*, and
 have one person, once a week, fix the largest-priority bug, how big of
 an investment would that be, and would the return be worth it? Many bugs
 will definitely not be worth your time, but others might.
You've answered this already; my apologies for the noise. --Ryan
Sep 01 2018
prev sibling parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 23 August 2018 at 17:52:54 UTC, bachmeier wrote:
 On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
I hope a startup can choose D without having to do that. Otherwise D is not really a viable option for startups because they need to focus on survival rather than language development.
What a joke: are you really arguing that every startup should have all their suppliers give them everything for free? Most startups pay a ton of money for their critical tools, money that pays for further development of those tools, including for bugfixes and features in the OSS projects they use (which they don't always open source). I'd wager that whatever Weka has spent on Johan to fix D is much less. Maybe Weka is simply learning they can't get away with that anymore. What you _could_ argue is that the cost/benefit ratio of D ends up being too high compared to some mooted alternative with a bigger community, say C++ or Rust, but I think you'd have a tough time making that case.
Aug 23 2018
prev sibling next sibling parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/23/18 12:22 PM, Shachar Shemesh wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 I'm not saying all bugs you file will be fixed, but all bugs you 
 *don't* file will definitely not be fixed.
So far, my experience is that it has about the same chances of being fixed both ways, and not filing takes less effort.
I have had much better success with bugs being fixed for issues that I file vs. hoping someone fixes it without a report, not just in D's ecosystem, but pretty much anywhere. But that's the choice you make. We'll have to disagree on that one. It's hard to fix bugs without reports, but hey, maybe you will get lucky and someone fixes them by accident. -Steve
Aug 23 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 9:22 AM, Shachar Shemesh wrote:
 So telling me to keep filing them is simply a non-starter. I've got bugs that 
 simply don't reproduce in watered down examples. I will not spend two days
just 
 to create a test case that demonstrates the bug outside the Weka code base. If 
 nothing else, my boss won't allow me to spend that time.
In my experience with debugging code, if drilling down to find the cause of the problem is not done, there is no way to conclude whether whether it is a compiler bug or a user bug. (Of course, compiler internal errors are always compiler bugs.) Drilling down and finding it to be a compiler problem also usually suggests a practical workaround for it.
Aug 23 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 23:46, Walter Bright wrote:
 In my experience with debugging code, if drilling down to find the cause 
 of the problem is not done, there is no way to conclude whether whether 
 it is a compiler bug or a user bug.
 
 (Of course, compiler internal errors are always compiler bugs.)
 
 Drilling down and finding it to be a compiler problem also usually 
 suggests a practical workaround for it.
Consider the following line from the weka code base: _trustedData.opIndex(diskIdx) &= NotBitmap(toDistrust); That's strange. Why didn't Shachar just do? _trustedData[diskIdx] &= NotBitmap(toDistrust); Answer: Because the compiler decided that it needs to call _trustedData.opIndexAssign, and then failed the compilation because it has none. There is no way that that is a bug in the code. This is a compiler bug (maybe since fixed. This is fairly old code). So, where's the issue number, I hear you ask? There is none. This problem only happens inside the code base. Once I tried to isolate it, I couldn't reproduce. At this point I can either use the work-around I already have and (try to, obviously unsuccessfully) forget about it, file a bug report that will be (justifiably) ignored because no-one else can reproduce it, or spend an unknown amount of time (two days would probably be low-balling at this point) in trying to get this to reproduce on a watered down version of the code. Which would you pick? Shachar
Aug 23 2018
next sibling parent Ali <fakeemail example.com> writes:
On Friday, 24 August 2018 at 01:57:03 UTC, Shachar Shemesh wrote:
 At this point I can either use the work-around I already have 
 and (try to, obviously unsuccessfully) forget about it, file a 
 bug report that will be (justifiably) ignored because no-one 
 else can reproduce it, or spend an unknown amount of time (two 
 days would probably be low-balling at this point) in trying to 
 get this to reproduce on a watered down version of the code.
1. use the work-around, then 2. spend two man days, spread over two weeks 30-60 minutes at a time, trying to reproduce on a watered down version of code , i.e. trying to isolate the problem 3. refactor once the real problem is found from this nice article written by one of the better visionary developer John Ousterhout
 If you don't know what the problem was, you haven't fixed it
http://web.stanford.edu/~ouster/cgi-bin/sayings.php
Aug 23 2018
prev sibling next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 24/08/2018 1:57 PM, Shachar Shemesh wrote:
 Consider the following line from the weka code base:
 
          _trustedData.opIndex(diskIdx) &= NotBitmap(toDistrust);
 
 That's strange. Why didn't Shachar just do?
 
          _trustedData[diskIdx] &= NotBitmap(toDistrust);
 
 Answer: Because the compiler decided that it needs to call 
 _trustedData.opIndexAssign, and then failed the compilation because it 
 has none. There is no way that that is a bug in the code. This is a 
 compiler bug (maybe since fixed. This is fairly old code).
From now on, please report it as-is and tag it as e.g. unknown_weka or something. Give us a report once a month or so, on what is open. It'll be easier to figure out what is going on if we can see the symptom. Even if it isn't the problem its a step in the right direction.
Aug 23 2018
prev sibling next sibling parent reply FeepingCreature <feepingcreature gmail.com> writes:
On Friday, 24 August 2018 at 01:57:03 UTC, Shachar Shemesh wrote:
 That's strange. Why didn't Shachar just do?

         _trustedData[diskIdx] &= NotBitmap(toDistrust);

 Answer: Because the compiler decided that it needs to call 
 _trustedData.opIndexAssign, and then failed the compilation 
 because it has none. There is no way that that is a bug in the 
 code. This is a compiler bug (maybe since fixed. This is fairly 
 old code).

 So, where's the issue number, I hear you ask? There is none. 
 This problem only happens inside the code base. Once I tried to 
 isolate it, I couldn't reproduce.
Have you tried to use the excellent Dustmite tool? It's never failed to reduce a bug for me.
Aug 24 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 24/08/18 10:43, FeepingCreature wrote:
 Have you tried to use the excellent Dustmite tool? It's never failed to 
 reduce a bug for me.
Dustmite might be excellent. I wouldn't know. It cannot swallow the Weka code base. Shachar
Aug 24 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 4:08:52 AM MDT Shachar Shemesh via Digitalmars-d 
wrote:
 On 24/08/18 10:43, FeepingCreature wrote:
 Have you tried to use the excellent Dustmite tool? It's never failed to
 reduce a bug for me.
Dustmite might be excellent. I wouldn't know. It cannot swallow the Weka code base.
Dustmite can be fantastic, but it's far from a silver bullet. It's often difficult to give it stopping criteria in a way that gets you what you want, and you don't have to have all that much code before it takes hours to finish (one of my coworkers was recently trying to use dustmite to find an issue, and he had it running for days). I would expect that with a code base of any real size, you would have to be able to take a modular piece of it and test it directly in order to have much hope of it getting the job done. - Jonathan M Davis
Aug 24 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 6:57 PM, Shachar Shemesh wrote:
 At this point I can either use the work-around I already have and (try to, 
 obviously unsuccessfully) forget about it, file a bug report that will be 
 (justifiably) ignored because no-one else can reproduce it, or spend an
unknown 
 amount of time (two days would probably be low-balling at this point) in
trying 
 to get this to reproduce on a watered down version of the code.
 
 Which would you pick?
You should file a bug report, even with no example. It'll still be a clue, sometimes I can find problems without an example. But still, assuming it is a compiler bug is a dodgy practice. I've drilled down on a lot of bug reports that the submitter was absolutely sure was a compiler bug, that turned out to be invalid.
Aug 24 2018
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 17:01, Steven Schveighoffer wrote:
 So interestingly, you are accepting the sockaddr by VALUE.
Indeed. The reason is that I cannot accept them by reference, as then you wouldn't be able to pass lvalues in. Another controversial decision by D. Had that been C++, I'd definitely get a const ref instead. Shachar
Aug 23 2018
parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/23/18 12:27 PM, Shachar Shemesh wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 So interestingly, you are accepting the sockaddr by VALUE.
Indeed. The reason is that I cannot accept them by reference, as then you wouldn't be able to pass lvalues* in. Another controversial decision by D.
*rvalues you meant. One that is actively being addressed, at least by the community: https://github.com/dlang/DIPs/blob/master/DIPs/DIP1016.md No guarantees it gets through, but this is probably further than anyone has ever gotten before on this topic (and it's a very old topic).
 
 Had that been C++, I'd definitely get a const ref instead.
If you want to use inheritance this is a given, in D or in C++. What this simply means is your identification of the problem is simply wrong -- it's not that you can't make subtypes with structs (you can), it's that you can't accept rvalues by reference, and accepting by reference is required for inheritance. -Steve
Aug 23 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 11:29 AM, Steven Schveighoffer wrote:
 What this simply means is your identification of the problem is simply wrong
-- 
 it's not that you can't make subtypes with structs (you can), it's that you 
 can't accept rvalues by reference, and accepting by reference is required for 
 inheritance.
You're quite right that correct identification of the problem is critical before designing a solution.
Aug 23 2018
prev sibling next sibling parent Mike Franklin <slavo5150 yahoo.com> writes:
On Thursday, 23 August 2018 at 13:22:45 UTC, Shachar Shemesh 
wrote:

 Because in D, structs can't inherit,
Forgive me if I'm not helping, but if you are willing to create a little infrastructure, I think you can create polymorphic structs with the technique described at https://theartofmachinery.com/2018/08/13/inheritance_and_polymorphism_2.html. See https://gitlab.com/sarneaud/xanthe/blob/master/src/game/player.d#L18 for a concrete example.
 I can't do that for structs, and if I defined SockAddr as a 
 class, I'd mandate allocating it on the GC, violating the whole 
 point behind writing Mecca to begin with.
There are other ways to allocate memory for classes, without the GC. Do any of the techniques described at https://wiki.dlang.org/Memory_Management#Explicit_Class_Instance_Allocation give you an alternative?
 The general consensus, however, is that these problems will not 
 be resolved (we used to file bugs in Bugzilla. We stopped doing 
 that because we saw nothing happens with them), and as far as 
 the future of the language goes, that's bad news.
I've fixed 4 bugs in the past 2 weeks: https://github.com/pulls?utf8=%E2%9C%93&q=is%3Apr+author%3AJinShil+archived%3Afalse+is%3Aclosed+Fix But I admit they were quite simple. I agree, the more difficult bugs tend to not get fixed. I've tried to fix a few of them, but they were beyond my current abilities. Again, you might have more success if you put some financial incentive behind them. Mike
Aug 23 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 6:22 AM, Shachar Shemesh wrote:
 (we used to file bugs in Bugzilla. We stopped doing that because we saw
nothing 
 happens with them)
Quite a number of Weka filed bugs have been fixed. Here's the 2.080 list of 46 bugs fixed: https://dlang.org/changelog/2.080.0.html#bugfix-list The ones that don't get fixed are the ones not posted to Bugzilla. This is true of every major language I know of. Furthermore, *anyone* with a mind to can fix any bug and post a PR for it.
Aug 23 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/23/2018 5:55 AM, Steven Schveighoffer wrote:
 If front() returns by ref, then no copying happens. If front() returns by 
 value, then a copy is made. This should not be surprising behavior.
I think he means, if the range ITSELF doesn't allow copying, it won't work with foreach (because foreach makes a copy), but it will work with opApply.
foreach (ref v; collection) does not make a copy. It's up to the programmer whether copying is done with it. After all, if one is passing a struct instance around by value instead of by ref, there is no escaping copying it. This should not be surprising. Designing a non-copyable struct is a more advanced technique, and it is reasonable to expect that someone using it should understand the consequences.
 There are many people who post short curt answers, maybe even cynical. But
this 
 isn't necessarily the authoritative answer. Where I see this happening, I 
 usually try to respond with a more correct answer (even though my voice isn't 
 authoratative exactly), but the sad truth is that we can't spend all our day 
 making sure we have a super-pleasant forum where every answer is valid and 
 nobody is rude.
The D forums are deliberately set up to allow people the freedom to say what they wish as much as possible, we only put minimal restrictions on it, such as we expect professional demeanor. What we aren't going to do is vet every post for consisting only of approved opinions. That wouldn't even be a forum - it would be an echo chamber.
 In reply to Shachar's general point:
 This whole thread seems very gloomy and final, but I feel like the tone does
not 
 match in my mind how D is progressing. "Every single one of the people [at
Weka] 
 rushing to defend D at the time has since come around." Seems like you all
have 
 decided to either ditch D internally, maybe moving forward, or accepted that 
 Weka will fail eventually due to the choice of D? It sure reads that way.
 
 This is in SHARP contrast to the presentation that Liran gave at Dconf this 
 year, touting D as a major reason Weka was able to develop what they did, and
to 
 some degree, your showcase of how Mecca works.
 
 My experience with D is that it has gotten much better over the years. I
suppose 
 that having worked with the earlier versions, and seeing what has happened
gives 
 me a different perspective. I guess I just don't have that feeling that there 
 are some unfixable problems that will "kill" the language. Everything in a 
 programming language is fixable, it just matters how much pain you are willing 
 to deal with to fix it. If we get to a point where there really is a sticking 
 point, D3 can be born.
 
 I do feel that we need, in general, more developers working on the compiler 
 itself. So many of the problems need compiler changes, and the learning curve
to 
 me just seems so high to get into it.
I've been programming for 40 years. Every language, and every implementation, has had problems, often requiring significant effort to deal with them. The issue for me is, can I get my work done? For example, safe isn't 100%. But simply having buffer overflow checking dmd's source code does not use safe, and I've recently noticed string handling code that had buffer overflow bugs in it as a result. Nobody is going to guarantee the airplane you're flying on won't crash. But that doesn't mean all the efforts made to approach such a guarantee are pointless or doomed. The faults that result in an accident steadily get rarer and weirder. Some have suggested that proofs could be constructed to prove safe is correct in all cases. It's a good idea, but I have no idea how to construct such a proof, nor do I have any idea how to show that the proof itself covers every issue. The proof would inevitably be as complex as D itself is, putting this right back where we are now. Criticism of safe would be much more damning if using it caused unsafe behavior, but I am unaware of any such bugs. Note also that the whole point of dip1000 is expanding the coverage of safe.
Aug 23 2018
parent ag0aep6g <anonymous example.com> writes:
On 08/23/2018 10:11 PM, Walter Bright wrote:
 On 8/23/2018 5:55 AM, Steven Schveighoffer wrote:
[...]
 I think he means, if the range ITSELF doesn't allow copying, it won't 
 work with foreach (because foreach makes a copy), but it will work 
 with opApply.
    foreach (ref v; collection) does not make a copy.
It makes a copy of `collection`.
 It's up to the programmer whether copying is done 
 with it. After all, if one is passing a struct instance around by value 
 instead of by ref, there is no escaping copying it. This should not be 
 surprising.
It's not obvious that using `collection` in a `foreach` means passing it around by value.
Aug 23 2018
prev sibling next sibling parent burjui <bytefu gmail.com> writes:
On Thursday, 23 August 2018 at 12:03:59 UTC, Walter Bright wrote:
 I'd appreciate a list of bugzilla issues you regard as critical 
 to your operations.
Sorry to weasle in, but https://issues.dlang.org/show_bug.cgi?id=2043 Sorry for childish behaviour in bugzilla (my last comment), but this bug makes me radiate in gamma-rays, because the current D behaviour is stupid and invalid, the workarounds (hacks) are incredibly ugly, and, on top of that, the bug is 10 years old. There are also tons of other bugs, inconsistencies etc. that I stumble upon regularly, they all were in bugzilla long before my "discoveries". Ok, let me be brutally honest. These types of things really piss me off in D. IMHO top-tier devs are not focused on really important things. For me, a long-term D user, proper closures are much more relevant than D/C++ Kama Sutra. I don't have any C++ code and I don't intend to write any. I want to and do write D code and I want it to work *at least* correctly. But Walter, our compiler-fu black belt master (not making fun of him, I really respect his expertise), is more concerned about "poor" C++ users (which have all the luxuries) than about peasants like me with kick (and maybe a restomp of the groin). But I get it, it's a community-driven project, so "if you want it, DIY or get lost". I choose the latter, because my compiler-fu is rather weak, if exists at all. I still love D, but more like a good book, than a reliable tool. It's funny and sad at the same time, that I regularly find nightly Rust more reliable than stable D. So the obvious move for me is to stop arguing with the Party and simply choose other party that better represents and addresses my needs. Don't get me wrong, I'm not ditching or D or saying it's crap, I'm still using it from time to time, but I will likely not choose it for any "serious" project (even hobby). Just tired of stomping on all the occasional caltrops, which are well-known, marked on the maps, but not being removed because children starve in Africa.
Aug 23 2018
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 23/08/18 15:03, Walter Bright wrote:
 So you will excuse me, but I don't think this bug is being taken as 
 seriously as I think it should.
It is a serious problem. (There are workarounds available, like using scope(failure).)
I don't think you understand how unworkable this workaround is. struct A { this(something); ~this(); } struct B { A a; this(something) { a = A(something); // Safeguard against 14246 scope(failure) destroy(a); // more code } } struct C { A a; this(something) { a = A(something); // Also needs a safeguard against 14246 scope(failure) destroy(a); // more code } } struct D { this(something) { // Do nothing } } struct E { B b; D d; this(something) { b = B(something); // B doesn't even have an explicit destructor, but we are supposed to look at its implementation and understand that it contains A, so: scope(failure) destroy(b); d = D(something); // D doesn't even contain A, but it might one day, so: scope(failure) destroy(d); } } The chances of this scheme actually working without errors are, more or less, zero.
Aug 24 2018
parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Friday, 24 August 2018 at 10:16:34 UTC, Shachar Shemesh wrote:
 On 23/08/18 15:03, Walter Bright wrote:
 So you will excuse me, but I don't think this bug is being 
 taken as seriously as I think it should.
It is a serious problem. (There are workarounds available, like using scope(failure).)
I don't think you understand how unworkable this workaround is.
I think Walter was talking more about "scope (failure) destroy(this)" at the top of all your structs? I don't know if it has some gotchas, though (as I don't use RAII in D...).
Aug 24 2018
parent reply Shachar Shemesh <shachar weka.io> writes:
On 24/08/18 13:43, nkm1 wrote:
 
 I think Walter was talking more about "scope (failure) destroy(this)" at 
 the top of all your structs? I don't know if it has some gotchas, though 
 (as I don't use RAII in D...).
 
No, unlike what I suggest, that doesn't work without carefully reviewing every single place you put it to see whether the constructor actually supports destructing a partially constructed object. Shachar
Aug 24 2018
next sibling parent nkm1 <t4nk074 openmailbox.org> writes:
On Friday, 24 August 2018 at 13:34:57 UTC, Shachar Shemesh wrote:
 On 24/08/18 13:43, nkm1 wrote:
 
 I think Walter was talking more about "scope (failure) 
 destroy(this)" at the top of all your structs? I don't know if 
 it has some gotchas, though (as I don't use RAII in D...).
 
No, unlike what I suggest, that doesn't work without carefully reviewing every single place you put it to see whether the constructor actually supports destructing a partially constructed object. Shachar
So I guess you're saying you also use "= void" as default initializer for some things. Otherwise, it's already a requirement that all default-initialized things should be destructible... Yeah, I do agree that these kinds of things are not very well supported by D. I do not agree that it means the language is doomed (from a techincal standpoint, at least). Moving in a more Java-esque direction ("Fast like C++ but with good GC" (so, in reality, slower than C++ but faster than JVM)) would be a reasonable strategy for D. Seems like it's not going to happen, though.
Aug 24 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 6:34 AM, Shachar Shemesh wrote:
 No, unlike what I suggest, that doesn't work without carefully reviewing every 
 single place you put it to see whether the constructor actually supports 
 destructing a partially constructed object.
All D objects are default-initialized before the constructor sees it (unlike C++). A destructor should be able to handle a default-initialized object.
Aug 25 2018
next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 25 August 2018 at 07:56:55 UTC, Walter Bright wrote:
 On 8/24/2018 6:34 AM, Shachar Shemesh wrote:
 No, unlike what I suggest, that doesn't work without carefully 
 reviewing every single place you put it to see whether the 
 constructor actually supports destructing a partially 
 constructed object.
All D objects are default-initialized before the constructor sees it (unlike C++). A destructor should be able to handle a default-initialized object.
Then we should add a switch to inject a unittest to run a destructor on a default initialisable object. i.e. if it has a static opCall or an disable this(); then don't otherwise do. Otherwise this is a well disguised instance of faith based programming.
Aug 25 2018
prev sibling parent reply Shachar Shemesh <shachar weka.io> writes:
On 25/08/18 10:56, Walter Bright wrote:
 On 8/24/2018 6:34 AM, Shachar Shemesh wrote:
 No, unlike what I suggest, that doesn't work without carefully 
 reviewing every single place you put it to see whether the constructor 
 actually supports destructing a partially constructed object.
All D objects are default-initialized before the constructor sees it (unlike C++). A destructor should be able to handle a default-initialized object.
I'm not talking about a default initialized object. I'm talking about an object where the constructor started running, but not to completion. With that said, this statement is, I think, representative of the Lego problem D has. All D objects? Really? Even this one? struct A { int a; disable this(); disable init; this(int number); ~this(); } If you allow a feature to be disabled, you really need to keep in mind that feature might be well and truly disabled.
Aug 25 2018
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, August 25, 2018 7:33:47 AM MDT Shachar Shemesh via Digitalmars-
d wrote:
 On 25/08/18 10:56, Walter Bright wrote:
 On 8/24/2018 6:34 AM, Shachar Shemesh wrote:
 No, unlike what I suggest, that doesn't work without carefully
 reviewing every single place you put it to see whether the constructor
 actually supports destructing a partially constructed object.
All D objects are default-initialized before the constructor sees it (unlike C++). A destructor should be able to handle a default-initialized object.
I'm not talking about a default initialized object. I'm talking about an object where the constructor started running, but not to completion. With that said, this statement is, I think, representative of the Lego problem D has. All D objects? Really? Even this one? struct A { int a; disable this(); disable init; this(int number); ~this(); } If you allow a feature to be disabled, you really need to keep in mind that feature might be well and truly disabled.
As I understand it, it's not actually possible to disable the init value. Even if bodies are provide for the constructor and destructor (so that they don't give you errors), you just end up with an error like q.d(14): Error: no identifier for declarator init You could replace that with something like disable void init(); but that's only legal because declaring a member named init has never been made illegal even though it's generally been agreed upon (by most of the core devs anyway) that it should be. The way D is designed, there _must_ be an init value. The closest that you can get to disabling it is the disable this(); line which just disables default initialiation. The init value still gets used when constructing the object, and it can still be used explicitly. If void initialization of member variables worked the way that some folks think that it should - including Andrei and Walter: https://issues.dlang.org/show_bug.cgi?id=11331 https://issues.dlang.org/show_bug.cgi?id=11817 then I think that that would definitely be an example that would fit the point that you're trying to make (since they you have to worry about the constructor partially setting values on top of garbage and then trying to destroy that correctly), but at least for the moment, it doesn't actually work. Having to worry about destructors running on void initialized objects is a similar problem, but not quite the same, since that doesn't involve an exception being thrown from a constructor. Regardless, even if a type is designed such that its init value can be properly destroyed, and you don't have to worry about void initialization, it can't be all that hard to design it such that the destructor won't work properly if the constructor doesn't complete properly. What all of this makes me think of though is a similar problem that FeepingCreature loves to bring up and complain about, which is that invariants that consider the init value to be invalid (e.g. if the invariant checks that a member variable is non-null) blow up in your face if the type has a destructor, because the invariant gets run before the destructor, and member variables that are pointers will of course be null in the init value. And while disabling default initialization helps, it doesn't fix the problem because of code that explicitly uses the init value. To work around this, he's come up with some weird solution using unions that's probably going to break on him at some point (though if I understand correctly, he's changed std.typecons.Nullable to use it, which makes it a lot less likely that it's going to break). But really, having an invariant that fails for the init value is generally a recipe for disaster, much as it's easy to see why it would be desirable for stuff like pointers. What's worse is that once void initialization is involved, an invariant is almost certainly going to fail, because the invariant gets called before opAssign. And that's the number one reason that I never use invariants in structs anymore. In any case, all of that is really a tangent with regards to init values, but it's definitely a sign of how some of the stuff around init values, constructors, and destructors doesn't really play well together. And sadly, it's really not that hard to get into a state where your destructor is going to have serious problems if it's run. In general, any place where D was designed around the idea that something would _always_ be there (e.g. init values and default initialization) but we've then later added the ability to get around it (e.g. void initialization or disable) has tended to not play well with everything else and has caused some fun problems. - Jonathan M Davis
Aug 25 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 7:37 AM, Jonathan M Davis wrote:
 In general, any place where D was
 designed around the idea that something would _always_ be there (e.g. init
 values and default initialization) but we've then later added the ability to
 get around it (e.g. void initialization or  disable) has tended to not play
 well with everything else and has caused some fun problems.
It's why that stuff isn't allowed in safe code, and hence one should realize one is taking responsibility from the compiler for ensuring the correctness. D expects someone writing system code to have a much greater awareness of how the language and the machine works. If you remove the blade guards from the table saw, more things can be done with it, but you'll need to take much greater care using it.
Aug 25 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, August 25, 2018 2:16:16 PM MDT Walter Bright via Digitalmars-d 
wrote:
 On 8/25/2018 7:37 AM, Jonathan M Davis wrote:
 In general, any place where D was
 designed around the idea that something would _always_ be there (e.g.
 init values and default initialization) but we've then later added the
 ability to get around it (e.g. void initialization or  disable) has
 tended to not play well with everything else and has caused some fun
 problems.
It's why that stuff isn't allowed in safe code, and hence one should realize one is taking responsibility from the compiler for ensuring the correctness. D expects someone writing system code to have a much greater awareness of how the language and the machine works. If you remove the blade guards from the table saw, more things can be done with it, but you'll need to take much greater care using it.
Definitely, but some of the features don't play well together even with system - e.g. if you have a struct with an invariant, you can't use void initialization, because the invariant will be checked when you assign the struct an actual value later. I was forced to remove the invariant that SysTime had years ago, because of issues related to that, and I've basically given up on using invariants in structs as a result. If I'm in full control of all of the code, it's not a problem, but as soon as I'm providing library code to someone eles, struct invariants become land mines. And just in general, it becomes pretty problematic to try to skip or disable something that the lanugage was built around (e.g. default initialization) - though fortunately, it usually interacts better than invariants tend to do when you do something like void initialization. - Jonathan M Davis
Aug 27 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 6:33 AM, Shachar Shemesh wrote:
 If you allow a feature to be disabled, you really need to keep in mind that 
 feature might be well and truly disabled.
Disabling default initializations is not safe, and that means taking responsibility for it not being default initialized. It's like turning off array bounds checking. Then it's up to you to ensure no buffer overflows are happening. On a pragmatic note, what you're asking for is a set of nested try blocks, one for each field with a throwing constructor, as opposed to one try block around the entire function. This is an expensive proposition in terms of performance. You'll need to weigh that very carefully against saving the default zero initialization of the struct, which is very efficient.
Aug 25 2018
prev sibling parent rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 04:02:37 -0700, Walter Bright wrote:

 On 8/23/2018 2:09 AM, Shachar Shemesh wrote:
 * The community
 
 Oh boy.
 
 Someone who carries weight needs to step in when the forum is trying to
 squash down on criticism. For Mecca, I'm able to do that [2], but for
 D, this simply doesn't happen.
If someone is trying to squash criticism, I would like to see what you're referring to. If a post contains unprofessional behavior, like using f--k or harassing people, it will get removed. Simply being critical is not removed (if it was, this this thread would have disappeared).
One problem is that "professional" has a broad range of meanings, and as variety of cultures are represented here. I've had bosses that thought if the person "deserved" it. It would be nice to have a published code of conduct; it doesn't need to be large or formal, just a simple definition of professional, respectful behavior. Anybody would be able to point to it, rather than hope you or someone who's around enough to feel comfortable calling someone out is on the NG at the time.
Sep 01 2018
prev sibling next sibling parent reply Matheus <a a.com> writes:
On Thursday, 23 August 2018 at 09:09:40 UTC, Shachar Shemesh 
wrote:
 On 23/08/18 09:58, Joakim wrote:
 Because you've not listed any here, which makes you no better 
 than some noob
Here's one: the forum does not respond well to criticism.
Well, I'm D hobbyist and of course it's not a perfect language and you have some valid points, but on the other hand I think it's very disrespectful to come into a community and say the product that people are working mainly as volunteers and without any payment "is dead". Matheus.
Aug 23 2018
parent rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 15:34:34 +0000, Matheus wrote:

 Well, I'm D hobbyist and of course it's not a perfect language and you
 have some valid points, but on the other hand I think it's very
 disrespectful to come into a community and say the product that people
 are working mainly as volunteers and without any payment "is dead".
 
 Matheus.
Not necessarily. If you see somebody's about to drive off a cliff, warning them is a good thing to do. If someone thinks the direction D is headed will lead to the end of the language as a good/viable choice for programmers, then trying to change that direction is the best thing s/he can do.
Sep 01 2018
prev sibling parent reply Nick Treleaven <nick geany.org> writes:
On Thursday, 23 August 2018 at 09:09:40 UTC, Shachar Shemesh 
wrote:
 Please see the thread about lazy [1] for a case where a 
 question actually has an answer, but nobody seems to know it
I updated the spec for lazy parameters to add a link to lazy variadic functions at the end, and for the latter I added a simpler version of Steven Schveighoffer's examples of both. So at least in future someone with a similar problem might find out that lazy variadics is another option. https://dlang.org/spec/function.html#lazy-params https://dlang.org/spec/function.html#lazy_variadic_functions I hadn't understood the rationale for lazy variadic functions just from reading the spec. If I learn something that should be mentioned in the spec, I try to make a pull. Failing that, we can file bugzilla issues for missing documentation, even if it's only an enhancement for clarification.
Aug 31 2018
parent Basile B. <b2.temp gmx.com> writes:
On Friday, 31 August 2018 at 08:36:27 UTC, Nick Treleaven wrote:
 I hadn't understood the rationale for lazy variadic functions
https://dlang.org/spec/function.html#lazy_variadic_functions I don't know if this has been updated too but this sentence makes no sense : "Then each of the arguments whose type does not match that of the delegate is converted to a delegate."
Aug 31 2018
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:

 To sum it up: fatal flaws + no path to fixing + no push from 
 the community = inevitable eventual death.

 With great regrets,
 Shachar
I want to jump in for the sake of someone from the outside coming in and reading this to say that I disagree. I don't know a whole lot about the type of development you're doing, because that's not my line of work, and you obviously know that area well. However, for scripting tasks (as Dicebot mentioned) D is great. It is also great for real but smaller projects (5,000-20,000 lines). I don't think there's a better choice right now for data science/scientific programming, where you have many small jobs going on repeatedly, and those are large and growing areas. Weka is an awesome project, but I don't know that most people considering D should use your experience as the basis of their decision. At least in my areas, I expect considerable growth in the usage of D over the next 10 years. Maybe it won't see much traction as a C++ replacement for large projects like Weka.
Aug 23 2018
parent rjframe <dlang ryanjframe.com> writes:
On Thu, 23 Aug 2018 14:29:23 +0000, bachmeier wrote:

 Weka is an awesome project, but I don't know that most people
 considering D should use your experience as the basis of their decision.
 At least in my areas, I expect considerable growth in the usage of D
 over the next 10 years. Maybe it won't see much traction as a C++
 replacement for large projects like Weka.
As long as D calls itself a systems language (which I believe is still the case), the experience of organizations building large systems is extremely important -- for organizations that want to build large systems.
Sep 01 2018
prev sibling next sibling parent Everlast <Everlast For.Ever> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death. With great regrets, Shachar
I agree with this. I no longer program in D, except for minor things, because of this type of approach. D, as a language, is the best. D as an actual practical tool is a deadly pit of snakes... anyone of which can bite you, and that won't stop the others. Of course, in the pit is where all the gold is at... D's ecosystem is the problem, not the language(although, the bugs in the implementation are a problem, they seem to be generally solved and since the source is open it can be fixed when needed). It is obviously the mentalities of the leaders... it always is, that is why they are called leaders, because they lead and whatever mentalities they have will shape who and what they lead. That would be a very impressive language and ecosystem! Specially if it had both .net and native compilation and fixed some of the major language issues D has. My feeling is D is and will stay stagnate for the majority of the world. It doesn't seem to have enough momentum to break out and the "leaders" don't seem to know much about actually leading... programming? Yes, but leading? No, not really...(obviously they know something but I'm talking about what is required... a bill gate like character, say, not that we want another one of those!) D is one of those language where it's always something getting in the way of zoning. Something really little and stupid but constantly trips you up... it becomes a big drag after while. I'd rather program in a language that has it's shit together where I can write large projects in a 10th of the time and with a fourth of the trouble. Since time is money, you know these types of issues will stop businesses from adopting D. The typical answer from the D community is "Implement it in a library!" or "It has bindings!" as if these are the solutions someone trying to get shit done wants to hear. Usually the libraries are defunct in some way(bit rot, version issues, shitty design, some deal breaker(e.g., uses gc), shitty documentation, etc). Since D is mainly community driven, this means that the community will cobble shit together and then it becomes part of D. This is good for shear amount of code generation but terrible for unity of design. Everyone does it their way which results in many different approaches to many different things(and this even gets in the libraries and compiler design). There has to be a sense of balance in anything, and this includes leadership, compiler design, language features, etc... D does not have that balance and people recognize that in whatever way they see it and generally choose not to use it. The people that use it have a defacto need to project D as a balanced language(oh, it has this this and that! It has this and that and that over there too if you jump through hoops A B and C). Very few people have the intelligence to be able to admit they are going down the wrong path and it's time to turn around. It's human nature to dig and dig and dig and dig and dig and dig...
Aug 23 2018
prev sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Thursday, 23 August 2018 at 03:50:44 UTC, Shachar Shemesh 
wrote:
 On 22/08/18 21:34, Ali wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention.
Yea, I agree, the negatives are not really negative Walter not matter how smart he is, he is one man who can work on the so many things at the same time Its a chicken and egg situation, D needs more core contributors, and to get more contributors it needs more users, and to get more users it need more core contributors
No, no and no. I was holding out on replying to this thread to see how the community would react. The vibe I'm getting, however, is that the people who are seeing D's problems have given up on affecting change. It is no secret that when I joined Weka, I was a sole D detractor among a company quite enamored with the language. I used to have quite heated water cooler debates about that point of view. Every single one of the people rushing to defend D at the time has since come around. There is still some debate on whether, points vs. counter points, choosing D was a good idea, but the overwhelming consensus inside Weka today is that D has *fatal* flaws and no path to fixing them. And by "fatal", I mean literally flaws that are likely to literally kill the language. And the thing that brought them around is not my power of persuasion. The thing that brought them around was spending a couple of years working with the language on an every-day basis. And you will notice this in the way Weka employees talk on this forum: except me, they all disappeared. You used to see Idan, Tomer and Eyal post here. Where are they? This forum is hostile to criticism, and generally tries to keep everyone using D the same way. If you're cutting edge D, the forum is almost no help at all. Consensus among former posters here is that it is generally a waste of time, so almost everyone left, and those who didn't, stopped posting. And it's not just Weka. I've had a chance to talk in private to some other developers. Quite a lot have serious, fundamental issues with the language. You will notice none of them speaks up on this thread. They don't see the point. No technical project is born great. If you want a technical project to be great, the people working on it have to focus on its *flaws*. The D's community just doesn't do that. To sum it up: fatal flaws + no path to fixing + no push from the community = inevitable eventual death. With great regrets, Shachar
Same feeling here btw. I regularly have to face strange bugs while updating the compiler or its libraries. For instance, my Resync tool used to work fine both on Windows and Linux. But it seems that the latest version of "std.file.copy" now completely ignores the "PreserveAttributes.no" argument on Windows, which made recent Windows builds of Resync fail on read-only files. Very typical... While D remains my favorite file scripting language, I must admit that this is quite disappointing for such an old language, compared to similar languages like Crystal.
Sep 04 2018
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 04/09/2018 9:40 PM, Ecstatic Coder wrote:
 But it seems that the latest version of "std.file.copy" now completely 
 ignores the "PreserveAttributes.no" argument on Windows, which made 
 recent Windows builds of Resync fail on read-only files.
What??? There is nothing in the changelog between 2.080.0 and 2.082.0 for changes to std.file. Version from July 2017[0]. Version from 2.082.0[1]. They look the same to me. [0] https://github.com/dlang/phobos/blob/d8959320e0c47a1861e32bbbf6a3ba30a019798e/std/file.d#L3430 [1] https://github.com/dlang/phobos/blob/v2.082.0/std/file.d#L4216
Sep 04 2018
parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 4 September 2018 at 09:56:13 UTC, rikki cattermole 
wrote:
 On 04/09/2018 9:40 PM, Ecstatic Coder wrote:
 But it seems that the latest version of "std.file.copy" now 
 completely ignores the "PreserveAttributes.no" argument on 
 Windows, which made recent Windows builds of Resync fail on 
 read-only files.
What??? There is nothing in the changelog between 2.080.0 and 2.082.0 for changes to std.file. Version from July 2017[0]. Version from 2.082.0[1]. They look the same to me. [0] https://github.com/dlang/phobos/blob/d8959320e0c47a1861e32bbbf6a3ba30a019798e/std/file.d#L3430 [1] https://github.com/dlang/phobos/blob/v2.082.0/std/file.d#L4216
Mayb I'm wrong, but what I can say is that I've recently updated DMD and compiled a windows build of Resync, and that I *HAD* to make Windows-specific code that removes the "read-only" attributes only on Windows. attributes = source_file_path.getAttributes(); source_file_path.getTimes( access_time, modification_time ); version ( Windows ) { if ( target_file_path.exists() ) { target_file_path.setAttributes( attributes & ~1 ); } source_file_path.copy( target_file_path, PreserveAttributes.no ); target_file_path.setAttributes( attributes & ~1 ); target_file_path.setTimes( access_time, modification_time ); target_file_path.setAttributes( attributes ); } else { if ( target_file_path.exists() ) { target_file_path.setAttributes( 511 ); } source_file_path.copy( target_file_path, PreserveAttributes.no ); target_file_path.setAttributes( attributes ); target_file_path.setTimes( access_time, modification_time ); } Honestly I don't see why I have to make this ugly fix on Windows, while the Linux version has always worked fine on read-only files.
Sep 04 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 04/09/2018 10:27 PM, Ecstatic Coder wrote:
 On Tuesday, 4 September 2018 at 09:56:13 UTC, rikki cattermole wrote:
 On 04/09/2018 9:40 PM, Ecstatic Coder wrote:
 But it seems that the latest version of "std.file.copy" now 
 completely ignores the "PreserveAttributes.no" argument on Windows, 
 which made recent Windows builds of Resync fail on read-only files.
What??? There is nothing in the changelog between 2.080.0 and 2.082.0 for changes to std.file. Version from July 2017[0]. Version from 2.082.0[1]. They look the same to me. [0] https://github.com/dlang/phobos/blob/d8959320e0c47a1861e32bbbf6a3ba30a0197 8e/std/file.d#L3430 [1] https://github.com/dlang/phobos/blob/v2.082.0/std/file.d#L4216
Mayb I'm wrong, but what I can say is that I've recently updated DMD and compiled a windows build of Resync, and that I *HAD* to make Windows-specific code that removes the "read-only" attributes only on Windows. attributes = source_file_path.getAttributes(); source_file_path.getTimes( access_time, modification_time ); version ( Windows ) {     if ( target_file_path.exists() )     {         target_file_path.setAttributes( attributes & ~1 );     }     source_file_path.copy( target_file_path, PreserveAttributes.no );     target_file_path.setAttributes( attributes & ~1 );     target_file_path.setTimes( access_time, modification_time );     target_file_path.setAttributes( attributes ); } else {     if ( target_file_path.exists() )     {         target_file_path.setAttributes( 511 );     }     source_file_path.copy( target_file_path, PreserveAttributes.no );     target_file_path.setAttributes( attributes );     target_file_path.setTimes( access_time, modification_time ); } Honestly I don't see why I have to make this ugly fix on Windows, while the Linux version has always worked fine on read-only files.
Hang on a second. assert(preserve == Yes.preserveAttributes); Something is smelling an awful lot here. Up to Windows 7 CopyFileW which is used for Windows didn't copy the attributes over[0] but it does now. This is a bug on our end, which should include a fallback to copying manually the file contents over. [0] https://docs.microsoft.com/en-us/windows/desktop/api/winbase/nf-winbase-copyfilew
Sep 04 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
https://issues.dlang.org/show_bug.cgi?id=19221
Sep 04 2018
prev sibling parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
 Hang on a second.

 assert(preserve == Yes.preserveAttributes);

 Something is smelling an awful lot here.

 Up to Windows 7 CopyFileW which is used for Windows didn't copy 
 the attributes over[0] but it does now.

 This is a bug on our end, which should include a fallback to 
 copying manually the file contents over.

 [0] 
 https://docs.microsoft.com/en-us/windows/desktop/api/winbase/nf-winbase-copyfilew
Yeah, keeping exactly the same behavior on every supported platform is never easy. And when you need to support Windows too, by experience I know it can quickly become a pain in the *ss...
Sep 05 2018
prev sibling parent Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Tuesday, 4 September 2018 at 09:40:23 UTC, Ecstatic Coder 
wrote:
 But it seems that the latest version of "std.file.copy" now 
 completely ignores the "PreserveAttributes.no" argument on 
 Windows, which made recent Windows builds of Resync fail on 
 read-only files.

 Very typical...

 While D remains my favorite file scripting language, I must 
 admit that this is quite disappointing for such an old 
 language, compared to similar languages like Crystal.
Windows can simply be a pain to work with though. Look at Crystal itself, it doesn't support Windows natively as far as I know, so of course you won't have Windows-specific bugs in Crystal...
Sep 04 2018
prev sibling parent reply Mihails <nope nope.nope> writes:
On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are 
 fairly obvious to anyone paying attention. D would really 
 benefit from a project manager, which I think Martin Nowak has 
 tried to do, and which the companies using D and the community 
 should get together and fund as a paid position. Maybe it could 
 be one of the funding targets for the Foundation.

 If the job was well-defined, so I knew exactly what we're 
 getting by hiring that person, I'd contribute to that.
Didn't intend to chime in, but no, that was not what I have meant at all. My stance is that as long as current leadership remains in charge and keep sames attitude, no amount of money or developer time will fix D. What is the point in hiring someone to manage things if Walter still can do stuff like -dip1000? For me moment of understanding this was exact point of no return.
Aug 23 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 23/08/2018 9:16 PM, Mihails wrote:
 On Wednesday, 22 August 2018 at 17:42:56 UTC, Joakim wrote:
 Pretty positive overall, and the negatives he mentions are fairly 
 obvious to anyone paying attention. D would really benefit from a 
 project manager, which I think Martin Nowak has tried to do, and which 
 the companies using D and the community should get together and fund 
 as a paid position. Maybe it could be one of the funding targets for 
 the Foundation.

 If the job was well-defined, so I knew exactly what we're getting by 
 hiring that person, I'd contribute to that.
Didn't intend to chime in, but no, that was not what I have meant at all. My stance is that as long as current leadership remains in charge and keep sames attitude, no amount of money or developer time will fix D. What is the point in hiring someone to manage things if Walter still can do stuff like -dip1000? For me moment of understanding this was exact point of no return.
Whoever acts as a project manager would need to be given the ability to override W&A as far as process is concerned. They shouldn't be able to create said process, but should be able to enforce it. Remember their job would be to facilitate communication within the community not deal with burn down charts. Ensuring DIP's and spec remain up to date is a big part of that. That is why I suggested it originally. It minimizes a lot of the existing problems we have had for years.
Aug 23 2018
parent reply Mihails <nope nope.nope> writes:
On Thursday, 23 August 2018 at 09:30:45 UTC, rikki cattermole 
wrote:
 Whoever acts as a project manager would need to be given the 
 ability to override W&A as far as process is concerned. They 
 shouldn't be able to create said process, but should be able to 
 enforce it.
Good luck getting W&A to agree to it, especially when there is yet another "critical D opportunity" on the table ;)
Aug 23 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 23/08/2018 9:45 PM, Mihails wrote:
 On Thursday, 23 August 2018 at 09:30:45 UTC, rikki cattermole wrote:
 Whoever acts as a project manager would need to be given the ability 
 to override W&A as far as process is concerned. They shouldn't be able 
 to create said process, but should be able to enforce it.
Good luck getting W&A to agree to it, especially when there is yet another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
Aug 23 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Aug 23, 2018 at 09:51:43PM +1200, rikki cattermole via Digitalmars-d
wrote:
 On 23/08/2018 9:45 PM, Mihails wrote:
 On Thursday, 23 August 2018 at 09:30:45 UTC, rikki cattermole wrote:
 Whoever acts as a project manager would need to be given the
 ability to override W&A as far as process is concerned. They
 shouldn't be able to create said process, but should be able to
 enforce it.
Good luck getting W&A to agree to it, especially when there is yet another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
And the nice thing about D being open source is that should the situation escalate to the point where the community simply cannot get along with W&A anymore, forking is always an option. (Hopefully it won't come to that, though. AFAIK there was a fork during the D1 fiasco with Tango vs. Phobos, but that fork has since basically died. Maintaining a language is a lot of work, and not everyone has the persistence to stick with it for the long haul.) T -- Guns don't kill people. Bullets do.
Aug 23 2018
parent reply ag0aep6g <anonymous example.com> writes:
On 08/23/2018 05:17 PM, H. S. Teoh wrote:
 And the nice thing about D being open source is that should the
 situation escalate to the point where the community simply cannot get
 along with W&A anymore, forking is always an option.
But forking only happens when the dissenters have enough motivation to do it. If they don't, they might just fade away one by one, looking for greener pastures elsewhere. That would probably be the worst outcome for D: the community shrinks until it's just Walter and Andrei wondering where everybody did go. Now, I don't know if the amount of (quality) contributors is actually growing, shrinking, or stagnating. But I do know that I feel the pull (push?) away from D myself. It feels like ` safe` and `shared` won't ever be solid. Auto-decoding won't go away. Unsound conversions (e.g., `char` -> `dchar`) won't go away. Regressions aren't being fixed (78 open). Wrong-code don't get fixed (170!). Etc., and so on. So why even bother? Currently, I'm toying with ideas for a hobby language of my own. I know that it most likely won't go anywhere, but wasting time on that starts to feel more rewarding than wasting it on D.
Aug 23 2018
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 24/08/2018 4:39 AM, ag0aep6g wrote:
 Currently, I'm toying with ideas for a hobby language of my own. I know 
 that it most likely won't go anywhere, but wasting time on that starts 
 to feel more rewarding than wasting it on D.
I've been doing that for a few years now. My parser generator is slowly coming together. I'll probably start by trying to spec out D in it.
Aug 23 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Aug 23, 2018 at 06:39:32PM +0200, ag0aep6g via Digitalmars-d wrote:
 On 08/23/2018 05:17 PM, H. S. Teoh wrote:
 And the nice thing about D being open source is that should the
 situation escalate to the point where the community simply cannot
 get along with W&A anymore, forking is always an option.
But forking only happens when the dissenters have enough motivation to do it. If they don't, they might just fade away one by one, looking for greener pastures elsewhere. That would probably be the worst outcome for D: the community shrinks until it's just Walter and Andrei wondering where everybody did go.
If that's what it takes, then maybe that will be the point where they start addressing issues that drove people away? :-P
 Now, I don't know if the amount of (quality) contributors is actually
 growing, shrinking, or stagnating. But I do know that I feel the pull
 (push?) away from D myself.
 
 It feels like ` safe` and `shared` won't ever be solid.
To be honest, I've been skeptical of safe since the first time I read about it in TDPL. It feels like something tacked on, rather than integrated into the language design, and it's exactly the kind of feature that needs to be designed into the language from day one in order to have a chance of actually being successful. Over the years, the holes in the implementation hasn't made my confidence much stronger, and as a result, in my own code I rarely, if ever, use safe. Maybe in self-contained modules I'd do it, and then mostly only via automatic attribute inference and annotated unittests to prevent regressions. `shared`... TBH, I haven't really needed to use it, and from the little I know of it, it seems so cumbersome to use and bug-ridden in the current implementation that I don't think I'll ever use it. Sometimes it's just not worth the trouble to fight with a language feature that TBH isn't thoroughly-enough designed to fit in with other language features seamlessly. I'd rather find a different way of approaching my programming problem that wouldn't require `shared`.
 Auto-decoding won't go away.
Yeah, this is a big one for me. Ironically, Walter actually agrees that autodecoding was a bad idea. But byCodePoint / byCodeUnit has so far been a usable (if not as pleasant) workaround for the most part.
 Unsound conversions (e.g., `char` -> `dchar`) won't go away.
Yeah, another annoying holdover from C/C++ that doesn't make any sense. This is one issue where I'd be happy for breaking changes that will clean up the core language.
 Regressions aren't being fixed (78 open). Wrong-code don't get fixed
 (170!).  Etc., and so on. So why even bother?
I think this is unfair. Regressions *are* being fixed... but they're getting reported at a faster rate than they're being fixed. Wrong-code bugs also *are* getting fixed... just not as fast as we'd like, but I think it's unfair to say they aren't getting fixed. Overall, I think with the number of regressions being introduced every release, we really got into language-freeze mode way too early. The fear of breaking changes has gotten to the level of paranoia, yet at the same time regressions keep happening. IMNSHO, if there are going to be so many regressions anyway, why not take the chance to pull off breaking changes like killing autodecoding, killing unsound conversions, and a number of other small but cumulatively annoying rough edges that have been thorns in our sides for years. Unfortunately, I don't see this happening in the near future.
 Currently, I'm toying with ideas for a hobby language of my own. I
 know that it most likely won't go anywhere, but wasting time on that
 starts to feel more rewarding than wasting it on D.
I've felt the temptation myself, actually. The thing that's stopping me from actually doing it is that I simply don't have the time it takes to actually pull it off. To put things in perspective, though: despite all of D's rough edges and certain dark corners where you run into implementation bugs, features that clash with each other, and other such unpleasant things, I still find D much more pleasant to use than C/C++ or Java. Go isn't even on my radar because it lacks generics. And Rust... honestly, I really cannot see myself going back to the dark world of worrying about memory management in every piece of code I write. That's simply the wrong level of abstraction to be working at, IMO, when your programming problem isn't to implement a memory manager. And I'm not interested in non-compiled languages, so Python isn't really on my consideration either. D is far from perfect, but I haven't yet found something closer to my personal ideal of what a programming language should be like. So for the time being, given the choice, I'd still choose D over any other language. T -- Do not reason with the unreasonable; you lose by definition.
Aug 23 2018
parent ag0aep6g <anonymous example.com> writes:
On 08/23/2018 07:28 PM, H. S. Teoh wrote:
 On Thu, Aug 23, 2018 at 06:39:32PM +0200, ag0aep6g via Digitalmars-d wrote:
[...]
 Regressions aren't being fixed (78 open). Wrong-code don't get fixed
 (170!).  Etc., and so on. So why even bother?
I think this is unfair. Regressions *are* being fixed... but they're getting reported at a faster rate than they're being fixed. Wrong-code bugs also *are* getting fixed... just not as fast as we'd like, but I think it's unfair to say they aren't getting fixed.
Yeah, it was hyperbole for sure. Of course bugs are being fixed. But it feels like it takes longer and longer. Elsewhere in this thread, Kenji Hara was mentioned. I had a look at my bug report history, and before Kenji left, he had managed to fix a third of the DMD bugs I had reported by that time. After Kenji's leaving, the fix rate dropped significantly and Walter had to step in more often. So it's very possible that I'm just missing Kenji.
Aug 23 2018
prev sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Thursday, 23 August 2018 at 09:51:43 UTC, rikki cattermole 
wrote:
 Good luck getting W&A to agree to it, especially when there is 
 yet another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
No fork of D can be successful, it won't have the manpower, skills or willpower to draw on. Even with W and A it's already short. 'Threatening' W and A with a fork is an empty threat that just p***es them off. Bad move on your part.
Aug 23 2018
next sibling parent Joakim <dlang joakim.fea.st> writes:
On Thursday, 23 August 2018 at 18:27:27 UTC, Abdulhaq wrote:
 On Thursday, 23 August 2018 at 09:51:43 UTC, rikki cattermole 
 wrote:
 Good luck getting W&A to agree to it, especially when there 
 is yet another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
No fork of D can be successful, it won't have the manpower, skills or willpower to draw on. Even with W and A it's already short. 'Threatening' W and A with a fork is an empty threat that just p***es them off. Bad move on your part.
Agreed, W&A's "power" is that they have done or are doing much of the work and that enough people trust they're better at how they work than those forking. That would be a very high bar for any forking team to surpass.
Aug 23 2018
prev sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 24/08/2018 6:27 AM, Abdulhaq wrote:
 On Thursday, 23 August 2018 at 09:51:43 UTC, rikki cattermole wrote:
 Good luck getting W&A to agree to it, especially when there is yet 
 another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
No fork of D can be successful, it won't have the manpower, skills or willpower to draw on. Even with W and A it's already short. 'Threatening' W and A with a fork is an empty threat that just p***es them off. Bad move on your part.
Oh relax. It isn't close to a threat and they themselves have said what I did that their power only exists as long as the community says so in the past.
Aug 23 2018
prev sibling next sibling parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo
After having seen all the discussions around Mihails post in these days, I'm puzzled by one fact. There was no discussions around one paragraph: "You can't assume there is any control over how declared vision documents get executed in practice. You can't trust any promises from language authors because they don't keep any track of those." I think that this is one of the central points of the post, so why? /Paolo
Aug 24 2018
parent Eugene Wissner <belka caraus.de> writes:
On Friday, 24 August 2018 at 12:25:58 UTC, Paolo Invernizzi wrote:
 On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
 wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo
After having seen all the discussions around Mihails post in these days, I'm puzzled by one fact. There was no discussions around one paragraph: "You can't assume there is any control over how declared vision documents get executed in practice. You can't trust any promises from language authors because they don't keep any track of those." I think that this is one of the central points of the post, so why? /Paolo
I think I touched it indirectly. There is every time "the feature" that will make D most popular language in the world, be it safety (which will kill C) or betterC (which will kill C?), RC instead of GC or whatever. A lot of work is done, but after some time everyone loses interest and the written code becomes a mess. Look how are Phobos containers implemented, there is no consistent memory model.
Aug 24 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo

 [1] 
 https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
 [2] 
 https://blog.mist.global/articles/My_concerns_about_D_programming_language.html
Two things: 1. from the blog "You can't assume that next compiler upgrade won't suddenly break your project or any of its transitive dependencies." 2. it took till 2018 to fix this: https://issues.dlang.org/show_bug.cgi?id=16739 As to 1.: this is my biggest fear and chagrin: with every new version my code might break. And new versions come quite frequently. Having to spend time fixing what wasn't broke a week ago is a nightmare. As to 2: just keeps you from writing code For about a year I've had the feeling that D is moving too fast and going nowhere at the same time. D has to slow down and get stable. D is past the experimental stage. Too many people use it for real world programming and programmers value and _need_ both stability and consistency. I've been working with Java recently and although it is not an exciting language, it does the job and it does it well. You can rely on it to get the job done - and get it done fast. And you know that your code will still work next week, month or in 5 years. In everyday programming life you don't care about the latest fancy features. Imo, D should slow down, take inventory, do some spring cleaning and work on useful libraries and a sound ecosystem. I don't care what color the bike shed is as long as there are bikes in there that actually work. Atm, I'm not considering D for any important and or big projects.
Aug 24 2018
next sibling parent reply Dejan Lekic <dejan.lekic gmail.com> writes:
On Friday, 24 August 2018 at 13:04:28 UTC, Chris wrote:
 I've been working with Java recently and although it is not an 
 exciting language, it does the job and it does it well. You can 
 rely on it to get the job done - and get it done fast. And you 
 know that your code will still work next week, month or in 5 
 years. In everyday programming life you don't care about the 
 latest fancy features. Imo, D should slow down, take inventory, 
 do some spring cleaning and work on useful libraries and a 
 sound ecosystem. I don't care what color the bike shed is as 
 long as there are bikes in there that actually work.

 Atm, I'm not considering D for any important and or big 
 projects.
There is exactly where I am - I am using Java (and more recently Python) for serious stuff. I am however in favour of D moving fast (that is why many Java programmers moved to Kotlin/Scala!). The only problem with D is that there should be stable release of D2 (two times a year, like Fedora for an example), and this stable release gets only security updates and bug-fixes! I know this would require someone to maintain all this (it is a full-time job!)...
Aug 24 2018
parent Chris <wendlec tcd.ie> writes:
On Friday, 24 August 2018 at 13:17:11 UTC, Dejan Lekic wrote:
 On Friday, 24 August 2018 at 13:04:28 UTC, Chris wrote:

 There is exactly where I am - I am using Java (and more 
 recently Python) for serious stuff.
So I'm not alone.
 I am however in favour of D moving fast (that is why many Java 
 programmers moved to Kotlin/Scala!).
[snip] Yes, but D is already there in many ways. Other languages (like Java) are trying to catch up with D. And like in Java, new features should be a bonus, something you may use later, but not break existing code.
Aug 24 2018
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Friday, 24 August 2018 at 13:04:28 UTC, Chris wrote:

 For about a year I've had the feeling that D is moving too fast 
 and going nowhere at the same time. D has to slow down and get 
 stable. D is past the experimental stage. Too many people use 
 it for real world programming and programmers value and _need_ 
 both stability and consistency.
I've started moving some things to other languages myself. The problem is that D, in its current form, has a process that is specially optimized to make it as unusable as possible. 1. There will be no D version 3. 2. There will be no major breaking changes like autodecoding unless we think they're important (and there are no guidelines on what's important, just whatever comes to someone's mind on a particular day). 3. There are many trivial breaking changes made, and they can come in any release. 4. The more releases the better. You simply can't share a D program with anyone else. It's an endless cycle of compiler upgrades and figuring out how to fix code that stops compiling. It doesn't work for those of us that are busy. Why there is not a stable branch with releases once a year is quite puzzling. (And no, "just use the old compiler" is not an answer.)
Aug 24 2018
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 25/08/2018 4:00 AM, bachmeier wrote:
 On Friday, 24 August 2018 at 13:04:28 UTC, Chris wrote:
 
 For about a year I've had the feeling that D is moving too fast and 
 going nowhere at the same time. D has to slow down and get stable. D 
 is past the experimental stage. Too many people use it for real world 
 programming and programmers value and _need_ both stability and 
 consistency.
I've started moving some things to other languages myself. The problem is that D, in its current form, has a process that is specially optimized to make it as unusable as possible. 1. There will be no D version 3. 2. There will be no major breaking changes like autodecoding unless we think they're important (and there are no guidelines on what's important, just whatever comes to someone's mind on a particular day). 3. There are many trivial breaking changes made, and they can come in any release. 4. The more releases the better. You simply can't share a D program with anyone else. It's an endless cycle of compiler upgrades and figuring out how to fix code that stops compiling. It doesn't work for those of us that are busy. Why there is not a stable branch with releases once a year is quite puzzling. (And no, "just use the old compiler" is not an answer.)
Hmm, would a every 2 year LTS be reasonable? We're currently doing about 1 major every 2 months now. This can be used for boot strapping purposes too, while keeping the number of compilers required minimal.
Aug 24 2018
prev sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 24 August 2018 at 16:00:10 UTC, bachmeier wrote:
 

 You simply can't share a D program with anyone else. It's an 
 endless cycle of compiler upgrades and figuring out how to fix 
 code that stops compiling. It doesn't work for those of us that 
 are busy. Why there is not a stable branch with releases once a 
 year is quite puzzling. (And no, "just use the old compiler" is 
 not an answer.)
...hmm...I can't recall anyone ever suggesting to have a stable branch. It's a good idea. That being said, I see forward progress on reducing breakage. The CI infrastructure has improved a lot and there are a number of dub projects that also get checked.
Aug 24 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Aug 24, 2018 at 04:58:12PM +0000, jmh530 via Digitalmars-d wrote:
 On Friday, 24 August 2018 at 16:00:10 UTC, bachmeier wrote:
 
 You simply can't share a D program with anyone else. It's an endless
 cycle of compiler upgrades and figuring out how to fix code that
 stops compiling.
I got bitten by this just yesterday. Update dmd git master, update vibe.d git master, now my vibe.d project doesn't compile anymore due to some silly string.d error somewhere in one of vibe.d's dependencies. :-/ In the past this appeared to be a problem with out-of-date build targets, that was fixed by rebuilding everything from scratch. But this time appears to be something else...
 It doesn't work for those of us that are busy. Why there is not a
 stable branch with releases once a year is quite puzzling. (And no,
 "just use the old compiler" is not an answer.)
...hmm...I can't recall anyone ever suggesting to have a stable branch. It's a good idea. That being said, I see forward progress on reducing breakage. The CI infrastructure has improved a lot and there are a number of dub projects that also get checked.
This is probably completely unrealistic, but I've been thinking about the possibility of adding *all* D codebases to the CI infrastructure, including personal projects and what-not. Set it up such that any breakages send a notification to the author(s) in advance of a PR being checked in, so that they have time to respond. I'm not sure how this would work in practice since you have to deal with dead / unmaintained projects and/or slow/unresponsive authors, and some PRs you might want to push through regardless of breakage. But it would be nice to know exactly how much code we're breaking out there. Part of this is my suspicion that certain big, scary breaking changes actually may not have that big of an impact as we imagine, and could be worthwhile in the long run to get rid of wrinkles in the language and end up with a better, cleaner design that will last longer into the future. But I'm probably just dreaming. T -- VI = Visual Irritation
Aug 24 2018
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 [snip]

 This is probably completely unrealistic, but I've been thinking 
 about the possibility of adding *all* D codebases to the CI 
 infrastructure, including personal projects and what-not.  Set 
 it up such that any breakages send a notification to the 
 author(s) in advance of a PR being checked in, so that they 
 have time to respond.  I'm not sure how this would work in 
 practice since you have to deal with dead / unmaintained 
 projects and/or slow/unresponsive authors, and some PRs you 
 might want to push through regardless of breakage.  But it 
 would be nice to know exactly how much code we're breaking out 
 there.
A worthy goal. If you could get some download statistics from dub (i.e. like total downloads past month), then you could probably create a few buckets and rules so you could make sure that there aren't breakages in the most downloaded projects while not worrying about dead projects that aren't being downloaded anyway.
Aug 24 2018
prev sibling next sibling parent reply Meta <jared771 gmail.com> writes:
On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 I got bitten by this just yesterday.  Update dmd git master, 
 update vibe.d git master, now my vibe.d project doesn't compile 
 anymore due to some silly string.d error somewhere in one of 
 vibe.d's dependencies. :-/
While we're airing grievances about code breakages, I hit this little gem the other day, and it annoyed me enough to complain about it: https://github.com/dlang/phobos/pull/5291#issuecomment-414196174 What really gets me is the actual removal of the symbol. If it had been left there with a deprecation message, I would've caught the problem immediately at the source and fixed it in a few minutes. Instead, I spent an hour or so tracing "execution" paths through a codebase that I'm unfamiliar with to figure out why a static if branch is no longer being taken. On the topic of this thread... I was a bit confused with Dicebot's decision to leave at the time, because he seemed to like dip1000 but then later had a heel turn and left. Looking back through newsgroup threads, it seems like it was mostly that he disagreed with the project management side of things (which he also brings up in his article); an incomplete version of the feature being merged with code in the main branch having to be adjusted to support it. People have complained about it before, and it's distressingly common in D. Why it's not done in a feature branch and then merged in, I don't know, but I do agree with his objections.
Aug 24 2018
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 8/24/18 5:12 PM, Meta wrote:
 On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 I got bitten by this just yesterday.  Update dmd git master, update 
 vibe.d git master, now my vibe.d project doesn't compile anymore due 
 to some silly string.d error somewhere in one of vibe.d's 
 dependencies. :-/
While we're airing grievances about code breakages, I hit this little gem the other day, and it annoyed me enough to complain about it: https://github.com/dlang/phobos/pull/5291#issuecomment-414196174 What really gets me is the actual removal of the symbol. If it had been left there with a deprecation message, I would've caught the problem immediately at the source and fixed it in a few minutes. Instead, I spent an hour or so tracing "execution" paths through a codebase that I'm unfamiliar with to figure out why a static if branch is no longer being taken.
According to this comment: https://github.com/dlang/phobos/pull/5291#issuecomment-360929553 There was no way to get a deprecation to work. When we can't get a deprecation to work, we face a hard decision -- actually break code right away, print lots of crappy errors, or just leave the bug unfixed. -Steve
Aug 24 2018
parent reply Meta <jared771 gmail.com> writes:
On Friday, 24 August 2018 at 21:43:45 UTC, Steven Schveighoffer 
wrote:
 According to this comment: 
 https://github.com/dlang/phobos/pull/5291#issuecomment-360929553

 There was no way to get a deprecation to work.

 When we can't get a deprecation to work, we face a hard 
 decision -- actually break code right away, print lots of 
 crappy errors, or just leave the bug unfixed.

 -Steve
Ah, that's unfortunate. Damned if you do, damned if you don't. I still don't agree with making a breaking change to Phobos.
Aug 24 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Aug 24, 2018 at 09:48:33PM +0000, Meta via Digitalmars-d wrote:
 On Friday, 24 August 2018 at 21:43:45 UTC, Steven Schveighoffer wrote:
 According to this comment:
 https://github.com/dlang/phobos/pull/5291#issuecomment-360929553
 
 There was no way to get a deprecation to work.
 
 When we can't get a deprecation to work, we face a hard decision --
 actually break code right away, print lots of crappy errors, or just
 leave the bug unfixed.
 
 -Steve
Ah, that's unfortunate. Damned if you do, damned if you don't. I still don't agree with making a breaking change to Phobos.
I'm kinda on the fence about this, actually. OT1H I actually *want* D to have breaking changes that will give longer-term benefits, like removing cruft that shouldn't have been there in the first place, or straightening out decisions that in retrospect were poorly made. But OTOH I have also experienced the annoyance of random code breaking upon compiler upgrades, especially if the breakage happens in old code that I don't quite remember the intricacies of, or worse, in 3rd party code whose upstream has become non-responsive or has abandoned the project, or it's just too urgent to wait for upstream to address the problem, which I then have to debug and fix myself. I don't know how to reconcile these two. Perhaps if we had the manpower, we could maintain older versions for long enough to allow users to gradually rewrite to work with newer compilers, while the development branch can be bolder in making breaking changes that ultimately will result in a better, cleaner language. But I doubt we have the kind of manpower it takes to maintain something like that. T -- People demand freedom of speech to make up for the freedom of thought which they avoid. -- Soren Aabye Kierkegaard (1813-1855)
Aug 24 2018
parent Dukc <ajieskola gmail.com> writes:
On Friday, 24 August 2018 at 22:04:49 UTC, H. S. Teoh wrote:
 I don't know how to reconcile these two.  Perhaps if we had the 
 manpower, we could maintain older versions for long enough to 
 allow users to gradually rewrite to work with newer compilers, 
 while the development branch can be bolder in making breaking 
 changes that ultimately will result in a better, cleaner 
 language.  But I doubt we have the kind of manpower it takes to 
 maintain something like that.
In theory, it should be done so that there would be a longer-term unstable and and stable major branches. Stable major branch would behave mainly like we do now: new features allowed and breaking changes also allowed, but only with proper deprectation processes. In unstable major branch, you would do breaking changes, like removing autodecoding and exception throwing on general-purpose Phobos functions. No additional features here unless they depend on the breakages, to ease transitioning between the two. Thwy would be merged like perhaps every ten versions. I'm not saying this would necessarily work, but in theory it's the only way to get rid of historical babbage without becoming a moving target.
Aug 25 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Aug 24, 2018 at 09:12:40PM +0000, Meta via Digitalmars-d wrote:
 On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 I got bitten by this just yesterday.  Update dmd git master, update
 vibe.d git master, now my vibe.d project doesn't compile anymore due
 to some silly string.d error somewhere in one of vibe.d's
 dependencies. :-/
While we're airing grievances about code breakages, I hit this little gem the other day, and it annoyed me enough to complain about it: https://github.com/dlang/phobos/pull/5291#issuecomment-414196174 What really gets me is the actual removal of the symbol. If it had been left there with a deprecation message, I would've caught the problem immediately at the source and fixed it in a few minutes. Instead, I spent an hour or so tracing "execution" paths through a codebase that I'm unfamiliar with to figure out why a static if branch is no longer being taken.
Ironically, I was the one who pushed the merge button on that PR. :-/ Mea culpa. But we had discussed this particular change at length before, and it was clear that there was no clean way to effect the change; every approach would lead to a mess. I forget the details now, but I think Jonathan's approach was the least of the evils among the options we had. Though you do have an extremely good point about deprecating it first, or somehow warning the user in some way, so that when things do break, the solution is clear and doesn't require hours of tracing through 3rd party code. I'm not sure if it was actually possible for deprecation to work on this particular change, but in any case, we should have tried harder to communicate the cause (and possible solution(s)) of the breakage to users.
 On the topic of this thread... I was a bit confused with Dicebot's
 decision to leave at the time, because he seemed to like dip1000 but
 then later had a heel turn and left. Looking back through newsgroup
 threads, it seems like it was mostly that he disagreed with the
 project management side of things (which he also brings up in his
 article); an incomplete version of the feature being merged with code
 in the main branch having to be adjusted to support it. People have
 complained about it before, and it's distressingly common in D. Why
 it's not done in a feature branch and then merged in, I don't know,
 but I do agree with his objections.
I think it's clear by now that most of D's woes are not really technical in nature, but managerial. I'm not sure how to improve this situation, since I'm no manager type either. It's a known problem among techies that we tend to see all problems as technical in nature, or solvable via technical solutions, when in reality what's actually needed is someone with real management skills. Hammer and nail, and all that, y'know. Unfortunately, we techies also tend to resist non-technical "interference", especially from non-techies (like manager types). I used to have that attitude too (and probably still do to some extent), and only with age did I begin realizing this about myself. It's not an easy problem to fix in practice, especially in a place like here, where we're driven primarily by the technical aspects of D, and for the most part outside of any financial or other motivations that might have served to moderate things a little. T -- Some days you win; most days you lose.
Aug 24 2018
parent reply Meta <jared771 gmail.com> writes:
On Friday, 24 August 2018 at 21:53:18 UTC, H. S. Teoh wrote:
 I think it's clear by now that most of D's woes are not really 
 technical in nature, but managerial.
Agreed.
 I'm not sure how to improve this situation, since I'm no 
 manager type either.
Money is the only feasible solution IMO. How many people posting on this forum would quit their job tomorrow and solely contribute to OSS and/or work on their own projects if money wasn't an issue? The majority of people don't like being told what to do, and only want to work on what they're interested in. The only way to get them to work on something they're not interested in is to pay them.
Aug 24 2018
parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Friday, 24 August 2018 at 21:57:55 UTC, Meta wrote:
 On Friday, 24 August 2018 at 21:53:18 UTC, H. S. Teoh wrote:
 I think it's clear by now that most of D's woes are not really 
 technical in nature, but managerial.
 Agreed.
 I'm not sure how to improve this situation, since I'm no 
 manager type either.
 Money is the only feasible solution IMO. How many people 
 posting on this forum would quit their job tomorrow and solely 
 contribute to OSS and/or work on their own projects if money 
 wasn't an issue? The majority of people don't like being told 
 what to do, and only want to work on what they're interested 
 in. The only way to get them to work on something they're not 
 interested in is to pay them.
As the discussion seed is a post from Mihails, I want to recall the author [1]: "Didn't intend to chime in, but no, that was not what I have meant at all. My stance is that as long as current leadership remains in charge and keep sames attitude, no amount of money or developer time will fix D. What is the point in hiring someone to manage things if Walter still can do stuff like -dip1000? For me moment of understanding this was exact point of no return." What are the opinions on that, specifically on the attitude? [1] https://forum.dlang.org/post/yadddavkoopieykhaczx forum.dlang.org /Paolo
Aug 25 2018
prev sibling next sibling parent RhyS <sale rhysoft.com> writes:
On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 I got bitten by this just yesterday.  Update dmd git master, 
 update vibe.d git master, now my vibe.d project doesn't compile 
 anymore due to some silly string.d error somewhere in one of 
 vibe.d's dependencies. :-/
Welcome to my life with D for the past 2 years. You can not rely on D as new features break old ones or create regressions. You can also not rely on its packages, because new features or changes break packages. Or packages that depend on each other break. In the end, the answer is simply, you can not rely on D. Unless you want to stick with one compiler version and write every feature yourself. Other languages also suffer from issues like this but they get fixed so fast that in general the impact is rarely noticed. With D you can be stuck waiting days or weeks! or spending hours fixing it yourself. Again and again ... So your time doing actual work is absorbed by constant fixing D issues. Some will say that contributing to a open source program is the cost to pay but when you have the choice between well established and stable languages and D... That cost very fast becomes: Lets use C/C++/Rust/Go/... And it is saying a lot when young languages like Rust and Go gave me less trouble then D. D has potential but this push for BetterC, better C++ integration, more DIPS down the pipeline... When is enough, enough! It feels like D is more some people their personal playground to push and try out new features then a actually well supported and stable language. You can play around with D at home or for small project but for long term projects, where you bank your company's future on D, you need to be crazy.
Aug 25 2018
prev sibling parent Vladimir Panteleev <thecybershadow.lists gmail.com> writes:
On Friday, 24 August 2018 at 17:12:53 UTC, H. S. Teoh wrote:
 This is probably completely unrealistic, but I've been thinking 
 about the possibility of adding *all* D codebases to the CI 
 infrastructure, including personal projects and what-not.
You mean more than what's already covered by the project tester? https://ci.dlang.io/blue/organizations/jenkins/dlang-org%2Fci/detail/master/159/pipeline/ Anyone can add their project: https://github.com/dlang/ci/blob/38f10275e56b046acad1b9a9b4ecc8bd771e096d/vars/runPipeline.groovy#L457 Some issues for why we can't add *all* D codebases: - We also care about generated code, not just whether it compiles; that means, running the project's tests. However, some tests are flaky (they access network resources or have race conditions). - When we want to deprecate language/library features, they need to be removed from tested code. That means that the project author/maintainer needs to be in the loop and update their code when we "break it on purpose". - Some code or tests are just outright broken, i.e. depending on undefined behavior, like order of iteration of associative arrays. (Guilty of that one!)
Aug 26 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too fast and going 
 nowhere at the same time. D has to slow down and get stable. D is past the 
 experimental stage. Too many people use it for real world programming and 
 programmers value and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
Aug 24 2018
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers value 
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
There's also who demands less (and may be breaking) features.
Aug 24 2018
prev sibling next sibling parent reply tide <tide tide.tide> writes:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers value 
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
Some problems require new features like how taking the address of a member function without an object returns a function pointer, but requires a delegate where C++ has member function pointers, D just has broken unusable code. Or old features that were implemented poorly (C++ mangling for example).
Aug 24 2018
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 24.08.2018 21:42, tide wrote:
 
 Some problems require new features like how taking the address of a 
 member function without an object returns a function pointer, but 
 requires a delegate
That is indeed broken behavior (and I think there is a DIP to fix it), but member function pointers are not really a necessary language feature.
 where C++ has member function pointers, D just has 
 broken unusable code.
You can use a T function(ref S, Args) in place of a T (S::*)(Args) (for S a value type, otherwise you don't need the ref). Then, instead of auto mptr = &S::foo; you write auto mptr = (ref S s, Args args) => s.foo(args); (This is a bit more typing, but it can be automated, such that you only write getMPtr!(S.foo) or similar.) instead of s->*mptr(args) you write mptr(s, args) The syntax is more obvious, it is more general, and at least some C++ compilers will do the same thing under the hood.
Aug 24 2018
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 24.08.2018 22:46, Timon Gehr wrote:
 
 s->*mptr(args)
 
 you write
 
 mptr(s, args)
Oops. Wither the first code sample should be s.*mptr(args) or the second code sample should be mptr(*s, args)
Aug 24 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 12:42 PM, tide wrote:
 Some problems require new features like how taking the address of a member 
 function without an object returns a function pointer, but requires a delegate 
 where C++ has member function pointers, D just has broken unusable code. Or
old 
 features that were implemented poorly (C++ mangling for example).
How to do member function pointers in D: https://www.digitalmars.com/articles/b68.html
Aug 24 2018
parent reply tide <tide tide.tide> writes:
On Friday, 24 August 2018 at 22:42:19 UTC, Walter Bright wrote:
 On 8/24/2018 12:42 PM, tide wrote:
 Some problems require new features like how taking the address 
 of a member function without an object returns a function 
 pointer, but requires a delegate where C++ has member function 
 pointers, D just has broken unusable code. Or old features 
 that were implemented poorly (C++ mangling for example).
How to do member function pointers in D: https://www.digitalmars.com/articles/b68.html
struct SomeStruct { void foo() { // use SomeStruct } } void broken() { void function() foo = &SomeStruct.foo; foo(); // runtime error, isn't actually safe uses wrong calling convention as well } Not really lack of feature so much as there exists broken code. This has been valid code for god knows how long. At some point it was usable in safe, but it looks you can't take an address of a member function without "this" as well in safe anymore.
Aug 24 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/24/2018 4:22 PM, tide wrote:
 struct SomeStruct
 {
      void foo() {
          // use SomeStruct
      }
 }
 
 
 void broken()
 {
      void function() foo = &SomeStruct.foo;
      foo(); // runtime error, isn't actually safe uses wrong calling
convention 
 as well
 }
 
 Not really lack of feature so much as there exists broken code. This has been 
 valid code for god knows how long. At some point it was usable in  safe, but
it 
 looks you can't take an address of a member function without "this" as well in 
 safe anymore.
That's because it isn't safe. But being able to take the address is important for system work.
Aug 24 2018
next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 25 August 2018 at 01:43:19 UTC, Walter Bright wrote:
 On 8/24/2018 4:22 PM, tide wrote:
 struct SomeStruct
 {
      void foo() {
          // use SomeStruct
      }
 }
 
 
 void broken()
 {
      void function() foo = &SomeStruct.foo;
      foo(); // runtime error, isn't actually safe uses wrong 
 calling convention as well
 }
 
 Not really lack of feature so much as there exists broken 
 code. This has been valid code for god knows how long. At some 
 point it was usable in  safe, but it looks you can't take an 
 address of a member function without "this" as well in safe 
 anymore.
That's because it isn't safe. But being able to take the address is important for system work.
The stupid thing is you _have_ to cast (which is unsafe) the return type to be correct. This could be solvable with DIP1011 to make &SomeStruct.foo return `extern(delegate) void function(ref Foo)` although it makes no explicit mention other than "member functions be implicitly convertible to extern(delegate) functions".
Aug 25 2018
prev sibling next sibling parent tide <tide tide.tide> writes:
On Saturday, 25 August 2018 at 01:43:19 UTC, Walter Bright wrote:
 On 8/24/2018 4:22 PM, tide wrote:
 struct SomeStruct
 {
      void foo() {
          // use SomeStruct
      }
 }
 
 
 void broken()
 {
      void function() foo = &SomeStruct.foo;
      foo(); // runtime error, isn't actually safe uses wrong 
 calling convention as well
 }
 
 Not really lack of feature so much as there exists broken 
 code. This has been valid code for god knows how long. At some 
 point it was usable in  safe, but it looks you can't take an 
 address of a member function without "this" as well in safe 
 anymore.
That's because it isn't safe. But being able to take the address is important for system work.
Which is my point. Why did you link that article then? It's not safe due to the inherent flaw of D. It shouldn't return a function() type. This is invalid code just outright, the type system could easily be used to prevent this kind of mistake. But instead it relies on the user knowing about the bug in D. Hell like someone else mentioned, if it returned a delegate that would make more sense. But it doesn't for whatever reason. There's a lot of little things like this in D, and from your response you obviously don't give a flying shit about fixing it as you don't even see it as a problem. Just disable it in safe and anyone that needs to write in system will have to deal with insanity instead of having something reasonable.
Aug 25 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 25.08.2018 03:43, Walter Bright wrote:
 On 8/24/2018 4:22 PM, tide wrote:
 struct SomeStruct
 {
      void foo() {
          // use SomeStruct
      }
 }


 void broken()
 {
      void function() foo = &SomeStruct.foo;
      foo(); // runtime error, isn't actually safe uses wrong calling 
 convention as well
 }

 Not really lack of feature so much as there exists broken code. This 
 has been valid code for god knows how long. At some point it was 
 usable in  safe, but it looks you can't take an address of a member 
 function without "this" as well in safe anymore.
That's because it isn't safe. But being able to take the address is important for system work.
So is taking the address of an `int` variable. The analogous point is that the type of `&x` for `x` an `int` should be `int*` (and not e.g. `string*`). D gets this right, and it should be equally obvious that it should get it right for the member function pointer case. (Or at least, not wrong. Using e.g. `void*` instead of an incompatible type would already be an improvement.)
Aug 25 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 6:32 AM, Timon Gehr wrote:
 (Or at least, not wrong. Using e.g. `void*` 
 instead of an incompatible type would already be an improvement.)
Making it void* is a reasonable idea.
Aug 25 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 25 August 2018 at 20:19:44 UTC, Walter Bright wrote:
 On 8/25/2018 6:32 AM, Timon Gehr wrote:
 (Or at least, not wrong. Using e.g. `void*` instead of an 
 incompatible type would already be an improvement.)
Making it void* is a reasonable idea.
If/when (I really hope the latter) DIP 1011 gets in it should be extern(delegate) with `ref typeof(this)` as the first parameter.
Aug 25 2018
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers value 
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
Heh, thought this proggit comment thread was funny given this complaint, some C++ users feel it's moving too fast now: "In the last few years it has basically become a different language, the feature creep is insane. I stopped caring about new features since C++11, and progressively used the language less and less." Another user: "I remember being really excited about C++11 - and I think it really did add some much needed features. But it's been getting more and more out of hand since then..." https://www.reddit.com/r/programming/comments/99rnuq/comment/e4q8iqn
Aug 24 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, August 24, 2018 6:54:38 PM MDT Joakim via Digitalmars-d wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too
 fast and going nowhere at the same time. D has to slow down
 and get stable. D is past the experimental stage. Too many
 people use it for real world programming and programmers value
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
Heh, thought this proggit comment thread was funny given this complaint, some C++ users feel it's moving too fast now: "In the last few years it has basically become a different language, the feature creep is insane. I stopped caring about new features since C++11, and progressively used the language less and less." Another user: "I remember being really excited about C++11 - and I think it really did add some much needed features. But it's been getting more and more out of hand since then..." https://www.reddit.com/r/programming/comments/99rnuq/comment/e4q8iqn
LOL. Yeah. Basically, we all somehow want stuff to be completely stable and never break any of our code, but we also want new stuff that improves the language - which frequently requires breaking existing code (and even if it doesn't require breaking existing code, adding features always risks breaking existing ones - even simply fixing bugs risks introducing new ones). There are better and worse ways to handle change, but I think that we're pretty much all fundamentally conflicted in what we want. You simply can't have everything stay the same and have it change at the same time, and yet, that's basically what everyone wants. Pretty much the only way to get around that would be to have a perfect language with no bugs, and that's obviously not happening even if we could all agree on what the "perfect" language would entail (which we're clearly not going to do). - Jonathan M Davis
Aug 24 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers value 
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
"Every programmer who..." Really? Sorry, but this is not an answer. The fact remains that D is in danger of becoming unusable for real world programming. Earlier this year I had to "unearth" old Python code from 2009 (some parts of the code were even older). And you know what? It still worked! The same goes for Java code I wrote for Java 1.5. If you want to achieve something similar with D you have to write code that is basically C code, i.e. you shouldn't use any of the nicer or more advanced features, because they might break with the next dmd release - which kind of defeats the purpose. Also, a. adding new features doesn't necessarily mean that old code has to stop working and b. the last breaking change I would've supported was to get rid of autodecode, but that was never done and now it seems too late, yet it would have been a change of utmost importance because string handling is everywhere these days. But maybe it would have been too much tedious work and no real intellectual challenge, so why bother. Other languages do bother, however. You may brush our concerns aside with a throw away comment like the one above, but I'm not the only one who doesn't consider D for serious stuff anymore. As has been said before, none of the problems are unfixable - but if your answer is indicative of the D leadership's attitude towards concerned (longtime) users, then don't be surprised that we go back to Java and other languages that offer more stability. I still have maximum respect for everything you, Andrei and the community have achieved. But please don't throw it all away now.
Aug 25 2018
next sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Saturday, 25 August 2018 at 10:52:04 UTC, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers 
 value and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
"Every programmer who..." Really? Sorry, but this is not an answer. The fact remains that D is in danger of becoming unusable for real world programming. Earlier this year I had to "unearth" old Python code from 2009 (some parts of the code were even older). And you know what? It still worked! The same goes for Java code I wrote for Java 1.5. If you want to achieve something similar with D you have to write code that is basically C code, i.e. you shouldn't use any of the nicer or more advanced features, because they might break with the next dmd release - which kind of defeats the purpose. Also, a. adding new features doesn't necessarily mean that old code has to stop working and b. the last breaking change I would've supported was to get rid of autodecode, but that was never done and now it seems too late, yet it would have been a change of utmost importance because string handling is everywhere these days. But maybe it would have been too much tedious work and no real intellectual challenge, so why bother. Other languages do bother, however. You may brush our concerns aside with a throw away comment like the one above, but I'm not the only one who doesn't consider D for serious stuff anymore. As has been said before, none of the problems are unfixable - but if your answer is indicative of the D leadership's attitude towards concerned (longtime) users, then don't be surprised that we go back to Java and other languages that offer more stability. I still have maximum respect for everything you, Andrei and the community have achieved. But please don't throw it all away now.
And yet some of the heaviest users of D have said in public 'please break our code". I wonder why that could be. It's also not terribly surprising that D2 code from 2009 doesn't always compile when you consider the release date of the language. Do you think it's a bad thing that imports were fixed, for example? That broke a lot of old code. "If you want to achieve
 something similar with D you have to write code that is 
 basically C code, i.e. you shouldn't use any of the nicer or 
 more advanced features, because they might break with the next 
 dmd release - which kind of defeats the purpose.
" I don't think this is true. Have slices, arrays, associative arrays and so on broken ? On the other hand D written like C that didn't get the imports right would have broken when the module system was corrected. This is a good thing.
 "Every programmer who..." Really? Sorry, but this is not an 
 answer. The fact remains that D is in danger of becoming 
 unusable for real world programming."
I don't think this is true either. It doesn't fit with my own experience and it doesn't fit with the growing enterprise adoption. That may be your personal perspective, but it's really hard to put yourself in the shoes of somebody in a very different situation that you have never encountered. There's intrinsically a tradeoff between different kinds of problems. Nassim Taleb writes about hormesis. I'm not sure that breakage of a non-serious kind is necessarily terrible. It might be terrible for you personally - that's not for me to judge. But it has the effect of building capabilities that have value in other ways. There are quite a few different sorts of concerns raised on this thread and they are linked by how people feel not by logic. I have a lot of respect for Shachar technically but I personally found the way he expressed his point of view a bit odd and unlikely to be effective in achieving whatever it is his goal was, also bearing in mind he doesn't speak for Weka. It might be helpful to go through the concerns and organise them based on logical ordering because an outburst of emotion won't translate in itself into any kind of solution.
Aug 25 2018
next sibling parent Chris <wendlec tcd.ie> writes:
On Saturday, 25 August 2018 at 12:16:06 UTC, Laeeth Isharc wrote:

 Nassim Taleb writes about hormesis.  I'm not sure that breakage 
 of a non-serious kind is necessarily terrible.  It might be 
 terrible for you personally - that's not for me to judge.  But 
 it has the effect of building capabilities that have value in 
 other ways.
Unless you can afford to spend a lot of time on just fixing things, even small changes are annoying. I can understand that if a change is for the better in the long run it's worth breaking code, especially if you warn users so they know what's coming. But being "afraid" of every new version is not exactly a thing that encourages you to use D. Mind you, it's not just one program you have to maintain. Usually you have libraries and modules (and third party libs) that you haven't touched for a year or so and all of a sudden they don't compile anymore. You basically have to fix everything. And then there's vibe.d that has a hard time catching up with everything (Sönke is doing a great job, btw.) So you have a service based on vibe.d and somebody asks you to implement a trivial change (say sorting). You add it, compile it with the latest version of dmd and it gives you an error "XYZ is not supported when the moon is full". So you have to go back to a compiler version that works and you cannot benefit from e.g. the latest GC optimizations of D. You basically end up with a mess of different compilers for different code until you have fixed everything (which you don't have time for all the time). If your code is older than 2 or 3 versions of dmd, you are already in trouble and given the release frequency it happens quite fast. Imagine, to fix a trivial bug or implement a simple feature you may have to stick to an outdated version of dmd or you have a ratio of 10% time spent on your own code 90% time spent on fixing what the latest release broke. Not good. I am not being emotional, as you suggest. I'm talking about my practical experience and the challenges I face when I use D. And the prevalent attitude of "It's for your own good in the long run, you may not understand it now, but you will one day once you've followed the discussions on the forum and read the new specs!" is not helpful either when you need to get a job done. In the long run this might kill off D. And I'm someone who talks about it, imagine all the frustrated users who silently leave. I have never encountered any such problems with any other language so far. Something tells me that there's something wrong with how D is managed.
Aug 25 2018
prev sibling next sibling parent Dave Jones <dave jones.com> writes:
On Saturday, 25 August 2018 at 12:16:06 UTC, Laeeth Isharc wrote:
 On Saturday, 25 August 2018 at 10:52:04 UTC, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
There are quite a few different sorts of concerns raised on this thread and they are linked by how people feel not by logic. I have a lot of respect for Shachar technically but I personally found the way he expressed his point of view a bit odd and unlikely to be effective in achieving whatever it is his goal was, also bearing in mind he doesn't speak for Weka. It might be helpful to go through the concerns and organise them based on logical ordering because an outburst of emotion won't translate in itself into any kind of solution.
You can approach things rationally, make a list of the issues, have a big discussion again. and nothing will change.
Aug 25 2018
prev sibling parent reply RhyS <sale rhysoft.com> writes:
On Saturday, 25 August 2018 at 12:16:06 UTC, Laeeth Isharc wrote:
 And yet some of the heaviest users of D have said in public 
 'please break our code".  I wonder why that could be.
My answer to that is simply: Break stuff so it becomes STABLE! Remove junk and clutter. Do NOT break stuff to add more unneeded features inside a rotten carcass. Be honest, how many people will use BetterC in production! Not as some homework or sided projects with a few dozen lines. Very few and that is the issue ... That is development time that can have gone into documentation, bug fixes...
Aug 25 2018
next sibling parent bpr <brogoff gmail.com> writes:
On Saturday, 25 August 2018 at 22:55:05 UTC, RhyS wrote:
 Be honest, how many people will use BetterC in production!
Much, MUCH more likely than that I would ever use full D with GC in production. Really, if I want a language with a GC, D is not that good. Why wouldn't I use a JVM language (Java, Kotlin, Scala) or Go or something else? Notice how they all have precise GCs? Or maybe I'd use a functional language like OCaml or Haskell. Same deal, precise GC. The truth is that D is by design NOT a replacement for C++ when a low level systems programming language must be used. DasBetterC is close to what I'd like when I have to use C++, but not yet ideal. I'm starting to think that Rust (or, C++ 17 and beyond) will win this battle because every other language shows up with stuff I don't want.
Aug 25 2018
prev sibling parent reply Radu <void null.pt> writes:
On Saturday, 25 August 2018 at 22:55:05 UTC, RhyS wrote:
 On Saturday, 25 August 2018 at 12:16:06 UTC, Laeeth Isharc 
 wrote:
 And yet some of the heaviest users of D have said in public 
 'please break our code".  I wonder why that could be.
My answer to that is simply: Break stuff so it becomes STABLE! Remove junk and clutter. Do NOT break stuff to add more unneeded features inside a rotten carcass. Be honest, how many people will use BetterC in production! Not as some homework or sided projects with a few dozen lines. Very few and that is the issue ... That is development time that can have gone into documentation, bug fixes...
I think you need to look at Dlang as what it is - still WIP and mostly *community driven*. I got used to the occasional breaking or regression, and the best I can advise is to try to report or fix them if you can. There are still lots of things to be removed/added/or fixed in the language and the standard libraries - breakage will appear, and looks that most users expect some kind of breakage. As for DasBetterC, you might underestimate the potential it has for migrating old C code fully of partially to D, or the nice thing that it enabled Webassembly targeting. But for me it is important in the way that it acted as a catalyst for people to look at the issues Dlang and Druntime had, and made them better by making them more modular. This is a win for Dlang in the long run, maybe the betterC flag will be removed at some point because the compiler and runtime will be smart enough to enable pay-as-you-go intrinsically.
Aug 25 2018
parent Chris <wendlec tcd.ie> writes:
On Saturday, 25 August 2018 at 23:46:54 UTC, Radu wrote:

 I think you need to look at Dlang as what it is - still WIP and 
 mostly *community driven*.

 I got used to the occasional breaking or regression, and the 
 best I can advise is to try to report or fix them if you can. 
 There are still lots of things to be removed/added/or fixed in 
 the language and the standard libraries - breakage will appear, 
 and looks that most users expect some kind of breakage.
This is not good enough. Yes, D users expect "some kind of breakage" - all the time, and this is exactly the problem. In D people have put up with breakages for too long believing things will improve in the long run, and because of this it is assumed that users will put up with it forever and ever and ever. Has it never occurred to anyone that many users have got sick and tired of this? I mean, there are other languages out there and you can get things done in no time and your program will still compile next week. I say it again, D has come a long way and things were looking good, but it's going nowhere. Look at this thread. Very intelligent people, top programmers, top engineers are arguing over details, while users (of D) who are programming for other users are trying to say "Stop!". There's a huge difference between developing D and developing _in_ D.
Aug 25 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 3:52 AM, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 Every programmer who says this also demands new (and breaking) features.
"Every programmer who..." Really?
You want to remove autodecoding (so do I) and that will break just about every D program in existence. For everyone else, it's something else that's just as important to them. For example, Shachar wants partially constructed objects to be partially destructed, a quite reasonable request. Ok, but consider the breakage: struct S { ~this() {} } class C { S s; this() nothrow {} } I.e. a nothrow constructor now must call a throwing destructor. This is not some made up example, it breaks existing code: https://github.com/dlang/dmd/pull/6816 If I fix the bug, I break existing code, and apparently a substantial amount of existing code. What's your advice on how to proceed with this?
Aug 25 2018
next sibling parent Boris-Barboris <ismailsiege gmail.com> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 I.e. a nothrow constructor now must call a throwing destructor. 
 This is not some made up example, it breaks existing code:

   https://github.com/dlang/dmd/pull/6816

 If I fix the bug, I break existing code, and apparently a 
 substantial amount of existing code. What's your advice on how 
 to proceed with this?
Deprecated message and allow it (nothrow and safe\system checks) for a year or two? Guaranteed destructor call is important enough to actually bother with this IMO. Nothrow and other attribute deduction could also be a temporary option.
Aug 25 2018
prev sibling next sibling parent reply David Nadlinger <code klickverbot.at> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 If I fix the bug, I break existing code, and apparently a 
 substantial amount of existing code. What's your advice on how 
 to proceed with this?
At least for the transition period, I'd have attributes only apply to the user-specified code and infer them for the actual full constructor. (We can still print a deprecation warning if they don't match.) —David
Aug 25 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 2:46 PM, David Nadlinger wrote:
 On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 If I fix the bug, I break existing code, and apparently a substantial amount 
 of existing code. What's your advice on how to proceed with this?
At least for the transition period, I'd have attributes only apply to the user-specified code and infer them for the actual full constructor. (We can still print a deprecation warning if they don't match.) —David
Inferring is not good enough, for example: https://github.com/dlang/dmd/pull/6816#issuecomment-307972790 There the safe constructor is calling a destructor that calls free(). It can't be inferred as safe. Sure, we can do warnings and deprecations, but the user will sooner or later have to change his code, and Chris' code from 2009 is not going to compile without changes. --- A further issue with inferring for destructors is that why do it for destructors and not other member functions? It's a good question, without obvious answers. --- The larger issue here is there is no solution where someone's ox doesn't get gored. I have to make a judgement call on what is overall best for the future of D. And the ox gets inevitably gored. Even C, probably the slowest moving major language, has these issues.
Aug 25 2018
parent reply David Nadlinger <code klickverbot.at> writes:
On Saturday, 25 August 2018 at 22:53:44 UTC, Walter Bright wrote:
 On 8/25/2018 2:46 PM, David Nadlinger wrote:
 At least for the transition period, I'd have attributes only 
 apply to the user-specified code and infer them for the actual 
 full constructor. (We can still print a deprecation warning if 
 they don't match.) —David
Inferring is not good enough, for example: https://github.com/dlang/dmd/pull/6816#issuecomment-307972790 There the safe constructor is calling a destructor that calls free(). It can't be inferred as safe.
How did you interpret "only apply to the user-specified code"? In this example, the ` safe` in `this() safe {}` would only apply to `{}`. This necessitates being careful with the error message in a case like this struct UnsafeDtor { system ~this() {} } struct Foo { UnsafeDtor ud; this(int a) safe { if (!a) throw Exception("boo"); } } void bar() safe { auto f = Foo(1); } as just printing "cannot call system constructor" would be a bit misleading. It should work without surprises otherwise, though, and can be supplemented with deprecation warnings if the specified "inner" and inferred "outer" attributes don't match. —David
Aug 25 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 5:52 PM, David Nadlinger wrote:
 On Saturday, 25 August 2018 at 22:53:44 UTC, Walter Bright wrote:
 On 8/25/2018 2:46 PM, David Nadlinger wrote:
 At least for the transition period, I'd have attributes only apply to the 
 user-specified code and infer them for the actual full constructor. (We can 
 still print a deprecation warning if they don't match.) —David
Inferring is not good enough, for example: https://github.com/dlang/dmd/pull/6816#issuecomment-307972790 There the safe constructor is calling a destructor that calls free(). It can't be inferred as safe.
How did you interpret "only apply to the user-specified code"? In this example, the ` safe` in `this() safe {}` would only apply to `{}`.
I'm not sure what you're referring to. I'm referring to the specified message, and the example: struct Array { int[] _payload; ~this() // (2) { import core.stdc.stdlib : free; free(_payload.ptr); // (3) } } class Scanner { Array arr; this() safe {} // (1) } In order for (1) to be safe, then the destructor it calls for arr (2) must also be safe. But the destructor calls free() (3), which is not safe. Therefore, the compilation fails. Inference does not solve this problem, because (2) is inferred as system.
 can be supplemented with 
 deprecation warnings if the specified "inner" and inferred "outer" attributes 
 don't match.
Yes, and that's probably the only viable way forward with this. But Chris' ox will get gored, as 2009 code will not compile anymore without modification after the deprecation period expires.
Aug 26 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Sunday, 26 August 2018 at 10:17:51 UTC, Walter Bright wrote:
 I'm not sure what you're referring to. I'm referring to the 
 specified message, and the example:

 struct Array
 {
     int[] _payload;
     ~this() // (2)
     {
         import core.stdc.stdlib : free;
         free(_payload.ptr); // (3)
     }
 }

 class Scanner
 {
     Array arr;
     this()  safe {}  // (1)
 }

 In order for (1) to be  safe, then the destructor it calls for 
 arr (2) must also be  safe. But the destructor calls free() 
 (3), which is not  safe. Therefore, the compilation fails. 
 Inference does not solve this problem, because (2) is inferred 
 as  system.
Yes but if Scanners constructor is nothrow then all is fine, since it won't unwind unless an error is thrown in which case it game over anyway.
Aug 26 2018
prev sibling next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 On 8/25/2018 3:52 AM, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 Every programmer who says this also demands new (and 
 breaking) features.
"Every programmer who..." Really?
You want to remove autodecoding (so do I) and that will break just about every D program in existence. For everyone else, it's something else that's just as important to them. For example, Shachar wants partially constructed objects to be partially destructed, a quite reasonable request. Ok, but consider the breakage: struct S { ~this() {} } class C { S s; this() nothrow {} } I.e. a nothrow constructor now must call a throwing destructor. This is not some made up example, it breaks existing code: https://github.com/dlang/dmd/pull/6816 If I fix the bug, I break existing code, and apparently a substantial amount of existing code. What's your advice on how to proceed with this?
Run semantic3 on the constructor independent of the requirement to destruct already constructed objects. If the constructors is nothrow then there is no need to have the destructors run or the eh code at all, because no Exceptions can be thrown (an Error may be thrown but that will kill the program). This is how I intend to fix it after I refactor semantic3.
Aug 25 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/25/2018 4:49 PM, Nicholas Wilson wrote:
 Run semantic3 on the constructor independent of the requirement to destruct 
 already constructed objects. If the constructors is nothrow then there is no 
 need to have the destructors run or the eh code at all, because no Exceptions 
 can be thrown (an Error may be thrown but that will kill the program). This is 
 how I intend to fix it after I refactor semantic3.
A function can be made nothrow by: try { .... } catch (Exception e) { ... handle it locally ... } Also, your proposal is ignoring the destructors, which is literally what the compiler does now.
Aug 30 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 30 August 2018 at 23:03:57 UTC, Walter Bright wrote:
 On 8/25/2018 4:49 PM, Nicholas Wilson wrote:
 Run semantic3 on the constructor independent of the 
 requirement to destruct already constructed objects. If the 
 constructors is nothrow then there is no need to have the 
 destructors run or the eh code at all, because no Exceptions 
 can be thrown (an Error may be thrown but that will kill the 
 program). This is how I intend to fix it after I refactor 
 semantic3.
A function can be made nothrow by: try { .... } catch (Exception e) { ... handle it locally ... }
Then I should have said: no exceptions can propagate, which is the real problem.
 Also, your proposal is ignoring the destructors, which is 
 literally what the compiler does now.
It was implicit in that the throwing case would call the destructors in the event of an exception (otherwise the bug ain't fixed). This formulation is to reduce the amount of breakage, which was the problem last time. Yes this will break (as in code breakage) safe ctors calling system dtors but, such is life. The ctor probably shouldn't be throwing in the first place. I'll probably add -vthrowingctor and -vthrowingdtor as well since this will be a perf hit in the case of a throwing ctor. Sorry for any confusion.
Aug 30 2018
prev sibling next sibling parent reply Andre Pany <andre s-e-a-p.de> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 On 8/25/2018 3:52 AM, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 Every programmer who says this also demands new (and 
 breaking) features.
"Every programmer who..." Really?
You want to remove autodecoding (so do I) and that will break just about every D program in existence. For everyone else, it's something else that's just as important to them. For example, Shachar wants partially constructed objects to be partially destructed, a quite reasonable request. Ok, but consider the breakage: struct S { ~this() {} } class C { S s; this() nothrow {} } I.e. a nothrow constructor now must call a throwing destructor. This is not some made up example, it breaks existing code: https://github.com/dlang/dmd/pull/6816 If I fix the bug, I break existing code, and apparently a substantial amount of existing code. What's your advice on how to proceed with this?
In the whole discussion I miss 2 really important things. If your product compiles fine with a dmd version, no one forces you to update to the next dmd version. In the company I work for, we set for each project the DMD version in the build settings. The speed of DMD releases or breaking changes doesn't affect us at all. Maybe I do not know a lot open source products but the amount of work which goes into code quality is extremely high for the compiler, runtime, phobos and related products. I love to see how much work is invested in unit tests and also code style. DMD (and LDC and GDC) has greatly improved in the last years in various aspects. But I also see that there is a lot of work to be done. There are definitely problems to be solved. It is sad that people like Dicebot leaving the D community. Kind regards Andre
Aug 26 2018
next sibling parent reply Peter Alexander <peter.alexander.au gmail.com> writes:
On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
 In the whole discussion I miss 2 really important things.

 If your product compiles fine with a dmd version, no one forces 
 you to update to the next dmd version. In the company I work 
 for, we set for each project the DMD version in the build 
 settings. The speed of DMD releases or breaking changes doesn't 
 affect us at all.
If your product is a library then your customers dictate which dmd version you build with.
Aug 26 2018
parent aliak <something something.com> writes:
On Sunday, 26 August 2018 at 09:59:37 UTC, Peter Alexander wrote:
 On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
 In the whole discussion I miss 2 really important things.

 If your product compiles fine with a dmd version, no one 
 forces you to update to the next dmd version. In the company I 
 work for, we set for each project the DMD version in the build 
 settings. The speed of DMD releases or breaking changes 
 doesn't affect us at all.
If your product is a library then your customers dictate which dmd version you build with.
Why is this a problem? I have the exact same thought. This is not an unsolvable problem. Package managers have solved this ages ago with a min-version flag. The compiler can do the same if D is against just embracing the package manager as the way to do things. If D has an LTS version and cutting edge then I don't see the problem: a) You broke me lib! => Set a min-compilation version flag, or use LTS (you have both options) - Qt does this. - Node does this. - iOS Foundation even gets rid of crap, and their user base is HUGE. - Safari is completely revamping how cookies and storage APIs work. That's *universal*. Programmers are dealing with it. Yes their user base is much bigger - so they can survive - is probably one subjective argument. But then if you have an LTS then what's the argument? b) Why you no update D?! => use cutting edge. The only problem I see is manpower. Cheers, - Ali
Aug 26 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
 On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright 
 wrote:

 In the whole discussion I miss 2 really important things.

 If your product compiles fine with a dmd version, no one forces 
 you to update to the next dmd version. In the company I work 
 for, we set for each project the DMD version in the build 
 settings. The speed of DMD releases or breaking changes doesn't 
 affect us at all.
No. Nobody forces you to use the latest version that may have an improved GC, new library functions or bug fixes. In fact, why bother with improving the language at all? But how do you feel about code that you've been compiling with, say dmd 2.071.2 for years now - including workarounds for compiler bugs? Doesn't the thought of having to upgrade it one day bother you at all? What if your customer said that 2.08++ had better features asking you to use them? The burden of finding paths to handle deprecations etc. is on the user, not the language developers. And this is where the psychological factor that Laeeth was talking about comes in. If you're constantly programming thinking "Whatever I write today might break tomorrow, uh, and what about the code I wrote in 2016? Well, I'll have to upgrade it one day, when I have time. I'll just keep on using an older version of dmd for now. Yeah, no, I cannot benefit from the latest improvements but at least it compiles with dmd2.st0neage. But why worry, I'll just have to get used to the fact that I have different code for different versions, for now...and forever." You can get used to anything until you find out that it doesn't need to be this way. You write unexciting Java code and hey, it works and it always will. It took me a while to understand why Java has been so successful, but now I know. It's not write once, run everywhere. It's write once, run forever. Stability, predictability. And maybe that's why Java, Go and once C++ prefer a slower pace. I just don't understand why it is so hard to understand the points I and others have made. It's not rocket science, but maybe this is the problem, because I already see, the point to take home is: There are no real problems, we are just imagining them. Real world experience doesn't count, because we just don't see the bigger picture which is the eternal glory of academic discussions about half baked features of an eternally unfinished language that keeps changing randomly. Not practical, but intellectually satisfying.
Aug 26 2018
next sibling parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Sunday, 26 August 2018 at 13:40:17 UTC, Chris wrote:
 You can get used to anything until you find out that it doesn't 
 need to be this way. You write unexciting Java code and hey, it 
 works and it always will. It took me a while to understand why 
 Java has been so successful, but now I know.
A week ago or so I was considering a programming language for my new project. JVM (Kotlin) was one of alternatives and scored high in my list. The (big) problem was that JVM doesn't have structs. So I investigated (I don't actually know Java nor Kotlin). There was some library that apparently brings structs to Java, which seemed a bit dubious to me. Also, I found out there was an effort (at Oracle) to hack structs into Java. Interestingly, it seemed that 14 people worked full time on that. 14 people to make it possible to use structs. So, yeah. It's not difficult to understand why Java is more "industrial-strength" than D. I don't know what exactly you expected. I chose D in the end, as this project is something where it's reasonable to use D. And yes, I expect some problems that I just wouldn't have with Java/Kotlin or even with C++ (well, the latter has some pretty severe problems, IMO)
Aug 26 2018
parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 26 August 2018 at 14:00:56 UTC, nkm1 wrote:
 [...]
What did I expect? Better: What do I expect now. I've been using D for years now. I think it's time for D to offer users the same stability as other languages do. Simple as.
Aug 26 2018
parent reply lurker <lurker aol.com> writes:
On Sunday, 26 August 2018 at 14:17:33 UTC, Chris wrote:
 On Sunday, 26 August 2018 at 14:00:56 UTC, nkm1 wrote:
 [...]
What did I expect? Better: What do I expect now. I've been using D for years now. I think it's time for D to offer users the same stability as other languages do. Simple as.
lurking around this board for a long time and gave up on d2 along time ago. it is to scripty. i can not convince anybody at work to use it even for small things under windows. some tried it and they say it is to buggy, misses windows essentials and there seems to be no chance of betterment via management or the compiler enthusiasts that rather implement any fancy fart instead of getting the compiler stable, bug free and usable. it seems like this is a language experiment, unusable for serious development. i just downloaded current beta 2 and visual D. installs ok, no detection of visual studio or any of the associated paths. visual D installed ok, but a click on menu options killed visual studio. i uninstalled successfully - hallelujah. so i lurk around for an other year an see if D experiment is still around and/or usable.
Aug 26 2018
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 27/08/2018 4:09 AM, lurker wrote:
 On Sunday, 26 August 2018 at 14:17:33 UTC, Chris wrote:
 On Sunday, 26 August 2018 at 14:00:56 UTC, nkm1 wrote:
 [...]
What did I expect? Better: What do I expect now. I've been using D for years now. I think it's time for D to offer users the same stability as other languages do. Simple as.
lurking around this board for a long time and gave up on d2 along time ago. it is to scripty. i can not convince anybody at work to use it even for small things under windows. some tried it and they say it is to buggy, misses windows essentials and there seems to be no chance of betterment via management or the compiler enthusiasts that rather implement any fancy fart instead of getting the compiler stable, bug free and usable. it seems like this is a language experiment, unusable for serious development. i just downloaded current beta 2 and visual D. installs ok, no detection of visual studio or any of the associated paths. visual D installed ok, but a click on menu options killed visual studio. i uninstalled successfully - hallelujah. so i lurk around for an other year an see if D experiment is still around and/or usable.
Both VisualD and dmd should work out of the box with MSVC and Visual Studio. In the last year I have not seen any reports of either failing out-right without something wrong with the users environment making it problematic. So please report any issues you're having. Because they are not regular user experience.
Aug 26 2018
next sibling parent lurker <lurker aol.com> writes:
On Sunday, 26 August 2018 at 16:25:31 UTC, rikki cattermole wrote:
 On 27/08/2018 4:09 AM, lurker wrote:
 On Sunday, 26 August 2018 at 14:17:33 UTC, Chris wrote:
 On Sunday, 26 August 2018 at 14:00:56 UTC, nkm1 wrote:
 So please report any issues you're having. Because they are not 
 regular user experience.
i did - and lost interest. right now i'd rather use a modern BASIC, because the install works and the compiler too.
Aug 26 2018
prev sibling parent reply Sjoerd Nijboer <dlang sjoerdnijboer.com> writes:
On Sunday, 26 August 2018 at 16:25:31 UTC, rikki cattermole wrote:
 On 27/08/2018 4:09 AM, lurker wrote:
 On Sunday, 26 August 2018 at 14:17:33 UTC, Chris wrote:
 lurking around this board for a long time and gave up on d2 
 along time ago. it is to scripty. i can not convince anybody 
 at work to use it even for small things under windows. some 
 tried it and they say it is to buggy, misses windows 
 essentials and there seems to be no chance of betterment via 
 management or the compiler enthusiasts that rather implement 
 any fancy fart instead of getting the compiler stable, bug 
 free and usable.
 it seems like this is a language experiment, unusable for 
 serious development.
 i just downloaded current beta 2 and visual D. installs ok, no 
 detection of visual studio or any of the associated paths. 
 visual D installed ok, but a click on menu options killed 
 visual studio.
 i uninstalled successfully - hallelujah.
 so i lurk around for an other year an see if D experiment is 
 still around and/or usable.
Both VisualD and dmd should work out of the box with MSVC and Visual Studio. In the last year I have not seen any reports of either failing out-right without something wrong with the users environment making it problematic. So please report any issues you're having. Because they are not regular user experience.
The first five minutes of VisualD and DUB are rough! Consecutively they shun me away a lot of the time. I'm also lurking on the forums for all of D's promises, but it doesn't seem productively useable outside of isolated projects whitout tightly locked down dependencies and the ability to maintain your own compiler and libraries. It might be in practise, but it certainly doesn't look to be so. https://forum.dlang.org/post/ydggepqkufeqaauoicsz forum.dlang.org
Aug 26 2018
parent reply drug <drug2004 bk.ru> writes:
On 26.08.2018 19:45, Sjoerd Nijboer wrote:
 
 The first five minutes of VisualD and DUB are rough!
 Consecutively they shun me away a lot of the time.
 
 I'm also lurking on the forums for all of D's promises, but it doesn't 
 seem productively useable outside of isolated projects whitout tightly 
 locked down dependencies and the ability to maintain your own compiler 
 and libraries.
 It might be in practise, but it certainly doesn't look to be so.
 
 https://forum.dlang.org/post/ydggepqkufeqaauoicsz forum.dlang.org
I use D heavily 6+ years and based on my experience I can state it is highly productive language comparing to C/C++ I work with too. In D I can do more than in C/C++. C is too low level and so verbose, C++ lacks for some metaprogramming features and it's less consistent than D. It's rather funny to see how one man who forced to program in programming language he doesn't like can triggers comments from lurkers that they don't like D too. No offense. D is in great form and is getting much better and better and I'd like to ask D community to continue their good work and make D great again.
Aug 26 2018
parent reply RhyS <sale rhysoft.com> writes:
On Sunday, 26 August 2018 at 18:18:04 UTC, drug wrote:
 It's rather funny to see how one man who forced to program in 
 programming language he doesn't like can triggers comments from 
 lurkers that they don't like D too. No offense.
 D is in great form and is getting much better and better and 
 I'd like to ask D community to continue their good work and 
 make D great again.
Most people lurking here are people that WANT to use D but are offset by the issues. D is not bad as a language but it has issue. Their are issues at every step in the D eco system and each of those create a barrier. Its those same issues that never seem to get solved and are secondary citizens compared to adding more "future" features or trying to Up-one C++... Its not BetterC or static if or whatever new feature of the month, that brings in new people. You can advertise D as much as you want, but when people download D and very few people stay, is that not a hint... The fact that only recently the D Poll pointed out that most people are using VSC and not VS. I am like "what, you only figure that out now". Given the mass popularity of VSC... That alone tells you how much the mindset of D is stuck in a specific eco space.
Aug 26 2018
parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 12:10, RhyS via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 26 August 2018 at 18:18:04 UTC, drug wrote:
 It's rather funny to see how one man who forced to program in
 programming language he doesn't like can triggers comments from
 lurkers that they don't like D too. No offense.
 D is in great form and is getting much better and better and
 I'd like to ask D community to continue their good work and
 make D great again.
Most people lurking here are people that WANT to use D but are offset by the issues. D is not bad as a language but it has issue. Their are issues at every step in the D eco system and each of those create a barrier. Its those same issues that never seem to get solved and are secondary citizens compared to adding more "future" features or trying to Up-one C++... Its not BetterC or static if or whatever new feature of the month, that brings in new people. You can advertise D as much as you want, but when people download D and very few people stay, is that not a hint... The fact that only recently the D Poll pointed out that most people are using VSC and not VS. I am like "what, you only figure that out now". Given the mass popularity of VSC... That alone tells you how much the mindset of D is stuck in a specific eco space.
Industry tends to use VS, because they fork-out for the relatively expensive licenses. I work at a company with a thousand engineers, all VS users, D could find home there if some rough edges were polished, but they *absolutely must be polished* before it would be taken seriously. It is consistently expressed that poor VS integration is an absolute non-starter. While a majority of people (hobbyists?) that take an online poll in an open-source community forum might be VSCode users, that doesn't mean VS is a poor priority target. Is D a hobby project, or an industry solution? I vote the latter. I don't GAF about peoples hobbies, I just want to use D to _do my job_. Quality VS experience is critical to D's adoption in that sector. Those 1000 engineers aren't reflected in your poll... would you like them to be?
Aug 26 2018
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 12:34 PM, Manu wrote:
 I work at a company with a thousand engineers, all VS users, D could
 find home there if some rough edges were polished, but they
 *absolutely must be polished* before it would be taken seriously.
 It is consistently expressed that poor VS integration is an absolute
 non-starter.
I will tiresomely ask again, do you have a list of each and every aspect of the poor integration?
Aug 26 2018
next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Sunday, 26 August 2018 at 20:55:04 UTC, Walter Bright wrote:
 On 8/26/2018 12:34 PM, Manu wrote:
 I work at a company with a thousand engineers, all VS users, D 
 could
 find home there if some rough edges were polished, but they
 *absolutely must be polished* before it would be taken 
 seriously.
 It is consistently expressed that poor VS integration is an 
 absolute
 non-starter.
I will tiresomely ask again, do you have a list of each and every aspect of the poor integration?
Not to put words in his mouth, but: * rvalue references: see recent DIP * https://github.com/ldc-developers/ldc/issues/2800 Really a DMD issue * https://issues.dlang.org/show_bug.cgi?id=19179 probably more.
Aug 26 2018
parent Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 15:55, Nicholas Wilson via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 26 August 2018 at 20:55:04 UTC, Walter Bright wrote:
 On 8/26/2018 12:34 PM, Manu wrote:
 I work at a company with a thousand engineers, all VS users, D
 could
 find home there if some rough edges were polished, but they
 *absolutely must be polished* before it would be taken
 seriously.
 It is consistently expressed that poor VS integration is an
 absolute
 non-starter.
I will tiresomely ask again, do you have a list of each and every aspect of the poor integration?
Not to put words in his mouth, but: * rvalue references: see recent DIP * https://github.com/ldc-developers/ldc/issues/2800 Really a DMD issue * https://issues.dlang.org/show_bug.cgi?id=19179 probably more.
I feel like the question was about tooling specifically... but yes, all of those! ;)
Aug 26 2018
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 1:55 PM, Walter Bright wrote:
 I will tiresomely ask again, do you have a list of each and every aspect of
the 
 poor integration?
I know you don't like filing bug reports. I'll make it easy for you. Every time someone you work with says: "I can't use D because ..." "I'm abandoning D because ..." "D sux because ..." Just append a note of it to a text file. It doesn't matter what the reason is, any information is valuable, including: "... because Walter killed and ate my dog" "... because my apartment contract stipulates no pets, no smokers, no D programming" "... because I can't take D jokes anymore" "... because Walter won't stop droning on with his boring Boeing anecdotes" Now and then, just email me the file.
Aug 26 2018
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Sunday, 26 August 2018 at 22:54:18 UTC, Walter Bright wrote:
 On 8/26/2018 1:55 PM, Walter Bright wrote:
 I will tiresomely ask again, do you have a list of each and 
 every aspect of the poor integration?
I know you don't like filing bug reports. I'll make it easy for you. Every time someone you work with says: "I can't use D because ..." "I'm abandoning D because ..." "D sux because ..." Just append a note of it to a text file. It doesn't matter what the reason is, any information is valuable, including: "... because Walter killed and ate my dog" "... because my apartment contract stipulates no pets, no smokers, no D programming" "... because I can't take D jokes anymore" "... because Walter won't stop droning on with his boring Boeing anecdotes" Now and then, just email me the file.
It's 2.30 AM here, and you've made me smile :-P /P
Aug 26 2018
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 15:55, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 8/26/2018 1:55 PM, Walter Bright wrote:
 I will tiresomely ask again, do you have a list of each and every aspect of the
 poor integration?
I know you don't like filing bug reports. I'll make it easy for you.
I file shit-loads of bug reports! >_<
 Every time someone you work with says:

 "I can't use D because ..."
That's not what we're up against. We need to be focused on the question "It's okay to consider trying something other than C++ because [boxes that we care about are all ticked]". We're not determining why we can't use D, we need to establish why we can *consider trying* D.
 "I'm abandoning D because ..."
I've talked about this quite a bit already. One of my colleagues spent a few weeks trying it out in his home game engine; he abandoned it because effort required to interact with C++ was much greater than he imagined upon first inspection of the binding features available. His key criticisms were: - Expects tool to output compatible .h/.di files from complementary file. - Complained he felt colour-blind in the editor. - Bugs (mostly with extern(C++)) reduced his confidence it was production ready. (I attempted to address the list of issues he encountered a few weeks back, but some remain) He likes the language, and would support it if he was satisfied tooling was where it needs to be. Tooling maturity was his biggest concern. By contrast, another colleague tried writing a small game in his own time. His feedback was that it felt 'fine', but he didn't encounter anything that made it "simpler than C++", and claimed readability improvements were tenuous. He wouldn't show us his code. I'm sure he wrote basically what he would have written in C++, and that's not how to get advantages out of D... but his experience is still relevant. It demonstrates that C++ programmers won't be convinced without clear demonstration of superior expressive opportunity. What I know is, it all starts with a direct comparison to C++, and THAT starts with extern(C++)... which still kinda sucks right now. I've been working on it.
Aug 26 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 5:40 PM, Manu wrote:
 By contrast, another colleague tried writing a small game in his own
 time. His feedback was that it felt 'fine', but he didn't encounter
 anything that made it "simpler than C++", and claimed readability
 improvements were tenuous.
 He wouldn't show us his code. I'm sure he wrote basically what he
 would have written in C++, and that's not how to get advantages out of
 D... but his experience is still relevant. It demonstrates that C++
 programmers won't be convinced without clear demonstration of superior
 expressive opportunity.
Actually, I understand that one. If you look at my conversions of C++ code to D, like this one: https://github.com/dlang/dmd/commit/8322559195c28835d61c99877ea7c344cb0e1c91#diff-1be391ebabb9f6e11079e1ea4ef1158b The code looks the same, and in fact, is about 98% the same. I first learned programming in BASIC. Outgrew it, and switched to Fortran. Amusingly, my early Fortran code looked just like BASIC. My early C code looked like Fortran. My early C++ code looked like C. The productivity gains of D won't happen until one stops writing C++ code in it, and stops thinking in C++ terms. In fact, one gets irked because one is deep in the rut of how C++ does things, and it's annoying when one is forced to do things a different way in D. Like, why can't I have member function pointers, or friend declarations? Going the other way, though, is even worse - what do you mean, I have to write forward declarations? Why can't I pass a string literal to a template? No user defined attributes? Why doesn't CTFE work? Who can actually get work done in this language?
Aug 26 2018
next sibling parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 21:50, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 8/26/2018 5:40 PM, Manu wrote:
 By contrast, another colleague tried writing a small game in his own
 time. His feedback was that it felt 'fine', but he didn't encounter
 anything that made it "simpler than C++", and claimed readability
 improvements were tenuous.
 He wouldn't show us his code. I'm sure he wrote basically what he
 would have written in C++, and that's not how to get advantages out of
 D... but his experience is still relevant. It demonstrates that C++
 programmers won't be convinced without clear demonstration of superior
 expressive opportunity.
Actually, I understand that one. If you look at my conversions of C++ code to D, like this one: https://github.com/dlang/dmd/commit/8322559195c28835d61c99877ea7c344cb0e1c91#diff-1be391ebabb9f6e11079e1ea4ef1158b The code looks the same, and in fact, is about 98% the same.
This code appears to be a mechanical translation. That's not what happened in this case; he wrote his game in D from scratch. It was just that he arrived at mostly the same place. He was googling for styling and sample material, but I suspect the problem was a lack of topical material demonstrating how he might write his D code differently. It's also the case that the significant difference between C++ and D (in my experience) mostly come down to: D has modules, tidier meta, UDA's, slices, and ranges/UFCS. In trade, D struggles with const, and ref is broken. If your code doesn't manifest some gravity towards one of those features, it will tend to be quite same-ey, and advantage may not be particularly apparent. In my current project, we stand to make substantial gains from D's meta and UDA's in particular. I think UFCS could make a big play too. Tidier lambda syntax will also be attractive, however controlling closures with respect to nogc appears to be a challenge which C++ doesn't suffer. It's all contingent on fighting through outstanding C++ related issues, and making the tooling as good as we can get it though.
Aug 26 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 11:16 PM, Manu wrote:
 The code looks the same, and in fact, is about 98% the same.
This code appears to be a mechanical translation.
It's not. It's by hand. But I had a specific goal of minimizing the diffs, so that if the translation didn't work, it reduced the number of places to look for the mistake. And in fact, this has saved me a LOT of grief :-)
 That's not what
 happened in this case; he wrote his game in D from scratch.
 It was just that he arrived at mostly the same place. He was googling
 for styling and sample material, but I suspect the problem was a lack
 of topical material demonstrating how he might write his D code
 differently.
It takes time to learn how to write idiomatic D effectively. I'm still learning how to do it right, too.
 It's also the case that the significant difference between C++ and D
 (in my experience) mostly come down to: D has modules, tidier meta,
 UDA's, slices, and ranges/UFCS. In trade, D struggles with const, and
 ref is broken.
 If your code doesn't manifest some gravity towards one of those
 features, it will tend to be quite same-ey, and advantage may not be
 particularly apparent.
I suspect that is still a bit stuck on looking at individual instruments and not seeing the orchestra. Let's take the much-maligned D const. It isn't C++ const (let's call that "head-const", because that's what it is). Head-const for a function parameter tells us very little about what may happen to it in the function. You can pass a head-const reference to a container, and have the function add/change/delete every element of that container, all without a peep from any C++ tool. Looking at the function signature, you've really got no clue whatsoever. The reason people have trouble with transitive-const is that they are still programming in C++, where they *do* add/change/delete every member of the "const" container. That includes me. I try to add transitive-const, and it won't compile, because I as well am used to replacing the engine and tail lights in my head-const car. In order to use transitive-const, it's forcing me to fundamentally re-think how I organize code into functions. For example, dmd is full of functions that combine data-gathering with taking-action. I've been reorganizing to separate data-gathering and taking-action into separate functions. The former can be transitive-const, maybe even pure. And I like the results, the code becomes much easier to understand.
Aug 28 2018
next sibling parent reply Eugene Wissner <belka caraus.de> writes:
On Tuesday, 28 August 2018 at 07:53:34 UTC, Walter Bright wrote:
 Let's take the much-maligned D const. It isn't C++ const (let's 
 call that "head-const", because that's what it is). Head-const 
 for a function parameter tells us very little about what may 
 happen to it in the function. You can pass a head-const 
 reference to a container, and have the function 
 add/change/delete every element of that container, all without 
 a peep from any C++ tool. Looking at the function signature, 
 you've really got no clue whatsoever.

 The reason people have trouble with transitive-const is that 
 they are still programming in C++, where they *do* 
 add/change/delete every member of the "const" container.

 That includes me. I try to add transitive-const, and it won't 
 compile, because I as well am used to replacing the engine and 
 tail lights in my head-const car. In order to use 
 transitive-const, it's forcing me to fundamentally re-think how 
 I organize code into functions.

 For example, dmd is full of functions that combine 
 data-gathering with taking-action. I've been reorganizing to 
 separate data-gathering and taking-action into separate 
 functions. The former can be transitive-const, maybe even pure. 
 And I like the results, the code becomes much easier to 
 understand.
There are still valid use cases where const should be "broken". One of them is mutex (another one caching). I have very little experiance in multi-threaded programming, but what do you think about "mutable" members, despite the object is const?
Aug 28 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via Digitalmars-d
wrote:
[...]
 There are still valid use cases where const should be "broken". One of
 them is mutex (another one caching). I have very little experiance in
 multi-threaded programming, but what do you think about "mutable"
 members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided. Const in D is not the same as const in languages like C++; const in D means *physical* const, as in, the data might reside in ROM where it's physically impossible to modify. Allowing the user to bypass this means UB if the data exists in ROM. Plus, the whole point of const in D is that it is machine-verifiable, i.e., the compiler checks that the code does not break const in any way and therefore you are guaranteed (barring compiler bugs) that the data does not change. If const were not machine-verifiable, it would be nothing more than programming by convention, since it would guarantee nothing. Allowing const to be "broken" somewhere would mean it's no longer machine-verifiable (you need a human to verify whether the semantics are still correct). Many of D's const woes can actually be solved if we had a language-supported way of declaring the equivalence between const(U!T) and U!(const(T)), AKA head-mutable. The language already supports a (very) limited set of such conversions, e.g., const(T*) is assignable to const(T)*, because you're just making a copy of the pointer, but the target is still unchangeable. However, because there is no way to specify such a conversion in a user-defined type, that means things like RefCounted, or caches, or mutexes, cannot be made to work without either ugly workarounds or treading into UB territory by casting away const. But if there is a way for a user-defined template U to define a conversion from const(U!T) to U!(const(T)) (the conversion code, of course, would have to be const-correct and verifiable by the compiler), then we could make it so that U!(const(T)) contained a mutable portion (e.g., the refcount, mutex, cache, etc.) and an immutable portion (the reference to the const object). T -- In order to understand recursion you must first understand recursion.
Aug 28 2018
next sibling parent reply tide <tide tide.tide> writes:
On Tuesday, 28 August 2018 at 17:02:46 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via 
 Digitalmars-d wrote: [...]
 There are still valid use cases where const should be 
 "broken". One of them is mutex (another one caching). I have 
 very little experiance in multi-threaded programming, but what 
 do you think about "mutable" members, despite the object is 
 const?
The problem with compromising const is that it would invalidate any guarantees const may have provided. Const in D is not the same as const in languages like C++; const in D means *physical* const, as in, the data might reside in ROM where it's physically impossible to modify. Allowing the user to bypass this means UB if the data exists in ROM.
I feel that such a narrow use case, wouldn't you just use something like immutable instead.
 Plus, the whole point of const in D is that it is 
 machine-verifiable, i.e., the compiler checks that the code 
 does not break const in any way and therefore you are 
 guaranteed (barring compiler bugs) that the data does not 
 change.  If const were not machine-verifiable, it would be 
 nothing more than programming by convention, since it would 
 guarantee nothing.  Allowing const to be "broken" somewhere 
 would mean it's no longer machine-verifiable (you need a human 
 to verify whether the semantics are still correct).
This is still not true, it is not machine verifiable as it is. It can be bypassed quite easily, as a const object can be assigned from an non-const one. There's no way to offer that guarantee. import std.format : format; struct Type { int value; } void test(const ref Type type, int* ptr) { int first = type.value; *ptr = first + 1; assert(type.value == first, format!"%d != %d"(type.value, first)); } void main() { Type type = Type(10); test(type, &type.value); }
Aug 28 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 28, 2018 at 07:39:20PM +0000, tide via Digitalmars-d wrote:
 On Tuesday, 28 August 2018 at 17:02:46 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via
 Digitalmars-d wrote: [...]
 There are still valid use cases where const should be "broken".
 One of them is mutex (another one caching). I have very little
 experiance in multi-threaded programming, but what do you think
 about "mutable" members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided. Const in D is not the same as const in languages like C++; const in D means *physical* const, as in, the data might reside in ROM where it's physically impossible to modify. Allowing the user to bypass this means UB if the data exists in ROM.
I feel that such a narrow use case, wouldn't you just use something like immutable instead.
The problem is that immutable implicitly converts to const. Basically, const means "I guarantee I will never modify this data (though someone else might", and immutable means "nobody will ever modify this data". You cannot allow const to mutate without risking breakage with immutable. If the original data came from a mutable reference, you can probably get away with casting const away. But if it came from an immutable object, casting const away is UB. Allowing const to be "sometimes" modified is also UB.
 Plus, the whole point of const in D is that it is
 machine-verifiable, i.e., the compiler checks that the code does not
 break const in any way and therefore you are guaranteed (barring
 compiler bugs) that the data does not change.  If const were not
 machine-verifiable, it would be nothing more than programming by
 convention, since it would guarantee nothing.  Allowing const to be
 "broken" somewhere would mean it's no longer machine-verifiable (you
 need a human to verify whether the semantics are still correct).
This is still not true, it is not machine verifiable as it is. It can be bypassed quite easily, as a const object can be assigned from an non-const one. There's no way to offer that guarantee.
You misunderstand. Const means "this code cannot modify this object no matter what". It does not guarantee somebody else can't modify it (you want immutable for that). Both mutable and immutable implicitly convert to const, therefore it is imperative that code that handles const never modifies the data, because you don't know the provenance of the data: it could have come from an immutable object. Allowing const to "sometimes" modify stuff will violate immutable and cause UB. Whether a piece of code modifies the data is certainly machine-verifiable -- but only if there are no backdoors to const. If there are, then the compiler cannot feasibly verify const, since it would need to transitively examine all code called by the code in question, but the source code may not be always available. Even if the data came from a mutable object, it does not make it any less machine-verifiable, since what we're verifying is "this code does not modify this data", not "this data never changes". For the latter, immutable provides that guarantee, not const. It is possible, for example, to obtain a const reference to a mutable object, and have one thread modify the object (via the mutable reference) while another thread reads it (via the const reference). You cannot guarantee that the data itself won't change, but you *can* guarantee that the code holding the const reference (without access to the mutable reference) isn't the one making the changes. T -- A program should be written to model the concepts of the task it performs rather than the physical world or a process because this maximizes the potential for it to be applied to tasks that are conceptually similar and, more important, to tasks that have not yet been conceived. -- Michael B. Allen
Aug 28 2018
next sibling parent reply tide <tide tide.tide> writes:
On Tuesday, 28 August 2018 at 20:32:29 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 07:39:20PM +0000, tide via 
 Digitalmars-d wrote:
 On Tuesday, 28 August 2018 at 17:02:46 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via 
 Digitalmars-d wrote: [...]
 There are still valid use cases where const should be 
 "broken". One of them is mutex (another one caching). I 
 have very little experiance in multi-threaded programming, 
 but what do you think about "mutable" members, despite the 
 object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided. Const in D is not the same as const in languages like C++; const in D means *physical* const, as in, the data might reside in ROM where it's physically impossible to modify. Allowing the user to bypass this means UB if the data exists in ROM.
I feel that such a narrow use case, wouldn't you just use something like immutable instead.
The problem is that immutable implicitly converts to const. Basically, const means "I guarantee I will never modify this data (though someone else might", and immutable means "nobody will ever modify this data". You cannot allow const to mutate without risking breakage with immutable. If the original data came from a mutable reference, you can probably get away with casting const away. But if it came from an immutable object, casting const away is UB. Allowing const to be "sometimes" modified is also UB.
 Plus, the whole point of const in D is that it is 
 machine-verifiable, i.e., the compiler checks that the code 
 does not break const in any way and therefore you are 
 guaranteed (barring compiler bugs) that the data does not 
 change.  If const were not machine-verifiable, it would be 
 nothing more than programming by convention, since it would 
 guarantee nothing.  Allowing const to be "broken" somewhere 
 would mean it's no longer machine-verifiable (you need a 
 human to verify whether the semantics are still correct).
This is still not true, it is not machine verifiable as it is. It can be bypassed quite easily, as a const object can be assigned from an non-const one. There's no way to offer that guarantee.
You misunderstand. Const means "this code cannot modify this object no matter what". It does not guarantee somebody else can't modify it (you want immutable for that). Both mutable and immutable implicitly convert to const, therefore it is imperative that code that handles const never modifies the data, because you don't know the provenance of the data: it could have come from an immutable object. Allowing const to "sometimes" modify stuff will violate immutable and cause UB. Whether a piece of code modifies the data is certainly machine-verifiable -- but only if there are no backdoors to const. If there are, then the compiler cannot feasibly verify const, since it would need to transitively examine all code called by the code in question, but the source code may not be always available. Even if the data came from a mutable object, it does not make it any less machine-verifiable, since what we're verifying is "this code does not modify this data", not "this data never changes". For the latter, immutable provides that guarantee, not const. It is possible, for example, to obtain a const reference to a mutable object, and have one thread modify the object (via the mutable reference) while another thread reads it (via the const reference). You cannot guarantee that the data itself won't change, but you *can* guarantee that the code holding the const reference (without access to the mutable reference) isn't the one making the changes. T
Point being, there is a huge difference between what you were saying, and what you are saying now. "This data never changes" is a much better guarantee and check than "this code does not modify this data". You use const to make sure the data doesn't change, if you can't guarantee it doesn't change from any other code then I wouldn't say it is machine-verifiable. So we would need another qualifier "tantamount" to be implemented then it seems.
Aug 28 2018
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 29, 2018 at 01:02:54AM +0000, tide via Digitalmars-d wrote:
[...]
 Point being, there is a huge difference between what you were saying,
 and what you are saying now. "This data never changes" is a much
 better guarantee and check than "this code does not modify this data".
 You use const to make sure the data doesn't change, if you can't
 guarantee it doesn't change from any other code then I wouldn't say it
 is machine-verifiable.
You appear to be still misunderstanding how it works. In D, if you want to make sure the data never changes, you use immutable. Const is for when you want to make sure a piece of code doesn't modify the data (even if the data is mutable elsewhere). Both are machine-verifiable. As in, the compiler can verify that the code never touches the data. Immutable provides the strongest guarantee (no code anywhere modifies this data), while const provides a weaker guarantee (this code doesn't modify this data, but somebody else might). The usefulness of const is that you can safely pass *both* mutable and immutable data through it, and you're guaranteed there will be no problems, because const does not allow the code to touch the data. If the code does not need to touch the data, then it could take the data as const, and you could use the same code to handle both mutable and immutable data. All of this breaks down if you allow const to be overridden anywhere. That's why it's UB to cast away const.
 So we would need another qualifier "tantamount" to be implemented then
 it seems.
I don't understand what you mean by this. Could you clarify? T -- Живёшь только однажды.
Aug 29 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Thanks, that's a good explanation of the point of the differences between const 
and immutable.
Aug 28 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via Digitalmars-d
wrote:
 [...]
 There are still valid use cases where const should be "broken". One of
 them is mutex (another one caching). I have very little experiance in
 multi-threaded programming, but what do you think about "mutable"
 members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided.
No. You start with the set of allowed program rewrites, then require code with __mutable to not break under them. Code using __mutable is unsafe.
 Const in D is not the same as const
 in languages like C++; const in D means*physical*  const, as in, the
 data might reside in ROM where it's physically impossible to modify.
 Allowing the user to bypass this means UB if the data exists in ROM.
 
 Plus, the whole point of const in D is that it is machine-verifiable,
 i.e., the compiler checks that the code does not break const in any way
 and therefore you are guaranteed (barring compiler bugs) that the data
 does not change.  If const were not machine-verifiable, it would be
 nothing more than programming by convention, since it would guarantee
 nothing.  Allowing const to be "broken" somewhere would mean it's no
 longer machine-verifiable (you need a human to verify whether the
 semantics are still correct).
It is not unusual to need a human to verify that your code does what it was intended to do.
Aug 29 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 29, 2018 at 06:58:16PM +0200, Timon Gehr via Digitalmars-d wrote:
 On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via Digitalmars-d
wrote:
 [...]
 There are still valid use cases where const should be "broken".
 One of them is mutex (another one caching). I have very little
 experiance in multi-threaded programming, but what do you think
 about "mutable" members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided.
No. You start with the set of allowed program rewrites, then require code with __mutable to not break under them. Code using __mutable is unsafe.
Currently, immutable implicitly converts to const. If const is allowed to be overridden, then you could violate immutable, which is UB.
 Const in D is not the same as const in languages like C++; const in
 D means*physical*  const, as in, the data might reside in ROM where
 it's physically impossible to modify.  Allowing the user to bypass
 this means UB if the data exists in ROM.
 
 Plus, the whole point of const in D is that it is
 machine-verifiable, i.e., the compiler checks that the code does not
 break const in any way and therefore you are guaranteed (barring
 compiler bugs) that the data does not change.  If const were not
 machine-verifiable, it would be nothing more than programming by
 convention, since it would guarantee nothing.  Allowing const to be
 "broken" somewhere would mean it's no longer machine-verifiable (you
 need a human to verify whether the semantics are still correct).
It is not unusual to need a human to verify that your code does what it was intended to do.
And it is not unusual for humans to make mistakes and certify code that is not actually correct. Automation provides much stronger guarantees than human verification. Besides, this is missing the point. What I meant was that if const could be arbitrarily overridden anywhere down the call chain, then the compiler could no longer feasibly verify that a particular piece of code doesn't violate const. The code could be calling a function for which the compiler has no source code, and who knows what that function might do. It could override const and modify the data willy-nilly, and if the const reference is pointing to an immutable object, you're in UB land. Not allowing const to be overridden (without the user deliberately treading into UB land by casting it away) allows the compiler to statically check that the code doesn't actually modify a const object. You appear to be thinking I was making a statement about verifying program correctness in general, which is taking what I said out of context. T -- It is not the employer who pays the wages. Employers only handle the money. It is the customer who pays the wages. -- Henry Ford
Aug 29 2018
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 29.08.2018 19:15, H. S. Teoh wrote:
 On Wed, Aug 29, 2018 at 06:58:16PM +0200, Timon Gehr via Digitalmars-d wrote:
 On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via Digitalmars-d
wrote:
 [...]
 There are still valid use cases where const should be "broken".
 One of them is mutex (another one caching). I have very little
 experiance in multi-threaded programming, but what do you think
 about "mutable" members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided.
No. You start with the set of allowed program rewrites, then require code with __mutable to not break under them. Code using __mutable is unsafe.
Currently, immutable implicitly converts to const. If const is allowed to be overridden, then you could violate immutable, which is UB. ...
__mutable fields are __mutable also in the immutable instance. You might get into trouble with shared if you are not careful because of the unfortunate "implicit shared" semantics of immutable, but it is up to the programmer to get this right.
 
 Const in D is not the same as const in languages like C++; const in
 D means*physical*  const, as in, the data might reside in ROM where
 it's physically impossible to modify.  Allowing the user to bypass
 this means UB if the data exists in ROM.

 Plus, the whole point of const in D is that it is
 machine-verifiable, i.e., the compiler checks that the code does not
 break const in any way and therefore you are guaranteed (barring
 compiler bugs) that the data does not change.  If const were not
 machine-verifiable, it would be nothing more than programming by
 convention, since it would guarantee nothing.  Allowing const to be
 "broken" somewhere would mean it's no longer machine-verifiable (you
 need a human to verify whether the semantics are still correct).
It is not unusual to need a human to verify that your code does what it was intended to do.
And it is not unusual for humans to make mistakes and certify code that is not actually correct. Automation provides much stronger guarantees than human verification. ...
Absolutely. But D only strives to provide such automation in safe code. For system code, we need a formal specification of what is allowed. (And it needs to include all things that the GC and language do; no magic.) Note that such a formal specification is a prerequisite for any (possibly language-external) automated verification approaches.
 Besides, this is missing the point.  What I meant was that if const
 could be arbitrarily overridden anywhere down the call chain, then the
 compiler could no longer feasibly verify that a particular piece of code
 doesn't violate const. The code could be calling a function for which
 the compiler has no source code, and who knows what that function might
 do. It could override const and modify the data willy-nilly, and if the
 const reference is pointing to an immutable object, you're in UB land.
 
 Not allowing const to be overridden (without the user deliberately
 treading into UB land by casting it away) allows the compiler to
 statically check that the code doesn't actually modify a const object.
 
 You appear to be thinking I was making a statement about verifying
 program correctness in general, which is taking what I said out of
 context.
 
 
 T
 
I was thinking you were making a statement about __mutable fields.
Aug 29 2018
next sibling parent reply Dave Jones <dave jones.com> writes:
On Wednesday, 29 August 2018 at 18:02:16 UTC, Timon Gehr wrote:
 On 29.08.2018 19:15, H. S. Teoh wrote:
 On Wed, Aug 29, 2018 at 06:58:16PM +0200, Timon Gehr via 
 Digitalmars-d wrote:
 On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via 
 Digitalmars-d wrote:
Currently, immutable implicitly converts to const. If const is allowed to be overridden, then you could violate immutable, which is UB. ...
__mutable fields are __mutable also in the immutable instance. You might get into trouble with shared if you are not careful because of the unfortunate "implicit shared" semantics of immutable, but it is up to the programmer to get this right.
So you cant cast away const but you can specify a field stays mutable even if the aggregate is const or immutable?
Aug 29 2018
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 29, 2018 at 07:02:42PM +0000, Dave Jones via Digitalmars-d wrote:
 On Wednesday, 29 August 2018 at 18:02:16 UTC, Timon Gehr wrote:
 On 29.08.2018 19:15, H. S. Teoh wrote:
 On Wed, Aug 29, 2018 at 06:58:16PM +0200, Timon Gehr via
 Digitalmars-d wrote:
 On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via
 Digitalmars-d wrote:
Currently, immutable implicitly converts to const. If const is allowed to be overridden, then you could violate immutable, which is UB. ...
__mutable fields are __mutable also in the immutable instance. You might get into trouble with shared if you are not careful because of the unfortunate "implicit shared" semantics of immutable, but it is up to the programmer to get this right.
So you cant cast away const but you can specify a field stays mutable even if the aggregate is const or immutable?
That appears to be the case. But it scares me that const(T) would no longer guarantee you can't modify anything in T. I fear it will break some subtle assumptions about how const/immutable works, and introduce hidden bugs to existing code. T -- Doubt is a self-fulfilling prophecy.
Aug 29 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/29/2018 11:02 AM, Timon Gehr wrote:
 Absolutely. But D only strives to provide such automation in  safe code. For 
  system code, we need a formal specification of what is allowed. (And it needs 
 to include all things that the GC and language do; no magic.) Note that such a 
 formal specification is a prerequisite for any (possibly language-external) 
 automated verification approaches.
I don't think that system code is amenable to formal verification. After all, you can do UB in it, and it is the programmer's responsibility to ensure it works.
Aug 29 2018
parent Timon Gehr <timon.gehr gmx.ch> writes:
On 29.08.2018 21:58, Walter Bright wrote:
 On 8/29/2018 11:02 AM, Timon Gehr wrote:
 Absolutely. But D only strives to provide such automation in  safe 
 code. For  system code, we need a formal specification of what is 
 allowed. (And it needs to include all things that the GC and language 
 do; no magic.) Note that such a formal specification is a prerequisite 
 for any (possibly language-external) automated verification approaches.
I don't think that system code is amenable to formal verification. After all, you can do UB in it, and it is the programmer's responsibility to ensure it works.
If it's amenable to informal verification, it is also amenable to formal verification. Computers can check mathematical proofs, and if the code is proven correct it does not contain UB. This is independent of whether D classifies the code as safe or system.
Aug 30 2018
prev sibling parent tide <tide tide.tide> writes:
On Wednesday, 29 August 2018 at 17:15:15 UTC, H. S. Teoh wrote:
 Besides, this is missing the point.  What I meant was that if 
 const could be arbitrarily overridden anywhere down the call 
 chain, then the compiler could no longer feasibly verify that a 
 particular piece of code doesn't violate const. The code could 
 be calling a function for which the compiler has no source 
 code, and who knows what that function might do. It could 
 override const and modify the data willy-nilly, and if the 
 const reference is pointing to an immutable object, you're in 
 UB land.

 Not allowing const to be overridden (without the user 
 deliberately treading into UB land by casting it away) allows 
 the compiler to statically check that the code doesn't actually 
 modify a const object.

 You appear to be thinking I was making a statement about 
 verifying program correctness in general, which is taking what 
 I said out of context.


 T
You keep saying that, it has to be machine verifiable, but honestly I don't see the benefit to being machine verifiable. As in the case it can't verify the object doesn't change in scope at all, it can only verify that the code in scope doesn't modify it. I'd rather have C++ const and be useful than avoiding const almost completely.
Aug 29 2018
prev sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, August 29, 2018 11:15:15 AM MDT H. S. Teoh via Digitalmars-d 
wrote:
 On Wed, Aug 29, 2018 at 06:58:16PM +0200, Timon Gehr via Digitalmars-d 
wrote:
 On 28.08.2018 19:02, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 08:18:57AM +0000, Eugene Wissner via
 Digitalmars-d wrote: [...]

 There are still valid use cases where const should be "broken".
 One of them is mutex (another one caching). I have very little
 experiance in multi-threaded programming, but what do you think
 about "mutable" members, despite the object is const?
The problem with compromising const is that it would invalidate any guarantees const may have provided.
No. You start with the set of allowed program rewrites, then require code with __mutable to not break under them. Code using __mutable is unsafe.
Currently, immutable implicitly converts to const. If const is allowed to be overridden, then you could violate immutable, which is UB.
If I understand correctly, the main reason behind looking to add __mutable is to be able to do stuff with containers that can't currently be done where you have a way of knowing whether the actual data is truly immutable or not and thus can avoid mutating data that's actually immutable. That's still undefined behavior right now, but presumably, if __mutable were added, it would then be defined behavior that was highly system and not intended for normal code. However, even allowing that much does make it so that the compiler can't then do any optimizations based on const, since while it may be possible in some cases to avoid mutating immutable data when casting away const and mutating, I don't see how it would be possible to guarantee that it would be done in a way that could not possibly be screwed by up optimizations made at a higher level based on the fact that the objects in question are typed as const. Basically, it seems that Andrei really wants a backdoor in const for certain uses cases, so he's looking to find a way to enable it without really putting a backdoor in const (IIRC it was discussed as part of this year's dconf talk talking about the new containers that one of the students is working on). I guess that he's managed to talk Timon into working on the issue for him given Timon's excellent knowledge about the type system and about the related computer science concepts. We'll see what they come up with, but it's going to be _very_ difficult to make it so that you can actually rely on const's guarantees if it has any kind of backdoors at all. However, given some of the technical issues that they've run into with allocators and containers, Andrei has been rather motivated to change the status quo. We'll see what happens. - Jonathan M Davis
Aug 29 2018
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
On Tue, 28 Aug 2018 at 00:55, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 8/26/2018 11:16 PM, Manu wrote:
 The code looks the same, and in fact, is about 98% the same.
This code appears to be a mechanical translation.
It's not. It's by hand. But I had a specific goal of minimizing the diffs, so that if the translation didn't work, it reduced the number of places to look for the mistake. And in fact, this has saved me a LOT of grief :-)
 That's not what
 happened in this case; he wrote his game in D from scratch.
 It was just that he arrived at mostly the same place. He was googling
 for styling and sample material, but I suspect the problem was a lack
 of topical material demonstrating how he might write his D code
 differently.
It takes time to learn how to write idiomatic D effectively. I'm still learning how to do it right, too.
 It's also the case that the significant difference between C++ and D
 (in my experience) mostly come down to: D has modules, tidier meta,
 UDA's, slices, and ranges/UFCS. In trade, D struggles with const, and
 ref is broken.
 If your code doesn't manifest some gravity towards one of those
 features, it will tend to be quite same-ey, and advantage may not be
 particularly apparent.
I suspect that is still a bit stuck on looking at individual instruments and not seeing the orchestra. Let's take the much-maligned D const. It isn't C++ const (let's call that "head-const", because that's what it is). Head-const for a function parameter tells us very little about what may happen to it in the function. You can pass a head-const reference to a container, and have the function add/change/delete every element of that container, all without a peep from any C++ tool. Looking at the function signature, you've really got no clue whatsoever. The reason people have trouble with transitive-const is that they are still programming in C++, where they *do* add/change/delete every member of the "const" container.
We understand... really. I've spent a decade digesting this, and I'm not one of those that has ever really complained about D's const. I've always mostly bought into it philosophically. The reality is though, that D's const is not actually very useful, and C++'s const is. D has no way to express head-const, and it turns out it's a tremendously useful concept. As I said, I tend to create a head-const hack to use in its place, and that gets me out of jail... but it's specified as undefined behaviour, which isn't great.
 That includes me. I try to add transitive-const, and it won't compile, because
I
 as well am used to replacing the engine and tail lights in my head-const car.
In
 order to use transitive-const, it's forcing me to fundamentally re-think how I
 organize code into functions.
I often walk the same path, but sometimes it doesn't yield success, and in many cases, it just doesn't actually model the problem. If the problem is that I want a const container class; I don't want a function I pass a vector to mutate the container structure, that is, add/remove/reorder/reallocate the array, but I DO intend it to interact with mutable elements. That's a perfectly valid problem structure, and it turns out, it's very common. I would attribute the given element type as const if I wanted the const-ness to propagate to the elements, that's obvious and convenient. It might be that we can sufficiently rearrange all manner of conventional wisdom to interact successfully with D's const, but I've been watching this space for 10 years, and nobody has produced any such evidence, or articles that we can read and understand how to wrangle successful solutions. This particular class of problem reeks of the typical criticism for Rust... that is, I have better things to be doing with my time than trying to find awkward alternative code structures to pacify the const-checker, when in reality, const-container-of-mutable-elements is simply the correct conceptual modeling of the problem. Anyway, I'm not fighting that battle. I have enough of my own.
 For example, dmd is full of functions that combine data-gathering with
 taking-action. I've been reorganizing to separate data-gathering and
 taking-action into separate functions. The former can be transitive-const,
maybe
 even pure. And I like the results, the code becomes much easier to understand.
I've also had occasional success refactoring to support const, but it's certainly the case that success is not guaranteed. And it's always time consuming regardless.
Aug 28 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
There's been some talk of adding a "mutable" qualifier for fields, which would 
stop the transitivity of const at that point. But it has problems, such as what 
happens with opaque types. The compiler can no longer check them, and hence
will 
have to assume they contain mutable members.
Aug 28 2018
next sibling parent Manu <turkeyman gmail.com> writes:
On Tue, 28 Aug 2018 at 19:00, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 There's been some talk of adding a "mutable" qualifier for fields, which would
 stop the transitivity of const at that point. But it has problems, such as what
 happens with opaque types. The compiler can no longer check them, and hence
will
 have to assume they contain mutable members.
Exactly. And you arrive at C++. 'c-const' and 'turtles-const' probably need to be specified differently from the top, not broken along the way with the likes of mutable.
Aug 28 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 29.08.2018 03:59, Walter Bright wrote:
 There's been some talk of adding a "mutable" qualifier for fields, which 
 would stop the transitivity of const at that point. But it has problems, 
 such as what happens with opaque types. The compiler can no longer check 
 them, and hence will have to assume they contain mutable members.
This is a misunderstanding. The __mutable DIP will define the set of allowed program rewrites based on const/immutable/pure. Then code that uses __mutable must remain correct when they are applied. This achieves two things: it clearly defines the semantics of const/immutable/pure and (the possibility of) __mutable will not be an optimization blocker. I'll get back to this once I have finished the tuple DIP implementation.
Aug 29 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/29/2018 10:05 AM, Timon Gehr wrote:
 This is a misunderstanding. The __mutable DIP will define the set of allowed 
 program rewrites based on const/immutable/pure. Then code that uses __mutable 
 must remain correct when they are applied. This achieves two things: it
clearly 
 defines the semantics of const/immutable/pure and (the possibility of)
__mutable 
 will not be an optimization blocker.
 
 I'll get back to this once I have finished the tuple DIP implementation.
This is good news. I'm looking forward to both of them.
Aug 29 2018
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 28, 2018 at 10:20:06AM -0700, Manu via Digitalmars-d wrote:
[...]
 The reality is though, that D's const is not actually very useful, and
 C++'s const is.
Actually, I think C++ const is not very useful, because it guarantees nothing. At the most, it's just a sanity checker to make sure the programmer didn't accidentally do something dumb. But given an opaque C++ function that takes const parameters, there is ZERO guarantee that it doesn't actually modify stuff behind your back, and do so legally (per spec). I mean, how many times have you written const_cast<...> just to get a piece of code to compile? I know I've been guilty of this in many places, because it simply isn't worth the effort to track down all the places of the code that you need to fix to make it const-correct. So basically, C++ const is nothing more than an annotation that isn't really enforced. But you're spot on about D's const, though. While D's const *does* provide real guarantees (unless you tread into UB territory by casting it away), that also limits its scope so much that it's rarely useful outside of rather narrow confines. Yet because it's so strict, using it requires investing significant effort. So you end up with the unfortunate situation of "a lot of effort" + "limited usefulness" which for many people equals "not worth using".
 D has no way to express head-const, and it turns out it's a
 tremendously useful concept.
I can live without head-const... but what *really* makes const painful for me is the lack of head-mutable. I.e., given a const container (which implies const objects), there is no universal way to obtain a mutable reference to said const objects, unless you tread into UB territory by forcefully casting it away. This makes const so limited in applicability that, for the most part, I've given up using const at all, in spite of having tried quite hard to use it as much as possible for years. [...]
 I've also had occasional success refactoring to support const, but
 it's certainly the case that success is not guaranteed. And it's
 always time consuming regardless.
Yes, it's time-consuming. And takes significant effort. In spite of being rather limited in applicability. In my experience, it's useful for isolated pieces of code near the bottom of the program's call chain, where there is little or no additional dependencies. But it's just too cumbersome to use at any higher level, and a royal pain in generic code (which I'm quite heavy on). It probably *can* be made to work in most cases, but it falls under my umbrella category of "too much effort needed, only marginal benefits, therefore not worth it". T -- A linguistics professor was lecturing to his class one day. "In English," he said, "A double negative forms a positive. In some languages, though, such as Russian, a double negative is still a negative. However, there is no language wherein a double positive can form a negative." A voice from the back of the room piped up, "Yeah, yeah."
Aug 28 2018
next sibling parent reply aliak <something something.com> writes:
On Tuesday, 28 August 2018 at 17:53:36 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 10:20:06AM -0700, Manu via
 D has no way to express head-const, and it turns out it's a 
 tremendously useful concept.
I can live without head-const... but what *really* makes const painful for me is the lack of head-mutable. I.e., given a const container (which implies const objects), there is no universal way to obtain a mutable reference to said const objects, unless you tread into UB territory by forcefully casting it away. This makes const so limited in applicability that, for the most part, I've given up using const at all, in spite of having tried quite hard to use it as much as possible for years.
Simen's opHeadMutable [0] was pretty good solution to this const range stuff, but for some reason (not specified by anyone in the thread) it didn't seem to catch on :/ [0] https://forum.dlang.org/post/zsaqtmvqmfkzhrmrmrju forum.dlang.org
Aug 28 2018
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 28, 2018 at 06:44:37PM +0000, aliak via Digitalmars-d wrote:
 On Tuesday, 28 August 2018 at 17:53:36 UTC, H. S. Teoh wrote:
 On Tue, Aug 28, 2018 at 10:20:06AM -0700, Manu via
 D has no way to express head-const, and it turns out it's a
 tremendously useful concept.
I can live without head-const... but what *really* makes const painful for me is the lack of head-mutable. I.e., given a const container (which implies const objects), there is no universal way to obtain a mutable reference to said const objects, unless you tread into UB territory by forcefully casting it away. This makes const so limited in applicability that, for the most part, I've given up using const at all, in spite of having tried quite hard to use it as much as possible for years.
Simen's opHeadMutable [0] was pretty good solution to this const range stuff, but for some reason (not specified by anyone in the thread) it didn't seem to catch on :/ [0] https://forum.dlang.org/post/zsaqtmvqmfkzhrmrmrju forum.dlang.org
[...] Probably because nobody pushed it hard enough to make it happen. T -- It only takes one twig to burn down a forest.
Aug 28 2018
prev sibling parent =?UTF-8?Q?Tobias=20M=C3=BCller?= <troplin bluewin.ch> writes:
H. S. Teoh <hsteoh quickfur.ath.cx> wrote:
 On Tue, Aug 28, 2018 at 10:20:06AM -0700, Manu via Digitalmars-d wrote:
 [...]
 Actually, I think C++ const is not very useful, because it guarantees
 nothing. At the most, it's just a sanity checker to make sure the
 programmer didn't accidentally do something dumb. But given an opaque
 C++ function that takes const parameters, there is ZERO guarantee that
 it doesn't actually modify stuff behind your back, and do so legally
 (per spec).
No, casting away const on pointers and references is only legal if the object pointed to is actually mutable (not const). Everything else is UB. Casting away const of a function parameter that is not under your control will sooner or later lead to UB. Tobi
Aug 28 2018
prev sibling parent Manu <turkeyman gmail.com> writes:
On Tue, 28 Aug 2018 at 10:54, H. S. Teoh via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Tue, Aug 28, 2018 at 10:20:06AM -0700, Manu via Digitalmars-d wrote:
 [...]
 The reality is though, that D's const is not actually very useful, and
 C++'s const is.
Actually, I think C++ const is not very useful, because it guarantees nothing. At the most, it's just a sanity checker to make sure the programmer didn't accidentally do something dumb.
I'd rate that as "pretty damn useful"™!
 But given an opaque
 C++ function that takes const parameters, there is ZERO guarantee that
 it doesn't actually modify stuff behind your back, and do so legally
 (per spec).
Well it can't modify the head-object... that's the point of head-const!
 I mean, how many times have you written const_cast<...>
 just to get a piece of code to compile?
Never in my life. That's a heinous crime. If it were removed from C++ and declared UB, I'd be fine with that.
 I know I've been guilty of this
 in many places, because it simply isn't worth the effort to track down
 all the places of the code that you need to fix to make it
 const-correct.  So basically, C++ const is nothing more than an
 annotation that isn't really enforced.
It could be enforced though. const_cast<> doesn't have to exist, and `mutable` doesn't have to exist either. That would strengthen C++'s design to make it more meaningful while retaining a generally useful semantic. That said, D's transitive const is a nice thing to be able to express... I just recognise that it's mostly useless, and from that perspective, I think being able to express the C++ meaning would be useful, and certainly MORE useful. I wonder if there's a design that could allow to express both options selectively?
 But you're spot on about D's const, though.  While D's const *does*
 provide real guarantees (unless you tread into UB territory by casting
 it away), that also limits its scope so much that it's rarely useful
 outside of rather narrow confines.  Yet because it's so strict, using it
 requires investing significant effort.  So you end up with the
 unfortunate situation of "a lot of effort" + "limited usefulness" which
 for many people equals "not worth using".
And then that case of not being used (even if it could have) blocks use somewhere else, and not(/unable-to)-const spreads like a virus >_<
 D has no way to express head-const, and it turns out it's a
 tremendously useful concept.
I can live without head-const... but what *really* makes const painful for me is the lack of head-mutable. I.e., given a const container (which implies const objects), there is no universal way to obtain a mutable reference to said const objects,
... I think we're talking about the same thing. In this context, the container is the 'head', and the elements would be mutable beneath that unless declared const themselves.
 unless you tread into UB territory by
 forcefully casting it away.  This makes const so limited in
 applicability that, for the most part, I've given up using const at all,
 in spite of having tried quite hard to use it as much as possible for
 years.
Right. This appears to be the accepted recommendation for quite some time, and no change in sight. Tragically, the more people resign to this recommendation (and it's practically official at this stage), the harder it becomes to use even if you want to; any library code that you interact with that didn't use const because 'recommendation' creates interaction blockages for your own code, propagating can't-use-const into your client code, despite your best intentions. D's const is an objective failure. I don't think anyone could argue otherwise with a straight face. It's sad but true; the surface area and complexity of the feature absolutely doesn't justify its limited (and actively waning) usefulness.
Aug 28 2018
prev sibling next sibling parent Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 21:50, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 Going the other way, though, is even worse - what do you mean, I have to write
 forward declarations? Why can't I pass a string literal to a template? No user
 defined attributes? Why doesn't CTFE work? Who can actually get work done in
 this language?
And yes... this. This is the truth that has ruined my career. There was a time when I used to be happy; ignorance is bliss ;) That said, most of my problems I personally care about are purely artificial. We could make ref work, we could make namespaces work, it wouldn't hurt anybody. The power is yours, and yours alone. Everyone else is behind it... you're the only gatekeeper ;) const however... that's another kettle of fish, and I'm slowly becoming sympathetic to the complaining that I've been watching here for years, which I've somehow managed to resist being sucked into for a very long time.
Aug 26 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Aug 26, 2018 at 11:27:57PM -0700, Manu via Digitalmars-d wrote:
 On Sun, 26 Aug 2018 at 21:50, Walter Bright via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 Going the other way, though, is even worse - what do you mean, I
 have to write forward declarations? Why can't I pass a string
 literal to a template? No user defined attributes? Why doesn't CTFE
 work? Who can actually get work done in this language?
And yes... this. This is the truth that has ruined my career. There was a time when I used to be happy; ignorance is bliss ;)
Me too(tm)! D has officially ruined my life (and my career). I can't stand writing code in any other language anymore. [...]
 const however... that's another kettle of fish, and I'm slowly
 becoming sympathetic to the complaining that I've been watching here
 for years, which I've somehow managed to resist being sucked into for
 a very long time.
IMNSHO, const is only a "problem" when you have to interoperate with C++ const, where I can see how the disparity would cause endless woe when translating / interoperating across the C++/D boundary. Const in D makes sense as-is. Though, granted, its infectiousness means its scope is actually very narrow, and as a result, we ironically can't use it in very many places, and so its touted benefits only rarely apply. :-( Which also means that it's taking up a lot of language design real estate with not many benefits to show for it. T -- Perhaps the most widespread illusion is that if we were in power we would behave very differently from those who now hold it---when, in truth, in order to get power we would have to become very much like them. -- Unknown
Aug 27 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/27/2018 10:08 AM, H. S. Teoh wrote:
 Const in D makes sense as-is.  Though, granted, its infectiousness means
 its scope is actually very narrow, and as a result, we ironically can't
 use it in very many places, and so its touted benefits only rarely
 apply. :-(  Which also means that it's taking up a lot of language
 design real estate with not many benefits to show for it.
D const is of great utility if you're interested in functional programming. Using it has forced me to rethink how I separate tasks into functions, and the result is for the better. I agree that D const has little utility if you try to program in C++ style.
Aug 27 2018
next sibling parent tide <tide tide.tide> writes:
On Tuesday, 28 August 2018 at 01:11:14 UTC, Walter Bright wrote:
 On 8/27/2018 10:08 AM, H. S. Teoh wrote:
 Const in D makes sense as-is.  Though, granted, its 
 infectiousness means
 its scope is actually very narrow, and as a result, we 
 ironically can't
 use it in very many places, and so its touted benefits only 
 rarely
 apply. :-(  Which also means that it's taking up a lot of 
 language
 design real estate with not many benefits to show for it.
D const is of great utility if you're interested in functional programming. Using it has forced me to rethink how I separate tasks into functions, and the result is for the better. I agree that D const has little utility if you try to program in C++ style.
It doesn't play well with templates or any of the like either, so even if you try to do template programming it is just better to not use it. I'm curious as to what an example of this D const for functional programming would look like.
Aug 27 2018
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 27, 2018 at 06:11:14PM -0700, Walter Bright via Digitalmars-d wrote:
 On 8/27/2018 10:08 AM, H. S. Teoh wrote:
 Const in D makes sense as-is.  Though, granted, its infectiousness means
 its scope is actually very narrow, and as a result, we ironically
 can't use it in very many places, and so its touted benefits only
 rarely apply. :-(  Which also means that it's taking up a lot of
 language design real estate with not many benefits to show for it.
D const is of great utility if you're interested in functional programming. Using it has forced me to rethink how I separate tasks into functions, and the result is for the better. I agree that D const has little utility if you try to program in C++ style.
I am very interested in functional programming, yet ironically, one of D's top functional programming selling points, ranged-based programming, interacts badly with const. Just ask Jonathan about using ranges with const, and you'll see what I mean. :-) The very design of ranges in D requires that the range be mutable. However, because const is infectious, this makes it a royal pain to use in practice. Take, for example, a user-defined container type, let's call it L. For argument's sake, let's say it's a linked list. And let's say the list elements are reference-counted -- we'll write that as RefCounted!Elem even though this argument isn't specific to the current Phobos implementation of RefCounted. As experience has shown in the past, it's usually a good idea to separate the container from the range that iterates over it, so an obvious API choice would be to define, say, an .opSlice method for L that returns a range over its elements. Now, logically speaking, iterating over L shouldn't modify it, so it would make sense that .opSlice should be const. So we have: struct L { private RefCounted!Elem head, tail; auto opSlice() const { ... } } The returned range, however, must be mutable, since otherwise you couldn't use .popFront to iterate over it (and correspondingly, Phobos isInputRange would evaluate to false). But here's the problem: because opSlice is declared const, that means `this` is also const, which means this.head and this.tail are also const. But since this.head is const, that means you couldn't do this: auto opSlice() const { struct Result { RefCounted!(const(Elem)) current; ... // rest of range API void popFront() { current = current.next; // Error: cannot assign const(RefCounted!Elem) to RefCounted!(const(Elem)) } } return Result(head); // <-- Error: cannot assign const(RefCounted!Elem) to RefCounted!(const(Elem)) } This would have worked had we used pointers instead, because the compiler knows that it's OK to assign const(Elem*) to const(Elem)*. However, in this case, the compiler has no way of knowing that it is safe to assign const(RefCounted!Elem) to RefCounted!(const(Elem)). Indeed, they are different types, and the language currently has no way of declaring the head-mutable construct required here. This is only the tip of the iceberg, of course. If you then try to add a method to RefCounted to make it convert const(RefCounted!T) to RefCounted!(const(T)), then you'll be led down a rabbit hole of further problems with const (e.g., how to implement ref-counting with const objects in a way that doesn't violate the type system) until you reach the point where it's impossible to proceed without casting away const somehow. Unfortunately, the spec says that's Undefined Behaviour. So you're on your own. This is just one example among many, that const is hard to use in the general case. It works fairly well for a narrow number of cases, such as for built-in types, but once you start generalizing your code, you'll find brick walls in undesired places, the workarounds for which require so much effort as to offset any benefits that const may have brought. TL;DR: const is beautiful in theory, but hard to use in practice. So hard that it's often not worth the trouble, despite the benefits that it undoubtedly does provide. P.S. If D had the concept of head-mutable, a lot of this pain (though not all) would have been alleviated. T -- "I'm running Windows '98." "Yes." "My computer isn't working now." "Yes, you already said that." -- User-Friendly
Aug 28 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 28.08.2018 03:11, Walter Bright wrote:
 On 8/27/2018 10:08 AM, H. S. Teoh wrote:
 Const in D makes sense as-is.  Though, granted, its infectiousness means
 its scope is actually very narrow, and as a result, we ironically can't
 use it in very many places, and so its touted benefits only rarely
 apply. :-(  Which also means that it's taking up a lot of language
 design real estate with not many benefits to show for it.
D const is of great utility if you're interested in functional programming.
D const/immutable is stronger than immutability in Haskell (which is usually _lazy_).
Aug 29 2018
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 29, 2018 at 07:50:57PM +0200, Timon Gehr via Digitalmars-d wrote:
 On 28.08.2018 03:11, Walter Bright wrote:
 On 8/27/2018 10:08 AM, H. S. Teoh wrote:
 Const in D makes sense as-is. Though, granted, its infectiousness
 means its scope is actually very narrow, and as a result, we
 ironically can't use it in very many places, and so its touted
 benefits only rarely apply. :-( Which also means that it's taking
 up a lot of language design real estate with not many benefits to
 show for it.
D const is of great utility if you're interested in functional programming.
D const/immutable is stronger than immutability in Haskell (which is usually _lazy_).
This makes me wonder: is it possible to model a lazy immutable value in D? Likely not, if we were to take the immutability literally, since once the variable is marked immutable and initialized, you couldn't change it afterwards (without casting and UB). We *might* be able to get away with a head-mutable reference to the data, though. Say something like this: struct LazyImmutable(T, alias initializer) { immutable(T)* impl; property T get() { if (impl is null) impl = initializer(); return *impl; } alias get this; } Seems rather cumbersome to use in practice, though. And adds indirection overhead to by-value types. One could possibly use emplace to alleviate that, but still, the variable itself cannot be marked immutable without breaking its functionality. Which means you couldn't rely on such a wrapper type to work transitively in complex types, unlike how immutable applies transitively to all aggregate members: if T was an aggregate type, they couldn't be LazyImmutable, but must be actually immutable. Maybe this can be made to work, but at the sacrifice of being unable to use built-in type qualifiers like const/immutable. T -- Fact is stranger than fiction.
Aug 29 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/29/2018 10:50 AM, Timon Gehr wrote:
 D const/immutable is stronger than immutability in Haskell (which is usually 
 _lazy_).
I know Haskell is lazy, but don't see the connection with a weaker immutability guarantee. In any case, isn't immutability a precept of FP?
Aug 29 2018
next sibling parent "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 08/29/2018 04:01 PM, Walter Bright wrote:
 On 8/29/2018 10:50 AM, Timon Gehr wrote:
 D const/immutable is stronger than immutability in Haskell (which is 
 usually _lazy_).
I know Haskell is lazy, but don't see the connection with a weaker immutability guarantee. In any case, isn't immutability a precept of FP?
I think the point is that it disallows less, and permits more, all without breaking immutability. Ie, lazy immutable *can* be changed, albiet once and only once in a very specific circumstance: When transitioning from uninitialized to initialized. AIUI, D only has this "the immutable is in-scope, but can still be initialized" state within constructors, whereas (it sounds like) Haskell allows it anywhere. It's like strong-pure vs weak-pure: Both enforce the same purity guarantees, but weak-pure is less restrictive and more expressive.
Aug 30 2018
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 29.08.2018 22:01, Walter Bright wrote:
 On 8/29/2018 10:50 AM, Timon Gehr wrote:
 D const/immutable is stronger than immutability in Haskell (which is 
 usually _lazy_).
I know Haskell is lazy, but don't see the connection with a weaker immutability guarantee.
In D, you can't have a lazy value within an immutable data structure (__mutable will fix this).
 In any case, isn't immutability a precept of FP?
Yes, but it's at a higher level of abstraction. The important property of a (lazy) functional programming language is that a language term can be deterministically assigned a value for each concrete instance of an environment in which it is well-typed (i.e., values for all free variables of the term). Furthermore, the language semantics can be given as a rewrite system such that each rewrite performed by the system preserves the semantics of the rewritten term. I.e., terms change, but their values are preserved (immutable). [1] To get this property, it is crucially important the functional programming system does not leak reference identities of the underlying value representations. This is sometimes called referential transparency. Immutability is a means to this end. (If references allow mutation, you can detect reference equality by modifying the underlying object through one reference and observing that the data accessed through some other reference changes accordingly.) Under the hood, functional programming systems simulate term rewriting in some way, ultimately using mutable data structures. Similarly, in D, the garbage collector is allowed to change data that has been previously typed as immutable, and it can type-cast data that has been previously typed as mutable to immutable. However, it is impossible to write a GC or Haskell-like programs in D with pure functions operating on immutable data, because of constraints the language puts on user code that druntime is not subject to. Therefore, D immutable/pure are both too strong and too weak: they prevent system code from implementing value representations that internally use mutation (therefore D cannot implement its own runtime system, or alternatives to it), and it does not prevent pure safe code from leaking reference identities of immutable value representations: pure safe naughty(immutable(int[]) xs){ return cast(long)xs.ptr; } (In fact, it is equally bad that safe weakly pure code can depend on the address of mutable data.) [1] E.g.: (λa b. a + b) 2 3 and 10 `div` 2 are two terms whose semantics are given as the mathematical value 5. During evaluation, terms change: (λa b. a + b) 2 3 ⇝ 2 + 3 ⇝ 5 10 `div` 2 ⇝ 5 However, each intermediate term still represents the same value.
Sep 04 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/4/2018 12:59 PM, Timon Gehr wrote:
  [...]
Thanks for the great explanation! Not sure I thoroughly understand it, though.
 Therefore, D immutable/pure are both too strong and too weak: they prevent 
  system code from implementing value representations that internally use 
 mutation (therefore D cannot implement its own runtime system, or alternatives 
 to it), and it does not prevent pure  safe code from leaking reference 
 identities of immutable value representations:
 
 pure  safe naughty(immutable(int[]) xs){
      return cast(long)xs.ptr;
 }
 
 (In fact, it is equally bad that  safe weakly pure code can depend on the 
 address of mutable data.)
Would it make sense to disallow such casts in pure code? What other adjustments would you suggest?
Sep 04 2018
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 14:00, Walter Bright via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 8/26/2018 12:34 PM, Manu wrote:
 I work at a company with a thousand engineers, all VS users, D could
 find home there if some rough edges were polished, but they
 *absolutely must be polished* before it would be taken seriously.
 It is consistently expressed that poor VS integration is an absolute
 non-starter.
I will tiresomely ask again, do you have a list of each and every aspect of the poor integration?
Tooling specifically? Just to be clear, when I say "poor VS integration is an absolute non-starter", I'm not necessarily saying that's where we are... I'm just saying that's a fact. Where I think we are today is at MVP, which is a hard fought victory in itself. I think we've reached the point with VS integration that people won't question it and reject it on the spot... but we have entered a place where people feel that it's lacking quality compared to the tooling they're used to. Debug experience is worlds better than it's been, by contrast it's very usable today, and you'll notice I haven't been complaining about it... but it's still imperfect. You should try using VisualD to debug DMD some time. You'll quickly discover edge cases trying to evaluate all the relevant state while stepping around. C++ RTTI is a problem (can't identify derived type in debugger), globals are a problem, TLS is a problem. The minor edge cases emerge frequently enough that they convey a sense of immaturity. I'll start taking note everything I encounter a minor debugging issue. One specific issue I frequently encounter is with scope closure. I'd really like to see the line of the closing '}' being emit as the line where all destructors are called. The workflow in C++ for example: - Place breakpoint on the line of the closing '}' for a scope. - Debugger stops at that line prior to execution of scope closing destructors, you can use step-in and enter the destructors that are scheduled to call at the end of scope. - It's very awkward to break and step through destruction paths without being able to place such breakpoints at end-of-scope. - Developers expect this, and are surprised when it "doesn't work". I added a bugzilla for this issue 6 or 7 years ago... but I can't find it. We could really use robust tools to produce a matching C++ .h file from the extern symbols in a D module, and likewise a .di file from a C/C++ header. This category of tooling has many existing instances, it's overwhelming. As far as I can tell, they're all experimental, or broken, or designed only to target a specific application (DMD), etc. I think it would be an advantage if there were *one* tool, that works, in both directions, which is advertised, recommended, and maintained by the core group. I've evaluated them all before at various points in time, but the landscape keeps shifting. Someone, please give me a prominent hyperlink to the solution that *works*, and instructions. I don't have energy to re-do comprehensive evaluations on this front, I'm spent. Agree on one, make it the official one, and then track bugs against it in bugzilla? FWIW, my recent flurry of extern(C++) related PR's on DMD are only made possible by the current state of the VS tooling. There is no way I would have taken the time to do that work if the VS tooling wasn't where it is. Those PR's are a direct result of VS tooling reaching relative maturity; they would not have happened if the VS project didn't just load and the debugger just worked. The bar for tooling is very high, and we VS developers are quite precious, and soft ;) VisualD: Installer needs to be signed alongside DMD installer. Autocomplete/Intellisense is very unreliable. I don't think it has clear visibility through enough meta constructs. Today's solutions appear ad-hoc, I suspect they could receive a lot more assistance from DMD. Go-to definition must _always_ work. Syntax highlighting only performs a fraction of what VisualAssist does. Colleagues that have been experimenting with D consistently complain that they feel colour blind. Find-all-references locate all symbol references in a project Refactoring tools (rename symbol throughout project, etc) Rainer has done amazing work; he's tirelessly helpful, but it feels unfair to dump this whole load onto him. I'm not sure who or how to make these things a priority for development. A robust 'intellisense' solution (MS nomenclature) in the form of a library would help everyone. It should support: * Comprehensive syntax highlighting which is *correct*, not just best-guess * Auto-complete; when I press '.' what are all valid identifiers, and no spurious entries that aren't. Current tooling misses heaps of stuff, and also tends to pollute the list with a lot of junk. * Go-to definition (ie, locate symbol) must reliably work * Refactor tools; eg, rename identifier throughout code * Suggest imports when calling functions that aren't in scope * Cleanup spurious imports People learn to lean on their tooling very heavily, and they'll feel a strong sense of loss. This puts a further burden on 'D the language' to be _that much better_ to compensate for any such sense of loss in tooling. The last few experiences I list are imperfect in C++, D can theoretically (and should) do a much better job, since scoping and language structure is better, and there's no text includes and macros. programmers to use C++. They *hate* C++ even more than C++ programmers. This is actually a really interesting opportunity to show them how bar for tooling very high. If we were able to respond with similarly quality tooling, we'd have a There's more things than that though.. I'll dump a list of things that are on the front of my mind recently. Language: There's the namespace thing and the ref DIP. Please just add support for a non-scoped string namespace beside the existing solution. We know I can't reason this with you... just... please do it for all of us that actually use it. I'll buy you a beer. Andrei's copy-ctor, and the move-ctor DIP would be super-welcome conclusions. Copy-construction is a problem. There's a set of languages issues in this space that Andrei knows well, and they're very important. I'm finding that 'const' is a practical problem. I don't know what to do. - Conventional wisdom; "don't use const in D", doesn't work when interacting with C++, which uses const successfully and aggressively. - I use a head-const hack, which is a shim that wraps pointers and forcefully casts-away const (technically undefined behaviour; in practice, it works) - C++ people lose their mind when they see this. It would be possible to pacify them with demonstration that D's const is superior, but I don't know how to produce that evidence. - I used to buy into D's const philosophy, but I've come to conclude that C++ got it right. Radical solutions should not be off the table... ARC? What ever happened to the opAddRef/opDecRef proposal? Was it rejected? Is it canned, or is it just back on the bench? (GC is absolutely off the table for my project, I have no influence on this) On DMD: My project is 100% DLL based. DLL's must work flawlessly. For STL interaction (which I'm concluding now): - This ABI crash is blocking me: https://issues.dlang.org/show_bug.cgi?id=19179 - We need a version to detect which CRT was selected at compile time so we can emit matching data structures in our code. - MS .obj files have _ITERATOR_DEBUG_LEVEL definition emit to the object file. I think we need a pragma to emit such tokens to the object. I would make the STL modules emit the _ITERATOR_DEBUG_LEVEL as the code was generated using said pragma. - https://issues.dlang.org/show_bug.cgi?id=18999 For this project, I would ban GC, perhaps attempt using -betterC. I have little practical experience here... but reports are often unhappy. I'll find out I guess. This issue: https://issues.dlang.org/show_bug.cgi?id=18896 Less important: This emerged again recently: https://issues.dlang.org/show_bug.cgi?id=18906 Workarounds for this are problematic: https://issues.dlang.org/show_bug.cgi?id=18845 I wasn't able to conclude some mangling issues during my recent effort, they interacted awkwardly with the build/test scripts, and DMC: * https://issues.dlang.org/show_bug.cgi?id=18997 * https://issues.dlang.org/show_bug.cgi?id=18958 I don't think this is exhaustive... but they're at close reach.
Aug 26 2018
next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Sunday, 26 August 2018 at 23:39:32 UTC, Manu wrote:
 We could really use robust tools to produce a matching C++ .h 
 file
 from the extern symbols in a D module, and likewise a .di file 
 from a
 C/C++ header.
 This category of tooling has many existing instances, it's
 overwhelming. As far as I can tell, they're all experimental, or
 broken, or designed only to target a specific application 
 (DMD), etc.
 I think it would be an advantage if there were *one* tool, that 
 works,
 in both directions, which is advertised, recommended, and 
 maintained
 by the core group.
 I've evaluated them all before at various points in time, but 
 the
 landscape keeps shifting. Someone, please give me a prominent
 hyperlink to the solution that *works*, and instructions. I 
 don't have
 energy to re-do comprehensive evaluations on this front, I'm 
 spent.
 Agree on one, make it the official one, and then track bugs 
 against it
 in bugzilla?
I know this isn't quite what you asked for but you should get in contact with Iain to generalise https://github.com/dlang/dmd/pull/8591/files As for the other direction, I'd suggest talking to Atila to get app working for your use cases. This is about as official as its going to get.
Aug 26 2018
parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 17:20, Nicholas Wilson via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 26 August 2018 at 23:39:32 UTC, Manu wrote:
 We could really use robust tools to produce a matching C++ .h
 file
 from the extern symbols in a D module, and likewise a .di file
 from a
 C/C++ header.
 This category of tooling has many existing instances, it's
 overwhelming. As far as I can tell, they're all experimental, or
 broken, or designed only to target a specific application
 (DMD), etc.
 I think it would be an advantage if there were *one* tool, that
 works,
 in both directions, which is advertised, recommended, and
 maintained
 by the core group.
 I've evaluated them all before at various points in time, but
 the
 landscape keeps shifting. Someone, please give me a prominent
 hyperlink to the solution that *works*, and instructions. I
 don't have
 energy to re-do comprehensive evaluations on this front, I'm
 spent.
 Agree on one, make it the official one, and then track bugs
 against it
 in bugzilla?
I know this isn't quite what you asked for but you should get in contact with Iain to generalise https://github.com/dlang/dmd/pull/8591/files As for the other direction, I'd suggest talking to Atila to get app working for your use cases. This is about as official as its going to get.
Yeah, i've been following both those efforts. I don't have any free time to motivate this stuff on my own right now. I'm just listing all the things (because Walter asked me to). Incidentally, what's the state of DCompute stuff lately? Did the front-end ever get a polish pass? That's actually another really high-value ticket that I could use to gain a lot of leverage, if it's at a place where you'd want to show it to developers... Mostly, just a good set of step-by-step docs would make all the difference.
Aug 26 2018
parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Monday, 27 August 2018 at 00:26:35 UTC, Manu wrote:
 Yeah, i've been following both those efforts.
 I don't have any free time to motivate this stuff on my own 
 right now.
 I'm just listing all the things (because Walter asked me to).
Fair enough. I suppose you'd need namespaces working first anyway to get much out of those tools.
 Incidentally, what's the state of DCompute stuff lately? Did the
 front-end ever get a polish pass?
 That's actually another really high-value ticket that I could 
 use to
 gain a lot of leverage, if it's at a place where you'd want to 
 show it
 to developers...
 Mostly, just a good set of step-by-step docs would make all the 
 difference.
Mostly I've been stupidly busy with uni but I'm now FREEEEEEE! I'm going to wait for the LLVM 7 release (very soon) and then get things going again: testing, atomics, docs, API contracts etc. I'll post to Announce when I've done all that. I'm still waiting for the Khronos folks to get back to me to get the SPIRV backend into upstream LLVM. You're mostly interested in CUDA, right? That should be much easier to get shipshape.
Aug 26 2018
next sibling parent Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 17:50, Nicholas Wilson via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 27 August 2018 at 00:26:35 UTC, Manu wrote:
 Yeah, i've been following both those efforts.
 I don't have any free time to motivate this stuff on my own
 right now.
 I'm just listing all the things (because Walter asked me to).
Fair enough. I suppose you'd need namespaces working first anyway to get much out of those tools.
 Incidentally, what's the state of DCompute stuff lately? Did the
 front-end ever get a polish pass?
 That's actually another really high-value ticket that I could
 use to
 gain a lot of leverage, if it's at a place where you'd want to
 show it
 to developers...
 Mostly, just a good set of step-by-step docs would make all the
 difference.
Mostly I've been stupidly busy with uni but I'm now FREEEEEEE! I'm going to wait for the LLVM 7 release (very soon) and then get things going again: testing, atomics, docs, API contracts etc. I'll post to Announce when I've done all that. I'm still waiting for the Khronos folks to get back to me to get the SPIRV backend into upstream LLVM. You're mostly interested in CUDA, right? That should be much easier to get shipshape.
Actually, I'm really mostly interested in DX12 shader output right now... I think there are tools that can convert LLVM to DX shaders? I haven't looked into it yet, but it's on my backlog. Next would be SPIRV for Vulkan.
Aug 26 2018
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 18:08, Manu <turkeyman gmail.com> wrote:
 On Sun, 26 Aug 2018 at 17:50, Nicholas Wilson via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 27 August 2018 at 00:26:35 UTC, Manu wrote:
 Yeah, i've been following both those efforts.
 I don't have any free time to motivate this stuff on my own
 right now.
 I'm just listing all the things (because Walter asked me to).
Fair enough. I suppose you'd need namespaces working first anyway to get much out of those tools.
 Incidentally, what's the state of DCompute stuff lately? Did the
 front-end ever get a polish pass?
 That's actually another really high-value ticket that I could
 use to
 gain a lot of leverage, if it's at a place where you'd want to
 show it
 to developers...
 Mostly, just a good set of step-by-step docs would make all the
 difference.
Mostly I've been stupidly busy with uni but I'm now FREEEEEEE! I'm going to wait for the LLVM 7 release (very soon) and then get things going again: testing, atomics, docs, API contracts etc. I'll post to Announce when I've done all that. I'm still waiting for the Khronos folks to get back to me to get the SPIRV backend into upstream LLVM. You're mostly interested in CUDA, right? That should be much easier to get shipshape.
Actually, I'm really mostly interested in DX12 shader output right now... I think there are tools that can convert LLVM to DX shaders? I haven't looked into it yet, but it's on my backlog. Next would be SPIRV for Vulkan.
This looks promising: https://blogs.msdn.microsoft.com/directx/2017/01/23/new-directx-shader-compiler-based-on-clangllvm-now-available-as-open-source/
Aug 26 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Monday, 27 August 2018 at 01:10:17 UTC, Manu wrote:
 On Sun, 26 Aug 2018 at 18:08, Manu <turkeyman gmail.com> wrote:
 Actually, I'm really mostly interested in DX12 shader output 
 right
 now... I think there are tools that can convert LLVM to DX 
 shaders? I
 haven't looked into it yet, but it's on my backlog.
 Next would be SPIRV for Vulkan.
This looks promising: https://blogs.msdn.microsoft.com/directx/2017/01/23/new-directx-shader-compiler-based-on-clangllvm-now-available-as-open-source/
I would love to do it but that repo is _horribly_ organised and resembles nothing like a backend and appears to lack relevant LLVM IR tests. SPIRV for Vulkan would be doable if there is support from the Vulcan folks when the SPIRV backend is upstreamed to LLVM.
Aug 26 2018
prev sibling next sibling parent reply tide <tide tide.tide> writes:
On Sunday, 26 August 2018 at 23:39:32 UTC, Manu wrote:
 You should try using VisualD to debug DMD some time. You'll 
 quickly
 discover edge cases trying to evaluate all the relevant state 
 while
 stepping around. C++ RTTI is a problem (can't identify derived 
 type in
 debugger), globals are a problem, TLS is a problem.
 The minor edge cases emerge frequently enough that they convey 
 a sense
 of immaturity. I'll start taking note everything I encounter a 
 minor
 debugging issue.
It's not just VisualD, the debug info DMD produces just doesn't include things like global variables for some reason. I use VS Code to debug, I get around it by using -gc (which is now deprecated) and adding the variable to the watch list with the full name. It's a pain in the ass though.
Aug 26 2018
next sibling parent Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 17:40, tide via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, 26 August 2018 at 23:39:32 UTC, Manu wrote:
 You should try using VisualD to debug DMD some time. You'll
 quickly
 discover edge cases trying to evaluate all the relevant state
 while
 stepping around. C++ RTTI is a problem (can't identify derived
 type in
 debugger), globals are a problem, TLS is a problem.
 The minor edge cases emerge frequently enough that they convey
 a sense
 of immaturity. I'll start taking note everything I encounter a
 minor
 debugging issue.
It's not just VisualD, the debug info DMD produces just doesn't include things like global variables for some reason. I use VS Code to debug, I get around it by using -gc (which is now deprecated) and adding the variable to the watch list with the full name. It's a pain in the ass though.
Right. That's not even remotely acceptable.
Aug 26 2018
prev sibling parent Andre Pany <andre s-e-a-p.de> writes:
On Monday, 27 August 2018 at 00:35:12 UTC, tide wrote:
 On Sunday, 26 August 2018 at 23:39:32 UTC, Manu wrote:
 [...]
It's not just VisualD, the debug info DMD produces just doesn't include things like global variables for some reason. I use VS Code to debug, I get around it by using -gc (which is now deprecated) and adding the variable to the watch list with the full name. It's a pain in the ass though.
Please see this SAoC https://forum.dlang.org/thread/wnjccfdfcptsfefafplr forum.dlang.org Kind regards Andre
Aug 27 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
Thanks for the list! This is good stuff.
Aug 26 2018
prev sibling next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, August 26, 2018 5:39:32 PM MDT Manu via Digitalmars-d wrote:
   ARC? What ever happened to the opAddRef/opDecRef proposal? Was it
 rejected? Is it canned, or is it just back on the bench? (GC is
 absolutely off the table for my project, I have no influence on this)
I don't know what Walter's current plans are for what any built-in ref-counting solution would look like, but it's my understanding that whatever he was working on was put on hold, because he needed something like DIP 1000 in order to make it work with safe - which is what then triggered his working on DIP 1000 like he has been. So, presumably, at some point after DIP 1000 is complete and ready, he'll work on the ref-counting stuff again. So, while we may very well get it, I expect that it will be a while. - Jonathan M Davis
Aug 26 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 6:09 PM, Jonathan M Davis wrote:
 I don't know what Walter's current plans are for what any built-in
 ref-counting solution would look like, but it's my understanding that
 whatever he was working on was put on hold, because he needed something like
 DIP 1000 in order to make it work with  safe - which is what then triggered
 his working on DIP 1000 like he has been. So, presumably, at some point
 after DIP 1000 is complete and ready, he'll work on the ref-counting stuff
 again. So, while we may very well get it, I expect that it will be a while.
DIP1000 is needed to make ref counting memory safe. Andrei is working on ref counting, and he's concluded that copy constructors are a key feature to make them work. Postblits are, sadly, a great idea that are just a failure. Hence, dip1000 and copy constructors (Razvan is working on that) are key technologies.
Aug 29 2018
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On Sun, 26 Aug 2018 at 18:09, Jonathan M Davis via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sunday, August 26, 2018 5:39:32 PM MDT Manu via Digitalmars-d wrote:
   ARC? What ever happened to the opAddRef/opDecRef proposal? Was it
 rejected? Is it canned, or is it just back on the bench? (GC is
 absolutely off the table for my project, I have no influence on this)
I don't know what Walter's current plans are for what any built-in ref-counting solution would look like, but it's my understanding that whatever he was working on was put on hold, because he needed something like DIP 1000 in order to make it work with safe - which is what then triggered his working on DIP 1000 like he has been. So, presumably, at some point after DIP 1000 is complete and ready, he'll work on the ref-counting stuff again. So, while we may very well get it, I expect that it will be a while.
I'm sure I recall experimental patches where those operators were available to try out... was I dreaming? :/
Aug 26 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 6:11 PM, Manu wrote:
 I'm sure I recall experimental patches where those operators were
 available to try out... was I dreaming? :/
Andrei has worked on this for a long while, and finally came to the conclusion that ref counting won't work without copy constructors. Hence copy constructors are a priority.
Aug 26 2018
parent Stefam Koch <uplink.coder googlemail.com> writes:
On Monday, 27 August 2018 at 04:56:22 UTC, Walter Bright wrote:
 On 8/26/2018 6:11 PM, Manu wrote:
 I'm sure I recall experimental patches where those operators 
 were
 available to try out... was I dreaming? :/
Andrei has worked on this for a long while, and finally came to the conclusion that ref counting won't work without copy constructors. Hence copy constructors are a priority.
I'd like to know how he arrived at that conclusion.
Aug 27 2018
prev sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Sunday, 26 August 2018 at 19:34:39 UTC, Manu wrote:
 On Sun, 26 Aug 2018 at 12:10, RhyS via Digitalmars-d 
 <digitalmars-d puremagic.com> wrote:
 On Sunday, 26 August 2018 at 18:18:04 UTC, drug wrote:
 It's rather funny to see how one man who forced to program in
 programming language he doesn't like can triggers comments 
 from
 lurkers that they don't like D too. No offense.
 D is in great form and is getting much better and better and
 I'd like to ask D community to continue their good work and
 make D great again.
Most people lurking here are people that WANT to use D but are offset by the issues. D is not bad as a language but it has issue. Their are issues at every step in the D eco system and each of those create a barrier. Its those same issues that never seem to get solved and are secondary citizens compared to adding more "future" features or trying to Up-one C++... Its not BetterC or static if or whatever new feature of the month, that brings in new people. You can advertise D as much as you want, but when people download D and very few people stay, is that not a hint... The fact that only recently the D Poll pointed out that most people are using VSC and not VS. I am like "what, you only figure that out now". Given the mass popularity of VSC... That alone tells you how much the mindset of D is stuck in a specific eco space.
Industry tends to use VS, because they fork-out for the relatively expensive licenses. I work at a company with a thousand engineers, all VS users, D could find home there if some rough edges were polished, but they *absolutely must be polished* before it would be taken seriously. It is consistently expressed that poor VS integration is an absolute non-starter. While a majority of people (hobbyists?) that take an online poll in an open-source community forum might be VSCode users, that doesn't mean VS is a poor priority target. Is D a hobby project, or an industry solution? I vote the latter. I don't GAF about peoples hobbies, I just want to use D to _do my job_. Quality VS experience is critical to D's adoption in that sector. Those 1000 engineers aren't reflected in your poll... would you like them to be?
Do you see a path from here to there that's planned? I think it's very difficult winning over people that expect to see the same degree of polish as in a project thats older and has much more commercial support. In other words as a thought experiment if everyone in the community were to stop and work only on VS and debugging polish, how many years would it be before your colleagues were willing to switch? I think it might be a while. I'm not suggesting that polish isn't worth working on, but one might be realistic about what may be achieved. I think D is a classic example of Clayton Christensen's Innovators Dilemma. In the beginning a certain kind of innovation starts at the fringe. It's inferior alongst some dimensions compared to the products with high market share and so it gets kind of ignored. But for some particular reasons it has a very high appeal to some groups of people and so it keeps growing mostly unnoticed and over tiny expands the niches where it is used. This can keep going for a long time. And then something in the environment changes and it's like it becomes an overnight success. For American cars it was the oil price shock of the 1970s. Japanese cars then might have been seen as inferior but they were energy efficient and they worked. I think it's possible that for D this will arise from the interaction of data set sizes growing - storage prices drop at 40% a year and somehow people find a way to use that cheaper storage - whilst processing power and memory latency and bandwidth is a sadder tale. But it might be something else. So people who say that there is no place for D in the kind of work they do might sometimes be right. Frustrating because if only the polish were there, but polish is a lot of work and not everyone is interested in it. They might not be right about broader adoption because the world is a very big place,most people don't talk about their work, and because some of the factors that present huge obstacles in some environments simply don't apply in others. Thinking about frustrations as an entrepreneurial challenge may be ultimately more generative than just hoping someone will do something. I do wonder if there isn't an opportunity in organising people from the community to work on projects that enterprise users would find valuable but that won't get done otherwise. Organising the work might not be difficult, but it takes time and attention, which enterprise users are not long on.
Aug 26 2018
parent Sjoerd Nijboer <dlang sjoerdnijboer.com> writes:
On Monday, 27 August 2018 at 01:45:37 UTC, Laeeth Isharc wrote:
 I think D is a classic example of Clayton Christensen's 
 Innovators Dilemma.  In the beginning a certain kind of 
 innovation starts at the fringe.  It's inferior alongst some 
 dimensions compared to the products with high market share and 
 so it gets kind of ignored.
Those inferior dimentions for me are productivity in the form of code-navigation, proper autocompletion, a package manager with a wizard, stability of the toolchain, auto C/C++/D header translation and a project manager with a wizard. All with proper GUI support since I'm IDE challenged. I leterally can't work whitout one. I guess I could, but someone would have to invest serious time in helping me out with the issues I face for getting up to a level of skill where I would be productive in D whitout an IDE. I can't create or improve this tooling by myself because I don't have the neccesary knowledge and skill required for said tooling, but I'm willing to back some kind of crowd funded kickstarter if needed. Don't get me wronf, every few months when i check D's ecosystem out I see massive improvements and I acknowledge that there is a good amounth of tooling for D already, it just isn't up to a standard for me where I can say that D is a good alternative for the usecases where it should shine brightly.
Aug 26 2018
prev sibling parent reply Radu <rad.racariu gmail.com> writes:
On Sunday, 26 August 2018 at 16:09:35 UTC, lurker wrote:
 [...]
Yeah right... I'm sure this is what everyone experienced. This thread has become the trollers trolling playground.
Aug 26 2018
next sibling parent lurker <lurker aol.com> writes:
On Sunday, 26 August 2018 at 19:28:30 UTC, Radu wrote:
 On Sunday, 26 August 2018 at 16:09:35 UTC, lurker wrote:
 [...]
Yeah right... I'm sure this is what everyone experienced. This thread has become the trollers trolling playground.
no, i used to use d1 a lot and then d2 changed everything with it promise. but it turned into skript like monster, that doesn't work very well with tons of features. d2 is unreliable, doesn't install correct, dll problem ..... no company i ever worked at, used linux. they all use vs with windows. they want to make money and not go for the experiments of some compiler freaks. it is important that you can have a quality experience on windows with vs, be able to use dll's and have a stable compiler with predictable use; if possible be able not to use the gc, but the gc wouldn't even be hanger. well, you are correct, i have a different opinion and experience with d2 - so i must be a troll. anyway, radu, i wish you good luck on a larger commercial program on windows with d2. over and out.
Aug 26 2018
prev sibling parent reply Ali <fakeemail example.com> writes:
On Sunday, 26 August 2018 at 19:28:30 UTC, Radu wrote:
 On Sunday, 26 August 2018 at 16:09:35 UTC, lurker wrote:
 [...]
Yeah right... I'm sure this is what everyone experienced. This thread has become the trollers trolling playground.
Radu, when i started using this forum, one of the first questions i asked How do you used D? https://forum.dlang.org/thread/vyrjquwvwohapjnlmvvv forum.dlang.org And the majority replied that they are using it for personal and hobby projects Who is really trolling who?
Aug 26 2018
parent Radu <rad.racariu gmail.com> writes:
On Sunday, 26 August 2018 at 20:21:07 UTC, Ali wrote:
 On Sunday, 26 August 2018 at 19:28:30 UTC, Radu wrote:
 On Sunday, 26 August 2018 at 16:09:35 UTC, lurker wrote:
 [...]
Yeah right... I'm sure this is what everyone experienced. This thread has become the trollers trolling playground.
Radu, when i started using this forum, one of the first questions i asked How do you used D? https://forum.dlang.org/thread/vyrjquwvwohapjnlmvvv forum.dlang.org And the majority replied that they are using it for personal and hobby projects Who is really trolling who?
The experience described by OP makes one question why the hell this language still exists and still has users. Plus that I found it extremely offensive to the people working to make Dlang better. On your little divergent line. Not everyone answers forum polls and not everyone working on commercial projects wants or can comment on them. I hope that at least 10% percent of the energy spent on venting on the forums can be spent on bug reports, PRs and some donations.
Aug 26 2018
prev sibling parent Andre Pany <andre s-e-a-p.de> writes:
On Sunday, 26 August 2018 at 13:40:17 UTC, Chris wrote:
 On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
 [...]
No. Nobody forces you to use the latest version that may have an improved GC, new library functions or bug fixes. In fact, why bother with improving the language at all? But how do you feel about code that you've been compiling with, say dmd 2.071.2 for years now - including workarounds for compiler bugs? Doesn't the thought of having to upgrade it one day bother you at all? What if your customer said that 2.08++ had better features asking you to use them? The burden of finding paths to handle deprecations etc. is on the user, not the language developers. And this is where the psychological factor that Laeeth was talking about comes in. If you're constantly programming thinking "Whatever I write today might break tomorrow, uh, and what about the code I wrote in 2016? Well, I'll have to upgrade it one day, when I have time. I'll just keep on using an older version of dmd for now. Yeah, no, I cannot benefit from the latest improvements but at least it compiles with dmd2.st0neage. But why worry, I'll just have to get used to the fact that I have different code for different versions, for now...and forever." You can get used to anything until you find out that it doesn't need to be this way. You write unexciting Java code and hey, it works and it always will. It took me a while to understand why Java has been so successful, but now I know. It's not write once, run everywhere. It's write once, run forever. Stability, predictability. And maybe that's why Java, Go and once C++ prefer a slower pace. I just don't understand why it is so hard to understand the points I and others have made. It's not rocket science, but maybe this is the problem, because I already see, the point to take home is: There are no real problems, we are just imagining them. Real world experience doesn't count, because we just don't see the bigger picture which is the eternal glory of academic discussions about half baked features of an eternally unfinished language that keeps changing randomly. Not practical, but intellectually satisfying.
I reaĺly like new features, for new projects I also consider to use the latest stable dmd version (2.xx.1 or 2.xx.2) if there aren't any known issues. For legacy coding I do the math: does the new features, gc improvements,... worth the time = money. I can also decide to upgrade every 5 releases, but only if it worth the investment. I want to stress, the upgrade is fully in the hand of the developer and the decision can be made on costs and benefits. My opinion might be very optimistic, but I feel some opinions in this thread are rather pessimistic. Kind regards Andre
Aug 26 2018
prev sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
 On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright 
 wrote:
 On 8/25/2018 3:52 AM, Chris wrote:
 On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright 
 wrote:
 Every programmer who says this also demands new (and 
 breaking) features.
"Every programmer who..." Really?
You want to remove autodecoding (so do I) and that will break just about every D program in existence. For everyone else, it's something else that's just as important to them. For example, Shachar wants partially constructed objects to be partially destructed, a quite reasonable request. Ok, but consider the breakage: struct S { ~this() {} } class C { S s; this() nothrow {} } I.e. a nothrow constructor now must call a throwing destructor. This is not some made up example, it breaks existing code: https://github.com/dlang/dmd/pull/6816 If I fix the bug, I break existing code, and apparently a substantial amount of existing code. What's your advice on how to proceed with this?
In the whole discussion I miss 2 really important things. If your product compiles fine with a dmd version, no one forces you to update to the next dmd version. In the company I work for, we set for each project the DMD version in the build settings. The speed of DMD releases or breaking changes doesn't affect us at all. Maybe I do not know a lot open source products but the amount of work which goes into code quality is extremely high for the compiler, runtime, phobos and related products. I love to see how much work is invested in unit tests and also code style. DMD (and LDC and GDC) has greatly improved in the last years in various aspects. But I also see that there is a lot of work to be done. There are definitely problems to be solved. It is sad that people like Dicebot leaving the D community. Kind regards Andre
Dicebot should speak for himself as he wishes. But I was entertained by the simultaneous posting by someone else of a blog post from a while back with him asking for comments on the early release of dtoh, a tool intended in time to be integrated into DMD given its design. I don't think he was very happy about the process around DIP1000 but I am not myself well placed to judge. In any case, languages aren't in a death match where there can be only one survivor. Apart from anything else, does anyone really think less code will be written in the future than in the past or that there will be fewer people who write code as part of what they do but aren't career programmers? I probably have an intermediate experience between you and Jon Degenhardt on the one hand and those complaining about breakage. Some of it was self-inflicted because on the biggest project we have 200k SLoC a good part of which I wrote myself pretty quickly and the build system has been improvised since and could still be better. The Linux builder is a docker container created nightly and taking longer to something similar on Windows side, where funnily enough the bigger problems are. Often in the past little things like dub turning relative paths into absolute ones, creating huge paths that broke DMD path limit till we got fed up and decided to fix ourselves. (Did you know there are six extant ways of handling paths on Windows?) Dub dependency resolution has been tough. It might be better now. I appreciate it's a tough problem but somehow eg maven is quick (it might well cheat by solving an easier problem). And quite a lot of breakage in vibe. But nobody forces you to use vibe and there do exist other options for many things. Overall though, it's not that bad depending on your use case. Everything has problems but also everyone has a different kind of sensitivity to different kinds of problems. For me, DPP makes a huge difference because I now know it's pretty likely I can just #include a C library if that's the best option and in my experience it mostly just works. The plasticity, coherence and readability of D code dominates the difficulties for quite a few things I am doing. Might not be the case for others because everyone is different. Cost of my time in the present context dominates cost of several programmers' time but I don't think thats a necessary part of why D makes sense for some things for us. I think by the end of this year we might have eleven people including me writing D at least sometimes, from only me about 18 months ago. That's people working from the office including practitioners who write code and add a handful of remote consultants who only write D to that. There's no question from my perspective that D is much better than a year ago and unimaginably better from when I first picked it up in 2014. One can't be serious suggesting that D isn't flourishing as far as adoption goes. The forum really isn't the place to assess what people using the language at work feel. Almost nobody working from our offices is active in the forums and that's the impression I get speaking to other enterprise users. People have work to do, unfortunately! I wonder if the budget was there whether it would be possible to find someone even half as productive as Seb Wilzbach to help full-time, because whilst some of the problems raised are very difficult ones, others might just be a matter of (very high quality) manpower. Michael Parker's involvement has also made a huge difference to public profile of D. I definitely think a stable version with backfixes ported would be great if feasible. I don't really get the hate for betterC even though I don't directly use it myself in code I write. It's useful directly for lots of things like web assembly and embedded and a side effect from Seb's work on betterC testing for Phobos will I guessbe that it's much clearer how much can be used without depending on the GC because that's what betterC also implies. Is it really such an expensive effort ? Beyond real factors it also helps with perception and social factors. I feel like I came across more HFT programmers or people who claim to be such in Reddit and who could, they say, never even stare too closely at a GC language than in the industry itself! Would be great if Manu's work on STL and extern (C++) comes to fruition. I think DPP will work for much more of C++ in time, though it might be quite some time. I wonder if we are approaching the point where enterprise crowd-funding of missing features or capabilities in the ecosystem could make sense. If you look at how Liran managed to find David Nadlinger to help him, it could just be in part a matter of lacking social organisation preventing the market addressing unfulfilled mutual coincidences of wants. Lots of capable people would like to work full time programming in D. Enough firms would like some improvements made. If takes work to organise these things. If I were a student I might be trying to see if there was an opportunity there.
Aug 26 2018
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:

It's simple, I went to GitLab to see the code of the tool, and I 
found the articles among the other projects of the author.

 I don't think he was very happy about the process around 
 DIP1000 but I am not myself well placed to judge.
The whole story is pretty simple [1]. From my perspective, the request was to confine feature development into separate branch, to don't impact language adopters. Pay-as-you-go all way down, also for the compiler/rt/phobos codebase itself.
 I definitely think a stable version with backfixes ported would 
 be great if feasible.
The other way round: "keep it in sync with specification document, design set of acceptance tests and do all the development in a separate branch until is verified to both have desired semantics and don't cause any breakage in existing projects." [2] I would like your opinion on that specific request, "keep it in sync with specification document" versus "bureaucracy" [3]
 I wonder if we are approaching the point where enterprise 
 crowd-funding of missing features or capabilities in the 
 ecosystem could make sense.  If you look at how Liran managed 
 to find David Nadlinger to help him, it could just be in part a 
 matter of lacking social organisation preventing the market 
 addressing unfulfilled mutual coincidences of wants.  Lots of 
 capable people would like to work full time programming in D.  
 Enough firms would like some improvements made.  If takes work 
 to organise these things.  If I were a student I might be 
 trying to see if there was an opportunity there.
That would great for the ecosystem, for the language... [4] [1] https://forum.dlang.org/thread/o62rml$mju$1 digitalmars.com [2] https://forum.dlang.org/post/o6fih1$2b14$1 digitalmars.com [3] https://github.com/dlang/dmd/pull/8346 [4] https://forum.dlang.org/post/detxilaksggqsrdaogri forum.dlang.org /Paolo
Aug 27 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:

 I wonder if we are approaching the point where enterprise 
 crowd-funding of missing features or capabilities in the 
 ecosystem could make sense.  If you look at how Liran managed 
 to find David Nadlinger to help him, it could just be in part a 
 matter of lacking social organisation preventing the market 
 addressing unfulfilled mutual coincidences of wants.  Lots of 
 capable people would like to work full time programming in D.  
 Enough firms would like some improvements made.  If takes work 
 to organise these things.  If I were a student I might be 
 trying to see if there was an opportunity there.
I think D has reached the point where that'd make perfect sense. Move from the garage to a proper factory :)
Aug 27 2018
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:

 [...]
I think D has reached the point where that'd make perfect sense. Move from the garage to a proper factory :)
Who's going to pay for the factory? -Alex
Aug 27 2018
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:

 [...]
I think D has reached the point where that'd make perfect sense. Move from the garage to a proper factory :)
Who's going to pay for the factory? -Alex
That's for the D Foundation to figure out. There's a reason we have a D Foundation now, isn't there?
Aug 27 2018
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 27 August 2018 at 14:26:08 UTC, Chris wrote:
 On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc 
 wrote:

 [...]
I think D has reached the point where that'd make perfect sense. Move from the garage to a proper factory :)
Who's going to pay for the factory? -Alex
That's for the D Foundation to figure out. There's a reason we have a D Foundation now, isn't there?
The annual monthly budget is around 4K$. https://opencollective.com/dlang# -Alex
Aug 27 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Monday, 27 August 2018 at 16:15:37 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 14:26:08 UTC, Chris wrote:
 On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc 
 wrote:

 [...]
I think D has reached the point where that'd make perfect sense. Move from the garage to a proper factory :)
Who's going to pay for the factory? -Alex
That's for the D Foundation to figure out. There's a reason we have a D Foundation now, isn't there?
The annual monthly budget is around 4K$. https://opencollective.com/dlang# -Alex
"annual monthly?" Look again: https://wiki.dlang.org/Vision/2018H1#H2_2017_Review
Aug 27 2018
parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 27 August 2018 at 16:32:15 UTC, Joakim wrote:
 On Monday, 27 August 2018 at 16:15:37 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 14:26:08 UTC, Chris wrote:
 On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 [...]
Who's going to pay for the factory? -Alex
That's for the D Foundation to figure out. There's a reason we have a D Foundation now, isn't there?
The annual monthly budget is around 4K$. https://opencollective.com/dlang# -Alex
"annual monthly?" Look again: https://wiki.dlang.org/Vision/2018H1#H2_2017_Review
I merely using the value that the open collective site have give me. Regardless $1605 far from enough money to hire full time workers, as chris has suggested. -Alex
Aug 27 2018
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 27 August 2018 at 18:02:21 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 16:32:15 UTC, Joakim wrote:
 On Monday, 27 August 2018 at 16:15:37 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 14:26:08 UTC, Chris wrote:
 On Monday, 27 August 2018 at 13:48:42 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 09:36:43 UTC, Chris wrote:
 [...]
Who's going to pay for the factory? -Alex
That's for the D Foundation to figure out. There's a reason we have a D Foundation now, isn't there?
The annual monthly budget is around 4K$. https://opencollective.com/dlang# -Alex
"annual monthly?" Look again: https://wiki.dlang.org/Vision/2018H1#H2_2017_Review
I merely using the value that the open collective site have give me. Regardless $1605 far from enough money to hire full time workers, as chris has suggested. -Alex
Then the D Foundation should work on it. Get companies on board etc. All I hear is "we don't have enough money, we depend on the good will of our community members..." Then leave it. There's no way D can compete with languages that are backed by companies and that have additional benefits like targeting Android, iOS and the Web, e.g. Kotlin. What does D have to offer? Sure, nice features, but what's the point if you cannot use the language anywhere you like and have broken basics like autodecode? Also, some of the nice features (the more useful ones) are adopted by other languages as time goes by. So there will remain no compelling reason to choose D over other languages - if it goes on like this.
Aug 27 2018
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 27 August 2018 at 18:20:04 UTC, Chris wrote:
 Then the D Foundation should work on it.
Easier said then done. You can't go around demanding people to build factories without addressing the issues that comes with building factories, such as the big question of how is it going to be payed to be built. -Alex
Aug 27 2018
next sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 27 August 2018 at 19:51:52 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 18:20:04 UTC, Chris wrote:
 Then the D Foundation should work on it.
Easier said then done. You can't go around demanding people to build factories without addressing the issues that comes with building factories, such as the big question of how is it going to be payed to be built. -Alex
But it needs to be done, right? I suppose we agree on this one. Now that we have the D Foundation, its efforts should also go into the "business" side of things, not just features.
Aug 27 2018
prev sibling parent reply Ali <fakeemail example.com> writes:
On Monday, 27 August 2018 at 19:51:52 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 18:20:04 UTC, Chris wrote:
 Then the D Foundation should work on it.
Easier said then done. You can't go around demanding people to build factories without addressing the issues that comes with building factories, such as the big question of how is it going to be payed to be built. -Alex
No one is (and no one should be) demanding anything, hoping maybe.. Walter, wants to build D, and he is doing what he can to continue building it Andrei and many others joined him If we are sharing our opinion, its not coming from any sense of entitlement, we are sharing our opinion, because the builders, provided the platform for us to voice our opinion And again, because I keep repeating this, if they want more donations, I think talking more about the future plans will help, D currently neither have a larger user base, or an ambitious future plan, it make sense that they are not getting a lot of donations, they are not really making it attractive to donate I think that most current donors are probably incentivized by negative factors, or negatively motivated, they are probably afraid D's Development will stop or they feel guilty for using the language and not providing much back I dont think many donors are doing so because they are excited about the future Nothing seriously wrong about negative motivation, it works, but positive motivation is better
Aug 27 2018
parent Laeeth Isharc <Laeeth laeeth.com> writes:
On Monday, 27 August 2018 at 20:15:03 UTC, Ali wrote:
 On Monday, 27 August 2018 at 19:51:52 UTC, 12345swordy wrote:
 On Monday, 27 August 2018 at 18:20:04 UTC, Chris wrote:
 Then the D Foundation should work on it.
Easier said then done. You can't go around demanding people to build factories without addressing the issues that comes with building factories, such as the big question of how is it going to be payed to be built. -Alex
No one is (and no one should be) demanding anything, hoping maybe.. Walter, wants to build D, and he is doing what he can to continue building it Andrei and many others joined him If we are sharing our opinion, its not coming from any sense of entitlement, we are sharing our opinion, because the builders, provided the platform for us to voice our opinion And again, because I keep repeating this, if they want more donations, I think talking more about the future plans will help, D currently neither have a larger user base, or an ambitious future plan, it make sense that they are not getting a lot of donations, they are not really making it attractive to donate I think that most current donors are probably incentivized by negative factors, or negatively motivated, they are probably afraid D's Development will stop or they feel guilty for using the language and not providing much back I dont think many donors are doing so because they are excited about the future Nothing seriously wrong about negative motivation, it works, but positive motivation is off
I donate to the D Foundation via my personal consulting company though it is listed under the name of Symmetry Investments. I see that I am the second biggest donor after Andrei. I think I can have more insight into my motivations than you can, and I can say that I am motivated by enthusiasm about commercial benefits and it wouldn't have occurred to me to donate out of fear, as you suggest. If one makes a mistake I am in a business where the custom is that one fixes the mistake and moves on. Suppose it were to turn out to have been a mistake to use D. Well I have made costlier mistakes then that this year and it's only August. And, as if happens, I don't think it was a mistake. So you may think what you wish about the motivations of donors, but I think you might do well to base your views on evidence not imaginings if you wish to be taken seriously :)
Aug 29 2018
prev sibling parent reply RhyS <sale rhysoft.com> writes:
On Monday, 27 August 2018 at 18:20:04 UTC, Chris wrote:
 Then the D Foundation should work on it. Get companies on board 
 etc. All I hear is "we don't have enough money, we depend on 
 the good will of our community members..." Then leave it. 
 There's no way D can compete with languages that are backed by 
 companies and that have additional benefits like targeting 
 Android, iOS and the Web, e.g. Kotlin.
My question becomes, how is it possible that D supposedly only has a income of 3.2K ( opencollective ). hen i look at Crystal, they have 2.5k ( bountysource ) and another 2k from a single company. By that definition, Crystal what is at a 0.26 release, is out funding D by 50%... Julia on the other hand raises $4.6 million in funding in 2017...
Aug 27 2018
parent Sjoerd Nijboer <dlang sjoerdnijboer.com> writes:
On Monday, 27 August 2018 at 21:34:53 UTC, RhyS wrote:
 My question becomes, how is it possible that D supposedly only 
 has a income of 3.2K ( opencollective ).
Well, one could aquire grants from Mozila through MOSS or The Linux Foundation for development. One could market D as the very next javascript through LLVM and WebAssembly, for which its actually really really good. Or, one could go with a few benchmarks of productivity, code quality and operation cost to one of the big companies with a lot of custom software running on cloudservers. Being able to say that your solution can approach C/C++ speeds while having java like development speeds and in general a better maintainability than java is a huge plus for them. They might want to invest in D so that they might have a better tool in one or two years, plus dedication from the community. Granted, its not as simple as that. Goals need to be set, and the roadmap must become clear. And Andrei and Walter need to say yes or no to a plan from which they won't know if it'll neccesarily work out. But for exactly this, a manager for D would be perfect. Because the power of D is that the language must become better, even if that results in postphoning desparately wanted features. (Preffarably not)
Aug 27 2018
prev sibling parent Basile B. <b2.temp gmx.com> writes:
On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:
 There's no question from my perspective that D is much better 
 than a year ago and unimaginably better from when I first 
 picked it up in 2014.  One can't be serious suggesting that D 
 isn't flourishing as far as adoption goes.
I agree, things got better since, despite of what's called "anarchy". We can see that the global objectives aimed at are reached, are worked on, despite of the many small 'out of context' contributions, which are however a necessary and a painful work required to fix all what's been done when development process was less good (i.e the technical debt accumulated when CI and reviews were less good).
Aug 27 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
 You want to remove autodecoding (so do I) and that will break 
 just about every D program in existence. For everyone else, 
 it's something else that's just as important to them.
I wanted to get rid of autodecode and I even offered to test it on my string heavy code to see what breaks (and maybe write guidelines for the transition), but somehow the whole idea of getting rid of autodecode was silently abandoned. What more could I do? I am fully aware of the fact that it's always something else to someone else. But you can have a list of common sense priorities. For example, fast and efficient string handling should be up there, because in times of data mining and machine learning (cf. translation tools) string handling is _very_ important. My suggestion is - breakage only where it is absolutely necessary - new features as opt in but not as breaking features - new versions of dmd should be more tolerant of older code. Say, let's introduce a _guaranteed_ backwards compatibility of at least N versions. In this way one could work with the latest version (and benefit from new features, bug fixes, optimizations) and bit by bit upgrade old code as you go along. - maybe introduce a feature freeze for a while and have a poll about which features are really being used on a day to day basis. In this way you can spend time on fixing stuff that is really needed instead of wasting time on features that are not really used that often (as seems to be the case with safe) - stick to a list of clear goals instead of taking on board random requests, interesting as they may be. In other words: more focused work, harden your heart a little bit ;)
 For example, Shachar wants partially constructed objects to be 
 partially destructed, a quite reasonable request. Ok, but 
 consider the breakage:

   struct S {
     ~this() {}
   }

   class C {
     S s;

     this() nothrow {}
   }

 I.e. a nothrow constructor now must call a throwing destructor. 
 This is not some made up example, it breaks existing code:

   https://github.com/dlang/dmd/pull/6816

 If I fix the bug, I break existing code, and apparently a 
 substantial amount of existing code. What's your advice on how 
 to proceed with this?
I have no opinion on this. But it's indicative of the way D is being developed atm. A half baked feature (partially constructed objects with no partial destructor) that if you fix it, breaks existing code. But what for was it introduced at all and why wasn't it thought through properly right from the start? See if I donate (a humble amount of) money to the D foundation I don't want to see it used on experimental container library optimizations. I want it to be spent on D being turned into a sound and stable language with certain guarantees.
Aug 26 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 8:43 AM, Chris wrote:
 I wanted to get rid of autodecode and I even offered to test it on my string 
 heavy code to see what breaks (and maybe write guidelines for the transition), 
 but somehow the whole idea of getting rid of autodecode was silently
abandoned. 
 What more could I do?
It's not silently abandoned. It will break just about every D program out there. I have a hard time with the idea that breakage of old code is inexcusable, so let's break every old program?
 But what for was it 
 introduced at all and why wasn't it thought through properly right from the
start?
Because nobody thought about that issue before. A lot of things only become apparent in hindsight.
Aug 26 2018
next sibling parent reply FeepingCreature <feepingcreature gmail.de> writes:
On Sunday, 26 August 2018 at 22:44:05 UTC, Walter Bright wrote:
 On 8/26/2018 8:43 AM, Chris wrote:
 I wanted to get rid of autodecode and I even offered to test 
 it on my string heavy code to see what breaks (and maybe write 
 guidelines for the transition), but somehow the whole idea of 
 getting rid of autodecode was silently abandoned. What more 
 could I do?
It's not silently abandoned. It will break just about every D program out there. I have a hard time with the idea that breakage of old code is inexcusable, so let's break every old program?
Can I just throw in here that I like autodecoding and I think it's good? If you want ranges that iterate over bytes, then just use arrays of bytes. If you want Latin1 text, use Latin1 strings. If you want Unicode, you get Unicode iteration. This seems right and proper to me. Hell I'd love if the language was *more* aggressive about validating casts to strings.
Aug 26 2018
next sibling parent reply Neia Neutuladh <neia ikeran.org> writes:
On Sunday, 26 August 2018 at 23:12:10 UTC, FeepingCreature wrote:
 On Sunday, 26 August 2018 at 22:44:05 UTC, Walter Bright wrote:
 On 8/26/2018 8:43 AM, Chris wrote:
 I wanted to get rid of autodecode and I even offered to test 
 it on my string heavy code to see what breaks (and maybe 
 write guidelines for the transition), but somehow the whole 
 idea of getting rid of autodecode was silently abandoned. 
 What more could I do?
It's not silently abandoned. It will break just about every D program out there. I have a hard time with the idea that breakage of old code is inexcusable, so let's break every old program?
Can I just throw in here that I like autodecoding and I think it's good? If you want ranges that iterate over bytes, then just use arrays of bytes. If you want Latin1 text, use Latin1 strings. If you want Unicode, you get Unicode iteration. This seems right and proper to me. Hell I'd love if the language was *more* aggressive about validating casts to strings.
Same here. I do make unicode errors more often than I'd care to admit (someString[$-1] being the most common; I need to write a lastChar helper function), but autodecoding means I can avoid that class of errors.
Aug 26 2018
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, August 26, 2018 7:25:12 PM MDT Neia Neutuladh via Digitalmars-d 
wrote:
 Can I just throw in here that I like autodecoding and I think
 it's good?
 If you want ranges that iterate over bytes, then just use
 arrays of bytes. If you want Latin1 text, use Latin1 strings.
 If you want Unicode, you get Unicode iteration. This seems
 right and proper to me. Hell I'd love if the language was
 *more* aggressive about validating casts to strings.
Same here. I do make unicode errors more often than I'd care to admit (someString[$-1] being the most common; I need to write a lastChar helper function), but autodecoding means I can avoid that class of errors.
Except that it doesn't really. It just makes it so that you make those errors at the code point level instead of at the code unit level, where the error is less obvious, because it works correctly for more characters. But it's still wrong in general. e.g. code using auto-decoding is generally going to be horribly broken when operating on Hebrew text because of all of the combining characters that get used there. - Jonathan M Davis
Aug 26 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/26/2018 6:25 PM, Neia Neutuladh wrote:
 Same here. I do make unicode errors more often than I'd care to admit 
 (someString[$-1] being the most common; I need to write a lastChar helper 
 function), but autodecoding means I can avoid that class of errors.
Autodecoding doesn't prevent you from incorrectly using indices like [$-1].
Aug 27 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, August 26, 2018 5:12:10 PM MDT FeepingCreature via Digitalmars-d 
wrote:
 On Sunday, 26 August 2018 at 22:44:05 UTC, Walter Bright wrote:
 On 8/26/2018 8:43 AM, Chris wrote:
 I wanted to get rid of autodecode and I even offered to test
 it on my string heavy code to see what breaks (and maybe write
 guidelines for the transition), but somehow the whole idea of
 getting rid of autodecode was silently abandoned. What more
 could I do?
It's not silently abandoned. It will break just about every D program out there. I have a hard time with the idea that breakage of old code is inexcusable, so let's break every old program?
Can I just throw in here that I like autodecoding and I think it's good? If you want ranges that iterate over bytes, then just use arrays of bytes. If you want Latin1 text, use Latin1 strings. If you want Unicode, you get Unicode iteration. This seems right and proper to me. Hell I'd love if the language was *more* aggressive about validating casts to strings.
The problem is that auto-decoding doesn't even give you correct Unicode handling. At best, it's kind of like using UTF-16 instead of ASCII but assuming that a UTF-16 code unit can always contain an entire character (which is frequently what you get in programs written in languages like Java still don't. It's just a lot harder to realize it, because it's far from fail-fast. In general, doing everything at the code point level with Unicode (as auto-decoding does) is very much broken. It's just that it's a lot less obvious, because that much more works - and it comes with the bonus of being far less efficient. If you wanted everything to "just work" out of the box without having to worry about Unicode, you could probably do it if everything operated at the grapheme cluster level, but that be would horribly inefficient. The sad reality is that if you want your string-processing code to be at all fast while still being correct, you have to have at least a basic understanding of Unicode and use it correctly - and that rarely means doing much of anything at the code point level. It's much more likely that it needs to be at either the code unit or grapheme level. But either way, without a programmer understanding the details and programming accordingly, the code is just plain going to be wrong somewhere. The idea that we can have string-processing "just work" without the programmer having to worry about the details of Unicode is unfortunately largely a fallacy - at least if you care about efficiency. By operating at the code point level, we're just generating code that looks like it works when it doesn't really, and it's less efficient. It certainly works in more cases than just using ASCII would, but it's still broken for Unicode handling just like if the code were assuming that char was always an entire character. As such, I don't really see how there can be much defense for auto-decoding. It was done on the incorrect assumption that code points represented actually characters (for that you actually need graphemes) and that the loss in speed was worth the correctness, with the idea that anyone wanting the speed could work around the auto-decoding. We could get something like that if we went to the grapheme level, but that would hurt performance that much more. Either way, operating at the code point level everywhere is just plain wrong. This isn't just a case of "it's annoying" or "we're don't like it." It objectively results in incorrect code. - Jonathan M Davis
Aug 26 2018
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Aug 26, 2018 at 11:12:10PM +0000, FeepingCreature via Digitalmars-d
wrote:
[...]
 Can I just throw in here that I like autodecoding and I think it's
 good?  If you want ranges that iterate over bytes, then just use
 arrays of bytes.  If you want Latin1 text, use Latin1 strings. If you
 want Unicode, you get Unicode iteration. This seems right and proper
 to me. Hell I'd love if the language was *more* aggressive about
 validating casts to strings.
Actually, this is exactly the point that makes autodecoding so bad, because it *looks like* correct Unicode iteration over characters, but it actually isn't. It's iteration over Unicode *code points*, which is not the same thing as iteration over what people would think of as "characters", which in Unicode is called graphemes (cf. byGrapheme). So iterating over strings like "a\u301" will give you two codepoints, even though it actually renders as a single grapheme. Unfortunately, most of the time the iteration will look correct -- in most European languages, so the programmer will suspect nothing wrong. Until the code is then given a non-European Unicode string. Then it starts getting wrong behaviour. Not to mention that this incomplete solution represents an across-the-board performance hit on all string-processing code (unless it was explicitly written to bypass autodecoding with something like byCodeUnit), even if the code in question doesn't even care about Unicode and treats the strings as opaque byte sequences. The illusion of simplicity and correctness that autodecoding gives is misleading, and makes programmers think their code is OK, when the fact of the matter is that to handle Unicode correctly, you *have* to actually know that Unicode is and how it works. You simply cannot pretend that it bears any resemblance to the ASCII days of one code unit per character (no, not even with UTF-32) and expect your code to behave correctly with all valid Unicode input strings. In fact, this very illusion was what made Andrei choose to go with autodecoding in the first place, thinking that it would default to correct behaviour. Unfortunately, the reality didn't match up with that expectation. The ideal solution would have been to make strings non-iterable by default, and only iterable when the programmer chooses the mode of iteration (explicitly specify byCodeUnit, byCodePoint, or byGrapheme). T -- What do you call optometrist jokes? Vitreous humor.
Aug 27 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 26 August 2018 at 22:44:05 UTC, Walter Bright wrote:
 On 8/26/2018 8:43 AM, Chris wrote:
 I wanted to get rid of autodecode and I even offered to test 
 it on my string heavy code to see what breaks (and maybe write 
 guidelines for the transition), but somehow the whole idea of 
 getting rid of autodecode was silently abandoned. What more 
 could I do?
It's not silently abandoned. It will break just about every D program out there. I have a hard time with the idea that breakage of old code is inexcusable, so let's break every old program?
We all know that, but we also know that autodecode is a broken feature - which makes me shudder. So we are - once again - in the absurd situation where we have to say "If it's broke, don't fix it!" However, given the importance of string handling these days and given how basic a feature of any programming language it is, I'm surprised that nobody ever came up with a strategy to fix it. It was just said - ah, too dangerous, too complicated, let's leave bad enough alone. Do you think that companies that deal with languages would be happy to learn that D's string handling is a mess? That'd be an immediate show stopper. For these reasons I supported this particular breaking change. It is unrealistic to assume that code will never break. But as I said in my post above, dmd should give guarantees of backward compatibility of at least N versions. Then we could be more relaxed about our code. But breakage should be avoided where ever possible to begin with. To break code because you have to fix a badly implemented feature like partially constructed objects is just ridiculous. You often say we have to report bugs etc., but a lot of D's issues are common sense issues that don't need massive community input, i.e. get the basics right, before adding new features for the heck of it, reliability, improved tools. Do my suggestions really sound so unreasonable?
 But what for was it introduced at all and why wasn't it 
 thought through properly right from the start?
Because nobody thought about that issue before. A lot of things only become apparent in hindsight.
QED. With this approach you do more harm than good. I have a bad feeling about the way things are going atm.
Aug 27 2018
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/27/2018 2:14 AM, Chris wrote:
 On Sunday, 26 August 2018 at 22:44:05 UTC, Walter Bright wrote:
 Because nobody thought about that issue before. A lot of things only become 
 apparent in hindsight.
QED. With this approach you do more harm than good. I have a bad feeling about the way things are going atm.
I can quote you a loooong list of problems that are obvious only in hindsight, by world leading development teams. Start by watching the documentary series "Aviation Disasters", look at Challenger, Deepwater Horizon, Fukushima, Apollo 1, Apollo 13, the World Trade Centers, etc. Of course, there are a number of them in C, C++, Java, Javascript, basically every language I've worked with. I'll guarantee every non-trivial project you've worked on has problems that are obvious only in hindsight, too. If you wait till it's perfect, you'll never ship, and yet it'll *still* have problems. I'm not making excuses for mistakes - just don't have unworkable requirements. The end of the day is, does D get the job done for you better than other languages? That's a decision only you can make.
Aug 28 2018
parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 28 August 2018 at 07:30:01 UTC, Walter Bright wrote:
 On 8/27/2018 2:14 AM, Chris wrote:
 bad feeling about the way things are going atm.
I can quote you a loooong list of problems that are obvious only in hindsight, by world leading development teams. Start by watching the documentary series "Aviation Disasters", look at Challenger, Deepwater Horizon, Fukushima, Apollo 1, Apollo 13, the World Trade Centers, etc. Of course, there are a number of them in C, C++, Java, Javascript, basically every language I've worked with. I'll guarantee every non-trivial project you've worked on has problems that are obvious only in hindsight, too. If you wait till it's perfect, you'll never ship, and yet it'll *still* have problems. I'm not making excuses for mistakes - just don't have unworkable requirements.
This is all good and well and I know that anyone who develops software shoots him/herself in the foot sooner or later. But this is not the same situation. If you have to ship something till date X, then you are under pressure and naturally make mistakes that are obvious only on hindsight. But D is not under pressure to include new features so frequently. There's absolutely no reason to rush into something that eats up a lot of your time (better spent on more urgent problems) and by so doing produce possible breakages.
 The end of the day is, does D get the job done for you better 
 than other languages? That's a decision only you can make.
It has done a better job until recently. The problem are not things like safe, `const` and whatnot, the problem are very practical issues such as fear of breakage / time spent fixing things and running the code on ARM, integration into other technologies (webasm). Since the D Foundation was founded I really thought that part of the effort would go into stabilizing the language and developing better tools for various aspects of programming (not just language features). Programming is so much more than just language features, and languages that offer the "so much more" part are usually the ones people adopt. But somehow D still seems to be in its hobby hacker days. Features are first and foremost, everything else comes second. But features get "ripped" by other programming languages and they can pick and choose, because they know what really worked in D, while D has to struggle with the things that didn't work or only half worked. Laeeth was talking about being analytical about the whole thing. Why not find out what features are really being used? I.e. does the majority really need - for practical purposes - partially constructed objects? When people choose a programming language, there are several boxes that have to be ticked, like for example: - what's the future of language X? (guarantees, stability) - how easy is it to get going (from "Hello world" to a complete tool chain) - will it run on ARM? - will it be a good choice for the Web (e.g. webasm)? - how good is it at data processing / number grinding - etc. I think the D Foundation should focus on the more "trivial" things too. If a company is asked to develop a data grinding web application along with a smart phone app - will it choose D? If a company offers localization services and translations - will it choose D (autodecode)? The D community / leadership is acting as if they had all the time in the world. But other languages are moving fast and they learn from D what _not_ to do. Last but not least, if it's true that the D Foundation has raised only 3.2K, then there's something seriously wrong.
Aug 28 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 28 August 2018 at 08:44:26 UTC, Chris wrote:
 When people choose a programming language, there are several 
 boxes that have to be ticked, like for example:

 - what's the future of language X? (guarantees, stability)
 - how easy is it to get going (from "Hello world" to a complete 
 tool chain)
 - will it run on ARM?
 - will it be a good choice for the Web (e.g. webasm)?
 - how good is it at data processing / number grinding
 - etc.
I don't know if all their claims are 100% true, but let that sink in for a while: https://julialang.org/.
Aug 28 2018
parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Tuesday, 28 August 2018 at 08:51:27 UTC, Chris wrote:
 On Tuesday, 28 August 2018 at 08:44:26 UTC, Chris wrote:
 When people choose a programming language, there are several 
 boxes that have to be ticked, like for example:

 - what's the future of language X? (guarantees, stability)
 - how easy is it to get going (from "Hello world" to a 
 complete tool chain)
 - will it run on ARM?
 - will it be a good choice for the Web (e.g. webasm)?
 - how good is it at data processing / number grinding
 - etc.
I don't know if all their claims are 100% true, but let that sink in for a while: https://julialang.org/.
Julia is great. I don't see it as a competitor to D but for us one way researchers might access libraries written in D. One could do quite a lot in it, but I don't much fancy embedding Julia in Excel for example, though you could. Or doing DevOps in Julia. Perhaps more of a Matlab substitute. Look around and you can find people grumpy about any language that's used. http://www.zverovich.net/2016/05/13/giving-up-on-julia.html Languages really aren't in a battle to the death with each other. I find this zero-sum mindset quite peculiar.
Aug 29 2018
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 29 August 2018 at 23:47:11 UTC, Laeeth Isharc wrote:
 On Tuesday, 28 August 2018 at 08:51:27 UTC, Chris wrote:
 Julia is great.  I don't see it as a competitor to D but for us 
 one way researchers might access libraries written in D.  One 
 could do quite a lot in it, but I don't much fancy embedding 
 Julia in Excel for example, though you could.  Or doing DevOps 
 in Julia.  Perhaps more of a Matlab substitute.

 Look around and you can find people grumpy about any language 
 that's used.
 http://www.zverovich.net/2016/05/13/giving-up-on-julia.html

 Languages really aren't in a battle to the death with each 
 other.
  I find this zero-sum mindset quite peculiar.
I'm old enough to a) not become enthusiastic about a language and b) know that you can find fault with any language. It's not about "life or death". D was promising and I liked it and it did things for me no other language could do for me - back in the day. Nowadays many languages have similar features, especially the useful ones that have proven to be, well, useful and not the latest fad. But D has some major issues that have become clear to me after using it for quite a while: 1. unsolved issues like autodecode that nobody seems to care about 2. obvious facepalm moments all over the place (see 1.) 3. moving the goal posts all the time and forcing you into a new paradigm every 1 1/2 years (first it was "ranges", then "templates" and now it's "functional", wait OOP will come back one day). Yeah, a language that doesn't come with a paradigm or ideology, no, a language that only nudges you into a certain direction and makes your code look old and just soooo "not modern" according to the latest CS fashion of the day. "Why do you complain? If you think C++ (as the D leadership did for a long time), of course your code will break, you knob! If it breaks it's for your own good (for now)." 4. nitpicking over details of half baked features that shouldn't be there in the first place, but hey! let's break valid existing code to fix them - or not - or, what about volatileSafeUB (it's sooo not C++)? Yeah, sounds great. We'll just have to issue a compiler message "error: cannot assign `size_t` to `size_t`" 5. complete and utter negligence of developer reality (ARM, Android, iOS, tools etc.). It's all left to spare time enthusiasts - and their code will break in 4 weeks too. Just you wait and see 6. the leadership doesn't address the issues and gives evasive answers as in "Programmers who..." or on hindsight you're always wiser, other engineers have made mistakes too 7. I've seen it all before, many times, and it's a sign of a sinking ship, rearranging the deck chairs on the Titanic 8. what a pity 9. I hope D will be great again
Aug 31 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Aug 31, 2018 at 09:37:55AM +0000, Chris via Digitalmars-d wrote:
[...]
 3. moving the goal posts all the time and forcing you into a new
 paradigm every 1 1/2 years (first it was "ranges", then "templates"
 and now it's "functional", wait OOP will come back one day).
[...] Wait, what? Since when has this ever been a "choose one paradigm among many" deal? Templates are what enables range-based idioms to succeed, and ranges are what makes it possible to write functional-like code in D. Since when have they become mutually exclusive?! T -- Three out of two people have difficulties with fractions. -- Dirk Eddelbuettel
Aug 31 2018
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 31 August 2018 at 14:38:36 UTC, H. S. Teoh wrote:
 On Fri, Aug 31, 2018 at 09:37:55AM +0000, Chris via 
 Digitalmars-d wrote: [...]
 3. moving the goal posts all the time and forcing you into a 
 new paradigm every 1 1/2 years (first it was "ranges", then 
 "templates" and now it's "functional", wait OOP will come back 
 one day).
[...] Wait, what? Since when has this ever been a "choose one paradigm among many" deal? Templates are what enables range-based idioms to succeed, and ranges are what makes it possible to write functional-like code in D. Since when have they become mutually exclusive?! T
I wasn't talking about that, but about the fact that users are slowly but surely nudged into a certain direction. And yes, D was advertised as a "no ideology language".
Aug 31 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Aug 31, 2018 at 03:13:30PM +0000, Chris via Digitalmars-d wrote:
 On Friday, 31 August 2018 at 14:38:36 UTC, H. S. Teoh wrote:
 On Fri, Aug 31, 2018 at 09:37:55AM +0000, Chris via Digitalmars-d wrote:
 [...]
 3. moving the goal posts all the time and forcing you into a new
 paradigm every 1 1/2 years (first it was "ranges", then
 "templates" and now it's "functional", wait OOP will come back one
 day).
[...] Wait, what? Since when has this ever been a "choose one paradigm among many" deal? Templates are what enables range-based idioms to succeed, and ranges are what makes it possible to write functional-like code in D. Since when have they become mutually exclusive?!
[...]
 I wasn't talking about that, but about the fact that users are slowly
 but surely nudged into a certain direction. And yes, D was advertised
 as a "no ideology language".
Sorry, "slowly but surely nudged" sounds very different from "forcing you into a new paradigm every 1 1/2 years". So which is it? A nudge, presumably from recommended practices which you don't really have to follow (e.g., I don't follow all D recommended practices in my own code), or a strong coercion that forces you to rewrite your code in a new paradigm or else? T -- IBM = I'll Buy Microsoft!
Aug 31 2018
parent Chris <wendlec tcd.ie> writes:
On Friday, 31 August 2018 at 15:43:13 UTC, H. S. Teoh wrote:

 [...]
 I wasn't talking about that, but about the fact that users are 
 slowly but surely nudged into a certain direction. And yes, D 
 was advertised as a "no ideology language".
Sorry, "slowly but surely nudged" sounds very different from "forcing you into a new paradigm every 1 1/2 years". So which is it? A nudge, presumably from recommended practices which you don't really have to follow (e.g., I don't follow all D recommended practices in my own code), or a strong coercion that forces you to rewrite your code in a new paradigm or else? T
Ah yeah, fair play to you. I knew I someone would see the force / nudge thing. You're nudged over the years until you end up being forced to use a certain paradigm. There's nothing wrong with languages "forcing" you to use certain paradigms as long as it's clear from the start and you know what you're in for. But moving the goalposts as you go along is a bit meh. I remember that Walter said that once he didn't care about (or even understand) templates. Then it was all templates, now it's functional programming (which I like). What will be next? Forced `assert` calls in every function? I can already see it... But, again, it's this attitude of nitpicking over words (nudge / force) instead of addressing the issues that alarms me. It's not a good sign.
Sep 01 2018
prev sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Friday, 31 August 2018 at 09:37:55 UTC, Chris wrote:
 On Wednesday, 29 August 2018 at 23:47:11 UTC, Laeeth Isharc 
 wrote:
 On Tuesday, 28 August 2018 at 08:51:27 UTC, Chris wrote:

 9. I hope D will be great again
Are you someone who lives by hope and fears about things that have a meaning for you? Or do you prefer to take action? If the latter, what do you think might be some small step you could take to move the world towards the direction in which you think it should head. My experience of life is that in the end one way and another everything one does, big or small, turns out to matter and also that great things can have quite little beginnings. So what could you do towards the end you hope for ?
Aug 31 2018
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 31 August 2018 at 18:24:40 UTC, Laeeth Isharc wrote:
 On Friday, 31 August 2018 at 09:37:55 UTC, Chris wrote:
 On Wednesday, 29 August 2018 at 23:47:11 UTC, Laeeth Isharc 
 wrote:
 On Tuesday, 28 August 2018 at 08:51:27 UTC, Chris wrote:

 9. I hope D will be great again
Are you someone who lives by hope and fears about things that have a meaning for you? Or do you prefer to take action? If the latter, what do you think might be some small step you could take to move the world towards the direction in which you think it should head. My experience of life is that in the end one way and another everything one does, big or small, turns out to matter and also that great things can have quite little beginnings. So what could you do towards the end you hope for ?
Hope is usually the last thing to die. But one has to be wise enough to see that sometimes there is nothing one can do. As things are now, for me personally D is no longer an option, because of simple basic things, like autodecode, a flaw that will be there forever, poor support for industry technologies (Android, iOS) and the constant "threat" of code breakage. The D language developers don't seem to understand the importance of these trivial matters. I'm not just opinionating, by now I have no other _choice_ but to look for alternatives - and I do feel a little bit sad.
Sep 01 2018
next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 09/01/2018 07:12 AM, Chris wrote:
 
 Hope is usually the last thing to die. But one has to be wise enough to 
 see that sometimes there is nothing one can do. As things are now, for 
 me personally D is no longer an option, because of simple basic things, 
 like autodecode, a flaw that will be there forever, poor support for 
 industry technologies (Android, iOS)
Much as I hate to agree, that IS one thing where I'm actually in the same boat: My primary current paid project centers around converting some legacy Flash stuff to...well, to NOT Flash obviously. I *want* to use D for this very badly. But I'm not. I'm using Unity3D because: 1. For our use right now: It has ready-to-go out-of-the-box WebAsm support (or is it asm.js? Whatever...I can't keep up with the neverending torrent of rubble-bouncing from the web client world.) 2. For our use later: It has ready-to-go out-of-the-box iOS/Android support (along with just about any other platform we could ever possibly hope to care about). 3. It has all the robust multimedia functionality we need ready-to-go on all platforms (actually, its capabilities are totally overkill for us, but that's not a bad problem to have). I will be migrating the server back-end to D, but I *really* wish I could be doing the client-side in D too, even if that meant having to build an entire 2D engine off nothing more than SDL. Unfortunately, I just don't feel I can trust the D experience to be robust enough on those platforms right now, and I honestly have no idea when or even if it will get there (Maybe I'm wrong on that. I hope I am. But that IS my impression even as the HUUUGE D fan I am.)
Sep 01 2018
parent Chris <wendlec tcd.ie> writes:
On Saturday, 1 September 2018 at 21:18:27 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 09/01/2018 07:12 AM, Chris wrote:
 
 Hope is usually the last thing to die. But one has to be wise 
 enough to see that sometimes there is nothing one can do. As 
 things are now, for me personally D is no longer an option, 
 because of simple basic things, like autodecode, a flaw that 
 will be there forever, poor support for industry technologies 
 (Android, iOS)
Much as I hate to agree, that IS one thing where I'm actually in the same boat: My primary current paid project centers around converting some legacy Flash stuff to...well, to NOT Flash obviously. I *want* to use D for this very badly. But I'm not. I'm using Unity3D because: 1. For our use right now: It has ready-to-go out-of-the-box WebAsm support (or is it asm.js? Whatever...I can't keep up with the neverending torrent of rubble-bouncing from the web client world.) 2. For our use later: It has ready-to-go out-of-the-box iOS/Android support (along with just about any other platform we could ever possibly hope to care about). 3. It has all the robust multimedia functionality we need ready-to-go on all platforms (actually, its capabilities are totally overkill for us, but that's not a bad problem to have). I will be migrating the server back-end to D, but I *really* wish I could be doing the client-side in D too, even if that meant having to build an entire 2D engine off nothing more than SDL.
 Unfortunately, I just don't feel I can trust the D experience 
 to be robust enough on those platforms right now, and I 
 honestly have no idea when or even if it will get there (Maybe 
 I'm wrong on that. I hope I am. But that IS my impression even 
 as the HUUUGE D fan I am.)
"when or even if" I'm in the same situation but I can't wait anymore. Apps are everywhere these days and if you can't provide some sort of app, you're not in a good position. It's the realty of things, it's not a game, for many of us our jobs depend on it. Btw, why did I get this message yesterday: "Your message has been saved, and will be posted after being approved by a moderator." My message hasn't shown up yet as it hasn't been approved yet ;)
Sep 02 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2018 4:12 AM, Chris wrote:
 Hope is usually the last thing to die. But one has to be wise enough to see
that 
 sometimes there is nothing one can do. As things are now, for me personally D
is 
 no longer an option, because of simple basic things, like autodecode, a flaw 
 that will be there forever, poor support for industry technologies (Android, 
 iOS) and the constant "threat" of code breakage. The D language developers
don't 
 seem to understand the importance of these trivial matters. I'm not just 
 opinionating, by now I have no other _choice_ but to look for alternatives -
and 
 I do feel a little bit sad.
Autodecode - I've suffered under that, too. The solution was fairly simple. Append .byCodeUnit to strings that would otherwise autodecode. Annoying, but hardly a showstopper. Android, iOS - Contribute to help make it better. Breakage - I've dealt with this, too. The language changes have been usually just some minor edits. The more serious problems were the removal of some Phobos packages. I dealt with this by creating the undeaD library: https://github.com/dlang/undeaD
Sep 04 2018
next sibling parent tide <tide tide.tide> writes:
On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright wrote:
 On 9/1/2018 4:12 AM, Chris wrote:
 Hope is usually the last thing to die. But one has to be wise 
 enough to see that sometimes there is nothing one can do. As 
 things are now, for me personally D is no longer an option, 
 because of simple basic things, like autodecode, a flaw that 
 will be there forever, poor support for industry technologies 
 (Android, iOS) and the constant "threat" of code breakage. The 
 D language developers don't seem to understand the importance 
 of these trivial matters. I'm not just opinionating, by now I 
 have no other _choice_ but to look for alternatives - and I do 
 feel a little bit sad.
Android, iOS - Contribute to help make it better.
It would help if the main official compiler supported those operating systems. That would mean adding ARM support to DMD. Or a much simpler solution, use an existing backend that has ARM support built in to it and is maintained by a much larger established group of individuals. Say like how some languages, like Rust, do.
Sep 04 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright wrote:
 Autodecode - I've suffered under that, too. The solution was 
 fairly simple. Append .byCodeUnit to strings that would 
 otherwise autodecode. Annoying, but hardly a showstopper.
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 } Welcome to my world! [snip]
Sep 05 2018
next sibling parent reply aliak <something something.com> writes:
On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
 On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright 
 wrote:
 Autodecode - I've suffered under that, too. The solution was 
 fairly simple. Append .byCodeUnit to strings that would 
 otherwise autodecode. Annoying, but hardly a showstopper.
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 } Welcome to my world! [snip]
The dstring is only ok because the 2 code units fit in a dchar right? But all the other ones are as expected right? Seriously... why is it not graphemes by default for correctness whyyyyyyy!
Sep 05 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Sep 05, 2018 at 09:33:27PM +0000, aliak via Digitalmars-d wrote:
[...]
 The dstring is only ok because the 2 code units fit in a dchar right?
 But all the other ones are as expected right?
And dstring will be wrong once you have non-precomposed diacritics and other composing sequences.
 Seriously... why is it not graphemes by default for correctness
 whyyyyyyy!
Because grapheme decoding is SLOW, and most of the time you don't even need it anyway. SLOW as in, it will easily add a factor of 3-5 (if not worse!) to your string processing time, which will make your natively-compiled D code a laughing stock of interpreted languages like Python. It will make autodecoding look like an optimization(!). Grapheme decoding is really only necessary when (1) you're typesetting a Unicode string, and (2) you're counting the number of visual characters taken up by the string (though grapheme counting even in this case may not give you what you want, thanks to double-width characters, zero-width characters, etc. -- though it can form the basis of correct counting code). For all other cases, you really don't need grapheme decoding, and being forced to iterate over graphemes when unnecessary will add a horrible overhead, worse than autodecoding does today. // Seriously, people need to get over the fantasy that they can just use Unicode without understanding how Unicode works. Most of the time, you can get the illusion that it's working, but actually 99% of the time the code is actually wrong and will do the wrong thing when given an unexpected (but still valid) Unicode string. You can't drive without a license, and even if you try anyway, the chances of ending up in a nasty accident is pretty high. People *need* to learn how to use Unicode properly before complaining about why this or that doesn't work the way they thought it should work. T -- Gone Chopin. Bach in a minuet.
Sep 05 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh wrote:

 //

 Seriously, people need to get over the fantasy that they can 
 just use Unicode without understanding how Unicode works.  Most 
 of the time, you can get the illusion that it's working, but 
 actually 99% of the time the code is actually wrong and will do 
 the wrong thing when given an unexpected (but still valid) 
 Unicode string.  You can't drive without a license, and even if 
 you try anyway, the chances of ending up in a nasty accident is 
 pretty high.  People *need* to learn how to use Unicode 
 properly before complaining about why this or that doesn't work 
 the way they thought it should work.


 T
Python 3 gives me this: print(len("á")) 1 and so do other languages. Is it asking too much to ask for `string` (not `dstring` or `wstring`) to behave as most people would expect it to behave in 2018 - and not like Python 2 from days of yore? But of course, D users should have a "Unicode license" before they do anything with strings. (I wonder is there a different license for UTF8 and UTF16 and UTF32, Big / Little Endian, BOM? Just asking.) So again, for the umpteenth time, it's the users' fault. I see. Ironically enough, it was the language developers' lack of understanding of Unicode that led to string handling being a nightmare in D in the first place. Oh lads, if you were politicians I'd say that with this attitude you're gonna the next election. I say this, because many times the posts by (core) developers remind me so much of politicians who are completely detached from the reality of the people. Right oh!
Sep 06 2018
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Thursday, 6 September 2018 at 07:23:57 UTC, Chris wrote:

 Seriously, people need to get over the fantasy that they can 
 just use Unicode without understanding how Unicode works.  
 Most of the time, you can get the illusion that it's working, 
 but actually 99% of the time the code is actually wrong and 
 will do the wrong thing when given an unexpected (but still 
 valid) Unicode string.
 Is it asking too much to ask for `string` (not `dstring` or 
 `wstring`) to behave as most people would expect it to behave 
 in 2018 - and not like Python 2 from days of yore?
I agree with Chris. The boat is sailed, so D2 should just go full throttle with the original design and auto decode to graphemes, regardless of the performance.
Sep 06 2018
prev sibling next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 6 September 2018 at 07:23:57 UTC, Chris wrote:
 On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh 
 wrote:

 //

 Seriously, people need to get over the fantasy that they can 
 just use Unicode without understanding how Unicode works.  
 Most of the time, you can get the illusion that it's working, 
 but actually 99% of the time the code is actually wrong and 
 will do the wrong thing when given an unexpected (but still 
 valid) Unicode string.  You can't drive without a license, and 
 even if you try anyway, the chances of ending up in a nasty 
 accident is pretty high.  People *need* to learn how to use 
 Unicode properly before complaining about why this or that 
 doesn't work the way they thought it should work.


 T
Python 3 gives me this: print(len("á")) 1 and so do other languages.
The same Python 3 that people criticize for having unintuitive unicode string handling? https://learnpythonthehardway.org/book/nopython3.html
 Is it asking too much to ask for `string` (not `dstring` or 
 `wstring`) to behave as most people would expect it to behave 
 in 2018 - and not like Python 2 from days of yore? But of 
 course, D users should have a "Unicode license" before they do 
 anything with strings. (I wonder is there a different license 
 for UTF8 and UTF16 and UTF32, Big / Little Endian, BOM? Just 
 asking.)
Yes and no, unicode is a clusterf***, so every programming language is having problems with it.
 So again, for the umpteenth time, it's the users' fault. I see. 
 Ironically enough, it was the language developers' lack of 
 understanding of Unicode that led to string handling being a 
 nightmare in D in the first place. Oh lads, if you were 
 politicians I'd say that with this attitude you're gonna the 
 next election. I say this, because many times the posts by 
 (core) developers remind me so much of politicians who are 
 completely detached from the reality of the people. Right oh!
You have a point that it was D devs' ignorance of unicode that led to the current auto-decoding problem. But let's have some nuance here, the problem ultimately is unicode.
Sep 06 2018
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 06/09/2018 7:54 PM, Joakim wrote:
 On Thursday, 6 September 2018 at 07:23:57 UTC, Chris wrote:
 On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh wrote:

 //

 Seriously, people need to get over the fantasy that they can just use 
 Unicode without understanding how Unicode works. Most of the time, 
 you can get the illusion that it's working, but actually 99% of the 
 time the code is actually wrong and will do the wrong thing when 
 given an unexpected (but still valid) Unicode string.  You can't 
 drive without a license, and even if you try anyway, the chances of 
 ending up in a nasty accident is pretty high.  People *need* to learn 
 how to use Unicode properly before complaining about why this or that 
 doesn't work the way they thought it should work.


 T
Python 3 gives me this: print(len("á")) 1 and so do other languages.
The same Python 3 that people criticize for having unintuitive unicode string handling? https://learnpythonthehardway.org/book/nopython3.html
 Is it asking too much to ask for `string` (not `dstring` or `wstring`) 
 to behave as most people would expect it to behave in 2018 - and not 
 like Python 2 from days of yore? But of course, D users should have a 
 "Unicode license" before they do anything with strings. (I wonder is 
 there a different license for UTF8 and UTF16 and UTF32, Big / Little 
 Endian, BOM? Just asking.)
Yes and no, unicode is a clusterf***, so every programming language is having problems with it.
 So again, for the umpteenth time, it's the users' fault. I see. 
 Ironically enough, it was the language developers' lack of 
 understanding of Unicode that led to string handling being a nightmare 
 in D in the first place. Oh lads, if you were politicians I'd say that 
 with this attitude you're gonna the next election. I say this, because 
 many times the posts by (core) developers remind me so much of 
 politicians who are completely detached from the reality of the 
 people. Right oh!
You have a point that it was D devs' ignorance of unicode that led to the current auto-decoding problem. But let's have some nuance here, the problem ultimately is unicode.
Let's also be realistic here, when D was being designed UTF-16 was touted as being 'the' solution you should support e.g. Java had it retrofitted shortly before D. So it isn't anyone's fault on D's end.
Sep 06 2018
prev sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 07:54:09 UTC, Joakim wrote:
 On Thursday, 6 September 2018 at 07:23:57 UTC, Chris wrote:
 On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh 
 wrote:

 //

 Seriously, people need to get over the fantasy that they can 
 just use Unicode without understanding how Unicode works.  
 Most of the time, you can get the illusion that it's working, 
 but actually 99% of the time the code is actually wrong and 
 will do the wrong thing when given an unexpected (but still 
 valid) Unicode string.  You can't drive without a license, 
 and even if you try anyway, the chances of ending up in a 
 nasty accident is pretty high.  People *need* to learn how to 
 use Unicode properly before complaining about why this or 
 that doesn't work the way they thought it should work.


 T
Python 3 gives me this: print(len("á")) 1 and so do other languages.
The same Python 3 that people criticize for having unintuitive unicode string handling? https://learnpythonthehardway.org/book/nopython3.html
 Is it asking too much to ask for `string` (not `dstring` or 
 `wstring`) to behave as most people would expect it to behave 
 in 2018 - and not like Python 2 from days of yore? But of 
 course, D users should have a "Unicode license" before they do 
 anything with strings. (I wonder is there a different license 
 for UTF8 and UTF16 and UTF32, Big / Little Endian, BOM? Just 
 asking.)
Yes and no, unicode is a clusterf***, so every programming language is having problems with it.
 So again, for the umpteenth time, it's the users' fault. I 
 see. Ironically enough, it was the language developers' lack 
 of understanding of Unicode that led to string handling being 
 a nightmare in D in the first place. Oh lads, if you were 
 politicians I'd say that with this attitude you're gonna the 
 next election. I say this, because many times the posts by 
 (core) developers remind me so much of politicians who are 
 completely detached from the reality of the people. Right oh!
You have a point that it was D devs' ignorance of unicode that led to the current auto-decoding problem. But let's have some nuance here, the problem ultimately is unicode.
Yes, Unicode is a beast that is hard to tame. But there is, afaik, not even a proper plan to tackle the whole thing in D, just patches. D has autodecoding which slows things down but doesn't even work correctly at the same time. However, it cannot be removed due to massive code breakage. So you sacrifice speed for security (fine) - but the security doesn't even exist. So what's the point? Also, there aren't any guidelines about how to use strings in different contexts. So after a while your code ends up being a mess of .byCodePoint / .byGrapheme / string / dstring whatever, and you never know if you really got it right or not (performance wise and other). We're talking about a basic functionality like string handling. String handling is very important these days (data harvesting, translation tools) and IT is used all over the world where you have to deal with different alphabets that are outside the ASCII range. And because it's such a basic functionality, you don't want to waste time having to think about it.
Sep 06 2018
prev sibling parent reply ag0aep6g <anonymous example.com> writes:
On 09/06/2018 09:23 AM, Chris wrote:
 Python 3 gives me this:
 
 print(len("á"))
 1
Python 3 also gives you this: print(len("á")) 2 (The example might not survive transfer from me to you if Unicode normalization happens along the way.) That's when you enter the 'á' as 'a' followed by U+0301 (combining acute accent). So Python's `len` counts in code points, like D's std.range does (auto-decoding).
Sep 06 2018
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 10:22:22 UTC, ag0aep6g wrote:
 On 09/06/2018 09:23 AM, Chris wrote:
 Python 3 gives me this:
 
 print(len("á"))
 1
Python 3 also gives you this: print(len("á")) 2 (The example might not survive transfer from me to you if Unicode normalization happens along the way.) That's when you enter the 'á' as 'a' followed by U+0301 (combining acute accent). So Python's `len` counts in code points, like D's std.range does (auto-decoding).
To avoid this you have to normalize and recompose any decomposed characters. I remember that Mac OS X used (and still uses?) decomposed characters by default, so when you typed 'á' into your cli, it would automatically decompose it to 'a' + acute. `string` however returns len=2 for composed characters too. If you do a lot of string handling it will come back to bite you sooner or later.
Sep 06 2018
parent reply ag0aep6g <anonymous example.com> writes:
On 09/06/2018 12:40 PM, Chris wrote:
 To avoid this you have to normalize and recompose any decomposed 
 characters. I remember that Mac OS X used (and still uses?) decomposed 
 characters by default, so when you typed 'á' into your cli, it would 
 automatically decompose it to 'a' + acute. `string` however returns 
 len=2 for composed characters too. If you do a lot of string handling it 
 will come back to bite you sooner or later.
You say that D users shouldn't need a '"Unicode license" before they do anything with strings'. And you say that Python 3 gets it right (or maybe less wrong than D). But here we see that Python requires a similar amount of Unicode knowledge. Without your Unicode license, you couldn't make sense of `len` giving different results for two strings that look the same. So both D and Python require a Unicode license. But on top of that, D also requires an auto-decoding license. You need to know that `string` is both a range of code points and an array of code units. And you need to know that `.length` belongs to the array side, not the range side. Once you know that (and more), things start making sense in D. My point is: D doesn't require more Unicode knowledge than Python. But D's auto-decoding gives `string` a dual nature, and that can certainly be confusing. It's part of why everybody dislikes auto-decoding. (Not saying that Python is free from such pitfalls. I simply don't know the language well enough.)
Sep 06 2018
parent Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 11:43:31 UTC, ag0aep6g wrote:
 You say that D users shouldn't need a '"Unicode license" before 
 they do anything with strings'. And you say that Python 3 gets 
 it right (or maybe less wrong than D).

 But here we see that Python requires a similar amount of 
 Unicode knowledge. Without your Unicode license, you couldn't 
 make sense of `len` giving different results for two strings 
 that look the same.

 So both D and Python require a Unicode license. But on top of 
 that, D also requires an auto-decoding license. You need to 
 know that `string` is both a range of code points and an array 
 of code units. And you need to know that `.length` belongs to 
 the array side, not the range side. Once you know that (and 
 more), things start making sense in D.
You'll need some basic knowledge of Unicode, if you deal with strings, that's for sure. But you don't need a "license" and it certainly shouldn't be used as an excuse for D's confusing nature when it comes to strings. Unicode is confusing enough, so you don't need to add another layer of complexity to confuse users further. And most certainly you shouldn't blame the user for being confused. Afaik, there's no warning label with an accompanying user manual for string handling.
 My point is: D doesn't require more Unicode knowledge than 
 Python. But D's auto-decoding gives `string` a dual nature, and 
 that can certainly be confusing. It's part of why everybody 
 dislikes auto-decoding.
D should be clear about it. I think it's too late for `string` to change its behavior (i.e. "á".length = 1). If you wanna change `string`'s behavior now, maybe a compiler switch would be an option for the transition period: -autodecode=off. Maybe a new type of string could be introduced that behaves like one would expect, say `ustring` for correct Unicode handling. Or `string` does that and you introduce a new type for high performance tasks (`rawstring` would unfortunately be confusing). The thing is that even basic things like string handling are complicated and flawed so that I don't want to use D for any future projects and I don't have the time to wait until it gets fixed one day, if it ever will get fixed that is. Neither does it seem to be a priority as opposed to other things that are maybe less important for production. But at least I'm wiser after this thread, since it has been made clear that things are not gonna change soon, at least not soon enough for me. This is why I'll file for D-vorce :) Will it be difficult? Maybe at the beginning, but it will make things easier in the long run. And at the end of the day, if you have to fix and rewrite parts of your code again and again due to frequent language changes, you might as well port it to a different PL altogether. But I have no hard feelings, it's a practical decision I had to make based on pros and cons. [snip]
Sep 06 2018
prev sibling parent reply aliak <something something.com> writes:
On Wednesday, 5 September 2018 at 22:00:27 UTC, H. S. Teoh wrote:
 Because grapheme decoding is SLOW, and most of the time you 
 don't even need it anyway.  SLOW as in, it will easily add a 
 factor of 3-5 (if not worse!) to your string processing time, 
 which will make your natively-compiled D code a laughing stock 
 of interpreted languages like Python.  It will make 
 autodecoding look like an optimization(!).
Hehe, it's already a bit laughable that correctness is not preferred. // Swift let a = "á" let b = "á" let c = "\u{200B}" // zero width space let x = a + c + a let y = b + c + b print(a.count) // 1 print(b.count) // 1 print(x.count) // 3 print(y.count) // 3 print(a == b) // true print("ááááááá".range(of: "á") != nil) // true // D auto a = "á"; auto b = "á"; auto c = "\u200B"; auto x = a ~ c ~ a; auto y = b ~ c ~ b; writeln(a.length); // 2 wtf writeln(b.length); // 3 wtf writeln(x.length); // 7 wtf writeln(y.length); // 9 wtf writeln(a == b); // false wtf writeln("ááááááá".canFind("á")); // false wtf Tell me which one would cause the giggles again? If speed is the preference over correctness (which I very much disagree with, but for arguments sake...) then still code points are the wrong choice. So, speed was obviously (??) not the reason to prefer code points as the default. Here's a read on how swift 4 strings behave. Absolutely amazing work there: https://oleb.net/blog/2017/11/swift-4-strings/
 Grapheme decoding is really only necessary when (1) you're 
 typesetting a Unicode string, and (2) you're counting the 
 number of visual characters taken up by the string (though 
 grapheme counting even in this case may not give you what you 
 want, thanks to double-width characters, zero-width characters, 
 etc. -- though it can form the basis of correct counting code).
Yeah nah. Those are not the only 2 cases *ever* where grapheme decoding is correct. I don't think one can list all the cases where grapheme decoding is the correct behavior. Off the op of me head you've already forgotten comparisons. And on top of that, comparing and counting has a bajillion* use cases. * number is an exaggeration.
 For all other cases, you really don't need grapheme decoding, 
 and being forced to iterate over graphemes when unnecessary 
 will add a horrible overhead, worse than autodecoding does 
 today.
As opposed to being forced to iterate with incorrect results? I understand that it's slower. I just don't think that justifies incorrect output. I agree with everything you've said next though, that people should understand unicode.
 //

 Seriously, people need to get over the fantasy that they can 
 just use Unicode without understanding how Unicode works.  Most 
 of the time, you can get the illusion that it's working, but 
 actually 99% of the time the code is actually wrong and will do 
 the wrong thing when given an unexpected (but still valid) 
 Unicode string.  You can't drive without a license, and even if 
 you try anyway, the chances of ending up in a nasty accident is 
 pretty high.  People *need* to learn how to use Unicode 
 properly before complaining about why this or that doesn't work 
 the way they thought it should work.
I agree that you should know about unicode. And maybe you can't be correct 100% of the time but you can very well get much closer than were D is right now. And yeah, you can't drive without a license, but most cars hopefully don't show you an incorrect speedometer reading because it produces faster drivers.
 T
 --
 Gone Chopin. Bach in a minuet.
Lol :D
Sep 06 2018
next sibling parent Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 Hehe, it's already a bit laughable that correctness is not 
 preferred.

 // Swift
 let a = "á"
 let b = "á"
 let c = "\u{200B}" // zero width space
 let x = a + c + a
 let y = b + c + b

 print(a.count) // 1
 print(b.count) // 1
 print(x.count) // 3
 print(y.count) // 3

 print(a == b) // true
 print("ááááááá".range(of: "á") != nil) // true

 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;

 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf

 writeln(a == b); // false wtf
 writeln("ááááááá".canFind("á")); // false wtf
writeln(cast(ubyte[]) a); // [195, 161] writeln(cast(ubyte[]) b); // [97, 204, 129] At least for equality, it doesn't seem far fetched to me that both are not considered equal if they are not the same.
Sep 06 2018
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;

 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf

 writeln(a == b); // false wtf
 writeln("ááááááá".canFind("á")); // false wtf
I had to copy-paste that because I wondered how the last two can be false. They are because á is encoded differently. if you replace all occurences of it with a grapheme that fits to one code point, the results are: 2 2 7 7 true true
Sep 06 2018
next sibling parent Daniel Kozak <kozzi11 gmail.com> writes:
On Thu, Sep 6, 2018 at 4:45 PM Dukc via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a =3D "=C3=A1";
 auto b =3D "a=CC=81";
 auto c =3D "\u200B";
 auto x =3D a ~ c ~ a;
 auto y =3D b ~ c ~ b;

 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf

 writeln(a =3D=3D b); // false wtf
 writeln("=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1".canFind("a=CC=81")=
); // false wtf

 I had to copy-paste that because I wondered how the last two can
 be false. They are because =C3=A1 is encoded differently. if you
 replace all occurences of it with a grapheme that fits to one
 code point, the results are:

 2
 2
 7
 7
 true
 true
import std.stdio; import std.algorithm : canFind; import std.uni : normalize; void main() { auto a =3D "=C3=A1".normalize; auto b =3D "a=CC=81".normalize; auto c =3D "\u200B".normalize; auto x =3D a ~ c ~ a; auto y =3D b ~ c ~ b; writeln(a.length); // 2 writeln(b.length); // 2 writeln(x.length); // 7 writeln(y.length); // 7 writeln(a =3D=3D b); // true writeln("=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1=C3=A1".canFind("a=CC=81".= normalize)); // true }
Sep 06 2018
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Sep 06, 2018 at 02:42:58PM +0000, Dukc via Digitalmars-d wrote:
 On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;
 
 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf
[...] This is an unfair comparison. In the Swift version you used .count, but here you used .length, which is the length of the array, NOT the number of characters or whatever you expect it to be. You should rather use .count and specify exactly what you want to count, e.g., byCodePoint or byGrapheme. I suspect the Swift version will give you unexpected results if you did something like compare "á" to "a\u301", for example (which, in case it isn't obvious, are visually identical to each other, and as far as an end user is concerned, should only count as 1 grapheme). Not even normalization will help you if you have a string like "a\u301\u302": in that case, the *only* correct way to count the number of visual characters is byGrapheme, and I highly doubt Swift's .count will give you the correct answer in that case. (I expect that Swift's .count will count code points, as is the usual default in many languages, which is unfortunately wrong when you're thinking about visual characters, which are called graphemes in Unicode parlance.) And even in your given example, what should .count return when there's a zero-width character? If you're counting the number of visual places taken by the string (e.g., you're trying to align output in a fixed-width terminal), then *both* versions of your code are wrong, because zero-width characters do not occupy any space when displayed. If you're counting the number of code points, though, e.g., to allocate the right buffer size to convert to dstring, then you want to count the zero-width character as 1 rather than 0. And that's not to mention double-width characters, which should count as 2 if you're outputting to a fixed-width terminal. Again I say, you need to know how Unicode works. Otherwise you can easily deceive yourself to think that your code (both in D and in Swift and in any other language) is correct, when in fact it will fail miserably when it receives input that you didn't think of. Unicode is NOT ASCII, and you CANNOT assume there's a 1-to-1 mapping between "characters" and display length. Or 1-to-1 mapping between any of the various concepts of string "length", in fact. In ASCII, array length == number of code points == number of graphemes == display width. In Unicode, array length != number of code points != number of graphemes != display width. Code written by anyone who does not understand this is WRONG, because you will inevitably end up using the wrong value for the wrong thing: e.g., array length for number of code points, or number of code points for display length. Not even .byGrapheme will save you here; you *need* to understand that zero-width and double-width characters exist, and what they imply for display width. You *need* to understand the difference between code points and graphemes. There is no single default that will work in every case, because there are DIFFERENT CORRECT ANSWERS depending on what your code is trying to accomplish. Pretending that you can just brush all this detail under the rug of a single number is just deceiving yourself, and will inevitably result in wrong code that will fail to handle Unicode input correctly. T -- It's amazing how careful choice of punctuation can leave you hanging:
Sep 06 2018
next sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 6 September 2018 at 16:44:11 UTC, H. S. Teoh wrote:
 On Thu, Sep 06, 2018 at 02:42:58PM +0000, Dukc via 
 Digitalmars-d wrote:
 On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;
 
 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf
[...] This is an unfair comparison. In the Swift version you used .count, but here you used .length, which is the length of the array, NOT the number of characters or whatever you expect it to be. You should rather use .count and specify exactly what you want to count, e.g., byCodePoint or byGrapheme. I suspect the Swift version will give you unexpected results if you did something like compare "á" to "a\u301", for example (which, in case it isn't obvious, are visually identical to each other, and as far as an end user is concerned, should only count as 1 grapheme). Not even normalization will help you if you have a string like "a\u301\u302": in that case, the *only* correct way to count the number of visual characters is byGrapheme, and I highly doubt Swift's .count will give you the correct answer in that case. (I expect that Swift's .count will count code points, as is the usual default in many languages, which is unfortunately wrong when you're thinking about visual characters, which are called graphemes in Unicode parlance.)
No, Swift counts grapheme clusters by default, so it gives 1. I suggest you read the linked Swift chapter above. I think it's the wrong choice for performance, but they chose to emphasize intuitiveness for the common case. I agree with most of the rest of what you wrote about programmers having no silver bullet to avoid Unicode's and languages' complexity.
Sep 06 2018
parent RhyS <sale rhysoft.com> writes:
On Thursday, 6 September 2018 at 17:19:01 UTC, Joakim wrote:
 No, Swift counts grapheme clusters by default, so it gives 1. I 
 suggest you read the linked Swift chapter above. I think it's 
 the wrong choice for performance, but they chose to emphasize 
 intuitiveness for the common case.
I like to point out that Swift spend a lot of time reworking how string are handled. If my memory serves me well, they have reworked strings from version 2 to 3 and finalized it in version 4.
 Swift 4 includes a faster, easier to use String implementation 
 that retains Unicode correctness and adds support for creating, 
 using and managing substrings.
That took them somewhere along the line of two years to get string handling to a acceptable and predictable state. And it annoyed the Swift user base greatly but a lot of changes got made to reaching a stable API. Being honest, i personally find Swift a more easy languages despite it lacking IDE support on several platforms and no official Windows compiler.
Sep 06 2018
prev sibling parent reply aliak <something something.com> writes:
On Thursday, 6 September 2018 at 16:44:11 UTC, H. S. Teoh wrote:
 On Thu, Sep 06, 2018 at 02:42:58PM +0000, Dukc via 
 Digitalmars-d wrote:
 On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;
 
 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf
[...] This is an unfair comparison. In the Swift version you used .count, but here you used .length, which is the length of the array, NOT the number of characters or whatever you expect it to be. You should rather use .count and specify exactly what you want to count, e.g., byCodePoint or byGrapheme. I suspect the Swift version will give you unexpected results if you did something like compare "á" to "a\u301", for example (which, in case it isn't obvious, are visually identical to each other, and as far as an end user is concerned, should only count as 1 grapheme). Not even normalization will help you if you have a string like "a\u301\u302": in that case, the *only* correct way to count the number of visual characters is byGrapheme, and I highly doubt Swift's .count will give you the correct answer in that case. (I expect that Swift's .count will count code points, as is the usual default in many languages, which is unfortunately wrong when you're thinking about visual characters, which are called graphemes in Unicode parlance.) And even in your given example, what should .count return when there's a zero-width character? If you're counting the number of visual places taken by the string (e.g., you're trying to align output in a fixed-width terminal), then *both* versions of your code are wrong, because zero-width characters do not occupy any space when displayed. If you're counting the number of code points, though, e.g., to allocate the right buffer size to convert to dstring, then you want to count the zero-width character as 1 rather than 0. And that's not to mention double-width characters, which should count as 2 if you're outputting to a fixed-width terminal. Again I say, you need to know how Unicode works. Otherwise you can easily deceive yourself to think that your code (both in D and in Swift and in any other language) is correct, when in fact it will fail miserably when it receives input that you didn't think of. Unicode is NOT ASCII, and you CANNOT assume there's a 1-to-1 mapping between "characters" and display length. Or 1-to-1 mapping between any of the various concepts of string "length", in fact. In ASCII, array length == number of code points == number of graphemes == display width. In Unicode, array length != number of code points != number of graphemes != display width. Code written by anyone who does not understand this is WRONG, because you will inevitably end up using the wrong value for the wrong thing: e.g., array length for number of code points, or number of code points for display length. Not even .byGrapheme will save you here; you *need* to understand that zero-width and double-width characters exist, and what they imply for display width. You *need* to understand the difference between code points and graphemes. There is no single default that will work in every case, because there are DIFFERENT CORRECT ANSWERS depending on what your code is trying to accomplish. Pretending that you can just brush all this detail under the rug of a single number is just deceiving yourself, and will inevitably result in wrong code that will fail to handle Unicode input correctly. T
It's a totally fair comparison. .count in swift is the equivalent of .length in D, you use that to get the size of an array, etc. They've just implemented string.length as string.byGrapheme.walkLength. So it's intuitively correct (and yes, slower). If you didn't want the default though then you could also specify what "view" over characters you want. E.g. let a = "á̂" a.count // 1 <-- Yes it is exactly as expected. a.unicodeScalars // 3 a.utf8.count // 5 I don't really see any issues with a zero-width character. If you want to deal with screen width (i.e. pixel space) that's not the same as how many characters are in a string. And it doesn't matter whether you go byGrapheme or byCodePoint or byCodeUnit because none of those represent a single column on screen. A zero-width character is 0 *width* but it's still *one* character. There's no .length/size/count in any language (that I've heard of) that'll give you your screen space from their string type. You query the font API for that as that depends on font size, kerning, style and face. And again, I agree you need to know how unicode works. I don't argue that at all. I'm just saying that having the default be incorrect for application logic is just silly and when people have to do things like string.representation.normalize.byGrapheme or whatever to search for a character in a string *correctly* ... well, just, ARGH! D makes the code-point case default and hence that becomes the simplest to use. But unfortunately, the only thing I can think of that requires code point representations is when dealing specifically with unicode algorithms (normalization, etc). Here's a good read on code points: https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to- nicode-code-points/ - tl;dr: application logic does not need or want to deal with code points. For speed units work, and for correctness, graphemes work. Yes you will fail miserably when you receive input you did not expect. That's always true. That's why we have APIs that make it easier or harder to fail more or less. Expecting people to be unicode experts before using unicode is also unreasonable - or just makes it easier to fail, must easier. I sit next to one of the guys who worked on unicode in Qt and he couldn't explain the difference between a grapheme and an extended grapheme cluster... I'm not saying I can btw... I'm just saying unicode is frikkin hard. And we don't need APIs making it harder to get right - which is exactly what non-correct-by-default APIs do. I think to boil it down to one sentence is I think it's silly to have a string type that is advertised as unicode but optimized for latin1 ... ish because people will use it for unicode and get incorrect results with its naturally intuitive usage. Cheers, - Ali
Sep 06 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, September 6, 2018 1:04:45 PM MDT aliak via Digitalmars-d wrote:
 D makes the code-point case default and hence that becomes the
 simplest to use. But unfortunately, the only thing I can think of
 that requires code point representations is when dealing
 specifically with unicode algorithms (normalization, etc). Here's
 a good read on code points:
 https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to-un
 icode-code-points/ -

 tl;dr: application logic does not need or want to deal with code
 points. For speed units work, and for correctness, graphemes work.
I think that it's pretty clear that code points are objectively the worst level to be the default. Unfortunately, changing it to _anything_ else is not going to be an easy feat at this point. But if we can first ensure that Phobos in general doesn't rely on it (i.e. in general, it can deal with ranges of char, wchar, dchar, or graphemes correctly rather than assuming that all ranges of characters are ranges of dchar), then maybe we can figure something out. Unfortunately, while some work has been done towards that, what's mostly happened is that folks have complained about auto-decoding without doing much to improve the current situation. There's a lot more to this than simply ripping out auto-decoding even if every D user on the planet agreed that outright breaking almost every existing D program to get rid of auto-decoding was worth it. But as with too many things around here, there's a lot more talking than working. And actually, as such, I should probably stop discussing this and go do something useful. - Jonathan M Davis
Sep 06 2018
next sibling parent reply aliak <something something.com> writes:
On Thursday, 6 September 2018 at 20:15:22 UTC, Jonathan M Davis 
wrote:
 On Thursday, September 6, 2018 1:04:45 PM MDT aliak via 
 Digitalmars-d wrote:
 D makes the code-point case default and hence that becomes the
 simplest to use. But unfortunately, the only thing I can think 
 of
 that requires code point representations is when dealing
 specifically with unicode algorithms (normalization, etc). 
 Here's
 a good read on code points:
 https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to-un
 icode-code-points/ -

 tl;dr: application logic does not need or want to deal with 
 code points. For speed units work, and for correctness, 
 graphemes work.
I think that it's pretty clear that code points are objectively the worst level to be the default. Unfortunately, changing it to _anything_ else is not going to be an easy feat at this point. But if we can first ensure that Phobos in general doesn't rely on it (i.e. in general, it can deal with ranges of char, wchar, dchar, or graphemes correctly rather than assuming that all ranges of characters are ranges of dchar), then maybe we can figure something out. Unfortunately, while some work has been done towards that, what's mostly happened is that folks have complained about auto-decoding without doing much to improve the current situation. There's a lot more to this than simply ripping out auto-decoding even if every D user on the planet agreed that outright breaking almost every existing D program to get rid of auto-decoding was worth it. But as with too many things around here, there's a lot more talking than working. And actually, as such, I should probably stop discussing this and go do something useful. - Jonathan M Davis
Is there a unittest somewhere in phobos you know that one can be pointed to that shows the handling of these 4 variations you say should be dealt with first? Or maybe a PR that did some of this work that one could investigate? I ask so I can see in code what it means to make something not rely on autodecoding and deal with ranges of char, wchar, dchar or graphemes. Or a current "easy" bugzilla issue maybe that one could try a hand at?
Sep 06 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, September 6, 2018 3:15:59 PM MDT aliak via Digitalmars-d wrote:
 On Thursday, 6 September 2018 at 20:15:22 UTC, Jonathan M Davis

 wrote:
 On Thursday, September 6, 2018 1:04:45 PM MDT aliak via

 Digitalmars-d wrote:
 D makes the code-point case default and hence that becomes the
 simplest to use. But unfortunately, the only thing I can think
 of
 that requires code point representations is when dealing
 specifically with unicode algorithms (normalization, etc).
 Here's
 a good read on code points:
 https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to
 -un icode-code-points/ -

 tl;dr: application logic does not need or want to deal with
 code points. For speed units work, and for correctness,
 graphemes work.
I think that it's pretty clear that code points are objectively the worst level to be the default. Unfortunately, changing it to _anything_ else is not going to be an easy feat at this point. But if we can first ensure that Phobos in general doesn't rely on it (i.e. in general, it can deal with ranges of char, wchar, dchar, or graphemes correctly rather than assuming that all ranges of characters are ranges of dchar), then maybe we can figure something out. Unfortunately, while some work has been done towards that, what's mostly happened is that folks have complained about auto-decoding without doing much to improve the current situation. There's a lot more to this than simply ripping out auto-decoding even if every D user on the planet agreed that outright breaking almost every existing D program to get rid of auto-decoding was worth it. But as with too many things around here, there's a lot more talking than working. And actually, as such, I should probably stop discussing this and go do something useful. - Jonathan M Davis
Is there a unittest somewhere in phobos you know that one can be pointed to that shows the handling of these 4 variations you say should be dealt with first? Or maybe a PR that did some of this work that one could investigate? I ask so I can see in code what it means to make something not rely on autodecoding and deal with ranges of char, wchar, dchar or graphemes. Or a current "easy" bugzilla issue maybe that one could try a hand at?
Not really. The handling of this has generally been too ad-hoc. There are plenty of examples of handling different string types, and there are a few handling different ranges of character types, but there's a distinct lack of tests involving graphemes. And the correct behavior for each is going to depend on what exactly the function does - e.g. almost certainly, the correct thing for filter to do is to not do anything special for ranges of characters at all and just filter on the element type of the range (even though it would almost always be incorrect to filter a range of char unless it's known to be all ASCII), while on the other hand, find is clearly designed to handle different encodings. So, it needs to be able to find a dchar or grapheme in a range of char. And of course, there's the issue of how normalization should be handled (if at all). A number of the tests in std.utf and std.string do a good job of testing Unicode strings of varying encodings, and std.utf does a good job overall of testing ranges of char, wchar, and dchar which aren't strings, but I'm not sure that anything in Phobos outside of std.uni currently does anything with ranges of graphemes. std.conv.to does have some tests for ranges of char, wchar, and dchar due to a bug fix. e.g. // bugzilla 15800 safe unittest { import std.utf : byCodeUnit, byChar, byWchar, byDchar; assert(to!int(byCodeUnit("10")) == 10); assert(to!int(byCodeUnit("10"), 10) == 10); assert(to!int(byCodeUnit("10"w)) == 10); assert(to!int(byCodeUnit("10"w), 10) == 10); assert(to!int(byChar("10")) == 10); assert(to!int(byChar("10"), 10) == 10); assert(to!int(byWchar("10")) == 10); assert(to!int(byWchar("10"), 10) == 10); assert(to!int(byDchar("10")) == 10); assert(to!int(byDchar("10"), 10) == 10); } but there are no grapheme tests, and no Unicode characters are involved (though I'm not sure that much in std.conv really needs to worry about Unicode characters). So, there are tests scattered all over the place which do pieces of what they need to be doing, but I'm not sure that there are currently any that test the full range of character ranges that they really need to be testing. As with testing reference type ranges, such tests have generally been added only when fixing a specific bug, and there hasn't been a sufficient effort to just go through all of the affected functions and add appropriate tests. And unfortunately, unlike with reference type ranges, the correct behavior of a function when faced with ranges of different character types is going to be highly dependent on what they do. Some of them shouldn't be doing anything special for processing ranges of characters, some shouldn't be doing anything special for processing arbitrary ranges of characters, but they still need to do something special for strings because of efficiency issues caused by auto-decoding, and yet others need to actually take Unicode into account and operate on each range type differently depending on whether it's a range of code units, code points, or graphemes. So, completely aside from auto-decoding issues, it's a bit of a daunting task. I keep meaning to take the time to work on it, I've done some of the critical work for supporting arbitrary ranges of char, wchar, and dchar rather than just string types (as have some other folks), but I haven't spent the time to start going through the functions one by one and add the appropriate tests and fixes, and no one else has gone that far either. So, I can't really point towards a specific set of tests and say "here, do what these do." And even if I could, whether what those tests do would be correct for another function would depend on what the functions do. So, sorry that I can't be more helpful. Actually, what you could probably do if you're looking for something related to this to do, and you don't feel that you know enough to just start adding tests, you could try byCodeUnit, byDchar, and byGrapheme with various functions and see what happens. If the function doesn't even compile (which will probably be the case at least some of the time), then that's an easy bug report. If the function does compile, then it will require a greater understanding to know whether it's doing the right thing, but in at least some cases, it may be obvious, and if the result is obviously wrong, you can create a bug report for that. Ultimately though, a pretty solid understanding of ranges and Unicode is going to be required to write a lot of these tests. And worse, a pretty solid understanding of ranges and Unicode is going to be required to use any of these functions correctly even if they all work correctly and have all of the necessary tests to prove it. Unicode is just plain too complicated, and trying to make things "just work" with it is frequently difficult - especially if efficiency matters, but even when efficiency doesn't matter, it's not always obvious how to make it "just work." :( - Jonathan M Davis
Sep 08 2018
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Thursday, 6 September 2018 at 20:15:22 UTC, Jonathan M Davis 
wrote:
 On Thursday, September 6, 2018 1:04:45 PM MDT aliak via 
 Digitalmars-d wrote:
 D makes the code-point case default and hence that becomes the
 simplest to use. But unfortunately, the only thing I can think 
 of
 that requires code point representations is when dealing
 specifically with unicode algorithms (normalization, etc). 
 Here's
 a good read on code points:
 https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to-un
 icode-code-points/ -

 tl;dr: application logic does not need or want to deal with 
 code points. For speed units work, and for correctness, 
 graphemes work.
I think that it's pretty clear that code points are objectively the worst level to be the default. Unfortunately, changing it to _anything_ else is not going to be an easy feat at this point. But if we can first ensure that Phobos in general doesn't rely on it (i.e. in general, it can deal with ranges of char, wchar, dchar, or graphemes correctly rather than assuming that all ranges of characters are ranges of dchar), then maybe we can figure something out. Unfortunately, while some work has been done towards that, what's mostly happened is that folks have complained about auto-decoding without doing much to improve the current situation. There's a lot more to this than simply ripping out auto-decoding even if every D user on the planet agreed that outright breaking almost every existing D program to get rid of auto-decoding was worth it. But as with too many things around here, there's a lot more talking than working. And actually, as such, I should probably stop discussing this and go do something useful. - Jonathan M Davis
A tutorial page linked from the front page with some examples would go a long way to making it easier for people. If I had time and understood strings enough to explain to others I would try to make a start, but unfortunately neither are true. And if we are doing things right with RCString, then isn't it easier to make the change with that first - which is new so can't break code - and in some years when people are used to working that way update Phobos (compiler switch in beginning and have big transition a few years after that). Isn't this one of the challenges created by the tension between D being both a high-level and low-level language. The higher the aim, the more problems you will encounter getting there. That's okay. And isn't the obstacle to breaking auto-decoding because it seems to be a monolithic challenge of overwhelming magnitude, whereas if we could figure out some steps to eat the elephant one mouthful at a time (which might mean start with RCString) then it will seem less intimidating. It will take years anyway perhaps - but so what?
Sep 08 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Saturday, September 8, 2018 8:05:04 AM MDT Laeeth Isharc via Digitalmars-
d wrote:
 On Thursday, 6 September 2018 at 20:15:22 UTC, Jonathan M Davis

 wrote:
 On Thursday, September 6, 2018 1:04:45 PM MDT aliak via

 Digitalmars-d wrote:
 D makes the code-point case default and hence that becomes the
 simplest to use. But unfortunately, the only thing I can think
 of
 that requires code point representations is when dealing
 specifically with unicode algorithms (normalization, etc).
 Here's
 a good read on code points:
 https://manishearth.github.io/blog/2017/01/14/stop-ascribing-meaning-to
 -un icode-code-points/ -

 tl;dr: application logic does not need or want to deal with
 code points. For speed units work, and for correctness,
 graphemes work.
I think that it's pretty clear that code points are objectively the worst level to be the default. Unfortunately, changing it to _anything_ else is not going to be an easy feat at this point. But if we can first ensure that Phobos in general doesn't rely on it (i.e. in general, it can deal with ranges of char, wchar, dchar, or graphemes correctly rather than assuming that all ranges of characters are ranges of dchar), then maybe we can figure something out. Unfortunately, while some work has been done towards that, what's mostly happened is that folks have complained about auto-decoding without doing much to improve the current situation. There's a lot more to this than simply ripping out auto-decoding even if every D user on the planet agreed that outright breaking almost every existing D program to get rid of auto-decoding was worth it. But as with too many things around here, there's a lot more talking than working. And actually, as such, I should probably stop discussing this and go do something useful.
A tutorial page linked from the front page with some examples would go a long way to making it easier for people. If I had time and understood strings enough to explain to others I would try to make a start, but unfortunately neither are true.
Writing up an article on proper Unicode handling in D is on my todo list, but my todo list of things to do for D is long enough that I don't know then I'm going to get to it.
 And if we are doing things right with RCString, then isn't it
 easier to make the change with that first - which is new so can't
 break code - and in some years when people are used to working
 that way update Phobos (compiler switch in beginning and have big
 transition a few years after that).
Well, I'm not actually convinced that what we have for RCString right now _is_ doing the right thing, but even if it is, that doesn't fix the issue that string doesn't do the right thing, and code needs to take that into account - especially if it's generic code. The better job we do at making Phobos code work with arbitrary ranges of characters, the less of an issue that is, but you're still pretty much forced to deal with it in a number of cases if you want your code to be efficient or if you want a function to be able to accept a string and return a string rather than a wrapper range. Using RCString in your code would reduce how much you had to worry about it, but it doesn't completely solve the problem. And if you're doing stuff like writing a library for other people to use, then you definitely can't just ignore the issue. So, an RCString that handles Unicode sanely will definitely help, but it's not really a fix. And plenty of code is still going to be written to use strings (especially when -betterC is involved). RCString is going to be another option, but it's not going to replace string. Even if RCString became the most common string type to use (which I question is going to ever happen), dynamic arrays of char, wchar, etc. are still going to exist in the language and are still going to have to be handled correctly. Phobos won't be able to assume that all of the code out there is using RCString and not string. The combination of improving Phobos so that it works properly with ranges of characters in general (and not just strings or ranges of dchar) and having an alternate string type that does the right thing will definitely help and need to be done if we have any hope of actually removing auto-decoding, but even with all of that, I don't see how it would be possible to really deprecate the old behavior. We _might_ be able to do something if we're willing to deprecate std.algorithm and std.range (since std.range gives you the current definitions of the range primitives for arrays, and std.algorithm publicly imports std.range), but you still then have the problem of two different definitions of the range primitives for arrays and all of the problems that that causes (even if it's only for the deprecation period). So, strings would end up behaving drastically differently with range-based functions depending on which module you imported. I don't know that that problem is insurmountable, but it's not at all clear that there is a path to fixing auto-decoding that doesn't outright break old code. If we're willing to break old code, then we could defnitely do it, but if we don't want to risk serious problems, we really need a way to have a more gradual transition, and that's the big problem that no one has a clean solution for.
 Isn't this one of the challenges created by the tension between D
 being both a high-level and low-level language.  The higher the
 aim, the more problems you will encounter getting there.  That's
 okay.

 And isn't the obstacle to breaking auto-decoding because it seems
 to be a monolithic challenge of overwhelming magnitude, whereas
 if we could figure out some steps to eat the elephant one
 mouthful at a time (which might mean start with RCString) then it
 will seem less intimidating.  It will take years anyway perhaps -
 but so what?
Well, I think that it's clear at this point that before we can even consider getting rid of auto-decoding, we need to make sure that Phobos in general works with arbitrary ranges of code units, code points, and graphemes. With that done, we would have a standard library that could work with strings as ranges of code units if that's what they were. So, in theory, at that point, the only issue would be how on earth to make strings work as ranges of code units without just pulling the rug out from under everyone. I'm not at all convinced that that's possible, but I am very much convinced that unless we improve first Phobos so that it's fully correct in spite of the auto-decoding issues, we definitely can't remove auto-decoding. And as a group, we haven't done a good enough job with that. Most of us agree that auto-decoding was a huge mistake, but there hasn't been enough work done towards fixing what we have, and there's plenty of work there that needs to be done whether we later try to remove auto-decoding or not. - Jonathan M Davis
Sep 08 2018
prev sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, September 6, 2018 10:44:11 AM MDT H. S. Teoh via Digitalmars-d 
wrote:
 On Thu, Sep 06, 2018 at 02:42:58PM +0000, Dukc via Digitalmars-d wrote:
 On Thursday, 6 September 2018 at 14:17:28 UTC, aliak wrote:
 // D
 auto a = "á";
 auto b = "á";
 auto c = "\u200B";
 auto x = a ~ c ~ a;
 auto y = b ~ c ~ b;

 writeln(a.length); // 2 wtf
 writeln(b.length); // 3 wtf
 writeln(x.length); // 7 wtf
 writeln(y.length); // 9 wtf
[...] This is an unfair comparison. In the Swift version you used .count, but here you used .length, which is the length of the array, NOT the number of characters or whatever you expect it to be. You should rather use .count and specify exactly what you want to count, e.g., byCodePoint or byGrapheme. I suspect the Swift version will give you unexpected results if you did something like compare "á" to "a\u301", for example (which, in case it isn't obvious, are visually identical to each other, and as far as an end user is concerned, should only count as 1 grapheme). Not even normalization will help you if you have a string like "a\u301\u302": in that case, the *only* correct way to count the number of visual characters is byGrapheme, and I highly doubt Swift's .count will give you the correct answer in that case. (I expect that Swift's .count will count code points, as is the usual default in many languages, which is unfortunately wrong when you're thinking about visual characters, which are called graphemes in Unicode parlance.) And even in your given example, what should .count return when there's a zero-width character? If you're counting the number of visual places taken by the string (e.g., you're trying to align output in a fixed-width terminal), then *both* versions of your code are wrong, because zero-width characters do not occupy any space when displayed. If you're counting the number of code points, though, e.g., to allocate the right buffer size to convert to dstring, then you want to count the zero-width character as 1 rather than 0. And that's not to mention double-width characters, which should count as 2 if you're outputting to a fixed-width terminal. Again I say, you need to know how Unicode works. Otherwise you can easily deceive yourself to think that your code (both in D and in Swift and in any other language) is correct, when in fact it will fail miserably when it receives input that you didn't think of. Unicode is NOT ASCII, and you CANNOT assume there's a 1-to-1 mapping between "characters" and display length. Or 1-to-1 mapping between any of the various concepts of string "length", in fact. In ASCII, array length == number of code points == number of graphemes == display width. In Unicode, array length != number of code points != number of graphemes != display width. Code written by anyone who does not understand this is WRONG, because you will inevitably end up using the wrong value for the wrong thing: e.g., array length for number of code points, or number of code points for display length. Not even .byGrapheme will save you here; you *need* to understand that zero-width and double-width characters exist, and what they imply for display width. You *need* to understand the difference between code points and graphemes. There is no single default that will work in every case, because there are DIFFERENT CORRECT ANSWERS depending on what your code is trying to accomplish. Pretending that you can just brush all this detail under the rug of a single number is just deceiving yourself, and will inevitably result in wrong code that will fail to handle Unicode input correctly.
Indeed. And unfortunately, the net result is that a large percentage of the string-processing code out there is going to be wrong, and I don't think that there's any way around that, because Unicode is simply too complicated for the average programmer to understand it (sad as that may be) - especially when most of them don't want to have to understand it. Really, I'd say that there are only three options that even might be sane if you really have the flexibility to design a proper solution: 1. Treat strings as ranges of code units by default. 2. Don't allow strings to be ranges, to be iterated, or indexed. They're opaque types. 3. Treat strings as ranges of graphemes. If strings are treated as ranges of code units by default (particularly if they're UTF-8), you'll get failures very quickly if you're dealing with non-ASCII, and you screw up the Unicode handling. It's also by far the most performant solution and in many cases is exactly the right thing to do. Obviously, something like byCodePoint or byGrapheme would then be needed in the cases where code points or graphemes are the appropriate level to iterate at. If strings are opaque types (with ways to get ranges over code units, code points, etc.), that mostly works in that it forces you to at least try to understand Unicode well enough to make sane choices about how you iterate over the string. However, it doesn't completely get away from the issue of the default, because of ==. It would be a royal pain if == didn't work, and if it does work, you then have the question of what it's comparing. Code units? Code points? Graphemes? Assuming that the representation is always the same encoding, comparing code ponits wouldn't make any sense, but you'd still have the question of code units or graphemes. As such, I'm not sure that an opaque type really makes the most sense (though it's suggested often enough). If strings are treated as ranges of graphemes, then that should then be correct for everything that doesn't care about the visual representation (and thus doesn't care about the display width of characters), but it would be highly inefficient to do most things at the grapheme level, and it would likely have many of the same problems that we have with strings now with regards to stuff like them not being able to be random-access and how they're don't really work as output ranges. So, if we were doing things from scratch, and it were up to me, I would basically go with what Walter originally tried to do and make strings be arrays of code units but with them also being ranges of code units - thereby avoiding all of the pain that we get with trying to claim that strings don't have capabilities that they clearly do have (such as random-access or length). And then of course, all of the appropriate helper functions would be available for the different levels of Unicode handling. I think that this is the solution that quite a few of us want - though some have expressed interest in an opaque string type, and I think that that's the direction that RCString (or whatever it's called) may be going. Unfortunately, right now, it's not looking like we're going to be able to implement what we'd like here because of the code breakage issues in removing auto-decoding. RCString may very well end up doing the right thing, and I know that Andrei wants to then encourage it to be the default string for everyone to use (much as we don't all agree with that idea), but we're still stuck with auto-decoding with regular strings and having to worry about it when writing generic code. _Maybe_ someone will be able to come up with a sane solution for moving away from auto-decoding, but it's not seeming likely at the moment. Either way, what needs to be done first is making sure that Phobos in general works with ranges of char, wchar, dchar, and graphemes rather than assuming that all ranges of characters are ranges of dchar. Fortunately, some work has been done towards that, but it's not yet true of Phobos in general, and it needs to be. Once it is, then the impact of auto-decoding is reduced in general, and with Phobos depending on it as little as possible, it then makes it saner to discuss how we might remove auto-decoding. I'm not at all convinced that it would make it possible to sanely remove it, but until that work is done, we definitely can't remove it regardless. And actually, until that work is done, the workarounds for auto-decoding (e.g. byCodeUnit) don't work as well as they should. I've done some of that work (as have some others), but I really should figure out how to get through enough of my todo list that I can get more done towards that goal - particularly since I don't think that anyone is actively working the problem. For the most part, it's only been done when someone ran into a problem with a specific function, whereas in reality, we need to be adding the appropriate tests for all of the string-processing functions in Phobos and then ensure that they pass those tests. - Jonathan M Davis
Sep 06 2018
prev sibling next sibling parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
 On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright 
 wrote:
 Autodecode - I've suffered under that, too. The solution was 
 fairly simple. Append .byCodeUnit to strings that would 
 otherwise autodecode. Annoying, but hardly a showstopper.
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 }
And this has what to do with autodecoding?
 Welcome to my world!
TBH, it looks like you're just confused about how Unicode works. None of that is something particular to D. You should probably address your concerns to the Unicode Consortium. Not that they care.
Sep 06 2018
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 08:44:15 UTC, nkm1 wrote:
 On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
 On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright 
 wrote:
 Autodecode - I've suffered under that, too. The solution was 
 fairly simple. Append .byCodeUnit to strings that would 
 otherwise autodecode. Annoying, but hardly a showstopper.
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 }
And this has what to do with autodecoding?
Nothing. I was just pointing out how awkward some basic things can be. autodecoding just adds to it in the sense that it's a useless overhead but will keep string handling in a limbo forever and ever and ever.
 TBH, it looks like you're just confused about how Unicode 
 works. None of that is something particular to D. You should 
 probably address your concerns to the Unicode Consortium. Not 
 that they care.
I'm actually not confused since I've been dealing with Unicode (and encodings in general) for quite a while now. Although I'm not a Unicode expert, I know what the operations above do and why. I'd only expect a modern PL to deal with Unicode correctly and have some guidelines as to the nitty-gritty. And once again, it's the user's fault as in having some basic assumptions about how things should work. The user is just too stoooopid to use D properly - that's all. I know this type of behavior from the management of pubs and shops that had to close down, because nobody would go there anymore. Do you know the book "Crónica de una muerte anunciada" (Chronicle of a Death Foretold) by Gabriel García Márquez? "The central question at the core of the novella is how the death of Santiago Nasar was foreseen, yet no one tried to stop it."[1] [1] https://en.wikipedia.org/wiki/Chronicle_of_a_Death_Foretold#Key_themes
Sep 06 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 6 September 2018 at 09:35:27 UTC, Chris wrote:
 On Thursday, 6 September 2018 at 08:44:15 UTC, nkm1 wrote:
 On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
 On Tuesday, 4 September 2018 at 21:36:16 UTC, Walter Bright 
 wrote:
 Autodecode - I've suffered under that, too. The solution was 
 fairly simple. Append .byCodeUnit to strings that would 
 otherwise autodecode. Annoying, but hardly a showstopper.
import std.array : array; import std.stdio : writefln; import std.uni : byCodePoint, byGrapheme; import std.utf : byCodeUnit; void main() { string first = "á"; writefln("%d", first.length); // prints 2 auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!) writefln("%d", firstCU.length); // prints 2 auto firstGr = "á".byGrapheme.array; // type is `Grapheme[]` writefln("%d", firstGr.length); // prints 1 auto firstCP = "á".byCodePoint.array; // type is `dchar[]` writefln("%d", firstCP.length); // prints 1 dstring second = "á"; writefln("%d", second.length); // prints 1 (That was easy!) // DMD64 D Compiler v2.081.2 }
And this has what to do with autodecoding?
Nothing. I was just pointing out how awkward some basic things can be. autodecoding just adds to it in the sense that it's a useless overhead but will keep string handling in a limbo forever and ever and ever.
 TBH, it looks like you're just confused about how Unicode 
 works. None of that is something particular to D. You should 
 probably address your concerns to the Unicode Consortium. Not 
 that they care.
I'm actually not confused since I've been dealing with Unicode (and encodings in general) for quite a while now. Although I'm not a Unicode expert, I know what the operations above do and why. I'd only expect a modern PL to deal with Unicode correctly and have some guidelines as to the nitty-gritty.
Since you understand Unicode well, enlighten us: what's the best default format to use for string iteration? You can argue that D chose the wrong default by having the stdlib auto-decode to code points in several places, and Walter and a host of the core D team would agree with you, and you can add me to the list too. But it's not clear there should be a default format at all, other than whatever you started off with, particularly for a programming language that values performance like D does, as each format choice comes with various speed vs. correctness trade-offs. Therefore, the programmer has to understand that complexity and make his own choice. You're acting like there's some obvious choice for how to handle Unicode that we're missing here, when the truth is that _no programming language knows how to handle unicode well_, since handling a host of world languages in a single format is _inherently unintuitive_ and has significant efficiency tradeoffs between the different formats.
 And once again, it's the user's fault as in having some basic 
 assumptions about how things should work. The user is just too 
 stoooopid to use D properly - that's all. I know this type of 
 behavior from the management of pubs and shops that had to 
 close down, because nobody would go there anymore.

 Do you know the book "Crónica de una muerte anunciada" 
 (Chronicle of a Death Foretold) by Gabriel García Márquez?

 "The central question at the core of the novella is how the 
 death of Santiago Nasar was foreseen, yet no one tried to stop 
 it."[1]

 [1] 
 https://en.wikipedia.org/wiki/Chronicle_of_a_Death_Foretold#Key_themes
You're not being fair here, Chris. I just saw this SO question that I think exemplifies how most programmers react to Unicode: "Trying to understand the subtleties of modern Unicode is making my head hurt. In particular, the distinction between code points, characters, glyphs and graphemes - concepts which in the simplest case, when dealing with English text using ASCII characters, all have a one-to-one relationship with each other - is causing me trouble. Seeing how these terms get used in documents like Matthias Bynens' JavaScript has a unicode problem or Wikipedia's piece on Han unification, I've gathered that these concepts are not the same thing and that it's dangerous to conflate them, but I'm kind of struggling to grasp what each term means. The Unicode Consortium offers a glossary to explain this stuff, but it's full of "definitions" like this: Abstract Character. A unit of information used for the organization, control, or representation of textual data. ... ... Character. ... (2) Synonym for abstract character. (3) The basic unit of encoding for the Unicode character encoding. ... ... Glyph. (1) An abstract form that represents one or more glyph images. (2) A synonym for glyph image. In displaying Unicode character data, one or more glyphs may be selected to depict a particular character. ... Grapheme. (1) A minimally distinctive unit of writing in the context of a particular writing system. ... Most of these definitions possess the quality of sounding very academic and formal, but lack the quality of meaning anything, or else defer the problem of definition to yet another glossary entry or section of the standard. So I seek the arcane wisdom of those more learned than I. How exactly do each of these concepts differ from each other, and in what circumstances would they not have a one-to-one relationship with each other?" https://stackoverflow.com/questions/27331819/whats-the-difference-between-a-character-a-code-point-a-glyph-and-a-grapheme Honestly, unicode is a mess, and I believe we will all have to dump the Unicode standard and start over one day. Until that fine day, there is no neat solution to how to handle it, no matter how much you'd like to think so. Also, much of the complexity actually comes from the complexity of the various language alphabets, so that cannot be waved away no matter what standard you come up with, though Unicode certainly adds more unneeded complexity on top, which is why it should be dumped.
Sep 06 2018
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 10:44:45 UTC, Joakim wrote:
[snip]
 You're not being fair here, Chris. I just saw this SO question 
 that I think exemplifies how most programmers react to Unicode:

 "Trying to understand the subtleties of modern Unicode is 
 making my head hurt. In particular, the distinction between 
 code points, characters, glyphs and graphemes - concepts which 
 in the simplest case, when dealing with English text using 
 ASCII characters, all have a one-to-one relationship with each 
 other - is causing me trouble.

 Seeing how these terms get used in documents like Matthias 
 Bynens' JavaScript has a unicode problem or Wikipedia's piece 
 on Han unification, I've gathered that these concepts are not 
 the same thing and that it's dangerous to conflate them, but 
 I'm kind of struggling to grasp what each term means.

 The Unicode Consortium offers a glossary to explain this stuff, 
 but it's full of "definitions" like this:

 Abstract Character. A unit of information used for the 
 organization, control, or representation of textual data. ...

 ...

 Character. ... (2) Synonym for abstract character. (3) The 
 basic unit of encoding for the Unicode character encoding. ...

 ...

 Glyph. (1) An abstract form that represents one or more glyph 
 images. (2) A synonym for glyph image. In displaying Unicode 
 character data, one or more glyphs may be selected to depict a 
 particular character.

 ...

 Grapheme. (1) A minimally distinctive unit of writing in the 
 context of a particular writing system. ...

 Most of these definitions possess the quality of sounding very 
 academic and formal, but lack the quality of meaning anything, 
 or else defer the problem of definition to yet another glossary 
 entry or section of the standard.

 So I seek the arcane wisdom of those more learned than I. How 
 exactly do each of these concepts differ from each other, and 
 in what circumstances would they not have a one-to-one 
 relationship with each other?"
 https://stackoverflow.com/questions/27331819/whats-the-difference-between-a-character-a-code-point-a-glyph-and-a-grapheme

 Honestly, unicode is a mess, and I believe we will all have to 
 dump the Unicode standard and start over one day. Until that 
 fine day, there is no neat solution to how to handle it, no 
 matter how much you'd like to think so. Also, much of the 
 complexity actually comes from the complexity of the various 
 language alphabets, so that cannot be waved away no matter what 
 standard you come up with, though Unicode certainly adds more 
 unneeded complexity on top, which is why it should be dumped.
One problem imo is that they mixed the terms up: "Grapheme: A minimally distinctive unit of writing in the context of a particular writing system." In linguistics a grapheme is not a single character like "á" or "g". It may also be a combination of characters like in English spelling <sh> ("s" + "h") that maps to a phoneme (e.g. ship, shut, shadow). In German this sound is written as <sch> as in "Schiff" (ship) (but not always, cf. "s" in "Stange"). Since Unicode is such a difficult beast to deal with, I'd say D (or any PL for that matter) needs, first and foremost, a clear policy about what's the default behavior - not ad hoc patches. Then maybe a strategy as to how the default behavior can be turned on and off, say for performance reasons. One way _could_ be a compiler switch to turn the default behavior on/off -unicode or -uni or -utf8 or whatever, or maybe better a library solution like `ustring`. If you need high performance and checks are no issue for the most part (web crawling, data harvesting etc), get rid of autodecoding. Once you need to check for character/grapheme correctness (e.g. translation tools) make it available through something like `to!ustring`. Which ever way: be clear about it. But don't let the unsuspecting user use `string` and get bitten by it.
Sep 06 2018
parent Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 11:19:14 UTC, Chris wrote:

 One problem imo is that they mixed the terms up: "Grapheme: A 
 minimally distinctive unit of writing in the context of a 
 particular writing system." In linguistics a grapheme is not a 
 single character like "á" or "g". It may also be a combination 
 of characters like in English spelling <sh> ("s" + "h") that 
 maps to a phoneme (e.g. ship, shut, shadow). In German this 
 sound is written as <sch> as in "Schiff" (ship) (but not 
 always, cf. "s" in "Stange").
Sorry, this should read "In linguistics a grapheme is not _necessarily_ _only_ a single character like "á" or "g"."
Sep 06 2018
prev sibling parent reply Guillaume Piolat <spam smam.org> writes:
On Wednesday, 5 September 2018 at 07:48:34 UTC, Chris wrote:
 import std.array : array;
 import std.stdio : writefln;
 import std.uni : byCodePoint, byGrapheme;
 import std.utf : byCodeUnit;

 void main() {

   string first = "á";

   writefln("%d", first.length);  // prints 2

   auto firstCU = "á".byCodeUnit; // type is `ByCodeUnitImpl` (!)

   writefln("%d", firstCU.length);  // prints 2

   auto firstGr = "á".byGrapheme.array;  // type is `Grapheme[]`

   writefln("%d", firstGr.length);  // prints 1

   auto firstCP = "á".byCodePoint.array; // type is `dchar[]`

   writefln("%d", firstCP.length);  // prints 1

   dstring second = "á";

   writefln("%d", second.length);  // prints 1 (That was easy!)

   // DMD64 D Compiler v2.081.2
 }
So Unicode in D works EXACTLY as expected, yet people in this thread act as if the house is on fire. D dying because of auto-decoding? Who can possibly think that in its right mind? The worst part of this forum is that suddenly everyone, by virtue of posting in a newsgroup, is an annointed language design expert. Let me break that to you: core developer are language experts. The rest of us are users, that yes it doesn't make us necessarily qualified to design a language.
Sep 06 2018
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 6 September 2018 at 11:01:55 UTC, Guillaume Piolat 
wrote:
 Let me break that to you: core developer are language experts. 
 The rest of us are users, that yes it doesn't make us 
 necessarily qualified to design a language.
Who?
Sep 06 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 11:01:55 UTC, Guillaume Piolat 
wrote:

 So Unicode in D works EXACTLY as expected, yet people in this 
 thread act as if the house is on fire.
Expected by who? The Unicode expert or the user?
 D dying because of auto-decoding? Who can possibly think that 
 in its right mind?
Nobody, it's just another major issue to be fixed.
 The worst part of this forum is that suddenly everyone, by 
 virtue of posting in a newsgroup, is an annointed language 
 design expert.

 Let me break that to you: core developer are language experts. 
 The rest of us are users, that yes it doesn't make us 
 necessarily qualified to design a language.
Calm down. I for my part never said I was an expert on language design. Number one: experts do make mistakes too, there is nothing wrong with that. And autodecode is a good example of experts getting it wrong, because, you know, you cannot be an expert in all fields. I think the problem was that it was discovered too late. Number two: why shouldn't users be allowed to give feedback? Engineers and developers need feedback, else we'd still be using CLI, wouldn't we. The user doesn't need to be an expert to know what s/he likes and doesn't like and developers / engineers often have a different point of view as to what is important / annoying etc. That's why IT companies introduced customer service, because the direct interaction between developers and users would often end badly (disgruntled customers).
Sep 06 2018
parent reply Guillaume Piolat <spam smam.org> writes:
On Thursday, 6 September 2018 at 13:30:11 UTC, Chris wrote:
 And autodecode is a good example of experts getting it wrong, 
 because, you know, you cannot be an expert in all fields. I 
 think the problem was that it was discovered too late.
There are very valid reasons not to talk about auto-decoding again: - it's too late to remove because breakage - attempts at removing it were _already_ tried - it has been debated to DEATH - there is an easy work-around So any discussion _now_ would have the very same structure of the discussion _then_, and would lead to the exact same result. It's quite tragic. And I urge the real D supporters to let such conversation die (topics debated to death) as soon as they appear.
 why shouldn't users be allowed to give feedback?
Straw-man. If we don't get over _some_ technical debate, the only thing that is achieved is a loss of time for everyone involved.
Sep 06 2018
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 07/09/2018 2:30 AM, Guillaume Piolat wrote:
 On Thursday, 6 September 2018 at 13:30:11 UTC, Chris wrote:
 And autodecode is a good example of experts getting it wrong, because, 
 you know, you cannot be an expert in all fields. I think the problem 
 was that it was discovered too late.
There are very valid reasons not to talk about auto-decoding again: - it's too late to remove because breakage - attempts at removing it were _already_ tried - it has been debated to DEATH - there is an easy work-around So any discussion _now_ would have the very same structure of the discussion _then_, and would lead to the exact same result. It's quite tragic. And I urge the real D supporters to let such conversation die (topics debated to death) as soon as they appear.
+1 Either decide a list of conditions before we can break to remove it, or yes lets let this idea go. It isn't helping anyone.
Sep 06 2018
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 6 September 2018 at 14:33:27 UTC, rikki cattermole 
wrote:
 Either decide a list of conditions before we can break to 
 remove it, or yes lets let this idea go. It isn't helping 
 anyone.
Can't you just let mark it as deprecated and provide a library compatibility range (100% compatible). Then people will just update their code to use the range... This should be possible to achieve using automated source-to-source translation in most cases.
Sep 06 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 6 September 2018 at 14:30:38 UTC, Guillaume Piolat 
wrote:
 On Thursday, 6 September 2018 at 13:30:11 UTC, Chris wrote:
 And autodecode is a good example of experts getting it wrong, 
 because, you know, you cannot be an expert in all fields. I 
 think the problem was that it was discovered too late.
There are very valid reasons not to talk about auto-decoding again: - it's too late to remove because breakage - attempts at removing it were _already_ tried - it has been debated to DEATH - there is an easy work-around So any discussion _now_ would have the very same structure of the discussion _then_, and would lead to the exact same result. It's quite tragic. And I urge the real D supporters to let such conversation die (topics debated to death) as soon as they appear.
The real supporters? So it's a religion? For me it's about technology and finding a good tool for a job.
 why shouldn't users be allowed to give feedback?
Straw-man.
I meant in _general_, not necessarily autodecode ;)
 If we don't get over _some_ technical debate, the only thing 
 that is achieved is a loss of time for everyone involved.
Translation: "Nothing to see here, move along!" Usually a sign to move on...
Sep 06 2018
next sibling parent Guillaume Piolat <spam smam.org> writes:
On Thursday, 6 September 2018 at 14:42:14 UTC, Chris wrote:
 Usually a sign to move on...
You have said that at least 10 times in this very thread. Doomsayers are as old as D. It will be doing OK.
Sep 06 2018
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Thursday, 6 September 2018 at 14:42:14 UTC, Chris wrote:
 On Thursday, 6 September 2018 at 14:30:38 UTC, Guillaume Piolat 
 wrote:
 On Thursday, 6 September 2018 at 13:30:11 UTC, Chris wrote:
 And autodecode is a good example of experts getting it wrong, 
 because, you know, you cannot be an expert in all fields. I 
 think the problem was that it was discovered too late.
There are very valid reasons not to talk about auto-decoding again: - it's too late to remove because breakage - attempts at removing it were _already_ tried - it has been debated to DEATH - there is an easy work-around So any discussion _now_ would have the very same structure of the discussion _then_, and would lead to the exact same result. It's quite tragic. And I urge the real D supporters to let such conversation die (topics debated to death) as soon as they appear.
The real supporters? So it's a religion? For me it's about technology and finding a good tool for a job.
Religions have believers but not supporters - in fact saying you are a supporter says you are not a member of that faith or community. I support the Catholic Church's efforts to relieve poverty in XYZ country - you're not a core part of that effort directly. Social institutions need support to develop - language is a very old human institution, and programming languages have more similarity with natural languages alongst certain dimensions (I'm aware that NLP is your field) than some recognise. So, why shouldn't a language have supporters? I give some money to the D Foundation - this is called providing support. Does that make me a zealot, or someone who confuses a computer programming language with a religion? I don't think so. I give money to the Foundation because it's a win-win. It makes me happy to support the development of things that are beautiful and it's commercially a no-brainer because of the incidental benefits it brings. Probably I would do so without those benefits, but on the other hand the best choices in life often end up solving problems you weren't even planning on solving and maybe didn't know you had. Does that make me a monomaniac who thinks D should be used everywhere, and only D - the one true language? I don't think so. I confess to being excited by the possibility of writing web applications in D, but that has much more to do with Javascript and the ecosystem than it does D. And on the other hand - even though I have supported the development of a Jupyter kernel for D (something that conceivably could make Julia less necessary) - I'm planning on doing more with Julia, because it's a better solution for some of our commercial problems than anything else I could find, including D. Does using Julia mean we will write less D? No - being able to do more work productively means I suggest the problem is in fact the entitlement of people who expect others to give them things for free without recognising that some appreciation would be in order, and that if one can helping in whatever way is possible is probably the right thing to do even if it's in a small way in the beginning. This is of course a well-known challenge of open-source projects in general, but it's my belief it's a fleeting period already passing for D. You know sometimes it's clear from the way someone argues that it isn't about what they say. If the things they claim were problems were in fact anti-problems (merits) they would make different arguments but with the same emotional tone. It's odd - if something isn't useful for me then either I just move on and find something that is, or I try to directly act myself or organise others to improve it so it is useful. I don't stand there grumbling at the toolmakers whilst taking no positive action to make that change happen.
Sep 08 2018
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 8 September 2018 at 14:20:10 UTC, Laeeth Isharc 
wrote:
 Religions have believers but not supporters - in fact saying 
 you are a supporter says you are not a member of that faith or 
 community.
If you are a supporter of Jesus Christ's efforts, then you most certainly are a christian. If you are a supporter of the Pope, then you may or not may be catholic, but you most likely are christian or a sympathise with the faith. Programming languages are more like powertools. You may be a big fan of Makita and dislike using other powertools like Bosch and DeWalt, or you may have different preferences based the situation, or you may accept whatever you have at hand. Being a supporter is stretching it though... Although I am sure that people who only have Makita in their toolbox feel that they are supporting the company.
 Social institutions need support to develop - language is a 
 very old human institution, and programming languages have more 
 similarity with natural languages alongst certain dimensions 
 (I'm aware that NLP is your field) than some recognise.
Sounds like a fallacy.
 So, why shouldn't a language have supporters?  I give some 
 money to the D Foundation - this is called providing support.
If you hope to gain some kind of return for it or consequences that you benefit from then it is more like obtaining support and influence through providing funds. I.e. paying for support...
 It's odd - if something isn't useful for me then either I just 
 move on and find something that is, or I try to directly act 
 myself or organise others to improve it so it is useful.  I 
 don't stand there grumbling at the toolmakers whilst taking no 
 positive action to make that change happen.
Pointing out that there is a problem, that needs to be solved, in order to reach a state where the tool is applicable in a production line... is not grumbling. It is healthy. Whether that leads to positive actions (changes in policies) can only be affected through politics, not "positive action". Doesn't help to buy a new, bigger and better motor, if the transmission is broken.
Sep 08 2018
prev sibling parent Mike Parker <aldacron gmail.com> writes:
On Tuesday, 28 August 2018 at 08:44:26 UTC, Chris wrote:

 Last but not least, if it's true that the D Foundation has 
 raised only 3.2K, then there's something seriously wrong.
The Foundation has significantly more than 3.2k. The Open Collective account is relatively new and is but one option. People also donate via PayPal and other means [1], with several monthly contributors. The Foundation is paying two full-time developers, pays me for part-time work, pays out bounties for guest posts on the D Blog, pays out bounties for specific coding tasks, contributes to the funding of DConf, and more. Soon there will be more fundraising drives for targeted initiatives, like the test drive we did with the VS Code plugin, to make up for the lack of donations of time. Some of them will be for paying people to fix onerous issues in Bugzilla. It's a long-term project for me. [1] https://dlang.org/foundation/donate.html
Aug 28 2018
prev sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 27.08.2018 11:14, Chris wrote:
 
 It is unrealistic to assume that code will never break. But as I said in 
 my post above, dmd should give guarantees of backward compatibility of 
 at least N versions. Then we could be more relaxed about our code.
Each breaking change occurs between two adjacent compiler versions.
Aug 29 2018
prev sibling parent reply John Carter <john.carter taitradio.com> writes:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:

 If I fix the bug, I break existing code, and apparently a 
 substantial amount of existing code. What's your advice on how 
 to proceed with this?
https://forum.dlang.org/post/ioiglnwckjsdrukpxbvn forum.dlang.org I've been updating some pre-C++11 code to C++14 and above. It's actually been pretty trivial, because gcc has nice warnings that tell me exactly which file and line I need to fix, and I can turn them on one by one and clean up all warnings in the code base and on to the next. If you look at my delta's 99% of them are deletions rather than modifications, and quite a few of them could have been automated (if it wasn't for the horrible complexity of the preprocessor). I have also been shifting a bunch of ruby 1.8.6 era code to 2.3 and beyond, a lot of the fixes I could let rubocop autofix for me, review for peace of mind sake, run unit tests, done. Yes autofix. Sounds very scary...but it has been amazingly rock solid so far. Language development needs to leave behind the notion that it's an API fixed for all eternity. Rather the assumption must be, a language processor eats source, it can (re)write source as well. To evolve, a language processor must learn to evolve the code base it supports.
Aug 26 2018
parent John Carter <john.carter taitradio.com> writes:
On Monday, 27 August 2018 at 04:00:18 UTC, John Carter wrote:

 Rather the assumption must be, a language processor eats 
 source, it can (re)write source as well.
And before any one mentions halting problems and the impossibility of a compiler understanding whether a refactoring is behaviour altering..... My empirical experience with this is, most recent advances in language design have been around making it harder to make stupid mistakes. And whenever I have found a chunk of pre-existing code that required substantial modification to update it to the new language paradigm... it was because it was buggy already, and attempting to express the current behaviour in latest preferred idiomatic way made that obvious. ie. It was hard to move that code to latest preferred idiom because it require a bug fix AND a syntactic tweak. ie. That code was working "by accident" or not at all. The older I get the less sympathy and mercy I have for code that works "by accident" and the more I encourage language designers to evolve our tools faster to give us more safety from our own inexhaustible supply of stupidity.
Aug 26 2018
prev sibling parent reply bachmeier <no spam.net> writes:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
 On 8/24/2018 6:04 AM, Chris wrote:
 For about a year I've had the feeling that D is moving too 
 fast and going nowhere at the same time. D has to slow down 
 and get stable. D is past the experimental stage. Too many 
 people use it for real world programming and programmers value 
 and _need_ both stability and consistency.
Every programmer who says this also demands new (and breaking) features.
I realize I'm responding to this discussion after a long time, but this is the first chance I've had to return to this thread... What you write is correct. There's nothing wrong with wanting both change and stability, because there are right ways to change the language and wrong ways to change the language. If you have a stable compiler release for which you know there will be no breaking changes for the next two years, you can distribute your code to someone else and know it will work. It's not unreasonable to say "Your compiler is three years old, you need to upgrade it." You will not receive a phone call from someone that doesn't know anything about D in the middle of your workday inquiring about why the program no longer compiles. Having to deal with the possibility that others might have any of twelve different compiler versions installed just isn't sustainable.
Sep 04 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/4/2018 5:37 PM, bachmeier wrote:
 Having to deal with the 
 possibility that others might have any of twelve different compiler versions 
 installed just isn't sustainable.
Back in the bad old DOS days, my compiler depended on the Microsoft linker, which was helpfully included on the DOS distribution disks (!) The problem, however, was Microsoft kept changing the linker, and every linker was different. At one point I had my "linker disk" which was packed with every version of MS-Link I could find. Now that was unsustainable. The eventual solution was Bjorn Freeman-Benson wrote a linker (BLINK) which we then used. When it had a bug, we fixed it. When we shipped a compiler, it had a predictable linker with it. It made all the difference in the world. Hence my penchant for "controlling our destiny" that I've remarked on now and then. It's also why the DMD toolchain is boost licensed - nobody is subject to our whims.
Sep 05 2018
prev sibling next sibling parent Yuxuan Shui <yshuiv7 gmail.com> writes:
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo

 [1] 
 https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
 [2] 
 https://blog.mist.global/articles/My_concerns_about_D_programming_language.html
I find Dicebot's article resonates quite strongly with me. I have been using D for hobby projects (i.e. not a lot of code) for about 3 years. During that time I found a handful of compiler bugs. An average programmer like me shouldn't be able to find bugs in the compiler so frequently. And there are other problems, like language features interact weirdly, unhelpful/misleading error messages. All of this really gives me an impression that D is an immature language. I think this is a pretty big problem, and I think it has not been given enough attention (never appeared in bold in the vision documents), probably until now. So what if, we just forget about safe, nogc, and stuff like that for while, do a feature freeze and try our best to fix all the bugs, and rough corners?
Aug 25 2018
prev sibling next sibling parent reply Ali <fakeemail example.com> writes:
On Wednesday, 22 August 2018 at 11:59:37 UTC, Paolo Invernizzi 
wrote:
 Just found by chance, if someone is interested [1] [2].

 /Paolo

 [1] 
 https://gitlab.com/mihails.strasuns/blog/blob/master/articles/on_leaving_d.md
 [2] 
 https://blog.mist.global/articles/My_concerns_about_D_programming_language.html
My summary of this discussion * D as a language have issues, issues serious enough that they cannot be fixed without introducing breaking changes, which more or less means creating a spinoff language, or languages * D as a project have issues, those cannot be fixed, because D's main attraction is Andrei and Walter are both the problem and the solution, again, no real solution unless their is a spinoff languages owned by a different group I think that a lot of the complainers, are expecting that either Andrei or Walter, will create a new version of D that is basically a new language, or that they will explicitly create a new language that have all of D good points but none of the bad I dont think that Andrei or Walter, are interested in creating a new language, I think they will keep trying to improve D incrementally As a realistic short term fix, I think both Andrei and Walter, need to streamline and be more vocal about long term plans, because this is obviously a source of confusion for many, and a source for a lot of rants
Aug 25 2018
parent reply Pjotr Prins <pjotr.public12 thebird.nl> writes:
On Sunday, 26 August 2018 at 03:17:06 UTC, Ali wrote:
 As a realistic short term fix, I think both Andrei and Walter, 
 need to streamline and be more vocal about long term plans, 
 because this is obviously a source of confusion for many, and a 
 source for a lot of rants
My summary is that D means different things to different people. D has put in the kitchen sink. It tries to please everyone which means it is a complex toolbox. One thing I have learnt by lurking in this project is how much effort goes into compiler/library development to make it great. NoGC and safe just show how hard that can be. Lots of work on corner cases. Maybe with hindsight D should have been less OOP and more FP (destructuring data anyone?), but then you lose all those who want/are used to that paradigm. I use quite a few languages. For me D is the most powerful language I have for getting performance. Artem wrote Sambamba as a student https://github.com/biod/sambamba and it is now running around the world in sequencing centers. Many many CPU hours and a resulting huge carbon foot print. The large competing C++ samtools project has been trying for 8 years to catch up with an almost unchanged student project and they are still slower in many cases. https://groups.google.com/forum/#!topic/sambamba-discussion/z1U7VBwKfgs Just saying. Much better to choose D over C++. I also work on a C++ project and I find it to be a royal pain compared to writing software in D. Note that Artem used the GC and only took GC out for critical sections in parallel code. I don't buy these complaints about GC. The complaints about breaking code I don't see that much either. Sambamba pretty much kept compiling over the years and with LDC/LLVM latest we see a 20% perfomance increase. For free (at least from our perspective). Kudos to LDC/LLVM efforts!! Very excited to see gdc pick up too. We need the GNU projects. So, do we need change? You can always try and improve process and over the last years W&A have been pushing for that. Let me state here that D is anarchy driven development in all its glory (much like the Linux kernel). I believe it is great. I think, in addition to standard packaging in Linux distros (which is coming), D could use more industry support (much like the Linux kernel). D being a performance language for software engineers I would look at the extremes of HPC and mobile to succeed. How do we wake those companies up? Especially those with large investments in C++. Those we should invite to Dconf. I remember one guy at Google telling me that every time someone would bring up "Why don't we write this in D instead?". That was 10 years ago. Google invested in Python and Go instead - but still write heaps of code in C++. Go figure.
Aug 25 2018
parent Jon Degenhardt <jond noreply.com> writes:
On Sunday, 26 August 2018 at 05:55:47 UTC, Pjotr Prins wrote:
 Artem wrote Sambamba as a student

     https://github.com/biod/sambamba

 and it is now running around the world in sequencing centers. 
 Many many CPU hours and a resulting huge carbon foot print. The 
 large competing C++ samtools project has been trying for 8 
 years to catch up with an almost unchanged student project and 
 they are still slower in many cases.
 
 [snip]

 Note that Artem used the GC and only took GC out for critical 
 sections in parallel code. I don't buy these complaints about 
 GC.

 The complaints about breaking code I don't see that much 
 either. Sambamba pretty much kept compiling over the years and 
 with LDC/LLVM latest we see a 20% perfomance increase. For free 
 (at least from our perspective). Kudos to LDC/LLVM efforts!!
This sounds very similar to my experiences with the tsv utilities, on most of the same points (development simplicity, comparative performance, GC use, LDC). Data processing apps may well be a sweet spot. See my DConf talk for an overview (https://github.com/eBay/tsv-utils/blob/master/docs/dconf2018.pdf). Though not mentioned in the talk, I also haven't had any significant issues with new compiler releases. May have be related to the type of code being written. Regarding the GC - The throughput oriented nature of data processing tools like the tsv utilities looks like a very good fit for the current GC. Applications where low GC latency is needed may have different results. It'd be great to hear an experience report from development of an application where GC was used and low GC latency was a priority. --Jon
Aug 26 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Sunday, August 26, 2018 7:11:30 PM MDT Manu via Digitalmars-d wrote:
 On Sun, 26 Aug 2018 at 18:09, Jonathan M Davis via Digitalmars-d

 <digitalmars-d puremagic.com> wrote:
 On Sunday, August 26, 2018 5:39:32 PM MDT Manu via Digitalmars-d wrote:
   ARC? What ever happened to the opAddRef/opDecRef proposal? Was it

 rejected? Is it canned, or is it just back on the bench? (GC is
 absolutely off the table for my project, I have no influence on this)
I don't know what Walter's current plans are for what any built-in ref-counting solution would look like, but it's my understanding that whatever he was working on was put on hold, because he needed something like DIP 1000 in order to make it work with safe - which is what then triggered his working on DIP 1000 like he has been. So, presumably, at some point after DIP 1000 is complete and ready, he'll work on the ref-counting stuff again. So, while we may very well get it, I expect that it will be a while.
I'm sure I recall experimental patches where those operators were available to try out... was I dreaming? :/
I have no idea. AFAIK, Walter didn't make any changes related to ref-counting public, and I have no idea whether he got very far along with it at all, but I also haven't been watching everything he does in anticipation for such a feature. So, I could easily have missed something. - Jonathan M Davis
Aug 26 2018
prev sibling next sibling parent Alexander Nicholi <alex arqadium.com> writes:
On Thursday, 23 August 2018 at 09:16:23 UTC, Mihails wrote:
 Didn't intend to chime in, but no, that was not what I have 
 meant at all. My stance is that as long as current leadership 
 remains in charge and keep sames attitude, no amount of money 
 or developer time will fix D.

 What is the point in hiring someone to manage things if Walter 
 still can do stuff like -dip1000? For me moment of 
 understanding this was exact point of no return.
Just as power is not something to be grasped or abolished, leadership is not something you can consider as an attribute or property of the D endeavour or its community. You criticise the leadership as if them being leaders is part of the problem, leaving no event or cause to credit for why their stewardship is bad! You think -dip1000 is disadvisable? What grounds are those on? How does that tie into the leadership at hand here? When I ask about leadership I ask not about leaders, but rather the actions leaders take, since those count for everything in evaluating their skill. On Thursday, 23 August 2018 at 14:29:23 UTC, bachmeier wrote:
 Weka is an awesome project, but I don't know that most people 
 considering D should use your experience as the basis of their 
 decision. At least in my areas, I expect considerable growth in 
 the usage of D over the next 10 years. Maybe it won't see much 
 traction as a C++ replacement for large projects like Weka.
It’s as much to consider as any other project, in that you shouldn’t base your opinions or decisions on it alone. There are other aspects of this I will comment on later. On Thursday, 23 August 2018 at 13:22:45 UTC, Shachar Shemesh wrote:
 What I did mean by that is that the enthusiasm from D has 
 *greatly* diminished. Many (but not all) developer will 
 definitely not choose D for our next project, should the choice 
 be ours to make.

 Like I said in my original post, it is not even in consensus 
 whether picking D to begin with had been a mistake. Some think 
 it was, some think, even in hind sight and after being more or 
 less disillusioned, that it was still better than picking 
 another language. As such, it is not even universally true that 
 engineers at Weka regret going with D.
Perhaps your team’s mistake is deciding from their enthusiasm instead of a solid analysis of what the language will demand from your team. It is no secret that D is not as ready for production as C++, so why did they not evaluate the shortcomings of the ecosystem beforehand? Was the honest consensus that D would carry your project as well as Go, Rust, or C++? A cursory examination shows the former two languages have a corporate backbone absent from D, and C++ is self-explanatory with ISO and OS code. On Thursday, 23 August 2018 at 13:22:45 UTC, Shachar Shemesh wrote:
 To summarize: Weka isn't ditching D, and people aren't even 
 particularly angry about it. It has problems, and we've learned 
 to live with them, and that's that. The general consensus, 
 however, is that these problems will not be resolved (we used 
 to file bugs in Bugzilla. We stopped doing that because we saw 
 nothing happens with them), and as far as the future of the 
 language goes, that's bad news.

 Shachar
Imagine you are on a state-of-the-art boat in the bronze age that needs many many people to move it by oar. You aren’t the crew of the ship, but you have cargo that needs pulled for trade. This boat is short-staffed, and by freemen to boot, and your merchantry has insisted on going with this boat over the bigger boats, because maybe they use slave drivers, or they’re really old and the sea won’t take them well, or whathaveyou. Can you, in reasonable conscience, expect this noble new boat to carry your load the same as older, rustier, or more backwards ships? It may be able to, but at a greatly reduced speed, or split into multiple trips or some other serious concession. But you are a merchant, you roll the nickels in the mediterranean and if you really wanted to you could push to hire more oarmen and have it both ways. In a way, this is the decision your team needed to make with a language like this, and it puzzles me why you haven’t mentioned a discussion of this sort yet. And it is even more bemusing that, somehow, you find the people running the boat to be at fault for not meeting your expectations. Were you hoping to find the holy grail aboard this ship? On Thursday, 23 August 2018 at 17:19:41 UTC, Ali wrote:
 On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh 
 wrote:
 On 23/08/18 17:01, Steven Schveighoffer wrote:
 My main job is to develop for Weka, not develop D itself.
Weka, at some point, made the strategic decision to use a non mainstream language I dont think Weka, have a choice, they have to invest in the development of D itself
Arguably, this should have been an expectation from the start. I have not spent a lot of time here, yet I am plenty prepared to fork over a lot of my own resources to make things happen in D for my company because of what I’ve seen of the ecosystem so far. I don’t know of Weka’s leadership at all, but since I lead my company I can say that it will almost certainly transition from my time as a developer into my money as I make more things and make more money. It’s no small favour to ask, but it’s the only realistic expectation to have besides the current state of affairs that result when you do nothing or sit idly hoping for things to come to you. They say Rome was not built in a day, but what they don’t tell you is Rome was not built from anything less than a global fortune either. It’s paramount to really know what you’re up to task for. On Thursday, 23 August 2018 at 16:22:54 UTC, Shachar Shemesh wrote:
 I'm reminded of a friend of mine, who kept hoping to win the 
 lottery despite never buying a ticket. His reasoning was that 
 the chances of winning are not much changed by buying a ticket.

 Shachar
Is using D tantamount to a lottery for you? What did you expect in your original enthusiasm if not this? On Thursday, 23 August 2018 at 18:27:27 UTC, Abdulhaq wrote:
 On Thursday, 23 August 2018 at 09:51:43 UTC, rikki cattermole 
 wrote:
 Good luck getting W&A to agree to it, especially when there 
 is yet another "critical D opportunity" on the table ;)
No. They have power for as long as we the community say that they do. We are at the point where they need a check and balance to keep everybody going smoothly. And I do hope that they listen to us before somebody decides its forkin' time.
No fork of D can be successful, it won't have the manpower, skills or willpower to draw on. Even with W and A it's already short. 'Threatening' W and A with a fork is an empty threat that just p***es them off. Bad move on your part.
The solution is obviously not hard power, then. Ever wonder why the US has unquestioned global military dominance? It’s all in soft power. Most of the military isn’t tasked with punching down everyone on earth, but rather protecting all of the trade routes on the planet with the concessions countries give in exchange for that. You have a much better time doing what you want if you convince people to go along with you instead of forcing them into your will, and it’s true at every scale from global geopolitics to programming language communities. Funding the D foundation with solid, clear expectations is an easy example of this, but there are lots of ways to go about it that makes everyone happy. On Thursday, 23 August 2018 at 23:06:03 UTC, Everlast wrote:
 I agree with this. I no longer program in D, except for minor 
 things, because of this type of approach. D, as a language, is 
 the best. D as an actual practical tool is a deadly pit of 
 snakes... anyone of which can bite you, and that won't stop the 
 others. Of course, in the pit is where all the gold is at...
So then, you need a snake charmer to distract them all as you run the gold. See below for a more concise explanation of this. On Thursday, 23 August 2018 at 23:06:03 UTC, Everlast wrote:
 My feeling is D is and will stay stagnate for the majority of 
 the world. It doesn't seem to have enough momentum to break out 
 and the "leaders" don't seem to know much about actually 
 leading... programming? Yes, but leading? No, not 
 really...(obviously they know something but I'm talking about 
 what is required... a bill gate like character, say, not that 
 we want another one of those!)
It’s interesting you mention Bill Gates! Like Steve Jobs, one thing he was notoriously brilliant for doing was in leadership. In the joint interview they did together (I believe it was some time in 2007), both of them mentioned how critical it was that they had so many people around them to make their companies happen, and how lucky they were for it. They kept it simple, but luck doesn’t earn you workmates - leadership does. It’s one of the most thankless jobs around and there are more people than many programmers can imagine who grossly overestimate their aptitude at it. On Thursday, 23 August 2018 at 23:06:03 UTC, Everlast wrote:
 Since time is money, you know these types of issues will stop 
 businesses from adopting D. The typical answer from the D 
 community is "Implement it in a library!" or "It has bindings!" 
 as if these are the solutions someone trying to get shit done 
 wants to hear. Usually the libraries are defunct in some 
 way(bit rot, version issues, shitty design, some deal 
 breaker(e.g., uses gc), shitty documentation, etc).
The default modus operandum for businesses is total indifference. If they shy away from D for these issues it is almost always because they didn’t care much at all in the first place and were likely eyeing it to see what it even is because they haven’t heard of it before. “Time is money” is not wrong but there’s a much less confusing way to put it: priorities. They usually don’t care, and in that case, what did you expect them to do? I’d see them taking a vacation to Greenland before they adopt D in that case. On Friday, 24 August 2018 at 21:53:18 UTC, H. S. Teoh wrote:
 I think it's clear by now that most of D's woes are not really 
 technical in nature, but managerial.  I'm not sure how to 
 improve this situation, since I'm no manager type either.  It's 
 a known problem among techies that we tend to see all problems 
 as technical in nature, or solvable via technical solutions, 
 when in reality what's actually needed is someone with real 
 management skills.  Hammer and nail, and all that, y'know.

 Unfortunately, we techies also tend to resist non-technical 
 "interference", especially from non-techies (like manager 
 types). I used to have that attitude too (and probably still do 
 to some extent), and only with age did I begin realizing this 
 about myself.  It's not an easy problem to fix in practice, 
 especially in a place like here, where we're driven primarily 
 by the technical aspects of D, and for the most part outside of 
 any financial or other motivations that might have served to 
 moderate things a little.
Yes, yes. As I said before, management is very valuable and the time is nigh for more of it because it is apparent D has plenty of technical rigour in its alumni. The most critical thing in management is that it needs to be from people who are technically rigourous, because an incompetent manager is, in D’s case at least, worse for everyone than a competent non-manager. It’s basically a matter of the folks who already kick ass + take names stepping up to bat together as a team, bringing more cohesion into things. This is anything but easy, but if accomplished it would make D shine a lot for enterprise onlookers, particularly the kinds of wealthy interests that do very much care about things even when it doesn’t really make them money (think Gabe Newell with his insistence on opening PC gaming up for linux). On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:
 I definitely think a stable version with backfixes ported would 
 be great if feasible.
Agreed as well. once I have the ability I will see to this myself with LDC if it hasn’t happened already. On Monday, 27 August 2018 at 01:15:49 UTC, Laeeth Isharc wrote:
 I wonder if we are approaching the point where enterprise 
 crowd-funding of missing features or capabilities in the 
 ecosystem could make sense.  If you look at how Liran managed 
 to find David Nadlinger to help him, it could just be in part a 
 matter of lacking social organisation preventing the market 
 addressing unfulfilled mutual coincidences of wants.  Lots of 
 capable people would like to work full time programming in D.  
 Enough firms would like some improvements made.  If takes work 
 to organise these things.  If I were a student I might be 
 trying to see if there was an opportunity there.
We’re well into the timeframe where it makes sense, imo. the problem is actually getting that to happen, which the community does not have many cards to play to influence the outcome of, many of which can easily backfire anyways. the best strategy is to maintain cohesion and mend divides, avoiding sentiments that sow them in the first place. I’m sure pretty much no one here sees anything less than great potential if not an outright great language in front of us here, so it’s in everyone’s interest to keep one’s head straight and grounded in reality on things. there’s plenty of concrete problems to solve as it is ;) ----- all things considered here, I see a lot of valid concerns from several angles here. the company aspect of the discussion seems misguided as to why expectations weren’t met, and as far as D itself is concerned stronger + smarter leadership is in the biggest demand from what I see. another thing I want to mention about corporate involvement and sponsorship dynamics: as a company your involvement is totally elastic. Really you can dictate and harmoniously fulfill any level of involvement you desire, and the only question is if your negotiation skills are decent enough and whether your expectations from others are grounded in reality. You can fund the foundation, pay developers of your choosing who are already vets in the ecosystem, hire your own IOI-style squadron of FOSS mercenaries, or get down-and-dirty and literally do it yourself. they don’t call head honchos ‘executives’ for nothing! ;)
Aug 26 2018
prev sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, September 4, 2018 2:45:53 PM MDT H. S. Teoh via Digitalmars-d 
wrote:
 Those who've learned LaTeX swear by it. Those who are learning LaTeX swear
 at it. -- Pete Bleackley
Well, that's a weirdly appropriate quote. The primary reason that I've done as much with latex as I have is so that I can use vim to write documents instead of having to use a word processor. - Jonathan M Davis
Sep 04 2018