www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - It is the year 2020: why should I use / learn D?

reply lagfra <me fragal.eu> writes:
https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

By 2020 C++ is planning to introduce:

* Ranges
* Contracts
* Concepts (`__traits`)
* Proper constexpr
* Modules
* Reflections
* Green threads

Right now it already has:

* `auto` variables
* Ranged for (`foreach`)
* Lambda expressions and closures
* `nothrow` attributes
* Proper containers
* Proper RAII

In no way this is the usual trollpost (I am a participant of 
SAoC). What bugs me is the shortening distance regarding what D 
has to offer with respect to C++. While D for sure has a way 
better syntax (thinking of template declarations, `immutable`, 
UDAs) and a GC, what are the advantages of using D vs C++ if my 
goal is to build a complex system / product?

TL;DR: what will D offer with respect to C++ when almost all key 
features of D are present in C++20(+)?
Nov 14 2018
next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/
According to https://medium.com/ wrongway4you/brief-article-on-c-modules-f58287a6c64: "Compilation times are promised to become smaller up to 20%" I think D's compilation times will still be significantly faster than C++. Especially when you consider that some library authors might even choose not to use C++ modules.. I think this is one place where D will still come ahead.
Nov 14 2018
prev sibling next sibling parent Joakim <dlang joakim.fea.st> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads

 Right now it already has:

 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII

 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what D 
 has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if my 
 goal is to build a complex system / product?

 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Not as many legacy features, such as the C preprocessor; much better syntax as you say, especially templates; no need to navigate enormously complex rules to make sure everything works; stuff they haven't added yet, like UFCS; and almost everything you list is available now, in 2018, so you get a giant head-start. :D
Nov 14 2018
prev sibling next sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads
Not all implementations are equal. - Regarding `constexpr` For example D's CTFE works by having some places where things are forced to be evaluated in CTFE. No syntax. In C++, you currently have `constexpr` and soon you'll have `consteval` alongside with constexpr (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p1073r2.html). No new keyword was used for the D version of CTFE. Similarly, `if constexpr` is strictly less useful than `static if` since it introduces a scope. See the workaround here: https://stackoverflow.com/questions/46880578/scope-of-variables-declared-inside if-constexpr-blocks < back to where we were with enable_if in terms of code size. Regarding modules, they were proposed and reported in two C++ standards already. Same for concepts. Ultimately all this doesn't matter much since: - the C++ culture of using cmake/makefiles over a declarative build system will not change (see the demise of biicode) - avoiding the latest language addition to keep compatibility is necessary in C++, with divergent frontends - existing codebases are still C++98 for the most part D has the chance to have a community : - for which template meta-programming is simple, well understood and almost boring - which is willing to build an ecosystem (but not enough)
 what are the advantages of using D vs C++ if my goal is to 
 build a complex system / product?
Easy: your struggle to learn the D language will have an ending. Which lead to reduced mental load forever - without compromising anything in power.
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Sanity.
Nov 14 2018
prev sibling next sibling parent Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Nested functions.
Nov 14 2018
prev sibling next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 15/11/2018 4:07 AM, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee
trip_report_ranges/ 
 
 
 By 2020 C++ is planning to introduce:
 
 * Ranges
Really butchered. From what I can see they never mentioned D in any of the documents (kinda glad tbh). Those documents even question what it should be doing... And the example code... yikes. No way that is going to be used.
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads
Skepticism especially when there is 2023 being listed as a conservative estimate on the Reddit post.
Nov 14 2018
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 14 November 2018 at 15:33:49 UTC, rikki cattermole 
wrote:
 [snip]

 Really butchered. From what I can see they never mentioned D in 
 any of the documents (kinda glad tbh). Those documents even 
 question what it should be doing...
I recall D being briefly mentioned in the Range specification. They rejected D's approach because they wanted to build on existing iterator-based code.
Nov 14 2018
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 14 November 2018 at 15:49:48 UTC, jmh530 wrote:
 [snip]

 I recall D being briefly mentioned in the Range specification. 
 They rejected D's approach because they wanted to build on 
 existing iterator-based code.
It's actually quite a bit more than I remembered: https://ericniebler.github.io/std/wg21/D4128.html#iterator-operations-are-primitive
Nov 14 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 7:57 AM, jmh530 wrote:
 It's actually quite a bit more than I remembered:
 
 https://ericniebler.github.io/std/wg21/D4128.html#iterator-operations-are-primitive
The trouble with the iterator-pair approach, not mentioned in that article, is that it is not checkable for memory safety. The article mentions as a defect that D ranges can only shrink, and cannot grow. But that's fundamental to memory safety.
Nov 14 2018
prev sibling parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 11/14/2018 07:57 AM, jmh530 wrote:
 On Wednesday, 14 November 2018 at 15:49:48 UTC, jmh530 wrote:
 [snip]

 I recall D being briefly mentioned in the Range specification. They 
 rejected D's approach because they wanted to build on existing 
 iterator-based code.
It's actually quite a bit more than I remembered: https://ericniebler.github.io/std/wg21/D4128.html#iterator-oper tions-are-primitive
Eric appeared on D forums after his range presentation on the following thread where Walter responded with the history of ranges here: https://forum.dlang.org/thread/hatpfdftwkycjxwxcthe forum.dlang.org Ali
Nov 16 2018
prev sibling parent reply Eugene Wissner <belka caraus.de> writes:
On Wednesday, 14 November 2018 at 15:49:48 UTC, jmh530 wrote:
 On Wednesday, 14 November 2018 at 15:33:49 UTC, rikki 
 cattermole wrote:
 [snip]

 Really butchered. From what I can see they never mentioned D 
 in any of the documents (kinda glad tbh). Those documents even 
 question what it should be doing...
I recall D being briefly mentioned in the Range specification. They rejected D's approach because they wanted to build on existing iterator-based code.
No, it wasn't the reason. Some algorithms cannot be implemented with ranges as efficient as with iterators. "In other words, by converting the is_word_boundary from iterators to D-style ranges, the algorithm goes from O(1) to O(N). From this we draw the following conclusion: D’s choice of algorithmic basis operations is inherently less efficient than C++’s." C++ iterators are more flexible. I think of things like rotate and bringToFront - in C++ you need begin, end and middle iterators. In D you can't have something like a "middle", you have to pass two independent ranges. But since most algorithms require begin and end iterators I like D ranges because they aren't that verbose. But D ranges aren't always nicer, for example SList.insertAfter requires a hack with accepting Take!Range instead of just Range.
Nov 14 2018
next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/14/18 11:09 AM, Eugene Wissner wrote:
 On Wednesday, 14 November 2018 at 15:49:48 UTC, jmh530 wrote:
 On Wednesday, 14 November 2018 at 15:33:49 UTC, rikki cattermole wrote:
 [snip]

 Really butchered. From what I can see they never mentioned D in any 
 of the documents (kinda glad tbh). Those documents even question what 
 it should be doing...
I recall D being briefly mentioned in the Range specification. They rejected D's approach because they wanted to build on existing iterator-based code.
No, it wasn't the reason. Some algorithms cannot be implemented with ranges as efficient as with iterators. "In other words, by converting the is_word_boundary from iterators to D-style ranges, the algorithm goes from O(1) to O(N). From this we draw the following conclusion: D’s choice of algorithmic basis operations is inherently less efficient than C++’s." C++ iterators are more flexible. I think of things like rotate and bringToFront - in C++ you need begin, end and middle iterators. In D you can't have something like a "middle", you have to pass two independent ranges. But since most algorithms require begin and end iterators I like D ranges because they aren't that verbose. But D ranges aren't always nicer, for example SList.insertAfter requires a hack with accepting Take!Range instead of just Range.
The solution is cursors -- a "range" of a single element, which points at that element. You can iterate it just like a range (it has front, popFront, and empty), but it's used basically to mark the location you are interested in. dcollections used cursors, which I found much nicer. For example, you can run a search on a tree, and get a reference to a single node, instead of a range. You can then compose ranges using cursors for any desired subsection: auto allElementsBefore = tree[tree.begin .. tree.find("someValue")]; auto allElementsAfter = tree[tree.find("someValue") .. $]; auto elementsBetween = tree[tree.find("value1") .. tree.find("value2")]; insert just takes a cursor, everything is good. -Steve
Nov 14 2018
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 14 November 2018 at 17:33:27 UTC, Steven 
Schveighoffer wrote:
 [snip]

 The solution is cursors -- a "range" of a single element, which 
 points at that element. You can iterate it just like a range 
 (it has front, popFront, and empty), but it's used basically to 
 mark the location you are interested in.

 dcollections used cursors, which I found much nicer. [snip]
I always thought the cursor approach you've discussed before was interesting, but I never played around with it much myself. It looks to me like your cursor implementation operates similar to Optional types (as in the optional package). Does that make sense?
Nov 14 2018
parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/14/18 1:33 PM, jmh530 wrote:
 On Wednesday, 14 November 2018 at 17:33:27 UTC, Steven Schveighoffer wrote:
 [snip]

 The solution is cursors -- a "range" of a single element, which points 
 at that element. You can iterate it just like a range (it has front, 
 popFront, and empty), but it's used basically to mark the location you 
 are interested in.

 dcollections used cursors, which I found much nicer. [snip]
I always thought the cursor approach you've discussed before was interesting, but I never played around with it much myself. It looks to me like your cursor implementation operates similar to Optional types (as in the optional package). Does that make sense?
I haven't looked at it in depth, but the concept is probably the same. A cursor necessarily points at an element, and contains a boolean to denote emptiness. The idea was a compromise to try and get the dcollections concept into Phobos. Originally, dcollections used iterators (and was D1). Ranges were added for the D2 port. Andrei wouldn't let it be considered for phobos, so I came up with the idea of making the iterators safe (and it actually is a nice concept too) by making them tiny ranges, but retaining the utility of an iterator for referencing. In the end, Andrei built his own thing anyway, but the red black tree in phobos came from dcollections. -Steve
Nov 14 2018
prev sibling next sibling parent reply Basile B. <b2.temp gmx.com> writes:
On Wednesday, 14 November 2018 at 16:09:32 UTC, Eugene Wissner 
wrote:
 No, it wasn't the reason. Some algorithms cannot be implemented 
 with ranges as efficient as with iterators.

 "In other words, by converting the is_word_boundary from 
 iterators to D-style ranges, the algorithm goes from O(1) to 
 O(N). From this we draw the following conclusion: D’s choice of 
 algorithmic basis operations is inherently less efficient than 
 C++’s."

 C++ iterators are more flexible.
You meant D ?
 I think of things like rotate and bringToFront ...
Nov 14 2018
next sibling parent reply Eugene Wissner <belka caraus.de> writes:
On Wednesday, 14 November 2018 at 17:47:10 UTC, Basile B. wrote:
 On Wednesday, 14 November 2018 at 16:09:32 UTC, Eugene Wissner 
 wrote:
 No, it wasn't the reason. Some algorithms cannot be 
 implemented with ranges as efficient as with iterators.

 "In other words, by converting the is_word_boundary from 
 iterators to D-style ranges, the algorithm goes from O(1) to 
 O(N). From this we draw the following conclusion: D’s choice 
 of algorithmic basis operations is inherently less efficient 
 than C++’s."

 C++ iterators are more flexible.
You meant D ?
C++. An iterator points to some element in a range, and you can mix it as you need. In D you always have two iterators: beginning and the end of the range. The feature I miss most, is a bidirectional iterator. In C++ you can go back and forth with this iterator. In D if you do popBack there is no way back (or front :)); with popFront you can't reach the element after back, even if there such elements, you have to take an another range from the container. D ranges are more compact, because you need the begining and the end in the most cases, so such functions just take one range instead of two iterators. But there are still a lot of cases where D ranges don't work. Steven Schveighoffer mentioned an interesting solution but it adds complexity: Implement Range, ConstRange, Cursor, ConstCursor?, algorithms that accept ranges or cursors and so forth...
 I think of things like rotate and bringToFront ...
Nov 14 2018
parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/14/18 1:08 PM, Eugene Wissner wrote:
 On Wednesday, 14 November 2018 at 17:47:10 UTC, Basile B. wrote:
 On Wednesday, 14 November 2018 at 16:09:32 UTC, Eugene Wissner wrote:
 No, it wasn't the reason. Some algorithms cannot be implemented with 
 ranges as efficient as with iterators.

 "In other words, by converting the is_word_boundary from iterators to 
 D-style ranges, the algorithm goes from O(1) to O(N). From this we 
 draw the following conclusion: D’s choice of algorithmic basis 
 operations is inherently less efficient than C++’s."

 C++ iterators are more flexible.
You meant D ?
C++. An iterator points to some element in a range, and you can mix it as you need. In D you always have two iterators: beginning and the end of the range. The feature I miss most, is a bidirectional iterator. In C++ you can go back and forth with this iterator. In D if you do popBack there is no way back (or front :)); with popFront you can't reach the element after back, even if there such elements, you have to take an another range from the container.
What you need is an index in a range. The inherent thing about a naked iterator (bidirectional or otherwise) is it's not safe. It's super easy to get into UB territory.
 D ranges are more compact, because you need the begining and the end in 
 the most cases, so such functions just take one range instead of two 
 iterators. But there are still a lot of cases where D ranges don't work.
It's also much easier to avoid invalid situations with a range vs. carrying around 2 iterators. Indeed there are very few cases for algorithms that need 3 reference points. But where iterators are great are simply as a pointer to an item in a container (see below).
 
 Steven Schveighoffer mentioned an interesting solution but it adds 
 complexity: Implement Range, ConstRange, Cursor, ConstCursor?, 
 algorithms that accept ranges or cursors and so forth...
Ugh, that's an issue with D and tail-const. You really should never need multiple functions for those things, it should just be Range and Cursor, and the tail-modified versions should be handled by the language. But one thing cursors (at least the ones I implemented) don't solve is the bidirectional nature -- a cursor is inherently not really movable. It needs boundaries to prevent Bad Things, and existentially you don't *want* boundaries for a cursor, because boundaries change. The most common use case I had for a cursor was keeping a reference to an element in a container. At various points, I wanted to refer to that element to use it, possibly to restructure the container, I didn't want to have to search for it again. But if you use an actual range for this, you now have *two* elements you are pointing at (the beginning and the end). It's entirely possible for that relationship to change or become invalid. With a cursor, it can't become invalid because you are only pointing at one spot. -Steve
Nov 14 2018
prev sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Wednesday, 14 November 2018 at 17:47:10 UTC, Basile B. wrote:
 On Wednesday, 14 November 2018 at 16:09:32 UTC, Eugene Wissner 
 wrote:
 No, it wasn't the reason. Some algorithms cannot be 
 implemented with ranges as efficient as with iterators.

 "In other words, by converting the is_word_boundary from 
 iterators to D-style ranges, the algorithm goes from O(1) to 
 O(N). From this we draw the following conclusion: D’s choice 
 of algorithmic basis operations is inherently less efficient 
 than C++’s."

 C++ iterators are more flexible.
You meant D ?
No, more flexible and prone to contortion. More flexible, less useful.
Nov 14 2018
prev sibling parent Neia Neutuladh <neia ikeran.org> writes:
On Wed, 14 Nov 2018 16:09:32 +0000, Eugene Wissner wrote:
 No, it wasn't the reason. Some algorithms cannot be implemented with
 ranges as efficient as with iterators.
 
 "In other words, by converting the is_word_boundary from iterators to
 D-style ranges, the algorithm goes from O(1) to O(N). From this we draw
 the following conclusion: D’s choice of algorithmic basis operations is
 inherently less efficient than C++’s."
The code in question: ``` bool is_word_boundary(Range1, Range2)( Range1 front, Range2 back ) if (isBidirectionalRange!Range1 && isInputRange!Range2 ) { bool is_word_prev = front.empty ? false : isword(front.back); bool is_word_this = back.empty ? false : isword(back.front); return is_word_prev != is_word_this; } range r = myrange; size_t n = 0; for(range r = myrange; !r.empty; r.popFront(), ++n) if( is_word_boundary( takeExactly(myrange, n), r) ) break; ``` This is fine for a random access range (takeExactly yields a random access range from it). But with a bidirectional range like a doubly linked list, takeExactly gives you a forward range; you can't efficiently construct `.back()` for it. If we had a notion of splittable ranges, that would work. (All deterministic ranges are splittable, but unless the range provides a special implementation or is a random access range, it's O(N) to split.) If subranges were a dedicated concept, that would also work; you could use a subrange to iterate through the outer range. And as Steven Schveighoffer mentioned, generalizing indexes allows a lot of range types to be close enough to random access to make this use case efficient. I'm pretty sure this should be equal to C++ iterators.
Nov 14 2018
prev sibling next sibling parent reply sepiasisy <sepia665 density2v.com> writes:
On Wednesday, 14 November 2018 at 15:33:49 UTC, rikki cattermole 
wrote:
 On 15/11/2018 4:07 AM, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/
 
 
 By 2020 C++ is planning to introduce:
 
 * Ranges
Really butchered. From what I can see they never mentioned D in any of the documents (kinda glad tbh). Those documents even question what it should be doing... And the example code... yikes. No way that is going to be used.
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads
Skepticism especially when there is 2023 being listed as a conservative estimate on the Reddit post.
In no way this is the usual trollpost (I am a participant of SAoC). What bugs me is the shortening distance regarding what D has to offer with respect to C++. While D for sure has a way better syntax (thinking of template declarations, `immutable`, UDAs) and a GC, what are the advantages of using D vs C++ if
Nov 14 2018
parent reply Dukc <ajieskola gmail.com> writes:
On Wednesday, 14 November 2018 at 17:08:20 UTC, sepiasisy wrote:
 What bugs me is the shortening distance regarding what D has to 
 offer with respect to C++.
I doubt the shortening distance. While C++ does advance and D isn't moving as fast as it was at 2010 (I think), I still believe C++ isn't the faster evolver of the two. When the next C++ standard comes out, D has improved too. Examples of what might be there by then: -stable and reasonably documented DIP1000 implementation -Dpp able to compile real-world c++ headers -ability to script web pages and call the Web API without needing JavaScript glue code for that -generally more stable implementation - nogc nothrow pure safe containers that work with allocators and const/immutable And if they're really going to add all that you listed and do it well, I don't think it will happen by 2020. Even if the standard catches that year, the implementations are unlikely to.
Nov 14 2018
next sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Wednesday, 14 November 2018 at 18:47:22 UTC, Dukc wrote:
 On Wednesday, 14 November 2018 at 17:08:20 UTC, sepiasisy wrote:
 What bugs me is the shortening distance regarding what D has 
 to offer with respect to C++.
I doubt the shortening distance. While C++ does advance and D isn't moving as fast as it was at 2010 (I think), I still believe C++ isn't the faster evolver of the two. When the next C++ standard comes out, D has improved too. Examples of what might be there by then: -stable and reasonably documented DIP1000 implementation -Dpp able to compile real-world c++ headers -ability to script web pages and call the Web API without needing JavaScript glue code for that -generally more stable implementation - nogc nothrow pure safe containers that work with allocators and const/immutable And if they're really going to add all that you listed and do it well, I don't think it will happen by 2020. Even if the standard catches that year, the implementations are unlikely to.
If somebody says they are a C++ expert, they are in for an interesting time in an interview in some places, like with the people I work with and it usually ends up with well not like a world expert- I just meant I know C++ reasonably well. If they deprecate old features and you never need to know them then maybe D eventually has a harder challenge to appeal to some. But backwards compatibility is a basic value of the C++ comminity as I understand it as an observer. Benefits to that, but it doesn't come for free. Would you recommend somebody that knows scripting and used to program in Z80 assembler as a kid learn C++ as a practitioner when their job involves programming but also involves other things that create more value? Maybe but it would definitely be a choice one could reasonably question. Demonstrably it's not hard for such a person to learn D and they can be reasonably productive in it. Again, it's a big wide world and the margin of language comparison is not always what you might think. I never thought a significant commercial choice would be whether to replace Extended Pascal with D or Ada, but life is full of surprises and turns out that is the case. Languages are about more than narrowly technical questions too - you can argue that some things are mere consequences of compiler and standard library implementations, but it's unavoidably true that the ecosystem and community are about values too. There are broader consequences of language choice that result from those differences,as Joyent discovered with node.js It doesn't do much good, I think,to worry about threats from perceived mighty competitors. Steal ideas -giving due scholarly credit (to do so is a sign of strength and confidence too( - from wherever one may find them. But it makes more sense to worry about making D the best language it can be - for a broad interpretation of language. It's easy being the underdog because to grow at a decent compounded rate you can just gain users from wherever you find them, and it really doesn't need to be a battle to the death. Life isn't a zero sum game and I doubt the number of people writing native code as part of their job has peaked. DPP already helps I think. As does dstep. Don't forget also the work to get STL types into druntime. Well you're right,D does have fewer libraries. Only 1400 on code.dlang. But you also have all of C. And I guess being able to use most of C++ will be a gain again also because it helps with incremental transitions of larger code bases. Might still be some problems where you can't represent C++ types in D, but ask Atila about that. It's easier for D than for Rust in this respect. way around). Dmitry Olshhansky told me about Graal. For interop on JVM that doesn't look too bad. No promises but I think something useful shouldn't be too bad to get to. Won't really know till we try. in a pleasant enough way that would be okay, wouldn't it ? Remaining questions are stability, quality and tooling. First two are already a lot better and I guess will be pretty good in time? I don't know on the tooling how much it might cost or how it would be accomplished.
Nov 14 2018
next sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Wed, 2018-11-14 at 19:39 +0000, Laeeth Isharc via Digitalmars-d wrote:
[=E2=80=A6]
=20
 If somebody says they are a C++ expert, they are in for an=20
 interesting time in an interview in some places, like with the=20
 people I work with and it usually ends up with well not like a=20
 world expert- I just meant I know C++ reasonably well.
At the ACCU conference (https://conference.accu.org ) if someone says they = are a C++ expert, they are probably a member of WG21. ;-) (For the uninitiated WG21 is the C++ standards committee.) --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Nov 14 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 12:01 PM, Russel Winder wrote:
 At the ACCU conference (https://conference.accu.org ) if someone says they are
 a C++ expert, they are probably a member of WG21. ;-)
 (For the uninitiated WG21 is the C++ standards committee.)
AFAIK, most WG21 members specialize in certain areas of the language. The list of people who thoroughly understand C++ is: 1. Herb Sutter 2. can't think of anyone else :-)
Nov 14 2018
parent Russel Winder <russel winder.org.uk> writes:
On Wed, 2018-11-14 at 15:59 -0800, Walter Bright via Digitalmars-d wrote:
 On 11/14/2018 12:01 PM, Russel Winder wrote:
 At the ACCU conference (https://conference.accu.org ) if someone says t=
hey
 are
 a C++ expert, they are probably a member of WG21. ;-)
 (For the uninitiated WG21 is the C++ standards committee.)
=20 AFAIK, most WG21 members specialize in certain areas of the language. The list=20 of people who thoroughly understand C++ is: =20 1. Herb Sutter 2. can't think of anyone else :-)
Jonathan Wakely, Roger Orr, Kate Gregory, etc. I can think of quite a few more. But yes, WG21 is about fragmentation as much as integration. A couple of potentially boring facts: 1. C++ has no std::string strip function. I started an effort, Anthony Williams wrote an implementation, the proposal got started, and is now goin= g nowhere for lack of someone to push it through the WG21 system. 2. C++ has no thread-safe queue, and no thread-safe channel system in the standard library. For a language with std::thread this is unbelievable. WG2= 1 as a whole it seems thinks shared memory multi-threading is the right way o= f doing concurrency and parallelism. Oh well. As I now use Rust not C++, I no longer actually care. Whilst GtkD and GStreamerD are lovely, the GStreamer community is shifting = C =E2=86=92 Rust, so Rust it has to be for the stuff I am doing now: D isn't really an option in this community, though many know of it. Rust has a wonderful amalgam of futures, channels, and executors generally = and especially in in gtk-rs that make multi-thread, reactive programming trivia= lly easy. It would be great if this got added to D and GtkD, but I can't see i= t happening =E2=80=93 this is not the same thing as vibe.d. JavaScript and Py= thon have their single threaded version of effectively the same thing =E2=80=93 which= probably is a bit like vibe.d. Kotlin has it's co-routines, Go it's goroutines and J= ava will likely get something similar in OpenJDK13 if not 12 =E2=80=93 to super= sede use of the Rx extensions =E2=80=93 more like what Rust has not as JavaScript and P= ython have. My feeling, agreed by many whose judgement I trust, is that this is not "me too" as a fashion, but is a genuine move forward for asynchronous computing= . Though many opine that Python's current offering is not of good enough quality. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Nov 14 2018
prev sibling next sibling parent lagfra <me fragal.eu> writes:
On Wednesday, 14 November 2018 at 19:39:42 UTC, Laeeth Isharc 
wrote:
 DPP already helps I think.  As does dstep.  Don't forget also 
 the work to get STL types into druntime.
I agree. In my opinion, an issue with DPP and related work is that they are difficult to follow outside of the D community. Most of the time this forum is the only source of information regarding the language. It's not that I don't follow what goes on in this forum (I check almost daily), yet I don't know what is the current progress on DPP (or newCTFE).

 D in a pleasant enough way that would be okay, wouldn't it ?

 Remaining questions are stability, quality and tooling.  First 
 two are already a lot better and I guess will be pretty good in 
 time?  I don't know on the tooling how much it might cost or 
 how it would be accomplished.
Pretty much agree.
Nov 14 2018
prev sibling parent reply Francesco Mecca <me francescomecca.eu> writes:
On Wednesday, 14 November 2018 at 19:39:42 UTC, Laeeth Isharc 
wrote:

 DPP already helps I think.  As does dstep.  Don't forget also 
 the work to get STL types into druntime.
Do any of you have any information regarding the STL types in druntime? I'd like to know more.
 Well you're right,D does have fewer libraries.  Only 1400 on 
 code.dlang.  But you also have all of C.  And I guess being 
 able to use most of C++ will be a gain again also because it 
 helps with incremental transitions of larger code bases.  Might 
 still be some problems where you can't represent C++ types in 
 D, but ask Atila about that.
 It's easier for D than for Rust in this respect.
Agreed
Nov 16 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 16 November 2018 at 09:03:48 UTC, Francesco Mecca 
wrote:
 On Wednesday, 14 November 2018 at 19:39:42 UTC, Laeeth Isharc 
 wrote:

 DPP already helps I think.  As does dstep.  Don't forget also 
 the work to get STL types into druntime.
Do any of you have any information regarding the STL types in druntime? I'd like to know more.
https://github.com/dlang/druntime/tree/master/src/core/stdcpp std::string is waiting on implementation of DIP1014 due to an interior pointer in the GNU implementation.
Nov 16 2018
prev sibling next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 14 November 2018 at 18:47:22 UTC, Dukc wrote:
 [snip]

 -stable and reasonably documented DIP1000 implementation
 -Dpp able to compile real-world c++ headers
 -ability to script web pages and call the Web API without 
 needing JavaScript glue code for that
 -generally more stable implementation
 - nogc nothrow pure  safe containers that work with allocators 
 and const/immutable
A lot of the stuff that Walter seems focused on like DIP1000 and converting the backend to D seems like stuff that is very valuable, but doesn't necessarily have a big immediate payoff. More like an investment in the future with nebulous (probably large) payoffs down the road. That's probably contributing to some people's sense of frustration.
Nov 14 2018
prev sibling next sibling parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Wednesday, 14 November 2018 at 18:47:22 UTC, Dukc wrote:
 On Wednesday, 14 November 2018 at 17:08:20 UTC, sepiasisy wrote:
 What bugs me is the shortening distance regarding what D has 
 to offer with respect to C++.
I doubt the shortening distance. While C++ does advance and D isn't moving as fast as it was at 2010 (I think), I still believe C++ isn't the faster evolver of the two. When the next C++ standard comes out, D has improved too. Examples of what might be there by then:
Me, C++ reminds me more and more of the Borg from Star Trek NG. Resistance is futile; your features will be assimilated. And assimilitated it will, in a big pile of ducts and things going everywhere, ugly as hell and with no apparent logical purpose. Having also the aura of invincibility but defeated in every time by a small glitch that noone, the least the collective itself, foresaw because of the weight of the complexity. Sorry for that not quite professional jab, but that's what it reminds me.
Nov 14 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 10:47 AM, Dukc wrote:
 I doubt the shortening distance. While C++ does advance and D isn't moving as 
 fast as it was at 2010 (I think), I still believe C++ isn't the faster evolver 
 of the two. When the next C++ standard comes out, D has improved too. Examples 
 of what might be there by then:
C++ is adding lots of new features. But the trouble is, the old features remain, and people will still use them, and suffer. Examples: 1. The preprocessor remains. There has never been a concerted effort to find replacements for it, then deprecate it. It's like allowing horse-drawn carts on the road. 2. Strings are still 0-terminated. This is a performance problem, memory consumption problem, and is fundamentally memory unsafe. 3. Arrays still decay to pointers, losing all bounds information. 4. `char` is still optionally signed. What a lurking disaster that is. 5. What size is an `int`?
Nov 14 2018
next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, November 14, 2018 4:25:07 PM MST Walter Bright via 
Digitalmars-d wrote:
 On 11/14/2018 10:47 AM, Dukc wrote:
 I doubt the shortening distance. While C++ does advance and D isn't
 moving as fast as it was at 2010 (I think), I still believe C++ isn't
 the faster evolver of the two. When the next C++ standard comes out, D
 has improved too. Examples
 of what might be there by then:
C++ is adding lots of new features. But the trouble is, the old features remain, and people will still use them, and suffer. Examples: 1. The preprocessor remains. There has never been a concerted effort to find replacements for it, then deprecate it. It's like allowing horse-drawn carts on the road. 2. Strings are still 0-terminated. This is a performance problem, memory consumption problem, and is fundamentally memory unsafe. 3. Arrays still decay to pointers, losing all bounds information. 4. `char` is still optionally signed. What a lurking disaster that is.
All of those are definitely problems, and they're not going away - though occasionally, the preprocessor does come in handy (as much as I agree that on the whole it's better that it not be there).
 5. What size is an `int`?
While I agree with the point that you're trying to make, that particular type actually isn't really a problem on modern systems in my experience, since it's always 32 bits. Maybe with ARM it's a problem (thus far I've only seriously programmed on x86 and x86-64 machines), and certainly, if you have to deal with 16-bit machines, it's a problem, but for most applications on modern systems at this point, int is always 32 bits. It's long that shoots you in the foot, because that still varies from system to system, and as such, I've always considered long to be bad practice in any C++ code base I've worked on. On the better teams that I've worked on, int has been fine when the size of a type really didn't matter, but otherwise, an integer type has been one of the int*_t types. But even then, when you're dealing with something like printf, you're screwed, because it doesn't understand the types with fixed sizes. So, you're constantly fighting the language and libraries. D's approach of fixing the size of most integer and floating point types is vastly superior, and the problems that we do have there are from the few places where we _didn't_ make them fixed, but since that mostly relates to the size of the memory space, I'm not sure that we really had much choice there. The main outlier is real, though most of the controversy there seems to have do with arguments about the efficiency of the x87 stuff rather than having to do with differences across systems like you get with the integers. - Jonathan M Davis
Nov 14 2018
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 14, 2018 at 03:25:07PM -0800, Walter Bright via Digitalmars-d wrote:
[...]
 C++ is adding lots of new features. But the trouble is, the old
 features remain, and people will still use them, and suffer.
To be fair, the same could be said of D too. E.g., will autodecoding ever be deprecated? I seriously doubt it, even though it's a significant performance problem. And it's default behaviour, so newcomers will suffer. Just like in C++ it's *possible* to write in a way that avoids the problems of old features, but newcomers still use old features because they're still there, and usually are what you get when you write code in the most straightforward way (i.e., it's "default behaviour"). Will C integer promotion rules ever be replaced? Probably never. Treating 'bool' as a non-integral type has already been rejected, in spite of multiple cases of counterintuitive behaviour when resolving overloads. Similarly for char/int interconversions, even though the whole point of having separate char/wchar/dchar types is to *not* treat UTF encoding units as mere numerical values. Similarly with many other features and language changes that would have benefitted D, but will probably never happen because of the almost paranoid fear of breaking existing code, which isn't all that different from C++'s backward-compatibility mantra.
 Examples:
 
 1. The preprocessor remains. There has never been a concerted effort
 to find replacements for it, then deprecate it. It's like allowing
 horse-drawn carts on the road.
Has there been a concerted effort to find a way to migrate existing D codebases away from autodecoding so that it can be eventually removed?
 2. Strings are still 0-terminated. This is a performance problem,
 memory consumption problem, and is fundamentally memory unsafe.
 
 3. Arrays still decay to pointers, losing all bounds information.
 
 4. `char` is still optionally signed. What a lurking disaster that is.
 
 5. What size is an `int`?
All good points, except for one fly in the ointment: 'real'. :-D The only D type with variable size, it's also a hidden performance degrader (significantly slower on new hardware because x87 hardware hasn't been updated for decades whereas IEEE standard floats like float/double have seen continuous improvement in hardware support over the last decade). Especially when where code carefully crafted to use only float/double still get implicit conversion to/from real when calling certain functions in std.math, causing significant performance degradation. Not saying that D isn't superior to C++ in many ways (if it wasn't, I wouldn't be here), but one ought to be careful not to end up being the proverbial pot calling kettle black. T -- It is of the new things that men tire --- of fashions and proposals and improvements and change. It is the old things that startle and intoxicate. It is the old things that are young. -- G.K. Chesterton
Nov 14 2018
prev sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 14, 2018 at 04:47:16PM -0700, Jonathan M Davis via Digitalmars-d
wrote:
 On Wednesday, November 14, 2018 4:25:07 PM MST Walter Bright via 
 Digitalmars-d wrote:
[...]
 5. What size is an `int`?
While I agree with the point that you're trying to make, that particular type actually isn't really a problem on modern systems in my experience, since it's always 32 bits.
[...]
 It's long that shoots you in the foot, because that still varies from
 system to system, and as such, I've always considered long to be bad
 practice in any C++ code base I've worked on. [...] But even then,
 when you're dealing with something like printf, you're screwed,
 because it doesn't understand the types with fixed sizes. So, you're
 constantly fighting the language and libraries.
Haha yeah, using printf with variable-sized integral types like long is a nightmare in C/C++. Though in my current project, a hack solution has been devised by making use of the otherwise-evil preprocessor: long l = ...; printf("Long value is: %"PRIlong"\n", l); where PRIlong is a macro defined in a suitable global header file that defines the correct format specifier for 'long'. It's a workaround as ugly as sin, but it does save you from the utter headache of trying to figure out which of %d, %ld, %lld you need to use. (Unfortunately, it doesn't stop people from continuing to write %ld where they really should be writing %"PRIlong"... programming by convention rears its head again with all of its ugly consequences.)
 D's approach of fixing the size of most integer and floating point
 types is vastly superior, and the problems that we do have there are
 from the few places where we _didn't_ make them fixed, but since that
 mostly relates to the size of the memory space, I'm not sure that we
 really had much choice there.
Yeah, recently I've had size_t bite me in the rear: I had code cross-compiled to 32bit Android/ARM and my size_t's were working nicely with int parameters... then I wrote a test driver for testing non-OS specific modules on the host PC, and suddenly I have a whole bunch of compile errors because size_t no longer converts to int.
 The main outlier is real, though most of the controversy there seems
 to have do with arguments about the efficiency of the x87 stuff rather
 than having to do with differences across systems like you get with
 the integers.
[...] Yeah, 'real' is a blemish in D's otherwise beautifully-designed basic types. (Just don't get me started on int promotion rules, and I'll pretend everything else is beautiful too. :-D) Confession: I used to love 'real'... but after researching more thoroughly into its performance characteristics recently (esp. on recent hardware), I've started to realize it wasn't quite what I had expected. T -- What's a "hot crossed bun"? An angry rabbit.
Nov 14 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 7:33 AM, rikki cattermole wrote:
 Really butchered. From what I can see they never mentioned D in any of the 
 documents (kinda glad tbh). Those documents even question what it should be 
 doing...
The C++ community insists they invented ranges independently from D. It's obviously not true, but they say that for all the D features they've implemented (except for static if).
Nov 14 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 14, 2018 at 03:09:33PM -0800, Walter Bright via Digitalmars-d wrote:
 On 11/14/2018 7:33 AM, rikki cattermole wrote:
 Really butchered. From what I can see they never mentioned D in any
 of the documents (kinda glad tbh). Those documents even question
 what it should be doing...
The C++ community insists they invented ranges independently from D.
[...] That's ridiculous. Didn't the guy who wrote the C++ range proposal copy the code example from my article on component programming in D? T -- The early bird gets the worm. Moral: ewww...
Nov 14 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 3:22 PM, H. S. Teoh wrote:
 That's ridiculous.  Didn't the guy who wrote the C++ range proposal copy
 the code example from my article on component programming in D?
:-)
Nov 14 2018
prev sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Wednesday, 14 November 2018 at 23:22:44 UTC, H. S. Teoh wrote:
 On Wed, Nov 14, 2018 at 03:09:33PM -0800, Walter Bright via 
 Digitalmars-d wrote:
 On 11/14/2018 7:33 AM, rikki cattermole wrote:
 Really butchered. From what I can see they never mentioned D 
 in any of the documents (kinda glad tbh). Those documents 
 even question what it should be doing...
The C++ community insists they invented ranges independently from D.
[...] That's ridiculous. Didn't the guy who wrote the C++ range proposal copy the code example from my article on component programming in D? T
Yep, the calendar one. TBF, you were in the list of citations for that talk.
Nov 14 2018
parent Norm <norm.rowtree gmail.com> writes:
On Thursday, 15 November 2018 at 06:55:12 UTC, Nicholas Wilson 
wrote:
 On Wednesday, 14 November 2018 at 23:22:44 UTC, H. S. Teoh 
 wrote:
 On Wed, Nov 14, 2018 at 03:09:33PM -0800, Walter Bright via 
 Digitalmars-d wrote:
 On 11/14/2018 7:33 AM, rikki cattermole wrote:
 Really butchered. From what I can see they never mentioned 
 D in any of the documents (kinda glad tbh). Those documents 
 even question what it should be doing...
The C++ community insists they invented ranges independently from D.
[...] That's ridiculous. Didn't the guy who wrote the C++ range proposal copy the code example from my article on component programming in D? T
Yep, the calendar one. TBF, you were in the list of citations for that talk.
https://www.youtube.com/watch?v=mFUXNMfaciE&feature=youtu.be?t=123
Nov 14 2018
prev sibling parent reply aliak <something something.com> writes:
On Wednesday, 14 November 2018 at 23:09:33 UTC, Walter Bright 
wrote:
 On 11/14/2018 7:33 AM, rikki cattermole wrote:
 Really butchered. From what I can see they never mentioned D 
 in any of the documents (kinda glad tbh). Those documents even 
 question what it should be doing...
The C++ community insists they invented ranges independently from D. It's obviously not true, but they say that for all the D features they've implemented (except for static if).
Actually Eric Niebler did mention in his (i think it was a keynote) in cppcon 2015 that a lot of the work was done by the D community :)... Or "thanks to the D community" or something along those line.
Nov 15 2018
parent reply NoMoreBugs <NoMoreBugs NoMoreBugs.com> writes:
On Friday, 16 November 2018 at 06:51:56 UTC, aliak wrote:
 Actually Eric Niebler did mention in his (i think it was a 
 keynote) in cppcon 2015 that a lot of the work was done by the 
 D community :)...  Or "thanks to the D community" or something 
 along those line.
lets be more precise shall we ;-) (on bottom of slide) 'The idea for this talk was taken from the article "Component programming with ranges" on the D language wiki.' Then a few seconds later, says that it was 'stolen from the D community'.
Nov 15 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 16 November 2018 at 07:14:26 UTC, NoMoreBugs wrote:
 On Friday, 16 November 2018 at 06:51:56 UTC, aliak wrote:
 Actually Eric Niebler did mention in his (i think it was a 
 keynote) in cppcon 2015 that a lot of the work was done by the 
 D community :)...  Or "thanks to the D community" or something 
 along those line.
lets be more precise shall we ;-) (on bottom of slide) 'The idea for this talk was taken from the article "Component programming with ranges" on the D language wiki.' Then a few seconds later, says that it was 'stolen from the D community'.
As a member of the D community 'stolen from the D community' is fine by me, it's their fault they didn't steal it properly ;)
Nov 15 2018
prev sibling next sibling parent bachmeier <no spam.net> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:


 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Maybe nothing for some programmers. If C++ suddenly becomes a superior language in the future, I know I can start writing C++ and call my existing D code. That's one of the things that makes me comfortable writing D code today. I'll worry about your scenario when C++ is too tempting to avoid - something that is not true for me today.
Nov 14 2018
prev sibling next sibling parent reply kinke <kinke libero.it> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 By 2020 C++ is planning to introduce:

 * Ranges
Still no slicing, right? array[1 .. $-1] is just so handy.
 [...]
 Right now it already has:

 * `auto` variables
I hate having to type `const auto bla = ...`. With D, it's just `const bla = ...`.
 * Ranged for (`foreach`)
But not an implicit 2nd version with index. `foreach (i, e; range) { ... }` comes in handy if you need the index too.
 * Lambda expressions and closures
const auto lambda = [...](T param) { ... }; In D, that can be `(T param) => <expression>` or a nested function, both much more readable.
 * `nothrow` attributes
 * Proper containers
 * Proper RAII
But no distinction between value and reference types, i.e., D's distinction between structs and classes.
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Some other things coming to mind OTOH: * *Much* saner templates, no std::enable_if<> crap etc., and of course static if. static foreach may also be interesting in some cases. * Builtin associative arrays. std::unordered_map<Key, Value> => Value[Key]. * Sane basic types. No need to include a header for the definition of `size_t`, no distinct basic types overlap (C++ long, wchar_t etc.; not 4 distinct types for a byte - C++ `char`, `signed char`, `unsigned char`, and planned `utf8_t`]. * `alias this` for simple decorators without boilerplate at all. * Pretty good interop with C++, and that's constantly getting better. * User-defined attributes. * Built-in unittests and documentation. I surely missed some other cool features.
Nov 14 2018
next sibling parent kinke <kinke libero.it> writes:
On Wednesday, 14 November 2018 at 17:22:50 UTC, kinke wrote:
 I surely missed some other cool features.
I forgot to mention the unified function call syntax, which allows for much more readable and compact code; that's definitely worth mentioning too.
Nov 14 2018
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 9:22 AM, kinke wrote:
 I surely missed some other cool features.
Unicode! Functional programming support Arrays of types No preprocessor
Nov 14 2018
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/14/2018 9:22 AM, kinke wrote:
 * Sane basic types. No need to include a header for the definition of
`size_t`, 
 no distinct basic types overlap (C++ long, wchar_t etc.; not 4 distinct types 
 for a byte - C++ `char`, `signed char`, `unsigned char`, and planned `utf8_t`].
It's amazing how much time goes into dealing with C++'s fluid notions of the sizes of those types. "unsigned long" is just awful, varying from memory model to memory model, from platform to platform, especially when C++ uses it for name mangling. I've wasted endless hours with this, and I know what I'm doing with it. I see others doing it, too. Look at all the countless C++ .h files which start with something like: #include <stdint.h> /* ELF file format */ typedef uint16_t Elf32_Half; typedef uint32_t Elf32_Word; typedef int32_t Elf32_Sword; typedef uint32_t Elf32_Addr; typedef uint32_t Elf32_Off; typedef uint32_t elf_u8_f32; or those windows.h "WORD" declarations. Or in phobos\etc\c\zlib\zconf.h: typedef unsigned char Byte; /* 8 bits */ typedef unsigned int uInt; /* 16 bits or more */ typedef unsigned long uLong; /* 32 bits or more */ Every C++ project reinvents this. Just not having to deal with "is `char` signed or unsigned? will that affect my existing code? does my code still work if `int` is 64 bits?" is a big win.
Nov 14 2018
prev sibling next sibling parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads

 Right now it already has:

 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII

 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what D 
 has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if my 
 goal is to build a complex system / product?

 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
newCTFE will blow away constexpr in perfomance! and when I have time for it, D will get type-manipulation abilities which are much more efficient than what you can do with templates. it'll be faster to compile.
Nov 14 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Nov 14, 2018 at 05:44:35PM +0000, Stefan Koch via Digitalmars-d wrote:
[...]
 newCTFE will blow away constexpr in perfomance!
Can't wait for newCTFE to be ready... any updates on the schedule?
 and when I have time for it, D will get type-manipulation abilities
 which are much more efficient than what you can do with templates.
 
 it'll be faster to compile.
I want to hear more about this. D's templates are darned powerful, and the syntax is far more readable (and writable!) than C++'s, but for certain use cases, like manipulating AliasSeq's (aka "type tuples") it's just not the right idiom, and often leads to quadratic or even exponential template expansion, which causes template bloat and often dog-slow. Sometimes I find myself wishing that types were first-class compile-time "variables", and that you could write code that manipulates them directly, e.g., append a type to an array of types from inside a foreach loop, which is much more readable (and writable!) than a recursive template that you have to use today. T -- Answer: Because it breaks the logical sequence of discussion. / Question: Why is top posting bad?
Nov 14 2018
parent Jacob Carlborg <doob me.com> writes:
On 2018-11-14 20:23, H. S. Teoh wrote:

 I want to hear more about this.  D's templates are darned powerful, and
 the syntax is far more readable (and writable!) than C++'s, but for
 certain use cases, like manipulating AliasSeq's (aka "type tuples") it's
 just not the right idiom, and often leads to quadratic or even
 exponential template expansion, which causes template bloat and often
 dog-slow.
 
 Sometimes I find myself wishing that types were first-class compile-time
 "variables", and that you could write code that manipulates them
 directly, e.g., append a type to an array of types from inside a foreach
 loop, which is much more readable (and writable!) than a recursive
 template that you have to use today.
That's exactly what he's referring to (correct me if I'm wrong). -- /Jacob Carlborg
Nov 14 2018
prev sibling next sibling parent sighoya <sighoya gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 * Modules
Not really, what C++ only will achieve is to leave out header files but no name spacing is supported due to backward compatibility. C++ modules aren't mapped to directories/files as it is in Java, still you get fu**ed with the wonderful C/C++ toolchain of cmake, make and bash.
D will get type-manipulation abilities which are much more 
efficient than what you can do with templates
In which regard?
Sometimes I find myself wishing that types were first-class 
compile-time "variables"
Types as values would be a nice option as can be seen by idris or the zig language.
e.g., append a type to an array of types from inside a foreach 
loop, which is much more readable (and writable!) than a 
recursive template that you have to use today.
Absolutely.
Nov 14 2018
prev sibling next sibling parent Isaac S. <spam-no-reply-isaac outlook.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
1. I personally love D's really clean syntax, semantics, and standard library. I also love most of the type system; especially that almost all types have clearly defined sizes. D's overall great design is the reason why I picked it. I don't think its possible for C++ to ever attain that as doing so would practically be making a whole new language. 2. One thing that I don't think gets enough love in D is the fact _everything_ is initialized with a default value (unless you ask it not to). It's frustrating the amount of time I've wasted debugging code in C++ due to forgetting to initialize one variable. 3. Unless I missed something, C++ still doesn't have anything like inout, which removes the need to duplicate property functions for const and non-const. 4. I like Dub. While it could use a few improvements, I find it is still _far_ more usable than make or cmake. The thing with the features you listed being added to C++ is that most of them are already in D and in the libraries programmed in D. I remember hearing about modules in C++ last year, and they still are not usable. Having to maintain a separate header file is one the big reasons (but not the only) I decided against C++. Even when these features are available, many code bases won't use them for a while because they want to maintain compatibility. This is especially true on Linux where different distros will have different major versions of GCC and Clang.
Nov 14 2018
prev sibling next sibling parent MrTypeSafe <MrTypeSafe gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 ...what are the advantages of using D vs C++ if my goal is to 
 build a complex system / product?
That 'goal' really needs further refinement ;-) What is a 'complex system/product' anyway? Are you building an o/s kernel? Are you building a relational database system?
 what will D offer with respect to C++ when almost all key 
 features of D are present in C++20(+)?
So you've moved from a pretty blurry goal, to comparing 'language features'..already? You really need to start at a higher level of thinking first. What paradigm will best suit the 'complexity' you mention? (and remember, multi-paradigm could just increase that complexity and place extra burden on programmers) Do you want/need garbage collection? Do you want/need exception handling? Do you want/need a unified type system? Do you want/need cross-platform (whatever that means these days). How stable is the language? How many developers are you likely to need? How experienced will they need to be? How easy can you obtain their services, and at what cost? Can you retain such programmers over the lifetime of the product? Comparing language features in order to discover which language best suits your goal, first depends on having a clear goal in the first place. Ask the right questions first. That's my advice.
Nov 14 2018
prev sibling next sibling parent aberba <karabutaworld gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads

 Right now it already has:

 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII

 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what D 
 has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if my 
 goal is to build a complex system / product?

 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
I've been in the community since 2014 and occasionally monitor the kind of developers using D. D might not be the hype out there but its a very solid language from a software engineering perspective. People from very important companies (I mean well known companies people depend on and use their software) are starting to use D day by day. Its slowly moving. We can count features but at the end of the day, what conveniently get the job done is what matters. Its impractical for companies to leave their legacy code since C++ is already in use, hence their attention to those languages. But D is maturing both in tools, libraries/packages and stability/reliability...making it a valid alternative to consider when starting afresh or where it makes sense to even port or integrate D into legacy code...D is powerful enough for all that. End of the day, I believe having a community is D's greatest strength. Aberba, Ghana, West Africa. Github.com/aberba
Nov 15 2018
prev sibling next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 11/14/18 4:07 PM, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee
trip_report_ranges/ 
 
 
 By 2020 C++ is planning to introduce:
 
 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads
 
 Right now it already has:
 
 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII
 
 In no way this is the usual trollpost (I am a participant of SAoC). What 
 bugs me is the shortening distance regarding what D has to offer with 
 respect to C++. While D for sure has a way better syntax (thinking of 
 template declarations, `immutable`, UDAs) and a GC, what are the 
 advantages of using D vs C++ if my goal is to build a complex system / 
 product?
 
 TL;DR: what will D offer with respect to C++ when almost all key 
 features of D are present in C++20(+)?
Thanks for asking. Saw several good answers, to which I feel compelled to add the following. I just delivered the opening keynote for Meeting C++ (https://meetingcpp.com/2018/Schedule.html). The video will come about in a few days. There's a bit of a twitter storm going about. I think C++ is not properly equipped for the next big thing, which is Design by Introspection. C++ has a history of poorly copying features from D while missing their core point, which makes the import much more difficult to use. The example I give in the talk is that C++ initially rejected everything and anything about static if, to then implement it under a different name ("whatever it is, make sure it's not static if!") and entirely missing the point by having if constexpr insert a new scope (whereby NOT introducing the scope is the entire point of the feature in the first place). So going through the motions is far from achieving real parity. At the same time C++ is spending a lot of real estate on language features of minor impact (concepts) or mere distractions (metaclasses), both of which aim squarely not at solving difficult issues in problem space, but to patch for difficulties created by the language itself. Andrei
Nov 15 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Thursday, 15 November 2018 at 13:29:47 UTC, Andrei 
Alexandrescu wrote:
 On 11/14/18 4:07 PM, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/
 
 
 By 2020 C++ is planning to introduce:
 
 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads
 
 Right now it already has:
 
 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII
 
 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what 
 D has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if 
 my goal is to build a complex system / product?
 
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Thanks for asking. Saw several good answers, to which I feel compelled to add the following. I just delivered the opening keynote for Meeting C++ (https://meetingcpp.com/2018/Schedule.html). The video will come about in a few days. There's a bit of a twitter storm going about.
Heh, slides like that will do it: https://mobile.twitter.com/JamesMcNellis/status/1063000460280377344
 I think C++ is not properly equipped for the next big thing, 
 which is Design by Introspection.
That was a great talk, finally clicked for me on the overview slides and checkedint example in that keynote.
 C++ has a history of poorly copying features from D while 
 missing their core point, which makes the import much more 
 difficult to use. The example I give in the talk is that C++ 
 initially rejected everything and anything about static if, to 
 then implement it under a different name ("whatever it is, make 
 sure it's not static if!") and entirely missing the point by 
 having if constexpr insert a new scope (whereby NOT introducing 
 the scope is the entire point of the feature in the first 
 place).

 So going through the motions is far from achieving real parity. 
 At the same time C++ is spending a lot of real estate on 
 language features of minor impact (concepts) or mere 
 distractions (metaclasses), both of which aim squarely not at 
 solving difficult issues in problem space, but to patch for 
 difficulties created by the language itself.
Let them keep digging deeper into their hole. If you're right about how D is better, someone will build the next great software with D and prove you right. Speaking of which, Weka might already be it: I'm editing together an interview with Liran for the D blog, should be up soon.
Nov 15 2018
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 11/15/18 9:20 AM, Joakim wrote:
 Speaking of which, Weka might already be it: I'm editing together an 
 interview with Liran for the D blog, should be up soon.
Can't wait!
Nov 18 2018
prev sibling next sibling parent reply Erik van Velzen <erik evanv.nl> writes:
Some in this thread have pointed out that D's templates are 
easier.

But I believe Bjarne is involved in making it so that you can 
just put "auto" or a concept like Itegral in a function argument 
and it will automatically become a template.

If that's true even templates will be more concise in C++.
Nov 15 2018
parent reply =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 11/15/2018 06:04 AM, Erik van Velzen wrote:

 If that's true even templates will be more concise in C++.
We had similar hopes with C++11's 'auto' keyword and the move semantics with 'auto&&'. Anybody who hasn't read explanations of those concepts like the ones in Scott Meyers's "Effective Modern C++" are fooling themselves about the simplicity of such concepts. Another example: Can anyone memorize what C++ special functions does the compiler generate in the absence, presence, explicit-defaulting, and explicit-deletion (and more? I can't be sure..) states of the same functions? No. One of the prominent members of the C++ community is local here in Silicon Valley. He hinted that the goal is to keep C++ improved to avoid it becoming like COBOL, where very few experts remain, who are paid $1M salaries. "We don't want C++ become like COBOL." My answer is, C++ is heading exactly the same place not through natural death but through those improvements. Ali
Nov 15 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 15, 2018 at 11:17:43AM -0800, Ali ehreli via Digitalmars-d wrote:
 On 11/15/2018 06:04 AM, Erik van Velzen wrote:
 If that's true even templates will be more concise in C++.
We had similar hopes with C++11's 'auto' keyword and the move semantics with 'auto&&'. Anybody who hasn't read explanations of those concepts like the ones in Scott Meyers's "Effective Modern C++" are fooling themselves about the simplicity of such concepts. Another example: Can anyone memorize what C++ special functions does the compiler generate in the absence, presence, explicit-defaulting, and explicit-deletion (and more? I can't be sure..) states of the same functions? No. One of the prominent members of the C++ community is local here in Silicon Valley. He hinted that the goal is to keep C++ improved to avoid it becoming like COBOL, where very few experts remain, who are paid $1M salaries. "We don't want C++ become like COBOL." My answer is, C++ is heading exactly the same place not through natural death but through those improvements.
[...] And that's the problem with C++: because of the insistence on backward compatibility, the only way forward is to engineer extremely delicate and elaborate solutions to work around fundamental language design flaws. Consequently, very few people actually understand all the intricate and subtle rules that the new constructs obey, and I daresay pretty much nobody fully understands the implications of these intricate and subtle rules when used together with other equally intricate and subtle features. The result is an extremely convoluted, hard to understand, and fragile minefield where every feature interacts with every other feature a complex way most people don't fully comprehend, and every two steps has a pretty high chance of resulting in an unexpected explosion somewhere in the code. Writing C++ code therefore becomes an exercise in navigating the obstacle course of an overly-complex and fragile language, rather than the language being a tool for facilitating the programmer's conveying his intent to the machine. It may be thrilling when you successfully navigate the obstacle course, but when I'm trying to get work done in the problem domain rather than wrestle with language subtleties, I really would rather throw out the obstacle course altogether and hire a better translator of my intent to the machine. Like D. T -- What do you call optometrist jokes? Vitreous humor.
Nov 15 2018
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Thursday, 15 November 2018 at 19:54:06 UTC, H. S. Teoh wrote:
 On Thu, Nov 15, 2018 at 11:17:43AM -0800, Ali Çehreli via 
 Digitalmars-d wrote:
 "We don't want C++ become like COBOL." My answer is, C++ is 
 heading exactly the same place not through natural death but 
 through those improvements.
[...] And that's the problem with C++: because of the insistence on backward compatibility, the only way forward is to engineer extremely delicate and elaborate solutions to work around fundamental language design flaws...
Funny you should say that, as that same problem already holds D back quite a lot. The Argument No.1 against pretty much any change in recent years is "it will break too much code".
 Writing C++ code therefore becomes an exercise in navigating 
 the obstacle course of an overly-complex and fragile language...
Same will happen to D. Or rather, it already has.
Nov 15 2018
next sibling parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Thursday, 15 November 2018 at 22:29:56 UTC, Stanislav Blinov 
wrote:
 Writing C++ code therefore becomes an exercise in navigating 
 the obstacle course of an overly-complex and fragile 
 language...
Same will happen to D. Or rather, it already has.
++1 That's my impression of D too, lately. It's seeking stability - but at what cost? C++, at least, is boldly going where..perhaps it should not go ;-) .. but at least it's moving in a direction that is (speaking for myself) making programming in C++ at little less brittle (*if* you stick to particular features, and learn them well). Trying to learn all of C++ is just complete nonsense (as it is for almost any language). It would take decades just learning it all (and it's a constant moving target now, making it even more difficult - i.e Scott Meyers).... and nobody needs to use all of the language anyway. D is no exception to this - it is also a rather complex language with far too many features that any single programmer would need in totality. Pick a subset, get good at using it. Preferably the subset that can best provide guarantees of software correctness ;-) As for the next 'paradigm', it won't be 'unbridled freedom', I guarantee that. The programmers may certainly want that freedom (I certainly do), but the institutions/corporations who will be impacted by that 'unbridled freedom', will want better guarantees around software correctness - not more freedom. So in my opinion, the language that can best provide such guarantees (with consideration to other constraints that might apply), is the language that people will flock too. D provides a lot in that area (which is what attracted me to it), but, it breaks awfully in other areas ( I'm thinking implicit conversions (so old school), no concept of private state *within* a module (what! really!), no appetite at all for addressing many issues, ...etc..etc). C++ is the Bear. Poke it at your risk.
Nov 15 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 15, 2018 at 11:03:56PM +0000, NoMoreBugs via Digitalmars-d wrote:
[...]
 C++, at least, is boldly going where..perhaps it should not go ;-)
 
 .. but at least it's moving in a direction that is (speaking for
 myself) making programming in C++ at little less brittle (*if* you
 stick to particular features, and learn them well).
That's ridiculous. So you're saying, to use C++ effectively you basically have to use only a subset of it? So why do we keep having to haul the rest of that baggage around? Not to mention, you cannot enforce this sort of thing in a project of any meaningful scale, short of draconian code review institutions. As long as a feature is there, SOMEBODY is bound to use it someday, and it will cause headaches to everyone else.
 Trying to learn all of C++ is just complete nonsense (as it is for
 almost any language). It would take decades just learning it all (and
 it's a constant moving target now, making it even more difficult - i.e
 Scott Meyers).... and nobody needs to use all of the language anyway.
Patently ridiculous. What's the point of having something in a language if nobody's going to use it, or if nobody *should* use it? A language should be a toolbox for the programmer to draw upon, not a minefield of dangerous explosives that you have to very carefully avoid touching in the wrong way. You should be able to combine any language feature with any other language feature, and it should work without any surprising results or unexpected interactions. (Not even D fully meets this criterion, but C++ hasn't even made it on the chart yet!) Which also means that every new feature or change added to a language not only brings the intended benefits, but also comes at the price of technical debt. The larger the language, the less people can fully comprehend it, and the less you can comprehend it, the more chances you will not use something correctly, resulting in bugs that are hard to find, reduce productivity, and waste time. A language that cannot be fully comprehended by any one person is far too large, and has the stench of bad language design.
 D is no exception to this - it is also a rather complex language with
 far too many features that any single programmer would need in
 totality. Pick a subset, get good at using it. Preferably the subset
 that can best provide guarantees of software correctness ;-)
Actually, I've used most of D's features myself, even just in my own projects. Won't lie, though: some parts look dark and dirty enough that I wouldn't go beyond reading about it, if I had the choice.
 As for the next 'paradigm', it won't be 'unbridled freedom', I
 guarantee that.
 
 The programmers may certainly want that freedom (I certainly do), but
 the institutions/corporations who will be impacted by that 'unbridled
 freedom', will want better guarantees around software correctness -
 not more freedom.
Then screw the institution. Programming is about facilitating the programmer to express what he wants to the computer, not about binding him in a straitjacket and making him jump through hoops just to express the simplest of concepts (*ahem*cough*Java*sneeze*). It's a big lie and a great fallacy that language restrictions lead to software correctness. Restrictions only reduce productivity, and sweep the problem under the rug -- you may get rid of the superficial mistakes, but fundamental errors committed through inexperience or a faulty paradigm will continue to promulgate. Real support for correctness in a language comes from a careful, well thought-out design of language features such that the paradigm itself leads you to think about your programming problem in a correct way that results in correct code. Most correctness problems stem from thinking in the wrong way about something, which manifests itself as the symptoms of leaky (or outright unsafe) abstractions, programming by convention, needless boilerplate, and so on. These in turn leads to a proliferation of bugs.
 So in my opinion, the language that can best provide such guarantees
 (with consideration to other constraints that might apply), is the
 language that people will flock too.
Common fallacy: popular == good, unpopular == bad. I couldn't care less which language people flock to. In fact, in my book, people flocking to the language is a strong indication that I should probably avoid it. People flocked to Java back in the day -- I was skeptical. And recently, after having gotten over my skepticism, I finally conceded to try it out. Did not last long. In the process, although I did find one or two pleasant surprises, my overall experience was that 85% of my time was wasted fighting with a crippled, straitjacketed language, rather than making progress in the problem domain. Same thing with C++ a year or two ago, when I went back to try to fix up one of my old C++ projects. Found myself fighting with the language more than getting anything done in the problem domain. Ended up spending about half a year to rewrite the whole thing in D (which is a small fraction of the effort it took to do it in C++), with far better results.
 D provides a lot in that area (which is what attracted me to it), but,
 it breaks awfully in other areas ( I'm thinking implicit conversions
 (so old school),
Yawn. Another common fallacy: old school == bad, bandwagon == good. (Not saying that implicit conversions in D are any good -- in fact, I'm in favor of killing off outdated C-style implicit conversions altogether, even if this will only ever happen over Walter's dead body -- but seriously, "old school" as the reason? Wow. Never seen a more rational argument.)
 no concept of private state *within* a module (what! really!), no
 appetite at all for addressing many issues, ...etc..etc).
Yawn. Still not over the hangup over 'private', I see. Missing the forest for the trees. There are far more important issues in programming than such trivialities. But hey, such things require too much thought to comprehend, so why bother?
 C++ is the Bear. Poke it at your risk.
*snort* The Bear, eh? :-D A sickly and overweight one, no doubt, with a deteriorating heart condition dangerously close to a stroke, still clawing the edge of the cliff in memory of its former strength even as it slowly but surely falls off the stage of relevance into the dusts of history, dragged down by its own weight of backward compatibilities and layers of bandages upon patches upon bandages that paper over fundamental design problems that will never be truly fixed. Yawn. T -- Your inconsistency is the only consistent thing about you! -- KD
Nov 15 2018
next sibling parent NoMoreBugs <NoMoreBugs gmail.com> writes:
On Friday, 16 November 2018 at 00:42:05 UTC, H. S. Teoh wrote:
 That's ridiculous.  So you're saying, to use C++ effectively 
 you basically have to use only a subset of it?
Yes. That's exactly what I am saying. Although, I would replace 'effectively' with 'safely'. A programmer writing secure industrial strength code, must understand the features they are using. No exception. This is why Java is so popular - due to it's limited features - which correspond well to the limitations most programmers have with regards to much they can continually hold in their head.
 So why do we keep having to haul the rest of that baggage 
 around?
So all the programmers in the world don't have to program in Java ;-)
 Not to mention, you cannot enforce this sort of thing in a 
 project of any meaningful scale, short of draconian code review 
 institutions.  As long as a feature is there, SOMEBODY is bound 
 to use it someday, and it will cause headaches to everyone else.
Well, you *can* restrict yourself to hiring quality, experienced programmers though.
 Patently ridiculous.  What's the point of having something in a 
 language if nobody's going to use it, or if nobody *should* use 
 it?  A language should be a toolbox for the programmer to draw 
 upon, not a minefield of dangerous explosives that you have to 
 very carefully avoid touching in the wrong way.  You should be 
 able to combine any language feature with any other language 
 feature, and it should work without any surprising results or 
 unexpected interactions. (Not even D fully meets this 
 criterion, but C++ hasn't even made it on the chart yet!)
Which statement is more 'Patently ridiculous' .. I mean really. Library implements need tools most other programmers will not need. The tools needed to write a driver, or kernel, are not the tools most programmers will need. What you don't need to use, you don't need to know about.
 Which also means that every new feature or change added to a 
 language not only brings the intended benefits, but also comes 
 at the price of technical debt.  The larger the language, the 
 less people can fully comprehend it, and the less you can 
 comprehend it, the more chances you will not use something 
 correctly, resulting in bugs that are hard to find, reduce 
 productivity, and waste time.  A language that cannot be fully 
 comprehended by any one person is far too large, and has the 
 stench of bad language design.
So, it follows, from your arguments, that we should all use Java. Is even that too complex for you?
 Actually, I've used most of D's features myself, even just in 
 my own projects.  Won't lie, though: some parts look dark and 
 dirty enough that I wouldn't go beyond reading about it, if I 
 had the choice.
A return to common sense. Horray!
 Then screw the institution.  Programming is about facilitating 
 the programmer to express what he wants to the computer, not 
 about binding him in a straitjacket and making him jump through 
 hoops just to express the simplest of concepts 
 (*ahem*cough*Java*sneeze*).
Well, a programmer would say that. But you want to be employed? Now you seem to be arguing for a language that has more complexity? C'mon.. make your mind.
 It's a big lie and a great fallacy that language restrictions 
 lead to software correctness.  Restrictions only reduce 
 productivity, and sweep the problem under the rug -- you may 
 get rid of the superficial mistakes, but fundamental errors 
 committed through inexperience or a faulty paradigm will 
 continue to promulgate.
Well, a programmer would say that. But you want to be employed? Start writing correct code - even if that means more effort.
 Real support for correctness in a language comes from a 
 careful, well thought-out design of language features such that 
 the paradigm itself leads you to think about your programming 
 problem in a correct way that results in correct code. Most 
 correctness problems stem from thinking in the wrong way about 
 something, which manifests itself as the symptoms of leaky (or 
 outright unsafe) abstractions, programming by convention, 
 needless boilerplate, and so on.  These in turn leads to a 
 proliferation of bugs.
If there is 'thinking' involved, then bugs will follow. It can't be helped. What's important to me, is being able to very clearly state my intent, so that the compiler know it, others know it, and most of all, I know it. How many features the language has is irrelevant, to me. It is only necessary that language allows you to clearly express you content. The language should server me. Not I serve the language.
 Common fallacy: popular == good, unpopular == bad.

 I couldn't care less which language people flock to.  In fact, 
 in my book, people flocking to the language is a strong 
 indication that I should probably avoid it.  People flocked to 
 Java back in the day -- I was skeptical.  And recently, after 
 having gotten over my skepticism, I finally conceded to try it 
 out.  Did not last long.  In the process, although I did find 
 one or two pleasant surprises, my overall experience was that 
 85% of my time was wasted fighting with a crippled, 
 straitjacketed language, rather than making progress in the 
 problem domain.
Again? I cannot determine which side of your two arguements you are on.
 (Not saying that implicit conversions in D are any good -- in 
 fact, I'm in favor of killing off outdated C-style implicit 
 conversions altogether, even if this will only ever happen over 
 Walter's dead body -- but seriously, "old school" as the 
 reason? Wow. Never seen a more rational argument.)
It's old school to have the language deciding for you, what your intent is. I agree that D2 will not change that - nor should it. But it could provide 'option' that programmers could 'opt into'. But then people will say 'no, change is to complex for us to handle'. Or (more likely), Walter won't like it. So lets not even bother thinking about it anymore.
 Yawn.  Still not over the hangup over 'private', I see.  
 Missing the forest for the trees.  There are far more important 
 issues in programming than such trivialities.
Really? Private state is a trivial notion? Who'd have guessed. Anyway, the D module is really just a 'God' class (or struct, depending on your point of view).
 *snort* The Bear, eh? :-D  A sickly and overweight one, no 
 doubt, with a deteriorating heart condition dangerously close 
 to a stroke, still clawing the edge of the cliff in memory of 
 its former strength even as it slowly but surely falls off the 
 stage of relevance into the dusts of history, dragged down by 
 its own weight of backward compatibilities and layers of 
 bandages upon patches upon bandages that paper over fundamental 
 design problems that will never be truly fixed.  Yawn.
Well, keep poking it ...you'll get its attention soon enough ;-)
Nov 15 2018
prev sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 16 November 2018 at 00:42:05 UTC, H. S. Teoh wrote:
 A language should be a toolbox for the programmer to draw upon, 
 not a minefield of dangerous explosives that you have to very 
 carefully avoid touching in the wrong way.
That is a most beautiful turn of phrase. You should add that to the list of random quotes at the bottom of your posts.
Nov 15 2018
next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, November 15, 2018 9:41:31 PM MST Nicholas Wilson via 
Digitalmars-d wrote:
 On Friday, 16 November 2018 at 00:42:05 UTC, H. S. Teoh wrote:
 A language should be a toolbox for the programmer to draw upon,
 not a minefield of dangerous explosives that you have to very
 carefully avoid touching in the wrong way.
That is a most beautiful turn of phrase. You should add that to the list of random quotes at the bottom of your posts.
It's definitely good, but it would be a bit funny for him to randomly quote himself. - Jonathan M Davis
Nov 15 2018
parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 16 November 2018 at 05:02:32 UTC, Jonathan M Davis 
wrote:
 On Thursday, November 15, 2018 9:41:31 PM MST Nicholas Wilson 
 via Digitalmars-d wrote:
 On Friday, 16 November 2018 at 00:42:05 UTC, H. S. Teoh wrote:
 A language should be a toolbox for the programmer to draw 
 upon, not a minefield of dangerous explosives that you have 
 to very carefully avoid touching in the wrong way.
That is a most beautiful turn of phrase. You should add that to the list of random quotes at the bottom of your posts.
It's definitely good, but it would be a bit funny for him to randomly quote himself.
It would be hilarious, especially if it was in context (and would add quite a bit to the confirmation bias that it is not at all random).
Nov 15 2018
prev sibling parent NoMoreBugs <NoMoreBugs NoMoreBugs.com> writes:
On Friday, 16 November 2018 at 04:41:31 UTC, Nicholas Wilson 
wrote:
 On Friday, 16 November 2018 at 00:42:05 UTC, H. S. Teoh wrote:
 A language should be a toolbox for the programmer to draw 
 upon, not a minefield of dangerous explosives that you have to 
 very carefully avoid touching in the wrong way.
That is a most beautiful turn of phrase. You should add that to the list of random quotes at the bottom of your posts.
Except, that one does not really need to stay away from them. One just needs to understand them. If one needs to stay away from them, they shouldn't be there.
Nov 15 2018
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 15, 2018 at 10:29:56PM +0000, Stanislav Blinov via Digitalmars-d
wrote:
 On Thursday, 15 November 2018 at 19:54:06 UTC, H. S. Teoh wrote:
 On Thu, Nov 15, 2018 at 11:17:43AM -0800, Ali ehreli via Digitalmars-d
 wrote:
 "We don't want C++ become like COBOL." My answer is, C++ is
 heading exactly the same place not through natural death but
 through those improvements.
[...] And that's the problem with C++: because of the insistence on backward compatibility, the only way forward is to engineer extremely delicate and elaborate solutions to work around fundamental language design flaws...
Funny you should say that, as that same problem already holds D back quite a lot. The Argument No.1 against pretty much any change in recent years is "it will break too much code".
Yes, this obsession with not breaking existing code is a thorn in D's side. D is not quite at the point of C++, where *nothing* legacy can change (like getting rid of that evil preprocessor); we still have a process for deprecating old stuff. A slow process, perhaps slower than many would like, but it has been happening over the years, and has cleaned up some of the uglier parts of the language / standard library. Still, it's sad to see that bad decisions like autodecoding probably won't ever be fixed, because it "breaks too much code".
 Writing C++ code therefore becomes an exercise in navigating the
 obstacle course of an overly-complex and fragile language...
Same will happen to D. Or rather, it already has.
I won't pretend D doesn't have its dark corners... but as of right now, it's still orders of magnitude better than C++. It lets me express complex computations with a minimum of fuss and red tape, and I can get a lot done in a short time far better than in C/C++/Java. Especially Java. :-P So far, at least, I haven't found another language that doesn't get in my way the same way D does. D is far from perfect, but I haven't seen a better alternative yet. T -- If you look at a thing nine hundred and ninety-nine times, you are perfectly safe; if you look at it the thousandth time, you are in frightful danger of seeing it for the first time. -- G. K. Chesterton
Nov 15 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, November 15, 2018 4:44:00 PM MST H. S. Teoh via Digitalmars-d 
wrote:
 On Thu, Nov 15, 2018 at 10:29:56PM +0000, Stanislav Blinov via 
Digitalmars-d wrote:
 On Thursday, 15 November 2018 at 19:54:06 UTC, H. S. Teoh wrote:
 On Thu, Nov 15, 2018 at 11:17:43AM -0800, Ali ehreli via
 Digitalmars-d

 wrote:
 "We don't want C++ become like COBOL." My answer is, C++ is
 heading exactly the same place not through natural death but
 through those improvements.
[...] And that's the problem with C++: because of the insistence on backward compatibility, the only way forward is to engineer extremely delicate and elaborate solutions to work around fundamental language design flaws...
Funny you should say that, as that same problem already holds D back quite a lot. The Argument No.1 against pretty much any change in recent years is "it will break too much code".
Yes, this obsession with not breaking existing code is a thorn in D's side. D is not quite at the point of C++, where *nothing* legacy can change (like getting rid of that evil preprocessor); we still have a process for deprecating old stuff. A slow process, perhaps slower than many would like, but it has been happening over the years, and has cleaned up some of the uglier parts of the language / standard library. Still, it's sad to see that bad decisions like autodecoding probably won't ever be fixed, because it "breaks too much code".
C++ doesn't really have a process for deprecating old features or code. They sort of do, but not really. We do have that ability and use it, which helps, but we still have the basic tension between code continuing to work and being able to move the language and standard library forward. No one likes coming back to their code and finding that it doesn't work anymore. They want it to work as-is forever. And yet everyone also wants new features. And they want the old cruft gone. Balancing all of that is difficult. I'm not saying that we're necessarily doing a fantastic job of that, but there isn't an easy answer, and no matter what answer you pick, some portion of the user base is going to be annoyed with you. We can't both be completely stable and completly flexible. A balance of some kind must be struck. It remains to be seen whether we're managing to strike a good balance or whether we'll do so in the future. As for features like auto-decoding, the core problem is how much they have their tendrils in everything. When it's a straightforward process to deprecate something, then it's a pretty simple question of whether we think the change is worth the breakage that it causes, but when it affects as much as auto-decoding does, it becomes very difficult to even figure out _how_ to do it. Honestly, I think that that's the biggest thing that's prevented removing auto-decoding thus far. It's not that someone proposed a plan to Walter and Andrei and they rejected it, because it was going to break too much code. No one has even proposed a plan that was feasible - at least not if you care about having any kind of transition process as opposed to immediately breaking tons of code (potentially in silent ways in some cases). AFAIK, pretty much the only plan that we have right now that would would would amount to creating D3 - or at least would basically mean throwing out the version of Phobos that we have, which isn't much different. Either way, it would mean completely forking the existing code base and community, and we really don't want to do that. We want a plan that involves removing auto-decoding in D2 in place. The first step in that whole mess, which really has not been done anywhere near the level that it needs to be done, is to ensure that Phobos in general doesn't care whether a range of characters is of char, wchar, dchar, or graphemes. Many of the specializations for strings do still need to be there to avoid auto-decoding (at least as long as auto-decoding is there), but a lot of the code assumes that ranges of characters are ranges of dchar, and it _all_ needs to work with ranges of char, wchar, dchar, and graphemes. Once it's reasonably character-type agnostic (and what that means exactly is going to vary depending on what the function does), then we can sit back and see how much in the way of auto-decoding tendrils are left. At minimum, at that point, code like byCodeUnit is then going to work as well as it can, even if we can't actually remove auto-decoding. Until that's done, byCodeUnit and its ilk are going to keep running into problems in various places when you try to use them with Phobos. As it stands, when using them with Phobos, they work sometimes, and other times, they don't. So, that work is necessary regardless of what happens with auto-decoding. But once that work is done, it may very well be that we can finally find a way to get rid of auto-decoding. I don't know. I still question it given how it's tied into arrays and UFCS, making it so that we can't properly use the module system to deal with conflicts, but at least at that point, we'll have done the work that needs to be done that surrounds the problem, and it's work that needs to be done whether we get rid of auto-decoding or not. *sigh* Honestly, auto-decoding is almost a perfect storm of issues for us being able to actually get rid of it. So, while I agree with you that we'd ideally fix the problem, it's _not_ an easy one to fix, and really the only "easy" way to fix it is to pretty much literally say "D3" and hard break all code. I think that the reality of the matter is that there are issues in every language that you can't fix without either creating a new language or creating a new version of the language that's not backwards compatible with the old one (which then forks the language and community). So, while we'd very much like to fix everything, there are going to be some things we simply can't fix if we're not willing to create D3, and talking about D3 creates a whole other can of worms, which I don't think we're even vaguely ready for yet. Maybe auto-decoding will turn out to be fixable, maybe it won't, but I think that it's going to be inevitable that _some_ things will be unfixable. I love D, but it's never going be perfect. No programming language will be, much as I would love to use one. We should do the best that we can to approach perfection, but we're going to miss in some places, and as it stands, we definitely missed when it comes to auto-decoding. - Jonathan M Davis
Nov 15 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 15, 2018 at 06:03:37PM -0700, Jonathan M Davis via Digitalmars-d
wrote:
[...]
 *sigh* Honestly, auto-decoding is almost a perfect storm of issues for
 us being able to actually get rid of it. So, while I agree with you
 that we'd ideally fix the problem, it's _not_ an easy one to fix, and
 really the only "easy" way to fix it is to pretty much literally say
 "D3" and hard break all code. I think that the reality of the matter
 is that there are issues in every language that you can't fix without
 either creating a new language or creating a new version of the
 language that's not backwards compatible with the old one (which then
 forks the language and community).  So, while we'd very much like to
 fix everything, there are going to be some things we simply can't fix
 if we're not willing to create D3, and talking about D3 creates a
 whole other can of worms, which I don't think we're even vaguely ready
 for yet.
Talking about D3 has sorta become taboo around here, for understandable reasons -- splitting the community now might very well be the death of D after that Tango vs. Phobos fiasco. Python survived such a transition, and Perl too AIUI. But D currently does not have nearly the size of Python or Perl to be able to bear the brunt of such a drastic change. Nevertheless I can't help wondering if it would be beneficial to one day sit down and sketch out D3, even if we never actually implement it. It may give us some insights on the language design we should strive to reach, based on the experience we have accumulated thus far. Autodecoding, even though it's a commonly mentioned example, actually is only a minor point as far as language design is concerned. More fundamental issues could be how to address the can of worms that 'shared' has become, for example, or what the type system might look like if we were to shed the vestiges of C integer promotion rules.
 Maybe auto-decoding will turn out to be fixable, maybe it won't, but I
 think that it's going to be inevitable that _some_ things will be
 unfixable. I love D, but it's never going be perfect. No programming
 language will be, much as I would love to use one. We should do the
 best that we can to approach perfection, but we're going to miss in
 some places, and as it stands, we definitely missed when it comes to
 auto-decoding.
[...] It's true that at some point, you just have to dig in and work with what you have, rather than wish for an ideal language that never arrives. Still, one can't help wondering what the ideal language might look like. A crazy wishful thought of mine is versioned support of the language, analogous to versioned library dependencies. Imagine if we could specify a D version in source files, and the compiler switches to "compatibility" mode for D2 and "native" mode for D3. The compiler would emit any necessary shunt code to make code compiled in D2 mode binary-compatible with code compiled in D3 mode, for example (modulo incompatible language changes, of course). This way, existing codebases could slowly transition to D3, module-by-module. Sorta like the -dipxxxx switches, but embedded in the source file, and on a larger scale. (I know this is probably not feasible in practice, because of the massive amount of work it will take to maintain parallel language versions in the same compiler, which will require manpower we simply do not have. But one can dream.) T -- My father told me I wasn't at all afraid of hard work. I could lie down right next to it and go to sleep. -- Walter Bright
Nov 15 2018
next sibling parent reply AlCaponeJr <a c.com> writes:
On Friday, 16 November 2018 at 02:02:26 UTC, H. S. Teoh wrote:
 ... after that Tango vs. Phobos fiasco....
Do you mind to explain this for newcomers? Al.
Nov 15 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, November 15, 2018 7:30:15 PM MST AlCaponeJr via Digitalmars-d 
wrote:
 On Friday, 16 November 2018 at 02:02:26 UTC, H. S. Teoh wrote:
 ... after that Tango vs. Phobos fiasco....
Do you mind to explain this for newcomers?
Walter has always been focused primarily on dmd, not the standard library or its runtime. During D1, Andrei was not involved, and Walter was the main person behind everything. Phobos was the standard library, and it included druntime. They weren't separate, and the standard library was very small. The process was also less open at the time, since none of it was on github. It was in a subversion repo that not many folks had access to. So, it was much harder to get changes committed, and Walter's focus was on the language, not the library. There were folks in the D community who created a library called Tango which grew to dwarf Phobos. At some point in the process, it got its own runtime, which meant that it was incompatible with Phobos. So, anyone using D1 had to choose whether they wanted to use the standard library, Phobos, or to use Tango. Almost everyone chose Tango, since it was much, much larger and was actively developed. This did sort of fork the community, and there was talk of two standard libraries (though officially, there was only Phobos), since you had to pick one or the other. However, since most everyone used Tango, I think that the biggest problem here was really a PR problem if anything. The fact that you couldn't use the standard library with Tango was definitely bad, but almost everyone used Tango, so I don't think that there was a huge split in reality (though I didn't use D until early in D2, so my understanding could be off). It did make the community look bad though. The bigger fork in the community came with D2. Folks had been using D1 in production, and when Walter was going to add const in a serious way, it was going to break enough code that he decided to fork the language so that folks could continue to use D1, and he could continue to develop the language with D2. Andrei had joined up around then. So, he took over the design of Phobos and started to influence the direction of the language (e.g. making everything thread-local by default). And since Walter no longer had to worry about avoiding breaking production code, he was a lot more willing to make massive changes. So, for a while, D2 was very volatile and not at all viable for production. During that process, Phobos grew quite a bit. The runtime was split out from Phobos to try and fix the problem that had existed between Phobos and Tango in D1, and in fact, druntime comes from Tango's runtime. But as D2 grew and matured, it became a very different language from D1. So, eventually, we ended up with a shrinking community using D1 with Tango, and a growing community using D2 with Phobos (some of whom came from the D1 community). A lot of the folks using D1 did not like D2. Phobos was _very_ different in its approach from Tango. D1 and Tango were more Java-like if anything, and Phobos and D2 are closer to C++ and the STL in their approach. Tango did eventually get ported to D2, and thanks to druntime being separate from Phobos, Phobos and Tango could be used together, but they were so different that most folks probably still wouldn't for the most part. And at this point, Tango is unmaintained. So, the fact that we had two major libraries in D1 that were viewed as the "standard" library (even though only one was actually the standard library) was a divider in the community, and the switch to D2 was an even bigger divider in the community. I think that the switch to D2 was by far the bigger divider in reality, but outside of the community, folks still sometimes bring up how D is divided by having two standard libraries. - Jonathan M Davis
Nov 15 2018
prev sibling parent reply =?UTF-8?B?QXVyw6lsaWVu?= Plazzotta <iam gmail.com> writes:
On Friday, 16 November 2018 at 02:02:26 UTC, H. S. Teoh wrote:
 Talking about D3 has sorta become taboo around here, for 
 understandable reasons -- splitting the community now might 
 very well be the death of D after that Tango vs. Phobos fiasco.
  Python survived such a transition, and Perl too AIUI.  But D 
 currently does not have nearly the size of Python or Perl to be 
 able to bear the brunt of such a drastic change.

 Nevertheless I can't help wondering if it would be beneficial 
 to one day sit down and sketch out D3, even if we never 
 actually implement it. It may give us some insights on the 
 language design we should strive to reach, based on the 
 experience we have accumulated thus far. Autodecoding, even 
 though it's a commonly mentioned example, actually is only a 
 minor point as far as language design is concerned.  More 
 fundamental issues could be how to address the can of worms 
 that 'shared' has become, for example, or what the type system 
 might look like if we were to shed the vestiges of C integer 
 promotion rules.


 T
I can't help but think D3 is the one true way to go to become significantly adopted by more companies. Even though D2 is used by several companies, it still remains a very marginal language and a migration with no backward compatibilities wouldn't be a much a thing to deal with. Maybe ten companies are using D for minor projects, it doesn't justify to stall or slow down the whole language specs. The language is still in a phase where it can allow himself breaking changes to significantly improve its performances or reduce the frictions and asymetries. However, I wouldn't gamble a penny on D3 for more practical reasons. The creators and the main contributors of D are all C++ full-time developers and some of them even members of C++ commities. For example, Walter distribute and commercialize a C++ compiler and Andrei contribute to C++ meeting in order to identify and improve the weaknesses of C++... So why should they care about a D3 branch? No offense, but I don't think anyone important here believe in D becoming a industry-proof language in any timeline. In my opinion and with due respect, I am convinced that D is more or less processed like a research laboratory to test and implement new features to then improve C++ specs and its standard librairies. The D project, despite of its elegant design, isn't in pursuit of becoming a successful language himself but a successful sandbox I hope I am wrong but after 2 years toying with the language for home projects and reading the changelogs and the forums, it is my overall impression that something is wrong and cannot be ignored. D cannot grow and develop its own identity if the main focus is C/C++ compatibility. Make no mistake, nobody will abandon his job in C++ among the D community to persuade a employer to hire him for a D full-time job (and thereby not contributing to the success of D), if it does'nt do anything else but operate with they already-used C++. History has showed us that trying to tease and please to C++ community does not help D to be adopted in industry. I wish I haven't hurt anyone's feelings but D project lack a bit of long-term vision. And it scares me to never use D in my professional life.
Nov 15 2018
parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Friday, 16 November 2018 at 03:50:55 UTC, Aurélien Plazzotta 
wrote:
 On Friday, 16 November 2018 at 02:02:26 UTC, H. S. Teoh wrote:
 Talking about D3 has sorta become taboo around here, for 
 understandable reasons -- splitting the community now might 
 very well be the death of D after that Tango vs. Phobos fiasco.
  Python survived such a transition, and Perl too AIUI.  But D 
 currently does not have nearly the size of Python or Perl to 
 be able to bear the brunt of such a drastic change.

 Nevertheless I can't help wondering if it would be beneficial 
 to one day sit down and sketch out D3, even if we never 
 actually implement it. It may give us some insights on the 
 language design we should strive to reach, based on the 
 experience we have accumulated thus far. Autodecoding, even 
 though it's a commonly mentioned example, actually is only a 
 minor point as far as language design is concerned.  More 
 fundamental issues could be how to address the can of worms 
 that 'shared' has become, for example, or what the type system 
 might look like if we were to shed the vestiges of C integer 
 promotion rules.


 T
I can't help but think D3 is the one true way to go to become significantly adopted by more companies.
Maybe it would be a good thing, but I think it's maturity of compilers, ecosystem and tooling that's the biggest obstacle for adoption - and perceptions are lagging and where we are today is a consequence of decisions taken some time back. Start to improve things and it's a while before you see results and longer again before people notice.
 Maybe ten companies are using D for minor projects, it doesn't 
 justify to stall or slow down the whole language specs.
Come on - there are quite significant entire companies where D is critical and others where it is used for important projects. Starting again from scratch may or may not be a good idea, but it won't exactly help with maturity or for many adoption. "I think I will wait till D3 is ready". I don't know but any fixing of past mistakes seems to me a much smaller change than D1 to D2 and so I don't know if it is the ideal framing. It's more like D 2.1 although then the versioning of releases gets a bit confusing.
 The language is still in a phase where it can allow himself 
 breaking changes to significantly improve its performances or 
 reduce the frictions and asymetries.
Sure.
 However, I wouldn't gamble a penny on D3 for more practical 
 reasons.
 The creators and the main contributors of D are all C++ 
 full-time developers
? I don't think so. Walter is a full-time developer of the D language and he has just put a lot of work into moving both dmd and DMC from C++ to D. Andrei quit Facebook to work full-time on the D Foundation. Maybe they do some other consulting on the side. If you were CEO of D and they worked for you, you would want them to do this because it's good for the language that they do. Other contributors do all kinds of different things. I work with a decent number of them and mostly they work on D projects full-time.
  some of them even members of C++ commities.
Traitors! We must hunt them down and expel them! Seriously,how can this be a bad thing? Some people are even members of non-native code communities also! Why wouldn't we want to have the benefit of the idea interchange that results ?
 For example, Walter distribute and commercialize a C++ compiler
That's his old gig and he keeps something going I guess, but it doesn't look to me like he is putting much time into DMC. When is the Cpp 2017 version coming out? My guess is never. > And Andrei contribute to C++ meeting in order to
 identify and improve the weaknesses of C++...
And they often don't listen and mess it up when they do, just like with static if. How is this a bad thing? If C++ gets better,I really don't think it's bad for D.
 No offense, but I don't think anyone important here believe in 
 D becoming a industry-proof language in any timeline.
Each must think what they will and I never worried much myself about who is important but rather who is doing good and interesting work. Of those people some seem to be doing remarkably well using D. I guess they aren't themselves bothered whether you consider them important either! In my
 opinion and with due respect, I am convinced that D is more or 
 less processed like a research laboratory to test and implement 
 new features to then improve C++ specs and its standard 
 librairies.
Interesting opinion. I personally disagree - seems clearly wrong to me and some here might say "if only! Would that it were true!"
 D cannot grow and develop its own identity if the main focus is 
 C/C++ compatibility.
Do you really think that's the case that it's the main focus?
 Make no mistake, nobody will abandon his job in C++ among the D 
 community to persuade a employer to hire him for a D full-time 
 job
:) There are no jobs in D :) I'm pretty sure you are mistaken both on the supply and demand side. And we are still hiring.
 I wish I haven't hurt anyone's feelings but D project lack a 
 bit of long-term vision.
I think right now the vision is clear enough as far as it needs to be articulated and the biggest constraint is that the D Foundation hasn't been in existence for long and it's quite a lot of work to create something from nothing and it takes time from beginning to start to see results though if one pays attention I don't think it's hard to see plenty of results already. Everything is an S curve - very flat in the beginning.
Nov 15 2018
parent reply =?UTF-8?B?QXVyw6lsaWVu?= Plazzotta <cmoi gmail.com> writes:
On Friday, 16 November 2018 at 07:20:25 UTC, Laeeth Isharc wrote:

 Starting again from scratch may or may not be a good idea, but 
 it won't exactly help with maturity or for many adoption.  "I 
 think I will wait till D3 is ready".
Maybe, a tool like python 2to3 would facilitate the transition from D2 to D3? Or perhaps a D2 branch maintained for a limited amount of time like a year. In general, the longer the migration phase is, the less likely an actual migration effort would occur.
 I don't know but any fixing of past mistakes seems to me a much 
 smaller change than D1 to D2 and so I don't know if it is the 
 ideal framing.  It's more like D 2.1 although then the 
 versioning of releases gets a bit confusing.
I think the currently failed semantic versioning is going to find a simple fix when D 2.100.0 will be released eventually, where the leading "0" character won't matter anymore.
 The creators and the main contributors of D are all C++ 
 full-time developers
  some of them even members of C++ commities.
Traitors! We must hunt them down and expel them! Seriously,how can this be a bad thing? Some people are even members of non-native code communities also! Why wouldn't we want to have the benefit of the idea interchange that results ?
Please, don't turn me into a parody :/ I don't see them as traitors of course, I know they put a lot of effort into the D language and it's a wonderful thing for our community to have several semi-gods among us. But my point is, I'm persuaded they believe much more in C++ than they believe in a future for D; the latter being rather a technological demo for something they would have hope at the beginning of their career.
 For example, Walter distribute and commercialize a C++ compiler
That's his old gig and he keeps something going I guess, but it doesn't look to me like he is putting much time into DMC. When is the Cpp 2017 version coming out? My guess is never. > And Andrei contribute to C++ meeting in order to
 identify and improve the weaknesses of C++...
And they often don't listen and mess it up when they do, just like with static if. How is this a bad thing? If C++ gets better,I really don't think it's bad for D.
I am not sure it is a good thing either. At best, the original idea from D concept is pointed out in the C++ changelog but that provide not visibility or credit at all for D design, or D market in extenso.
 D cannot grow and develop its own identity if the main focus 
 is C/C++ compatibility.
Do you really think that's the case that it's the main focus?
I admit I have exagerated, there are lot of efforts committed towards PRs managing, DIPs processing, DMD optimization, newCTFE and probably more. But Andrei told us about 18 months ago he had a female student under scholarship from the D foundation who was working on D3 and we have no news since the teasing. So, I would like to know whether this kind of project will happen :)
 Make no mistake, nobody will abandon his job in C++ among the 
 D community to persuade a employer to hire him for a D 
 full-time job
:) There are no jobs in D :) I'm pretty sure you are mistaken both on the supply and demand side.
I don't understand, what do you imply?
 And we are still hiring.
Nice but for which tech? To be honest, 2 years ago, I was hoping D would be ready to work with full-time, but to the best of my knowledge, there are no D jobs in France and it's a great frustration.
 I wish I haven't hurt anyone's feelings but D project lack a 
 bit of long-term vision.
I think right now the vision is clear enough as far as it needs to be articulated and the biggest constraint is that the D Foundation hasn't been in existence for long and it's quite a lot of work to create something from nothing and it takes time from beginning to start to see results though if one pays attention I don't think it's hard to see plenty of results already. Everything is an S curve - very flat in the beginning.
Agreed, but the beginning was in 2001 for D you know... :x It's not like it was previous year. Anyway, thanks for your input Laeeth Isharc :)
Nov 16 2018
next sibling parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 11/16/2018 12:00 PM, Aurélien Plazzotta wrote:

 But my point is, I'm persuaded they believe much more in C++ than they
 believe in a future for D; the latter being rather a technological demo
 for something they would have hope at the beginning of their career.
I don't know any person in the D community that fits that description. To respond to an earlier comment of yours, I don't know any D person who is a member of the C++ standardization committee either. Ali
Nov 16 2018
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Nov 16, 2018 at 08:00:30PM +0000, Aurlien Plazzotta via Digitalmars-d
wrote:
 On Friday, 16 November 2018 at 07:20:25 UTC, Laeeth Isharc wrote:
[...]
 The creators and the main contributors of D are all C++ full-time
 developers some of them even members of C++ commities.
Traitors! We must hunt them down and expel them! Seriously,how can this be a bad thing? Some people are even members of non-native code communities also! Why wouldn't we want to have the benefit of the idea interchange that results ?
Please, don't turn me into a parody :/ I don't see them as traitors of course, I know they put a lot of effort into the D language and it's a wonderful thing for our community to have several semi-gods among us. But my point is, I'm persuaded they believe much more in C++ than they believe in a future for D; the latter being rather a technological demo for something they would have hope at the beginning of their career.
[...] You have been grossly misinformed. Walter works full-time on D, and Andrei *quit* his career at Facebook in order to work full-time on D. What else do you expect them to do to prove their commitment to D?? T -- "A man's wife has more power over him than the state has." -- Ralph Emerson
Nov 16 2018
parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Friday, 16 November 2018 at 21:40:06 UTC, H. S. Teoh wrote:
 You have been grossly misinformed.  Walter works full-time on 
 D, and Andrei *quit* his career at Facebook in order to work 
 full-time on D. What else do you expect them to do to prove 
 their commitment to D??
It's hard to be critical of Walter and Andrei, it really is... but.... I would suggest, being more open to change, perhaps. My impression of D at the moment, is that it just wants to remain stable, and that it has become too difficult to get change through the two gatekeepers (Walter and Andrei), and so people are not even bothering to try anymore. Of course my impression could be wrong, but it is what it is. But being stable in an environment where change is now 'the norm' for programming languages, could be the death nail for D. Every other language is on the move...and it looks like it will stay that way for the foreseeable future. little way to go ;-) The future of programming languages is about safer, more correct code, where the programmer can clearly show their intent - and all by default. You'll have to make the effort to write less safe, more obtuse code - whereas these days, you have to make the effort to create safe, correct code! (e.g. private state within a module - nope. you have to re-architect just to get private state) Anyway, as a developer (of anything), no.1 rule is: If you don't listen to your users, then you won't have any users soon enough. I would like to quote Stroustrup (where he is speaking about the language he initially designed): "We have two ways of going forward..one is to find a better alternative..and two..is to transform the old crud into the new stuff.. that's where the future is." D has lots of 'crud' too. It needs transformation (one way or another). Most importantly, is to remind Walter and Andrei, that D needs to serve it's users (not the other way around). Because there are alternatives to D.
Nov 16 2018
parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Saturday, 17 November 2018 at 00:19:34 UTC, NoMoreBugs wrote:
 On Friday, 16 November 2018 at 21:40:06 UTC, H. S. Teoh wrote:
 You have been grossly misinformed.  Walter works full-time on 
 D, and Andrei *quit* his career at Facebook in order to work 
 full-time on D. What else do you expect them to do to prove 
 their commitment to D??
It's hard to be critical of Walter and Andrei, it really is... but.... I would suggest, being more open to change, perhaps. My impression of D at the moment, is that it just wants to remain stable, and that it has become too difficult to get change through the two gatekeepers (Walter and Andrei), and so people are not even bothering to try anymore. Of course my impression could be wrong, but it is what it is. But being stable in an environment where change is now 'the norm' for programming languages, could be the death nail for D. Every other language is on the move...and it looks like it will stay that way for the foreseeable future. little way to go ;-) The future of programming languages is about safer, more correct code, where the programmer can clearly show their intent - and all by default. You'll have to make the effort to write less safe, more obtuse code - whereas these days, you have to make the effort to create safe, correct code! (e.g. private state within a module - nope. you have to re-architect just to get private state) Anyway, as a developer (of anything), no.1 rule is: If you don't listen to your users, then you won't have any users soon enough. I would like to quote Stroustrup (where he is speaking about the language he initially designed): "We have two ways of going forward..one is to find a better alternative..and two..is to transform the old crud into the new stuff.. that's where the future is." D has lots of 'crud' too. It needs transformation (one way or another). Most importantly, is to remind Walter and Andrei, that D needs to serve it's users (not the other way around). Because there are alternatives to D.
Listening to users may often be a good idea. Politely doing what you believe to be best whatever users may think may well on occasion be an even better idea. Not that I see any hint that's how the language leadership think - that's just my personal opinion about the limitations of democracy in a popular sense and the role of leadership in creative endeavours. It's quite often the case that users don't know what they will want. Was the introduction of attributes universally welcomed? More importantly than that I think there's an implicit unexamined assumption in what you wrote that what people on the forum write (it's fascinating on these kinds of threads the number of accounts that pop up who I never recall having posted before, whatever that may mean) is in any way generally representative of committed users of D. Its a particular subset that post on the forums at all, and plenty of people active on the forums don't express themselves on these kinds of threads. A subset of a subset. I know a few people active in the development of the language and library that don't say much on the forums because they feel making pull requests are more productive. The larger commercial users of D that I have met personally (I am a commercial user myself) barely post at all because they have work to do. And to return to an old point. It's much better to focus on people that like what you are doing and already using your product than those who say "if only you would do X, D would be huge". That's the nature of the innovator's dilemma and also if one is to be persuasive then it's helpful to remember that talk is cheap, whereas making a closely reasoned argument accompanied by skin in the game - now that is much more persuasive.
Nov 16 2018
next sibling parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Saturday, 17 November 2018 at 02:07:45 UTC, Laeeth Isharc 
wrote:
 And to return to an old point.  It's much better to focus on 
 people that like what you are doing and already using your 
 product than those who say "if only you would do X, D would be 
 huge".
Given D's very small user base, that's probably not the mindset you want the foundation to be in ;-) That mindset, will ensure your small user base never grows, and will likely get smaller and smaller. Even the C++ Committee cannot risk being in that mindset - not anymore anyway. To grow the user base, you need to listen and respond to *their* needs - ask any streamer ;-)
 That's the nature of the innovator's dilemma and also if one is 
 to be persuasive then it's helpful to remember that talk is 
 cheap, whereas making a closely reasoned argument accompanied 
 by skin in the game - now that is much more persuasive.
As long as I use D, I have 'skin in the game'. When I stop using D, I'll have no interest in trying to voice my opinion about how it can better serve my needs. And you don't need to be sending pull requests to have skin in the game. Also, you seem to be saying that D is more of an incubation language, for creative ideas? But, it seems to want to strut itself on the world stage as a genuine competitor to industrial strength languages. Can it be both? (I'd argue, that it cannot be both, and that it needs to choose - soon). This duality of purpose, is just confusing - and in my opinion, is holding back the language. A language with only 2 gatekeepers makes sense for a language that wants to be an incubation language for creative ideas - even more so when its for the creative ideas of those 2 gatekeepers. But a language that wants to compete against industrial strength languages better listen to its users. I know of no widely used industrial strength language that has only 2 gatekeepers. Because how can only 2 people be representative of its vast number of users? It's not possible.
Nov 16 2018
parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Saturday, 17 November 2018 at 04:21:50 UTC, NoMoreBugs wrote:
 On Saturday, 17 November 2018 at 02:07:45 UTC, Laeeth Isharc 
 wrote:
 And to return to an old point.  It's much better to focus on 
 people that like what you are doing and already using your 
 product than those who say "if only you would do X, D would be 
 huge".
Given D's very small user base, that's probably not the mindset you want the foundation to be in ;-)
I understand the words, but not the commercial logic. That's exactly the mindset I personally think the Foundation ought to have - just my opinion though. Talk is cheap and one succeeds by doing more of what's working at least a little in my experience (whilst generating enough cheap experiments to have little successes to build on, something in this case more up to users than the foundation). I definitely don't think it's a profitable strategy to listen to advice, well meaning or not, to people who aren't involved about what would be conceptually wonderful.
 That mindset, will ensure your small user base never grows, and 
 will likely get smaller and smaller.
Evidently you don't see yourself as part of the D community from your phrasing. That's an assertion and we are all entitled to our opinions but to be persuasive reasoned arguments are often more effective. What you say is the opposite of my experience as well as basic commercial common sense.
 Even the C++ Committee cannot risk being in that mindset - not 
 anymore anyway.
People who run big organisations are often quite clueless about the challenges involved in running small organisations. And the same is true of language communities. I don't really see that it's a matter of "even the C++ committee" - their context is just so utterly different.
 To grow the user base, you need to listen and respond to 
 *their* needs - ask any streamer ;-)
Yes - the needs of people who are using D in a serious way already and prominent contributors to the community. People like Manu for example, if we are speaking of individuals. Some
 As long as I use D, I have 'skin in the game'.

 When I stop using D, I'll have no interest in trying to voice 
 my opinion about how it can better serve my needs.
What projects are you working on? And how do your needs flow out of your context? How many lines of code have you written in D? What's your github handle? What's been your biggest success in D?
 Also, you seem to be saying that D is more of an incubation 
 language, for creative ideas?
Where did I suggest more of that and less of industrial strength? Its certainly good for creative projects because its quite productive and because its so plastic. Seems pretty good for running industrial strength projects too. Just its perhaps not the first choice if you want pretty refactoring tools. Is there anything at all controversial or surprising about any of these observations?
 This duality of purpose, is just confusing - and in my opinion, 
 is holding back the language.
Duality of purpose? I think you mistake a snapshot in time for something planned or essential in some Platonic sense.
 A language with only 2 gatekeepers makes sense for a language 
 that wants to be an incubation language for creative ideas - 
 even more so when its for the creative ideas of those 2 
 gatekeepers.
Yes and that's why Apple never went anywhere until they told Jobs to stick to relationships and PR and let the grown ups on the committee design things. Or not. Or perhaps it wasn't that but the focus groups that were truly responsible for the success of Apple. A very small number of minds working closely together can be creative and design something beautiful, or have a chance to do so. A committee, notoriously, is a machine for suppressing creativity.
 I know of no widely used industrial strength language that has 
 only 2 gatekeepers. Because how can only 2 people be 
 representative of its vast number of users? >it's not possible.
You're right. Python was such a promising language but it never stood a chance with not even two but just one BDFL. Now that they have moved to some kind of committee structure, I am sure that bodes well for the future flourishing of the language. Or is in fact the opposite true?
Nov 20 2018
parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Tuesday, 20 November 2018 at 23:50:56 UTC, Laeeth Isharc wrote:
 Evidently you don't see yourself as part of the D community 
 from your phrasing.  That's an assertion and we are all 
 entitled to our opinions but to be persuasive reasoned 
 arguments are often more effective. What you say is the 
 opposite of my experience as well as basic commercial common 
 sense.
I understand the psychological basis of that assertion and the reaction you want to get from those who read it (to dismiss/ignore me because I'm an outsider). But your logic and your assertion is misguided. I don't see myself as a part of any 'language' community. This is where we seem to differ, a lot. A programming language for me is a tool to an end. Its serves me. I do not serve it - or its community. Its just a tool - that all it is. You don't build communities around a 'hammer' or a 'spanner'. It's not unreasonable that I give feedback on how that tool can better serve me. We also seem to differ on what a 'contributor' is. To me, the focus is always on the user, and their needs, not on the language and its needs. I think our views really differ here too. I also believe the language designer(s) are ultimately responsible for the mistakes that programmers continually make (and yes, here I'm paraphrasing someone who is well known and well respected in the computer science field). I don't want a faulty hammer or spanner in my toolbox.
 A very small number of minds working closely together can be 
 creative and design something beautiful, or have a chance to do 
 so.  A committee, notoriously, is a machine for suppressing 
 creativity.
The proof of your argument needs evidence. D has had 10 years (since D2) of 'creativity time', and much longer than that in reality. Look at what the C++ committee has been able to accomplish in the same amount of time. I don't object to creative endeavors. It's what makes life worthwhile. But after 18 years, is that what D (still) is? Or is it a serious tool that serious programmers should take seriously. And which perspective is the foundation committed too?
Nov 20 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 21 November 2018 at 01:09:30 UTC, NoMoreBugs wrote:
 On Tuesday, 20 November 2018 at 23:50:56 UTC, Laeeth Isharc 
 wrote:
 Evidently you don't see yourself as part of the D community 
 from your phrasing.  That's an assertion and we are all 
 entitled to our opinions but to be persuasive reasoned 
 arguments are often more effective. What you say is the 
 opposite of my experience as well as basic commercial common 
 sense.
I understand the psychological basis of that assertion and the reaction you want to get from those who read it (to dismiss/ignore me because I'm an outsider). But your logic and your assertion is misguided. I don't see myself as a part of any 'language' community. This is where we seem to differ, a lot. A programming language for me is a tool to an end. Its serves me. I do not serve it - or its community. Its just a tool - that all it is. You don't build communities around a 'hammer' or a 'spanner'. It's not unreasonable that I give feedback on how that tool can better serve me.
Yes, a programming language is a tool, not a religion. What has happened to D's "pragmatic approach"? It is sad, but interesting at the same time (from a sociological point of view), to see how intelligent and creative people, experienced and outstanding engineers, who started to create a good tool, have adopted a quasi-religious mindset, encouraging each other in their faith while taking offence at any criticism, calling critics trolls or alleging ulterior motives. Mind you, this doesn't only happen to newcomers, but also to people who have used and invested in D for years. It's a shame, because D has a lot of potential, things that other languages have only recently caught up on. This is more to my liking: Pragmatic language https://www.youtube.com/watch?v=PsaFVLr8t4E&feature=youtu.be?t=116 Evolution of language: https://www.youtube.com/watch?v=PsaFVLr8t4E&feature=youtu.be?t=1651 (And please spare me the comments about "industry backed", "vested interest", "D is a community effort" - a lot of things have to do with the mindset not with D being a community effort that isn't backed up by big industry.)
 We also seem to differ on what a 'contributor' is.

 To me, the focus is always on the user, and their needs, not on 
 the language and its needs. I think our views really differ 
 here too.
PRs are only used as an argument to smother any criticism. You don't contribute, so shut up! But see what happens to a lot of PRs.. [snip]
 D has had 10 years (since D2) of 'creativity time', and much 
 longer than that in reality.

 Look at what the C++ committee has been able to accomplish in 
 the same amount of time.

 I don't object to creative endeavors. It's what makes life 
 worthwhile.

 But after 18 years, is that what D (still) is?
Er, yeah, I tend to ask myself the same question.
 Or is it a serious tool that serious programmers should take 
 seriously.

 And which perspective is the foundation committed too?
Good question. But I think we know the answer.
Nov 21 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 21 November 2018 at 11:17:41 UTC, Chris wrote:
 On Wednesday, 21 November 2018 at 01:09:30 UTC, NoMoreBugs 
 wrote:
 [...]
Yes, a programming language is a tool, not a religion. What has happened to D's "pragmatic approach"? It is sad, but interesting at the same time (from a sociological point of view), to see how intelligent and creative people, experienced and outstanding engineers, who started to create a good tool, have adopted a quasi-religious mindset, encouraging each other in their faith while taking offence at any criticism, calling critics trolls or alleging ulterior motives. Mind you, this doesn't only happen to newcomers, but also to people who have used and invested in D for years. It's a shame, because D has a lot of potential, things that other languages have only recently caught up on. This is more to my liking: Pragmatic language https://www.youtube.com/watch?v=PsaFVLr8t4E&feature=youtu.be?t=116 Evolution of language: https://www.youtube.com/watch?v=PsaFVLr8t4E&feature=youtu.be?t=1651 (And please spare me the comments about "industry backed", "vested interest", "D is a community effort" - a lot of things have to do with the mindset not with D being a community effort that isn't backed up by big industry.)
What did you think about this bit? "There's one thing that we don't really have and I don't really want it in the language: it's meta-programming... instead we had a very good experience doing compiler plugins." https://www.youtube.com/watch?v=PsaFVLr8t4E?t=2126 Also, no "first-class immutability."
Nov 21 2018
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 21 November 2018 at 13:26:34 UTC, Joakim wrote:

 What did you think about this bit?

 "There's one thing that we don't really have and I don't really 
 want it in the language: it's meta-programming... instead we 
 had a very good experience doing compiler plugins."
 https://www.youtube.com/watch?v=PsaFVLr8t4E?t=2126

 Also, no "first-class immutability."
I watched the whole keynote. Well, to begin with it's still a very young language (not 18+ years old) and keeps getting better and better. Things that were a bit tricky just recently are now much easier and part and parcel of the language. It shows that they listen to their user base and make things as easy as possible. In many ways it's already miles ahead of D in terms of what you need as a programmer to get things done fast, e.g. tooling, interop, multi-platform, handling of deprecations etc. There are useful features (I already knew from D) that make life easier (e.g. lambdas). And as for meta-programming (I knew this would come up ;)), I don't really miss it / use it anymore. There was only one case where I said it would have been nice, but it wasn't _really_ necessary (it was just an old D habit, really). In fact, meta-programming in D can cause a lot of unnecessary headaches (cryptic compiler warnings galore, code breakage) and stall the whole development process unnecessarily - and often for very little extra value. It says a lot that Adam D. Ruppe stated that if you don't want your code to break, use non-idiomatic D. So what's the point of it then? It's just absurd. D could have been a real contender here (e.g. C interop) but instead of investing in a good infrastructure / tooling, cleaning up and stabilizing the language, the community has turned D into a "feature laboratory" where ideas are discussed to death and really important issues are swept under the rug. Other new languages focus on tooling and interop from the very beginning as they realize that this is very important these days, more so than fancy features (that can be added later). Then, of course, the inevitable "X doesn't have feature Y, but D does! That's why X sucks." Only: are all these super features indispensable for production? Why hasn't D got backing from big players yet? Because of the community's elitist and parochial mindset and the overall lack of consistency. Joakim, you have done some great work as regards Android / iOS and I believe we are on the same page here. But see that the D Foundation didn't pick up on it and say "Let's take this and create some sound tooling for ARM cross-compilation." If it's not about fancy super-sophisticated allocators, contributors are completely on their own. This is no way to treat people who make an effort.
Nov 21 2018
next sibling parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Wednesday, 21 November 2018 at 14:38:07 UTC, Chris wrote:
 
 I watched the whole keynote. Well, to begin with it's still a 
 very young language (not 18+ years old) and keeps getting 
 better and better. Things that were a bit tricky just recently 
 are now much easier and part and parcel of the language. It 
 shows that they listen to their user base and make things as 
 easy as possible. In many ways it's already miles ahead of D in 
 terms of what you need as a programmer to get things done fast, 
 e.g. tooling, interop, multi-platform, handling of deprecations 
 etc. There are useful features (I already knew from D) that 
 make life easier (e.g. lambdas).
Life isn't a zero-sum competition. It's good for D if other emerging languages flourish. Its bad for D if people worry about what others might be doing rather than thinking about how to make D the best language it can be.
 And as for meta-programming (I knew this would come up ;)), I 
 don't really miss it / use it anymore. There was only one case 
 where I said it would have been nice, but it wasn't _really_ 
 necessary (it was just an old D habit, really). In fact, 
 meta-programming in D can cause a lot of unnecessary headaches 
 (cryptic compiler warnings galore, code breakage) and stall the 
 whole development process unnecessarily - and often for very 
 little extra value. It says a lot that Adam D. Ruppe stated 
 that if you don't want your code to break, use non-idiomatic D. 
 So what's the point of it then? It's just absurd.
Maybe D isn't the right language for you if you feel it doesn't bring anything special. That's okay - it's a big wide world with plenty of room for there being different ways to be successful. Everything has a cost to it. Sometimes it's not worth the benefit. But the dollar value of those things very much depends on your context and what you are trying to achieve. One doesn't need to use metaprogramming if one doesn't want to, after all. My own experience is that it has been quite useful.
 D could have been a real contender here (e.g. C interop) but 
 instead of investing in a good infrastructure / tooling, 
 cleaning up and stabilizing the language,
You know Mrs Thatcher once said there is no such thing as society, only individual men and women and families. That's an overstatement - it's not one or the other, but society doesn't act as some mysterious social organism. And it's the same with community. Ive been investing a bit in tooling. Dpp is pretty convenient for using C libraries and dstep also has improved by leaps and bounds. Dpp has already saved me an incredible amount of time personally and it's made it possible to explore more options. It's Atila's personal project that I sponsor in a modest way. I open sourced a wrapper to create excel workbooks in D. Also a D wrapper for reliable UDP - UDT. It sped up file transfers between London and HK by between 100x and 350x. Then there is autowrap, which was developed for in house use but I open sourced. It now couldn't be easier to write D libraries for python, or excel addins and quite soon (it's there but the PR Manu has been working on including STL in D runtime. And by the way I think dpp isn't so very far from being able to #include vector and more of C++ will follow in time. So no idea when but in time there's a decent chance you can #include cpp libraries and find it just works in many cases and in others it's minimal work. Based on a suggestion from Dmitry Olshansky I was playing around with graal vm. Polyglot.h just works with dpp. Some work to write a wrapper but you can easily compile D to Llvm bitcode and have it run on the JVM where the tracing jit will do its magic and you can quite easily talk to js, java, python, R via polyglot. If I make any progress with the wrapper I will open source it. I give a little money to the D Foundation and it will increase in time. Symmetry sponsored the first Symmetry Autumn of Code with a couple of quite ambitious projects - fork GC and http 2. You're part of the D community I think. What have you been contributing? I presume something, because otherwise why so frustrated that other people are not doing what you would like them to. the community has
 turned D into a "feature laboratory" where ideas are discussed 
 to death and really important issues are swept under the rug.
People say a lot of things on forums. I think that's why lots of people don't bother and spend more time discussing pull requests. Talk is cheap. Personally, I've not been unhappy with the direction things have moved in over the past few years. Do you really not recall what things were like in 2014?
 Other new languages focus on tooling and interop from the very 
 beginning as they realize that this is very important these 
 days, more so than fancy features (that can be added later).
Its based around voluntary cooperation. How do you propose to make people work on things they personally aren't interested in doing? If IDEs are that important to you, it might not be the right choice - I don't know. I have the impression things keep getting better there but I personally don't use an IDE so I wouldn't know. People I work with don't seem miserable about VS support. It's not one of the things I have heard anyone grumble about. If they are important to you, what steps did you personally take to move the world just a little in the direction you wish it to go? I'm presuming you did something otherwise its funny to grumble. What more would you reasonably like others (for example me) to do on interop? (the big thing is the improvement in cpp interop, especially mangling). It also does not seem to me like theres a dramatic amount to do - cpp will get there in time. C is easy. and the reverse will follow. Jni isn't hard and polyglot is even easier and maybe will be made as easy as anything can be in time. Honestly, what's missing? Nodejs?! Julia? (at least Julia.h just works with dpp)
 Then, of course, the inevitable "X doesn't have feature Y, but 
 D does! That's why X sucks." Only: are all these super features 
 indispensable for production?
I am surprised you think the benefit of D is a matter of ticking of features. I was just explaining to a colleague who was intrigued enough to take Andrei's book home that truly D brings nothing new, if you look at it from a tick box point of view. Don't even bother trying if you're going to think of it that way. But actually D does bring something quite unique and whilst it may not exactly be Christopher Alexander's quality without a name, it does relate to that. In his paper on harmony - seeking computations (which is what characterizes life, a contrast with the deadness of mere emergence) he discusses a little experiment in which he shows the learnt inability of his Radcliffe students to notice patterns that really did exist but required a gestalt conception and couldn't be appreciated from considering the parts. So it's my personal view that if D brings something valuable, you really shouldn't expect it to be a matter of this feature or that. It's what it's like working on a project over time in D and what it's like brings pretty valuable commercial benefits. I speak of my own experience but what I have heard from Liran at Weka is consonant with that. Plasticity is something pretty valuable if you are in a business where intrinsically requirements evolve and can in any case never be sufficiently fully understood upfront. Readability and expressiveness too. That the deeper conceptual understanding of what plasticity is has yet to be understood does not mean its not something very real and important. I think it has to do with certain ideas from Christopher Alexander, Goethe and others, but I'm still figuring that out.
 why hasn't D got backing from big players yet?
Because it's solving different problems from the ones they have. Since at least in the US more than 100% of jobs are created by small and medium sized businesses, and since big tech have plenty of choices that solve their problems, then my personal view is that its a wonderful thing that D isn't backed by big tech. I'm not even talking about potential impact on funding if the end of the bubble in despair weighs on the sector that people could believe in amidst despair and we see a rotation of capital into other sectors something that could weigh on funding generally. Because of the community's elitist and parochial
 mindset and the overall lack of consistency.
What did you personally do to make things less parochial and more consistent? And btw parochial?? My own experience hiring D programmers from the community (two of them I met at the first meetup I went to) has been that one of the biggest benefits has been an opening up of the group to different ideas. There are so many more varied and different discussions today than before. That's a consequence of hiring people from the D community. If by the community you meant people who post on the forum then you are making an identification error. They are by no means the same thing.
 Joakim, you have done some great work as regards Android / iOS
Indeed.
 and I believe we are on the same page here. But see that the D 
 Foundation didn't pick up on it and say "Let's take this and 
 create some sound tooling for ARM cross-compilation."
You remind me of conversations with senior guy from an investment bank being on the hedge fund side. You should do this or do that. Well maybe, but how do you think I ought to make it happen in the context I am in. Its a small company with only 140 odd people. How is one to execute because a grand strategic plan by itself won't change a thing. You seem to be assuming the D Foundation has not just a gazillion dollars but a big staff to go with it. This isn't the case, I think. It wasn't set up long ago and it's just beginning to start to get going. It's awfully hard creating something from nothing (and it takes ages before you see results since it's an S curve) though of course its easy to give helpful advice. But talk is cheap and it really doesn't change much. If you would like to change the world, you know it's really up to you. It's pretty easy to make an impact. But to my eyes it's reminiscent of the behaviour of a spoilt child to insist others do things unless they are also doing what they can given their situation to help. Are you? And by the way my personal experience is that the small contributions I have made or that have been made commercially have been some of the highest return investments for me of the past decade. Yet I was never really thinking too much about the roi, just making sure if I spent a modest amount of someone else's money I was acting as a good steward of it. It's a stunning arbitrage to contribute to the community from where I sit. That won't be true of everyone because the context is different. But it might be true of more people than realise.
Nov 21 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 22 November 2018 at 04:07:32 UTC, Laeeth Isharc 
wrote:

[snip]

So I see, for you D is some sort of an esoteric group therapy.

 What did you personally do to make things less parochial and 
 more consistent?
Yawn. Same strategy again: "Contribute or shut up!". Groundhog Day.
 It's pretty easy to make an impact.
It's actually not. Not in D.
 But to my eyes it's reminiscent of the behaviour of a spoilt 
 child to insist others do
 things unless they are also doing what they can given their 
 situation to help.
Again, condescension. Using a posh word ("reminiscent") and asking a cowardly question at the end ("Are you?") to make it look as if it were not a personal attack! Jesus! To be honest, I've used D for years and promoted it among people I know. I've written loads of software in D that is being used by others. I think that could also be seen as a contribution. And if I mention / request certain things that are common in other languages or I see that certain things are being neglected, of course, I am _the ungrateful child_ or I don't understand how the universe works or both. My sarcasm is a reaction to the fact that I (and others) usually get no reaction regarding the points we make, only derision, nitpicking and a sermon about the general philosophy of D and that only the high priests really understand it.
 You seem to be assuming the D Foundation has not just a 
 gazillion dollars but a big staff to
 go with it.  This isn't the case, I think.  It wasn't set up 
 long ago and it's just beginning to start to get going.  It's 
 awfully hard creating something from nothing (and it takes ages 
 before you see results since it's an S curve) though of course 
 its easy to give helpful advice.
The thing is that - as you mentioned yourself - there's loads of stuff out there already: Dpp, DStep, Joakims ARM stuff, Polyglot.h etc. Is it so unreasonable to expect the D Foundation to focus on collecting all the brilliant work that's been done by volunteers so far (fair play to them!) and package it as a nice product that can be extended as needed? That'd be a killer package and a good selling point for D. It's only common sense. The D Foundation initially gave me hope that something like that would happen, and then you could get medium to big players on board (what's a million dollars to IBM or the like?) However, it got worse than before. While the development of D used to be a bit chaotic it has now become an autocratic chaos. The focus has shifted from improving D as a language (including tooling, packaging etc) to obsessing over the pet feature of the day. It's more of a hobby project now than ever before. And mind you, the D Foundation is young, but D is 18+ years old. But hey, that's all right, if that's what the D leadership wants, fine. But stop pretending that D is a useful language people should use in the real world.
Nov 22 2018
next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 22 November 2018 at 10:08:06 UTC, Chris wrote:
 The thing is that - as you mentioned yourself - there's loads 
 of stuff out there already: Dpp, DStep, Joakims ARM stuff, 
 Polyglot.h etc. Is it so unreasonable to expect the D 
 Foundation to focus on collecting all the brilliant work that's 
 been done by volunteers so far (fair play to them!) and package 
 it as a nice product that can be extended as needed? That'd be 
 a killer package and a good selling point for D. It's only 
 common sense.

 The D Foundation initially gave me hope that something like 
 that would happen, and then you could get medium to big players 
 on board (what's a million dollars to IBM or the like?) 
 However, it got worse than before. While the development of D 
 used to be a bit chaotic it has now become an autocratic chaos. 
 The focus has shifted from improving D as a language (including 
 tooling, packaging etc) to obsessing over the pet feature of 
 the day. It's more of a hobby project now than ever before. And 
 mind you, the D Foundation is young, but D is 18+ years old.
The lack of vision is problematic indeed, and I am doing something about it. The foundation is due to meet with industry users in a few weeks to resolve issues and set a clear direction, which will cover many of those points (hopefully a better replacement for those vision documents that are long overdue). Watch this space...
Nov 22 2018
parent Chris <wendlec tcd.ie> writes:
On Thursday, 22 November 2018 at 11:26:03 UTC, Nicholas Wilson 
wrote:
 On Thursday, 22 November 2018 at 10:08:06 UTC, Chris wrote:
 [...]
The lack of vision is problematic indeed, and I am doing something about it. The foundation is due to meet with industry users in a few weeks to resolve issues and set a clear direction, which will cover many of those points (hopefully a better replacement for those vision documents that are long overdue).
Finally!
 Watch this space...
I will.
Nov 22 2018
prev sibling parent welkam <wwwelkam gmail.com> writes:
On Thursday, 22 November 2018 at 10:08:06 UTC, Chris wrote:

 Yawn. Same strategy again: "Contribute or shut up!". Groundhog 
 Day.
I have been long enough on this forum/mailing list to have seen the same thing happen over and over again. Just like you say - Groundhog Day but I see different thing. I see people who dont understand how little available resources D have and who think that things are the way they are not because of lack of resources but because people are stupid. They go on forum/mailing list having wrong preconceptions thinking that if things are bad because of stupid people and if I enlighten those stupid people things will get better. So they make post. But then people from community try to explain to them that its lack of resources that is responsible and if you want for thing to change either contribute your own work or donate money. Sadly this kind of response is minority of post on thread and most people are getting carried by secondary topics. People dont like this response and get frustrated and leave this forum after few more post of their own, but few begin to understand that D doesnt live to its potential because people rather use D than contribute to this project and if there is a problem they just complain instead of working on solving it. Now here things get funny. Well at least to me. In essence they are complaining that other people act just like them without realizing that they complain about them self. D project has huge surface area - from low level to web development, from game consoles to mobile. People dont want for things just to run but the whole experience to be excellent. Thats not possible with just $1605 per month. we need more than 10x more than that. "D foundation expenses for H1 2017 have averaged at $1605 per month."
Nov 24 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 22 November 2018 at 04:07:32 UTC, Laeeth Isharc 
wrote:

[snip]

By the way, you keep mentioning that you use D for your own 
internal stuff, and as far as I can see a lot of companies that 
use D do the same. They have their own in-house ecosystem, and 
that's fine. Of course, for this kind of usage D might be OK 
(apart from the facepalm flaws it still has) - or any language 
for that matter.

However, a lot of IT companies (small, medium and big) also have 
to adapt to the market in terms of third party products like 
Android and iOS and other technologies (including those that do 
not yet exist). Once that's the case, D is one of the worst 
choices possible. Everything takes years, anything that is not 
directly related to (some specific features of) the language is 
treated as lowest priority, and a small or medium sized company 
may not be able "to roll its own" all the time, especially if 
everything is still raw and has a lot of loose ends as in the 
case of D. And you know what, customers don't care about your 
problems. They simply go somewhere else, if you can't deliver. So 
what you need is a language that provides adaptability out of the 
box. This is why a lot of new languages invest in exactly this 
and try to open as many "ports" as possible, to make the language 
as useful as possible. Only in this way they will be adopted by 
users. Your "cosmos" is D and your own company. But most other 
people have to cater for third party software as well and adapt 
to an ever changing market. I think this is a fact you're not 
really aware of. You can talk all you like about the cosmos and 
the universe, but all you see are your own needs for which D 
might be fine. But reality is different for other people.
Nov 22 2018
parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Thursday, 22 November 2018 at 11:14:54 UTC, Chris wrote:
 On Thursday, 22 November 2018 at 04:07:32 UTC, Laeeth Isharc 
 wrote:

 [snip]

 By the way, you keep mentioning that you use D for your own 
 internal stuff, and as far as I can see a lot of companies that 
 use D do the same. They have their own in-house ecosystem, and 
 that's fine. Of course, for this kind of usage D might be OK 
 (apart from the facepalm flaws it still has) - or any language 
 for that matter.

 However, a lot of IT companies (small, medium and big) also 
 have to adapt to the market in terms of third party products 
 like Android and iOS and other technologies (including those 
 that do not yet exist). Once that's the case, D is one of the 
 worst choices possible.
D doesn't have the best GUI libraries, that's quite true. JNI is not that bad so if you wanted everything but the front end in D would it be so tough? You might be right about the situation on ios as I haven't heard of people doing much there in D - last I heard the hard work was done but the last stage - something relating to bitcode if I recall right.
 Everything takes years
True. Why does it bother you? Did somebody suggest it would be otherwise? If one puts some energy behind making something happen it's not that bad, I think. Eg autowrap for JNI shouldn't take more than a couple of months of work to be mostly useful I think.
 anything that is not directly related to (some specific 
 features of) the language is treated as lowest priority
Who is treating, and what is this word priority I keep hearing. It's open source, ordered anarchy. It's hard enough persuading a creative person that you pay to do something he doesn't want to do. How do you suppose it is that Walter and Andrei can assign priorities more to your liking and make people work on what you would wish? They cannot. I think you're applying rules from your work domain to an environment where they don't apply. That's simply not how volunteer open source works. If there's a single corporate backer of course it's different if by solving their problems they solve yours too or if can influence them. That's intrinsically not the kind of community D is, so if you want to achieve a generative result here then your means do not suit your ends. , and a small or medium
 sized company may not be able "to roll its own" all the time,
Indeed, not unless they have fairly talented and creative programmers and the management is technical. Most companies probabily shouldn't introduce D in a top down way. They're just not cut out for it. Elsewhere I called D a secret weapon that is only available to the few (because you have to have the culture, capabilities and will to be willing to trade off less easy today for more simple tomorrow). That's okay, isn't it? It's a big wide world and I think there won't be fewer people writing native code in a decade but rather more. You don't need to appeal to everyone if you are the challenger and it is pointless to do so.
 especially if everything is still raw and has a lot of loose 
 ends as in the case of D. And you know what, customers don't 
 care about your problems. They simply go somewhere else, if you 
 can't deliver.
Yes. For example if you have a storage system that is not performant or does not add features sufficiently quickly you will lose customers. Its tough having customers but I think its a matter of picking your poison after deciding what's important to you. If you don't want to use D, I wouldn't dream of trying to change your mind. Maybe it's not for you, or not for you right now. But really one must consider one's ends and pick means to serve those. The means of complaining on the forum doesn't actually accomplish anything. If there's something you want in D that you would use, then probably the most effective route will be to make a small beginning yourself and try to persuade others to be involved. Or you could just pay for it yourself, or try to persuade others to club together to pay for it.
 So what you need is a language that provides adaptability out 
 of the box.
D notably is adaptable. It just doesn't come ready adapted.
this is why a lot of new languages
 invest in exactly this
Where do you think the money will come from to invest, exactly? Grumble enough on the forum until Google decide to sponsor the language and invest in the things you want? I don't think it works like that. Being an entrepreneur is quite difficult. How does one create something from nothing? In my experience when you are ready for it you figure out where you want to go, don't worry too much if your goal seems audacious or unrealistic, and make a humble little start and persist, correcting course along the way as new information arrives. I think that's the way in general to push the world in a direction you think would be a good one. An honest, thought through proposal by far beats complaining. And working code beats a proposal.
Only in
 this way they will be adopted by users.
That's not how it works for a language like D. Read the innovators dilemma by Christensen as well as Thiel Zero to One. Your "cosmos" is D and
 your own company. But most other people have to cater for third 
 party software as well and adapt to an ever changing market.
I don't have time to look but I'm curious. My guess would be most software is written within the enterprise for internal use and then add software as a service and I should think that's a very significant chunk of the total market. So maybe D isn't right for some users or not sufficiently mature in ecosystem and tooling right now. Okay. Do you find that especially surprising?
 I think this is a fact you're not really aware of. You can talk 
 all you like about the cosmos and the universe, but all you see 
 are your own needs for which D might be fine. But reality is 
 different for other people.
My impression is the opposite. I don't doubt that D isn't right for many, but you seem to be suggesting that what you feel is true for you must inevitably be true for the whole world. I think that's contradicted by observed experience. And as far as my world goes, finance is quite a decent employer of programmers and market for technology. If you're the challenger you really don't need to appeal to everyone to grow. You just need to increase awareness and appeal a bit to people who are already poised on the edge. It's better to not be taken seriously if you are the challenger and that camouflage may well be a matter of survival. You do not want the dominant players to take you seriously and try to imitate your advantages or to fight dirty until you are more than ready for it. Not that it's calculated, but it's how natural waves of adoption unfold.
Nov 22 2018
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 22, 2018 at 10:04:36PM +0000, Laeeth Isharc via Digitalmars-d wrote:
 On Thursday, 22 November 2018 at 11:14:54 UTC, Chris wrote:
[...]
 However, a lot of IT companies (small, medium and big) also have to
 adapt to the market in terms of third party products like Android
 and iOS and other technologies (including those that do not yet
 exist). Once that's the case, D is one of the worst choices
 possible.
D doesn't have the best GUI libraries, that's quite true. JNI is not that bad so if you wanted everything but the front end in D would it be so tough? You might be right about the situation on ios as I haven't heard of people doing much there in D - last I heard the hard work was done but the last stage - something relating to bitcode if I recall right.
[...] Joakim has done an amazing job making LDC cross-compile to Android. Currently I'm working on an Android project that has Java talking to D code via JNI. It works fairly well, though with some amount of boilerplate that at some point I'm expecting to factor out into a template that auto-generates JNI wrappers. So far I have the majority of my code in D, so the need for JNI wrappers has been pretty minimal, but at some point I'll probably be wanting more convenient ways of interacting with Java, and D's metaprogramming power will be brought to bear. T -- Let X be the set not defined by this sentence...
Nov 22 2018
prev sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 22 November 2018 at 22:04:36 UTC, Laeeth Isharc 
wrote:
 On Thursday, 22 November 2018 at 11:14:54 UTC, Chris wrote:
 On Thursday, 22 November 2018 at 04:07:32 UTC, Laeeth Isharc 
 wrote:

 [snip]

 By the way, you keep mentioning that you use D for your own 
 internal stuff, and as far as I can see a lot of companies 
 that use D do the same. They have their own in-house 
 ecosystem, and that's fine. Of course, for this kind of usage 
 D might be OK (apart from the facepalm flaws it still has) - 
 or any language for that matter.

 However, a lot of IT companies (small, medium and big) also 
 have to adapt to the market in terms of third party products 
 like Android and iOS and other technologies (including those 
 that do not yet exist). Once that's the case, D is one of the 
 worst choices possible.
D doesn't have the best GUI libraries, that's quite true. JNI is not that bad so if you wanted everything but the front end in D would it be so tough? You might be right about the situation on ios as I haven't heard of people doing much there in D - last I heard the hard work was done but the last stage - something relating to bitcode if I recall right.
 Everything takes years
True. Why does it bother you? Did somebody suggest it would be otherwise? If one puts some energy behind making something happen it's not that bad, I think. Eg autowrap for JNI shouldn't take more than a couple of months of work to be mostly useful I think.
 anything that is not directly related to (some specific 
 features of) the language is treated as lowest priority
Who is treating, and what is this word priority I keep hearing. It's open source, ordered anarchy. It's hard enough persuading a creative person that you pay to do something he doesn't want to do. How do you suppose it is that Walter and Andrei can assign priorities more to your liking and make people work on what you would wish? They cannot.
It's not what _I_ wish. It's common sense. Sound string handling and tooling.
 I think you're applying rules from your work domain to an 
 environment where they don't apply.   That's simply not how 
 volunteer open source works.
This is no longer true of D. 18+ years and a D Foundation. The "hobby project / open source community" argument is no longer valid. You cannot aspire to be a big player on the one hand and then, on the other hand, when it gets a bit rough, say "Ah, we're just a small community". It's either or. If I were an investor I'd say "No!" after listening to you. You need to be realistic. While everything has a spiritual dimension, one has to be practical at too.
Nov 23 2018
prev sibling parent reply Joakim <dlang joakim.fea.st> writes:
On Wednesday, 21 November 2018 at 14:38:07 UTC, Chris wrote:
 On Wednesday, 21 November 2018 at 13:26:34 UTC, Joakim wrote:

 What did you think about this bit?

 "There's one thing that we don't really have and I don't 
 really want it in the language: it's meta-programming... 
 instead we had a very good experience doing compiler plugins."
 https://www.youtube.com/watch?v=PsaFVLr8t4E?t=2126

 Also, no "first-class immutability."
I watched the whole keynote. Well, to begin with it's still a very young language (not 18+ years old) and keeps getting better and better. Things that were a bit tricky just recently are now much easier and part and parcel of the language. It shows that they listen to their user base and make things as easy as possible. In many ways it's already miles ahead of D in terms of what you need as a programmer to get things done fast, e.g. tooling, interop, multi-platform, handling of deprecations etc. There are useful features (I already knew from D) that make life easier (e.g. lambdas).
You would expect that from a language coming out of a tools provider. I thought it was interesting how he mentioned that one of the main reasons he didn't want meta-programming in Kotlin is because it's extremely hard for an IDE like theirs to deal with. That's potentially one of the big advantages of D's string-based mixin approach: the IDE can tell the compiler to dump them to files and then just treat them as regular D source, though I don't know that anybody is doing that yet.
 And as for meta-programming (I knew this would come up ;)), I 
 don't really miss it / use it anymore. There was only one case 
 where I said it would have been nice, but it wasn't _really_ 
 necessary (it was just an old D habit, really). In fact, 
 meta-programming in D can cause a lot of unnecessary headaches 
 (cryptic compiler warnings galore, code breakage) and stall the 
 whole development process unnecessarily - and often for very 
 little extra value. It says a lot that Adam D. Ruppe stated 
 that if you don't want your code to break, use non-idiomatic D. 
 So what's the point of it then? It's just absurd.
It's probably D's biggest differentiator, the one feature that people always say blows their mind when they use D, both from its power and relative ease of use. Of course, it's not for everyone, just like lisp macros, so I asked you what you thought, without saying "Kotlin sucks" or whatever.
 D could have been a real contender here (e.g. C interop) but 
 instead of investing in a good infrastructure / tooling, 
 cleaning up and stabilizing the language, the community has 
 turned D into a "feature laboratory" where ideas are discussed 
 to death and really important issues are swept under the rug. 
 Other new languages focus on tooling and interop from the very 
 beginning as they realize that this is very important these 
 days, more so than fancy features (that can be added later).
This is a common theme in open source: OSS devs are often much more interested in trying their hand at writing some whizbang tech, which they may not get to do at their day job, than fixing bugs and taking out the trash. Another is that they simply write a clone of some expensive proprietary software that only has some strict subset of its features, say Linus writing the linux kernel because he wanted to run UNIX on his home desktop. Neither are conducive to production, stable environments: it took many companies coming in and polishing up the linux kernel or LLVM before they became core components at many companies today. The same will have to be done for D.
 Then, of course, the inevitable "X doesn't have feature Y, but 
 D does! That's why X sucks." Only: are all these super features 
 indispensable for production? Why hasn't D got backing from big 
 players yet? Because of the community's elitist and parochial 
 mindset and the overall lack of consistency.
Why hasn't ruby/rails, Rust, or Nim gotten backing from big players yet? Most things don't get backing from big players, especially initially. What you hope is to create a superior tool that helps small companies grow into the big players someday.
 Joakim, you have done some great work as regards Android / iOS 
 and I believe we are on the same page here. But see that the D 
 Foundation didn't pick up on it and say "Let's take this and 
 create some sound tooling for ARM cross-compilation." If it's 
 not about fancy super-sophisticated allocators, contributors 
 are completely on their own. This is no way to treat people who 
 make an effort.
I agree that D has had perhaps too much of an emphasis on breaking new ground rather than the boring nuts-and-bolts of adding new platforms and bug-fixing. Now that multiple multi-million-valued and several small companies are built on D, let's hope they get together and make a push with the community in that direction, as Nicholas indicates. In the meantime, you are a commercial user of D, one who has always been _extremely_ positive about the language over the years in these forums. What changed your mind recently? I'm asking for specifics: bugs not fixed or concrete incidents that indicate the wrong mindset. We can all wave our hands in the air over perceived generalities, but nothing can change without specifics.
Nov 23 2018
next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Friday, 23 November 2018 at 10:25:57 UTC, Joakim wrote:
 On Wednesday, 21 November 2018 at 14:38:07 UTC, Chris wrote:
 I watched the whole keynote. Well, to begin with it's still a 
 very young language (not 18+ years old) and keeps getting 
 better and better. Things that were a bit tricky just recently 
 are now much easier and part and parcel of the language. It 
 shows that they listen to their user base and make things as 
 easy as possible. In many ways it's already miles ahead of D 
 in terms of what you need as a programmer to get things done 
 fast, e.g. tooling, interop, multi-platform, handling of 
 deprecations etc. There are useful features (I already knew 
 from D) that make life easier (e.g. lambdas).
You would expect that from a language coming out of a tools provider. I thought it was interesting how he mentioned that one of the main reasons he didn't want meta-programming in Kotlin is because it's extremely hard for an IDE like theirs to deal with. That's potentially one of the big advantages of D's string-based mixin approach: the IDE can tell the compiler to dump them to files and then just treat them as regular D source, though I don't know that anybody is doing that yet.
-debug-mixin=file was recently added so that should be possible. I'd bet money that Manu will pester Rainer for it at some point :)
 And as for meta-programming (I knew this would come up ;)), I 
 don't really miss it / use it anymore. There was only one case 
 where I said it would have been nice, but it wasn't _really_ 
 necessary (it was just an old D habit, really). In fact, 
 meta-programming in D can cause a lot of unnecessary headaches 
 (cryptic compiler warnings galore, code breakage) and stall 
 the whole development process unnecessarily - and often for 
 very little extra value. It says a lot that Adam D. Ruppe 
 stated that if you don't want your code to break, use 
 non-idiomatic D. So what's the point of it then? It's just 
 absurd.
It's probably D's biggest differentiator, the one feature that people always say blows their mind when they use D, both from its power and relative ease of use. Of course, it's not for everyone, just like lisp macros, so I asked you what you thought, without saying "Kotlin sucks" or whatever.
What I would really interesting from that they reckoned people would write compiler plugins as a substitute. Having dealt with both, I can say that doing what you can in the library is immensely easier. I've seen what clang does to support CUDA: it ain't pretty or simple. Obviously not all thing are going to be that complicated, but I'm not particularly convinced that that will be a common thing to do.
Nov 23 2018
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 23 November 2018 at 10:25:57 UTC, Joakim wrote:


 You would expect that from a language coming out of a tools 
 provider. I thought it was interesting how he mentioned that 
 one of the main reasons he didn't want meta-programming in 
 Kotlin is because it's extremely hard for an IDE like theirs to 
 deal with.
 That's potentially one of the big advantages of D's 
 string-based mixin approach: the IDE can tell the compiler to 
 dump them to files and then just treat them as regular D 
 source, though I don't know that anybody is doing that yet.
 It's probably D's biggest differentiator, the one feature that 
 people always say blows their mind when they use D, both from 
 its power and relative ease of use. Of course, it's not for 
 everyone, just like lisp macros, so I asked you what you 
 thought, without saying "Kotlin sucks" or whatever.
No, in fairness, you didn't say Kotlin sucks. To be honest meta-programming is really hard for the compiler and the D user. And when you look back at the code after all the headache you wonder was it really worth it. Also, an IDE / compiler that immediately tells you what's wrong is _very_ handy.
 This is a common theme in open source: OSS devs are often much 
 more interested in trying their hand at writing some whizbang 
 tech, which they may not get to do at their day job, than 
 fixing bugs and taking out the trash. Another is that they 
 simply write a clone of some expensive proprietary software 
 that only has some strict subset of its features, say Linus 
 writing the linux kernel because he wanted to run UNIX on his 
 home desktop.

 Neither are conducive to production, stable environments: it 
 took many companies coming in and polishing up the linux kernel 
 or LLVM before they became core components at many companies 
 today. The same will have to be done for D.
Exactly, so what is the D Foundation waiting for? D is 18 years old.
 Why hasn't ruby/rails, Rust, or Nim gotten backing from big 
 players yet? Most things don't get backing from big players, 
 especially initially. What you hope is to create a superior 
 tool that helps small companies grow into the big players 
 someday.
D could be a superior tool. But D is drinking all its money.
 I agree that D has had perhaps too much of an emphasis on 
 breaking new ground rather than the boring nuts-and-bolts of 
 adding new platforms and bug-fixing. Now that multiple 
 multi-million-valued and several small companies are built on 
 D, let's hope they get together and make a push with the 
 community in that direction, as Nicholas indicates.
It's 2018. The IT world has realized that you need to be compatible. And it has to be as easy and smooth as possible.
 In the meantime, you are a commercial user of D, one who has 
 always been _extremely_ positive about the language over the 
 years in these forums. What changed your mind recently?

 I'm asking for specifics: bugs not fixed or concrete incidents 
 that indicate the wrong mindset. We can all wave our hands in 
 the air over perceived generalities, but nothing can change 
 without specifics.
Thanks for asking. It's little things and big things that happened over the years. I think what really made me go "Hm" was https://issues.dlang.org/show_bug.cgi?id=16739 and that it took so long to fix it. I was starting to write a new text parser and came across this stupid bug. Then there was the whole issue of string handling and autodecode and the way the leadership / big guys in the community dealt with it. I was of the opinion that such an essential issue had to be fixed immediately (of course with a proper path to fix it). I even offered to be the guinea pig and document the transition. But no. Then there were the dreaded dmd updates. "Shit, what will break now?" this question would be my constant companion. Sometimes nothing, sometimes quite a few things. How would fixing it affect my code? Yes, I updated my code, but...Once my code broke and I was told to use the `-allinst` flag (Why? It had worked before!), then a few releases later my code broke completely again and on a whim I removed the -allinst flag and it compiled again. Just don't ask why! It's Ahhhrrrrrrrr! The thing is, in D normal features break, not the experimental ones. Ordinary idiomatic D code just breaks and then works again and breaks again. WT*? And, being a vibe.d user, the compiler would spit out loads of warnings related to vibe.d or just break it. I would have to wait for poor Sönke to fix it, but in the meantime there would be a new dmd version and again...poor Sönke. You can put up with this for a while until you realize that it need not be this way. Then there was the whole issue of ARM. Walter said he had no experience with it and kinda didn't care either, while everything was going in the direction of mobile (talking about priorities and signs of the time)! If it wasn't for you we'd have nothing, zero, nada. 5 years ago I raised the question. 5 years. D is 18 years old. What makes my blood boil is when I see lengthy discussions about semi-demented constructors and how they are soooo useful and then they forgot the semi-demented deconstructor and it wouldn't work anyway, because it would break some sh*it and conflict with the PR for the new loco-allocator library...and the guys don't even have a new XML parser you can safely use...but who needs XML, it suck anyway...ah, no lads, it's just crazy. And the advent of the D Foundation didn't help either. Quite the opposite. All I know is that I got things working fast in other languages (even if they don't have semi-demented constructors), I'm more productive than in D,...and I don't need to be afraid of the compiler...afraid of the compiler!
Nov 23 2018
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of string handling and autodecode and the way
the 
 leadership / big guys in the community dealt with it. I was of the opinion
that 
 such an essential issue had to be fixed immediately (of course with a proper 
 path to fix it). I even offered to be the guinea pig and document the 
 transition. But no.
 
 Then there were the dreaded dmd updates. "Shit, what will break now?" this 
 question would be my constant companion.
Fixing autodecode will break about everything. But you don't like breaking changes. What do you suggest?
Nov 23 2018
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 23 November 2018 at 23:49:16 UTC, Walter Bright wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Fixing autodecode will break about everything. But you don't 
 like breaking changes.

 What do you suggest?
If you put it this way, as you tend to, of course, I look like a lunatic. The breaking changes I don't like (and no-one does) are changes that seem completely random to the user. Normal, "innocent" code just breaks. It may be, because you're preparing feature X that will be available in a year or so and in order for it to work old code has to be sacrificed. If changes need to be made, let's do them in a sensible manner and not just walk over ordinary users' code. I'm under the impression that a fancy feature counts more than code stability ("Let the rabble clean up their code, we are pursuing a higher goal.") autodecode, however, is a different beast. It is an essential flaw and thus needs to be tackled (or D will be forever flawed). I've proposed a transition via a dual system (legacy / new). It could be a switch like "-autodecode=off". No code needs to break, but those that are willing to make the transition could go ahead. The transition could be documented and there could be an automated tester that tells users later where / how much of their code would break and how to fix it, if they turn autodecode off. But autodecode is not even open for discussion anymore.
Nov 24 2018
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/24/2018 6:52 AM, Chris wrote:
 On Friday, 23 November 2018 at 23:49:16 UTC, Walter Bright wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Fixing autodecode will break about everything. But you don't like breaking 
 changes.

 What do you suggest?
If you put it this way, as you tend to, of course, I look like a lunatic. The breaking changes I don't like (and no-one does) are changes that seem completely random to the user. Normal, "innocent" code just breaks.
Your last statement will come true if autodecode is removed. I'm not saying you're wrong, but I've been bombarded with messages for 15 years that consist of: 1. stop doing enhancements, just fix bugs 2. add in my favorite enhancement 1. don't break existing code 2. add in my favorite enhancement that breaks existing code usually in the same message from the same person.
 But autodecode is not even open for discussion anymore.
It is open. Nobody has shut down discussion of it. The trouble is nobody yet has come up with a reasonable migration path that doesn't result in normal, innocent code silently breaking. BTW, you can defeat autodecode for your own programs by using the .byChar filter: https://dlang.org/phobos/std_utf.html#byChar
Nov 24 2018
next sibling parent Chris <wendlec tcd.ie> writes:
On Sunday, 25 November 2018 at 04:23:37 UTC, Walter Bright wrote:
 On 11/24/2018 6:52 AM, Chris wrote:

 Your last statement will come true if autodecode is removed.
 I'm not saying you're wrong, but I've been bombarded with 
 messages for 15 years that consist of:

 1. stop doing enhancements, just fix bugs
 2. add in my favorite enhancement

 1. don't break existing code
 2. add in my favorite enhancement that breaks existing code

 usually in the same message from the same person.
You're mixing apples and oranges again. `autodecode` is an important change. You and Andrei have had no problems with breaking valid code for less. You seem to be in complete denial about `autodecode`. `autodecode` is not the same as a random request by a random user. `autodecode` is essential but you tend to conflate it with random stuff and argue as if it were a random issue, a pet feature. Ironically enough, a lot of random stuff and pet features get more attention / higher priority than basic issues like autodecode.
 But autodecode is not even open for discussion anymore.
It is open. Nobody has shut down discussion of it. The trouble is nobody yet has come up with a reasonable migration path that doesn't result in normal, innocent code silently breaking.
Well, this is not the impression I get. Several people have suggested migration paths, reasonable ones, there are people who want to help out (I used to be one of them), and yet the paths are not even discussed. All we hear is "It will break a lot of code, and isn't that what you wanted to avoid?" In every thread about aotudecode it becomes clear that people want to get rid of it, yet nothing is ever done. I've been vocal about this and other things, but how many more users are there that don't air their opinions but want to see some change?
 BTW, you can defeat autodecode for your own programs by using 
 the .byChar filter:

 https://dlang.org/phobos/std_utf.html#byChar
Ordinary D code already has enough workarounds. Maybe this is something nobody ever brings up, but I've looked at vibe.d code has it and it often goes unnoticed because the fix takes so long that you simply forget about it, and when you finally see it you may not have the time to "fix the fix". It's not good for code hygiene.
Nov 25 2018
prev sibling parent Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/24/18 11:23 PM, Walter Bright wrote:
 BTW, you can defeat autodecode for your own programs by using the 
 .byChar filter:
 
 https://dlang.org/phobos/std_utf.html#byChar
No, use byCodeUnit instead. byChar will convert other characters to char (in fact, byChar just returns r.byCodeUnit if it detects it's the same width). This is if you are expecting to ONLY get around auto-decoding. Of course, if you need conversion to a specific width, byChar or byWchar or byDchar will be useful. -Steve
Nov 26 2018
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 11/23/18 6:49 PM, Walter Bright wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of string handling and autodecode and 
 the way the leadership / big guys in the community dealt with it. I 
 was of the opinion that such an essential issue had to be fixed 
 immediately (of course with a proper path to fix it). I even offered 
 to be the guinea pig and document the transition. But no.

 Then there were the dreaded dmd updates. "Shit, what will break now?" 
 this question would be my constant companion.
Fixing autodecode will break about everything. But you don't like breaking changes. What do you suggest?
In fact, fixing autodecode will break very little. It will break some things, and I shudder to think of possible "correct" deprecation paths. But for the most part, code only cares about strings, and not the individual items inside them. 1. foreach(x; someString) stays the same. In fact it gets better, because it will be more consistent with phobos 2. someString.front changes, breaking some code, but most of it is already written to deal with it. Take for example std.algorithm.splitter -- there are already cases to handle ranges of char, it will just move from auto-decoding to... intentional decoding. 3. Probably the WORST part is that char implicitly casts to dchar, so things that expect dchar will still compile with strings whose element types switch to char, but instead of decoding, it will promote. Where you will find problems is where code assumes char[].front is dchar instead of using the Phobos ElementType in constraints. 4. The longer we wait, the worse it will be. There is no way to fix autodecoding without breaking code. So we have to make that sacrifice to make progress. The most painful thing I think would be a possible deprecation path where you have to be explicit about how you want things decoded in order to avoid messages. -Steve
Nov 24 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 24, 2018 at 10:25:23AM -0500, Steven Schveighoffer via
Digitalmars-d wrote:
 On 11/23/18 6:49 PM, Walter Bright wrote:
[...]
 Fixing autodecode will break about everything. But you don't like
 breaking changes.
 
 What do you suggest?
In fact, fixing autodecode will break very little. It will break some things, and I shudder to think of possible "correct" deprecation paths. But for the most part, code only cares about strings, and not the individual items inside them. 1. foreach(x; someString) stays the same. In fact it gets better, because it will be more consistent with phobos 2. someString.front changes, breaking some code, but most of it is already written to deal with it. Take for example std.algorithm.splitter -- there are already cases to handle ranges of char, it will just move from auto-decoding to... intentional decoding.
Unfortunately, this will break range-based code that expect a range over a string to return dchar. And thanks to item (3) below, this breakage will be a *silent* one, which is the worst possible kind of breakage. The code will still compile, but the semantics will be different from what the code was originally expecting. And to top it off, it will mostly work if your code deals primarily with ASCII strings. Only the poor end user on a foreign locale will even notice the breakage, and by then it's far too late. I'm all for killing autodecoding, but we need some way to inform user code that the code needs to be updated. Silent breakage is unacceptable. Better yet, have a tool to automatically convert the code to the new idiom, possibly with some user intervention required but preferably none besides "run this command to fix your code, then recompile".
 3. Probably the WORST part is that char implicitly casts to dchar, so
 things that expect dchar will still compile with strings whose element
 types switch to char, but instead of decoding, it will promote. Where
 you will find problems is where code assumes char[].front is dchar
 instead of using the Phobos ElementType in constraints.
Yes, this is a big problem because it's a silent breakage.
 4. The longer we wait, the worse it will be.
 
 There is no way to fix autodecoding without breaking code. So we have
 to make that sacrifice to make progress. The most painful thing I
 think would be a possible deprecation path where you have to be
 explicit about how you want things decoded in order to avoid messages.
[...] This isn't actually as bad as it sounds. Just make strings non-ranges (temporarily), and require .byCodePoint or .byCodeUnit for iteration. Then once the deprecation cycle is over and autodecoding is no more, allow strings as ranges again. It will be painful, yes, but painful is better than silent subtle breakage that nobody knows until it explodes in the customer's face. T -- Bare foot: (n.) A device for locating thumb tacks on the floor.
Nov 24 2018
parent Jon Degenhardt <jond noreply.com> writes:
On Saturday, 24 November 2018 at 16:26:43 UTC, H. S. Teoh wrote:
 On Sat, Nov 24, 2018 at 10:25:23AM -0500, Steven Schveighoffer 
 via Digitalmars-d wrote:
 4. The longer we wait, the worse it will be.
 
 There is no way to fix autodecoding without breaking code. So 
 we have to make that sacrifice to make progress. The most 
 painful thing I think would be a possible deprecation path 
 where you have to be explicit about how you want things 
 decoded in order to avoid messages.
[...] This isn't actually as bad as it sounds. Just make strings non-ranges (temporarily), and require .byCodePoint or .byCodeUnit for iteration. Then once the deprecation cycle is over and autodecoding is no more, allow strings as ranges again. It will be painful, yes, but painful is better than silent subtle breakage that nobody knows until it explodes in the customer's face.
I'm also on the side of depreciating of autodecoding. It's a significant deficit for systems doing high performance string manipulation. This is unfortunate, as otherwise D has many nice facilities for these applications. I'd be happy to provide additional feedback on this topic, as this an application area where I have experience. --Jon
Nov 24 2018
prev sibling parent reply welkam <wwwelkam gmail.com> writes:
On Friday, 23 November 2018 at 23:49:16 UTC, Walter Bright wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of string handling and 
 autodecode and the way the leadership / big guys in the 
 community dealt with it. I was of the opinion that such an 
 essential issue had to be fixed immediately (of course with a 
 proper path to fix it). I even offered to be the guinea pig 
 and document the transition. But no.
 
 Then there were the dreaded dmd updates. "Shit, what will 
 break now?" this question would be my constant companion.
Fixing autodecode will break about everything. But you don't like breaking changes. What do you suggest?
The real reason he is upset is because reality didnt match with his internal view(expectations). This creates anger, disturbance(dont know better words). Then "left brain" tries to come up with reasons to explain this feeling and thats what you get in his posts. Left brain coming up with reasons. People in neuroscience know for some time now that brain comes up with possible sounding explanations that are not the real reasons. Here are example from split brain. Talking about left explaining stuff start at 1:16 https://www.youtube.com/watch?v=wfYbgdo8e-8 Another example from Sam Harris. 19:55-20:10 https://youtu.be/gfpq_CIFDjg?t=1195 If you want these kinds of post to go away start managing expectation (almost impossible) or improve D (the thing you already doing)
Nov 24 2018
next sibling parent Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Saturday, 24 November 2018 at 18:02:59 UTC, welkam wrote:

 This creates anger, disturbance(dont know better words).
Frustration. In this case, of defenestration-inducing variety.
Nov 24 2018
prev sibling next sibling parent aliak <something something.com> writes:
On Saturday, 24 November 2018 at 18:02:59 UTC, welkam wrote:
 On Friday, 23 November 2018 at 23:49:16 UTC, Walter Bright 
 wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of string handling and 
 autodecode and the way the leadership / big guys in the 
 community dealt with it. I was of the opinion that such an 
 essential issue had to be fixed immediately (of course with a 
 proper path to fix it). I even offered to be the guinea pig 
 and document the transition. But no.
 
 Then there were the dreaded dmd updates. "Shit, what will 
 break now?" this question would be my constant companion.
Fixing autodecode will break about everything. But you don't like breaking changes. What do you suggest?
The real reason he is upset is because reality didnt match with his internal view(expectations). This creates anger, disturbance(dont know better words). Then "left brain" tries to come up with reasons to explain this feeling and thats what you get in his posts. Left brain coming up with reasons. People in neuroscience know for some time now that brain comes up with possible sounding explanations that are not the real reasons. Here are example from split brain. Talking about left explaining stuff start at 1:16 https://www.youtube.com/watch?v=wfYbgdo8e-8 Another example from Sam Harris. 19:55-20:10 https://youtu.be/gfpq_CIFDjg?t=1195 If you want these kinds of post to go away start managing expectation (almost impossible) or improve D (the thing you already doing)
You guys do realize that when he said "Then there were the dreaded dmd updates." he was talking about unexpected breakages right? Fixing autodecoding would be a very welcome and expected breaking change. These are not contradictory statements. Paraphrasing it as "Fixing autodecode will break about everything. But you don't like breaking changes." is just twisting his words.
Nov 25 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Saturday, 24 November 2018 at 18:02:59 UTC, welkam wrote:

 The real reason he is upset is because reality didnt match with 
 his internal view(expectations). This creates anger, 
 disturbance(dont know better words). Then "left brain" tries to 
 come up with reasons to explain this feeling and thats what you 
 get in his posts. Left brain coming up with reasons.
I don't have a "left brain" - I'm not a mutant. I have one brain (that has two halves). My unreasonable expectations are surprisingly reasonable in other languages / communities.
 People in neuroscience know for some time now that brain comes 
 up with possible sounding explanations that are not the real 
 reasons. Here are example from split brain. Talking about left 
 explaining stuff start at 1:16
 https://www.youtube.com/watch?v=wfYbgdo8e-8
 Another example from Sam Harris. 19:55-20:10
 https://youtu.be/gfpq_CIFDjg?t=1195
Possible sounding explanation of unreasonable person: - bad leadership, chaos, lack of vision / strategy, lack of understanding how business works Real explanation: - stupid user who doesn't understand D's philosophy and higher goals.
 If you want these kinds of post to go away start managing 
 expectation (almost impossible) or improve D (the thing you 
 already doing)
Once this kind of posts are gone, it will mean one of two things: either: 1. D has become a sound and reliable language with a good ecosystem. or: 2. D has been abandoned by everyone but the zealots.
Nov 25 2018
parent welkam <wwwelkam gmail.com> writes:
On Sunday, 25 November 2018 at 14:26:06 UTC, Chris wrote:
 I don't have a "left brain" - I'm not a mutant. I have one 
 brain (that has two halves).
those quotes are there for a reason. Its not to be taken literally. Real explanation would take too much typing. Also did you watch the CCP grey video in my post?
 My unreasonable expectations are surprisingly reasonable in 
 other languages / communities.
You mean languages with smaller scope and bigger resources? Like kotlin or rust? They both have lower scope than D and more resources. The closes to D in terms of scope and resources is nim and its rough on edges too. I would argue even more rough than D. So what D and nim have in common? Lack of resources.
 Possible sounding explanation of unreasonable person:

 - bad leadership, chaos, lack of vision / strategy, lack of 
 understanding how business works

 Real explanation:

 - stupid user who doesn't understand D's philosophy and higher 
 goals.
In that post I treated you as human. Human that is just like me and billions of others. When you look at human brain close enough you will find strange thing. I dont know how you can look at $1605 per month and say "yeah you can hire loads of high skilled developers to do awesome job". (im exaggerating to make a point) Look at that number as long as you need until you get it. I have 80-90% agreement with what is wrong with D but I dont act like you. That number is a reason. P.s. If you want people to think of you as reasonable person dont compare non-profit open source project to for profit organization.
Nov 25 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of ARM. Walter said he had no experience with
it 
 and kinda didn't care either,
Actually, a person in the D community was working for a couple years on an ARM back end, but he eventually lost interest and it was abandoned. Building a backend is something usually teams of people work on exclusively. It's a little unfair to expect me to do one spending an hour or so a day on it. I can't order someone to work on it, I can't hire someone to work on it. It had to wait until someone both competent and self-motivated stepped up to do it.
Nov 23 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Nov 23, 2018 at 03:56:31PM -0800, Walter Bright via Digitalmars-d wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of ARM. Walter said he had no
 experience with it and kinda didn't care either,
Actually, a person in the D community was working for a couple years on an ARM back end, but he eventually lost interest and it was abandoned.
LDC is already well able to target ARM -- I've been using it to write Android apps, and while it takes a bit of work to set up, once it's setup it works very well. Frankly, I would not be particularly interested in an ARM target for dmd: dmd's weak optimizer, sorry to say, makes it a rather unattractive option compared to LDC. And now that LDC is keeping up with DMD releases, I'm quite tempted to just start using LDC for all of my D projects, or at least all the performance-sensitive ones, since it would be keeping up with the latest features / bugfixes.
 Building a backend is something usually teams of people work on
 exclusively.  It's a little unfair to expect me to do one spending an
 hour or so a day on it. I can't order someone to work on it, I can't
 hire someone to work on it.
 
 It had to wait until someone both competent and self-motivated stepped
 up to do it.
I would much rather Walter spend his time on higher-level, more important D issues than writing another dmd backend. Though I wouldn't mind if he gave a bit more love to the dmd optimizer. ;-) (When will we get loop unrolling?) T -- Obviously, some things aren't very obvious.
Nov 23 2018
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/23/2018 4:11 PM, H. S. Teoh wrote:
  Though I wouldn't
 mind if he gave a bit more love to the dmd optimizer. ;-)  (When will we
 get loop unrolling?)
It's already there: https://github.com/dlang/dmd/blob/master/src/dmd/backend/gloop.d#L3767
Nov 23 2018
prev sibling next sibling parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Saturday, 24 November 2018 at 00:11:37 UTC, H. S. Teoh wrote:
 On Fri, Nov 23, 2018 at 03:56:31PM -0800, Walter Bright via 
 Digitalmars-d wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of ARM. Walter said he had no 
 experience with it and kinda didn't care either,
Actually, a person in the D community was working for a couple years on an ARM back end, but he eventually lost interest and it was abandoned.
LDC is already well able to target ARM -- I've been using it to write Android apps, and while it takes a bit of work to set up, once it's setup it works very well.
Care to say a bit more about what you are using it for? Do you write the gui in D too? Using jni or dlangui if so?
Nov 23 2018
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 24, 2018 at 01:38:07AM +0000, Laeeth Isharc via Digitalmars-d wrote:
 On Saturday, 24 November 2018 at 00:11:37 UTC, H. S. Teoh wrote:
 On Fri, Nov 23, 2018 at 03:56:31PM -0800, Walter Bright via
 Digitalmars-d wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of ARM. Walter said he had no
 experience with it and kinda didn't care either,
Actually, a person in the D community was working for a couple years on an ARM back end, but he eventually lost interest and it was abandoned.
LDC is already well able to target ARM -- I've been using it to write Android apps, and while it takes a bit of work to set up, once it's setup it works very well.
Care to say a bit more about what you are using it for? Do you write the gui in D too? Using jni or dlangui if so?
Right now, it's still an experimental project to test out the waters... Things are looking good and it might turn into a "real" project soon, but as of now, it's still experimental. Basically I have the Java code handle interacting with Android OS (it's possible to do this via NDK but too troublesome and not necessary at the moment), basically just a thin wrapper to forward most of the logic to the D code via JNI. There's an error popup screen purely written in Java, but it's more a development tool for displaying uncaught D exceptions than an actual end-user UI. The bulk of the code is in D, and renders geometric models via OpenGL ES. At some point there will be text, and menus, and other UI elements, but right now, just some animated geometric shapes. It communicates with the Java code via JNI. Presently, the GL context creation is still handled by the Java code (haven't found a pressing need to manually do it yet, so why not leverage what the Android Java API already provides). The geometric models are converted by a D utility running on the host system (i.e., not on Android) that converts a bunch of models I have handy into the required format for OpenGL as a bunch of D arrays that I just embed into the executable. There's also a bunch of GLSL shaders that are stripped and turned into D string literals and embedded into the executable as well -- no need to load resource files at runtime. (This may or may not change as the project gets bigger, though.) This is also done by a build-time D utility, which also auto-generates nice D interfaces for binding shader inputs. So instead of manually writing a bunch of glUniformxxx calls (with the associated high risk of typos that causes crashes at runtime), I can just write: useProgram(...); myshader.worldmatrix = FMatrix(...); myshader.worldposition = FVec(1, 2, 3, 1); ... glDrawArrays(...); and it automatically issues the correct glUniformxxx calls for me. Currently vertex attribute bindings still have to be done manually, but with just a little more infrastructure in place, and I could auto-generate those too. Then I wouldn't have to worry about those crash-prone fiddly details that the OpenGL API requires ever again. The D code obtains the app-specific data directory from the Java code via JNI, and handles its own saving/restoring of UI state via a home-made made (somewhat hamfisted) serializer. Eventually it will probably be replaced by a real serialization system / library, but right now, it's still in the initial experimental stage so just a crude homebrew system. But the point is that the D code is well able to handle Android app lifetime events. One interesting aspect of this project is that I designed the D code to be mostly independent of the Android API -- the API is still more or less structured according to an Android app's lifetime callbacks, but is agnostic to the actual implementation. The OpenGL code being also a subset of desktop OpenGL APIs, I was able to write an X11 driver that runs the same D backend (without any version blocks!), except on the PC instead of an Android device. This has been very useful for ironing out silly mistakes by allowing development as a local PC program before deploying to the actual Android device. I suppose I *could* use the Android simulator for this, but it's slow, needlessly resource-hungry, and far too klunky to use for fast development. Plus, it opens up the possibility of deploying this program on PCs too, in addition to Android. D's malleability really helped to pull this off -- by templating away Android-specific types that can be substituted by PC-specific types / wrappers, I can pretty much just reuse the code as-is without the major refactoring into a labyrinth of interfaces and other boilerplate that would have been required in Java to abstract away OS-specific components. Like I've already said, I haven't gotten to the point of autogenerating JNI interfaces to be directly callable from D, but if the code turns out to need to cross the JNI boundary much more, that's what I'll do. I also setup my build system (a custom one based on SCons -- Gradle is far too heavyweight for my tastes) to compile and run unittests on the local PC, which will abort if any unittests fail. So the same source files are compiled twice -- once with -unittest targeting x86_64 (using dmd, no less! :-P), and once without -unittest targeting ARM (using LDC cross-compiler). If the build succeeds at all, I know that all unittests pass, and the APK can be deployed to the device to be further tested there. The entire build takes about 10-11 seconds to build everything from a clean workspace (includes generating model data, generating shader D APIs, compiling and running unittests, compiling X11 executables, compiling Java code, cross-compiling D code, and building and signing an APK), and about 5-6 seconds on incremental builds, depending on what was changed. Not as fast as Atila would like, ;-) but reasonably fast and with absolutely minimal system requirements -- I literally did all this with just a text editor, SCons, and SSH. Didn't even fire up Android Studio once. tl;dr: Thanks to Joakim's efforts and the LDC team, it is quite possible to write Android apps in D today. This project of mine still has a Java component to it, but I'd wager that doing a completely native app with the NDK would work just as well (Joakim's github repo has simple examples of this). I won't lie, though -- setting up LDC to cross-compile to Android/ARM does require some effort. But it's not unduly hard, only has to be done once, and your build system takes care of it thereafter (assuming you use a reasonable build system, that is). T -- People walk. Computers run.
Nov 23 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Saturday, 24 November 2018 at 02:45:46 UTC, H. S. Teoh wrote:
 On Sat, Nov 24, 2018 at 01:38:07AM +0000, Laeeth Isharc via 
 Digitalmars-d wrote:
 [...]
Right now, it's still an experimental project to test out the waters... Things are looking good and it might turn into a "real" project soon, but as of now, it's still experimental. [...]
Interesting, good to hear it's working well. I plan to write up a post for the D blog about the Android port next month: this would make a good demo to link to, particularly if the source is available. Any chance you'll make it available, at least as a demo app if not the source code? If you have commercial aims for this app and don't want to release either, that's fine and I understand.
Nov 23 2018
parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 24, 2018 at 07:50:30AM +0000, Joakim via Digitalmars-d wrote:
 On Saturday, 24 November 2018 at 02:45:46 UTC, H. S. Teoh wrote:
 On Sat, Nov 24, 2018 at 01:38:07AM +0000, Laeeth Isharc via
 Digitalmars-d wrote:
[...]
 Right now, it's still an experimental project to test out the
 waters...  Things are looking good and it might turn into a "real"
 project soon, but as of now, it's still experimental.
 [...]
Interesting, good to hear it's working well. I plan to write up a post for the D blog about the Android port next month: this would make a good demo to link to, particularly if the source is available. Any chance you'll make it available, at least as a demo app if not the source code? If you have commercial aims for this app and don't want to release either, that's fine and I understand.
I would release the code... except it still has a long way to go before it's ready for public consumption. It's definitely not going to be ready by next month, unfortunately. But perhaps a demo APK might not be out-of-place. It's not quite presentable yet in terms of UI, but it does work, and displays nice eye-candy suitable for a simple demo. The biggest problem with releasing the code right now is that out of convenience, I reused some geometric models from a different project of mine, which is otherwise completely unrelated to this one (and which I'm not ready to release quite yet), and as a quick hack to be able to access the data I pulled it in as a git submodule, and the model generation utility depends on some of its source files. Before releasing the code I really should just pull out the model data directly into independent data files, instead of the "dirty" cross-dependency. (The main reason I haven't done anything about this yet, is because I hope to eventually generate the model data independently of this other project -- it's really just a quick and dirty way of obtaining some interesting geometry for testing OpenGL ES rendering, and was never meant to be a permanent dependency.) T -- Just because you can, doesn't mean you should.
Nov 24 2018
prev sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Saturday, 24 November 2018 at 00:11:37 UTC, H. S. Teoh wrote:
 On Fri, Nov 23, 2018 at 03:56:31PM -0800, Walter Bright via 
 Digitalmars-d wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Then there was the whole issue of ARM. Walter said he had no 
 experience with it and kinda didn't care either,
Actually, a person in the D community was working for a couple years on an ARM back end, but he eventually lost interest and it was abandoned.
LDC is already well able to target ARM -- I've been using it to write Android apps, and while it takes a bit of work to set up, once it's setup it works very well. Frankly, I would not be particularly interested in an ARM target for dmd: dmd's weak optimizer, sorry to say, makes it a rather unattractive option compared to LDC. And now that LDC is keeping up with DMD releases, I'm quite tempted to just start using LDC for all of my D projects, or at least all the performance-sensitive ones, since it would be keeping up with the latest features / bugfixes.
 Building a backend is something usually teams of people work 
 on exclusively.  It's a little unfair to expect me to do one 
 spending an hour or so a day on it. I can't order someone to 
 work on it, I can't hire someone to work on it.
 
 It had to wait until someone both competent and self-motivated 
 stepped up to do it.
I would much rather Walter spend his time on higher-level, more important D issues than writing another dmd backend. Though I wouldn't mind if he gave a bit more love to the dmd optimizer. ;-) (When will we get loop unrolling?) T
I can add that the documentable presence of the LLVM backend in LDC since a long time is considered a positive point from the the customer point of view. LLVM has a strong reputation, and simply add value to delivered product, along with a better acceptance that the source code is written in D DMD adds us values during the development phase, as it's still faster than LDC (also if it's not so much more faster as used to be years ago). Paolo
Nov 24 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 23 November 2018 at 23:56:31 UTC, Walter Bright wrote:
 On 11/23/2018 4:59 AM, Chris wrote:
 Actually, a person in the D community was working for a couple 
 years on an ARM back end, but he eventually lost interest and 
 it was abandoned.

 Building a backend is something usually teams of people work on 
 exclusively. It's a little unfair to expect me to do one 
 spending an hour or so a day on it. I can't order someone to 
 work on it, I can't hire someone to work on it.
 It had to wait until someone both competent and self-motivated 
 stepped up to do it.
Of course you cannot do everything alone. I never expected that. But ARM was never really high on the agenda. It need not be dmd, ldc is fine, but it was never really pushed. Maybe it's not "challenging" enough for the core devs, I don't know. But it is essential enough that it should have gotten a higher priority than just to wait until someone stepped up. I see a lot of other things happening like the re-implementation of the D compiler in D. Fine. But do I as a user really care if it's written in D or C++? I can see that it's a prestigious thing to have, but when I see where D/ARM is, I just wonder if the priorities are right. Maybe the D Foundation could pay (or could have paid) someone, to set up a LDC-ARM toolchain (it need not be a dmd backend). 6 months to year I think would be a realistic time frame? What do you reckon? If you develop software for ordinary people to use (not in-house frameworks for ads or time tables), they do ask you if there's an app for Android or iOS. And with D it's still too much of a gamble. It's quite a chore to set it up, and then it might break with the every new compiler release and it might or might not cater for various Android (and iOS) releases. And then it's wait, wait, wait...work around etc. I don't understand how things are prioritized in D. Basic and important things seem to be at the bottom of the list (XML parser), other things get huge attention while they are of dubious value to many users. This is why I don't completely buy the "we don't have enough resources" argument. The scarce resources you have are not used wisely in my opinion. And it is a pity when I see that D has loads of potential (C/C++ interop, Objective-C interop etc.) but other new languages overtake D because they focus on practical issues too.
Nov 24 2018
next sibling parent reply welkam <wwwelkam gmail.com> writes:
On Saturday, 24 November 2018 at 15:19:26 UTC, Chris wrote:
 I don't understand how things are prioritized in D.
Oh thats very easy. They dont. Not really. Well there are Wish list https://wiki.dlang.org/Wish_list Vision documents https://wiki.dlang.org/Vision/2018H1 But most people ignore them and work on what they want to work on. They scratch their own itch so to speak and if you try to force them to do something else they wont work at all. If you could figure out a way to make people work for free and on things that we want that be great.
 Basic and important things seem to be at the bottom of the list 
 (XML parser)
there was attempt on rewriting xml module but it never crossed finish line. Its from Aug 22, 2016 https://github.com/dlang/phobos/pull/4741
 The scarce resources you have are not used wisely in my opinion.
Ok try for a month to control contributors on open source project then lecture us on how to manage resources. I will wait here to laugh at your failure.
Nov 24 2018
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 24, 2018 at 04:46:02PM +0000, welkam via Digitalmars-d wrote:
 On Saturday, 24 November 2018 at 15:19:26 UTC, Chris wrote:
[...]
 Basic and important things seem to be at the bottom of the list (XML
 parser)
there was attempt on rewriting xml module but it never crossed finish line. Its from Aug 22, 2016 https://github.com/dlang/phobos/pull/4741
[...] Actually, if you're interested in XML, you should check out Jonathan Davis' dxml module:
 Documentation: http://jmdavisprog.com/docs/dxml/0.2.0/
 Github: https://github.com/jmdavis/dxml/tree/v0.2.0
 Dub: http://code.dlang.org/packages/dxml
It doesn't handle DTDs due to design constraints, but if you can live with that, it should work much better than std.xml. (I'm hoping Jonathan pushes for this to be merged into Phobos, as it's far superior to std.xml IMO. But that has yet to happen.) T -- I am a consultant. My job is to make your job redundant. -- Mr Tom
Nov 24 2018
parent welkam <wwwelkam gmail.com> writes:
On Saturday, 24 November 2018 at 17:09:08 UTC, H. S. Teoh wrote:
 On Sat, Nov 24, 2018 at 04:46:02PM +0000, welkam via 
 Digitalmars-d wrote:
 On Saturday, 24 November 2018 at 15:19:26 UTC, Chris wrote:
[...]
 Basic and important things seem to be at the bottom of the 
 list (XML
 parser)
there was attempt on rewriting xml module but it never crossed finish line. Its from Aug 22, 2016 https://github.com/dlang/phobos/pull/4741
[...] Actually, if you're interested in XML, you should check out Jonathan Davis' dxml module:
 Documentation: http://jmdavisprog.com/docs/dxml/0.2.0/
 Github: https://github.com/jmdavis/dxml/tree/v0.2.0
 Dub: http://code.dlang.org/packages/dxml
It doesn't handle DTDs due to design constraints, but if you can live with that, it should work much better than std.xml. (I'm hoping Jonathan pushes for this to be merged into Phobos, as it's far superior to std.xml IMO. But that has yet to happen.) T
You didnt knew but I watched almost all Dconf videos, all Andrei and Walter presentations on D and I know about Jonathan`s work. I just wanted to refute that people dont care about rewriting xml
Nov 24 2018
prev sibling parent reply bachmeier <no spam.net> writes:
On Saturday, 24 November 2018 at 16:46:02 UTC, welkam wrote:

 But most people ignore them and work on what they want to work 
 on. They scratch their own itch so to speak and if you try to 
 force them to do something else they wont work at all. If you 
 could figure out a way to make people work for free and on 
 things that we want that be great.
That's definitely a problem. It's not completely true though. Some people answer questions on Stack Overflow even though it doesn't benefit them in any way to do so - they get (more or less) meaningless points for answering questions. Some people put in long hours for minor Linux distributions, doing things that don't benefit them at all. Some people answer newbie Linux questions for hours at a time. Heck, I wasted time getting one of my projects to work on Windows, even though I would rather change jobs than use Windows. I think the problem with D is that the work that needs to be done is not clearly defined, you don't get credit in any way, and even if you do it, there's no reason to think you've done it "right", so it might all be for nothing. If you can fix those three things, you might be able to get people to contribute to things they don't care about. There are other solutions too. If you can partner with a department in a university, you can give students internship credit for working on certain projects.
Nov 24 2018
parent welkam <wwwelkam gmail.com> writes:
On Saturday, 24 November 2018 at 17:43:05 UTC, bachmeier wrote:
 I think the problem with D is that the work that needs to be 
 done is not clearly defined, you don't get credit in any way, 
 and even if you do it, there's no reason to think you've done 
 it "right", so it might all be for nothing. If you can fix 
 those three things, you might be able to get people to 
 contribute to things they don't care about.
That is a good observation. D would benefit if some one implemented systems from video games. Games are really good at defining goals, guiding players toward achieving them, tracking progress and rewarding them at any small improvement. The problem is that people in D community are developers not psychologist. Second problem is that implementing such systems require work and we already have too many tasks that need to be done.
Nov 24 2018
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 24, 2018 at 03:19:26PM +0000, Chris via Digitalmars-d wrote:
[...]
 I don't understand how things are prioritized in D. Basic and
 important things seem to be at the bottom of the list (XML parser),
 other things get huge attention while they are of dubious value to
 many users. This is why I don't completely buy the "we don't have
 enough resources" argument. The scarce resources you have are not used
 wisely in my opinion. And it is a pity when I see that D has loads of
 potential (C/C++ interop, Objective-C interop etc.) but other new
 languages overtake D because they focus on practical issues too.
[...] You have to understand that D is an open source project run by volunteers, not a for-profit organization that can afford to pay people to tell them what to do. Well, with the D foundation setup now, I suppose we could begin to pay some people to work on stuff (and we have). But with a budget of barely over 1K per month, the bulk of the work is still done by unpaid volunteers who contribute purely out of their own initiative. Demanding that volunteers work on tasks that you deem important is about as effective as herding cats. This is not to excuse the state of things in any way, but it's just a realistic evaluation of the actual situation. Even though Walter and Andrei serve as BDFL and visionary leaders, they can no more order any of us to do anything than a random stranger from the street can dictate to you how you ought to spend your free time. They can't just "use our resources" however they want, because this isn't a top-down organization where the higher ups assign tasks to the lower downs. This is a gathering of like-minded peers who contribute as equals according to their interest and capacity because they believe in the product. This is really what we mean when we say "if you want X to change, do it yourself" or "be the change that you want to see". It should not be misconstrued as writing anyone off, an excuse for laziness, or being dismissive of newcomers. Rather, it's an open invitation to participate in the gathering of peers, to have a hand in producing something we hope and believe will be wonderful. Whether you accept the invitation or not is really up to you -- it's not a demand, but just an invitation. If you see the value in D, and if you feel you can contribute something useful, then you will be welcomed. But if you expect to tell others what to do while not contributing anything yourself, then don't be surprised if you get the same reactions you might give when a random stranger walks up to you and starts dictating how you ought to be spending your free time. T -- What's a "hot crossed bun"? An angry rabbit.
Nov 24 2018
next sibling parent reply Grumpy <grumpy gmail.com> writes:
On Saturday, 24 November 2018 at 17:37:36 UTC, H. S. Teoh wrote:
 On Sat, Nov 24, 2018 at 03:19:26PM +0000, Chris via 
 Digitalmars-d wrote: [...]
 I don't understand how things are prioritized in D. Basic and 
 important things seem to be at the bottom of the list (XML 
 parser), other things get huge attention while they are of 
 dubious value to many users. This is why I don't completely 
 buy the "we don't have enough resources" argument. The scarce 
 resources you have are not used wisely in my opinion. And it 
 is a pity when I see that D has loads of potential (C/C++ 
 interop, Objective-C interop etc.) but other new languages 
 overtake D because they focus on practical issues too.
[...] You have to understand that D is an open source project run by volunteers, not a for-profit organization that can afford to pay people to tell them what to do. Well, with the D foundation setup now, I suppose we could begin to pay some people to work on stuff (and we have). But with a budget of barely over 1K per month, the bulk of the work is still done by unpaid volunteers who contribute purely out of their own initiative. Demanding that volunteers work on tasks that you deem important is about as effective as herding cats. This is not to excuse the state of things in any way, but it's just a realistic evaluation of the actual situation.
this concept was repeated until the nausea ...
 Even though Walter and Andrei serve as BDFL and visionary 
 leaders,
AHAHAHAHAHAHAHAHAHAH This is the point! They are considered leaders and ARE NOT leaders! Nobody asks that they give orders, a leader is asked to give a direction, a VISION!
 they can no more order any of us to do anything than a random 
 stranger from the street can dictate to you how you ought to 
 spend your free time. They can't just "use our resources" 
 however they want, because this isn't a top-down organization 
 where the higher ups assign tasks to the lower downs.
 This is really what we mean when we say "if you want X to 
 change, do it yourself" or "be the change that you want to 
 see".  It should not be misconstrued as writing anyone off, an 
 excuse for laziness, or being dismissive of newcomers.  Rather, 
 it's an open invitation to participate in the gathering of 
 peers, to have a hand in producing something we hope and 
 believe will be wonderful.
 Whether you accept the invitation or not is really up to you -- 
 it's not a demand, but just an invitation. If you see the value 
 in D, and if you feel you can contribute something useful, then 
 you will be welcomed.  But if you expect to tell others what to 
 do while not contributing anything yourself, then don't be 
 surprised if you get the same reactions you might give when a 
 random stranger walks up to you and starts dictating how you 
 ought to be spending your free time.
Nothing is more effective than the example, to motivate! Since the transparency of the leaders and the foundation is very bad, let's look a little bit more about what secret strategy they are concentrating on: Walter, months to convert the backend from C ++ to D, be careful, in a compiler in which you can not even turn on the GC, what a show, pride for the D programming language! Should we talk about the DIP1000 documentation? Andrei: https://github.com/andralex Need to add more? More than a gatekeeper, this is a solid brick wall. With these examples, the problem is the lack of contributions? This is pure collective madness.
Nov 24 2018
parent reply Chris <wendlec tcd.ie> writes:
On Saturday, 24 November 2018 at 18:00:44 UTC, Grumpy wrote:
 On Saturday, 24 November 2018 at 17:37:36 UTC, H. S. Teoh wrote:
 Since the transparency of the leaders and the foundation is 
 very bad, let's look a little bit more about what secret 
 strategy they are concentrating on:

 Walter, months to convert the backend from C ++ to D, be 
 careful, in a compiler in which you can not even turn on the 
 GC, what a show, pride for the D programming language! Should 
 we talk about the DIP1000 documentation?

 Andrei: https://github.com/andralex
 Need to add more? More than a gatekeeper, this is a solid brick 
 wall.

 With these examples, the problem is the lack of contributions?

 This is pure collective madness.
I called it an "autocratic chaos". Since my brain (at least the left half) has been analyzed in this thread, let's take this a bit further and talk about psychology. I may well be that Walter and other core devs really feel that they are making great progress when porting DMD to D and stuff like that. Indeed, their own projects might be emotionally rewarding and trigger feelings of euphoria. So the projects they're emotionally involved in are much more important to them than dreadful stuff like fixing `autodecode` - naturally. This causes a discrepancy between what users see / want / need and what the leadership sees / wants. If the above is true, it has to be changed. A project like D cannot survive if it's only driven by personal preferences. In my own job I sometimes work on interesting and emotionally rewarding stuff, but I also have to do the head wrecking and boring stuff that may not even be related to writing code - boring but necessary. It's not just a question of the - by now famous - $1,600 a month. I've seen other open source projects thrive because of a different community culture. "Do what you like or is important to you personally" will only get you so far. If people who are willing to volunteer see why they should spend time doing chore X or Y, they will do it. But if there's no clear vision and it takes ages to get a review / accepted, then why bother?
Nov 26 2018
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Monday, 26 November 2018 at 10:19:14 UTC, Chris wrote:

 In my own job I sometimes work on interesting and emotionally 
 rewarding stuff, but I also have to do the head wrecking and 
 boring stuff that may not even be related to writing code - 
 boring but necessary.
That is because you are being paid to do so!
Nov 26 2018
parent Chris <wendlec tcd.ie> writes:
On Monday, 26 November 2018 at 14:28:58 UTC, 12345swordy wrote:
 On Monday, 26 November 2018 at 10:19:14 UTC, Chris wrote:

 In my own job I sometimes work on interesting and emotionally 
 rewarding stuff, but I also have to do the head wrecking and 
 boring stuff that may not even be related to writing code - 
 boring but necessary.
 That is because you are being paid to do so!
Ah, no! Please. It works in other open source projects too. People volunteer to do head-wrecking nitty-gritty stuff, if they have the feeling it's worth it. Isn't LDC an example of just this? You just have to encourage people and give them some sort of reassurance. If, however, the leadership doesn't lead by example and acts more or less in a random manner, what do you expect? Wasn't it Walter himself who once stated that the attitude of the management trickles down? It actually does. There's even a proverb "Like master, like man."
Nov 26 2018
prev sibling parent reply welkam <wwwelkam gmail.com> writes:
On Monday, 26 November 2018 at 10:19:14 UTC, Chris wrote:
 I may well be that Walter and other core devs really feel that 
 they are making great progress when porting DMD to D and stuff 
 like that.
Actually D had postponed porting from C++ to D for a long time. Other languages like nim and jai from beginning had their implementation done in itself. Secondly translating was done with a tool and my guess didnt took much of work for DMD backend. Well compared to what would it take to make android app development easy. Thirdly compiler being in D means more people from community can participate in compiler development. I started twiddling with compiler because it was mostly in D and would not touched it if it was in C++.
 Indeed, their own projects might be emotionally rewarding and 
 trigger feelings of euphoria.
Its clear you havent read single whitepaper on behavioral psychology or neuroscience.
 A project like D cannot survive if it's only driven by personal 
 preferences.
The good thing is that its not only driven by personal preference. Now we have d foundation and companies who sponsor work on libraries like Symmetry Investments. This trend will only increase but not at the speed you or I want.
 In my own job I sometimes work on interesting and emotionally 
 rewarding stuff, but I also have to do the head wrecking and 
 boring stuff that may not even be related to writing code - 
 boring but necessary.
You compare open source with for profit company again. We all do boring but necessary stuff in paid jobs. Your not exception. What matters here is that you expect other people who worked 8h of boring stuff to go home and work on more boring stuff for free. These kind of people are rare and you are not one of them yourself.
 It's not just a question of the - by now famous - $1,600 a 
 month. I've seen other open source projects thrive because of a 
 different community culture.
Stop being vague and start naming open source projects that are as big scope as D and thrive without corporate sponsorship. We might learn something.
 it takes ages to get a review / accepted, then why bother?
You would be surprised what a little bit of money can change and Nicholas already doing good work https://www.flipcause.com/secure/cause_pdetails/NDUwNTY= Oh and about fixing autodecode https://youtu.be/Lo6Q2vB9AAg?t=4044
Nov 26 2018
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 26 November 2018 at 18:01:08 UTC, welkam wrote:
 On Monday, 26 November 2018 at 10:19:14 UTC, Chris wrote:
 I may well be that Walter and other core devs really feel that 
 they are making great progress when porting DMD to D and stuff 
 like that.
Actually D had postponed porting from C++ to D for a long time. Other languages like nim and jai from beginning had their implementation done in itself. Secondly translating was done with a tool and my guess didnt took much of work for DMD backend. Well compared to what would it take to make android app development easy. Thirdly compiler being in D means more people from community can participate in compiler development. I started twiddling with compiler because it was mostly in D and would not touched it if it was in C++.
Fair enough, but again it's a very narrow focus, and people interested in it can "twiddle around". For deployment we have LDC anyway. A language needs a broader focus, i.e. infrastructure, tools and, yes, an IDE.
 Indeed, their own projects might be emotionally rewarding and 
 trigger feelings of euphoria.
Its clear you havent read single whitepaper on behavioral psychology or neuroscience.
If you have worked with people (and you know a bit about yourself) you don't need to read all that to know how people feel and what motivates them.
 A project like D cannot survive if it's only driven by 
 personal preferences.
The good thing is that its not only driven by personal preference. Now we have d foundation and companies who sponsor work on libraries like Symmetry Investments. This trend will only increase but not at the speed you or I want.
Again, often a very narrow focus. Bits and pieces here and there, special interest.
 In my own job I sometimes work on interesting and emotionally 
 rewarding stuff, but I also have to do the head wrecking and 
 boring stuff that may not even be related to writing code - 
 boring but necessary.
You compare open source with for profit company again. We all do boring but necessary stuff in paid jobs. Your not exception. What matters here is that you expect other people who worked 8h of boring stuff to go home and work on more boring stuff for free. These kind of people are rare and you are not one of them yourself.
Well, I used to contribute a bit. Bug reports, the odd PR for dlang and other simple stuff. I donated money twice (I think, or was it three times?). I was seriously thinking about contributing more, but then 1) I had to spend time fixing my own code due to dmd updates 2) I saw the whole PR / priorities culture (or lack thereof) 3) I thought, if my own code breaks randomly or is no longer "state-of-the-art" due to a new paradigm being introduced, what happens if I make a contribution? Will I have to re-write the code forever and ever and ever like poor Sisyphus? 4) regarding stuff like ARM and JNI, I would have helped to improve it, my own experience / setup would have trickled into the project. However, it took too long until it became a real option, and other technologies emerged in the meantime, technologies that where closing the gap ignored by D. Yes, I would have worked on boring stuff too. People contribute for various reasons, e.g. self-interest, for the glory, gratitude (give something back) or because they have / share a vision or an idea. Usually it's a mix of all of them. (Except for gratitude, everything else is more or less self-interest). Now you may discard all of my points above as nonsense, because maybe they are not mentioned in papers about behavioral neuroscience and psychology (I've actually read quite a bit about behavioral biology), but those were my thoughts.
 It's not just a question of the - by now famous - $1,600 a 
 month. I've seen other open source projects thrive because of 
 a different community culture.
Stop being vague and start naming open source projects that are as big scope as D and thrive without corporate sponsorship. We might learn something.
There's loads, look at Linux distros and libraries that are still used everywhere.
 it takes ages to get a review / accepted, then why bother?
You would be surprised what a little bit of money can change and Nicholas already doing good work https://www.flipcause.com/secure/cause_pdetails/NDUwNTY=
Nicholas is a legend. Fair play to him. I wish him luck and hope he'll succeed!
 Oh and about fixing autodecode 
 https://youtu.be/Lo6Q2vB9AAg?t=4044
Nov 27 2018
parent reply welkam <wwwelkam gmail.com> writes:
On Tuesday, 27 November 2018 at 08:52:07 UTC, Chris wrote:
 Fair enough, but again it's a very narrow focus, and people 
 interested in it can "twiddle around". For deployment we have 
 LDC anyway. A language needs a broader focus, i.e. 
 infrastructure, tools and, yes, an IDE.
Lets say A, B and C needs to be done but core team can only work on one thing. If they work on A people complain that B and C is not worked on. If they worked on B people complain that A and C is not worked on and if they worked on C people would complain that A and B is not being worked on. In principle D could be best language for many things and people want their use case to be prioritized. But D doesnt have enough resources to work on all of that. Hack we dont have people working on std.io library, some std libraries are sub-par. If you write your project from scratch and it doesn't depend on external libraries then D is awesome but for everything else its just meh. And about tools. These kinds of captain obvious advises are not welcomed. First they imply astronomical incompetence of everyone who works on D and second they just waste time. If you asked on forum if better tooling was useful for language growth you will get almost everyone agreeing. Its not that they just agree with words but they agree with actions https://dlang.org/blog/2018/07/13/funding-code-d/ From my personal experience tooling is noticeably better than last time I tired D.
 If you have worked with people (and you know a bit about 
 yourself) you don't need to read all that to know how people 
 feel and what motivates them.
Until you dive deep and find out that the last person you can trust is yourself. I found that most people project their thinking and feeling onto others and call it understanding. If we narrowed scope to what motivates people to work for free then one single reason might be that people work on things that they think are worth wile. Dont know how to better express that. English is not my native tongue.
 What matters here is that you expect other people who worked 
 8h of boring stuff to go home and work on more boring stuff 
 for free. These kind of people are rare and you are not one of 
 them yourself.
Well, I used to contribute a bit. Bug reports, the odd PR for dlang and other simple stuff. I donated money twice (I think, or was it three times?). I was seriously thinking about contributing more
So you havent worked on boring stuff after 8h of potential boring stuff. Thats what I tried to convey.
 but then

 1) I had to spend time fixing my own code due to dmd updates
From what I can tell DMD improved on this. I use to hear about new updates breaking peoples code and now most of what I hear is that updates used to break code in the past. My guess unit tests improved this and they came mostly because compiler was converted to D.
 Now you may discard all of my points above as nonsense, because 
 maybe they are not mentioned in papers about behavioral 
 neuroscience and psychology (I've actually read quite a bit 
 about behavioral biology), but those were my thoughts.
My intention with that post was not to discredit your points. It was response to Walters question. I have seen many game devs listening too much on surface problems and not looking at underlying core issue. Then they release patch that doesnt make game better. What I want people to take form your posts is that D needs more people working on more stuff. We dont need people pointing stuff that is already know issue. It just waste people`s times. What we need is concrete action plan that is grounded in reality and doesn't have big flaws.
 There's loads, look at Linux distros and libraries that are 
 still used everywhere.
Like Ubuntu, red hat and suse? Red hat annual revenue is $2.9 billion. That with a B. Now I cant find it but I remember reading that pacman was maintained by single person and some people were angry that for long time pacman didnt check download validity or something. If my examples were not what you thought than stop being vague and be more concrete. P.s. we might be the only one left talking in this thread. We could talk about hentai and no one would be wiser :D
Nov 28 2018
next sibling parent Aliak <something something.com> writes:
On Wednesday, 28 November 2018 at 20:09:20 UTC, welkam wrote:
 P.s. we might be the only one left talking in this thread. We 
 could talk about hentai and no one would be wiser :D
I heard that! :p
Nov 28 2018
prev sibling next sibling parent bauss <jj_1337 live.dk> writes:
On Wednesday, 28 November 2018 at 20:09:20 UTC, welkam wrote:
 P.s. we might be the only one left talking in this thread. We 
 could talk about hentai and no one would be wiser :D
Back in the conversation just for this.
Nov 28 2018
prev sibling parent Chris <wendlec tcd.ie> writes:
On Wednesday, 28 November 2018 at 20:09:20 UTC, welkam wrote:
 On Tuesday, 27 November 2018 at 08:52:07 UTC, Chris wrote:

 Lets say A, B and C needs to be done but core team can only 
 work on one thing. If they work on A people complain that B and 
 C is not worked on. If they worked on B people complain that A 
 and C is not worked on and if they worked on C people would 
 complain that A and B is not being worked on.

 In principle D could be best language for many things and 
 people want their use case to be prioritized. But D doesnt have 
 enough resources to work on all of that. Hack we dont have 
 people working on std.io library, some std libraries are 
 sub-par. If you write your project from scratch and it doesn't 
 depend on external libraries then D is awesome but for 
 everything else its just meh.

 And about tools. These kinds of captain obvious advises are not 
 welcomed. First they imply astronomical incompetence of 
 everyone who works on D and second they just waste time. If you 
 asked on forum if better tooling was useful for language growth 
 you will get almost everyone agreeing. Its not that they just 
 agree with words but they agree with actions
 https://dlang.org/blog/2018/07/13/funding-code-d/

 From my personal experience tooling is noticeably better than 
 last time I tired D.
My suggestion is one year of rehab and detox for D (open a DTox fork if you like): - feature freeze - fix old bugs - do a general clean up - develop sound and stable tooling - improve integration with existing technologies (Java / GraalVM / Android / iOS) - release a stable version of D with LTS - i.e. establish a long term contract between users and creators. The fist three bullet points above are normal in software development. And the rest is common sense. Unfortunately, D is all about features and "showing C++" and whatnot. It sometimes seems to me like the boys want to play, whatever strikes their fancy at a given moment.
 So you havent worked on boring stuff after 8h of potential 
 boring stuff. Thats what I tried to convey.
Cos it wasn't worth it.
 but then

 1) I had to spend time fixing my own code due to dmd updates
From what I can tell DMD improved on this. I use to hear about new updates breaking peoples code and now most of what I hear is that updates used to break code in the past. My guess unit tests improved this and they came mostly because compiler was converted to D.
The problem is that you never know when it will hit you (again) as long as the core devs and community are trigger happy when it comes to new features. Your code must go to make room for feature X.
 Like Ubuntu, red hat and suse? Red hat annual revenue is $2.9 
 billion. That with a B. Now I cant find it but I remember 
 reading that pacman was maintained by single person and some 
 people were angry that for long time pacman didnt check 
 download validity or something. If my examples were not what 
 you thought than stop being vague and be more concrete.
Linux: ArchLinux, Manjaro. Especially Manjaro started as a one man project. SoX: to this day used in audio software all over the world, by professionals. Especially in the audio world there are many successful projects / libs.
 P.s. we might be the only one left talking in this thread. We 
 could talk about hentai and no one would be wiser :D
We've been found out!
Nov 29 2018
prev sibling parent JN <666total wp.pl> writes:
On Saturday, 24 November 2018 at 17:37:36 UTC, H. S. Teoh wrote:
 Even though Walter and Andrei serve as BDFL and visionary 
 leaders, they can no more order any of us to do anything than a 
 random stranger from the street can dictate to you how you 
 ought to spend your free time. They can't just "use our 
 resources" however they want, because this isn't a top-down 
 organization where the higher ups assign tasks to the lower 
 downs. This is a gathering of like-minded peers who contribute 
 as equals according to their interest and capacity because they 
 believe in the product.
The Vision document is overdue (https://wiki.dlang.org/Vision/2018H1 - "This document discusses the high-level vision for D with semestrial granularity. It is released in January and July of each year. Note that the goals presented are those the D leadership works on, explicitly fosters, or strongly believes are important for the success of the D language"). I'm sure more people would contribute if the vision and goals were better defined. Imagine someone has a great idea for improving the GC. But they aren't certain if GC is going to stay in D for the future. Perhaps it will transition towards reference counting, Rust-like solution or manual memory management. Without design and vision, you might scare away contributors, because they don't want to put any larger effort into something that would get thrown into trash few months down the road.
Nov 24 2018
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/24/2018 7:19 AM, Chris wrote:
 Of course you cannot do everything alone. I never expected that. But ARM was 
 never really high on the agenda.
It cannot be high on the agenda without a self-motivated, competent person to work on it.
 But it is essential enough that it should have gotten a higher 
 priority than just to wait until someone stepped up.
I have a lot of history with people asking me "what can I work on?" I give them a list of suggestions, and they work on something not on that list, i.e. what they want to work on. The D community is a volunteer one. I cannot order anyone to work on anything. For example, you can step up and work to improve any aspect of D that matters to you.
 I see a lot of other things happening like the re-implementation of the D 
 compiler in D. Fine. But do I as a user really care if it's written in D or
C++?
Not directly, no. But there are a number of problems working on that code due to it being half in D, half in C++, such as: 1. C/C++ is just a clumsy language to work in when one is used to D. 2. Half in one language, half in the other, means you have to main two sets of declarations of the data structures, and there's hell to pay if they get out of sync. 3. There's lots of technical debt in the old backend. Redoing it in D makes that a lot easier to improve. 4. I don't have to deal with C/C++'s portability problems when it's in D. I've spent a stupid amount of time just dealing with size_t.
 I don't understand how things are prioritized in D. Basic and important things 
 seem to be at the bottom of the list (XML parser), other things get huge 
 attention while they are of dubious value to many users. This is why I don't 
 completely buy the "we don't have enough resources" argument. The scarce 
 resources you have are not used wisely in my opinion. And it is a pity when I 
 see that D has loads of potential (C/C++ interop, Objective-C interop etc.)
but 
 other new languages overtake D because they focus on practical issues too.
It's fairly straightforward. Things that get attention do so because a self-motivated and competent person decides to solve it. Commercial organizations that rely on D tend to spend money solving their problems with D, not others. The D Foundation does have some money to spend, but you may not realize just how much a competent compiler engineer costs!
Nov 24 2018
next sibling parent Joakim <dlang joakim.fea.st> writes:
On Sunday, 25 November 2018 at 06:56:10 UTC, Walter Bright wrote:
 On 11/24/2018 7:19 AM, Chris wrote:
 Of course you cannot do everything alone. I never expected 
 that. But ARM was never really high on the agenda.
It cannot be high on the agenda without a self-motivated, competent person to work on it.
 But it is essential enough that it should have gotten a higher 
 priority than just to wait until someone stepped up.
I have a lot of history with people asking me "what can I work on?" I give them a list of suggestions, and they work on something not on that list, i.e. what they want to work on. The D community is a volunteer one. I cannot order anyone to work on anything.
Sure, you don't have that hard power but you have plenty of soft power, which you don't seem to be using well: https://en.m.wikipedia.org/wiki/Soft_power Two big communication mistakes I've seen from you and Andrei that I've been banging on for years now, which I think others are also alluding to: 1. Little to no clear communication on what the roadmap moving forward is, though the Vision statements have filled that gap somewhat. Of course, they seem to have stopped right when they were getting useful and less vague. You're right that many will just disregard your roadmap, but this ignores that there are others who are simply looking for a way to pitch in, like junior devs, and will gladly pick off something on a list you make. But this is why it's important to provide concrete tasks, rather than vague goals like "improve safety," so they can dive in easily. 2. Little notice of what you two are actually working on now, PRs often just show up out of the blue. This precludes others from either avoiding doing that work because you're already doing it, or pitching in to help you out. Presumably you two have some kind of TODO list that you could just share publicly? Another possibility in this vein is putting together a list of things you either don't want or would like to see in the language, like Swift's list of commonly rejected changes: https://github.com/apple/swift-evolution/blob/master/commonly_proposed.md This is actually the hard power you do have, as the gatekeepers for dmd and Phobos. I understand that as an engineer this may all seem unnecessary overhead, but if this project is ever to scale beyond where it's at now, it will have to be done.
Nov 25 2018
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 25 November 2018 at 06:56:10 UTC, Walter Bright wrote:
 On 11/24/2018 7:19 AM, Chris wrote:
 Of course you cannot do everything alone. I never expected 
 that. But ARM was never really high on the agenda.
It cannot be high on the agenda without a self-motivated, competent person to work on it.
 But it is essential enough that it should have gotten a higher 
 priority than just to wait until someone stepped up.
I have a lot of history with people asking me "what can I work on?" I give them a list of suggestions, and they work on something not on that list, i.e. what they want to work on. The D community is a volunteer one. I cannot order anyone to work on anything. For example, you can step up and work to improve any aspect of D that matters to you.
 I see a lot of other things happening like the 
 re-implementation of the D compiler in D. Fine. But do I as a 
 user really care if it's written in D or C++?
Not directly, no. But there are a number of problems working on that code due to it being half in D, half in C++, such as: 1. C/C++ is just a clumsy language to work in when one is used to D. 2. Half in one language, half in the other, means you have to main two sets of declarations of the data structures, and there's hell to pay if they get out of sync. 3. There's lots of technical debt in the old backend. Redoing it in D makes that a lot easier to improve. 4. I don't have to deal with C/C++'s portability problems when it's in D. I've spent a stupid amount of time just dealing with size_t.
 I don't understand how things are prioritized in D. Basic and 
 important things seem to be at the bottom of the list (XML 
 parser), other things get huge attention while they are of 
 dubious value to many users. This is why I don't completely 
 buy the "we don't have enough resources" argument. The scarce 
 resources you have are not used wisely in my opinion. And it 
 is a pity when I see that D has loads of potential (C/C++ 
 interop, Objective-C interop etc.) but other new languages 
 overtake D because they focus on practical issues too.
It's fairly straightforward. Things that get attention do so because a self-motivated and competent person decides to solve it. Commercial organizations that rely on D tend to spend money solving their problems with D, not others. The D Foundation does have some money to spend, but you may not realize just how much a competent compiler engineer costs!
Sorry, but D tends to D-motivate people. One example is std.xml. Another issue I've brought up over the years. You talk about highly motivated volunteers and when you get one what does s/he get? Review limbo for ever, because everything else is more important than that volunteer's work. Dip1000, the C++ > D transition, whatever. And when people ask about std.xml or any other work in limbo the answer is "The person didn't push hard enough!". That's cynical. How is that supposed to motivate anyone? Do you and Andrei sometimes think about how this sort of behavior is perceived by users? Believe me, it doesn't go unnoticed and it makes you think twice about contributing. It's normal psychology. And it's a rinse and repeat pattern in the D world. Guess why - and this speaks volumes - Johnathan M. Davis couldn't be bothered to put dxml through a review process? Johnathan told us that he wasn't "too enthusiastic" about it. Why wasn't the new std.xml ever reviewed + accepted back in the day? Contribute! You may not know, but D users are usually too busy finding workarounds for bugs, fixing broken code and coming up with their own solutions for basic things. I wrote a small XML parser for my own specific needs, because I couldn't wait anymore. We've kept your and our dream alive by working around difficulties ourselves in the hope that some day things would improve. But in vain.
Nov 25 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 26/11/2018 1:58 AM, Chris wrote:
 Guess why - and this speaks volumes - Johnathan M. Davis couldn't be 
 bothered to put dxml through a review process? Johnathan told us that he 
 wasn't "too enthusiastic" about it. Why wasn't the new std.xml ever 
 reviewed + accepted back in the day?
If you're referring to std.experimental.xml, simple it was horrible code that wasn't complete and I did try to improve upon it. It just wasn't good D code. It wouldn't pass the review even if feature complete.
Nov 25 2018
parent reply Chris <wendlec tcd.ie> writes:
On Sunday, 25 November 2018 at 13:04:41 UTC, rikki cattermole 
wrote:
 On 26/11/2018 1:58 AM, Chris wrote:
 Guess why - and this speaks volumes - Johnathan M. Davis 
 couldn't be bothered to put dxml through a review process? 
 Johnathan told us that he wasn't "too enthusiastic" about it. 
 Why wasn't the new std.xml ever reviewed + accepted back in 
 the day?
If you're referring to std.experimental.xml, simple it was horrible code that wasn't complete and I did try to improve upon it. It just wasn't good D code. It wouldn't pass the review even if feature complete.
I don't know if it's the same code that was written for std.xml(2). But why does it take so long to even reject something? Normally you would say "Not good enough" and you'd announce that you need a better module (with specs). Or why not take Jonathan's stuff (which I understand is very good) and integrate it without him having to "push it". Successful companies do that. They take promising stuff, clean it up and improve it. You cannot wait until a volunteer has got it "perfect". It might happen that you have something that's 90% there and then it's just binned / abandoned, because the volunteer didn't put in the last 10% . And then the volunteer is to blame. And maybe the poor volunteer couldn't finish the work, because s/he had to wait for feature X to be implemented.
Nov 25 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 26/11/2018 3:04 AM, Chris wrote:
 On Sunday, 25 November 2018 at 13:04:41 UTC, rikki cattermole wrote:
 On 26/11/2018 1:58 AM, Chris wrote:
 Guess why - and this speaks volumes - Johnathan M. Davis couldn't be 
 bothered to put dxml through a review process? Johnathan told us that 
 he wasn't "too enthusiastic" about it. Why wasn't the new std.xml 
 ever reviewed + accepted back in the day?
If you're referring to std.experimental.xml, simple it was horrible code that wasn't complete and I did try to improve upon it. It just wasn't good D code. It wouldn't pass the review even if feature complete.
I don't know if it's the same code that was written for std.xml(2). But why does it take so long to even reject something? Normally you would say "Not good enough" and you'd announce that you need a better module (with specs). Or why not take Jonathan's stuff (which I understand is very good) and integrate it without him having to "push it". Successful companies do that. They take promising stuff, clean it up and improve it. You cannot wait until a volunteer has got it "perfect". It might happen that you have something that's 90% there and then it's just binned / abandoned, because the volunteer didn't put in the last 10% . And then the volunteer is to blame. And maybe the poor volunteer couldn't finish the work, because s/he had to wait for feature X to be implemented.
It doesn't matter who does it. Its still all volunteer work even if there is a bit of compensation to go with it. Jonathan's xml library is good within its scope. But it definitely isn't xml compliant because its scope says so. There is nothing wrong with this, but there are problems for this to go into Phobos as it is today. We have been bitten in the past by code who wasn't scoped properly. So it should be a lot harder to have things accepted. Just a shame we don't really have the money to throw at a team to do libraries like this. To give people the options they need to both fit with D and with their external requirements (like specs).
Nov 25 2018
parent bachmeier <no spam.net> writes:
On Sunday, 25 November 2018 at 14:20:25 UTC, rikki cattermole 
wrote:

 Just a shame we don't really have the money to throw at a team 
 to do libraries like this. To give people the options they need 
 to both fit with D and with their external requirements (like 
 specs).
Thing is, there is already a team of good developers willing to volunteer their time. What would be different, and would cause the work to get done if it were paid out of limited funds, is that there would be a decision made to do X, Y, and Z, and when it was done, it would become part of D. The current system is "go ahead and do something and if we like it we might ask you to make a bunch of changes and in a few years there's a small chance it'll get accepted." Lack of money is not the problem.
Nov 25 2018
prev sibling next sibling parent reply Neia Neutuladh <neia ikeran.org> writes:
On Fri, 23 Nov 2018 10:25:57 +0000, Joakim wrote:
 Why hasn't ruby/rails, Rust, or Nim gotten backing from big players yet?
Mozilla's 2016 revenue was half a billion dollars. I would certainly hope that's big enough ot count as a big player.
Nov 23 2018
parent reply Joakim <dlang joakim.fea.st> writes:
On Friday, 23 November 2018 at 15:48:13 UTC, Neia Neutuladh wrote:
 On Fri, 23 Nov 2018 10:25:57 +0000, Joakim wrote:
 Why hasn't ruby/rails, Rust, or Nim gotten backing from big 
 players yet?
Mozilla's 2016 revenue was half a billion dollars. I would certainly hope that's big enough ot count as a big player.
Sociomantic was over $100 million in revenue in 2013 before they got bought, according to this 2014 press release: "Sociomantic Labs GmbH... employs more than 200 professionals in 16 offices worldwide with over $100 million in revenue in 2013" https://www.dunnhumby.com/dunnhumby-acquires-sociomantic-revolutionise-digital-advertising So do we already have a big player backing D? ;) Of course, both are tiny compared to google or Apple, who're backing Go and Swift.
Nov 23 2018
next sibling parent Dukc <ajieskola gmail.com> writes:
On Friday, 23 November 2018 at 15:57:27 UTC, Joakim wrote:
 "Sociomantic Labs GmbH... employs more than 200 professionals 
 in 16 offices worldwide with over $100 million in revenue in 
 2013"
 https://www.dunnhumby.com/dunnhumby-acquires-sociomantic-revolutionise-digital-advertising

 So do we already have a big player backing D? ;)
And, isn't Symmetry Investments even bigger?
Nov 23 2018
prev sibling parent reply Neia Neutuladh <neia ikeran.org> writes:
On Fri, 23 Nov 2018 15:57:27 +0000, Joakim wrote:
 On Friday, 23 November 2018 at 15:48:13 UTC, Neia Neutuladh wrote:
 On Fri, 23 Nov 2018 10:25:57 +0000, Joakim wrote:
 Why hasn't ruby/rails, Rust, or Nim gotten backing from big players
 yet?
Mozilla's 2016 revenue was half a billion dollars. I would certainly hope that's big enough ot count as a big player.
Sociomantic was over $100 million in revenue in 2013 before they got bought, according to this 2014 press release: "Sociomantic Labs GmbH... employs more than 200 professionals in 16 offices worldwide with over $100 million in revenue in 2013" https://www.dunnhumby.com/dunnhumby-acquires-sociomantic-revolutionise-
digital-advertising
 
 So do we already have a big player backing D? ;) Of course, both are
 tiny compared to google or Apple, who're backing Go and Swift.
Sociomantic is an ad company making 0.5% as much as Google, which is an ad company plus a lot of other things that suggest a much broader developer focus. If Sociomantic does spend as much proportionately on developer / community stuff as Google, we should expect 0.5% as much benefit. Mozilla is a company that serves developers as a huge portion of its purpose. So while it's got about 2% of the revenue of Google, it spends a lot more proportionately on developer-oriented stuff.
Nov 23 2018
parent Joakim <dlang joakim.fea.st> writes:
On Friday, 23 November 2018 at 17:16:42 UTC, Neia Neutuladh wrote:
 On Fri, 23 Nov 2018 15:57:27 +0000, Joakim wrote:
 On Friday, 23 November 2018 at 15:48:13 UTC, Neia Neutuladh 
 wrote:
 On Fri, 23 Nov 2018 10:25:57 +0000, Joakim wrote:
 Why hasn't ruby/rails, Rust, or Nim gotten backing from big 
 players yet?
Mozilla's 2016 revenue was half a billion dollars. I would certainly hope that's big enough ot count as a big player.
Sociomantic was over $100 million in revenue in 2013 before they got bought, according to this 2014 press release: "Sociomantic Labs GmbH... employs more than 200 professionals in 16 offices worldwide with over $100 million in revenue in 2013" https://www.dunnhumby.com/dunnhumby-acquires-sociomantic-revolutionise-
digital-advertising
 
 So do we already have a big player backing D? ;) Of course, 
 both are tiny compared to google or Apple, who're backing Go 
 and Swift.
Sociomantic is an ad company making 0.5% as much as Google, which is an ad company plus a lot of other things that suggest a much broader developer focus. If Sociomantic does spend as much proportionately on developer / community stuff as Google, we should expect 0.5% as much benefit. Mozilla is a company that serves developers as a huge portion of its purpose. So while it's got about 2% of the revenue of Google, it spends a lot more proportionately on developer-oriented stuff.
That makes no sense: all three are primarily geared towards their customers, ie advertisers who want to sell stuff to you. But of the three, google is the one that serves devs the most as a proportion of their business, not Mozilla, with their development platforms and SDKs like Android, Google Cloud, Google Assistant, etc.
Nov 23 2018
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 23 November 2018 at 10:25:57 UTC, Joakim wrote:
 On Wednesday, 21 November 2018 at 14:38:07 UTC, Chris wrote:
 On Wednesday, 21 November 2018 at 13:26:34 UTC, Joakim wrote:
[...] Why hasn't ruby/rails, Rust, or Nim gotten backing from big players yet? Most things don't get backing from big players, especially initially. What you hope is to create a superior tool that helps small companies grow into the big players someday. [...]
They have, unless you aren't paying attention to the news it seems. Sun invested into JRuby and Netbeans support, while they dropped support for Netbeans, under Oracle's stewardship, they kept investing into JRuby. In fact JRuby is what drives most of the Graal optimizations regarding compilation of dynamic languages and was the genesis of Project Panama, JNI's replacement project. Besides being sponsored by Mozilla, Rust is now used in Visual Studio Code, IoT Core native layer and an internal distributed service by Microsoft. There are also ongoing projects from Oracle and Dropbox. Khronos is working together with Mozilla on a multi-platform 3D API framework, which is built with Rust. Four game studios, namely Ready at Dawn, Chucklefish, SEED and Embark(former DICE/EA devs) have announced that they are building their future tooling and engine improvement with Rust. Rust was given the spotlight alongside C and C++ at Chrome Developers Summit 2018 regarding the languages currently mature for WebAssembly development. GNOME is adopting Rust and collaborating with Mozilla to improve the overall development experience when dealing with the gobject OOP model. Nim just got some support from Status, one of the companies behind Ethereum, but I guess it isn't a major player. So there is some support going on from them. -- Paulo
Nov 23 2018
next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 23 November 2018 at 19:51:23 UTC, Paulo Pinto wrote:
 On Friday, 23 November 2018 at 10:25:57 UTC, Joakim wrote:
 [...]
They have, unless you aren't paying attention to the news it seems. [...]
Quod erat demonstrandum. Obrigado.
Nov 23 2018
prev sibling parent Joakim <dlang joakim.fea.st> writes:
On Friday, 23 November 2018 at 19:51:23 UTC, Paulo Pinto wrote:
 On Friday, 23 November 2018 at 10:25:57 UTC, Joakim wrote:
 On Wednesday, 21 November 2018 at 14:38:07 UTC, Chris wrote:
 On Wednesday, 21 November 2018 at 13:26:34 UTC, Joakim wrote:
[...] Why hasn't ruby/rails, Rust, or Nim gotten backing from big players yet? Most things don't get backing from big players, especially initially. What you hope is to create a superior tool that helps small companies grow into the big players someday. [...]
They have, unless you aren't paying attention to the news it seems. Sun invested into JRuby and Netbeans support, while they dropped support for Netbeans, under Oracle's stewardship, they kept investing into JRuby. In fact JRuby is what drives most of the Graal optimizations regarding compilation of dynamic languages and was the genesis of Project Panama, JNI's replacement project. Besides being sponsored by Mozilla, Rust is now used in Visual Studio Code, IoT Core native layer and an internal distributed service by Microsoft. There are also ongoing projects from Oracle and Dropbox. Khronos is working together with Mozilla on a multi-platform 3D API framework, which is built with Rust. Four game studios, namely Ready at Dawn, Chucklefish, SEED and Embark(former DICE/EA devs) have announced that they are building their future tooling and engine improvement with Rust. Rust was given the spotlight alongside C and C++ at Chrome Developers Summit 2018 regarding the languages currently mature for WebAssembly development. GNOME is adopting Rust and collaborating with Mozilla to improve the overall development experience when dealing with the gobject OOP model. Nim just got some support from Status, one of the companies behind Ethereum, but I guess it isn't a major player. So there is some support going on from them. -- Paulo
I was aware of most of those efforts, but I wouldn't call any of them big players like Chris wanted, nor those Microsoft efforts as much of a "backing." We all pay attention when giant corporations really back an OSS language, such as Apple has done with Swift, because that means we get a lot of free dev tools to play with, ;) and these languages don't have it. Anyway, my intention was not to put those languages down, only to point out that even fairly successful or known languages like these never got a big player really backing them. As I've pointed out with my Linus quotes before about how Linux was built, I think it's much better for OSS tech to have many small or medium backers putting it to many different kinds of uses rather than one or two big backers who tend to use it in a certain way, as Linus says that tends to make the tech less specialized and more likely to survive over the long haul: https://yarchive.net/comp/evolution.html
Nov 23 2018
prev sibling parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Wednesday, 21 November 2018 at 01:09:30 UTC, NoMoreBugs wrote:
 On Tuesday, 20 November 2018 at 23:50:56 UTC, Laeeth Isharc 
 wrote:
 Evidently you don't see yourself as part of the D community 
 from your phrasing.  That's an assertion and we are all 
 entitled to our opinions but to be persuasive reasoned 
 arguments are often more effective. What you say is the 
 opposite of my experience as well as basic commercial common 
 sense.
I understand the psychological basis of that assertion and the reaction you want to get from those who read it (to dismiss/ignore me because I'm an outsider)
I'm just observing what is implied by your words. .
 But your logic and your assertion is misguided.

 I don't see myself as a part of any 'language' community.
There we go. So we agree about your self - conception. Its a positive statement not a normative one.
 This is where we seem to differ, a lot.

 A programming language for me is a tool to an end.

 Its serves me. I do not serve it - or its community.
Fair enough. We each have different kinds of relationships with different entities in the world. For some people they don't want a relationship with their bookstore. Others in different contexts may feel differently. And it's the same with the other users and contributors to a language. Perfect voluntarist society and each may choose as she wishes. This being said, one cannot also avoid considering that different choices may bring different consequences.
 Its just a tool - that all it is.

 You don't build communities around a 'hammer' or a 'spanner'.
Because these are rather mature sorts of tools that don't benefit much from user engagement in their development and people who use hammers really don't have that much in common and don't have much to say to each other. On the other hand there are certainly communities around open source machine tools. These are fairly simple things compared to D.
 It's not unreasonable that I give feedback on how that tool can 
 better serve me.
Indeed. And as I say, we can choose how we wish to engage with different entities in the world but choices bring consequences. And most likely your attitude will lead to polite engagement to the extent you say something insightful or interesting but to the extent you are just insisting the world adapt itself to what you personally want, the most likely ultimate consequence in time is that people politely ignore you. In my experience the best way to get value is to give value. Feel free to ignore this view, but I really wonder why you think others would want to do things without making it interesting in some way for them to do so.
 We also seem to differ on what a 'contributor' is.
I use the word in the ordinary English sense. Did you contribute to Phobos, dmd or code.dlang.org? I'm really curious. If not, how is it that you believe you contribute?
 To me, the focus is always on the user, and their needs, not on 
 the language and its needs. I think our views really differ 
 here too.
I can see that you feel people should be more focused on what you want and what you believe maybe others want. That's only human, but you know to persuade others you need ethos, pathos or logos. The only way you can achieve anything in open source is via persuasion. People have quite different needs. I'm a reasonable and growing commercial user of D. I wouldn't dream of telling anyone here D needs to do this or that to satisfy its users because who am I to know for sure what other users might find valuable and because its a terribly ineffective way to persuade people. It's by far easier to start with what's concrete and explain what one is trying to do and how the absence of some feature is holding one back. I personally see myself as being part of a language community even though I am involved in quite a tough and hard-nosed business. Or I should say because. At one level, I like to be involved, and at another I've found it profitable in life to do the things I like to do, because emotions, well-trained, are a guidance system. And turns out its also commercially smart to do so although I never really planned for that. Ethos, pathos and logos. If people aren't as engaged with what you want as you wish, it's just possible that your chosen means don't suit your ends.
 I also believe the language designer(s) are ultimately 
 responsible for the mistakes that programmers continually make 
 (and yes, here I'm paraphrasing someone who is well known and 
 well respected in the computer science field).
 I don't want a faulty hammer or spanner in my toolbox.
Wow, who would want to be a language designer. Did you try returning Visual Studio to Microsoft on the grounds it had bugs and somebody made some terrible decisions some years back with C++ and even decades later these most basic mistakes have never been fixed even though Microsoft sponsors talks about all the problems created by them. Describing a programming language as faulty is not even wrong.
 A very small number of minds working closely together can be 
 creative and design something beautiful, or have a chance to 
 do so.  A committee, notoriously, is a machine for suppressing 
 creativity.
The proof of your argument needs evidence.
No it doesn't because I'm not making an argument, just sharing some insights that you may do with as you will. One may lead a horse-puppet to water, but not make it drink.
 D has had 10 years (since D2) of 'creativity time', and much 
 longer than that in reality.
Well yes going back decades to BCPL and before. But I really don't see your point because I think your categories are wrong. I don't think languages have creativity time. But they develop at their own rate and the pace at which other quite different languages develop is an interesting piece of information but I really think one must make an explicit argument rather than be able to just assume that things should all develop at the same pace. There is no reason whatsoever that should be true.
 Look at what the C++ committee has been able to accomplish in 
 the same amount of time.
Haha. That's very entertaining. Where are modules? Ctfe and static if? Deprecation of mistakes with a schedule for removing them?
 I don't object to creative endeavors. It's what makes life 
 worthwhile.

 But after 18 years, is that what D (still) is?

 Or is it a serious tool that serious programmers should take 
 seriously.
There are two kinds of people - those who think the world is characterised by black and white dichotomies, and everyone else, haha. I think you would need to make an argument for why these two categories are opposed,but you have yet to do so. Yes, contributing to D can be a creative endeavour, if that's what you want. Yes, D is a tool that you can build a serious business on, if that's what you want, and some people did want that and have done so. It's rather sad to see someone thinking that a serious business can't be creative - don't tell anyone, but it really can you know. I'm presuming you have studied the commercial uses of D that are in the public domain and that you will come back with something intelligent otherwise we are both wasting each others time, since no serious conversation is possible with someone oblivious to reality.
Nov 21 2018
next sibling parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Thursday, 22 November 2018 at 02:56:14 UTC, Laeeth Isharc 
wrote:
 And most likely your attitude will lead to polite engagement to 
 the extent you say something insightful or interesting but to 
 the extent you are just insisting the world adapt itself to 
 what you personally want, the most likely ultimate consequence 
 in time is that people politely ignore you.
I just expressed an opinion. That's all I did. You are turning this into an completely *absurd* discussion *about me*. Lay off!
Nov 21 2018
parent reply Laeeth Isharc <laeeth kaleidic.io> writes:
On Thursday, 22 November 2018 at 03:06:17 UTC, NoMoreBugs wrote:
 On Thursday, 22 November 2018 at 02:56:14 UTC, Laeeth Isharc 
 wrote:
 And most likely your attitude will lead to polite engagement 
 to the extent you say something insightful or interesting but 
 to the extent you are just insisting the world adapt itself to 
 what you personally want, the most likely ultimate consequence 
 in time is that people politely ignore you.
I just expressed an opinion. That's all I did. You are turning this into an completely *absurd* discussion *about me*. Lay off!
I didn't insult you or attack you, but simply observed that your means weren't suited to your seeming ends and explained why. It's really your choice as to what to make of one perspective on the effectiveness of your behaviour.
Nov 21 2018
parent reply NoMoreBugs <NoMoreBugs gmail.com> writes:
On Thursday, 22 November 2018 at 04:18:25 UTC, Laeeth Isharc 
wrote:
 I didn't insult you or attack you, but simply observed that 
 your means weren't suited to your seeming ends and explained 
 why.  It's really your choice as to what to make of one 
 perspective on the effectiveness of your behavior.
Your posts (which are there for anyone to read) sought to undermine an opinion that I expressed, by explicitly encouraging others who read that opinion to not take it seriously, and to just ignore it, simply on the basis of whether or not I was really part of the 'D community' and one of the 'committed users of D'. If that is not a personal attack, what is? And is that really the threshold for stating an opinion on these forums? As I said, next time, you can save yourself a lot of typing, and just type "Wow". In the 2 weeks so far, I've been accused (by a select few - always the same few though) of being a 'non-contributor', of misrepresenting facts, and being a socket puppet. Welcome to the D forum.
Nov 21 2018
next sibling parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Thursday, 22 November 2018 at 04:40:37 UTC, NoMoreBugs wrote:

 In the 2 weeks so far, I've been accused (by a select few - 
 always the same few though) of being a 'non-contributor', of 
 misrepresenting facts, and being a socket puppet.
- Honey, be careful. They say on the news there's a lunatic speeding in the wrong lane. - A lunatic?!? There's loads of them!..
Nov 21 2018
parent NoMoreBugs <NoMoreBugs gmail.com> writes:
On Thursday, 22 November 2018 at 05:19:25 UTC, Stanislav Blinov 
wrote:
 On Thursday, 22 November 2018 at 04:40:37 UTC, NoMoreBugs wrote:

 In the 2 weeks so far, I've been accused (by a select few - 
 always the same few though) of being a 'non-contributor', of 
 misrepresenting facts, and being a socket puppet.
- Honey, be careful. They say on the news there's a lunatic speeding in the wrong lane. - A lunatic?!? There's loads of them!..
That's funny, cause I every day I drive to work, and the vast majority of people seem to speed past me (even though I drive at the speed limit). Either sane people are now the minority, or the minority are the actual lunatics. i.e. I can't work out whether I'm the sane driver (for not speeding), or the lunatic (for not speeding like everyone else). I might do on these forums, what I have to do everyday, and just pull over for a while ;-)
Nov 22 2018
prev sibling next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Thursday, 22 November 2018 at 04:40:37 UTC, NoMoreBugs wrote:
 On Thursday, 22 November 2018 at 04:18:25 UTC, Laeeth Isharc 
 wrote:
 I didn't insult you or attack you, but simply observed that 
 your means weren't suited to your seeming ends and explained 
 why.  It's really your choice as to what to make of one 
 perspective on the effectiveness of your behavior.
Your posts (which are there for anyone to read) sought to undermine an opinion that I expressed, by explicitly encouraging others who read that opinion to not take it seriously, and to just ignore it, simply on the basis of whether or not I was really part of the 'D community' and one of the 'committed users of D'. If that is not a personal attack, what is? And is that really the threshold for stating an opinion on these forums? As I said, next time, you can save yourself a lot of typing, and just type "Wow". In the 2 weeks so far, I've been accused (by a select few - always the same few though) of being a 'non-contributor', of misrepresenting facts, and being a socket puppet. Welcome to the D forum.
I'm sorry this has been your introduction and perception of the D forums, and I suppose you deserve a better explanation of some of the initial hostilities you may have faced. We have recently had someone who has expressed some ... rather strong opinions ... about the ideals of encapsulation of the class vis-à-vis the module, who was posting under multiple pseudonyms and impersonating others and generally acting in a manner the community deemed inappropriate. After a few weeks you start posting on this forum (IIRC) expressing some similar opinions (although not as strongly and not belligerently) and I presume that others on the forums thought you were "that guy" again (I did and I'm pretty sure Neia thinks you are as well, hence the "4-5 aliases"). It is now clear to me that you are not (because you are saying things that are sensible!) although you are getting a bit more worked up than I think you should (not that its less than par for this forum unfortunately...) and I would encourage you to take a less personal attitude to the response of others as they are (mostly) trying to help. Laeeth's responses are certainly not meant to come across as condescending he just has a slightly different view of the world (being in finance and all) and are usually has quite interesting viewpoints and things to say.
Nov 21 2018
prev sibling parent reply Dave Jones <not me.again> writes:
On Thursday, 22 November 2018 at 04:40:37 UTC, NoMoreBugs wrote:
 On Thursday, 22 November 2018 at 04:18:25 UTC, Laeeth Isharc 
 wrote:
 I didn't insult you or attack you, but simply observed that 
 your means weren't suited to your seeming ends and explained 
 why.  It's really your choice as to what to make of one 
 perspective on the effectiveness of your behavior.
Your posts (which are there for anyone to read) sought to undermine an opinion that I expressed, by explicitly encouraging others who read that opinion to not take it seriously, and to just ignore it, simply on the basis of whether or not I was really part of the 'D community' and one of the 'committed users of D'. If that is not a personal attack, what is?
That's not what he did. He commented on what you said, he didn't personally attack you.
 And is that really the threshold for stating an opinion on 
 these forums?
You can say what you want no problem. But if you want people to listen to you then usually you have to earn that, it's normal human behaviour. We put more and less value on what people say for a variety of reasons.
 As I said, next time, you can save yourself a lot of typing, 
 and just type "Wow".
Laeeth is a nice guy, he tries to politely engage with newbies and ease them into a more broad minded understanding of how things are around here (by here I mean the cosmos). I think he's wasting his time but he's a nice guy so he keeps trying.
 In the 2 weeks so far, I've been accused (by a select few - 
 always the same few though) of being a 'non-contributor', of 
 misrepresenting facts, and being a socket puppet.
You are a non contributor. You have misrepresented things (at the least Laeeth's last post) You might be a sock puppet. I am, although Im a sock puppet of a sock puppet of a sock puppet. In fact it's socks all the way down man!
 Welcome to the D forum.
Welcome to the internet dude. Chill out, stop taking everything personally. And get to grips with the fact that D is a community effort, most people have their own furrow to plough, there aint really anyone here who's job it is to tuck the newbies in at night. Also you have to accept that like who the fuck are you? i mean really, you pop up, weird name, anonymous, easily offended (like how old are you?), cant handle people disagreeing with you, and you still expect people to value your opinion in spite of all that? Look mate most people will ignore you, some of the nice guys will try to be nice to you, some of the bored loonies will poke you to get a reaction.. (wait thats me!), but at the end of the day wtf do you expect?
Nov 22 2018
parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 22/11/2018 10:44 PM, Dave Jones wrote:
 Welcome to the internet dude. Chill out, stop taking everything 
 personally. And get to grips with the fact that D is a community effort, 
 most people have their own furrow to plough, there aint really anyone 
 here who's job it is to tuck the newbies in at night.
 
 Also you have to accept that like who the fuck are you? i mean really, 
 you pop up, weird name, anonymous, easily offended (like how old are 
 you?), cant handle people disagreeing with you, and you still expect 
 people to value your opinion in spite of all that?
 
 Look mate most people will ignore you, some of the nice guys will try to 
 be nice to you, some of the bored loonies will poke you to get a 
 reaction.. (wait thats me!), but at the end of the day wtf do you expect?
That is quite enough, no need to respond if you won't be civil in response.
Nov 22 2018
parent Dave Jones <not me.again> writes:
On Thursday, 22 November 2018 at 10:24:44 UTC, rikki cattermole 
wrote:
 On 22/11/2018 10:44 PM, Dave Jones wrote:
 Welcome to the internet dude. Chill out, stop taking Look mate 
 most people will ignore you, some of the nice guys will try to 
 be nice to you, some of the bored loonies will poke you to get 
 a reaction.. (wait thats me!), but at the end of the day wtf 
 do you expect?
That is quite enough, no need to respond if you won't be civil in response.
Jeez I bet you're a hoot after a few beers. ;)
Nov 22 2018
prev sibling parent NoMoreBugs <NoMoreBugs gmail.com> writes:
On Thursday, 22 November 2018 at 02:56:14 UTC, Laeeth Isharc 
wrote:
 .....
and btw. the next time you disagree with something someone has said, instead of attacking them personally, you could try "Wow", or "Ok". Try it next time..it's actually easy and requires much less typing. https ... youtu.be/Lo6Q2vB9AAg?t=1460 How do you insert a link in these posts anyway?
Nov 21 2018
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sat, Nov 17, 2018 at 02:07:45AM +0000, Laeeth Isharc via Digitalmars-d wrote:
[...]
 (it's fascinating on these kinds of threads the number of accounts
 that pop up who I never recall having posted before, whatever that may
 mean)
When an unusually large number of new accounts suddenly pop up, all pretty much echoing the same opinions with the same tone and writing style, it's usually an indication that sockpuppeteering is going on. *shrug* That's just what some people do with the convenient anonymity of the internet. It's a practice as old as Usenet, and I've learned to just ignore it. [...]
 Its a particular subset that post on the forums at all, and plenty of
 people active on the forums don't express themselves on these kinds of
 threads.  A subset of a subset.
 
 I know a few people active in the development of the language and
 library that don't say much on the forums because they feel making
 pull requests are more productive.
Also, sometimes these forum discussions just devolve into mud-flinging and/or never-ending reiteration of opinions with no forward progress, so after a while it becomes clear that it's just a waste of time and energy that could have been better spent elsewhere. Like submitting PRs to make D better. [...]
 And to return to an old point.  It's much better to focus on people
 that like what you are doing and already using your product than those
 who say "if only you would do X, D would be huge".  That's the nature
 of the innovator's dilemma and also if one is to be persuasive then
 it's helpful to remember that talk is cheap, whereas making a closely
 reasoned argument accompanied by skin in the game - now that is much
 more persuasive.
+1. When it becomes clear to me that a particular debate participant has no vested interest in D, I usually find myself very disinclined to respond. Talk is cheap for a bystander who can just idly point out all of your flaws yet without lifting a finger himself. Words mean so much more when the person is actually writing D code and facing actual issues in actual, for-real code. Or better yet, actually contributing to D and pointing out issues that he discovered in the process. T -- This sentence is false.
Nov 16 2018
next sibling parent NoMoreBugs <NoMoreBugs NoMoreBugs.com> writes:
On Saturday, 17 November 2018 at 07:05:58 UTC, H. S. Teoh wrote:
 Talk is cheap for a bystander who can just idly point out all 
 of your flaws yet without lifting a finger himself.
This is a really awful statement. I recall Walter getting offended by something recently, but this is far worse. What you are essentially saying, is that the people who seem like 'new users', that have been involved in this discussion, are just 'bystanders unwilling to lift a finger', and perhaps even all one person creating lots of fake accounts? First, a user is someone who spends their time learning to use and understand D. That is time, take from their life, which cannot be taken back. So just calling them 'bystanders' because they might not be involved in PR's is really offensive - but is pretty typical of what 'new users' have to experience on these forums. Second, having read back over all the post in this thread, I see no evidence at all, of sockpuppeting. What is the basis for your wild accusation? New users expressing their opinion - or ones that you don't like? If anyone has ruined this discussion, it was you, with your post.
Nov 16 2018
prev sibling parent NoMoreBugs <NoMoreBugs gmail.com> writes:
On Saturday, 17 November 2018 at 07:05:58 UTC, H. S. Teoh wrote:
 When an unusually large number of new accounts suddenly pop up, 
 all pretty much echoing the same opinions with the same tone 
 and writing style, it's usually an indication that 
 sockpuppeteering is going on. *shrug*  That's just what some 
 people do with the convenient anonymity of the internet.  It's 
 a practice as old as Usenet, and I've learned to just ignore it.
btw. do you know what 'meatpuppetry' is ?
Nov 17 2018
prev sibling parent Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Friday, 16 November 2018 at 20:00:30 UTC, Aurélien Plazzotta 
wrote:
 There are no jobs in D :)
I'm pretty sure you are mistaken both on the supply and demand side.
I don't understand, what do you imply?
 And we are still hiring.
Nice but for which tech?
https://forum.dlang.org/post/dseedvywcitpzysxysme forum.dlang.org
Nov 16 2018
prev sibling next sibling parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Thursday, November 15, 2018 7:02:26 PM MST H. S. Teoh via Digitalmars-d 
wrote:
 On Thu, Nov 15, 2018 at 06:03:37PM -0700, Jonathan M Davis via
 Digitalmars-d wrote: [...]

 *sigh* Honestly, auto-decoding is almost a perfect storm of issues for
 us being able to actually get rid of it. So, while I agree with you
 that we'd ideally fix the problem, it's _not_ an easy one to fix, and
 really the only "easy" way to fix it is to pretty much literally say
 "D3" and hard break all code. I think that the reality of the matter
 is that there are issues in every language that you can't fix without
 either creating a new language or creating a new version of the
 language that's not backwards compatible with the old one (which then
 forks the language and community).  So, while we'd very much like to
 fix everything, there are going to be some things we simply can't fix
 if we're not willing to create D3, and talking about D3 creates a
 whole other can of worms, which I don't think we're even vaguely ready
 for yet.
Talking about D3 has sorta become taboo around here, for understandable reasons -- splitting the community now might very well be the death of D after that Tango vs. Phobos fiasco. Python survived such a transition, and Perl too AIUI. But D currently does not have nearly the size of Python or Perl to be able to bear the brunt of such a drastic change. Nevertheless I can't help wondering if it would be beneficial to one day sit down and sketch out D3, even if we never actually implement it. It may give us some insights on the language design we should strive to reach, based on the experience we have accumulated thus far. Autodecoding, even though it's a commonly mentioned example, actually is only a minor point as far as language design is concerned. More fundamental issues could be how to address the can of worms that 'shared' has become, for example, or what the type system might look like if we were to shed the vestiges of C integer promotion rules.
Honestly, over time, I've become increasingly convinced that the more radical ideas would be incredibly undesirable (e.g. making const or immutable the default). shared has some rough edges that need to be sorted out, but I don't think that it's fundamentally broken as things stand. I think that the issue is more that it's misunderstood, and its proper use has not really been messaged well - with the related problem being that the core synchronization components in druntime have not been entirely properly updated to take shared into account like they should have been, mostly because no one wanted to mess with them, because they had the idea that shared was largely unfinished and might change drastically later. So, while shared's implementation needs some tweaks to be sure, I'm not the least bit convinced that it needs a serious overhaul on the language front so much as some work on the library front and an overhaul on the PR front. In any case, I think that _most_ of the things that should go in something like D3 can be done in D2 so long as Walter and Andrei can be convinced. For instance, we totally _could_ fix the nonsense about treating bool as an integer type in D2. There's nothing about that that requires D3. Unfortunately, of course, Walter and Andrei weren't convinced by the DIP that would effectively have fixed that by removing the implicit conversions from integer literals to bool, so that's not happening in D2 unless something drastic changes, and as such, I see no reason to expect that it would happen in D3. And I think that most of the potential changes are like that. Maybe Walter and Andrei can be convinced of a particular change, maybe not. But each change can tackled individually, and for the most part, I think that the ones that have real value can be made in D2 (though obviously, convincing Walter and Andrei of any particular change is rarely easy). It's just the real earth-shattering ones like reversing the defaults of things that can't be, and I'm not at all convinced that any of those would actually be a good idea anyway, much as folks like to talk about them. In a large program, it can very much be worth going to the extra effort of making your program work with a lot of extra attributes, but they often just get in the way, and forcing them on all programs by default would easily risk making D miserable to work in by default. LOL. Walter's comment at dconf this year that he wished D had const as the default definitely makes me that much more leery of D3 ever arriving, since I increasingly avoid const in D. Honestly, the only thing I can think of where I'd love the opportunity to be able to sit down and start from scratch would be ranges - and not just for auto-decoding. I'd want to rework them so that save wasn't a thing, and I'd want to figure out how to rework them so that the reference and value semantics were cleaner. What we have works really well in the common case but ends up as a bit of a mess in the corner cases, and I don't think that it's entirely fixable. It's mostly a library issue (though it's also partly an issue with foreach), but it's pervasive enough that we can't really change it at this point. It's one of those areas where we introduced a revolutionary idea, and we didn't get it quite right. So, it's great, but it has problems, and we're stuck with at least some of them. Personally, that's probably the only thing that I'd be looking to change that I don't think could be changed without effectively doing D3. In general, the kind of changes that I can think of that I'd like to see are things that can definitely be done in D2 - assuming that Walter and Andrei can be convinced, which is rarely easy, and the question of D2 vs D3 probably wouldn't change that much (some, since backwards compatibility would be less of an issue, but that doesn't mean that they'd then be easy to convince of major changes in general). - Jonathan M Davis
Nov 15 2018
prev sibling next sibling parent bauss <jj_1337 live.dk> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads

 Right now it already has:

 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII

 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what D 
 has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if my 
 goal is to build a complex system / product?

 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Because C++ still have terrible support for proper web development. This is a joke, but not really. There is nothing in C++ that's equivalent to anything currently available in D when it comes to web development. D is pretty close to what other .NET), Rails (Ruby), Node (JS) etc. C++ doesn't really have anything that's at an equivalent level. The closest thing is probably something like CppCMS, but not only is it old, it's also way behind some of the D frameworks :) Also D will still have a much more beautiful syntax, C++ is never going away from the "ugly" syntax it has. Templates will always be superior in D. CTFE will still be superior to constexpr in C++, because D doesn't really have a difference between a runtime function and a compiletime function. I reckon that C++ will be much more limited. There are a lot more than just language features that makes D a better choice than C++ in my opinion. Also when C++ announces what they plan to implement for a version, usually half of the things doesn't make it and are usually brought up in the next version again, where they may or may not make it. An example is modules in C++, it was planned for C++17 too as far as I remember.
Nov 15 2018
prev sibling next sibling parent reply Atila Neves <atila.neves gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/

 By 2020 C++ is planning to introduce:

 * Ranges
 * Contracts
 * Concepts (`__traits`)
 * Proper constexpr
 * Modules
 * Reflections
 * Green threads

 Right now it already has:

 * `auto` variables
 * Ranged for (`foreach`)
 * Lambda expressions and closures
 * `nothrow` attributes
 * Proper containers
 * Proper RAII

 In no way this is the usual trollpost (I am a participant of 
 SAoC). What bugs me is the shortening distance regarding what D 
 has to offer with respect to C++. While D for sure has a way 
 better syntax (thinking of template declarations, `immutable`, 
 UDAs) and a GC, what are the advantages of using D vs C++ if my 
 goal is to build a complex system / product?

 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
Even if they're present in C++20 or later, they might not work the same or be as good. C++17 has `if constexpr`, but it's not as good as `static if`. Having used most of the modern C++ features and writing D daily, here's my personal opinions of why D is still / will still be better: * The GC. It's just easier to write safe code by default. * DIP1000 * auto is easier to use in D (no need for `const auto` or `static auto`, or... * Compile-time reflection. Whatever C++ gets is likely to be worse, because of: * string mixins * Lambdas are harder to use in C++ and have lifetime issues with captures by reference * Headers / modules. * UFCS and chains, even though the new C++ ranges will mitigate this * Sane template metaprogramming, but void_t is nice * Syntax for slices Is writing C++17 much better than C++03? Indubitably. Is it as nice as writing D? In my opinion, no - I'm 2x to 3x more productive to begin with in D and end up writing fewer bugs. D is still rough around the edges when it comes to implementation quality. I've never had C++ projects break due to a compiler update, nor have I ever filed a clang or gcc bug. CMake is horrible but I can get the job done, whereas I have to work around dub all the time. Right now my main source of cortisol is the ridiculously long edit-test cycle, but I'm working on something to make that part of my work life a lot happier.
Nov 16 2018
next sibling parent reply bauss <jj_1337 live.dk> writes:
On Friday, 16 November 2018 at 13:10:19 UTC, Atila Neves wrote:
 I've never had C++ projects break due to a compiler update, nor 
 have I ever filed a clang or gcc bug.
But that's because C++ is way older than D and has had a lot more work done to it. C++ hasn't always been elegant and I'll bet you Walter can come up with a lot of issues with C++ from back when he worked on C++ compilers. Clang and GCC do have a lot of bugs too though, but they're usually much more "precise" and not so much "general" as most bugs in D, but eventually D will reach the same level. I have never had any breakage because of a D compiler update in my own projects for the past 2+ years, but I have with 3rd party libraries though, so I'm not entirely free of that either. As long as D is as open as it is, then I think it'll continue to have breakage with each bigger release. The good thing though is that it's usually not necessary to update your compiler version and you can generally wait a few versions until you update and that way you can skip most breakage.
Nov 16 2018
parent Atila Neves <atila.neves gmail.com> writes:
On Friday, 16 November 2018 at 13:38:17 UTC, bauss wrote:
 On Friday, 16 November 2018 at 13:10:19 UTC, Atila Neves wrote:
 I've never had C++ projects break due to a compiler update, 
 nor have I ever filed a clang or gcc bug.
But that's because C++ is way older than D and has had a lot more work done to it.
Yes. That doesn't change the current state of affairs.
 C++ hasn't always been elegant
I don't think C++ has ever been elegant, except for the introduction of the STL.
 and I'll bet you Walter can come up with a lot of issues with 
 C++ from back when he worked on C++ compilers.
I don't need to, I remember them. There's a reason Qt looks the way it does: at the time compilers couldn't be relied to compile templates properly. I learned the STL by not learning the actual *standard* template library. There were a lot of similar incompatible implementations.
 Clang and GCC do have a lot of bugs too though, but they're 
 usually much more "precise" and not so much "general" as most 
 bugs in D, but eventually D will reach the same level.
I know. I didn't suggest they were bug free, just that I've never personally filed a bug for either of those compilers. I have 19 to my name on dmd.
 I have never had any breakage because of a D compiler update in 
 my own projects for the past 2+ years, but I have with 3rd 
 party libraries though, so I'm not entirely free of that either.
I've literally lost count.
 As long as D is as open as it is, then I think it'll continue 
 to have breakage with each bigger release.
I hope not!
 The good thing though is that it's usually not necessary to 
 update your compiler version and you can generally wait a few 
 versions until you update and that way you can skip most 
 breakage.
Again, something I've never had to do with clang or gcc.
Nov 17 2018
prev sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 16 November 2018 at 13:10:19 UTC, Atila Neves wrote:
 I've never had C++ projects break due to a compiler update, nor 
 have I ever filed a clang or gcc bug.
I used to regularly hit problems around 2005-2007ish in C++, including updates breaking stuff, libraries using new stuff that would not work in my install, and yes, plain old bugs like compiler crashes (lots of them when trying to use Boost back then!). Of course, "plain" code would work well, but I can say the same about D now too. I'd say feature-wise, D is about 4 years ahead of C++, but implementation-wise, meaning the compiler and associated ecosystem probably about 8 years behind C++...
Nov 16 2018
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Nov 15, 2018 at 07:49:32PM -0700, Jonathan M Davis via Digitalmars-d
wrote:
[...]
 Honestly, over time, I've become increasingly convinced that the more
 radical ideas would be incredibly undesirable (e.g. making const or
 immutable the default).
Actually, immutable by default would encourage better coding style, as well as potentially provide a small performance benefit by allowing the optimizer to take advantage of more things that don't need to be mutable (can elide certain loads even in the face of aliasing, can infer more loop invariants, etc.).
 shared has some rough edges that need to be sorted out, but I don't
 think that it's fundamentally broken as things stand. I think that the
 issue is more that it's misunderstood, and its proper use has not
 really been messaged well - with the related problem being that the
 core synchronization components in druntime have not been entirely
 properly updated to take shared into account like they should have
 been, mostly because no one wanted to mess with them, because they had
 the idea that shared was largely unfinished and might change
 drastically later. So, while shared's implementation needs some tweaks
 to be sure, I'm not the least bit convinced that it needs a serious
 overhaul on the language front so much as some work on the library
 front and an overhaul on the PR front.
Whether or not shared is fundamentally broken or (mostly) working as designed, I can't say, but the documentation problem is very real, because what little of it I have read has scared me off from touching shared with a 10-foot pole thus far. The fact that you have to cast may have been by design, but it's a scary thing to the under-informed.
 In any case, I think that _most_ of the things that should go in
 something like D3 can be done in D2 so long as Walter and Andrei can
 be convinced. For instance, we totally _could_ fix the nonsense about
 treating bool as an integer type in D2. There's nothing about that
 that requires D3.
Yes, but cutting the legacy tie with C integer promotion rules would require D3.
 Unfortunately, of course, Walter and Andrei weren't convinced by the
 DIP that would effectively have fixed that by removing the implicit
 conversions from integer literals to bool, so that's not happening in
 D2 unless something drastic changes, and as such, I see no reason to
 expect that it would happen in D3.
I was very disappointed at the rejection, in fact. I suppose you have a point that if we were to start over from a blank slate like D3, it will probably still stay that way. But if I had any say in how D3 would be done, I would definitely say treat bool as a non-numerical type, and require a cast if for whatever reason you decide to treat it as such. It makes intent so much clearer, and IMO leads to better code (just like when pointers stop implicitly converting to bool -- I was initially annoyed, but in retrospect appreciated the much improved readability of: if ((p = func()) !is null) ... over the terse but cryptic: if ((p = func())) ... [...]
 In a large program, it can very much be worth going to the extra
 effort of making your program work with a lot of extra attributes, but
 they often just get in the way, and forcing them on all programs by
 default would easily risk making D miserable to work in by default.
Attribute inference is the way to go. *Nobody* wants to waste time manually annotating everything. Boilerplate is evil. In D2 Walter couldn't pull off across-the-board attribute inference, mainly because of backward compatibility, among other issues. In D3 we could potentially build this in by default from the beginning, and it would save so many headaches.
 LOL. Walter's comment at dconf this year that he wished D had const as
 the default definitely makes me that much more leery of D3 ever
 arriving, since I increasingly avoid const in D.
So you're essentially going back to D1? ;-)
 Honestly, the only thing I can think of where I'd love the opportunity
 to be able to sit down and start from scratch would be ranges - and
 not just for auto-decoding. I'd want to rework them so that save
 wasn't a thing, and I'd want to figure out how to rework them so that
 the reference and value semantics were cleaner.
Yeah, I think Andrei himself also expressed the wish that ranges could be defined such that input ranges by definition have by-reference semantics, whereas forward ranges have by-value semantics, and would save their current position just by being copied into another variable. IIRC, the reason .save was introduced in the first place was because at the time, the language was still lacking some features that made it easy to determine whether something had by-value or by-reference semantics. [...]
 In general, the kind of changes that I can think of that I'd like to
 see are things that can definitely be done in D2 - assuming that
 Walter and Andrei can be convinced, which is rarely easy, and the
 question of D2 vs D3 probably wouldn't change that much (some, since
 backwards compatibility would be less of an issue, but that doesn't
 mean that they'd then be easy to convince of major changes in
 general).
[...] Seeing as the default answer to large-scale language changes these days seem to be "no, because it would break too much code", I think D3 would allow an opportunity to make large-scale changes that we otherwise wouldn't dare to in D2. One of the warts I'd like to see go the way of the dodo is the whole fiasco with Object.toString, Object.opEquals, etc., specifically how they interact with attributes in class methods. And also the implicit Monitor object (when it's not needed). Andrei had proposed ProtoObject as a D2 solution to this impasse, but I've yet to see any further progress since the initial discussion. T -- Knowledge is that area of ignorance that we arrange and classify. -- Ambrose Bierce
Nov 16 2018
prev sibling next sibling parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Friday, November 16, 2018 3:41:10 PM MST H. S. Teoh via Digitalmars-d 
wrote:
 On Thu, Nov 15, 2018 at 07:49:32PM -0700, Jonathan M Davis via
 Digitalmars-d wrote: [...]

 Honestly, over time, I've become increasingly convinced that the more
 radical ideas would be incredibly undesirable (e.g. making const or
 immutable the default).
Actually, immutable by default would encourage better coding style, as well as potentially provide a small performance benefit by allowing the optimizer to take advantage of more things that don't need to be mutable (can elide certain loads even in the face of aliasing, can infer more loop invariants, etc.).
immutable by default would encourage a functional programming style everywhere. I dispute that that's necessarily better. There are certainly times that that's better, but it's often hell. What we have now allows us to program functionally when we want to without forcing it, whereas having immutable by default would lean heavily towards forcing it. It also would not play well with D being a systems language or with D interacting with C or C++. Honestly, having D3 be const or immutable by default is the sort of choice that would make me seriously consider quitting D.
 In any case, I think that _most_ of the things that should go in
 something like D3 can be done in D2 so long as Walter and Andrei can
 be convinced. For instance, we totally _could_ fix the nonsense about
 treating bool as an integer type in D2. There's nothing about that
 that requires D3.
Yes, but cutting the legacy tie with C integer promotion rules would require D3.
It would, but given the insistence on C code being valid D code with the same semantics or not being valid D code, I don't see that ever changing, and that basic rule is critical for being able to easily port C code to D without subtle bugs. So, while we can make some tweaks to the promotion rules, I very much doubt that any version of D will ever be fully divorced from C in that regard.
 Unfortunately, of course, Walter and Andrei weren't convinced by the
 DIP that would effectively have fixed that by removing the implicit
 conversions from integer literals to bool, so that's not happening in
 D2 unless something drastic changes, and as such, I see no reason to
 expect that it would happen in D3.
I was very disappointed at the rejection, in fact. I suppose you have a point that if we were to start over from a blank slate like D3, it will probably still stay that way. But if I had any say in how D3 would be done, I would definitely say treat bool as a non-numerical type, and require a cast if for whatever reason you decide to treat it as such. It makes intent so much clearer, and IMO leads to better code (just like when pointers stop implicitly converting to bool -- I was initially annoyed, but in retrospect appreciated the much improved readability of: if ((p = func()) !is null) ... over the terse but cryptic: if ((p = func())) ... [...]
Technically, implicit conversions have nothing to do with if statements anyway. Weird as that may be, that's an implicit, explicit conversion in that the compiler implicitly inserts a cast. So, even types which don't have implicit conversion still work with if statements as long as they have an explicit conversion to bool. In general though, I do agree that being more explicit with conditionals is better. In any case, the point I was trying to make was that if Walter and Andrei can't be convinced of something like removing the implicit conversion of integer literals to bool for D2, I find it highly unlikely that they'll be convinced of it for D3. It might be that they'll be more open to a change for D3 if the reason that they don't want to make the change in D2 is a concern about code breakage, but aside from that, convincing them to change something for D2 and D3 is probably going to be pretty much the same. In plenty of cases, concerns about code breakage have nothing to do with it. In general, it's just plain hard to convince Walter and Andrei that a particular feature or language change is a good idea. Whether you're talking about D2 or D3 doesn't really change that. D3 just removes many of the code breakage concerns from the equation. Either way, any DIP is has an uphill battle, and I think that most DIPs that provide real value could make it into D2 and don't require D3. The trick - as always - is convincing Walter and Andrei. So, I don't see how talk of D3 fixes anything really. The kind of stuff that you're unhappy about seems to mostly be stuff that Walter doesn't agree with anyway, so it's not stuff that he'd fix in D3, and a lot of it, he could fix in D2. Ultimately, it comes down to convincing Walter and Andrei. It may eventually come to creating a D3 and fixing some issues that we couldn't fix in D2, but without first convincing Walter and Andrei of various issues, such a version bump would be pointless, and if we convince them first, most of the issues can be resolved even without such a version bump.
 In a large program, it can very much be worth going to the extra
 effort of making your program work with a lot of extra attributes, but
 they often just get in the way, and forcing them on all programs by
 default would easily risk making D miserable to work in by default.
Attribute inference is the way to go. *Nobody* wants to waste time manually annotating everything. Boilerplate is evil. In D2 Walter couldn't pull off across-the-board attribute inference, mainly because of backward compatibility, among other issues. In D3 we could potentially build this in by default from the beginning, and it would save so many headaches.
 LOL. Walter's comment at dconf this year that he wished D had const as
 the default definitely makes me that much more leery of D3 ever
 arriving, since I increasingly avoid const in D.
So you're essentially going back to D1? ;-)
In terms of const? You read my article on it, so you should know where I stand on it. I don't think that either C++ or D got it right, but I also don't know what const would look like if it were done right. Either way, I sure don't want it to be the default. It's way to restrictive. If it were the default, I'd be forced to mark code with mutable (or whatever the keyword would be) everywhere. The same if immutable were the default. It would not be fun.
 Seeing as the default answer to large-scale language changes these days
 seem to be "no, because it would break too much code", I think D3 would
 allow an opportunity to make large-scale changes that we otherwise
 wouldn't dare to in D2.
It would, but at the same time, we also have to be able to have a language that we can be using in production and continue to use in production. Having it change slowly over time is one thing. Having it change drastically like that is something else entirely. At some point, it may make sense to create a D3, but I think that we should have a much better idea of what we want to change that we can't change now when we get to that point. We really don't have that right now. At most, it seems like D3 is the sort of thing that gets brought up when someone has an individual change that they want that gets shot down. As far as I can tell, there isn't even vaguely a cohesive group of changes that the main group of contributors agree on that they'd like to see happen that can't happen at this point, let alone the community at large. AFAIK, Walter and Andrei don't have a laundry list of things that they'd like to do but can't, because it would break backwards compatibility. If Walter and Andrei were to say tomorrow that they wanted to create D3, they'd get bombarded with all kinds of stray ideas about how the language should be changed. There wouldn't be anything approaching any kind of agreement on what it should look like. Honestly, we have enough trouble agreeing on what changes should be made to the language when we're restricting ourselves to stuff that can be done transitionally within D2. And honestly, when you consider that we can't even sort out stuff like treating bool as an integer type, it just seems crazy to talk about jumping off the deep end into D3, much as it would be nice to be able to make some larger tweaks in some cases. I think that we that we first need to figure out how to better fix D2 before we consider anything with D3 - and that's without even considering any of the potential issues of forking the language and community that comes with creating a new language version.
 One of the warts I'd like to see go the way of the dodo is the whole
 fiasco with Object.toString, Object.opEquals, etc., specifically how
 they interact with attributes in class methods.  And also the implicit
 Monitor object (when it's not needed).  Andrei had proposed ProtoObject
 as a D2 solution to this impasse, but I've yet to see any further
 progress since the initial discussion.
Yeah, but all of that stuff moves slowly, and technically, it does have to go through the DIP process. It's a DIP that has yet to be approved, even if it's probably a shoe-in. Also, it's at least sort of related to the copy constructor DIP, and at least that is moving along, albeit slowly. If I understand correctly, it's about to move into the next stage. I think that the biggest question is really who's going to implement the ProtoObject stuff, since Andrei isn't a compiler dev. That's the weakest link in any case like this. Whenever someone comes up with an idea, and they're not set up to be the one to implement it, there's always a risk of it sitting in limbo for a while. But my guess is that at some point, Andrei will stick one of his students on it. The bigger concern I see is what's been the related hold-up with removing all of that muck from Object for years and that's the AA implementation. I know that Martin did a bunch of work on it, but I don't where any of that stands. However, without it being replaced, there will likely be serious issues when ProtoObject comes into the picture. - Jonathan M Davis
Nov 16 2018
next sibling parent Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Saturday, 17 November 2018 at 01:00:39 UTC, Jonathan M Davis 
wrote:
 LOL. Walter's comment at dconf this year that he wished D 
 had const as the default definitely makes me that much more 
 leery of D3 ever arriving, since I increasingly avoid const 
 in D.
So you're essentially going back to D1? ;-)
In terms of const? You read my article on it, so you should know where I stand on it. I don't think that either C++ or D got it right, but I also don't know what const would look like if it were done right.
From the viewpoint of an ignorant outsider (i.e. me), Scala's `var` (mutable)/`val` (immutable) strikes me as the closer to it.
Nov 17 2018
prev sibling next sibling parent Atila Neves <atila.neves gmail.com> writes:
On Saturday, 17 November 2018 at 01:00:39 UTC, Jonathan M Davis 
wrote:
 On Friday, November 16, 2018 3:41:10 PM MST H. S. Teoh via 
 Digitalmars-d wrote:
 On Thu, Nov 15, 2018 at 07:49:32PM -0700, Jonathan M Davis via 
 Digitalmars-d wrote: [...]

 [...]
Actually, immutable by default would encourage better coding style, as well as potentially provide a small performance benefit by allowing the optimizer to take advantage of more things that don't need to be mutable (can elide certain loads even in the face of aliasing, can infer more loop invariants, etc.).
immutable by default would encourage a functional programming style everywhere. I dispute that that's necessarily better. There are certainly times that that's better, but it's often hell. What we have now allows us to program functionally when we want to without forcing it, whereas having immutable by default would lean heavily towards forcing it.
Not forcing but encouraging.
 It also would not play well with D being a systems language
I can't see how - in Rust everything is immutable by default and it's fine.
 or with D interacting with C or C++.
I also can't see how that'd be the case.
 Honestly, having D3 be const or immutable by default is the 
 sort of choice that would make me seriously consider quitting D.
For me, the defaults should be safe pure for functions and const for variables.
Nov 17 2018
prev sibling parent 0xEAB <desisma heidel.beer> writes:
On Saturday, 17 November 2018 at 01:00:39 UTC, Jonathan M Davis 
wrote:
 If Walter and Andrei were to say tomorrow that they wanted to 
 create D3, they'd get bombarded with all kinds of stray ideas 
 about how the language should be changed.
Well, I'd kindly suggest to *not* create yet another programming language then.
Nov 18 2018
prev sibling next sibling parent 0xEAB <desisma heidel.beer> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
You don't have to deal with the header files bloat. <- still one of the biggest advantages imho
Nov 18 2018
prev sibling next sibling parent reply welkam <wwwelkam gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 TL;DR: what will D offer with respect to C++ when almost all 
 key features of D are present in C++20(+)?
If you list C++, D and Rust language features you would find that they are almost the same. That said its immediately clear that a list of features are not enough to describe a language. Your experience with these languages will be different because they are different languages. C++ biggest problem at this time is long compile times. It can add all D`s compile time features and still people would not use those features to the same extent as in D just because those features would increase compile times to unacceptable levels. At this time big projects like compilers and game engines take half an hour to compile on modern machines. Now imagine what their compile times would happen if those project would use meta programming to the same level as WekaIO. And no modules wont change this significantly. But as other said it already C++ havent -stolen- implemented those features correctly and by the time you could use them D would have improved even more.
Nov 18 2018
parent Tony <tonytdominguez aol.com> writes:
I think the crown jewel of D is interfaces and garbage 
collection. I was surprised to see that many people here don't do 
OOP and want to not use garbage collection.
Nov 18 2018
prev sibling next sibling parent Trailzz <email example.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 https://www.reddit.com/r/cpp/comments/9vwvbz/2018_san_diego_iso_c_committee_trip_report_ranges/
D has: 1. A package manager (dub) 2. (optional) Garbage Collection 3. A standard library that is actually useful 4. Far nicer syntax
Nov 18 2018
prev sibling parent reply Emil <emilper gmail.com> writes:
On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 By 2020 C++ is planning to introduce:
what are they planning to remove from C++ ?
Nov 20 2018
next sibling parent Stefan Koch <uplink.coder googlemail.com> writes:
On Tuesday, 20 November 2018 at 14:02:21 UTC, Emil wrote:
 On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 By 2020 C++ is planning to introduce:
what are they planning to remove from C++ ?
I don't think there is anything. But it is nice to be able to compile 20 years old code without encountering strange new errors.
Nov 20 2018
prev sibling parent Laeeth Isharc <laeeth kaleidic.io> writes:
On Tuesday, 20 November 2018 at 14:02:21 UTC, Emil wrote:
 On Wednesday, 14 November 2018 at 15:07:46 UTC, lagfra wrote:
 By 2020 C++ is planning to introduce:
what are they planning to remove from C++ ?
This was entertaining! :) https://www.bfilipek.com/2018/04/deprecating-pointers.html?m=1
Nov 20 2018