www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Remember the Vasa! by Bjarne Stroustrup

reply Walter Bright <newshound2 digitalmars.com> writes:
A cautionary tale we should all keep in mind.

http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

https://news.ycombinator.com/item?id=17172057
May 28 2018
next sibling parent reply 12345swordy <alexanderheistermann gmail.com> writes:
On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
No doubt that all this complexity is partially due to having a religious like zeal when it comes to preserving backwards compatibility. I mean create a new official file extension, so that it can make much needed breaking changes on it. For all the faults that D has, it is not afraid deprecate language features if it turns out to be a really bad idea.
May 28 2018
parent Walter Bright <newshound2 digitalmars.com> writes:
On 5/28/2018 6:54 PM, 12345swordy wrote:
 No doubt that all this complexity is partially due to having a religious like 
 zeal when it comes to preserving backwards compatibility.
 
 I mean create a new official file extension, so that it can make much needed 
 breaking changes on it.
 
 For all the faults that D has, it is not afraid deprecate language features if 
 it turns out to be a really bad idea.
If I was going to make a proposal to the C++ Standards committee, it would be to deprecate the preprocessor. D has by now shown that all of it that matters can be done with language features in a straightforward, hygienic manner. Here's the D subthread: https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/dzpwz76/
May 28 2018
prev sibling next sibling parent reply TheMightWarship <thewarship warship.com> writes:
On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
Bjarne opens with: "Many/most people in WG21 are working independently towards non-shared goals." I presume, this is essentially a criticism?? If so, I reject that as criticism. There has to be room for allowing people to puruse their individual goals too, and a programming langauge should allow that as well. The idea that less is more? Well, take a look at golang. Less means less, not more. When I go out to do a job, a take only those tools I'll need for the job. I'm not expected to take every tool ever created, and know how to use each of those tools. I take, and use, the tools I need for that particular job. A programming langauge should not restrict the availability of tools, unless is want to be the tool for a particular job. New features, that people want, are essentially 'tools' that they want. (and typically, tools they already have in one or more other langauges). Lets not contain complexity, by preventing it from every occurring - otherwise, nothing would exist.
May 28 2018
parent Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, May 29, 2018 03:43:00 TheMightWarship via Digitalmars-d wrote:
 On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_
 remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
Bjarne opens with: "Many/most people in WG21 are working independently towards non-shared goals." I presume, this is essentially a criticism?? If so, I reject that as criticism. There has to be room for allowing people to puruse their individual goals too, and a programming langauge should allow that as well.
I don't think that it's really a criticism of folks having individual goals. The overall criticism seems to be that while those involved may have varying goals, the resulting language needs to be reasonably coherent and usable by the lay programmer. So, ultimately, when new language features are introduced, you need to examine how they fit in with everything else (both what's already in the language and what's being proposed) and potentially adjust what's being proposed to make it all fit together better. Right now, they're getting a bunch of indpendent proposals that don't take each other into account at all, and it sounds like a lot of them aren't even talking about how this will help or hurt the average C++ programmer. It's more like they're just trying to get their pet feature into the language. So, Stroustrup thinks that they should be trying to make everything fit together better and aim it at how it affects the average C++ programmer who's just trying to get their job done rather than trying to get every stray thing into the language that seems like it would be valuable. He even talks about how the Vasa could have succeeded if a bit more work had been put into making sure how all of the adjustments to the vessel worked together. It didn't need to give up on everything that was done to improve it. Rather, it needed to be more coherent in its parts. C++ is suffering from a major case of being designed by committee. - Jonathan M Davis
May 28 2018
prev sibling next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
It seems C++ is following the road of PL/I, which is growing language way beyond the point anyone can understand or implement all of it.
May 28 2018
next sibling parent Buddha <bud bud.com> writes:
On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky wrote:
 On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
It seems C++ is following the road of PL/I, which is growing language way beyond the point anyone can understand or implement all of it.
This is ultimately a matter of 'architecture', rather than being a problem of a 'growing langauge'. A good architecture could allow growth/complexity to arise in a manageable way. When you have languages that are so low level, you simply cannot create good architecture, beyone a certain point (either from the users point of view, or the implementers). What we need, is better architecture in langauge design. This, ultimately, means we need to move away from the von Neumann machine, because that is really what's holding us back, from developing good architecture (for managing the inevitable complexity that arises from change). Nature shows us the way - we just don't bother to look.
May 28 2018
prev sibling next sibling parent reply Komplex <Komplex Komplex.com> writes:
On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky wrote:
 It seems C++ is following the road of PL/I, which is growing 
 language way beyond the point anyone can understand or 
 implement all of it.
but that happened to the linux kernel too, long ago? and yet...it's everywhere..and increasingly so... we need to move away from this concept that we need to understand it all, or otherwise.. it must be bad.
May 28 2018
parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On Tuesday, 29 May 2018 at 04:41:33 UTC, Komplex wrote:
 On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky wrote:
 It seems C++ is following the road of PL/I, which is growing 
 language way beyond the point anyone can understand or 
 implement all of it.
but that happened to the linux kernel too, long ago?
Not really. First - Linux (for the scale) is architectured wuite well. Second - language are way more composable and complex beasts then systems. I bet I can understand most of Linux kernel in a couple of years (w/o drivers and arch specifics beyond x86). Abstraction and components/interfaces is time-proven technique that actually works for the most part. In contradt I will likely never understand or have a good picture of C++20 as a (semi-)coherent whole, not that I really wanted to. D is probably at the edge of what I can tollerate complexity-wise. And we’ll get to simplify a few things soon I believe.
 and yet...it's everywhere..and increasingly so...
It has evolved a lot. Yet I believe we can get better things done w/o years upon years of churn. But that’s just a point of view.
 we need to move away from this concept that we need to 
 understand it all, or otherwise.. it must be bad.
May 28 2018
next sibling parent reply Let-It-Go <letitgo letitgo.com> writes:
On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon I 
 believe.
There is the core of the problem. Because you want to understand it all, therefore it must be simplified. This is not something that nature imposes on itself. It's entirely a product of the human mind. Why constrain ourselves in this way? Let it go, and let it grow ;-)
May 28 2018
parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Tuesday, 29 May 2018 at 05:47:32 UTC, Let-It-Go wrote:
 On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon I 
 believe.
There is the core of the problem. Because you want to understand it all, therefore it must be simplified. This is not something that nature imposes on itself. It's entirely a product of the human mind. Why constrain ourselves in this way? Let it go, and let it grow ;-)
As a a compiler developer, I can guarantee that at some point you _need_ to understand all of the language. If you don't you will mis-compile code. Also the more complex the language gets the more special-case handling needs to be added to the compiler making it slower and more brittle. Unconstrained complexity growth is a pretty scary thing.
May 29 2018
parent Wu Wei <tao tao.com> writes:
On Tuesday, 29 May 2018 at 07:25:39 UTC, Stefan Koch wrote:
 As a a compiler developer, I can guarantee that at some point 
 you _need_ to understand all of the language.
 If you don't you will mis-compile code.

 Also the more complex the language gets the more special-case 
 handling needs to be added to the compiler making it slower and 
 more brittle.

 Unconstrained complexity growth is a pretty scary thing.
Could this be more a problem of compiler 'architecture'? Or perhaps hardware architecture? Can we design better architecture (at all levels) to better accomodate inevitable change? Could it be a problem of not having enough compiler writers - where each knows some part(s), but together they know all the parts? Collaboration is good way to manage complexity. A compiler writer insisting they must know it all, (while understandable) is an unatural constraint. You'll end up like Scott Meyers - decades of effort learning, but can never understand it, because change is a moving target.
May 29 2018
prev sibling parent reply Guillaume Piolat <notthat email.com> writes:
On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon I 
 believe.
Within D, there is a bit smaller and cleaner language struggling to get out!
May 29 2018
next sibling parent bpr <brogoff gmail.com> writes:
On Tuesday, 29 May 2018 at 11:31:53 UTC, Guillaume Piolat wrote:
 On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon I 
 believe.
What are the things that you think will be simplified? I thought that D had some of the same issues about breaking backward compatibility that C++ had.
 Within D, there is a bit smaller and cleaner language 
 struggling to get out!
Ha, one of my favorite Stroustrup quotes about C++! One of the reasons I like the betterC switch is that it does simplify the language, perhaps too much, but preserves some of the best parts of D, like metaprogramming and modules.
May 29 2018
prev sibling parent reply Tony <tonytdominguez aol.com> writes:
On Tuesday, 29 May 2018 at 11:31:53 UTC, Guillaume Piolat wrote:
 On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon I 
 believe.
Within D, there is a bit smaller and cleaner language struggling to get out!
Seems like it could be broken into two languages, one a garbage collected object-oriented language. The other, C with metaprogramming and other "betterC" type stuff.
May 29 2018
parent reply bachmeier <no spam.net> writes:
On Tuesday, 29 May 2018 at 17:40:39 UTC, Tony wrote:
 On Tuesday, 29 May 2018 at 11:31:53 UTC, Guillaume Piolat wrote:
 On Tuesday, 29 May 2018 at 05:11:27 UTC, Dmitry Olshansky 
 wrote:
 D is probably at the edge of what I can tollerate 
 complexity-wise. And we’ll get to simplify a few things soon 
 I believe.
Within D, there is a bit smaller and cleaner language struggling to get out!
Seems like it could be broken into two languages, one a garbage collected object-oriented language. The other, C with metaprogramming and other "betterC" type stuff.
I don't think it's difficult to do that yourself. There's no need to have a formal split. One example is that it's really nice to have the GC available for part of the program and avoid it for another part. nogc gives you a guarantee. Different variants of the language are a special case of this that is equivalent to annotating the entire program to restrict behavior. That's rarely desirable.
May 29 2018
parent reply Tony <tonytdominguez aol.com> writes:
On Tuesday, 29 May 2018 at 20:19:09 UTC, bachmeier wrote:

 I don't think it's difficult to do that yourself. There's no 
 need to have a formal split. One example is that it's really 
 nice to have the GC available for part of the program and avoid 
 it for another part.  nogc gives you a guarantee. Different 
 variants of the language are a special case of this that is 
 equivalent to annotating the entire program to restrict 
 behavior. That's rarely desirable.
What would be an example of a type of application (or maybe that should be "which type of domain" or "which type of developer") where you would want part of it to do garbage collection and the rest of it do not do garbage collection?
May 29 2018
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 30/05/2018 8:37 AM, Tony wrote:
 On Tuesday, 29 May 2018 at 20:19:09 UTC, bachmeier wrote:
 
 I don't think it's difficult to do that yourself. There's no need to 
 have a formal split. One example is that it's really nice to have the 
 GC available for part of the program and avoid it for another part. 
  nogc gives you a guarantee. Different variants of the language are a 
 special case of this that is equivalent to annotating the entire 
 program to restrict behavior. That's rarely desirable.
What would be an example of a type of application (or maybe that should be "which type of domain" or "which type of developer") where you would want part of it to do garbage collection and the rest of it do not do garbage collection?
GUI's, audio systems, language tooling, games, I'm sure somebody can come up with a much more longer list.
May 29 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, May 30, 2018 08:43:33 rikki cattermole via Digitalmars-d 
wrote:
 On 30/05/2018 8:37 AM, Tony wrote:
 On Tuesday, 29 May 2018 at 20:19:09 UTC, bachmeier wrote:
 I don't think it's difficult to do that yourself. There's no need to
 have a formal split. One example is that it's really nice to have the
 GC available for part of the program and avoid it for another part.
  nogc gives you a guarantee. Different variants of the language are a
 special case of this that is equivalent to annotating the entire
 program to restrict behavior. That's rarely desirable.
What would be an example of a type of application (or maybe that should be "which type of domain" or "which type of developer") where you would want part of it to do garbage collection and the rest of it do not do garbage collection?
GUI's, audio systems, language tooling, games, I'm sure somebody can come up with a much more longer list.
Basically, stuff that can't afford to have the GC pause the program for more than a millisecond or two has to be careful with the GC, but your average program is going to be perfectly fine with it, and in many cases, it's just part of the program that can't afford the pause - e.g. a thread for an audio or video pipeline. The rest of the program can likely afford it just fine, but that thread or group of threads has to be at least close to realtime, so it can't use the GC. It's kind of like safe vs system. It's not uncommon for most of your program to be able to be safe just fine, but certain stuff just can't be. However, that's not necessarily a good reason to make it so that the entire program is system. So, you make that piece sytem and use trusted appropriately. With the GC, you typically use it and then turn it off or avoid it in select pieces of code that can't afford it. In most cases, it should not be necessary to avoid it entirely even if you can't afford it in part of your program (though as with pretty much everything, there are bound to be exceptions). - Jonathan M Davis
May 29 2018
parent Dave Jones <dave jones.com> writes:
On Tuesday, 29 May 2018 at 21:06:52 UTC, Jonathan M Davis wrote:
 On Wednesday, May 30, 2018 08:43:33 rikki cattermole via 
 Digitalmars-d wrote:
 On 30/05/2018 8:37 AM, Tony wrote:
 On Tuesday, 29 May 2018 at 20:19:09 UTC, bachmeier wrote:
 I don't think it's difficult to do that yourself. There's 
 no need to have a formal split. One example is that it's 
 really nice to have the GC available for part of the 
 program and avoid it for another part.  nogc gives you a 
 guarantee. Different variants of the language are a special 
 case of this that is equivalent to annotating the entire 
 program to restrict behavior. That's rarely desirable.
What would be an example of a type of application (or maybe that should be "which type of domain" or "which type of developer") where you would want part of it to do garbage collection and the rest of it do not do garbage collection?
GUI's, audio systems, language tooling, games, I'm sure somebody can come up with a much more longer list.
Basically, stuff that can't afford to have the GC pause the program for more than a millisecond or two has to be careful with the GC, but your average program is going to be perfectly fine with it, and in many cases, it's just part of the program that can't afford the pause - e.g. a thread for an audio or video pipeline. The rest of the program can likely afford it just fine, but that thread or group of threads has to be at least close to realtime, so it can't use the GC.
You cant call any code that might take a lock if you're doing real time audio, so that means no malloc/free either. That's standard practice. You either allocate everything up front or you do something like I do which is lock free queues ferrying things to and from the audio thread as needed. I mean the point is needing different memory management for different parts of the program is already a thing with real time audio, GC doesnt really change that.
May 29 2018
prev sibling parent Joakim <dlang joakim.fea.st> writes:
On Tuesday, 29 May 2018 at 20:37:38 UTC, Tony wrote:
 On Tuesday, 29 May 2018 at 20:19:09 UTC, bachmeier wrote:

 I don't think it's difficult to do that yourself. There's no 
 need to have a formal split. One example is that it's really 
 nice to have the GC available for part of the program and 
 avoid it for another part.  nogc gives you a guarantee. 
 Different variants of the language are a special case of this 
 that is equivalent to annotating the entire program to 
 restrict behavior. That's rarely desirable.
What would be an example of a type of application (or maybe that should be "which type of domain" or "which type of developer") where you would want part of it to do garbage collection and the rest of it do not do garbage collection?
I believe that's how the Weka guys say they use D for their distributed, parallel filesystem, so you can add it to the list of applications others gave you: https://www.youtube.com/watch?v=RVpaNM-f69s
May 29 2018
prev sibling parent reply Ali <fakeemail example.com> writes:
On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky wrote:
 It seems C++ is following the road of PL/I, which is growing 
 language way beyond the point anyone can understand or 
 implement all of it.
A key line from this paper
  We now have about 150 cooks; that’s not a good way to get a 
 tasty and balanced meal.
I don't think Bjarne is against adding feature to C++, or even constantly adding feature he even admits to support some of the features he mention in his list I think he is worried about 1. the huge number of features being targeted at once 2. the features coming from different independent teams, making them less likely to be coherent This is very different from "lets not grow C++ ever" D need to add features constantly, but coherently, D is very far from having 150 cooks I wish D had that many cooks :)
May 28 2018
parent reply Dave Jones <dave jones.com> writes:
On Tuesday, 29 May 2018 at 05:29:00 UTC, Ali wrote:
 On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky wrote:
 It seems C++ is following the road of PL/I, which is growing 
 language way beyond the point anyone can understand or 
 implement all of it.
A key line from this paper
  We now have about 150 cooks; that’s not a good way to get a 
 tasty and balanced meal.
I don't think Bjarne is against adding feature to C++, or even constantly adding feature he even admits to support some of the features he mention in his list I think he is worried about 1. the huge number of features being targeted at once 2. the features coming from different independent teams, making them less likely to be coherent
Which is ironic considering... Ken Thomson : " Stroustrup campaigned for years and years and years, way beyond any sort of technical contributions he made to the language, to get it adopted and used. And he sort of ran all the standards committees with a whip and a chair. And he said “no” to no one. He put every feature in that language that ever existed. It wasn’t cleanly designed—it was just the union of everything that came along. And I think it suffered drastically from that." Donald Knuth : "Whenever the C++ language designers had two competing ideas as to how they should solve some problem, they said "OK, we'll do them both". So the language is too baroque for my taste."
May 29 2018
next sibling parent Chameleon <Chameleon Chameleon.com> writes:
On Tuesday, 29 May 2018 at 23:55:07 UTC, Dave Jones wrote:
 Which is ironic considering...

 Ken Thomson : " Stroustrup campaigned for years and years and 
 years, way beyond any sort of technical contributions he made 
 to the language, to get it adopted and used. And he sort of ran 
 all the standards committees with a whip and a chair. And he 
 said “no” to no one. He put every feature in that language that 
 ever existed. It wasn’t cleanly designed—it was just the union 
 of everything that came along. And I think it suffered 
 drastically from that."

 Donald Knuth : "Whenever the C++ language designers had two 
 competing ideas as to how they should solve some problem, they 
 said "OK, we'll do them both". So the language is too baroque 
 for my taste."
good old Ken and Don are from a generation where you could (typically) understand the whole langauge. those times have passed. no really.. they have...I'm not kidding... It is now just complete nonsense that one person should be able to understand a modern programming langauge. At best, they will understand some of it. These days, it must be about collaboration - which is something D suffers from not having, due to people believing that they should be able to understand it all, and therefore progress should stop when this no longer becomes possible. That is essentially a human-ego driven perspective, that holds back progress. Progress in modern times requires collaboration. People who know and understand parts, connecting and collaborating with people who know and understand other parts. That is the way the C++ design by committee works. It might not be perfect, but its much better than having a King that you cannot say 'no' too (ie Vasa), or a King that always says 'no' to the people. D needs more collaborators, and less kings.
May 29 2018
prev sibling parent Laeeth Isharc <Laeeth laeeth.com> writes:
On Tuesday, 29 May 2018 at 23:55:07 UTC, Dave Jones wrote:
 On Tuesday, 29 May 2018 at 05:29:00 UTC, Ali wrote:
 On Tuesday, 29 May 2018 at 03:56:05 UTC, Dmitry Olshansky 
 wrote:
 It seems C++ is following the road of PL/I, which is growing 
 language way beyond the point anyone can understand or 
 implement all of it.
A key line from this paper
  We now have about 150 cooks; that’s not a good way to get a 
 tasty and balanced meal.
I don't think Bjarne is against adding feature to C++, or even constantly adding feature he even admits to support some of the features he mention in his list I think he is worried about 1. the huge number of features being targeted at once 2. the features coming from different independent teams, making them less likely to be coherent
Which is ironic considering... Ken Thomson : " Stroustrup campaigned for years and years and years, way beyond any sort of technical contributions he made to the language, to get it adopted and used. And he sort of ran all the standards committees with a whip and a chair. And he said “no” to no one. He put every feature in that language that ever existed. It wasn’t cleanly designed—it was just the union of everything that came along. And I think it suffered drastically from that." Donald Knuth : "Whenever the C++ language designers had two competing ideas as to how they should solve some problem, they said "OK, we'll do them both". So the language is too baroque for my taste."
A dysregulation of caution is more the rule than the exception in modern times. People say yes when they should have said no, and then after the mess becomes evident in reaction to it they say no when they should be saying yes (in response to efforts to clear things up). Viz the response before and after the credit crisis.
Jun 01 2018
prev sibling next sibling parent Arjan <arjan ask.me.to> writes:
On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
Hmm reminds me of this Scott Meyers talk: https://www.youtube.com/watch?v=ltCgzYcpFUI&feature=youtu.be
May 29 2018
prev sibling parent reply aberba <karabutaworld gmail.com> writes:
On Tuesday, 29 May 2018 at 01:46:47 UTC, Walter Bright wrote:
 A cautionary tale we should all keep in mind.

 http://open-std.org/JTC1/SC22/WG21/docs/papers/2018/p0977r0.pdf

 https://www.reddit.com/r/programming/comments/8mq10v/bjarne_stroustroup_remeber_the_vasa_critique_of/

 https://news.ycombinator.com/item?id=17172057
I don't know if you guys have realized but D is heading towards a similar direction too. A lot of the language's features are half baked plus parts of the language are only known well by few people. Including why certain things work the way they work. Example will be when to use immutable or const, in or const (const scope?),... I'm afraid certain things are been introduced without careful consideration of how it affects or relates to previous features. Too many ways of doing same things with sometimes slight differences doesn't help the language's future. Too many unfinished masterpieces.
Jun 01 2018
parent reply Tony <tonytdominguez aol.com> writes:
With regard to having, say, a GUI written with garbage 
collection, and then needing to have non-garbage collected code 
to process audio, could that not be done with GC D calling C? 
And, if there was a garbage-collected D (D for Applications) and 
a non-GC D (D for Systems Programming), couldn't one be linked 
with the other? And before you say "but it should all be together 
coming out of one compiler" - take a moment to Remember the Vasa!


I don't seriously expect two D-ish compilers, but it does seem to 
make more sense with regard to adding automatic reference 
counting to a language that already has garbage collection, as 
well as working to remove garbage collection from the standard 
library. Presumably at the beginning and for much of D's history, 
garbage collection was a premier selling point, along with OOP.

But with regard to various compile-time stuff and function 
annotations and other things that didn't exist years ago, has 
that resulted in noticeably faster programming and/or noticeably 
higher code quality by those utilizing it?
Jun 01 2018
next sibling parent reply Laeeth Isharc <Laeeth laeeth.com> writes:
On Friday, 1 June 2018 at 18:18:17 UTC, Tony wrote:

 But with regard to various compile-time stuff and function 
 annotations and other things that didn't exist years ago, has 
 that resulted in noticeably faster programming and/or 
 noticeably higher code quality by those utilizing it?
Yes, though you also can't compare a typical programmer from the D world with a typical guy from an enterprisey language world. The D guy I am begging please to consider copying memory just this once, and guy from a certain different community I am trying to encourage to consider using a profiler. Anyway in one little comparison for a market data service some D code I pulled together was quite a bit faster than the code written in a certain other enterprise language. D was just storing binary blobs and the other one was talking to MongoDB so it's a totally unfair comparison. But what I wrote quickly in a few weeks not caring about performance at all and feeling guilty about it was 2,000x faster than the solution written in a more traditional language that took months to write. And there was almost no code, so with the exception of three hairy lines it was much more readable and clear. Of course it's unfair to compare two different back ends, but that's also the point - there's a difference in values between different communities. Somebody told me that XYZ company that developed his language are the experts and what do I know so I will not even think about how it works beneath. The D programmer has a virtue of a different kind (and one must never forget that any virtue, taken to the extreme, and out of balance with other virtues becomes a vice) - she knows that it's just code and whilst uncomfortable in the beginning with persistence one can figure it out. Who the hell do you think you are to write a C compiler? That's still echoing today from the founding moment. Values are perhaps much more important than features in determining whether one should use a modern basically sound language. Is it a problem that if you install the latest DMD on windows or Ubuntu it might not work the first time, and it definitely won't if you are trying to show someone. That very much depends. Are you someone and do you work with people who are put off by the first five minutes and indeed repeated encounters with the rough around the edges aspects of D? I was a bit daunted by the bleeding edge reputation of Arch Linux so I tried everything else, but when I found it I knew I had come home. For my own workstation, not something to use on critical infrastructure of course - though on the whole I would ideally be able to adapt to failure rather than try to make sure the impossible to prevent never happens. Nassim Taleb says that something is resilient if it survives stress and antifragile if it benefits from stress and disorder. Well, sometimes it has been a pain in the neck, but I would by far rather deal with regular small infelicities than less frequent big ones. We are in an age of specialisation and resulting fragmentation - not just in programming but in many other fields too. The edge in life is always shifting. In 1998 I was nervous about moving to DE Shaw as a trader and an older chap with white hair (we were all younger then so Angus was an anomaly) said don't worry - you have some technical skills that most people won't have for a decade, so if it doesn't work out you will be fine. And today it's the other way around - because I followed my interests I ended up knowing enough about quite a few different things where the package is not that common. And yet there's a value in knowing the whole picture in one mind that can't be substituted for by a committee. And what's true there is true with other skills too. So the infelicities of D actually serve as a moat to filter for the worthy and a training regime to keep you fit. Taleb talks about checking into an expensive hotel where there is a guy in a suit paying the bellboy to carry his bags upstairs. Then an hour later he sees the same guy in the gym lifting weights using a machine. And he has a point that to a certain extent it's possible to use found challenges to stay fit more than people are in the habit of doing today. There's also a breadth found in the community that used to be the norm when I started programming in 1983 but disappeared in the years. In which other community will I have dinner with a naval architect and come away with good ideas that I can steal and put to good use for what I am doing? Anyway beyond performance and the specific virtues of programmers from the D community (which may well be vices in an environment that ought to be based on different values), yes I do think CTFE, introspection, sane templates and code generation make a great deal of difference to code readability. There's much less of it for a start, and it's easy to understand what it's doing. Vs a code interface. Microsoft advise using HTML templates for code generation - I thought at first surely an April Fool. And annotations are great, and take a look at how Atila Neves uses them for example. People weren't thrilled by them in the beginning but it's proven very useful in practice. The Remedy Games talk by Ethan describing their use was very cool. I think sometimes stress and unhappiness is caused by wanting something or someone to be something one is familiar with rather than what it intrinsically is. If you have problems using D at work try and figure out a way to solve them. Or work with others to improve things. But it could well be that it's not the right place for it. Maybe it's not technically a fit, but it could well be a question of values - the existing values of the group or the values appropriate to the domain the company is working within. Possibly it could be just that we are all very impatient these days whereas processes of social change take the time they take. The economist Brynjolfsson has written about this from the perspective of organisational architecture. The PC was quite available in 1982 and yet in 1987 Solow, a renowned expert on growth said that "computers are everywhere but in the productivity statistics". A decade later people were talking about the productivity miracle, and that was because of computers. But why did it take so long? Well it took computers time to mature and what Andy gaveth, Bill kept on taking away. However more than anything it was because organisational architecture needed to change to truly benefit from new technologies. And we know how much most people like change - it takes a while as a result. D is a very ambitious language. Therefore it's not surprising it develops more slowly, but that does not say much about it's eventual destiny. Lots of things are insignificant nothings in the beginning, but some of them become very important indeed. I was writing to someone earlier about the revival in US manufacturing that was obvious enough in 2011 based on outlook - perception about compound growth is very strange. Hence the phenomenon of the overnight success that took decades. It wasn't an overnight success - just that people weren't paying attention and only woke up to it when it passed a threshold of perception. So things are as they are, and wishing or pretending otherwise won't make it different. But in the meantime the fact that many places have values that get in the way of trying something a committee hasn't approved is a tragic waste of potential for the individual and company involved. But it's also an opportunity because we are hiring in London and Hong Kong and I've never seen anything like it. I do my whole spiel and then it turns out I needn't have bothered. Excellent programmer. "So I can work with intelligent and virtuous people and write D at work?" Okay. There's an aspect of hyperbole in my telling of it, but not that much. The main market test of D should be not popularity but are people using it to get real work done and do they find commercial benefits from their choices. Given a little patience the rest will follow. You only need one leader in each sector to talk about it, and over time a few more will try. There's a process of contagion that's slow in thd beginning to human perception and then not - the overnight success actually decades in the making. One can control what one does, but one can't control what others think of it. It's by far better then to focus on making D the best language and ecosystem it can be (an intrinsic quality) rather than fretting about popularity. The broader world is just slow to catch on, but they do catch on eventually, particularly when conditions shift. Growth in data set sizes meeting CPU manufacturers having some different constraints and things to worry about plus stagnation in relative memory performance and storage moving to the motherboard. I think recognition of conditions shifting is just a matter of time. I don't think it's a coincidence that Weka,the disruptive storage startup, used D successfully or that performance-wise they utterly dominated their competition. William Gibson said the future is here already, just unevenly distributed. So exotic situations today become quotidian ones tomorrow. Maybe performance mattering just a bit more than it used to is part of that. I don't yet work with big data but even on middling data a 2,000x performance improvement attained with zero effort or concern about performance - that's okay and does make a real difference.
Jun 01 2018
parent Dave Jones <dave jones.com> writes:
On Friday, 1 June 2018 at 23:10:30 UTC, Laeeth Isharc wrote:
 On Friday, 1 June 2018 at 18:18:17 UTC, Tony wrote:

 Yes, though you also can't compare a typical programmer from 
 the D world with a typical guy from an enterprisey language 
 world.
That was an excellent post.
Jun 02 2018
prev sibling parent reply Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Friday, 1 June 2018 at 18:18:17 UTC, Tony wrote:
 But with regard to varians compile-time stuff and function 
 annotations and other things that didn't exist years ago, has 
 that resulted in noticeably faster programming and/or 
 noticeably higher code quality by those utilizing it?
These are exactly the things that enable us to bring a very large code base to D. Not just faster or better, it makes the difference between impossible and possible. And we are engineers needing to solve real-world problems, not CS nerds that find these features merely interesting from a theoretical perspective. Stay tuned for an announcement...
Jun 01 2018
next sibling parent reply drug <drug2004 bk.ru> writes:
On 02.06.2018 03:49, Bastiaan Veelo wrote:
 interesting from a theoretical perspective. Stay tuned for an 
 announcement...
I've been staying for long enough, so let me ask - when the announcement will happen approximately? ))
Jun 02 2018
parent reply Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Saturday, 2 June 2018 at 09:07:29 UTC, drug wrote:
 On 02.06.2018 03:49, Bastiaan Veelo wrote:
 interesting from a theoretical perspective. Stay tuned for an 
 announcement...
I've been staying for long enough, so let me ask - when the announcement will happen approximately? ))
Approximately in the coming week :-)
Jun 02 2018
parent reply drug <drug2004 bk.ru> writes:
On 02.06.2018 14:37, Bastiaan Veelo wrote:
 On Saturday, 2 June 2018 at 09:07:29 UTC, drug wrote:
 On 02.06.2018 03:49, Bastiaan Veelo wrote:
 interesting from a theoretical perspective. Stay tuned for an 
 announcement...
I've been staying for long enough, so let me ask - when the announcement will happen approximately? ))
Approximately in the coming week :-)
That's really great! Thank you. :-)
Jun 02 2018
parent Bastiaan Veelo <Bastiaan Veelo.net> writes:
On Saturday, 2 June 2018 at 12:08:27 UTC, drug wrote:
 On 02.06.2018 14:37, Bastiaan Veelo wrote:
 On Saturday, 2 June 2018 at 09:07:29 UTC, drug wrote:
 On 02.06.2018 03:49, Bastiaan Veelo wrote:
 interesting from a theoretical perspective. Stay tuned for 
 an announcement...
I've been staying for long enough, so let me ask - when the announcement will happen approximately? ))
Approximately in the coming week :-)
That's really great! Thank you. :-)
At last: https://forum.dlang.org/post/spidbximoadsmdojgonu forum.dlang.org
Jun 20 2018
prev sibling next sibling parent KingJoffrey <KingJoffrey KingJoffrey.com> writes:
On Saturday, 2 June 2018 at 00:49:04 UTC, Bastiaan Veelo wrote:
 These are exactly the things that enable us to bring a very 
 large code base to D. Not just faster or better, it makes the 
 difference between impossible and possible. And we are 
 engineers needing to solve real-world problems, not CS nerds 
 that find these features merely interesting from a theoretical 
 perspective. Stay tuned for an announcement...
Well, as a real world engineer, needing to solve real-world problems, and 'interested in' bringing large code bases to D, can you tell me why I cannot have an encapsulated class in D, but instead, I am forced to outsource that enscapsulation to the module? When will the impossible, become possible?
Jun 02 2018
prev sibling parent reply Basile B. <b2.temp gmx.com> writes:
On Saturday, 2 June 2018 at 00:49:04 UTC, Bastiaan Veelo wrote:
 On Friday, 1 June 2018 at 18:18:17 UTC, Tony wrote:
 But with regard to varians compile-time stuff and function 
 annotations and other things that didn't exist years ago, has 
 that resulted in noticeably faster programming and/or 
 noticeably higher code quality by those utilizing it?
These are exactly the things that enable us to bring a very large code base to D. Not just faster or better, it makes the difference between impossible and possible. And we are engineers needing to solve real-world problems, not CS nerds that find these features merely interesting from a theoretical perspective. Stay tuned for an announcement...
Yeah, I'm curious to know which features/aspects could have lead you, finally, to choose between language x, y or z for example.
Jun 02 2018
parent I love Ice Cream <IloveIcecream. icecreamsandwhich.com> writes:
When Bjarne and the D community is criticizing your complexity, 
that's saying something...
Jun 02 2018