www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - OT: C# now has ref and const ref return

reply XavierAP <n3minis-git yahoo.es> writes:
https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#reference-return-values


only on the high-productivity segment (which Dlang currently 
neglects relatively), but expanding into higher performance. The 

that does not allocate in the GCed heap. I think D should pay 
attention

See also:
https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#ref-struct-types

Syntactical design-wise they have opted for total explicitness. 
Everything has to be decorated ref: function declaration, return 
statement, and local variable declaration at caller scope. (This 
makes it look at first like it's allowing ref local variable 
declarations like C++, but I understand -- haven't played with it 
yet -- that in reality it's only allowed at the left side of 
calls to ref return methods.) I guess this explicit approach may 

templates/heterogeneous generics, but polymorphic/homogeneous.



Aug 06 2019
next sibling parent Simen =?UTF-8?B?S2rDpnLDpXM=?= <simen.kjaras gmail.com> writes:
On Tuesday, 6 August 2019 at 09:54:49 UTC, XavierAP wrote:
 (This makes it look at first like it's allowing ref local 
 variable declarations like C++, but I understand -- haven't 
 played with it yet -- that in reality it's only allowed at the 
 left side of calls to ref return methods.)
variables (tested and works ;on my machine): using System.Diagnostics; class C { static void Main() { var i = 3; ref int ir = ref i; ir = 17; Debug.Assert(i == 17); } } Can't have const references to literal values (e.g. const int& i = 13;) like in C++, though. -- Simen
Aug 06 2019
prev sibling next sibling parent reply Bert <Bert gmail.com> writes:
On Tuesday, 6 August 2019 at 09:54:49 UTC, XavierAP wrote:
 https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#reference-return-values


 only on the high-productivity segment (which Dlang currently 
 neglects relatively), but expanding into higher performance. 

 code that does not allocate in the GCed heap. I think D should 
 pay attention

 See also:
 https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#ref-struct-types

 Syntactical design-wise they have opted for total explicitness. 
 Everything has to be decorated ref: function declaration, 
 return statement, and local variable declaration at caller 
 scope. (This makes it look at first like it's allowing ref 
 local variable declarations like C++, but I understand -- 
 haven't played with it yet -- that in reality it's only allowed 
 at the left side of calls to ref return methods.) I guess this 

 design doesn't have templates/heterogeneous generics, but 
 polymorphic/homogeneous.



Eventually D will not be competitive at all. Simple as that. These other languages are always moving forward in all directions. D inches along in a few. It's just a matter of time. D is about 20 years too late and the "leadership" is more concerned with conferences and "swag" than getting D competitive. Computation is expanding in all domains. Programming is not just about writing device drivers or controlling relays any more... Without proper user support tooling D will fail. It's just a matter of time. Just like a biological population there is a requirement of a certain minimum number of users to sustain D. The D downloads is touted as "Look, the number of downloads are growing, it means D is growing!! D is on the right track!"... but yet 1. The number of humans getting in to computing is growing and 2. The actual users of D are not growing in step. People download D, try it, and realize it sucks and it's not for them. Sure a few love D, they see the beauty of it and ignore or neglect the other issues. I'm about 10x as productive in other languages than D. It's not the language itself as it is everything around it. It's great for minor things like utilities or simple one offs but few in the right mind will use D for anything major(as a serious massive app targeted as the real public). I imagine D will simply become a support language that can be used to write certain things well in and then eventually some other better language with better support will come along and replace D. That is the direction D is headed and the leadership refuses to accept it. They keep looking at the pros of D and completely ignore the cons.... eventually the cons will catch up and overtake the pros and then it's just a matter of time. The leadership is myopic and that what is killing D... and it's probably too late to save it. They are stuck in the past and entropy will do what it always does and at some point it will be more trouble to keep D alive than it will to just move on. Sure there will always be a few hanger ons but that will be the future of D it seems.
Aug 06 2019
next sibling parent Ernie <Ernie gmail.com> writes:
On Tuesday, 6 August 2019 at 11:59:04 UTC, Bert wrote:
 I'm about 10x as productive in other languages than D.
Then go use those languages. Don't come here brood like a mofo.
Aug 06 2019
prev sibling next sibling parent reply Suliman <evermind live.ru> writes:
Personally I moved from D, to Dart because it's more look like 
D3. A lot of modern and useful futures.
Aug 06 2019
next sibling parent reply GreatSam4sure <greatsam4sure gmail.com> writes:
On Tuesday, 6 August 2019 at 14:25:05 UTC, Suliman wrote:
 Personally I moved from D, to Dart because it's more look like 
 D3. A lot of modern and useful futures.
You have move way to dart what is still keeping you around. Will you be a gain to the dart community. Will the D community miss you?
Aug 06 2019
parent Ali <fakeemail example.com> writes:
On Tuesday, 6 August 2019 at 16:50:10 UTC, GreatSam4sure wrote:
 On Tuesday, 6 August 2019 at 14:25:05 UTC, Suliman wrote:
 Personally I moved from D, to Dart because it's more look like 
 D3. A lot of modern and useful futures.
You have move way to dart what is still keeping you around. Will you be a gain to the dart community. Will the D community miss you?
The D language and community, in my humble opinion, seem to attract a lot of beginners looking for the silver bullet And from far D,it does seem to fit that claim, a reputation of strong meta-programming (a program that write program, yay super advanced stuff), it have Walter Bright, Andrei Alexandrescu two seasoned uber programmers working on it For many it also a language that is trying to be as Powerful as C++, yet simpler D seem to attract a lot of beginner or average programmers, looking for a secret weapon, a shortcut, a leverage point .. that will put them magically in the ranks of the uber programmers And when they realize it is not, they vent their frustration in the forum and contribute very little to the community ... which is very negative for D
Aug 07 2019
prev sibling parent reply JN <666total wp.pl> writes:
On Tuesday, 6 August 2019 at 14:25:05 UTC, Suliman wrote:
 Personally I moved from D, to Dart because it's more look like 
 D3. A lot of modern and useful futures.
I love Dart. I think it's one of the cleanest languages around. I love the language and I love the tools surrounding it (IDE support is quite nice). Unfortunately, it lacks value types and it's not really usable outside of web/server context.
Aug 06 2019
parent Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Tuesday, 6 August 2019 at 21:28:26 UTC, JN wrote:
 On Tuesday, 6 August 2019 at 14:25:05 UTC, Suliman wrote:
 Personally I moved from D, to Dart because it's more look like 
 D3. A lot of modern and useful futures.
I love Dart. I think it's one of the cleanest languages around. I love the language and I love the tools surrounding it (IDE support is quite nice). Unfortunately, it lacks value types and it's not really usable outside of web/server context.
I use Dart daily (at work we're writing a Flutter app) and I hate every moment as every line reminds me that I could do much better with D. I would gladly give up all of it's tooling if Flutter was written in D. Dart's tooling beats D's 10x, but D is at least 100x better language, IMO. Of course the language is not all, which why sadly I would need to stick to Dart for the time being. On the plus side, Dart's language team seems to have more experience with formal specifications and I have confidence that NNBD will be a strong language addition. On the minus side, Dart is very uninspiring and I have to constantly write tons of boilerplate. And use code generation as a separate build step. In isolation, it's type-system is well-designed, but compared to other languages like TypeScript it is seriously lacking in expressive power. And of course, the runtime performance is shit (but that's not surprising). In both D and Dart I miss TypeScript's more advanced features: https://www.typescriptlang.org/docs/handbook/advanced-types.html
Aug 07 2019
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
On Tue, Aug 6, 2019 at 5:01 AM Bert via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Tuesday, 6 August 2019 at 09:54:49 UTC, XavierAP wrote:
 https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#reference-return-values


 only on the high-productivity segment (which Dlang currently
 neglects relatively), but expanding into higher performance.

 code that does not allocate in the GCed heap. I think D should
 pay attention

 See also:
 https://docs.microsoft.com/en-us/dotnet/csharp/language-reference/keywords/ref#ref-struct-types

 Syntactical design-wise they have opted for total explicitness.
 Everything has to be decorated ref: function declaration,
 return statement, and local variable declaration at caller
 scope. (This makes it look at first like it's allowing ref
 local variable declarations like C++, but I understand --
 haven't played with it yet -- that in reality it's only allowed
 at the left side of calls to ref return methods.) I guess this

 design doesn't have templates/heterogeneous generics, but
 polymorphic/homogeneous.



Eventually D will not be competitive at all. Simple as that. These other languages are always moving forward in all directions. D inches along in a few. It's just a matter of time. D is about 20 years too late and the "leadership" is more concerned with conferences and "swag" than getting D competitive. Computation is expanding in all domains. Programming is not just about writing device drivers or controlling relays any more... Without proper user support tooling D will fail. It's just a matter of time. Just like a biological population there is a requirement of a certain minimum number of users to sustain D. The D downloads is touted as "Look, the number of downloads are growing, it means D is growing!! D is on the right track!"... but yet 1. The number of humans getting in to computing is growing and 2. The actual users of D are not growing in step. People download D, try it, and realize it sucks and it's not for them. Sure a few love D, they see the beauty of it and ignore or neglect the other issues. I'm about 10x as productive in other languages than D. It's not the language itself as it is everything around it. It's great for minor things like utilities or simple one offs but few in the right mind will use D for anything major(as a serious massive app targeted as the real public). I imagine D will simply become a support language that can be used to write certain things well in and then eventually some other better language with better support will come along and replace D. That is the direction D is headed and the leadership refuses to accept it. They keep looking at the pros of D and completely ignore the cons.... eventually the cons will catch up and overtake the pros and then it's just a matter of time. The leadership is myopic and that what is killing D... and it's probably too late to save it. They are stuck in the past and entropy will do what it always does and at some point it will be more trouble to keep D alive than it will to just move on. Sure there will always be a few hanger ons but that will be the future of D it seems.
I feel your sentiment, but I think you're wrong about some outcomes. D is limited by developer manpower. It's not clear how to improve that. I've been banging on and on and on and on about developer experience for a very long time, and it is actually moving, but it's under-staffed. VisualD is actually a very good tool, but with some gaps. Compared to any other language integration into Visual Studio, I don't know of many experiences better than VisualD. VisualD is 1-man, and if there were 2, I think it would make a world of difference... so reasonably progress is happening, just not at an enjoyable rate. Now your end-game though, I think that's a bit un-realistic. I expect, the moment that a substantial number of the D users feel like things are going in a completely wrong direction, or their progress is being inhibited behind a threshold that's unacceptable, D will fork. D exists because it's not as bad as you say, and the day that it becomes as bad as you say, it will fork... that fork under new leadership may see some rapid progress initially, but it's understood that the integrity of the community is a balancing factor against that progress. It's hard to say exactly where that balance is, but I'm fairly confident it exists, and as soon as the threshold is crossed, it will fork. We haven't forked because D is not currently as bad as you say.
Aug 06 2019
parent Ethan <gooberman gmail.com> writes:
On Tuesday, 6 August 2019 at 21:31:21 UTC, Manu wrote:
 It's hard to say exactly where that balance is, but I'm fairly
 confident it exists, and as soon as the threshold is crossed, 
 it will
 fork. We haven't forked because D is not currently as bad as 
 you say.
There's still people that come in to IRC asking about Tango. That in itself is a problem. I'm generally against forking, but I do think making D3 would be a rational option for two reasons: 1) const functions/methods by default 2) pure by default Way too much code would need retrofitting in the 2.x specification to just enable those things overnight. And these two things are *VERY* important for the future.
Aug 07 2019
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
The imminent demise of D was confidently predicted 20 years ago, and every year 
since, yet it keeps growing and getting better. The imminent failure of Zortech 
C++ was also constantly predicted by pretty much everyone, meanwhile Zortech
was 
pretty successful.
Aug 06 2019
next sibling parent reply Exil <Exil gmall.com> writes:
On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright wrote:
 The imminent demise of D was confidently predicted 20 years 
 ago, and every year since, yet it keeps growing and getting 
 better.
It won't be imminent but it hasn't reached the point of self sufficiency. I don't see it reaching the same level of success as other younger languages, at it stands.
 The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was pretty 
 successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Aug 06 2019
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright wrote:

 The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was 
 pretty successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Zortech was bought by Symantec and they rebranded the compiler to Symantec C++. It's now Digital Mars C++, still very much alive, and the backend was reused in DMD.
Aug 06 2019
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/6/2019 7:23 PM, Mike Parker wrote:
 On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright wrote:

 The imminent failure of Zortech C++ was also constantly predicted by pretty 
 much everyone, meanwhile Zortech was pretty successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Zortech was bought by Symantec and they rebranded the compiler to Symantec C++. It's now Digital Mars C++, still very much alive, and the backend was reused in DMD.
Not only that, Zortech C++ is the reason C++ itself reached the tipping point and became mainstream.
Aug 06 2019
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 7 August 2019 at 06:26:34 UTC, Walter Bright wrote:
 On 8/6/2019 7:23 PM, Mike Parker wrote:
 On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright 
 wrote:

 The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was 
 pretty successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Zortech was bought by Symantec and they rebranded the compiler to Symantec C++. It's now Digital Mars C++, still very much alive, and the backend was reused in DMD.
Not only that, Zortech C++ is the reason C++ itself reached the tipping point and became mainstream.
Sorry to say it like this, but Zortech wasn't relevant in Europe, if at all. I got introduced to C++ via Turbo C++ 1.0 and it was all Borland, Microsoft and Metrowerks from there on. For us early C++ adopters in the Iberian Penisula, Zortech was only seen on magazine ads.
Aug 07 2019
next sibling parent =?UTF-8?Q?Ali_=c3=87ehreli?= <acehreli yahoo.com> writes:
On 08/07/2019 04:33 AM, Paulo Pinto wrote:

 Sorry to say it like this, but Zortech wasn't relevant in Europe, if at
 all.
One stats point: The only Zortech copy that I've ever seen was on my friend's desk in Turkey. Must be 1987 or 1988. Ali
Aug 07 2019
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/7/2019 4:33 AM, Paulo Pinto wrote:
 Not only that, Zortech C++ is the reason C++ itself reached the tipping point 
 and became mainstream.
Sorry to say it like this, but Zortech wasn't relevant in Europe, if at all. I got introduced to C++ via Turbo C++ 1.0 and it was all Borland, Microsoft and Metrowerks from there on.
That came years afterwards. The success of Zortech C++ motivated Borland to drop their project of adding OOP extensions to C and go with C++. (I know the people involved.) The success of Borland C++ motivated Microsoft similarly (I heard at the time that MS was also developing their own OOP C language, but was never able to get confirmation). But there is little doubt the sharp upward tilt of C++ happened with the introduction of Zortech C++. ZTC++ was on DOS, where 90% of the programming at the time happened. Before ZTC++, C++ was a curiosity on unix platforms that was battling neck-and-neck with Objective C. (The comp.lang.c++ and comp.lang.objectivec newsgroups had about the same traffic volume.) After ZTC++ appeared, C++ boomed and ObjC tanked (and was rescued from oblivion by Apple).
Aug 07 2019
next sibling parent reply Maximilian <max_mili outlook.com> writes:
On Thursday, 8 August 2019 at 00:16:22 UTC, Walter Bright wrote:
 Before ZTC++, C++ was a curiosity on unix platforms that was 
 battling neck-and-neck with Objective C. (The comp.lang.c++ and 
 comp.lang.objectivec newsgroups had about the same traffic 
 volume.) After ZTC++ appeared, C++ boomed and ObjC tanked (and 
 was rescued from oblivion by Apple).
Don't get this as being disrespectful, I'm just curious in what happened then? I'm mean after your compiler was gather traction as you said, what happened and why others compilers succeeded? Better UI? Implementation? Max.
Aug 07 2019
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/7/2019 5:23 PM, Maximilian wrote:
 Don't get this as being disrespectful, I'm just curious in what happened then? 
 I'm mean after your compiler was gather traction as you said, what happened
and 
 why others compilers succeeded? Better UI? Implementation?
Borland / Microsoft poured massive resources into theirs. Borland was able to leverage their Turbo Pascal userbase, and Microsoft was very dominant in developer tools. In spite of that, Zortech still did better and better, eventually being bought out by Symantec.
Aug 07 2019
prev sibling next sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Wed, 2019-08-07 at 17:16 -0700, Walter Bright via Digitalmars-d wrote:
[=E2=80=A6]
=20
 Before ZTC++, C++ was a curiosity on unix platforms that was battling=20
 neck-and-neck with Objective C. (The comp.lang.c++ and comp.lang.objectiv=
ec=20
 newsgroups had about the same traffic volume.) After ZTC++ appeared, C++
 boomed=20
 and ObjC tanked (and was rescued from oblivion by Apple).
Being in the UNIX (well Solaris) world at the time at UCL, C++ was seen as interesting but in need of templates which it then got (1990) and so became viable, and Objective C was seen (possibly wrongly but still) as ramming Smalltalk and C together badly and so not viable. Someone got a few NeXT machines but they got put in a cupboard to gather dust: nice UI paradigm, useless programming platform. Macintosh had already taken hold in the HCI a= nd media communities. I was teaching C++ to first year second term (first term they did Scheme (1= 987 and 1988) or Miranda (1989 onwards) undergraduates 1987 to 1992 =E2=80=93 a= ll dates =C2=B11. We used the Glockespiel compiler because CFront was an afront, and= GCC was not up to it. Students liked it and did very well. After 1990 anyway when w= e didn't have to do #define/void* hacks to create data structures but could u= se templates. Of course templates with untyped parameters was an error, still not fixed i= n C++20 =E2=80=93 apparently Concepts got pulled out again. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Aug 08 2019
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 8 August 2019 at 10:29:36 UTC, Russel Winder wrote:
 On Wed, 2019-08-07 at 17:16 -0700, Walter Bright via 
 Digitalmars-d wrote: […]
 
 Before ZTC++, C++ was a curiosity on unix platforms that was 
 battling
 neck-and-neck with Objective C. (The comp.lang.c++ and 
 comp.lang.objectivec
 newsgroups had about the same traffic volume.) After ZTC++ 
 appeared, C++
 boomed
 and ObjC tanked (and was rescued from oblivion by Apple).
Being in the UNIX (well Solaris) world at the time at UCL, C++ was seen as interesting but in need of templates which it then got (1990) and so became viable, and Objective C was seen (possibly wrongly but still) as ramming Smalltalk and C together badly and so not viable. Someone got a few NeXT machines but they got put in a cupboard to gather dust: nice UI paradigm, useless programming platform. Macintosh had already taken hold in the HCI and media communities. I was teaching C++ to first year second term (first term they did Scheme (1987 and 1988) or Miranda (1989 onwards) undergraduates 1987 to 1992 – all dates ±1. We used the Glockespiel compiler because CFront was an afront, and GCC was not up to it. Students liked it and did very well. After 1990 anyway when we didn't have to do #define/void* hacks to create data structures but could use templates. Of course templates with untyped parameters was an error, still not fixed in C++20 – apparently Concepts got pulled out again.
Concepts are part of ISO C++20. It was contracts that got pulled out during the Cologne meeting due to some last minute discoveries of a few semantic issues.
Aug 08 2019
parent Russel Winder <russel winder.org.uk> writes:
On Thu, 2019-08-08 at 11:04 +0000, Paulo Pinto via Digitalmars-d wrote:
[=E2=80=A6]
=20
 Concepts are part of ISO C++20.
Many thanks for correcting my claim. I got my internal wires crossed and ha= ve confused myself and possibly others. Good to get it right!
 It was contracts that got pulled out during the Cologne meeting=20
 due to some last minute discoveries of a few semantic issues.
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Aug 08 2019
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/8/2019 3:29 AM, Russel Winder wrote:
 On Wed, 2019-08-07 at 17:16 -0700, Walter Bright via Digitalmars-d wrote:
 […]
 Before ZTC++, C++ was a curiosity on unix platforms that was battling
 neck-and-neck with Objective C. (The comp.lang.c++ and comp.lang.objectivec
 newsgroups had about the same traffic volume.) After ZTC++ appeared, C++
 boomed
 and ObjC tanked (and was rescued from oblivion by Apple).
Being in the UNIX (well Solaris) world at the time at UCL, C++ was seen as interesting but in need of templates which it then got (1990) and so became viable, and Objective C was seen (possibly wrongly but still) as ramming Smalltalk and C together badly and so not viable. Someone got a few NeXT machines but they got put in a cupboard to gather dust: nice UI paradigm, useless programming platform. Macintosh had already taken hold in the HCI and media communities. I was teaching C++ to first year second term (first term they did Scheme (1987 and 1988) or Miranda (1989 onwards) undergraduates 1987 to 1992 – all dates ±1. We used the Glockespiel compiler because CFront was an afront, and GCC was not up to it. Students liked it and did very well. After 1990 anyway when we didn't have to do #define/void* hacks to create data structures but could use templates. Of course templates with untyped parameters was an error, still not fixed in C++20 – apparently Concepts got pulled out again.
Templates were a very big deal for templates, but being in the trenches it was clear that C++ was a winner even without them. The difference Zortech C++ made was: 1. it was cheap 2. it was fast 2. it was on DOS where 90% of the programming action was 3. it was adapted to the DOS near/far memory models cfront simply fell flat on all 4 points, and only saw use on DOS with Microsoft only shops. Even Microsoft used ZTC++ internally (COM is Zortech's memory model <g>).
Aug 08 2019
prev sibling next sibling parent reply Exil <Exil gmall.com> writes:
On Thursday, 8 August 2019 at 00:16:22 UTC, Walter Bright wrote:
 On 8/7/2019 4:33 AM, Paulo Pinto wrote:
 Not only that, Zortech C++ is the reason C++ itself reached 
 the tipping point and became mainstream.
Sorry to say it like this, but Zortech wasn't relevant in Europe, if at all. I got introduced to C++ via Turbo C++ 1.0 and it was all Borland, Microsoft and Metrowerks from there on.
That came years afterwards. The success of Zortech C++ motivated Borland to drop their project of adding OOP extensions to C and go with C++. (I know the people involved.) The success of Borland C++ motivated Microsoft similarly (I heard at the time that MS was also developing their own OOP C language, but was never able to get confirmation). But there is little doubt the sharp upward tilt of C++ happened with the introduction of Zortech C++. ZTC++ was on DOS, where 90% of the programming at the time happened. Before ZTC++, C++ was a curiosity on unix platforms that was battling neck-and-neck with Objective C. (The comp.lang.c++ and comp.lang.objectivec newsgroups had about the same traffic volume.) After ZTC++ appeared, C++ boomed and ObjC tanked (and was rescued from oblivion by Apple).
See "data" means showing the actual statistics when you say "then ZTC++ appeared then C++ boomed", this means nothing without the data to back it up. It means even less when coming from someone with a clear bias.
Aug 08 2019
parent reply Ethan <gooberman gmail.com> writes:
On Thursday, 8 August 2019 at 13:14:45 UTC, Exil wrote:
 See "data" means showing the actual statistics when you say 
 "then ZTC++ appeared then C++ boomed", this means nothing 
 without the data to back it up. It means even less when coming 
 from someone with a clear bias.
Surely you know how to use Google. Walter's given every bit of information you need and place to look at to verify his claims for yourself. You know what's cheaper than research? Abject cynicism. Stop it.
Aug 08 2019
parent reply Exil <Exil gmall.com> writes:
On Thursday, 8 August 2019 at 15:17:07 UTC, Ethan wrote:
 On Thursday, 8 August 2019 at 13:14:45 UTC, Exil wrote:
 See "data" means showing the actual statistics when you say 
 "then ZTC++ appeared then C++ boomed", this means nothing 
 without the data to back it up. It means even less when coming 
 from someone with a clear bias.
Surely you know how to use Google. Walter's given every bit of information you need and place to look at to verify his claims for yourself. You know what's cheaper than research? Abject cynicism. Stop it.
I'm not the one making claims here. He made claims, I'm merely asking him to back them up, or they're just meaningless lies. Anyways not sure why you think I didn't search? What I found is that ZTC++ is barely ever mentioned. It is old though, and things do get lost with time.
Aug 09 2019
next sibling parent bachmeier <no spam.net> writes:
On Friday, 9 August 2019 at 18:50:19 UTC, Exil wrote:
 On Thursday, 8 August 2019 at 15:17:07 UTC, Ethan wrote:
 On Thursday, 8 August 2019 at 13:14:45 UTC, Exil wrote:
 See "data" means showing the actual statistics when you say 
 "then ZTC++ appeared then C++ boomed", this means nothing 
 without the data to back it up. It means even less when 
 coming from someone with a clear bias.
Surely you know how to use Google. Walter's given every bit of information you need and place to look at to verify his claims for yourself. You know what's cheaper than research? Abject cynicism. Stop it.
I'm not the one making claims here. He made claims, I'm merely asking him to back them up, or they're just meaningless lies.
Please reserve this kind of stuff for slashdot. He told the story the way he recalls it, and he owes you nothing. If you're going to call him a liar, you need to provide evidence to back up your claim. Here's Bjarne Stroustrup's version: "Until June 1988 all C++ compiler on PCs were Cfront ports. Then Zortech started shipping their compiler. The appearance of Walter Bright’s compiler made C++ ‘‘real’’ for many PC−oriented people for the first time. More conservative people reserved their judgement until the Borland C++ compiler in May 1990 or even Microsoft’s C++ compiler in March 1992."
Aug 09 2019
prev sibling parent reply Ethan <gooberman gmail.com> writes:
On Friday, 9 August 2019 at 18:50:19 UTC, Exil wrote:
 I'm not the one making claims here. He made claims, I'm merely 
 asking him to back them up, or they're just meaningless lies.
Are you five years old? Or is there some other rational reason why you have zero idea how the world works? Walter has history and the computing industry of the 20th century on his side. The onus is not on him to prove history is a lie. The onus is on you to prove you're not a mouthy child.
 Anyways not sure why you think I didn't search? What I found is 
 that ZTC++ is barely ever mentioned. It is old though, and 
 things do get lost with time.
Now who's making claims without data. All of Usenet is still online and searchable. The people and companies involved are still around or have left interviews and other articles, either searchable online or at your state library archives. I suppose "anthropology" is too big a word to understand. Look it up and come back here in a few years time.
Aug 09 2019
parent reply Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Friday, 9 August 2019 at 20:06:36 UTC, Ethan wrote:
 On Friday, 9 August 2019 at 18:50:19 UTC, Exil wrote:
 I'm not the one making claims here. He made claims, I'm merely 
 asking him to back them up, or they're just meaningless lies.
Are you five years old? Or is there some other rational reason why you have zero idea how the world works?
If you call people "five years old" just because they doubt claims without linked backing data, you don't know how the world of arguing actually works. If you make claims, then you have the burden of proof, not the one who doubts it.
Aug 09 2019
parent reply Ethan <gooberman gmail.com> writes:
On Friday, 9 August 2019 at 20:30:04 UTC, Laurent Tréguier wrote:
 On Friday, 9 August 2019 at 20:06:36 UTC, Ethan wrote:
 Are you five years old?

 Or is there some other rational reason why you have zero idea 
 how the world works?
If you call people "five years old" just because they doubt claims without linked backing data, you don't know how the world of arguing actually works. If you make claims, then you have the burden of proof, not the one who doubts it.
Well I see "irony" escapes you. Notice how in a single post I suggested he is immature - a potentially baseless claim - and demanded he prove otherwise. If you do not see the wry humour there and how that relates to his behavior towards Walter, well, there's no other point to make here so either you get it or you don't.
Aug 09 2019
parent reply Laurent =?UTF-8?B?VHLDqWd1aWVy?= <laurent.treguier.sink gmail.com> writes:
On Friday, 9 August 2019 at 20:34:58 UTC, Ethan wrote:
 Well I see "irony" escapes you.

 Notice how in a single post I suggested he is immature - a 
 potentially baseless claim - and demanded he prove otherwise.

 If you do not see the wry humour there and how that relates to 
 his behavior towards Walter, well, there's no other point to 
 make here so either you get it or you don't.
I'm sorry, but I still fail to see how any of what you say here can be qualified as "humor".
Aug 09 2019
parent Ethan <gooberman gmail.com> writes:
On Friday, 9 August 2019 at 20:40:14 UTC, Laurent Tréguier wrote:
 I'm sorry, but I still fail to see how any of what you say here 
 can be qualified as "humor".
Cool?
Aug 09 2019
prev sibling parent reply Murilo <murilomiranda92 hotmail.com> writes:
On Thursday, 8 August 2019 at 00:16:22 UTC, Walter Bright wrote:
 On 8/7/2019 4:33 AM, Paulo Pinto wrote:
 Not only that, Zortech C++ is the reason C++ itself reached
Hi Mr. Bright, I would like to talk to you about D, do you have an e-mail account?
Aug 08 2019
parent matheus <matheus gmail.com> writes:
On Friday, 9 August 2019 at 02:41:21 UTC, Murilo wrote:
 On Thursday, 8 August 2019 at 00:16:22 UTC, Walter Bright wrote:
 On 8/7/2019 4:33 AM, Paulo Pinto wrote:
 Not only that, Zortech C++ is the reason C++ itself reached
Hi Mr. Bright, I would like to talk to you about D, do you have an e-mail account?
You get his e-mail here: https://www.walterbright.com/ at the top there is a button: "Send email to Walter Bright". Matheus.
Aug 08 2019
prev sibling parent reply Exil <Exil gmall.com> writes:
On Wednesday, 7 August 2019 at 06:26:34 UTC, Walter Bright wrote:
 On 8/6/2019 7:23 PM, Mike Parker wrote:
 On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright 
 wrote:

 The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was 
 pretty successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Zortech was bought by Symantec and they rebranded the compiler to Symantec C++. It's now Digital Mars C++, still very much alive, and the backend was reused in DMD.
Not only that, Zortech C++ is the reason C++ itself reached the tipping point and became mainstream.
I mean anyone can make claims without data proving it. Even without Zortech C++ C++ would have gotten to the point it is at now.
Aug 07 2019
parent Walter Bright <newshound2 digitalmars.com> writes:
On 8/7/2019 9:12 AM, Exil wrote:
 I mean anyone can make claims without data proving it. Even without Zortech
C++ 
 C++ would have gotten to the point it is at now.
See my other followup post.
Aug 07 2019
prev sibling parent reply Exil <Exil gmall.com> writes:
On Wednesday, 7 August 2019 at 02:23:17 UTC, Mike Parker wrote:
 On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright wrote:

 The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was 
 pretty successful.
They weren't wrong, Zortech C++ doesn't exist anymore. While C++ is still going strong.
Zortech was bought by Symantec and they rebranded the compiler to Symantec C++. It's now Digital Mars C++, still very much alive, and the backend was reused in DMD.
Right DMC is all but dead, its only used in D for Windows releases which causes it to have a plethora of bugs because it's dead and no one is actively developing it. That backend is the reason people look to LDC.
Aug 07 2019
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/7/2019 9:17 AM, Exil wrote:
 Right DMC is all but dead,
Mainly because I stopped working on it to do D full time. Although I remain the only person to have implemented a soup-to-nuts C++ compiler, simultaneously doing two languages proved beyond me :-) Just to be clear, without the DMC++ back end, D would never have happened. For example, Windows 64 support became critical a few years ago. GDC and LDC were quite inadequate on it (they've since improved greatly) and we would have been severely negatively impacted if I hadn't upgraded the backend for Win64. It's a great strength of D that we have 3 well-supported compilers, DMD, LDC and GDC.
Aug 07 2019
parent reply Exil <Exil gmall.com> writes:
On Thursday, 8 August 2019 at 00:27:03 UTC, Walter Bright wrote:
 On 8/7/2019 9:17 AM, Exil wrote:
 Right DMC is all but dead,
Mainly because I stopped working on it to do D full time. Although I remain the only person to have implemented a soup-to-nuts C++ compiler, simultaneously doing two languages proved beyond me :-) Just to be clear, without the DMC++ back end, D would never have happened. For example, Windows 64 support became critical a few years ago. GDC and LDC were quite inadequate on it (they've since improved greatly) and we would have been severely negatively impacted if I hadn't upgraded the backend for Win64. It's a great strength of D that we have 3 well-supported compilers, DMD, LDC and GDC.
That's simply because of the backend you chose, and ultimately it is the limiting factor now. LDC originally attempted to just implement their own frontend. Now it is basically what DMD should have been. I don't ever expect DMD to get ARM support, or cross compiling capability. The amount of work needed just isn't worth it, especially when there's a project that takes care of that for you. It seems like the decision is based on some kind of ego thing (as seems to keep being demonstrated) rather than a rational process. So why continue to use an old dead project in your current active project?
Aug 08 2019
parent reply Atila Neves <atila.neves gmail.com> writes:
On Thursday, 8 August 2019 at 13:03:17 UTC, Exil wrote:
 On Thursday, 8 August 2019 at 00:27:03 UTC, Walter Bright wrote:
 [...]
That's simply because of the backend you chose, and ultimately it is the limiting factor now. LDC originally attempted to just implement their own frontend. Now it is basically what DMD should have been. I don't ever expect DMD to get ARM support, or cross compiling capability. The amount of work needed just isn't worth it, especially when there's a project that takes care of that for you. It seems like the decision is based on some kind of ego thing (as seems to keep being demonstrated) rather than a rational process. So why continue to use an old dead project in your current active project?
I like the DMC++ backend, because it runs faster than the alternatives. If I actually need code that runs as fast as possible I'll use ldc, but that hardly ever happens. Compilation takes too long as it is, I don't want to wait for LLVM as well.
Aug 08 2019
next sibling parent reply a11e99z <black80 bk.ru> writes:
On Thursday, 8 August 2019 at 18:22:56 UTC, Atila Neves wrote:
 On Thursday, 8 August 2019 at 13:03:17 UTC, Exil wrote:
 On Thursday, 8 August 2019 at 00:27:03 UTC, Walter Bright 
 wrote:
 [...]
That's simply because of the backend you chose, and ultimately it is the limiting factor now. LDC originally attempted to just implement their own frontend. Now it is basically what DMD should have been. I don't ever expect DMD to get ARM support, or cross compiling capability. The amount of work needed just isn't worth it, especially when there's a project that takes care of that for you. It seems like the decision is based on some kind of ego thing (as seems to keep being demonstrated) rather than a rational process. So why continue to use an old dead project in your current active project?
I like the DMC++ backend, because it runs faster than the alternatives. If I actually need code that runs as fast as possible I'll use ldc, but that hardly ever happens. Compilation takes too long as it is, I don't want to wait for LLVM as well.
idk how to LDC working right now but u have many options to compile D modules with LLVM: - u can generate IR for CTFE for each func in module and run it concurrently by JIT while generating IR for next funcs. - u don't need optimize CTFE code with 60+ opt passes, maybe 0-10, cuz this code only for generating final source/IR for module. BUT u still need do best optimization for compiling final/Release App/user code (non CTFE). and no need optimize Debug final code. need to say that Rust-ers complains on LLVM long compilation too. - u can interoperate with any generated LLVM-IR modules generated by other langs (Swift/ObjC/C++/Rust/..) at IR level not at arch ABI. - C++ 2x will add dynamic compilation that already supported by LLVM. - LLVM supports exceptions, coroutines, GC, intrinsics, int128, modules, metadata, debuggers at IR level too. - LLVM supports many archs: bare-metal that u wanted with betterC, WASM that can be useful to D as native and compatible (WASM wants add exceptions, GC). others users of LLVM adds different Passes/Filters to LLVM for parsing/optimizations and D can use it for free. someday LLVM users will speed generating/compilation time. probably D is better compatible with LLVM than with home DMC++. probably DMD is only one user of DMC++. good projects for all of us such as LLVM should be supported by using/fixing it, not to reinventing own bicycle. imo if D switches to LLVM, then we will all win. PS I want to remind and waiting https://forum.dlang.org/post/lcrsrevmvtnbqsqktsbe forum.dlang.org
Aug 08 2019
next sibling parent a11e99z <black80 bk.ru> writes:
On Thursday, 8 August 2019 at 19:39:14 UTC, a11e99z wrote:
 On Thursday, 8 August 2019 at 18:22:56 UTC, Atila Neves wrote:
 On Thursday, 8 August 2019 at 13:03:17 UTC, Exil wrote:
 On Thursday, 8 August 2019 at 00:27:03 UTC, Walter Bright 
 wrote:
about DPP:
 dpp is a compiler wrapper that will parse a D source file with 
 the .dpp extension and expand in place any #include directives 
 it encounters, translating all of the C or C++ symbols to D, 
 and then pass the result to a D compiler (DMD by default)
with transition D to LLVM probably u can interoperate with C++ sources through Clang at AST level (also D structs should be equal C++ structs/classes with full inheritance as D:Cpp or Cpp:D. classes are still garbage collected). imo easy(0-cost) interop D and C++ is most useful feature for D future than betterC. the last one will deprecate after such interop.
Aug 08 2019
prev sibling parent drug <drug2004 bk.ru> writes:
8/8/19 10:39 PM, a11e99z пишет:
 idk how to LDC working right now but u have many options to compile D 
 modules with LLVM:
No offence, but did you read Atila's post thoroughly? He said that dmc backend has the great advantage over gdc and ldc - it is faster. For desktop applications developing this is important feature.
Aug 09 2019
prev sibling next sibling parent drug <drug2004 bk.ru> writes:
https://forum.dlang.org/post/lcrsrevmvtnbqsqktsbe forum.dlang.org

This is really very interesting question the answer to it I'd like to 
hear. Hope it would be something different than 42.
Aug 09 2019
prev sibling parent Exil <Exil gmall.com> writes:
On Thursday, 8 August 2019 at 18:22:56 UTC, Atila Neves wrote:
 On Thursday, 8 August 2019 at 13:03:17 UTC, Exil wrote:
 On Thursday, 8 August 2019 at 00:27:03 UTC, Walter Bright 
 wrote:
 [...]
That's simply because of the backend you chose, and ultimately it is the limiting factor now. LDC originally attempted to just implement their own frontend. Now it is basically what DMD should have been. I don't ever expect DMD to get ARM support, or cross compiling capability. The amount of work needed just isn't worth it, especially when there's a project that takes care of that for you. It seems like the decision is based on some kind of ego thing (as seems to keep being demonstrated) rather than a rational process. So why continue to use an old dead project in your current active project?
I like the DMC++ backend, because it runs faster than the alternatives. If I actually need code that runs as fast as possible I'll use ldc, but that hardly ever happens. Compilation takes too long as it is, I don't want to wait for LLVM as well.
Theres nothing stopping you from writing a faster less optimizing backend for LLVM. You easily get access to everything else that LLVM offers.
Aug 09 2019
prev sibling parent reply Ethan <gooberman gmail.com> writes:
On Wednesday, 7 August 2019 at 00:36:19 UTC, Exil wrote:
 It won't be imminent but it hasn't reached the point of self 
 sufficiency. I don't see it reaching the same level of success 
 as other younger languages, at it stands.
So I guess this point needs stating again. Why did Swift get picked up? Because Apple. Why did Dart and Go get picked up? Because Google. Why did Rust get picked up? Because Mozilla. Why are people still making posts like this about D? Because languages thrive when there are large corporations in play, and no corporation has committed to D like those behemoths mentioned above. Many of the problems I've highlighted in D/DMD lately will get solved naturally when human- and financial- resources are addressed. But you've got a chicken and egg situation there.
Aug 07 2019
parent XavierAP <n3minis-git yahoo.es> writes:
On Wednesday, 7 August 2019 at 10:25:47 UTC, Ethan wrote:


 Why did Swift get picked up? Because Apple.

 Why did Dart and Go get picked up? Because Google.

 Why did Rust get picked up? Because Mozilla.
Python? <-- CWI.nl Python? <-- CNRI.reston.va.us Indeed we need a sugar daddy :_)
Aug 07 2019
prev sibling parent bachmeier <no spam.net> writes:
On Tuesday, 6 August 2019 at 23:55:11 UTC, Walter Bright wrote:
 The imminent demise of D was confidently predicted 20 years 
 ago, and every year since, yet it keeps growing and getting 
 better. The imminent failure of Zortech C++ was also constantly 
 predicted by pretty much everyone, meanwhile Zortech was pretty 
 successful.
One of the popular hobbies of programmers is to predict the death of languages because there's another language they like better. C++ is dead. Java is dead. Clojure is dead. Scala is dead. C is dead. Perl is dead. Ruby is dead. R is dead. Python is dead. Or so I've been told. As long as the D compiler continues to work, I'll be fine.
Aug 06 2019
prev sibling parent reply Ethan <gooberman gmail.com> writes:
On Tuesday, 6 August 2019 at 11:59:04 UTC, Bert wrote:
 I'm about 10x as productive in other languages than D. It's not 
 the language itself as it is everything around it. It's great 
 for minor things like utilities or simple one offs but few in 
 the right mind will use D for anything major(as a serious 
 massive app targeted as the real public).
So this is a good point and a bad point rolled in to one. Manu has mentioned VisualD, and his points are spot on. It's good, but I keep hitting rough edges. Associative arrays inside templated types don't display in the debugger for me, for example. But being productive in other languages. Well. This serious massive app targeted at the real public that I'm working on has the native component written in D. Server and client backends; and a runtime component that will be linked in to your program. The only native UI framework that gives me the performance and quality I need is WPF. language than in D only needs to take a look at my codebase to see just how much boilerplate my D code needs to generate for my quickly reminds me that generics were an afterthought and I often have to stop using generics and resort to writing D code that for runtime operations. return ref is good and all that, but there's still boxing and unboxing of even basic types that happens. Not to mention that to talk to any game codebase (outside of Unity games) requires marshalling and unmarshalling. tried using libpng in D last year, the two (TWO) separate packages on dub didn't compile under the then-current DMD version. One of the packages was updated earlier this year, but it's too late, I'm using Adam Ruppe's PNG handler. That D has no quality control on what is a *CRITICAL* library these days is kinda insane. So. My point - writing code in D is easy, and my project would not be where it is right now were it not for the deep introspection and metaprogramming techniques I'm using. Integrating D code in to the wider ecosystem, however, is where things get tricky. Atila's point is spot on - it needs to be as simple as #include.
Aug 07 2019
parent reply Bert <Bert gmail.com> writes:
On Wednesday, 7 August 2019 at 10:45:44 UTC, Ethan wrote:
 On Tuesday, 6 August 2019 at 11:59:04 UTC, Bert wrote:
 I'm about 10x as productive in other languages than D. It's 
 not the language itself as it is everything around it. It's 
 great for minor things like utilities or simple one offs but 
 few in the right mind will use D for anything major(as a 
 serious massive app targeted as the real public).
So this is a good point and a bad point rolled in to one. Manu has mentioned VisualD, and his points are spot on. It's good, but I keep hitting rough edges. Associative arrays inside templated types don't display in the debugger for me, for example.
I run in to all kinds of issues with Visual D constantly, it is not nearly as bad as it was but it has issues. It's one guy that has a life trying just to maintain it, at some point it too goes by the wayside.
 But being productive in other languages. Well.
Depends on what one defines as productive.
 This serious massive app targeted at the real public that I'm 
 working on has the native component written in D. Server and 
 client backends; and a runtime component that will be linked in 
 to your program.
But you are one person who has decided to use D in that regard, and funny how you didn't strictly do it in D alone! That actually says a lot about D.
 The only native UI framework that gives me the performance and 
 quality I need is WPF.
Oh, but wait, we have dlang-ui, nuclear-d, well, here: https://wiki.dlang.org/GUI_Libraries I just wonder what the glass ceiling is?

 language than in D only needs to take a look at my codebase to 
 see just how much boilerplate my D code needs to generate for 


 quickly reminds me that generics were an afterthought and I 
 often have to stop using generics and resort to writing D code 

to D precisely because of some of these limitations. It was the generics that was limiting(things have changed somewhat by now though)... I thought D's meta programming might handle it, and it D, and the meta programming is superior, but EVERYTHING(virtually) is a fail! I think the only other thing is generics but performance for high widget count in wpf and the pain of it was giving me(I can't remember the details but it killed my app even using their recommendations).

 game for runtime operations. return ref is good and all that, 
 but there's still boxing and unboxing of even basic types that 
 happens. Not to mention that to talk to any game codebase 
 (outside of Unity games) requires marshalling and unmarshalling.
Yes, this is an issue with the managed code, but for many people amazing performant given what it does and some of the features it provides.

 tried using libpng in D last year, the two (TWO) separate 
 packages on dub didn't compile under the then-current DMD 
 version. One of the packages was updated earlier this year, but 
 it's too late, I'm using Adam Ruppe's PNG handler. That D has 
 no quality control on what is a *CRITICAL* library these days 
 is kinda insane.
Yes, but this sort of thing in D is all over the place. Most libraries are defunct. Adams library is good because it works but I find it's structure quite annoying. It's a hodge podge of code and he tends to write in large files. I had to modify some of his simpledisplay to get some functionality I wanted for windows but maintaining it with any updates he pushes is a mess. I did it as a simple project but it worked well enough, but to go back and find all my changes to push to him or fork would be a pain and not worth it. What it means is that I have something that works for my purposes but because I don't want to put in the time to make it better I'm stuck. written very cleanly, organized, great docs, etc... It's a pipe the language and was native! That would be heaven to me. It would be a programming environment that I enjoy, that when I go to program code I feel good about it because I know I'm very productive and I'm not constantly running in to pot holes that distract me from my main goal. These things wear on a person after a while. Going down a long dirt road to a very nice mansion is great the first few times but after your suspension goes, and your car is full of dirt, it starts to be a problem.
 So. My point - writing code in D is easy, and my project would 
 not be where it is right now were it not for the deep 
 introspection and metaprogramming techniques I'm using. 
 Integrating D code in to the wider ecosystem, however, is where 
 things get tricky.
Writing code in D, for the most part is easy and great. Getting that code to work is not as easy, but integraing it in the ecosyste is the problem.. and that is why you are mixing it with wpf... and that should be a serious flag of "WHY!!!!!!!!!!!!" and the reason is simple, Walter doesn't give a shit about having proper gui, audio, video, or anything else. For him these things are not relevant so he will not push for them. He says "Someone wrote a library for that"... yeah, but that library doesn't work or is not maintained... or has depreciated features. .NET is "complete" basically anything you want to do and you can do it, it has a few things that are not, it's almost entire think wpf is quite bloated and such it still functions quite well after one learns the in's and outs. I don't like xaml programing but at the end of the day it does exactly what it says and there are no major problems and I can always go at it programatically, so I get to choose how much of one I want to go with. I like choices!] does certain things so much better that D could learn from) but my feeling is that the leadership is not interested in making D compete with these languages. They are happy where it's at. That is a problem for me because I'm not happy where it's at and it means another 5 years of the same thing, and then 10, and then 20. I want my programming life to get easier, not stay the same. I want to be able to enjoy programming while being productive. It's like having a Ferrari that is always breaking down... at some point it becomes an issue and the coolness factor fades and at some point you want to give it away too the next sucker. D can be competitive with these other platforms, but I see no real interest in the community to make it so. It's more like it wants to care out it's own niche. I will continue to use D for odd ball stuff such as simple utilities that I write to increase my own productivity, and maybe a few simulations but I will not use it to write any more apps, it's just too much trouble.
Aug 07 2019
next sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Wednesday, 7 August 2019 at 12:36:53 UTC, Bert wrote:
 Adams library is good because it works but I find it's 
 structure quite annoying. It's a hodge podge of code and he 
 tends to write in large files. I had to modify some of his 
 simpledisplay to get some functionality I wanted for windows 
 but maintaining it with any updates he pushes is a mess. I did 
 it as a simple project but it worked well enough, but to go 
 back and find all my changes to push to him or fork would be a 
 pain and not worth it.
You don't really have to find the files yourself, if you send me a copy there's a good chance I can extract the diff for you. Though it'll be even more of a hodge podge of code lol
Aug 07 2019
prev sibling parent reply Ethan <gooberman gmail.com> writes:
On Wednesday, 7 August 2019 at 12:36:53 UTC, Bert wrote:
 But you are one person who has decided to use D in that regard, 
 and funny how you didn't strictly do it in D alone! That 
 actually says a lot about D.
Actually, it says a lot more about modern software development. I don't care what language a library was written in. I 100% don't. It's irrelevant. All I care about is if it has a C/C++ ABI that I can hook in to. .NET assemblies can be understood in any .NET language. The approach I've taken with Binderoo is to make Binderoo libraries generate an interface for any supported language. Similar approach to .NET there, self describing libraries that don't rely on the language it was originally written in. Name mangling is a well understood and defined thing on any modern. It's feasible that one could write a .h/.di/.cs/etc definition generator given any arbitrary compiled .dll/.so/.dylib. There's no reason a WPF-quality framework couldn't be written in D. But why bother if I can just bind to one? Take my point one step further: Why has no one written a WPF-quality visual framework in C or C++? Or, if they have, why is it not more well known and in wider use? (When I say WPF quality. Understand that its standard widgets can be customised in a HTML-style manner. All I need is an easily customisable layout engine that's native. HTML5 is not performant enough for my needs yet.)
 .NET is "complete" basically anything you want to do and you 
 can do it, it has a few things that are not, it's almost entire 



I've often said around these parts that the .NET runtime is the expects your compiler to target .NET. We have LDC. Someone sitting down and starting up D.NET and integrating it in to the .NET Core ecosystem would be fantastic. Then you could write D code in the .NET environment and get access to the .NET runtime.
Aug 07 2019
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 7 August 2019 at 13:54:46 UTC, Ethan wrote:
 On Wednesday, 7 August 2019 at 12:36:53 UTC, Bert wrote:
 ....

 Take my point one step further: Why has no one written a 
 WPF-quality visual framework in C or C++? Or, if they have, why 
 is it not more well known and in wider use?

 ...
They have. Long before Qt was a thing, C++ Builder with VCL. It never picked much steam, because Borland went astray and the company that own it nowadays is happier to sell to corporations with deep pockets instead of indie devs. Then naturally Qt, QML is in fact kind of inspired on XAML just with JavaScript based language instead of XAML. Then there is UWP, first with C++/CX, now being rebranded as WinUI and being implemented in C++17 with C++/WinRT framework. Qt is not in wieder use due to religious C++ devs against moc and many want free beer for their tools, but since they want to get paid, don't like the dual licensing scheme from Qt. As for UWP it is Windows 10 only, and many devs don't want to compromise on that.
Aug 07 2019
prev sibling next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 7 August 2019 at 13:54:46 UTC, Ethan wrote:
 [snip]

 We have LDC. Someone sitting down and starting up D.NET and 
 integrating it in to the .NET Core ecosystem would be 
 fantastic. Then you could write D code in the .NET environment 
 and get access to the .NET runtime.
I think Laeeth had done some very preliminary work calling .net code [1]. I don't know anything about the status though. [1] https://github.com/symmetryinvestments/dotnetcore-d
Aug 07 2019
parent reply Ethan <gooberman gmail.com> writes:
On Wednesday, 7 August 2019 at 15:13:50 UTC, jmh530 wrote:
 I think Laeeth had done some very preliminary work calling .net 
 code [1]. I don't know anything about the status though.

 [1] https://github.com/symmetryinvestments/dotnetcore-d
Yes, I already do this in Binderoo. It's a critical piece of infrastructure for the program I talk of. But simply calling .NET code is not what I mean. I mean making D a first-class language in the .NET ecosystem by providing a front-end to the .NET compiler.
Aug 07 2019
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Wednesday, 7 August 2019 at 15:18:00 UTC, Ethan wrote:
 [snip]

 Yes, I already do this in Binderoo. It's a critical piece of 
 infrastructure for the program I talk of.

 But simply calling .NET code is not what I mean. I mean making 
 D a first-class language in the .NET ecosystem by providing a 
 front-end to the .NET compiler.
Ah, ok. I watched the Dconf video on Binderoo a while back, but I don't think I really groked it at the time. The documentation doesn't look like it has been updated since then either. I had no idea it worked with .net.
Aug 07 2019
parent Ethan Watson <gooberman gmail.com> writes:
On Wednesday, 7 August 2019 at 17:35:05 UTC, jmh530 wrote:
 I watched the Dconf video on Binderoo a while back, but I don't 
 think I really groked it at the time. The documentation doesn't 
 look like it has been updated since then either. I had no idea 
 it worked with .net.
I plan on relaunching it at some point in the reasonably near future. Needless to say, I've mainly focused on the .NET side of things since I left Remedy.
Aug 07 2019
prev sibling parent JN <666total wp.pl> writes:
On Wednesday, 7 August 2019 at 15:18:00 UTC, Ethan wrote:
 But simply calling .NET code is not what I mean. I mean making 
 D a first-class language in the .NET ecosystem by providing a 
 front-end to the .NET compiler.
There was actually a project for that long, long time ago: https://github.com/tim-m89/dnet https://archive.codeplex.com/?p=dnet
Aug 07 2019
prev sibling parent reply a11e99z <black80 bk.ru> writes:
On Wednesday, 7 August 2019 at 13:54:46 UTC, Ethan wrote:
 On Wednesday, 7 August 2019 at 12:36:53 UTC, Bert wrote:
 We have LDC. Someone sitting down and starting up D.NET and 
 integrating it in to the .NET Core ecosystem would be 
 fantastic. Then you could write D code in the .NET environment 
 and get access to the .NET runtime.
see https://en.wikipedia.org/wiki/Nemerle .NET with metaprogramming. see .NET Native as CoreRT. see compile scripts and expressions as lambdas - runtime metaprogramming with native speed. but D.NET is no needed. imo needed D++ (we have LDC) // next is simple D module interop.d //=================================================== codeC { #define SOME_MACRO /* */ #include <stdio.h> struct A { .. } printf( "%d", a.fld ); } // D code // D structs support C/C++ inheritance with or without vtbl // I mean diamond inheritance as C++ does // D struct can be created at stack or C++ heap only (or member of D class) struct D : A { void meth() { // can easy call any(?) C/C++ code without codeC(pp) blocks // or see as Terra interoperate with Lua } } codeCpp { #ifdef SOME_MACRO // classes in C++ is same C/C++ struct but only default private inheritance and members // allocated in stack or C++ heap cuz GC has undefined finalization order // for D code is same as struct { private: } class C : virtual public D { } } // D classes as now - allocated by GC or Scoped, cannot inherits structs class ClassD { // usual D class } // just my comment in code: // so f*ck alias this and add full C/C++ struct supports // full C++ interop allow use D as system language as betterCpp //=================================================== Clang has full C++ compiler that generates LLVM-IR. LDC generates LLVM-IR too. they can live together. Rust generates LLVM-IR too. need some universal TypeInfo that supports any kind of vtbl/types/descrs, something like Type Scheme for better world - u can program some module in lang that u think is more suitable. C++ 2z will add dynamic compilation, LLVM supports it already. D++ will have full C++ standard for ever. No needed any wrappers to C/C++ for ever. U can use any of billion libs as is. a lil problems: - C++ uses std::string in most times. should D++ use it too? in any case D should use some nonGC-string (for betterC too) OT: imo any OSes and langs should use totally interoperable String class (something like improved BSTR) - many interop troubles will disappear. - using C++ templates in D style. - hmm.. probably no more problems.
Aug 07 2019
parent reply xenon325 <anm programmer.net> writes:
On Wednesday, 7 August 2019 at 16:58:46 UTC, a11e99z wrote:
 On Wednesday, 7 August 2019 at 13:54:46 UTC, Ethan wrote:
 On Wednesday, 7 August 2019 at 12:36:53 UTC, Bert wrote:
 We have LDC. Someone sitting down and starting up D.NET and 
 integrating it in to the .NET Core ecosystem would be 
 fantastic. Then you could write D code in the .NET environment 
 and get access to the .NET runtime.
[...] imo needed D++ (we have LDC) // next is simple D module interop.d //=================================================== codeC { #define SOME_MACRO /* */ #include <stdio.h> struct A { .. } printf( "%d", a.fld ); } [...]
JFYI, there is https://wiki.dlang.org/Calypso, which basically does what you've described. Have no idea what it's capable of, latest commit is on 26 Mar 2018. -- Alexander
Aug 09 2019
parent Thomas Brix Larsen <brix brix-verden.dk> writes:
On Friday, 9 August 2019 at 14:54:55 UTC, xenon325 wrote:
 JFYI, there is https://wiki.dlang.org/Calypso, which basically 
 does what you've described. Have no idea what it's capable of, 
 latest commit is on 26 Mar 2018.

 --
 Alexander
Actually latest commit was 9 Aug 2019, on the death-to-ident-lookups-2019 branch.
Aug 09 2019
prev sibling parent SashaGreat <s g.com> writes:
One thing that bothers me is the fear of breaking code and 
perpetuating bad design.

C++ motto is exactly this, they don't break code and they keep 
with backwards compatibility.

Some will say this motto was the reason of the success of C++. I 
don't know, I think the real reason was compatibility with C and 
OO back in the day, since the next contender was Java (I'm 
talking about 90's).

But look how horrible C++ is, a lot of developers say this too.

Imagine there is o topic on Bjarne's website explaining: "Should 
I put "const" before or after the type?"

Some of Bjarne's answers:

"I put it before, but that's a matter of taste. "const T" and "T 
const" were - and are - (both) allowed and equivalent. For 
example:
	const int a = 1;	// ok
	int const b = 2;	// also ok"

"Why? When I invented "const" (initially named "readonly" and had 
a corresponding "writeonly"), I allowed it to go before or after 
the type because I could do so without ambiguity. Pre-standard C 
and C++ imposed few (if any) ordering rules on specifiers.

I don't remember any deep thoughts or involved discussions about 
the order at the time."


Unfortunately that persists till today.

Rust will try another thing, every 2~3 years they will revise 
their design and if needed they will indeed break things.

We will see how this will end.

I think D with a very smaller user base than C/C++/Java should be 
guided by better design even it will break code, yes it's hard 
but I'd prefer a nice and clear one than another C++ language.

Maybe D3?

Sasha.
Aug 07 2019