www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Better branding of -betterC

reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
Hi

It seems to me that a number of other new programming languages 
are basically an attempt to mimic D without the GC. I am thinking 
of 'Jai' by Jonathan Blow, 'Zig' by Andrew Kelly.

I was wondering if it worthwhile branding -betterC differently - 
e.g. use a brand such as 'micro-D' or some nicer name. That is, 
give it a new identity that highlights that it not just better C  
- but a D version without GC.

Regards
Dibyendu
Oct 29 2020
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
wrote:
 Hi

 It seems to me that a number of other new programming languages 
 are basically an attempt to mimic D without the GC. I am 
 thinking of 'Jai' by Jonathan Blow, 'Zig' by Andrew Kelly.

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.

 Regards
 Dibyendu
D-lite
Oct 29 2020
prev sibling next sibling parent reply IGotD- <nise nise.com> writes:
On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
wrote:
 Hi

 It seems to me that a number of other new programming languages 
 are basically an attempt to mimic D without the GC. I am 
 thinking of 'Jai' by Jonathan Blow, 'Zig' by Andrew Kelly.

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.

 Regards
 Dibyendu
I rather rebrand D because it is difficult to search for D related stuff in search engines. D is de facto "dlang" because you cannot search for "D". I was thinking if D is going to rename itself to "Duck", since it is common to use duck typing in D programming. However, searching for ducks on the internet will lead to that you really get results about ducks, the bird so you need to use ducklang just you need to use dlang.
Oct 29 2020
next sibling parent Anonymouse <zorael gmail.com> writes:
On Thursday, 29 October 2020 at 11:58:41 UTC, IGotD- wrote:
 I rather rebrand D because it is difficult to search for D 
 related stuff in search engines. D is de facto "dlang" because 
 you cannot search for "D".
This used to be more true than it is today. Certainly only searching for "D" will give you wrong results because of the overwhelming ambiguity, but so will "C" (tested Google). "d standard library", "d tutorials", "d syntax", "d std algorithm", "d hello world" etc all give correct/relevant results. Narurally you can find examples where they don't, like "d string", but my point stands. Branding is fine, adoption is hard.
Oct 29 2020
prev sibling parent reply matheus <matheus gmail.com> writes:
On Thursday, 29 October 2020 at 11:58:41 UTC, IGotD- wrote:
 I rather rebrand D because it is difficult to search for D 
 related stuff in search engines. D is de facto "dlang" because 
 you cannot search for "D".
But then you type "dlang" and search engine suggests "golang". On Thursday, 29 October 2020 at 11:52:03 UTC, jmh530 wrote:
 ...
 D-lite
This is indeed nice and it's sounds like delight. But I would go without "-", just: "Dlite". Now a name that I think it's a bit weird when I heard is: "DasBetterC". For me this sounds more like a Germany company than a feature of a language. Finally a name that I really don't like is the library: "Phobos". I still prefer the old "Tango" a thousand of times. Matheus.
Oct 29 2020
parent reply IGotD- <nise nise.com> writes:
On Thursday, 29 October 2020 at 13:49:01 UTC, matheus wrote:
 Now a name that I think it's a bit weird when I heard is: 
 "DasBetterC". For me this sounds more like a Germany company 
 than a feature of a language.
DasBetterC sounds horrible but I understand it is like a running joke in the D community. Not sure how it all started.
Oct 29 2020
parent Mike Parker <aldacron gmail.com> writes:
On Thursday, 29 October 2020 at 14:15:46 UTC, IGotD- wrote:

 DasBetterC sounds horrible but I understand it is like a 
 running joke in the D community. Not sure how it all started.
DConf 2018 in Germany. Walter's keynote was titled, "D as Better C". The DasBetterC thing hit him as he was looking at the title slide.
Oct 29 2020
prev sibling next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
There are two names for what you are suggesting.

Runtime less D and pay-as-you-go.

These are being worked on over time. They are basically the same thing, 
use as much of druntime as you want (i.e. some, or none).

-betterC fills its role, and doesn't need rebranding.
Oct 29 2020
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/29/2020 5:00 AM, rikki cattermole wrote:
 Runtime less D and pay-as-you-go.
 
 These are being worked on over time. They are basically the same thing, use as 
 much of druntime as you want (i.e. some, or none).
 
 -betterC fills its role, and doesn't need rebranding.
Future efforts with the library will be for so-called "header only" libraries, where one just imports the library module, and there's no need to link the library too.
Oct 30 2020
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 10/30/20 9:55 PM, Walter Bright wrote:
 On 10/29/2020 5:00 AM, rikki cattermole wrote:
 Runtime less D and pay-as-you-go.

 These are being worked on over time. They are basically the same 
 thing, use as much of druntime as you want (i.e. some, or none).

 -betterC fills its role, and doesn't need rebranding.
Future efforts with the library will be for so-called "header only" libraries, where one just imports the library module, and there's no need to link the library too.
"Source only", as "header only" is a C++ term (no headers in D).
Oct 31 2020
prev sibling next sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
wrote:
 Hi

 It seems to me that a number of other new programming languages 
 are basically an attempt to mimic D without the GC. I am 
 thinking of 'Jai' by Jonathan Blow, 'Zig' by Andrew Kelly.

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.

 Regards
 Dibyendu
I'm pretty sure that Jai is not mimicking D, also I doubt that Zig is either.
Oct 29 2020
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Oct 30 2020
next sibling parent Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt that 
 Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Yes I agree, I was being picky about the wording. Jai et al. have taken features from D and adopted them, but Jai is definitely not _mimicking_ D, in the sense of trying to be like D in general.
Oct 31 2020
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt that 
 Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations and C++ template meta-programing. D CTFE has definitely a very important value, but not everything that other languages adopt was created by D.
Oct 31 2020
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 31 October 2020 at 16:22:32 UTC, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros,
If CTFE means that you use the regular language then I guess dynamic languages would be the first, but then you may not have a clear separation between compilation and execution either. Of course there are many languages that do some of it, e.g. flexible type systems with generic unification/logic programming. Why most language designers don't go all the way with the full language probably has a lot to do with: 1. debugging is difficult 2. compilation may not terminate 3. compilation will be slower 4. it is seen as an implementation optimization and not a language feature But then again, computers are faster, have more memory and can spend more resources on static analysis so CTFE is perhaps primarly a consequence of technology... e.g. time.
Oct 31 2020
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/31/2020 9:22 AM, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations
Those are not natively compiled languages, and the compiler is part of the runtime.
 and C++ template meta-programing.
Having implemented a full C++ compiler, I don't agree: 1. it was discovered as a side effect, not designed 2. it does not do iteration 3. it only does integers - not floating point, not strings, not pointers 4. it cannot allocate memory 5. it is incredibly limited 6. it cannot call or execute a single C++ function 7. C++ template metaprograms are limited to trivial ones, due to fundamental problems with it and, most tellingly, 8. C++ has gone on to copy D's CTFE
 D CTFE has definitely a very important value, but not everything that other 
 languages adopt was created by D.
I did say popularized, not created. To round this out a bit, the C preprocessor can do compile time computation, too, but to compare it to D's CTFE is like comparing the pre-existing electric arc lamp to to Edison's incandescent bulb, and saying the bulb wasn't revolutionary.
Oct 31 2020
next sibling parent reply Max Samukha <maxsamukha gmail.com> writes:
On Sunday, 1 November 2020 at 01:26:28 UTC, Walter Bright wrote:
 On 10/31/2020 9:22 AM, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 Those are not natively compiled languages, and the compiler is 
 part of the runtime.
How is that relevant? There are natively compiled Lisps. Also, to compile D at runtime, you need a D compiler as part of the runtime environment. If you insist on singling out one language that popularized CTFE, Lisp would be a much fairer choice.
Nov 01 2020
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/1/2020 12:46 AM, Max Samukha wrote:
 How is that relevant? There are natively compiled Lisps. Also, to compile D at 
 runtime, you need a D compiler as part of the runtime environment. If you
insist 
 on singling out one language that popularized CTFE, Lisp would be a much
fairer 
 choice.
It's been said that eventually all languages tend to adopt all the features of Lisp. But that doesn't mean that Lisp popularized those features. After all, Lisp is one of the original programming languages, and if it popularized CTFE then C, C++, Pascal, etc., would have had it long ago.
Nov 02 2020
next sibling parent user1234 <user1234 12.de> writes:
On Tuesday, 3 November 2020 at 05:26:51 UTC, Walter Bright wrote:
 On 11/1/2020 12:46 AM, Max Samukha wrote:
 How is that relevant? There are natively compiled Lisps. Also, 
 to compile D at runtime, you need a D compiler as part of the 
 runtime environment. If you insist on singling out one 
 language that popularized CTFE, Lisp would be a much fairer 
 choice.
It's been said that eventually all languages tend to adopt all the features of Lisp. But that doesn't mean that Lisp popularized those features. After all, Lisp is one of the original programming languages, and if it popularized CTFE then C, C++, Pascal, etc., would have had it long ago.
Well CTFE in my opinion is mostly useful as part of metaprogramming and code-generation so this explains the case of Pascal (and its modern derivatives). It does not need CTFE because it cannot do compile-time introspection, which is one of the pillar of metaprog I'd say (other pillars are template, CTFE). Without CT reflection, CTFE would be just a bit more useful than folding.
Nov 02 2020
prev sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Tuesday, 3 November 2020 at 05:26:51 UTC, Walter Bright wrote:

 It's been said that eventually all languages tend to adopt all 
 the features of Lisp. But that doesn't mean that Lisp 
 popularized those features. After all, Lisp is one of the 
 original programming languages, and if it popularized CTFE then 
 C, C++, Pascal, etc., would have had it long ago.
Ok, you have a point.
Nov 04 2020
prev sibling next sibling parent Abdulhaq <alynch4047 gmail.com> writes:
On Sunday, 1 November 2020 at 01:26:28 UTC, Walter Bright wrote:
 On 10/31/2020 9:22 AM, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations
Those are not natively compiled languages, and the compiler is part of the runtime.
 and C++ template meta-programing.
Having implemented a full C++ compiler, I don't agree: 1. it was discovered as a side effect, not designed 2. it does not do iteration 3. it only does integers - not floating point, not strings, not pointers 4. it cannot allocate memory 5. it is incredibly limited 6. it cannot call or execute a single C++ function 7. C++ template metaprograms are limited to trivial ones, due to fundamental problems with it and, most tellingly, 8. C++ has gone on to copy D's CTFE
 D CTFE has definitely a very important value, but not 
 everything that other languages adopt was created by D.
I did say popularized, not created. To round this out a bit, the C preprocessor can do compile time computation, too, but to compare it to D's CTFE is like comparing the pre-existing electric arc lamp to to Edison's incandescent bulb, and saying the bulb wasn't revolutionary.
I respect both Paulo's and Walter's opinion, but speaking as a single data point I think Walter is right here in that D has been very influential regarding CTFE in the C universe - I say that as someone for whom Lisp was one of the first languages I learnt, sitting as a teenager in the library, together with PL/1! I later managed to get my hands on an actual physical computer and progressed (ha!) to the dizzy heights of BASIC and Z80 assembly. Seeing as I'm on the topic, 68000 was a blessing, 1980s C++ was a curse, Delphi was a blessing, MFC was a curse, dBase was, well...., now we have python, C++11, D, modern Java, all good, DHTML, node.js, CSS argh please no. What next we ask ourselves...
Nov 01 2020
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 1 November 2020 at 01:26:28 UTC, Walter Bright wrote:
 On 10/31/2020 9:22 AM, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations
Those are not natively compiled languages, and the compiler is part of the runtime.
Sure they are, one just needs to make use of the existing AOT compilers, including Java and .NET. Maybe time to actually learn about those platforms? Java world is not the same as during Visual Caffe days. Secondly, those features don't require shipping the compiler on the runtime.
 and C++ template meta-programing.
Having implemented a full C++ compiler, I don't agree: 1. it was discovered as a side effect, not designed 2. it does not do iteration 3. it only does integers - not floating point, not strings, not pointers 4. it cannot allocate memory 5. it is incredibly limited 6. it cannot call or execute a single C++ function 7. C++ template metaprograms are limited to trivial ones, due to fundamental problems with it and, most tellingly, 8. C++ has gone on to copy D's CTFE
 D CTFE has definitely a very important value, but not 
 everything that other languages adopt was created by D.
I did say popularized, not created. To round this out a bit, the C preprocessor can do compile time computation, too, but to compare it to D's CTFE is like comparing the pre-existing electric arc lamp to to Edison's incandescent bulb, and saying the bulb wasn't revolutionary.
You have implemented a C++ compiler in the old ISO C++98 days, many of your posts regarding C++ seem to be out of date with what ISO C++20 looks like. I usually see some of this recurring statements that D popularized something, that anyone with IEEE or ACM/SIGPLAN account can easily find otherwise. This only makes a disservice to D's credibility among those with CS knowledge in programming language design, one of my majors, hence why I tend to get sidetracked who did what. -- Paulo
Nov 01 2020
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 November 2020 at 09:58:39 UTC, Paulo Pinto wrote:
 I usually see some of this recurring statements that D 
 popularized something, that anyone with IEEE or ACM/SIGPLAN 
 account can easily find otherwise.
Well, yes, D and Rust are not popular so they cannot popularize anything. But it is true that some of the designers of system-oriented languages that came after D has looked at D's feature set for inspiration. I believe that applies to Go, Nim, Zig, Jai… What they took away from it, who knows? And some C++ proposals have been informed by D, e.g. the one on ranges. Although some of D's features were first proposed for C++, even though D implemented them first. A very tangled web.
Nov 01 2020
parent Jacob Carlborg <doob me.com> writes:
On 2020-11-01 11:25, Ola Fosheim Grøstad wrote:

 But it is true that some of the designers of system-oriented languages 
 that came after D has looked at D's feature set for inspiration. I 
 believe that applies to Go, Nim, Zig, Jai… What they took away from it, 
 who knows?
As far as I know, there's only one language that has taken a feature from D and gave it credit, that's Swift. The feature is how it handles special symbols like __FILE__ and __LINE__, when they're used as default arguments [1]. [1] https://developer.apple.com/swift/blog/?id=15 -- /Jacob Carlborg
Nov 01 2020
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
I've written a Java compiler, too.

If you're saying C++ did CTFE before D, bring it on. Show us!
Nov 02 2020
prev sibling parent reply J. V. <jv jv.com> writes:
On Saturday, 31 October 2020 at 16:22:32 UTC, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations and C++ template meta-programing. D CTFE has definitely a very important value, but not everything that other languages adopt was created by D.
even close to have the same capabilities as CTFE, nor they work in the same way. Reading your posts/responses, it looks like you are one of those guys that tries so hard to look smart and ends up writing a lot of wrong information and throws random references.
Nov 04 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 4 November 2020 at 13:06:58 UTC, J. V. wrote:
 On Saturday, 31 October 2020 at 16:22:32 UTC, Paulo Pinto wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Sorry but that flag belongs to Lisp and Dylan macros, Java compiler plugins, Java/.NET manipulation of attributes/annotations and C++ template meta-programing. D CTFE has definitely a very important value, but not everything that other languages adopt was created by D.
not even close to have the same capabilities as CTFE, nor they work in the same way. Reading your posts/responses, it looks like you are one of those guys that tries so hard to look smart and ends up writing a lot of wrong information and throws random references.
I am one of those guys that knows for the large majority of job, even though they aren't even close to have the same capabilities as CTFE. I am also one of those guys that still bothers to advocate D when the opportunity comes, but take it as you will.
Nov 04 2020
parent reply ryuukk_ <ryuukk_ gmail.com> writes:
BetterC has many flaws

- you have to be explicit in your code with  betterC  noGC etc 
etc etc, it's ugly and  annoying

- GC is still predominant everywhere (AA, arrays, string, 
libraries)

I am testing ZIG since last month, and let me tell you what, it 
is what i wanted D to be

NO GC at all! custom allocator friendly

I don't mind having the GC in D when i just want to quickly 
prototype something, or write a CLI tool

But for anything else, i rely on betterC and the experience is 
very very very poor!


Before thinking about better branding, what needed is: FIX IT! 
and make it cleaner


And most importantly! it shouldn't be called betterC, it should 
be called D, plain simple, just D!

Move the GC as a separate library, and make it opt-in rather than 
opt-out

Language should be simple, so you can do everything you want as a 
library (GC, RC)

I still have hopes for D as my main language, but the no gc story 
needs to be sorted ASAP


But to be very very honest, i doubt that'll ever happen, even NIM 
is handing that story better..
Nov 11 2020
next sibling parent reply rikki cattermole <rikki cattermole.co.nz> writes:
On 12/11/2020 11:15 AM, ryuukk_ wrote:
 BetterC has many flaws
 
 - you have to be explicit in your code with  betterC  noGC etc etc etc, 
 it's ugly and  annoying
No you don't. There is no specific behavior there for -betterC.
Nov 11 2020
parent reply Dibyendu Majumdar <mobile majumdar.org.uk> writes:
On Thursday, 12 November 2020 at 00:21:16 UTC, rikki cattermole 
wrote:
 On 12/11/2020 11:15 AM, ryuukk_ wrote:
 BetterC has many flaws
I think that the biggest flaw might be lack of numerated types in D - aka - safer unions. Without such a feature programming in Better C mode might not be that much better than C.
Nov 11 2020
next sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Thursday, 12 November 2020 at 00:27:44 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 12 November 2020 at 00:21:16 UTC, rikki cattermole 
 wrote:
 On 12/11/2020 11:15 AM, ryuukk_ wrote:
 BetterC has many flaws
I think that the biggest flaw might be lack of numerated types in D - aka - safer unions. Without such a feature programming in Better C mode might not be that much better than C.
They're available as a library: https://code.dlang.org/packages/sumtype
Nov 11 2020
parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Thursday, 12 November 2020 at 00:48:24 UTC, Paul Backus wrote:
 On Thursday, 12 November 2020 at 00:27:44 UTC, Dibyendu 
 Majumdar wrote:
 On Thursday, 12 November 2020 at 00:21:16 UTC, rikki 
 cattermole wrote:
 On 12/11/2020 11:15 AM, ryuukk_ wrote:
 BetterC has many flaws
I think that the biggest flaw might be lack of numerated types in D - aka - safer unions. Without such a feature programming in Better C mode might not be that much better than C.
They're available as a library: https://code.dlang.org/packages/sumtype
Thanks. Although without language support it probably doesn't get you safety?
Nov 12 2020
next sibling parent IGotD- <nise nise.com> writes:
On Thursday, 12 November 2020 at 14:50:33 UTC, Dibyendu Majumdar 
wrote:
 Thanks. Although without language support it probably doesn't 
 get you safety?
What do you mean? How isn't sumtype library safe? Why would a built in language sumtype be safer?
Nov 12 2020
prev sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Thursday, 12 November 2020 at 14:50:33 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 12 November 2020 at 00:48:24 UTC, Paul Backus 
 wrote:
 They're available as a library:

 https://code.dlang.org/packages/sumtype
Thanks. Although without language support it probably doesn't get you safety?
SumType is both type-safe and memory-safe, if that's what you're asking.
Nov 12 2020
parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Thursday, 12 November 2020 at 15:39:53 UTC, Paul Backus wrote:
 On Thursday, 12 November 2020 at 14:50:33 UTC, Dibyendu 
 Majumdar wrote:
 On Thursday, 12 November 2020 at 00:48:24 UTC, Paul Backus 
 wrote:
 They're available as a library:

 https://code.dlang.org/packages/sumtype
Thanks. Although without language support it probably doesn't get you safety?
SumType is both type-safe and memory-safe, if that's what you're asking.
Hi no, my point was that it not being a language feature means that people have to consciously use it. And that means it is probably not going to be. Also the compiler cannot assist you in making this choice.
Nov 12 2020
next sibling parent reply Paul Backus <snarwin gmail.com> writes:
On Thursday, 12 November 2020 at 18:07:08 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 12 November 2020 at 15:39:53 UTC, Paul Backus 
 wrote:
 SumType is both type-safe and memory-safe, if that's what 
 you're asking.
Hi no, my point was that it not being a language feature means that people have to consciously use it. And that means it is probably not going to be.
I don't think it's quite so black-and-white. Sure, a language feature is more convenient to use than a dub package, but it's not by a huge amount--and in both cases, you have to make a conscious choice to use them. I don't think sumtype is any harder to use in your code than something like ` safe`, for example.
 Also the compiler cannot assist you in making this choice.
I'm not aware of any compiler, for any language, than can provide this kind of assistance. Until we invent AIs that can write code for us, we programmers are going to be stuck making our own choices. :)
Nov 12 2020
parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Thursday, 12 November 2020 at 18:33:39 UTC, Paul Backus wrote:
 On Thursday, 12 November 2020 at 18:07:08 UTC, Dibyendu
 Also the compiler cannot assist you in making this choice.
I'm not aware of any compiler, for any language, than can provide this kind of assistance. Until we invent AIs that can write code for us, we programmers are going to be stuck making our own choices. :)
Sorry I didn't make myself clear. In Rust for example unions are unsafe whereas enumerated types aren't (caveat: this is based on my imperfect knowledge). So having it in the language allows the compiler to better infer safety of the code, and thus help the programmer make better choices.
Nov 12 2020
next sibling parent Paul Backus <snarwin gmail.com> writes:
On Thursday, 12 November 2020 at 21:22:37 UTC, Dibyendu Majumdar 
wrote:
 Sorry I didn't make myself clear.
 In Rust for example unions are unsafe whereas enumerated types 
 aren't (caveat: this is based on my imperfect knowledge). So 
 having it in the language allows the compiler to better infer 
 safety of the code, and thus help the programmer make better 
 choices.
In D, unions are system and SumTypes are safe, which as far as I can tell amounts to basically the same thing. Is there something I'm missing here? Obviously D's safety analysis is not quite as sophisticated as Rust's, but that would still be true even if D had built-in discriminated unions.
Nov 12 2020
prev sibling parent Siemargl <inqnone gmail.com> writes:
On Thursday, 12 November 2020 at 21:22:37 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 12 November 2020 at 18:33:39 UTC, Paul Backus 
 wrote:
 On Thursday, 12 November 2020 at 18:07:08 UTC, Dibyendu
 Also the compiler cannot assist you in making this choice.
I'm not aware of any compiler, for any language, than can provide this kind of assistance. Until we invent AIs that can write code for us, we programmers are going to be stuck making our own choices. :)
Sorry I didn't make myself clear. In Rust for example unions are unsafe whereas enumerated types aren't (caveat: this is based on my imperfect knowledge). So having it in the language allows the compiler to better infer safety of the code, and thus help the programmer make better choices.
I think you are mistaken. Rust only helps to make the only choice possible.
Nov 12 2020
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 11/12/20 1:07 PM, Dibyendu Majumdar wrote:
 On Thursday, 12 November 2020 at 15:39:53 UTC, Paul Backus wrote:
 On Thursday, 12 November 2020 at 14:50:33 UTC, Dibyendu Majumdar wrote:
 On Thursday, 12 November 2020 at 00:48:24 UTC, Paul Backus wrote:
 They're available as a library:

 https://code.dlang.org/packages/sumtype
Thanks. Although without language support it probably doesn't get you safety?
SumType is both type-safe and memory-safe, if that's what you're asking.
Hi no, my point was that it not being a language feature means that people have to consciously use it. And that means it is probably not going to be. Also the compiler cannot assist you in making this choice.
FWIW SumType is solid enough to be a good candidate for the standard library.
Nov 13 2020
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 12 November 2020 at 00:27:44 UTC, Dibyendu Majumdar 
wrote:
 I think that the biggest flaw might be lack of numerated types 
 in D - aka - safer unions. Without such a feature programming 
 in Better C mode might not be that much better than C.
I think this is more of a problem for full D as it prevents the GC from doing precise collection.
Nov 12 2020
prev sibling parent reply rinfz <cherez mailbox.org> writes:
On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
Nov 13 2020
parent reply ryuukk_ <ryuukk_ gmail.com> writes:
On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing Unity for example has lot of issues with it, and just recently started to introduce an incremental GC to help make the situation a little better D has an interesting GC situation where you can pause it, but still is stopping the world, and the bigger your heap is (wich happens with games), the longer the GC pauses will be, GC impact in D is minimal in the std lib, BUT, when a gc pause happen, you miss lot of frame, wich results in visible stutter ingame
Nov 14 2020
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing Unity for example has lot of issues with it, and just recently started to introduce an incremental GC to help make the situation a little better D has an interesting GC situation where you can pause it, but still is stopping the world, and the bigger your heap is (wich happens with games), the longer the GC pauses will be, GC impact in D is minimal in the std lib, BUT, when a gc pause happen, you miss lot of frame, wich results in visible stutter ingame
The problem is not having a GC, rather which algorithms get used and when. US Navy has no issues using realtime Java GCs for weapon systems control, where a frame drop costs real human lives.
 The Aegis Weapons System software was recently rewritten into 
 real-time Java as part of the Aegis Modernization activity. 
 This project involved replacement of about 200,000 lines of 
 CMS-2 and Ada code with roughly 500,000 lines of Java. The 
 effort began in 2003 and the new Java implementation of Aegis 
 Weapons System is now being deployed on warships.
https://dl.acm.org/doi/abs/10.1145/2402676.2402699 However these are not the kind of implementations get offered as free beer, from devs working on side projects.
Nov 14 2020
prev sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
Nov 14 2020
parent reply ryuukk_ <ryuukk_ gmail.com> writes:
On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
Nov 15 2020
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 15 November 2020 at 15:24:39 UTC, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ 
 wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
Then there are those that keep forgeting that there is more to the game developers world than trying to crack out yet another Fortnite or Crysis. Thankfully Markus Persson ignored opinions like yours.
Nov 15 2020
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 November 2020 at 17:14:57 UTC, Paulo Pinto wrote:
 Thankfully Markus Persson ignored opinions like yours.
Did he use a stop-the-world collector?
Nov 15 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 15 November 2020 at 21:17:09 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 15 November 2020 at 17:14:57 UTC, Paulo Pinto wrote:
 Thankfully Markus Persson ignored opinions like yours.
Did he use a stop-the-world collector?
That is a possible option among the plethora of Java implementations, maybe yes, maybe no.
Nov 15 2020
parent reply ryuukk_ <ryuukk_ gmail.com> writes:
On Sunday, 15 November 2020 at 17:14:57 UTC, Paulo Pinto wrote:
 On Sunday, 15 November 2020 at 15:24:39 UTC, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ 
 wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
Then there are those that keep forgeting that there is more to the game developers world than trying to crack out yet another Fortnite or Crysis. Thankfully Markus Persson ignored opinions like yours.
Thankfully we have people who think like you, so nobody uses D for gamedev, despite that category being the most popular in every stores https://fortunly.com/blog/video-game-industry-revenue/ And that will keep growing year after year Phones being the most popular one, constrained devices where every bit of memory wasted is prohibited, same for CPU cycles But yes, we got minecraft in Java, that got rewritten in C++ because it sucked and couldn't get proper console, not even devent mobile release But yes, we should trust the people who don't write games Who better than them to dictate how games should be created :D :D The way to go! Use GC, they say Use 3x memory they say But phones are constrained devices - Tell them to download more ram!!!!!!!!!
Nov 16 2020
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 November 2020 at 16:56:14 UTC, ryuukk_ wrote:
 But yes, we got minecraft in Java, that got rewritten in C++ 
 because it sucked and couldn't get proper console, not even 
 devent mobile release
Yes, but for indies on a low budget tooling that allow you to cut costs would be an asset. So he probably chose the right tool for him for where he was at the time. But clearly, a stop the world collector is not what you want, you would actually be better off writing your game-client in javascript... Still, having options that play well with each other would be a good thing. For instance, having ARC as an option in C++ would be a good thing. Having a single threaded compacting GC and ARC as options would be a good thing for D. Not sure where D is heading... But yeah, too many people that wanted to use D for highly interactive apps have moved on. Trying to convince those that remain that the current GC is good for this application space will obviously not encourage them to stay. So not sure what the purpose of such rhetorical games are... Cheers!
Nov 16 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 16 November 2020 at 17:15:03 UTC, Ola Fosheim Grøstad 
wrote:
 ...

 Not sure where D is heading... But yeah, too many people that 
 wanted to use D for highly interactive apps have moved on. 
 Trying to convince those that remain that the current GC is 
 good for this application space will obviously not encourage 
 them to stay. So not sure what the purpose of such rhetorical 
 games are...

 Cheers!
They moved on to Unity. Despite some "I know best" comments, go to IGDA forums, GDC talks, UK magazine MCV/Develop, IGF and notice that Unity is always among the first party support from Microsoft/Apple/Google/Sony/Nintendo for game engines on their platforms and AR/VR. So while some people here spread GC hate around D, and how unsuitable it is for doing games, others are building their game studios are polyglot and don't see a nail in every problem.
Nov 16 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 November 2020 at 19:04:48 UTC, Paulo Pinto wrote:

 studios are polyglot and don't see a nail in every problem.
You could write a good game with javascript + C++, or Dart + C++, if you keep the game world on the GC side. Not too bad for an indie game with two developers, one doing the graphics and the other one the game mechanics. So I don't think that is unreasonable, but the current GC that D has to to support to carry existing code bases is not in the same ballpark. If you want to create a dynamic online "streaming" game world then you deal with a lot of pointers and lots of caching... It is not that this cannot change, but as of today that is farther away than 1-2 years... IMHO.
Nov 16 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 16 November 2020 at 19:12:55 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 16 November 2020 at 19:04:48 UTC, Paulo Pinto wrote:

 studios are polyglot and don't see a nail in every problem.
You could write a good game with javascript + C++, or Dart + C++, if you keep the game world on the GC side. Not too bad for an indie game with two developers, one doing the graphics and the other one the game mechanics. So I don't think that is unreasonable, but the current GC that D has to to support to carry existing code bases is not in the same ballpark. If you want to create a dynamic online "streaming" game world then you deal with a lot of pointers and lots of caching... It is not that this cannot change, but as of today that is farther away than 1-2 years... IMHO.
But that is the whole point, what needs improving is the current implementation of D's GC, not throwing away everything.
Nov 16 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 16 November 2020 at 19:34:00 UTC, Paulo Pinto wrote:
 But that is the whole point, what needs improving is the 
 current implementation of D's GC, not throwing away everything.
But D first need people to agree on the semantic constraints that can enable this... And that will break some code bases. So there is in essence a cultural resistance to improving the GC.
Nov 16 2020
parent rinfz <cherez mailbox.org> writes:
On Monday, 16 November 2020 at 19:51:28 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 16 November 2020 at 19:34:00 UTC, Paulo Pinto wrote:
 But that is the whole point, what needs improving is the 
 current implementation of D's GC, not throwing away everything.
But D first need people to agree on the semantic constraints that can enable this... And that will break some code bases. So there is in essence a cultural resistance to improving the GC.
Why D (and many other languages) would benefit from the equivalent of rust's editions (although I'm not sure the D dev team can carry the burden of that with the few people that they have).
Nov 16 2020
prev sibling next sibling parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Monday, 16 November 2020 at 16:56:14 UTC, ryuukk_ wrote:
 Thankfully we have people who think like you, so nobody uses D 
 for gamedev, despite that category being the most popular in 
 every stores
I think that is a very good observation. It is a huge industry and currently I believe C++ & C dominate when it comes to core engine stuff. I am not a games developer but I know that guys like Jonathan Blow would not use a language that has GC in it. My motivation for starting the Laser-D project is to make D attractive to those who do not want the GC even as an option. It is ironic that Walter was inspired to start his career in compiler tools because he wanted to write a game - I hope I am not remembering incorrectly! Right now is the time to try to get into this market because there are multiple parallel attempts to do so (Jai, Zig etc).
Nov 16 2020
parent reply Max Haughton <maxhaton gmail.com> writes:
On Monday, 16 November 2020 at 17:46:07 UTC, Dibyendu Majumdar 
wrote:
 On Monday, 16 November 2020 at 16:56:14 UTC, ryuukk_ wrote:
 Thankfully we have people who think like you, so nobody uses D 
 for gamedev, despite that category being the most popular in 
 every stores
I think that is a very good observation. It is a huge industry and currently I believe C++ & C dominate when it comes to core engine stuff. I am not a games developer but I know that guys like Jonathan Blow would not use a language that has GC in it. My motivation for starting the Laser-D project is to make D attractive to those who do not want the GC even as an option. It is ironic that Walter was inspired to start his career in compiler tools because he wanted to write a game - I hope I am not remembering incorrectly! Right now is the time to try to get into this market because there are multiple parallel attempts to do so (Jai, Zig etc).
Is this "Laser-D" supposed to be a separate language on just a project to document betterC? If it's the former you may want to tread lightly because you if you are successful you could end up seriously hurting D's prospects in the future - probably second to people complaining about the GC is people complaining about D having a fragmented ecosystem (I have recently seen people mentioning Tango and D1 despite them being dead for a decade now) If you really want to make -betterC better, there are still noticable flaws with it - for example you still can't use std.format at compile time in -betterC mode (https://run.dlang.io/is/TIcgW2).
Nov 16 2020
parent Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Monday, 16 November 2020 at 18:19:56 UTC, Max Haughton wrote:

 Is this "Laser-D" supposed to be a separate language on just a 
 project to document betterC?
My plan is document a subset of D that is maybe even more restricted than Better C - because I think not every feature of D is needed. I would like to turn off features in the compiler to reflect the language definition, but this may be too much effort. But one thing I want to guarantee is that all Laser-D programs will also be D programs.
 If you really want to make -betterC better, there are still 
 noticable flaws with it - for example you still can't use 
 std.format at compile time in -betterC mode 
 (https://run.dlang.io/is/TIcgW2).
If I am not mistaken there are few tests that cover the Better C option - so there are probably many bugs lurking there... I hope not but I already found that despite what the doc says, creating C++ classes in Better C does not appear to work (doc isn't clear though about whether you can only interface to C++ classes or create them too).
Nov 16 2020
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 16 November 2020 at 16:56:14 UTC, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 17:14:57 UTC, Paulo Pinto wrote:
 On Sunday, 15 November 2020 at 15:24:39 UTC, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker 
 wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ 
 wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
Then there are those that keep forgeting that there is more to the game developers world than trying to crack out yet another Fortnite or Crysis. Thankfully Markus Persson ignored opinions like yours.
Thankfully we have people who think like you, so nobody uses D for gamedev, despite that category being the most popular in every stores
They use Unity, Unreal, Xenko, Monogame, CryEngine, Godot, HoloLens UWP instead. levels go game scripting, and Unreal even uses their own C++ GC implementation. Ah, and then two of the major 3D APIs, namely DirectX and Metal, make use of reference counting, which despite all cargo cult around GC vs RC, it is a GC algorithm as per CS definiton, chapter 5 of the renowed Garbage Collection handbook. Have fun with your GC hate. Do you know what? People like you used to argue against using C and Pascal for doing games. Then the same people, that grundgely adopted them, started to use the same arguments against C++. Some devs can only move forward when platform owners drag them out of their confort zone.
Nov 16 2020
prev sibling parent ryuukk_ <ryuukk_ gmail.com> writes:
On Sunday, 15 November 2020 at 17:14:57 UTC, Paulo Pinto wrote:
 On Sunday, 15 November 2020 at 15:24:39 UTC, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ 
 wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
Then there are those that keep forgeting that there is more to the game developers world than trying to crack out yet another Fortnite or Crysis. Thankfully Markus Persson ignored opinions like yours.
Thankfully we have people who think like you, so nobody uses D for gamedev, despite that category being the most popular in every stores https://fortunly.com/blog/video-game-industry-revenue/ And that will keep growing year after year Phones being the most popular one, constrained devices where every bit of memory wasted is prohibited, same for CPU cycles But yes, we got minecraft in Java, that got rewritten in C++ because it sucked and couldn't get proper console, not even devent mobile release But yes, we should trust the people who don't write games Who better than them to dictate how games should be created :D :D The way to go! Use GC, they say Use 3x memory they say But phones are constrained devices - Tell them to download more ram!!!!!!!!!
Nov 16 2020
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 15.11.20 16:24, ryuukk_ wrote:
 On Sunday, 15 November 2020 at 07:55:22 UTC, Mike Parker wrote:
 On Sunday, 15 November 2020 at 07:11:01 UTC, ryuukk_ wrote:
 On Friday, 13 November 2020 at 22:56:03 UTC, rinfz wrote:
 On Wednesday, 11 November 2020 at 22:15:18 UTC, ryuukk_ wrote:
 NO GC at all! custom allocator friendly
Out of interest, what are your problems with GC?
for game engines, GC is not a good thing
As a blanket statement, that's just not true. There's a whole universe of games and game engines you can make where the GC isn't even going to be a blip on the radar.
those who don't write games or game engines are always the ones telling you that you shouldn't care then you listen to them and you wonder why your game stutters with 50ms pauses (4frames at 60fps) every few seconds
In D the only reason this would happen is because you (continuously) allocate memory during the main game loop. Don't do that and you'll be fine. I can understand wanting to avoid binary bloat, but the GC does not cause stuttering on its own.
Nov 15 2020
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 15 November 2020 at 17:21:23 UTC, Timon Gehr wrote:
 In D the only reason this would happen is because you 
 (continuously) allocate memory during the main game loop. Don't 
 do that and you'll be fine. I can understand wanting to avoid 
 binary bloat, but the GC does not cause stuttering on its own.
No, all threads that have owning GC-pointers have to halt. So that is kinda killing a MMO client. Besides a pinning GC will roughly double your memory consumption and lead to more memory fragmentation. A single threaded GC with compaction, on the other hand, could actually be beneficial.
Nov 15 2020
prev sibling parent reply Dibyendu Majumdar <mobile majumdar.org.uk> writes:
On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt that 
 Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Indeed - Jai can be thought of as a cut-down version of D without GC, but with CTFE. I don't have access to Jai but from what I observe, D has everything Jai has. Zig seems like a copy of Jai. Neither of these languages offer anything new (from what I can see) - and that was my argument about better branding for 'leaner D' - which is not the same as 'better C'. Regards
Oct 31 2020
next sibling parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu Majumdar 
wrote:
 On Saturday, 31 October 2020 at 01:57:19 UTC, Walter Bright 
 wrote:
 On 10/29/2020 5:48 AM, Abdulhaq wrote:
 I'm pretty sure that Jai is not mimicking D, also I doubt 
 that Zig is either.
D popularized CTFE, and other languages followed suit, including Jai.
Indeed - Jai can be thought of as a cut-down version of D without GC, but with CTFE. I don't have access to Jai but from what I observe, D has everything Jai has. Zig seems like a copy of Jai. Neither of these languages offer anything new (from what I can see) - and that was my argument about better branding for 'leaner D' - which is not the same as 'better C'. Regards
D has everything that Algol has - that doesn't mean that Algol copied D.
Oct 31 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 31 October 2020 at 17:35:34 UTC, Abdulhaq wrote:
 D has everything that Algol has - that doesn't mean that Algol 
 copied D.
Hah, Simula added OO to the Algol family :-). Difficult to say what influenced what, but I can imagine that D influenced some C++ people to push harder for some features in C++17/C++20.
Oct 31 2020
parent reply Abdulhaq <alynch4047 gmail.com> writes:
On Saturday, 31 October 2020 at 17:41:08 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 31 October 2020 at 17:35:34 UTC, Abdulhaq wrote:
 D has everything that Algol has - that doesn't mean that Algol 
 copied D.
Hah, Simula added OO to the Algol family :-). Difficult to say what influenced what, but I can imagine that D influenced some C++ people to push harder for some features in C++17/C++20.
Yeah we're talking "on the shoulders of giants" here, and it is possible that D could end up in that hall of fame at some time. However, speaking to the original point, Jonathan Blow has a very different philosophy to Walter and Andrei, Jonathan is very games oriented, and Jai targets games development. D's CTFE probably had some influence on Jai (as did Lisp etc as Paulo pointed) out, but in no way did Jonathan think of, or approach, Jai as a cut down D.
Oct 31 2020
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 31 October 2020 at 17:53:14 UTC, Abdulhaq wrote:
 However, speaking to the original point, Jonathan Blow has a 
 very different philosophy to Walter and Andrei, Jonathan is 
 very games oriented, and Jai targets games development. D's 
 CTFE probably had some influence on Jai (as did Lisp etc as 
 Paulo pointed) out, but in no way did Jonathan think of, or 
 approach, Jai as a cut down D.
Yes, most certainly. I did watch his videos a long time ago. He seemed to be very concerned with specific patterns that he feels is common in games programming. So Jai might end up having some special casing that other languages don't bother with. e.g. structs of arrays vs arrays of structs. But, most Algolish C++-influenced languages are rather similar. We are basically talking flavours and runtime differences.
Oct 31 2020
prev sibling parent reply bachmeier <no spam.net> writes:
On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu Majumdar 
wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
Oct 31 2020
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu Majumdar 
 wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It has been explained in videos and people have written documentation of the features based on that.
Oct 31 2020
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Saturday, 31 October 2020 at 19:55:18 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu 
 Majumdar wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It has been explained in videos and people have written documentation of the features based on that.
Jonathan Blow puts out like 3 hour long videos explaining stuff...most people don't have the patience to sit through that... Zig has some interesting features though
Oct 31 2020
prev sibling parent reply bachmeier <no spam.net> writes:
On Saturday, 31 October 2020 at 19:55:18 UTC, Ola Fosheim Grøstad 
wrote:
 On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu 
 Majumdar wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It has been explained in videos and people have written documentation of the features based on that.
That wouldn't make Zig an implementation, not a copy.
Oct 31 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 November 2020 at 01:46:33 UTC, bachmeier wrote:
 That wouldn't make Zig an implementation, not a copy.
I can see why someone would think Zig is inspired by Jai, though. Zig appears to focus on not having any "magic", so that the source is explicit. Not a reasonable design rationale IMO. Properties and exceptions are useful abstractions. What all these languages have in common is that they are opinionated, but don't bring anything significantly new to table vs C++, except Rust. So what we end up with is many front ends to LLVM, sketchy interop and starving eco systems...
Oct 31 2020
next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Sunday, 1 November 2020 at 02:14:00 UTC, Ola Fosheim Grøstad 
wrote:
 On Sunday, 1 November 2020 at 01:46:33 UTC, bachmeier wrote:
 That wouldn't make Zig an implementation, not a copy.
I can see why someone would think Zig is inspired by Jai, though. Zig appears to focus on not having any "magic", so that the source is explicit. Not a reasonable design rationale IMO. Properties and exceptions are useful abstractions. What all these languages have in common is that they are opinionated, but don't bring anything significantly new to table vs C++, except Rust. So what we end up with is many front ends to LLVM, sketchy interop and starving eco systems...
And Rust buzz has made everyone start looking at ways to integrate affine types alongside automatic memory management, leaving that for hot paths. So in the end, we that love systems languages with some for of GC will keep our eco-systems, have a couple of extra language features for those 1% use cases that they really matter instead of writing Assembly or what have you, and only the religiously anti-GC crowd might actually keep caring about languages like Rust and C++. Even C has issues with adoption among some embedded domains where the community is focused on C89 with OEM extensions and doesn't care about anything else. What D really misses are systems like Oberon, Singularity, Midori, but sadly that is a corporation level effort. Just something like this would already shut many mouths. http://www.progtools.org/article.php?name=oberon&section=compilers&type=tutorial -- Paulo
Nov 01 2020
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 November 2020 at 07:27:09 UTC, Paulo Pinto wrote:
 On Sunday, 1 November 2020 at 02:14:00 UTC, Ola Fosheim Grøstad 
 wrote:
 of writing Assembly or what have you, and only the religiously 
 anti-GC crowd might actually keep caring about languages like 
 Rust and C++.
C++ is slowly taking over embedded. Most languages suitable for interactive applications use frameworks built with C++. So, better interop with C++ is important for tiny languages. For most servers C++ is becoming less important thanks to cloud computing.
Nov 01 2020
prev sibling parent reply norm <norm.rowtree gmail.com> writes:
On Sunday, 1 November 2020 at 07:27:09 UTC, Paulo Pinto wrote:

 Even C has issues with adoption among some embedded domains 
 where the community is focused on C89 with OEM extensions and 
 doesn't care about anything else.
Just going to pick on this para. sorry :) I hear this a lot and it is simply not true. Working in embedded for the last ~18yrs (automotive and medical) every project since 2010 I know of is C++. At conferences and the like over the last 2-3yrs there is a lot of talk about Rust and people are starting to mention MicroPython for prototyping. But as a whole everyone I talk to is looking forward to upgrading their C++ compilers on the next project to get C++20 features. Every C++ embedded project I have ever seen has exceptions turned off. Cheers, Norm
Nov 02 2020
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 3 November 2020 at 06:04:26 UTC, norm wrote:
 On Sunday, 1 November 2020 at 07:27:09 UTC, Paulo Pinto wrote:

 Even C has issues with adoption among some embedded domains 
 where the community is focused on C89 with OEM extensions and 
 doesn't care about anything else.
Just going to pick on this para. sorry :) I hear this a lot and it is simply not true. Working in embedded for the last ~18yrs (automotive and medical) every project since 2010 I know of is C++. At conferences and the like over the last 2-3yrs there is a lot of talk about Rust and people are starting to mention MicroPython for prototyping. But as a whole everyone I talk to is looking forward to upgrading their C++ compilers on the next project to get C++20 features. Every C++ embedded project I have ever seen has exceptions turned off. Cheers, Norm
Then you live in a happy bubble, CppCon 2016: Dan Saks “extern c: Talking to C Programmers about C++” https://www.youtube.com/watch?v=D7Sd8A6_fYU https://embeddedgurus.com/barr-code/2018/02/c-the-immortal-programming-language/ Since you work in the industry, maybe you want share some reports that prove otherwise?
Nov 02 2020
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 11/2/2020 10:04 PM, norm wrote:
 Every C++ embedded project I have ever seen has exceptions turned off.
I actually looked at implementing EH for DOS C++ programs. It's technically possible, but quite useless, as the necessary EH code would consume the entire 640Kb of memory :-/
Nov 02 2020
parent norm <norm.rowtree gmail.com> writes:
On Tuesday, 3 November 2020 at 07:29:52 UTC, Walter Bright wrote:
 On 11/2/2020 10:04 PM, norm wrote:
 Every C++ embedded project I have ever seen has exceptions 
 turned off.
I actually looked at implementing EH for DOS C++ programs. It's technically possible, but quite useless, as the necessary EH code would consume the entire 640Kb of memory :-/
Yep, binary bloat is the biggest factor but also the unpredictability of the unwind. If resources run out during the unwind the system can and does unexpected things. If you're lucky you get a bus error. With exceptions turned off the most common approach is a brutal but effective one. An ENFORCE function, usually called ASSERT, that takes a predicate. If the predicate fails it writes the failure reason to an area in flash that persists across boot then sits in a while loop waiting for the HW watchdog to kick in. Any pending failure reason is reported on the next boot. Cheers, Norm
Nov 03 2020
prev sibling parent reply IGotD- <nise nise.com> writes:
On Tuesday, 3 November 2020 at 06:04:26 UTC, norm wrote:
 Just going to pick on this para. sorry :)

 I hear this a lot and it is simply not true. Working in 
 embedded for the last ~18yrs (automotive and medical) every 
 project since 2010 I know of is C++. At conferences and the 
 like over the last 2-3yrs there is a lot of talk about Rust and 
 people are starting to mention MicroPython for prototyping. But 
 as a whole everyone I talk to is looking forward to upgrading 
 their C++ compilers on the next project to get C++20 features.

 Every C++ embedded project I have ever seen has exceptions 
 turned off.
It's funny that they talk about C++20 in the embedded environment but many features cannot be used because of STL. STL is just like phobos and will become just as big if you statically link with it. Turning exceptions off also means that many parts of the STL library will be dysfunctional. Usually you need specialized libraries. Modern C++ is even more of a mess because they have moved much of the functionality to the STL library. std::move for example is a library template, which can be duplicated outside of course but still you need to do that with a lot of functions. Even statically compiled primitives are library templates today with some inunderstandable implementation underneath. STL is this big blob that can include all sort of things that you don't expect even for simple primitives. Apart of a the ugliness of modern C++, many of the modern features are implemented in such a way it is difficult to make it freestanding from STL. You do not want STL in embedded systems. Is modern C++ good for embedded systems? No, you need to slice of half of the new features because of the reasons I mentioned above.
Nov 03 2020
next sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 3 November 2020 at 10:25:15 UTC, IGotD- wrote:
 On Tuesday, 3 November 2020 at 06:04:26 UTC, norm wrote:
 Just going to pick on this para. sorry :)

 I hear this a lot and it is simply not true. Working in 
 embedded for the last ~18yrs (automotive and medical) every 
 project since 2010 I know of is C++. At conferences and the 
 like over the last 2-3yrs there is a lot of talk about Rust 
 and people are starting to mention MicroPython for 
 prototyping. But as a whole everyone I talk to is looking 
 forward to upgrading their C++ compilers on the next project 
 to get C++20 features.

 Every C++ embedded project I have ever seen has exceptions 
 turned off.
It's funny that they talk about C++20 in the embedded environment but many features cannot be used because of STL. STL is just like phobos and will become just as big if you statically link with it. Turning exceptions off also means that many parts of the STL library will be dysfunctional. Usually you need specialized libraries. Modern C++ is even more of a mess because they have moved much of the functionality to the STL library. std::move for example is a library template, which can be duplicated outside of course but still you need to do that with a lot of functions. Even statically compiled primitives are library templates today with some inunderstandable implementation underneath. STL is this big blob that can include all sort of things that you don't expect even for simple primitives. Apart of a the ugliness of modern C++, many of the modern features are implemented in such a way it is difficult to make it freestanding from STL. You do not want STL in embedded systems. Is modern C++ good for embedded systems? No, you need to slice of half of the new features because of the reasons I mentioned above.
To be fair same applies to C, hence C89 + OEM extensions. In some domains even C11 is a challenge regarding adoption, as per the reports I usually read.
Nov 03 2020
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 3 November 2020 at 10:25:15 UTC, IGotD- wrote:
 Apart of a the ugliness of modern C++, many of the modern 
 features are implemented in such a way it is difficult to make 
 it freestanding from STL. You do not want STL in embedded 
 systems.
Which C++ features depend on STL? I can't think of any... STL is ok for protyping. They are working on providing exceptions as return values for embedded. D needs that as well.
 Is modern C++ good for embedded systems?
Define modern C++?
Nov 03 2020
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 11/3/2020 2:25 AM, IGotD- wrote:
 Is modern C++ good for embedded systems? No, you need to slice of half of the 
 new features because of the reasons I mentioned above.
I can't say that D does this better. Phobos is overly complex and layered. DasBetterC at least relies only on the existence of the C standard library, and really not even that.
Nov 03 2020
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2020-11-01 03:14, Ola Fosheim Grøstad wrote:

 I can see why someone would think Zig is inspired by Jai, though. Zig 
 appears to focus on not having any "magic", so that the source is 
 explicit. Not a reasonable design rationale IMO. Properties and 
 exceptions are useful abstractions.
I don't think exceptions need to go against being explicit in the source code. It just happens that's how most languages implement them. For example, requiring functions that can throw an exception, to be annotated (with the exceptions it can throw). And require all those functions to be called with a specific syntax solves that. If you look at how Zig (same with Swift) implements error handling, you see that it's very similar to how exceptions work. At least from a source code point of view. Then the implementation is completely different under the hood. -- /Jacob Carlborg
Nov 01 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 1 November 2020 at 13:58:57 UTC, Jacob Carlborg wrote:
 If you look at how Zig (same with Swift) implements error 
 handling, you see that it's very similar to how exceptions 
 work. At least from a source code point of view. Then the 
 implementation is completely different under the hood.
I haven't used Zig, but as far as I can tell you have to test right after the function call? Swift is somewhat more tedious than regular exceptions... But not too bad.
Nov 01 2020
parent Jacob Carlborg <doob me.com> writes:
On Sunday, 1 November 2020 at 16:34:49 UTC, Ola Fosheim Grøstad 
wrote:

 I haven't used Zig, but as far as I can tell you have to test 
 right after the function call?
Actually, I thought that as well. But I just tried it and you don't have to call a function which can return an error with `try`. If you don't, then you just get back an error union (a type which consist of the return type and the error type). Then you can treat it like any other value. But it's not exactly like with exceptions. -- /Jacob Carlborg
Nov 02 2020
prev sibling next sibling parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu Majumdar 
 wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It is not vaporware. It is in closed beta right now.
Oct 31 2020
next sibling parent reply Max Haughton <maxhaton gmail.com> writes:
On Saturday, 31 October 2020 at 20:17:02 UTC, Stefan Koch wrote:
 On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu 
 Majumdar wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It is not vaporware. It is in closed beta right now.
I agree it's not vaporware but it does seem a strange way to write a compiler (even slowly).
Oct 31 2020
parent Stefan Koch <uplink.coder googlemail.com> writes:
On Saturday, 31 October 2020 at 20:38:25 UTC, Max Haughton wrote:
 On Saturday, 31 October 2020 at 20:17:02 UTC, Stefan Koch wrote:
 On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu 
 Majumdar wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It is not vaporware. It is in closed beta right now.
I agree it's not vaporware but it does seem a strange way to write a compiler (even slowly).
Maybe it's strange. But I really like it. It's the responsible thing to do! Having a smaller number of people test-drive before releasing something! And therefore being able to fix it before impacting a greater number of people.
Oct 31 2020
prev sibling parent bachmeier <no spam.net> writes:
On Saturday, 31 October 2020 at 20:17:02 UTC, Stefan Koch wrote:
 On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu 
 Majumdar wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
It is not vaporware. It is in closed beta right now.
While there is no official body that determines if something is vaporware, my usage is very much consistent with Perl 6, which was the most famous application of the term to a programming language. Heavy hype, excitement over new features, claims progress is being made, and a belief that there will eventually be a release. It has been more than six years (assuming Wikipedia is accurate) and we only have continued promises that Jai will eventually be released. Here is an article from early 2007 talking about Perl 6 as vaporware, and the label was surely used long before that: https://www.techrepublic.com/blog/geekend/transition-from-perl-5x-to-perl-6/ The most important lesson to be learned from the vaporware version of Perl 6 is not that there won't be a release (it was) but that the release won't be the same as the promise. Until Jai is released we can only talk about Jonathan Blow's ideas. An idea without an implementation is as useful as a car without gas.
Oct 31 2020
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Saturday, 31 October 2020 at 19:31:56 UTC, bachmeier wrote:
 On Saturday, 31 October 2020 at 17:18:30 UTC, Dibyendu Majumdar 
 wrote:

 Zig seems like a copy of Jai.
How do you copy vaporware? Or has there been a release of a single line of compileable Jai that I missed?
The betas available to those that applied to it, when it was announced, and the upcoming game that will be released.
Oct 31 2020
prev sibling next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
wrote:

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.
I think it's a bad, bad, bad idea to put any emphasis on BetterC other than as a tool to help in porting C or C++ code to D, or to integrate D into existing C and C++ projects. I see too many people reaching for it first thing, probably out of a misguided GC phobia. D is the language we need to be promoting. BetterC was intended for a specific purpose. Beyond that, it's a crippled D. If some people prefer to use it that way, fine, but we shouldn't encourage it.
Oct 29 2020
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via Digitalmars-d wrote:
 On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar wrote:
[...]
 I was wondering if it worthwhile branding -betterC differently -
 e.g.  use a brand such as 'micro-D' or some nicer name. That is,
 give it a new identity that highlights that it not just better C  -
 but a D version without GC.
I think it's a bad, bad, bad idea to put any emphasis on BetterC other than as a tool to help in porting C or C++ code to D, or to integrate D into existing C and C++ projects. I see too many people reaching for it first thing, probably out of a misguided GC phobia. D is the language we need to be promoting. BetterC was intended for a specific purpose. Beyond that, it's a crippled D. If some people prefer to use it that way, fine, but we shouldn't encourage it.
+1. Most people with GC phobia don't actually need to avoid the GC. You really only need to avoid the GC if you're working in very specific niches, like hard real-time requirements (game engines, rocket booster controllers, etc.). Your general software project does not need to avoid the GC; you just need to know how to use it effectively (and/or apply nogc where it matters). T -- Freedom of speech: the whole world has no right *not* to hear my spouting off!
Oct 29 2020
next sibling parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Thursday, 29 October 2020 at 14:43:50 UTC, H. S. Teoh wrote:
 On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via 
 Digitalmars-d wrote:
 
 I think it's a bad, bad, bad idea to put any emphasis on 
 BetterC other than as a tool to help in porting C or C++ code 
 to D, or to integrate D into existing C and C++ projects. I 
 see too many people reaching for it first thing, probably out 
 of a misguided GC phobia. D is the language we need to be 
 promoting. BetterC was intended for a specific purpose. Beyond 
 that, it's a crippled D. If some people prefer to use it that 
 way, fine, but we shouldn't encourage it.
+1. Most people with GC phobia don't actually need to avoid the GC. You really only need to avoid the GC if you're working in very specific niches, like hard real-time requirements (game engines, rocket booster controllers, etc.). Your general software project does not need to avoid the GC; you just need to know how to use it effectively (and/or apply nogc where it matters).
Maybe my point is being missed. There is a market for languages that do what D does but where GC is not an option. D's better C version is essentially what they need, so if you are happy to cede that market to Jai/Zig and similar then fine.
Oct 29 2020
next sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 29 October 2020 at 16:15:28 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 29 October 2020 at 14:43:50 UTC, H. S. Teoh wrote:
 On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via 
 Digitalmars-d wrote:
 Maybe my point is being missed. There is a market for languages 
 that do what D does but where GC is not an option. D's better C 
 version is essentially what they need, so if you are happy to 
 cede that market to Jai/Zig and similar then fine.
I can't think of a better name than betterC if your goal is to do the same things as with C, only with some added features. The important thing to me is to emphasize that it's a subset of C but has many limitations. I'm in strong agreement with Mike on this one. More generally, if you want to promote it effectively, write a big project using betterC and promote your project instead.
Oct 29 2020
parent bachmeier <no spam.net> writes:
On Thursday, 29 October 2020 at 18:20:56 UTC, bachmeier wrote:
 On Thursday, 29 October 2020 at 16:15:28 UTC, Dibyendu Majumdar 
 wrote:
 On Thursday, 29 October 2020 at 14:43:50 UTC, H. S. Teoh wrote:
 On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via 
 Digitalmars-d wrote:
 Maybe my point is being missed. There is a market for 
 languages that do what D does but where GC is not an option. 
 D's better C version is essentially what they need, so if you 
 are happy to cede that market to Jai/Zig and similar then fine.
I can't think of a better name than betterC if your goal is to do the same things as with C, only with some added features. The important thing to me is to emphasize that it's a subset of C but has many limitations. I'm in strong agreement with Mike on this one. More generally, if you want to promote it effectively, write a big project using betterC and promote your project instead.
subset of C -> superset of C
Oct 29 2020
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 29 October 2020 at 16:15:28 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 29 October 2020 at 14:43:50 UTC, H. S. Teoh wrote:
 On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via 
 Digitalmars-d wrote:
 
 I think it's a bad, bad, bad idea to put any emphasis on 
 BetterC other than as a tool to help in porting C or C++ code 
 to D, or to integrate D into existing C and C++ projects. I 
 see too many people reaching for it first thing, probably out 
 of a misguided GC phobia. D is the language we need to be 
 promoting. BetterC was intended for a specific purpose. 
 Beyond that, it's a crippled D. If some people prefer to use 
 it that way, fine, but we shouldn't encourage it.
+1. Most people with GC phobia don't actually need to avoid the GC. You really only need to avoid the GC if you're working in very specific niches, like hard real-time requirements (game engines, rocket booster controllers, etc.). Your general software project does not need to avoid the GC; you just need to know how to use it effectively (and/or apply nogc where it matters).
Maybe my point is being missed. There is a market for languages that do what D does but where GC is not an option. D's better C version is essentially what they need, so if you are happy to cede that market to Jai/Zig and similar then fine.
That market exists because of GC-phobia. https://www.ptc.com/en/products/developer-tools/perc https://astrobe.com/Oberon.htm, https://astrobe.com/boards.htm https://www.wildernesslabs.co/developers https://www.f-secure.com/en/consulting/foundry/usb-armory https://dotnet.microsoft.com/apps/games All examples of products, whose authors just didn't care for the anti-GC crowd and pushed forward with their ideas.
Oct 29 2020
prev sibling parent Max Haughton <maxhaton gmail.com> writes:
On Thursday, 29 October 2020 at 14:43:50 UTC, H. S. Teoh wrote:
 On Thu, Oct 29, 2020 at 02:28:58PM +0000, Mike Parker via 
 Digitalmars-d wrote:
 On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu 
 Majumdar wrote:
[...]
 [...]
I think it's a bad, bad, bad idea to put any emphasis on BetterC other than as a tool to help in porting C or C++ code to D, or to integrate D into existing C and C++ projects. I see too many people reaching for it first thing, probably out of a misguided GC phobia. D is the language we need to be promoting. BetterC was intended for a specific purpose. Beyond that, it's a crippled D. If some people prefer to use it that way, fine, but we shouldn't encourage it.
+1. Most people with GC phobia don't actually need to avoid the GC. You really only need to avoid the GC if you're working in very specific niches, like hard real-time requirements (game engines, rocket booster controllers, etc.). Your general software project does not need to avoid the GC; you just need to know how to use it effectively (and/or apply nogc where it matters). T
We could really use some more nogc containers that use the allocators, in the standard library. Even if they don't work that well it's good marketing for the language. Unfortunately I think D's approximation move semantics might be a problem?
Oct 29 2020
prev sibling next sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 29 October 2020 at 14:28:58 UTC, Mike Parker wrote:
 On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
 wrote:

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.
I think it's a bad, bad, bad idea to put any emphasis on BetterC other than as a tool to help in porting C or C++ code to D, or to integrate D into existing C and C++ projects. I see too many people reaching for it first thing, probably out of a misguided GC phobia. D is the language we need to be promoting. BetterC was intended for a specific purpose. Beyond that, it's a crippled D. If some people prefer to use it that way, fine, but we shouldn't encourage it.
7+ memory handling, Swift and Go, is the attitute to improve their low level programming capabilities without making concessions to the anti-GC folks. D has more to gain by fixing the corner cases in the language, compiler/library bugs, and having a good library ecosystem than trying to please everyone.
Oct 29 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 29 October 2020 at 21:13:20 UTC, Paulo Pinto wrote:
 Indeed, what I appreciate in the Java Panama/Valhala efforts, 

 their low level programming capabilities without making 
 concessions to the anti-GC folks.
But Swift uses refcounting...
Oct 29 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Thursday, 29 October 2020 at 21:34:06 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 29 October 2020 at 21:13:20 UTC, Paulo Pinto wrote:
 Indeed, what I appreciate in the Java Panama/Valhala efforts, 

 improve their low level programming capabilities without 
 making concessions to the anti-GC folks.
But Swift uses refcounting...
Which anyone with proper compiler teachings knows is a GC algorithm, unless their CS professor really screw up their CS books.
Oct 30 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 30 October 2020 at 07:19:13 UTC, Paulo Pinto wrote:
 On Thursday, 29 October 2020 at 21:34:06 UTC, Ola Fosheim 
 Grøstad wrote:
 On Thursday, 29 October 2020 at 21:13:20 UTC, Paulo Pinto 
 wrote:
 Indeed, what I appreciate in the Java Panama/Valhala efforts, 

 improve their low level programming capabilities without 
 making concessions to the anti-GC folks.
But Swift uses refcounting...
Which anyone with proper compiler teachings knows is a GC algorithm, unless their CS professor really screw up their CS books.
In the forums GC usually means a tracing GC. C++ has GC too, then.
Oct 30 2020
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 30 October 2020 at 07:46:58 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 30 October 2020 at 07:19:13 UTC, Paulo Pinto wrote:
 On Thursday, 29 October 2020 at 21:34:06 UTC, Ola Fosheim 
 Grøstad wrote:
 On Thursday, 29 October 2020 at 21:13:20 UTC, Paulo Pinto 
 wrote:
 Indeed, what I appreciate in the Java Panama/Valhala 

 attitute to improve their low level programming capabilities 
 without making concessions to the anti-GC folks.
But Swift uses refcounting...
Which anyone with proper compiler teachings knows is a GC algorithm, unless their CS professor really screw up their CS books.
In the forums GC usually means a tracing GC. C++ has GC too, then.
I like to fix urban myths, and RC not being a GC algorithm is such a myth. The other one is about its actual performance against modern tracing GC implementations. Regarding C++, it does have one as part of C++11 GC API, C++/CLI, C++/CX and Unreal C++.
Oct 30 2020
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 30 October 2020 at 08:39:17 UTC, Paulo Pinto wrote:
 Regarding C++, it does have one as part of C++11 GC API, 
 C++/CLI, C++/CX and Unreal C++.
There are some definitions in the standard library for a tracing GC in C++. What I meant was that atomic_shared_ptr might qualify as automatic reference counting since you don't have to do explicit acquire/release. Even though most people don't think that C++ has automatic reference counting. Swift has some optimization of it though.
Oct 30 2020
parent reply Steven Schveighoffer <schveiguy gmail.com> writes:
On 10/30/20 5:32 AM, Ola Fosheim Grøstad wrote:
 On Friday, 30 October 2020 at 08:39:17 UTC, Paulo Pinto wrote:
 Regarding C++, it does have one as part of C++11 GC API, C++/CLI, 
 C++/CX and Unreal C++.
There are some definitions in the standard library for a tracing GC in C++. What I meant was that atomic_shared_ptr might qualify as automatic reference counting since you don't have to do explicit acquire/release. Even though most people don't think that C++ has automatic reference counting. Swift has some optimization of it though.
Yes, swift builds on top of Objective-C ARC. Pairs of RC increment and decrements are elided because the compiler knows what it means. And Swift no longer has to deal with auto release pools as the compiler is smarter about return values. I think D was looking at one point to be able to do this, but via a specialized library hook. The difference between RC and a tracing GC is that cycles can prevent garbage from being detected in RC, whereas tracing can find those. Even in swift, you have a notion of weak references that get auto-nullified when the target gets destroyed, and not properly setting up an ownership relationship will result in items not being collected. But certainly it's much easier than the Objective-C requirements. -Steve
Oct 30 2020
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 30 October 2020 at 14:24:43 UTC, Steven Schveighoffer 
wrote:
 I think D was looking at one point to be able to do this, but 
 via a specialized library hook.

 The difference between RC and a tracing GC is that cycles can 
 prevent garbage from being detected in RC, whereas tracing can 
 find those.

 Even in swift, you have a notion of weak references that get 
 auto-nullified when the target gets destroyed, and not properly 
 setting up an ownership relationship will result in items not 
 being collected. But certainly it's much easier than the 
 Objective-C requirements.
*nods* Yes, you need weak references. I did like your idea of having reference counting and using the current GC-collection infrastructure for detecting cycles and dangling pointers. I think dynamic sanitizers can be very powerful. Class objects are already quite heavy as each interface adds another pointer to each object, so I don't think adding two counters would make a catastrophic difference. I once made a prototype refcounting scheme in D with borrowing that tracked the number of borrows at runtime. You could even add tracking of __FILE__ and __LINE__ and get good debug tracking during testing. So there are interesting possibilities without "throwing out everything". E.g. retaining key properties of the tracing GC for development and drop down to ARC at release.
Oct 30 2020
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 29 October 2020 at 14:28:58 UTC, Mike Parker wrote:
 I think it's a bad, bad, bad idea to put any emphasis on 
 BetterC other than as a tool to help in porting C or C++ code 
 to D, or to integrate D into existing C and C++ projects.
Only because it is a poster child. If exceptions were value types and classes were structs with some extra fields then betterC would be more attractive.
Oct 29 2020
parent reply Paul Backus <snarwin gmail.com> writes:
On Thursday, 29 October 2020 at 21:45:47 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 29 October 2020 at 14:28:58 UTC, Mike Parker wrote:
 I think it's a bad, bad, bad idea to put any emphasis on 
 BetterC other than as a tool to help in porting C or C++ code 
 to D, or to integrate D into existing C and C++ projects.
Only because it is a poster child. If exceptions were value types and classes were structs with some extra fields then betterC would be more attractive.
I have a DIP I think you'll like: https://github.com/dlang/DIPs/blob/02594a09d9daf1d393ebce2cfc2f0c4f90d4bcf8/DIPs/1NNN-PJB.md
Oct 29 2020
next sibling parent reply Daniel Kozak <kozzi11 gmail.com> writes:
On Thu, Oct 29, 2020 at 11:00 PM Paul Backus via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 I have a DIP I think you'll like:


 https://github.com/dlang/DIPs/blob/02594a09d9daf1d393ebce2cfc2f0c4f90d4bcf8/DIPs/1NNN-PJB.md
Nice DIP, but why not added as a https://dlang.org/phobos/core_attribute.html this would mean no breaking change
Oct 30 2020
next sibling parent Paul Backus <snarwin gmail.com> writes:
On Friday, 30 October 2020 at 09:15:32 UTC, Daniel Kozak wrote:
 On Thu, Oct 29, 2020 at 11:00 PM Paul Backus via Digitalmars-d 
 < digitalmars-d puremagic.com> wrote:

 I have a DIP I think you'll like:


 https://github.com/dlang/DIPs/blob/02594a09d9daf1d393ebce2cfc2f0c4f90d4bcf8/DIPs/1NNN-PJB.md
Nice DIP, but why not added as a https://dlang.org/phobos/core_attribute.html this would mean no breaking change
No reason at all--I just didn't think of it at the time. Thanks for the tip!
Oct 30 2020
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2020-10-30 10:15, Daniel Kozak wrote:

 Nice DIP, but why not added as a 
 https://dlang.org/phobos/core_attribute.html this would mean no breaking 
 change
Technically any new public symbol can cause a breaking change. But a compiler recognized UDA is less likely to cause a breaking change than a built-in attribute. And if it is breaking some code, it can be resolved by using fully qualified name (or similar techniques). There's no need to rename the UDA. -- /Jacob Carlborg
Oct 30 2020
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2020-10-29 22:57, Paul Backus wrote:

 I have a DIP I think you'll like:
 
 https://github.com/dlang/DIPs/blob/02594a09d9daf1d393ebce2cfc2f0c4f90d4b
f8/DIPs/1NNN-PJB.md 
Your main rationale seems to be error handling without exceptions. In my opinion there are better ways to deal with error handling than returning errors. For example, there's nothing wrong with exceptions per se (well, a few details), it's just that the implementation is a bit unfortunate. In my opinion there's nothing that stops exceptions on being implemented in some other way, that does not require the runtime. Like a Result struct or as tagged unions with some help from the compiler. The following code: enum Error { fileNotFound } string readFile(string path) throw(Error) { throw Error.fileNotFound; } void foo() throw(auto) { string content = try readFile("foo.txt"); } void bar() { try bar(); catch (Error e) writeln("An error occurred: ", e); } Can be lowered to something like: struct Result(Value, Error) { Value value; Error error; bool isSuccessful; this(Value value) { this.value = value; isSuccessful = true; } this(Error error) { this.error = error; isSuccessful = false; } } Result!(string, Error) readFile(string path) nothrow { return Result!(string, Error)(Error.fileNotFound); } Result!(void, Error) foo() nothrow { string content; auto __result = readFile("foo.txt"); if (__result.isSuccessful) content = __result.value; else return Result!(void, Error)(__result.error); } void bar() nothrow { auto __result = bar(); if (!__result.isSuccessful) writeln("An error occurred: ", __result.error); } Have a look at this C++ proposal [1] by Herb Sutter and the error handling in Zig [2]. [1] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0709r0.pdf [2] https://ziglang.org/documentation/master/#Errors -- /Jacob Carlborg
Oct 31 2020
prev sibling parent Dylan Graham <dylan.graham2000 gmail.com> writes:
On Thursday, 29 October 2020 at 14:28:58 UTC, Mike Parker wrote:
 On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
 wrote:

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.
I think it's a bad, bad, bad idea to put any emphasis on BetterC other than as a tool to help in porting C or C++ code to D, or to integrate D into existing C and C++ projects. I see too many people reaching for it first thing, probably out of a misguided GC phobia. D is the language we need to be promoting. BetterC was intended for a specific purpose. Beyond that, it's a crippled D. If some people prefer to use it that way, fine, but we shouldn't encourage it.
I use BetterC in my products (car engine/gearbox parts) targeting ARM Cortex MCUs, but once I get a lightweight D runtime up I intend to switch to that. I think such a thing could reduce reliance on BetterC and allow for quicker easing into full D. Given it's a real time environment I'm thinking that Phobos allocations will be tracked in a stack like structure per thread and can then be freed at a later, convenient opportunity or immediately after the Phobos call ends. I could see about making some sort of small scale GC but that's currently out of my depth.
Nov 01 2020
prev sibling parent reply Dibyendu Majumdar <d.majumdar gmail.com> writes:
On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
wrote:

 I was wondering if it worthwhile branding -betterC differently 
 - e.g. use a brand such as 'micro-D' or some nicer name. That 
 is, give it a new identity that highlights that it not just 
 better C  - but a D version without GC.
Early next year I am hoping to start a fork of D - the idea is to simply create a new version that has better C option baked in - i.e. full D will be optional. Looking a for a new name for this subset of D. Dig Deal Delite Any suggestions for a name? My plan is to keep the effort low by: a) just creating new name b) change the default for option for betterC c) create a cut-down distro that works with betterC
Nov 04 2020
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 4 November 2020 at 11:23:36 UTC, Dibyendu Majumdar 
wrote:
 On Thursday, 29 October 2020 at 11:50:12 UTC, Dibyendu Majumdar 
 wrote:
 Any suggestions for a name?

 My plan is to keep the effort low by:
 a) just creating new name
 b) change the default for option for betterC
 c) create a cut-down distro that works with betterC
Feel free to share your ideas, maybe as issues, on this experimental repo: https://github.com/OlaFosheimGrostad/dex Nothing is set in stone and I won't have time to play with it until late December anyway. I call it «Dex» for «D-experimental» or «D-extended». I am thinking about limiting it to Better-D with classes and return-exceptions. So that it will compile most existing D code, but with a small runtime that is embedded friendly. I'd like to add more sanitizers, maybe borrowing code from clang.
Nov 04 2020