www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Nim programming language finally hit 1.0

reply Rel <relmail rambler.ru> writes:
https://nim-lang.org/blog/2019/09/23/version-100-released.html

Well, finally Nim team released 1.0. Now future releases 
shouldn't break people's code and this fact should increase 
language adoption. Still few things seems to be unfinished (like 
their NewRuntime thing), but I'd like to congratulate Nim's team 
with this big release. What do you think about it?
Sep 25 2019
next sibling parent reply JN <666total wp.pl> writes:
On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
I've been looking at Nim every now and then, but I don't really like it. It's trying to cover too many bases. Don't like camelCase? Don't worry, snake_case and PascalCase works too. Don't like GC? Don't worry, you can use refcounting and manual memory management too. Considering how APIs and libraries have to adapt to such multiple cases, I imagine the standard library is quite a mess.
Sep 25 2019
next sibling parent bauss <jj_1337 live.dk> writes:
On Wednesday, 25 September 2019 at 10:37:07 UTC, JN wrote:
 I've been looking at Nim every now and then, but I don't really 
 like it. It's trying to cover too many bases. Don't like 
 camelCase? Don't worry, snake_case and PascalCase works too.
My biggest issue with Nim too. It's going to be a mess just like PHP was with function names being in all kinds of cases and you'd never know which one it is. Is it to_lower(str) or is it toLower(str) or is it ToLower(str) or is it tolower(str) etc.
Sep 29 2019
prev sibling parent zoujiaqing <zoujiaqing gmail.com> writes:
On Wednesday, 25 September 2019 at 10:37:07 UTC, JN wrote:
 Don't like GC? Don't worry, you can use refcounting and manual
But I like ARC . Using ARC instead of GC .
Oct 04 2019
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
The link above reads almost like a summary of "Don't do what D did!". Congratulations! I've been looking at Nim on and off (like most people, I suppose). One thing that really turns me off is that indentation is an integral part of the syntax [1]. Nim designers seem to forget that Python introduced forced indentation, because Python was meant to be used by non-programmers (chemists, biologists etc.), and thus this feature was there to "help" non-programmers to keep their code clean and tidy (cf. Perl!). However, Nim is targeting (experienced) programmers who really don't need a patronizing feature like that. This is a real bummer, in my opinion, like selling a bottle of whiskey with a child safety lock to a publican. [1] https://nim-lang.org/docs/manual.html#lexical-analysis-indentation
Sep 25 2019
next sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Wed, 2019-09-25 at 10:59 +0000, Chris via Digitalmars-d wrote:
[=E2=80=A6]
 clean and tidy (cf. Perl!). However, Nim is targeting=20
 (experienced) programmers who really don't need a patronizing=20
 feature like that. This is a real bummer, in my opinion, like=20
[=E2=80=A6] Using whitespace instead of curly braces is not patronising, it is a langua= ge choice. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Sep 25 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 12:50:42 UTC, Russel Winder 
wrote:
 On Wed, 2019-09-25 at 10:59 +0000, Chris via Digitalmars-d 
 wrote: […]
 clean and tidy (cf. Perl!). However, Nim is targeting 
 (experienced) programmers who really don't need a patronizing 
 feature like that. This is a real bummer, in my opinion, like
[…] Using whitespace instead of curly braces is not patronising, it is a language choice.
I will not engage in a discussion about Python-style whitespace tyranny again. In fact, it can render code actually _less_ readable and trigger compiler errors for no reason other than "missing indentation on line 149", copying and pasting can be a real nightmare. Fixing indentation can be time consuming - and what for? Forcing users to write "clean code". It might be a language choice - but then it's a bad one, and the fact remains that Python introduced it to nudge non-programmers into structuring their code. In other words, it was meant to be patronizing from the start. Indentation style should be decided on by the teams using the language and not be part of the language. Why is this so hard to understand?
Sep 25 2019
next sibling parent Russel Winder <russel winder.org.uk> writes:
On Wed, 2019-09-25 at 13:04 +0000, Chris via Digitalmars-d wrote:
[=E2=80=A6]
 I will not engage in a discussion about Python-style whitespace=20
 tyranny again. In fact, it can render code actually _less_=20
 readable and trigger compiler errors for no reason other than=20
 "missing indentation on line 149", copying and pasting can be a=20
 real nightmare. Fixing indentation can be time consuming - and=20
 what for? Forcing users to write "clean code". It might be a=20
 language choice - but then it's a bad one, and the fact remains=20
 that Python introduced it to nudge non-programmers into=20
 structuring their code. In other words, it was meant to be=20
 patronizing from the start. Indentation style should be decided=20
 on by the teams using the language and not be part of the=20
 language. Why is this so hard to understand?
Because it is your unsubstantiated opinion cast as some form of fact that i= s not a truth. You have a right to your opinion, and clearly you hate the Python way of indentation (which must therefore include any programming language with an "offside rule" such as Haskell). You do not have a right to try and state t= hat your opinion is fact that others must also believe. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Sep 25 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 25 September 2019 at 13:04:09 UTC, Chris wrote:
 I will not engage in a discussion about Python-style whitespace 
 tyranny again. In fact, it can render code actually _less_ 
 readable and trigger compiler errors for no reason other than
It is annoying if using a plain text editor, but I find it to be enjoyable when using a dedicated Python editor. I feel this is an area where current languages will start to look dated as new languages appear that are designed alongside the tooling for the language. Plain text is a rather crude representation.
Sep 27 2019
parent reply Russel Winder <russel winder.org.uk> writes:
On Fri, 2019-09-27 at 14:24 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-d
wrote:
=20
[=E2=80=A6]
 I feel this is an area where current languages will start to look=20
 dated as new languages appear that are designed alongside the=20
 tooling for the language.
=20
 Plain text is a rather crude representation.
Around 1986 people were shifting from using text editors to structure orien= ted editors of one form or another, where the stored representation of the code was some form of tree. Sadly the take up of these editors was drowned out b= y the clamour from programmers "you cannot take our text file representation = of our code from us". So now we have IDE that spend vast amounts of CPU power dynamically recreating tree representations of the text files so as to prov= ide all the programming language specific support. I think I am going to chalk this one up to karma. =20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Sep 27 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 14:48:39 UTC, Russel Winder wrote:
 I think I am going to chalk this one up to karma.
Well, I don't know. It is easier to evolve UIs today, so I believe this will change, but it will take several iterations to get there. But I guess a new paradigm for evolving software is needed, an approach that is more iterative and REPL oriented perhaps. Nobody would want to design electronics using only a text editor... they have capable visual tooling. But they started out with a visual representation on paper so it was a translation from paper, that probably made adoption easier.
Sep 27 2019
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 25 September 2019 at 10:59:55 UTC, Chris wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
The link above reads almost like a summary of "Don't do what D did!". Congratulations! I've been looking at Nim on and off (like most people, I suppose). One thing that really turns me off is that indentation is an integral part of the syntax [1]. Nim designers seem to forget that Python introduced forced indentation, because Python was meant to be used by non-programmers (chemists, biologists etc.), and thus this feature was there to "help" non-programmers to keep their code clean and tidy (cf. Perl!). However, Nim is targeting (experienced) programmers who really don't need a patronizing feature like that. This is a real bummer, in my opinion, like selling a bottle of whiskey with a child safety lock to a publican. [1] https://nim-lang.org/docs/manual.html#lexical-analysis-indentation
Forced indentation goes back to 70's languages. Python did nothing new there.
Sep 25 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 14:24:39 UTC, Paulo Pinto 
wrote:
 On Wednesday, 25 September 2019 at 10:59:55 UTC, Chris wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
The link above reads almost like a summary of "Don't do what D did!". Congratulations! I've been looking at Nim on and off (like most people, I suppose). One thing that really turns me off is that indentation is an integral part of the syntax [1]. Nim designers seem to forget that Python introduced forced indentation, because Python was meant to be used by non-programmers (chemists, biologists etc.), and thus this feature was there to "help" non-programmers to keep their code clean and tidy (cf. Perl!). However, Nim is targeting (experienced) programmers who really don't need a patronizing feature like that. This is a real bummer, in my opinion, like selling a bottle of whiskey with a child safety lock to a publican. [1] https://nim-lang.org/docs/manual.html#lexical-analysis-indentation
Forced indentation goes back to 70's languages. Python did nothing new there.
Let me guess, the reasoning behind it was the same in the 70ies. Force "clean code"? I.e. patronizing. Please enlighten me. Anyway, Python made forced indentation popular. Python became a mainstream language "by accident", and because of its popularity people thought that forced indentation was great, although it has its merits only in contexts where most users are not programmers (e.g. natural sciences at universities).
Sep 25 2019
parent reply bpr <brogoff gmail.com> writes:
On Wednesday, 25 September 2019 at 14:36:44 UTC, Chris wrote:
 Let me guess, the reasoning behind it was the same in the 
 70ies. Force "clean code"? I.e. patronizing. Please enlighten 
 me.
I believe it was introduced in the seminal paper "The next 700 programming languages" by Peter J. Landin, in the ISWIM abstract language of that paper, and later picked up by implemented languages later. ABC was the one that most influenced Python. My guess is that it had nothing to do with being patronizing, and that it was just a more readable notation, but that's just like, my opinion man.
Sep 25 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 15:05:59 UTC, bpr wrote:
 On Wednesday, 25 September 2019 at 14:36:44 UTC, Chris wrote:
 Let me guess, the reasoning behind it was the same in the 
 70ies. Force "clean code"? I.e. patronizing. Please enlighten 
 me.
I believe it was introduced in the seminal paper "The next 700 programming languages" by Peter J. Landin, in the ISWIM abstract language of that paper, and later picked up by implemented languages later. ABC was the one that most influenced Python. My guess is that it had nothing to do with being patronizing, and that it was just a more readable notation, but that's just like, my opinion man.
Thanks. Yeah, I guessed its goal was to get a more readable notation, but you can get that with any language that doesn't force you to indent. It's common sense that you structure your code visually, but I call it patronizing or nanny syntax, if you are forced to use whitespace, there are many ways to visually structure your code but this should be up to the programmer / team / company.
Sep 25 2019
parent reply JN <666total wp.pl> writes:
On Wednesday, 25 September 2019 at 15:11:57 UTC, Chris wrote:
 Thanks. Yeah, I guessed its goal was to get a more readable 
 notation, but you can get that with any language that doesn't 
 force you to indent. It's common sense that you structure your 
 code visually, but I call it patronizing or nanny syntax, if 
 you are forced to use whitespace, there are many ways to 
 visually structure your code but this should be up to the 
 programmer / team / company.
I guess it's just a matter of preference. Personally, "forced indentation" is one of my favourite features of Python. I indent mostly unnecessary noise to me. Also, I think you underestimate how hard it is to find a missing brace in code sometimes. For example if you forget a closing brace on a method, and then the compiler thinks next methods are inner methods and it breaks on a "class Foo" declaration 200 lines of code later.
Sep 25 2019
parent Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 20:56:49 UTC, JN wrote:
 I guess it's just a matter of preference. Personally, "forced 
 indentation" is one of my favourite features of Python. I 

 are mostly unnecessary noise to me. Also, I think you 
 underestimate how hard it is to find a missing brace in code 
 sometimes. For example if you forget a closing brace on a 
 method, and then the compiler thinks next methods are inner 
 methods and it breaks on a "class Foo" declaration 200 lines of 
 code later.
I know, I've had my fair share of both (missing braces and missing indentation), but I find missing braces easier to fix and they don't happen nearly as often as missing / wrong indentation. If you have to move lines and blocks around for testing / debugging / refactoring purposes, forced indentation can be a real PIA, and you have to spend time fixing things that shouldn't be broke in the first place. Worst of all: you don't just have to look for the missing brace, but check the optical alignment on the screen, now, what will you do if the statement opening the block is (50-100 lines) out of sight? A waste of time, or "diminishing returns" [1]. [1] https://en.wikipedia.org/wiki/Diminishing_returns
Sep 26 2019
prev sibling next sibling parent reply IGotD- <nise nise.com> writes:
On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
I'm happy that they have come to the 1.0 release and I wish them the best. I have personally tinkered with Nim in order to see if it fits my projects. The language is ok but I find the standard libraries a bit inconsistent. Syntax wise it quite nice and you can achieve a lot with fewer lines. The reason I found D better for my purposes was that it was easier to port C like algorithms to D and also D has better libraries. In my opinion Nim is one of the closest competitors of D because they are both runner up languages and they try to focus on productivity. Right now Rust seems to be influencing D a lot but I think if you want to look other languages, you should definitely also look at Nim.
Sep 25 2019
next sibling parent reply a11e99z <black80 bk.ru> writes:
On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html
The reason I found D better for my purposes was that it was easier to port C like algorithms to D and also D has better libraries. In my opinion Nim is one of the closest competitors of D because they are both runner up languages and they try to focus on productivity. Right now Rust seems to be influencing D a lot but I think if you want to look other languages, you should definitely also look at Nim.
and at Zig https://ziglang.org/documentation/master/
Sep 25 2019
next sibling parent reply Rel <relmail rambler.ru> writes:
On Wednesday, 25 September 2019 at 12:43:01 UTC, a11e99z wrote:
 On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html
The reason I found D better for my purposes was that it was easier to port C like algorithms to D and also D has better libraries. In my opinion Nim is one of the closest competitors of D because they are both runner up languages and they try to focus on productivity. Right now Rust seems to be influencing D a lot but I think if you want to look other languages, you should definitely also look at Nim.
and at Zig https://ziglang.org/documentation/master/
The funny thing about Zig is that right now it is a better C compiler, than a Zig compiler (in my personal opinion). Also I don't really like the idea of passing allocators around.
Sep 25 2019
parent a11e99z <black80 bk.ru> writes:
On Wednesday, 25 September 2019 at 13:04:25 UTC, Rel wrote:
 On Wednesday, 25 September 2019 at 12:43:01 UTC, a11e99z wrote:

 and at Zig https://ziglang.org/documentation/master/
The funny thing about Zig is that right now it is a better C compiler, than a Zig compiler (in my personal opinion). Also I don't really like the idea of passing allocators around.
when no coroutines that can be executed by any thread u can put default allocator at TLS else at coroutine context (and fix API)
Sep 25 2019
prev sibling parent reply SrMordred <patric.dexheimer gmail.com> writes:
On Wednesday, 25 September 2019 at 12:43:01 UTC, a11e99z wrote:
 On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html
The reason I found D better for my purposes was that it was easier to port C like algorithms to D and also D has better libraries. In my opinion Nim is one of the closest competitors of D because they are both runner up languages and they try to focus on productivity. Right now Rust seems to be influencing D a lot but I think if you want to look other languages, you should definitely also look at Nim.
and at Zig https://ziglang.org/documentation/master/
I fail to see an appealing reason to use Zig. Can u point me something? (and i´m loving playing with Nim and macros :) )
Sep 25 2019
parent Rel <relmail rambler.ru> writes:
On Wednesday, 25 September 2019 at 13:51:52 UTC, SrMordred wrote:
 On Wednesday, 25 September 2019 at 12:43:01 UTC, a11e99z wrote:
 On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html
The reason I found D better for my purposes was that it was easier to port C like algorithms to D and also D has better libraries. In my opinion Nim is one of the closest competitors of D because they are both runner up languages and they try to focus on productivity. Right now Rust seems to be influencing D a lot but I think if you want to look other languages, you should definitely also look at Nim.
and at Zig https://ziglang.org/documentation/master/
I fail to see an appealing reason to use Zig. Can u point me something? (and i´m loving playing with Nim and macros :) )
There are few things I can think of. Like a special error types that enforces error checking. Automatic C-bindings is awesome. Having cross compilation as a first class citizen is nice too. Coroutines are in the WIP right now, don't know how well it will be implemented, we'll see.
Sep 25 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 In my opinion Nim is one of the closest competitors of D 
 because they are both runner up languages and they try to focus 
 on productivity. Right now Rust seems to be influencing D a lot 
 but I think if you want to look other languages, you should 
 definitely also look at Nim.
The problem of D has always been that it constantly follows the fashion of the day. First it was trying to "beat" C++, then it was competing with Go, now it's Rust. What's next? I find it refreshing to see languages like Nim and Zig that try to base their decisions on what works and what doesn't, what makes sense and what doesn't, not on Reddit threads about the latest CS fashion. Given that D has introduced a nice set of promising features of its own, I don't understand why D is constantly trying to emulate other languages, before it's even clear whether feature X will work or not. I think languages like Nim and Zig do have a chance, if they avoid D's mistakes and focus on their strengths. Usefulness and consistency are of utmost importance.
Sep 25 2019
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 25 September 2019 at 14:29:19 UTC, Chris wrote:
 On Wednesday, 25 September 2019 at 11:02:57 UTC, IGotD- wrote:
 In my opinion Nim is one of the closest competitors of D 
 because they are both runner up languages and they try to 
 focus on productivity. Right now Rust seems to be influencing 
 D a lot but I think if you want to look other languages, you 
 should definitely also look at Nim.
The problem of D has always been that it constantly follows the fashion of the day. First it was trying to "beat" C++, then it was competing with Go, now it's Rust. What's next? I find it refreshing to see languages like Nim and Zig that try to base their decisions on what works and what doesn't, what makes sense and what doesn't, not on Reddit threads about the latest CS fashion. Given that D has introduced a nice set of promising features of its own, I don't understand why D is constantly trying to emulate other languages, before it's even clear whether feature X will work or not. I think languages like Nim and Zig do have a chance, if they avoid D's mistakes and focus on their strengths. Usefulness and consistency are of utmost importance.
Definitely, on my case what D offered me, not available in either in Java or .NET, has in the mean time been sorted out. C++/CLI as additional companion, and while annotations + D's metaprogramming, they get the job done. Likewise Java is getting value types and a JNI replacement, GraalVM is an wonderful piece of technology, battle tested by Twitter, with NVidia now adding CUDA support, and annotation processors and reflection also offer good enough metaprogramming, even if like on .NET's case they aren't as developer friendly as D's. C++20, regardless of the backwards compatibility baggage it brings along, also offers many of the features that made D noteworthy and keeps being the tool to go when Java, .NET need an extra help. Then on Apple's platform I am an happy Swift user. While I still advocate D for safe systems programming, its story needs to be more coherent. And regarding the latest DIP discussions, if D veterans have issues trying to make sense how they are supposed to work, what hopes have newly D adopters? The broken features should be sorted out, and be proud of being a GC systems language. That is what I appreciate in Go and Nim, they have their story, with GC, and stand by it, even if that alienates some crowds. So even if Go is not supposed to be a systems language, there are Google projects using it like that (gVisor, Android GPGPU debugger, Fuchsia TCP/IP stack). The language should be stabilized, GC improved (I know it has gotten much better), replace deprecated Phobos packages (xml, JSON, stream), focus on vi/Emacs/VSCode and maybe people will come and stay around. Trying to cater to the crowds that flock to the systems language of the month is damaging for D's future, at the end the only thing left is patting each other in the back while asserting "language X might have won, but D had it first".
Sep 25 2019
next sibling parent Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Thursday, 26 September 2019 at 06:14:24 UTC, Paulo Pinto wrote:

 And regarding the latest DIP discussions, if D veterans have 
 issues trying to make sense how they are supposed to work, what 
 hopes have newly D adopters?
See also the latest threads on DIP 1000 semantic, +1000!
 The broken features should be sorted out, and be proud of being 
 a GC systems language.

 That is what I appreciate in Go and Nim, they have their story, 
 with GC, and stand by it, even if that alienates some crowds.

 So even if Go is not supposed to be a systems language, there 
 are Google projects using it like that (gVisor, Android GPGPU 
 debugger, Fuchsia TCP/IP stack).

 The language should be stabilized, GC improved (I know it has 
 gotten much better), replace deprecated Phobos packages (xml, 
 JSON, stream), focus on vi/Emacs/VSCode and maybe people will 
 come and stay around.

 Trying to cater to the crowds that flock to the systems 
 language of the month is damaging for D's future, at the end 
 the only thing left is patting each other in the back while 
 asserting "language X might have won, but D had it first".
What to say more than that? Thanks Paulo! /P
Sep 26 2019
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 26 September 2019 at 06:14:24 UTC, Paulo Pinto wrote:

 Definitely, on my case what D offered me, not available in 
 either in Java or .NET, has in the mean time been sorted out.
[...]
 Trying to cater to the crowds that flock to the systems 
 language of the month is damaging for D's future, at the end 
 the only thing left is patting each other in the back while 
 asserting "language X might have won, but D had it first".
Thanks for this brief summary. My experience is similar (more along the lines of Java/Kotlin and now with the addition of GraalVM). Your last paragraph sums it all up. D doesn't know which path to take, yet randomness and whimsicality are exactly what you don't want when you develop software.
Sep 26 2019
prev sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 26 September 2019 at 06:14:24 UTC, Paulo Pinto wrote:

 The language should be stabilized, GC improved (I know it has 
 gotten much better), replace deprecated Phobos packages (xml, 
 JSON, stream), focus on vi/Emacs/VSCode and maybe people will 
 come and stay around.
There's a chicken-and-egg problem, though. D doesn't have the type of community that languages like Rust, Go, or Python have. I remember in the 90s when Python was insignificant to the programming world, they had their community of true believers promoting the language and making it more usable. In 2005, when I made my move to R for my computing, I looked at Python and it had little to offer. They had a community that made a lot of stuff happen in a hurry, and they promoted the language well. It's not even a particularly good choice for scientific computing, yet it is now a giant in that area. D has its mailing list and a few people doing their own projects (I'm in that group) but not a lot of blog posts or podcasts or tutorials, and can't even put together documentation for Dub. You read the mailing list and about two out of every three posts is negative. Until some people step up to be leaders of the community, nothing else will matter as far as D adoption goes. For all the years I've been involved, it's always been people looking to Walter and Andrei to make things happen, and I haven't seen anything like that with any other language.
Sep 26 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 26 September 2019 at 13:26:13 UTC, bachmeier wrote:
 
 It's [Python] not even a particularly good choice for 
 scientific computing, yet it is now a giant in that area.
You're right. Sigh.
 D has its mailing list and a few people doing their own 
 projects (I'm in that group) but not a lot of blog posts or 
 podcasts or tutorials, and can't even put together 
 documentation for Dub. You read the mailing list and about two 
 out of every three posts is negative. Until some people step up 
 to be leaders of the community, nothing else will matter as far 
 as D adoption goes. For all the years I've been involved, it's 
 always been people looking to Walter and Andrei to make things 
 happen, and I haven't seen anything like that with any other 
 language.
There's a huge difference between Python and D, though. Python (and its proponents) were targeting amateurs. It will be hard to find a (natural science / tech) student nowadays who hasn't dabbled in Python. So the audience is huge. But D is targeting hard core programmers, and they are naturally much harder to convince, because they know exactly what they want, and what they want it for. If you don't have a convincing "story" (as Paulo said), you're out. (Python on the other hand doesn't need a convincing story, it's used for throw-away scripts and it gives millions of people the feeling that they are "programmers".) What D needs is focus and consistency. Simple as.
Sep 26 2019
next sibling parent Max Samukha <maxsamukha gmail.com> writes:
On Thursday, 26 September 2019 at 13:42:47 UTC, Chris wrote:

 There's a huge difference between Python and D, though. Python 
 (and its proponents) were targeting amateurs. It will be hard 
 to find a (natural science / tech) student nowadays who hasn't 
 dabbled in Python. So the audience is huge. But D is targeting 
 hard core programmers, and they are naturally much harder to 
 convince, because they know exactly what they want, and what 
 they want it for. If you don't have a convincing "story" (as 
 Paulo said), you're out. (Python on the other hand doesn't need 
 a convincing story, it's used for throw-away scripts and it 
 gives millions of people the feeling that they are 
 "programmers".) What D needs is focus and consistency. Simple 
 as.
Programmers are obsolete. AI is coming for them.
Sep 26 2019
prev sibling parent reply bachmeier <no spam.net> writes:
On Thursday, 26 September 2019 at 13:42:47 UTC, Chris wrote:


 (Python on the other hand doesn't need a convincing story, it's 
 used for throw-away scripts and it gives millions of people the 
 feeling that they are "programmers".) What D needs is focus and 
 consistency. Simple as.
Python is also used for a lot of "real" programming tasks these days. D is perfectly suited for scripts of 200 lines. Moderate-sized programs typically start as small scripts and grow over time. You don't have to convince anyone to switch languages if they start out with D. Neither Walter or Andrei has a scripting background, rather they are from the large-scale enterprise software world, and that led to the current emphasis on that segment. There's no reason the community can't pick up the ball and carry it into other arenas.
Sep 26 2019
next sibling parent Meta <jared771 gmail.com> writes:
On Thursday, 26 September 2019 at 16:04:18 UTC, bachmeier wrote:
 On Thursday, 26 September 2019 at 13:42:47 UTC, Chris wrote:


 (Python on the other hand doesn't need a convincing story, 
 it's used for throw-away scripts and it gives millions of 
 people the feeling that they are "programmers".) What D needs 
 is focus and consistency. Simple as.
Python is also used for a lot of "real" programming tasks these days. D is perfectly suited for scripts of 200 lines. Moderate-sized programs typically start as small scripts and grow over time. You don't have to convince anyone to switch languages if they start out with D. Neither Walter or Andrei has a scripting background, rather they are from the large-scale enterprise software world, and that led to the current emphasis on that segment. There's no reason the community can't pick up the ball and carry it into other arenas.
Just to add to this, IMO D is far better than Python for scripting type tasks. At work, I write all my scripts in D unless it's prohibitively difficult to do so.
Sep 26 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 26 September 2019 at 16:04:18 UTC, bachmeier wrote:

 Python is also used for a lot of "real" programming tasks these 
 days.
I know, but that doesn't mean it's a good choice. I suppose people (say scientists) just started with little scripts and then kept on using the language they already knew when things got bigger. A lot of that "real" stuff is not used in contexts where performance is crucial (e.g. real time) and believe me, it's really annoying if you get a great piece of software / research in Python that is completely useless in terms of developing a product where performance is crucial. And you cannot simply rewrite it, because it depends on loads of other Python modules developed by other scientists etc. And, of course, they used Python, because of all the useful modules...it's a vicious circle.
 D is perfectly suited for scripts of 200 lines. Moderate-sized 
 programs typically start as small scripts and grow over time. 
 You don't have to convince anyone to switch languages if they 
 start out with D.
It is, but who will adopt an exotic language like D for scripting? D is "too much" for that, and it lacks the libraries Python has. Plus, everybody else uses Python etc. Do you really want to market D as a scripting language? It clearly isn't a gonna learn D for scripting tasks? The learning curve is too steep. D is not a trivial language at all. Python is better suited for that.
 Neither Walter or Andrei has a scripting background, rather 
 they are from the large-scale enterprise software world, and 
 that led to the current emphasis on that segment. There's no 
 reason the community can't pick up the ball and carry it into 
 other arenas.
I don't think that's gonna work (see previous paragraph). The D community shouldn't be delusional about software and why / how it's adopted. There are both technical and socio-dynamic aspects to it. I think Nim and maybe Zig are much more realistic about things. (Btw, Nim's forced indentation might be a good marketing strategy to get Python devs on board. It might work.) D has become too intellectual, i.e. playing with ideas and abstract concepts have become more important than the real world. And the problem is that this playing around with ideas has negatively affected the use of D in the real world. Once the D community gets over that, D might have a convincing "story" (it did when I first started to use it, in ~2010 I think).
Sep 27 2019
next sibling parent reply jmh530 <john.michael.hall gmail.com> writes:
On Friday, 27 September 2019 at 10:45:31 UTC, Chris wrote:
 [snip]

 D has become too intellectual, i.e. playing with ideas and 
 abstract concepts have become more important than the real 
 world. And the problem is that this playing around with ideas 
 has negatively affected the use of D in the real world. Once 
 the D community gets over that, D might have a convincing 
 "story" (it did when I first started to use it, in ~2010 I 
 think).
I came to D from scripting languages like Matlab, Python, and R. It got to the point where my code was taking 24 hours + to run and I tried to find alternatives that were faster. I tried C++, but didn't like it very much. I prefer D to C++, but I'm still far more productive in Matlab/Python/R because I can depend on other people's libraries to a greater extent. However, D has made some good progress on the library front in the past two years. Mir has really come into its own with its own little universe around it (lubeck and numir are fun) and the GSOC project Magpie for Data Frames already looks promising. I would contrast this progress on the library front with the intellectual issues you are describing on the language itself. For me, the biggest downside with D was a lack of libraries. As you mentioned above, modules depending on modules. The network effect is very important for these types of use cases. So I do see progress on this front, even if there is a lot of unhappy feelings about some language issues.
Sep 27 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 27 September 2019 at 12:47:44 UTC, jmh530 wrote:

 I came to D from scripting languages like Matlab, Python, and 
 R. It got to the point where my code was taking 24 hours + to 
 run and I tried to find alternatives that were faster.
You're very wise. I wish more Matlab / Python users switched to highly performant languages.
 I tried C++, but didn't like it very much. I prefer D to C++, 
 but I'm still far more productive in Matlab/Python/R because I 
 can depend on other people's libraries to a greater extent.
 However, D has made some good progress on the library front in 
 the past two years. Mir has really come into its own with its 
 own little universe around it (lubeck and numir are fun) and 
 the GSOC project Magpie for Data Frames already looks 
 promising. I would contrast this progress on the library front 
 with the intellectual issues you are describing on the language 
 itself. For me, the biggest downside with D was a lack of 
 libraries. As you mentioned above, modules depending on 
 modules. The network effect is very important for these types 
 of use cases. So I do see progress on this front, even if there 
 is a lot of unhappy feelings about some language issues.
That's good to hear. Would you care to tell us what exactly you're using D for? I'd imagine it's for some sort of stats / machine learning / AI stuff. Your experience might be valuable for others who are trying to get away from Matlab / Python, and, as bachmeier suggested, D might attract more "scrpiters". Let them know what to expect and what you can do with D + Mir etc.
Sep 27 2019
parent jmh530 <john.michael.hall gmail.com> writes:
On Friday, 27 September 2019 at 12:59:27 UTC, Chris wrote:
 [snip]

 That's good to hear. Would you care to tell us what exactly 
 you're using D for? I'd imagine it's for some sort of stats / 
 machine learning / AI stuff. Your experience might be valuable 
 for others who are trying to get away from Matlab / Python, 
 and, as bachmeier suggested, D might attract more "scrpiters". 
 Let them know what to expect and what you can do with D + Mir 
 etc.
TBH, my use is probably more at the hobby level rather than Matlab, Python, R, and some others I know are all supported) and operationally it would cause a lot of difficulties if I was the only person writing in a language (the whole, what happens if you get hit by a bus). I've had a nice goal in the back of my mind to be able to do statistical analysis and portfolio optimization in D. I can do the optimization with DlangScience/nlopt, but this overlaps with my day job, so even if I do some work on this, I would need to get jump through work hoops before putting it online. So I feel a little more comfortable working on D projects that relate to statistical analysis. Some simple projects I've thought about working on include adding ols function to lubeck that can print some pretty results and seeing if dpp can use GSL out of the box. It's just a matter of finding the time. Projects like mir and magpie are good infrastructure, but there's a lot of statistical libraries that aren't supported in D. One library that I have been making a lot of use of in R is rstan for Bayesian inference. I haven't gotten around to trying to use stan in D (I know it's possible). It would be possible for me to write something like pystan or matlabstan, but rstan is implemented a little differently and faster. So I have kept up my R usage to use that.
Sep 27 2019
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Friday, 27 September 2019 at 12:47:44 UTC, jmh530 wrote:
 On Friday, 27 September 2019 at 10:45:31 UTC, Chris wrote:
 [snip]

 D has become too intellectual, i.e. playing with ideas and 
 abstract concepts have become more important than the real 
 world. And the problem is that this playing around with ideas 
 has negatively affected the use of D in the real world. Once 
 the D community gets over that, D might have a convincing 
 "story" (it did when I first started to use it, in ~2010 I 
 think).
I came to D from scripting languages like Matlab, Python, and R. It got to the point where my code was taking 24 hours + to run and I tried to find alternatives that were faster. I tried C++, but didn't like it very much. I prefer D to C++, but I'm still far more productive in Matlab/Python/R because I can depend on other people's libraries to a greater extent. However, D has made some good progress on the library front in the past two years. Mir has really come into its own with its own little universe around it (lubeck and numir are fun) and the GSOC project Magpie for Data Frames already looks promising. I would contrast this progress on the library front with the intellectual issues you are describing on the language itself. For me, the biggest downside with D was a lack of libraries. As you mentioned above, modules depending on modules. The network effect is very important for these types of use cases. So I do see progress on this front, even if there is a lot of unhappy feelings about some language issues.
I haven't done much with it yet but I hooked up your embedr to our scripting language written in D and it works for examples with very little effort. I was planning to do same for Julia so we can steal their libraries too. So currently you can write inline D and R in SIL. Python isn't far off working and C++ and mono). I think using cling one could maybe expose quite a lot of C++ libraries to D if you don't insist on zero overhead. What were the practical limitations you encountered with using R libraries from D via embedr?
Sep 28 2019
next sibling parent jmh530 <john.michael.hall gmail.com> writes:
On Sunday, 29 September 2019 at 03:53:57 UTC, Laeeth Isharc wrote:
 [snip]
 What were the practical limitations you encountered with using 
 R libraries from D via embedr?
Honestly, I haven't put it through its paces enough to say. I have used it for small examples, but nothing that significant.
Sep 29 2019
prev sibling parent bachmeier <no spam.net> writes:
On Sunday, 29 September 2019 at 03:53:57 UTC, Laeeth Isharc wrote:

 I haven't done much with it yet but I hooked up your embedr to 
 our scripting language written in D and it works for examples 
 with very little effort.  I was planning to do same for Julia 
 so we can steal their libraries too.  So currently you can 
 write inline D and R in SIL.  Python isn't far off working and 

 cling and mono). I think using cling one could maybe expose 
 quite a lot of C++ libraries to D if you don't insist on zero 
 overhead.

 What were the practical limitations you encountered with using 
 R libraries from D via embedr?
The practical limitation is that it only does what I need. It's built around R's matrix/vector types, which are the ones I use. The released version doesn't provide support for multidimensional arrays, for instance, and the wrapper for working with R lists can still lead to segfaults under some use cases. I have personal code (not in embedr) to read both arrays and lists, and to create lists to return to R. That was easy to do, but it is limited to those uses. In a nutshell: If you want to pass data between the two languages as a vector or matrix, access that data in either language, or make changes to the data, it's all there and it should work. If you want to return a list from D to R (needed in order to return multiple items from a function), that will work. If you want to do fancy things, like passing a list from R to D and then making arbitrary changes on the D side without unpacking the data, there might be problems. There are a lot of issues with automating the protection/unprotection from R's garbage collector that the cost-benefit analysis has never been in favor of getting it to work. These days, when I do have time to work on this, it's mostly going into compilation of a subset of R to D. The idea is that while it's useless to try to compile arbitrary R code, there's a subset that can be made fast, and it's that subset responsible for most bottlenecks. Add type information to your R code, compile, and you've got something that runs as fast as D. Ultimately, I want it so that a coauthor can write their code in R, I can write my code in D, and either of us can run it with good performance. If I win the lottery I will take time off of work and complete it. :)
Sep 29 2019
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 27 September 2019 at 10:45:31 UTC, Chris wrote:
 On Thursday, 26 September 2019 at 16:04:18 UTC, bachmeier wrote:

 [...]
I know, but that doesn't mean it's a good choice. I suppose people (say scientists) just started with little scripts and then kept on using the language they already knew when things got bigger. A lot of that "real" stuff is not used in contexts where performance is crucial (e.g. real time) and believe me, it's really annoying if you get a great piece of software / research in Python that is completely useless in terms of developing a product where performance is crucial. And you cannot simply rewrite it, because it depends on loads of other Python modules developed by other scientists etc. And, of course, they used Python, because of all the useful modules...it's a vicious circle. [...]
If any language is going to overtake Python or R among scientific users, I am betting Julia will be that language.
Sep 27 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 27 September 2019 at 13:04:37 UTC, Paulo Pinto wrote:

 If any language is going to overtake Python or R among 
 scientific users, I am betting Julia will be that language.
Yep. I agree. It's geared towards scientific programming and the devs are well aware of the drawbacks of Python and Matlab. "Julia features optional typing, multiple dispatch, and good performance, achieved using type inference and just-in-time (JIT) compilation, implemented using LLVM. It is multi-paradigm, combining features of imperative, functional, and object-oriented programming. Julia provides ease and expressiveness for high-level numerical computing, in the same way as languages such as R, MATLAB, and Python, but also supports general programming. To achieve this, Julia builds upon the lineage of mathematical programming languages, but also borrows much from popular dynamic languages, including Lisp, Perl, Python, Lua, and Ruby." [1] [1] https://docs.julialang.org/en/v1/
Sep 27 2019
prev sibling parent reply bachmeier <no spam.net> writes:
On Friday, 27 September 2019 at 13:04:37 UTC, Paulo Pinto wrote:

 If any language is going to overtake Python or R among 
 scientific users, I am betting Julia will be that language.
*If* that is going to happen, I think you're right. Many economists have been moving from Matlab to Julia. For instance, the New York Fed has ported their forecasting model (a fairly large codebase) from Matlab to Julia. Matlab is not the best language, but a bigger factor is that it's really expensive - wealthy institutions complain about Matlab licensing costs. I don't see it making much progress in replacing Python or R yet. Maybe that's just my small view of the world. A different use case, for which Julia isn't as well suited, is writing extensions. In R, the most popular dependency for packages by far is C++ in the form of Rcpp. D is a perfect replacements, and that's primarily how I've been using it (giving up on C++ for that was why I looked at D in the first place). There's a lot of room for D in that area. Too bad I don't have time to work on marketing it.
Sep 27 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 27 September 2019 at 14:01:34 UTC, bachmeier wrote:
 *If* that is going to happen, I think you're right. Many 
 economists have been moving from Matlab to Julia. For instance, 
 the New York Fed has ported their forecasting model (a fairly 
 large codebase) from Matlab to Julia. Matlab is not the best 
 language, but a bigger factor is that it's really expensive - 
 wealthy institutions complain about Matlab licensing costs. I 
 don't see it making much progress in replacing Python or R yet. 
 Maybe that's just my small view of the world.

 A different use case, for which Julia isn't as well suited, is 
 writing extensions. In R, the most popular dependency for 
 packages by far is C++ in the form of Rcpp. D is a perfect 
 replacements, and that's primarily how I've been using it 
 (giving up on C++ for that was why I looked at D in the first 
 place). There's a lot of room for D in that area. Too bad I 
 don't have time to work on marketing it.
Now, that sounds very interesting and promising. A few bullet points and maybe it could be turned into a blog post. There are two obstacles though: 1. D is huge, so you'd need a dedicated section "D for Data Scientists", so they know exactly which subset / features to use to get going. 2. Lack of consistency due to a constant change of focus by leadership / community Imo, 2. is the biggest obstacle.
Sep 27 2019
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 27 September 2019 at 14:01:34 UTC, bachmeier wrote:
 A different use case, for which Julia isn't as well suited, is 
 writing extensions. In R, the most popular dependency for 
 packages by far is C++ in the form of Rcpp. D is a perfect 
 replacements, and that's primarily how I've been using it
Speaking of which, I've thought about D as an extension / glue language too. I think it'd be well suited for that.
Sep 27 2019
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 14:01:34 UTC, bachmeier wrote:
 On Friday, 27 September 2019 at 13:04:37 UTC, Paulo Pinto wrote:

 If any language is going to overtake Python or R among 
 scientific users, I am betting Julia will be that language.
*If* that is going to happen, I think you're right. Many economists have been moving from Matlab to Julia. For instance, the New York Fed has ported their forecasting model (a fairly large codebase) from Matlab to Julia. Matlab is not the best language, but a bigger factor is that it's really expensive - wealthy institutions complain about Matlab licensing costs. I don't see it making much progress in replacing Python or R yet. Maybe that's just my small view of the world.
Octave is a free implementation of Matlab that is quite capable, but it does not provide all the specialized matlab functions needed to achieve 100% compatibility. So if that is an obstacle for Octave, then it probably is an obstacle for Julia as well. Anyway, as long as Python is taking over in higher education then Python will remain the language of choice... Since Julia is not suitable for CS students then Python most likely will keep that position. Btw, I think most people need a very compelling reason to switch to a new language.
Sep 27 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 27 September 2019 at 14:31:14 UTC, Ola Fosheim Grøstad 
wrote:
 Octave is a  free implementation of Matlab that is quite 
 capable, but it does not provide all the specialized matlab 
 functions needed to achieve 100% compatibility.  So if that is 
 an obstacle for Octave, then it probably is an obstacle for 
 Julia as well.
Not the real deal. People always return to Matlab.
 Anyway, as long as Python is taking over in higher education 
 then Python will remain the language of choice...  Since Julia 
 is not suitable for CS students then Python most likely will 
 keep that position.

 Btw, I think most people need a very compelling reason to 
 switch to a new language.
Three compelling reasons: performance, performance and performance. :) Seriously, there's a lot of great stuff out there, but it's written in Python and Matlab and thus often quite useless except as "proof of concept". This has been bugging me for years.
Sep 27 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 14:43:32 UTC, Chris wrote:
 Not the real deal. People always return to Matlab.
Probably, the problem is when you try to run models written by others and they use rarely used Matlab functions.
 Btw, I think most people need a very compelling reason to 
 switch to a new language.
Three compelling reasons: performance, performance and performance. :)
Right, but it is likely to come in the form of kernels that compiles to OpenCL for numerics. So, new libraries that can run on the GPU, but is controlled from Python.
Sep 27 2019
prev sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Fri, 2019-09-27 at 14:31 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-d
wrote:
[=E2=80=A6]
=20
 Anyway, as long as Python is taking over in higher education then=20
 Python will remain the language of choice...  Since Julia is not=20
 suitable for CS students then Python most likely will keep that=20
 position.
=20
[=E2=80=A6] At least in the UK, Python is the language of teaching programming to 10 to= 16 year olds. Universities now have to cope with new undergraduates already knowing Python, and probably some Java. UK universities are having to change their whole approach.=20 --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Sep 27 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 14:51:31 UTC, Russel Winder wrote:
 At least in the UK, Python is the language of teaching 
 programming to 10 to 16 year olds. Universities now have to 
 cope with new undergraduates already knowing Python, and 
 probably some Java.

 UK universities are having to change their whole approach.
Right. I believe Python will be introduced in public schools in Norway within the next decade or so. A few years back University of Oslo chose Python as their introduction language for alle the science departments. Then the CS department introduce Java after Python. In more advanced physics courses students may choose between Python and C++. Anyway, several books, say introduction course books in physics, come in one edition for Matlab and another for Python. Adding more editions for other languages across the board seems unlikely in the near future. You probably will find some for Java and some for Julia, but not enough to cover all the courses for a bachelor. As more and more courses use programming as "advanced calculators" and for simple didactic simulations we end up with a situation where they want a single language for all science topics. Preferably one that can be used interactively. So Python seems to dominate as the tool of choice for "didactic programming".
Sep 27 2019
parent reply Russel Winder <russel winder.org.uk> writes:
On Fri, 2019-09-27 at 15:16 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-d
wrote:
=20
[=E2=80=A6]
 Right. I believe Python will be introduced in public schools in=20
 Norway within the next decade or so.  A few years back University=20
 of Oslo chose Python as their introduction language for alle the=20
 science departments. Then the CS department introduce Java after=20
 Python.  In more advanced physics courses students may choose=20
 between Python and C++.
Subjects such as biology and physics and all the various variations are for the moment settling on C++ for computational stuff and Python for coordination, visualisation, and data rendering. As example astropy for mos= t of astronomy. It isn't that Python/C++ is the best choice per se, it is tha= t libraries got developed using the Python/C++ combination, and that people working in the fields are introduced to these libraries as the de facto standard for their subject. I cannot see Nim, D, Chapel, etc. even denting this traction of Python and C++, though at least Nim looks derivative of Python. Something similar is happening in data science, machine learning, and artificial intelligence: Python has become the de facto standard because someone chose to write all the libraries people needed in Python, NumPy, or C++.
 Anyway, several books, say introduction course books in physics,=20
 come in one edition for Matlab and another for Python.  Adding=20
 more editions for other languages across the board seems unlikely=20
 in the near future.
The use of Matlab was set because universities were given free licences, I suspect as a loss leader to ensure people knew Matlab and expected to use i= t when they moved to the world of work. However, over the 2007 to 2015 period= it was clear Matlab were charging too much and that Python was the obvious pla= ce to move to. I ran many workshops over 2010 to 2015 retraining Matlab programmers as Python programmers. Actually they were electrical engineers = not programmers who just wanted to solve problems. Matlab had the tools, but wa= s too expensive, they discovered Python also had the tools, and so they switched. I suspect there is beginning to be back pressure from the world of work to universities against Matlab and for Python, but this is speculation on my part.
 You probably will find some for Java and some for Julia, but not=20
 enough to cover all the courses for a bachelor.
=20
 As more and more courses use programming as "advanced=20
 calculators" and for simple didactic simulations we end up with a=20
 situation where they want a single language for all science=20
 topics. Preferably one that can be used interactively.
Having been an integral part of the late 1990s push for Java everywhere in university to make teaching easier, it was a very big mistake. The late 198= 0s, early 1990s situation of using two or three different programming languages for the core university curriculum made for much better programmers. So Pyt= hon for everything in universities is a very bad idea as well, at least for CS students =E2=80=93 electrical engineering, biology, and physics students ar= e in a different boat.
 So Python seems to dominate as the tool of choice for "didactic=20
 programming".
=20
--=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Sep 27 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 17:37:14 UTC, Russel Winder wrote:
 It isn't that Python/C++ is the best choice per se, it is that 
 libraries got developed using the Python/C++ combination, and 
 that people working in the fields are introduced to these 
 libraries as the de facto standard for their subject. I cannot 
 see  Nim, D, Chapel, etc. even denting this traction of Python 
 and C++, though at least Nim looks derivative of Python.
Right, it is very difficult to make this move as more and more simulation models or libraries are implemented in a specific language. Although it does happen, Fortran is not as important as it used to be, at least that is my impression. But it clearly takes decades, not years, to make this move.
 Something similar is happening in data science, machine 
 learning, and artificial intelligence: Python has become the de 
 facto standard because someone chose to write all the libraries 
 people needed in Python, NumPy, or C++.
True.
 The use of Matlab was set because universities were given free 
 licences, I suspect as a loss leader to ensure people knew 
 Matlab and expected to use it when they moved to the world of 
 work. However, over the 2007 to 2015 period it was clear Matlab 
 were charging too much and that Python was the obvious place to 
 move to. I ran many workshops over 2010 to 2015 retraining 
 Matlab programmers as Python programmers. Actually they were 
 electrical engineers not programmers who just wanted to solve 
 problems. Matlab had the tools, but was too expensive, they 
 discovered Python also had the tools, and so they switched.
Interesting that electrical engineers found that switch worthwhile, but I still think that if you primarily do experiments with linear algebra or signal processing then Matlab is more convenient to use interactively than Python. Yet, if you already know Python then maybe that isn't enough to also learn Matlab. So I think that over time, as Python becomes the primary language in education, you also get that effect. Not because Python is better than the alternative, but that learning the alternative is less attractive than solving the problem right away (so they prefer writing more code in Python than learning Matlab and write more terse code).
 I suspect there is beginning to be back pressure from the world 
 of work to universities against Matlab and for Python, but this 
 is speculation on my part.
I don't know, but I suspect as people that grew up with Python enter leadership position then the question becomes "Is there a good reason for not using Python for this?".
 university curriculum made for much better programmers. So 
 Python for everything in universities is a very bad idea as 
 well, at least for CS students – electrical engineering,
Yeah, but if you always start out with Python then students can switch field after the first two years with less resistance. Then you probably learn Arduino or C in a low level programming course, Scheme in a functional programming course, Prolog in a logic course, and Java . That's not too bad, but the real problem is that some students only do mandatory programming exercises and do not program on their own, and that institutions do not want too many students to fail. If you then add into the mix that the CS department want many students (as the money follow the student) they end up accepting low quality submissions and thus the students don't get the training through effort that they need. The desire to write your own programs and try out other languages is probably necessary to become a decent programmer. Not sure what can be done with that. In the 80s low level programming was a hobby, maybe 1 out of 10 males experimented with this on their 8-bit computers... We don't have these numbers today, I think?
Sep 27 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 18:16:27 UTC, Ola Fosheim Grøstad 
wrote:
 The desire to write your own programs and try out other 
 languages is probably necessary to become a decent programmer. 
 Not sure what can be done with that. In the 80s low level
Another factor is that in the 80s and 90s the available languages were quite primitive and not too big and the hardware was insufficient for many tasks. So there was a real motivation to look for better tools. As languages and eco systems have become quite large and hardware is sufficient for most tasks, then we also should expect less motivation for looking for better tools. Seems like the adoption of new languages today is primarily driven by a desire by big corporations to avoid compatibility and isolate their own eco systems. Apple/Swift. Google/Go/Koitlin. Oracle/Java. I am not sure if this trend is a good thing. Actually it probably is bad. Seems like there is a desire for big business to segregate developers into Apple-programmers, Microsoft-programmers, Google-programmers, Business/Database programmers… Clearly not beneficial to programmers. Python bypasses all that though. Which in my view is a good thing.
Sep 27 2019
next sibling parent reply JN <666total wp.pl> writes:
On Friday, 27 September 2019 at 18:41:46 UTC, Ola Fosheim Grøstad 
wrote:
 Seems like the adoption of new languages today is primarily 
 driven by a desire by big corporations to avoid compatibility 
 and isolate their own eco systems.

 Apple/Swift.

 Google/Go/Koitlin.
 Oracle/Java.
I don't think it's strictly that. It's more like they are looking for a way to improve developer experience for their respective platform. They don't have to necessarily look for isolation, they just don't want to make any effort to support other platforms more than necessary. Swift is obviously a replacement for Objective-C. Objective-C is already incompatible and isolated, because it's THE language for iOS/macOS, and basically doesn't exist elsewhere. But it is quite outdated, and developers writing apps for Apple platforms would like to use a more modern language. Kotlin is obviously a replacement for Java. Due to various reasons, Java's been quite conservative feature-wise and it's still being able to interop and make use of the massive JVM ecosystem. Go... I think Go started as a side project by some googlers? I think the goal was to offer an alternative to C/C++. Nowadays we have D, Rust, Crystal but back then there weren't really alternatives if you were looking for something that isn native but isn't C/C++.
Sep 27 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 27 September 2019 at 18:58:31 UTC, JN wrote:
 I don't think it's strictly that. It's more like they are 
 looking for a way to improve developer experience for their 
 respective platform. They don't have to necessarily look for 
 isolation, they just don't want to make any effort to support 
 other platforms more than necessary.
It is pretty obvious that Apple doesn't want to make it easy to port iOS apps to Android. Same with Microsoft and Windows-applications. They have both made too many developer-hostile moves over the years for this not to be true. Microsoft even did it with their browser for over a decade. Apple did it with iOS Safari as well. Google is more in a grey area. They have fewer reasons to lock in and lock out.
 Kotlin is obviously a replacement for Java. Due to various 
 reasons, Java's been quite conservative feature-wise and it's 


 still being able to interop and make use of the massive JVM 
 ecosystem.
 Go... I think Go started as a side project by some googlers?
AFAIK it was an attempt to build a system programming language for internal use at Google that would allow them to hire cheaper labour than C++ programmers. But Go had problems gaining traction within Google.
Sep 27 2019
prev sibling parent reply Ali <fakeemail example.com> writes:
On Friday, 27 September 2019 at 18:41:46 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 27 September 2019 at 18:16:27 UTC, Ola Fosheim 
 Grøstad wrote:

 Seems like the adoption of new languages today is primarily 
 driven by a desire by big corporations to avoid compatibility 
 and isolate their own eco systems.

 Apple/Swift.

 Google/Go/Koitlin.
 Oracle/Java.
Something I was always curious to know, Andrei Alexandrescu worked at facebook for few years And from the little available online, it seems that facebook for a while used D for some internal tools Again as an outsider, and with very little available on this, it seems facebook moved t Ocaml and than Reason Did Andrei, ever write a blog on why D at facebook didnt happen, it seems like it could have been a big moment for D (OCaml is a GC language, so it can't be the GC)
Oct 01 2019
parent Chris <wendlec tcd.ie> writes:
On Tuesday, 1 October 2019 at 15:55:50 UTC, Ali wrote:
 Something I was always curious to know, Andrei Alexandrescu 
 worked at facebook for few years

 And from the little available online, it seems that facebook 
 for a while used D for some internal tools

 Again as an outsider, and with very little available on this, 
 it seems facebook  moved t Ocaml and than Reason

 Did Andrei, ever write a blog on why D at facebook didnt 
 happen, it seems like it could have been a big moment for D

 (OCaml is a GC language, so it can't be the GC)
I remember Andrei mentioned in an article that D was faster than C++ for a script like cli tool, because it never freed memory in the first place (before it finished). Below you will find some material about it [1]. I don't know what exactly D was used for at facebook (he's probably not allowed to reveal it), but it might have been data analysis, same stuff sociomantics uses it for. Anyway, would be interesting to see why facebook abandoned it (or at least didn't go on with it) after Andrei had left. But the D community tends not to ask such questions ;) [1] 2013 The announcement: https://forum.dlang.org/post/l37h5s$2gd8$1 digitalmars.com "In a post on the Dlang.org website, Alexandrescu elaborates a little more: The code project he’s working on is in “heavy daily use at Facebook” and originally it was written in C++. Now that it’s in D the team has “measured massive wins in all of source code size, build speed, and running speed.” His post has spurred a lot of positive support on the forum, and if what he says is true then it would make sense for Facebook to take a critical tool and make it more efficient–because there will be an upshot on overall efficiency of operations at the site." https://www.fastcompany.com/3019887/facebook-adds-5000-lines-of-d-language-code-whats-that-mean 2014 "Today, Alexandrescu is a research scientist at Facebook, where he and a team of coders are using D to refashion small parts of the company's massive operation. Bright, too, has collaborated with Facebook on this experimental software, as an outsider contractor. The tech giant isn't an official sponsor of the language—something Alexandrescu is quick to tell you—but Facebook believes in D enough to keep him working on it full-time, and the company is at least considering the possibility of using D in lieu of C++, the venerable language that drives the systems at the heart of so many leading web services." https://www.wired.com/2014/07/d-programming-language/
Oct 01 2019
prev sibling parent Laeeth Isharc <laeeth laeeth.com> writes:
On Friday, 27 September 2019 at 14:01:34 UTC, bachmeier wrote:
 On Friday, 27 September 2019 at 13:04:37 UTC, Paulo Pinto wrote:

 If any language is going to overtake Python or R among 
 scientific users, I am betting Julia will be that language.
*If* that is going to happen, I think you're right. Many economists have been moving from Matlab to Julia. For instance, the New York Fed has ported their forecasting model (a fairly large codebase) from Matlab to Julia. Matlab is not the best language, but a bigger factor is that it's really expensive - wealthy institutions complain about Matlab licensing costs. I don't see it making much progress in replacing Python or R yet. Maybe that's just my small view of the world. A different use case, for which Julia isn't as well suited, is writing extensions. In R, the most popular dependency for packages by far is C++ in the form of Rcpp. D is a perfect replacements, and that's primarily how I've been using it (giving up on C++ for that was why I looked at D in the first place). There's a lot of room for D in that area. Too bad I don't have time to work on marketing it.
Small beginning. https://github.com/symmetryinvestments/juliad
Sep 28 2019
prev sibling parent reply GreatSam4sure <greatsam4sure gmail.com> writes:
On Thursday, 26 September 2019 at 13:26:13 UTC, bachmeier wrote:
 On Thursday, 26 September 2019 at 06:14:24 UTC, Paulo Pinto 
 wrote:

 The language should be stabilized, GC improved (I know it has 
 gotten much better), replace deprecated Phobos packages (XML, 
 JSON, stream), focus on vi/Emacs/VSCode and maybe people will 
 come and stay around.
There's a chicken-and-egg problem, though. D doesn't have the type of community that languages like Rust, Go, or Python has. I remember in the 90s when Python was insignificant to the programming world, they had their community of true believers promoting the language and making it more usable. In 2005, when I made my move to R for my computing, I looked at Python and it had little to offer. They had a community that made a lot of stuff happen in a hurry, and they promoted the language well. It's not even a particularly good choice for scientific computing, yet it is now a giant in that area. D has its mailing list and a few people doing their own projects (I'm in that group) but not a lot of blog posts or podcasts or tutorials, and can't even put together documentation for Dub. You read the mailing list and about two out of every three posts is negative. Until some people step up to be leaders of the community, nothing else will matter as far as D adoption goes. For all the years I've been involved, it's always been people looking to Walter and Andrei to make things happen, and I haven't seen anything like that with any other language.
This is a newbie point of view. The biggest problem to D adoption is this community IMHO. Almost everything here is negative. The community is 95% negative to the language. There are people here that never see anything good about D yet they are here. The community is damaging to D image. It is a community that is out to destroy the language. All the hope of me using D has been almost destroyed. The leadership and most people in the community seem to be the opposite poles yet we want D to have great adoption. Just open your google and search for D you will read so many negative things about the language. Come to the community, the story is the same. This is the most negative community to its own language I have ever known of. The story has to change. This community greatly discourage me. It is my biggest problem using D. I have personally come to believe that many people in this community is sponsor against the language. Pls if you know you are no longer interest in D it will be better not to discourage others. Also, the leadership should handle the difference between them any community member to prevent them from pouring their frustration here.
Sep 27 2019
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Sep 27, 2019 at 03:11:26PM +0000, GreatSam4sure via Digitalmars-d wrote:
[...]
 This is a newbie point of view.
 
 The biggest problem to D adoption is this community IMHO. Almost
 everything here is negative. The community is 95% negative to the
 language. There are people here that never see anything good about D
 yet they are here. The community is damaging to D image.
[...] I don't know what has happened here, and why the naysayers seem to be out in force here, but I want to say that I am very positive about D, and I'm pretty sure I'm not the only one. I just haven't been posting as much because I'm occupied with other things. But seriously, there is no language out there right now that I like better than D. It's true that D has its flaws and dark corners -- no language of this complexity can possibly be perfect -- but the good stuff is REALLY good, and none of that is going away anytime soon. OTOH, one of the reasons we are sometimes very critical about something in the language is because it's *so* good, and we have come to expect the best, so when we see something that still needs improvement, we tend to speak out. We criticize the not-so-good parts because we care. Or at least I do -- can't speak for anyone else. And one thing I've learned over the years: don't waste time on the naysayers. Know what you want, and go for it. Let others waste their breath naysaying if that's what they want -- let it be their problem, not yours. T -- Любишь кататься - люби и саночки возить.
Sep 27 2019
prev sibling parent reply Benjiro <benjiro benjiro.com> writes:
On Friday, 27 September 2019 at 15:11:26 UTC, GreatSam4sure wrote:
 The biggest problem to D adoption is this community IMHO. 
 Almost everything here is negative. The community is 95% 
 negative to the language. There are people here that never see 
 anything good about D yet they are here. The community is 
 damaging to D image.

 It is a community that is out to destroy the language. All the 
 hope of me using D has been almost destroyed.
As one of these negative people that from time to time reads up on multiple languages forums. D was one of the first languages after moving away from Web dev, that i liked from structure point. Even got the book from Ali. But ... the more i worked with it, the more frustration, upon frustration, upon ... kept creeping in with constant issues. * Not user friendly tooling ( somewhat improved, took only a few years ) * Compiler bugs. O those bugs ... * Upgrading the compiler resulting in packages ( like vibe.d going broke ) * Lacking IDE support ( somewhat improved on Visual Studio Code ) * Lacking in packages to easily get going. * Constant and frustrating changing direction. BetterC? First get BetterD doing and when your a big boy, then do this. * The feeling off being looked down when suggestion that D is not user friendly. Especially in the past you constantly got this as a answer: "you have a issue, fix it yourself". Yes, this was the constant answer from everybody. * The snarky and frankly poison comments from regular members. Some of those now seem to have left but D really has some members that got under anybody their skin for daring to mention a issue. * Focus upon C++ crowd, when very few C++ developers have any interest in D. Hell, if a C++ developer wants to switch, they have better alternatives with bigger or better growing communities. * A leadership that seems to be stuck in the 1990's mentality. The world has moved on, people expect more these days, they have choices most old timers did not have. So they are more spoiled with those choices and are not going to put in the time. But when you ignore those "spoiled" people, you never build up a loyal base and you will never motivate them to help out with code or money. You can not run anything worth while with D unless you plan on spending a lot of time writing supporting code yourself. Some companies have no issue with this but little old me can not be bothered with it, when there are plenty of good alternatives that get the job done as good. * I know one of the posters on this forum a bit more personal. We have had some discussions about the different compilers. Just as he used Go for his personal project, i used Crystal for mine. We both found D too much trouble for the advantages it gave. Its a vicious circle and i know it. Lack in people / contributors => No Packaging / Bugs / Issues => New people scared away => Lack in people / contributors .... * D just is not sexy. I know D now from 10/2016 ( when i got Ali's book. Programming in D ). Loved the book and language syntax ( especially if you come from PHP ) but it was all the rest that ruined it for me. And over the years D seems to have been going down a direction that i hated. New features that had no meaning for me, code breaking issues and resources being put in features like BetterC, when all i wanted was a god darn simple HTTP server that worked and kept working on release updates. And more active packages. * On the subject of Programming in D, i noticed a lot of books are old. BetterC, the feature where a lot of resources have gone into, seems to be ignored in every book on the market. ------------------------------ To give you a nice example how much of a irritation D can be even today. Just right now, i tried to install D on Ubuntu because i want to time the current D compile speed, vs a few other languages. And i am following the official order ( https://dlang.org/download.html ): 1. sudo wget http://master.dl.sourceforge.net/project/d-apt/files/d-apt.list -O /etc/apt/sources.list.d/d-apt.list 2. sudo apt-get update && sudo apt-get -y --allow-unauthenticated install --reinstall d-apt-keyring 3. sudo apt-get update && sudo apt-get install dmd-compiler dub Guess what happens on Step 2.
 Err:7 https://netcologne.dl.sourceforge.net/project/d-apt d-apt 
 InRelease
The following signatures couldn't be verified because the public key is not available: NO_PUBKEY EBCF975E5BA24D5E And this issue was posted a LONG time ago. Yep ... Now try Go, Crystal, Nim, Rust, ... and a lot of other languages. I have installed all those dozens of times and a issue like simply do not show up or are fixed so fast, that it never hits "me". But whenever it involves D, its always one thing or another. ------------------------------ Hopefully this explains things from my point of view, where D simply fails me personally. Its late so i am off to bed.
Sep 28 2019
next sibling parent reply drug <drug2004 bk.ru> writes:
29.09.2019 4:29, Benjiro пишет:
 
 To give you a nice example how much of a irritation D can be even today. 
 Just right now, i tried to install D on Ubuntu because i want to time 
 the current D compile speed, vs a few other languages. And i am 
 following the official order ( https://dlang.org/download.html ):
 
 1. sudo wget 
 http://master.dl.sourceforge.net/project/d-apt/files/d-apt.list -O 
 /etc/apt/sources.list.d/d-apt.list
d-apt isn't the official
 2. sudo apt-get update && sudo apt-get -y --allow-unauthenticated 
 install --reinstall d-apt-keyring
 3. sudo apt-get update && sudo apt-get install dmd-compiler dub
 
 Guess what happens on Step 2.
 
 Err:7 https://netcologne.dl.sourceforge.net/project/d-apt d-apt InRelease
  The following signatures couldn't be verified because the public key is not available:  NO_PUBKEY EBCF975E5BA24D5E
d-apt provides instructions to install this pubkey https://d-apt.sourceforge.io/ " To enable it, add the repository sources: $ sudo wget https://netcologne.dl.sourceforge.net/project/d-apt/files/d-apt.list -O /etc/apt/sources.list.d/d-apt.list then update local info and install "d-apt" public key (fingerprint 0xEBCF975E5BA24D5E): $ sudo apt-get update --allow-insecure-repositories && sudo apt-get -y --allow-unauthenticated install --reinstall d-apt-keyring && sudo apt-get update "
 
 And this issue was posted a LONG time ago.
So there is no issue you are talking about. Probably it's gone
 
 Yep ... Now try Go, Crystal, Nim, Rust, ... and a lot of other 
 languages. I have installed all those dozens of times and a issue like 
 simply do not show up or are fixed so fast, that it never hits "me". But 
 whenever it involves D, its always one thing or another.
 
 ------------------------------
 
 Hopefully this explains things from my point of view, where D simply 
 fails me personally. Its late so i am off to bed.
Did you ask people here to help you? You are welcome!
Sep 29 2019
parent reply Benjiro <benjiro benjiro.com> writes:
On Sunday, 29 September 2019 at 07:43:12 UTC, drug wrote:
 Did you ask people here to help you? You are welcome!
Thanks for your help* ( See PS below ) but fixing the download was not that hard. But for a new user that tries this, it can just as well been Chinese. Its small issues like having a not working download present on the actual official website. It does not matter if its 3th party or not when its posted on the official website. This is why i mentioned before how there are so many frustrations. Its not just the big issues but the constant small issues that are present everywhere.
 You are welcome!
*PS: Next time you write with a exclamation mark, know that this makes your sentence have a totally different meaning and is considered passive aggressive in my region. Its little things like this that seems to pop up everywhere in this community on a regular base. The fact that there are no rules of conduct or any kind of oversight is probably the main reason why people see D as negative community. Even places like Reddit that are toxic, have some kind of moderation going on because people will be people. D, you want my suggestion? Invest into a proper forum, set up a RoC and actually moderate topics. And yes, i understand the irony as i am posting this off-topic in the Nim Language thread. :) This is the first step...
Sep 29 2019
parent reply drug <drug2004 bk.ru> writes:
29.09.2019 13:35, Benjiro пишет:
 On Sunday, 29 September 2019 at 07:43:12 UTC, drug wrote:
 Did you ask people here to help you? You are welcome!
Thanks for your help* ( See PS below ) but fixing the download was not that hard. But for a new user that tries this, it can just as well been Chinese. Its small issues like having a not working download present on the actual official website. It does not matter if its 3th party or not when its posted on the official website.
But I always download dmd from the official site and have no troubles. It works out of box. Your example above is a wrong mix of several ways to install dmd.
 
 This is why i mentioned before how there are so many frustrations. Its 
 not just the big issues but the constant small issues that are present 
 everywhere.
 
 You are welcome!
*PS: Next time you write with a exclamation mark, know that this makes your sentence have a totally different meaning and is considered passive aggressive in my region.
Thank you. Good to know because I'm not native English speaker.
 
 Its little things like this that seems to pop up everywhere in this 
 community on a regular base. The fact that there are no rules of conduct 
 or any kind of oversight is probably the main reason why people see D as 
 negative community. Even places like Reddit that are toxic, have some 
 kind of moderation going on because people will be people.
 
 D, you want my suggestion? Invest into a proper forum, set up a RoC and 
 actually moderate topics. And yes, i understand the irony as i am 
 posting this off-topic in the Nim Language thread. :)
 
 This is the first step...
Yeah, before the D comminity consisted of high professional members who can develop the language. Now when less professional people start appearing here they don't want to implement the features they want by themselves and when they hear 'do it yourself' they start complaining the community is toxic. Not at all. I do not blame these people of course, they are just users of the language not its developer. But it would be nice if they understand the situation.
Sep 29 2019
parent Johannes Pfau <nospam example.com> writes:
Am Sun, 29 Sep 2019 16:01:49 +0300 schrieb drug:

 29.09.2019 13:35, Benjiro пишет:
 On Sunday, 29 September 2019 at 07:43:12 UTC, drug wrote:
 Did you ask people here to help you? You are welcome!
Thanks for your help* ( See PS below ) but fixing the download was not that hard. But for a new user that tries this, it can just as well been Chinese. Its small issues like having a not working download present on the actual official website. It does not matter if its 3th party or not when its posted on the official website.
But I always download dmd from the official site and have no troubles. It works out of box. Your example above is a wrong mix of several ways to install dmd.
 This is why i mentioned before how there are so many frustrations. Its
 not just the big issues but the constant small issues that are present
 everywhere.
 
 You are welcome!
*PS: Next time you write with a exclamation mark, know that this makes your sentence have a totally different meaning and is considered passive aggressive in my region.
Thank you. Good to know because I'm not native English speaker.
 Its little things like this that seems to pop up everywhere in this
 community on a regular base. The fact that there are no rules of
 conduct or any kind of oversight is probably the main reason why people
 see D as negative community. Even places like Reddit that are toxic,
 have some kind of moderation going on because people will be people.
 
 D, you want my suggestion? Invest into a proper forum, set up a RoC and
 actually moderate topics. And yes, i understand the irony as i am
 posting this off-topic in the Nim Language thread. :)
 
 This is the first step...
Yeah, before the D comminity consisted of high professional members who can develop the language. Now when less professional people start appearing here they don't want to implement the features they want by themselves and when they hear 'do it yourself' they start complaining the community is toxic. Not at all. I do not blame these people of course, they are just users of the language not its developer. But it would be nice if they understand the situation.
I think a large part of this shift in the forum discussions is that most constructive, language related discussions have moved to slack. This only leaves the bike-shedding, criticism and trolling here, which seems to have changed the atmostphere a lot. A few years ago, the D community was considered very friendly and open and I think this is still the case when you look at the D.learn forum. So it's not really people who ask questions getting harsh responses, even if a question has been asked a hundred times, people still get good and friendly answers. It's more people demanding things and criticizing unproportionally. This certainly does not justify rude responses, but this seems to be the pattern recently. -- Johannes
Sep 29 2019
prev sibling next sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Sunday, 29 September 2019 at 01:29:59 UTC, Benjiro wrote:
 all i wanted was a god darn simple HTTP server that
 worked and kept working on release updates
lol i wrote one in 2008, and it still works to this day. Very rarely breaks, I have broad forward and backward compatibility with compiler versions. I sometimes wish I actually put in Phobos like Andrei wanted me to back in the day. I just couldn't be bothered; I actually see users as a liability (you cost me time and offer me nothing) so like meh. But still, I hear this complaint over and over again and it really just seems to be a problem with vibe.d rather than the language per se.
Sep 29 2019
prev sibling next sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Sunday, 29 September 2019 at 01:29:59 UTC, Benjiro wrote:
 On Friday, 27 September 2019 at 15:11:26 UTC, GreatSam4sure 
 wrote:
 The biggest problem to D adoption is this community IMHO. 
 Almost everything here is negative. The community is 95% 
 negative to the language. There are people here that never see 
 anything good about D yet they are here. The community is 
 damaging to D image.

 It is a community that is out to destroy the language. All the 
 hope of me using D has been almost destroyed.
As one of these negative people that from time to time reads up on multiple languages forums. D was one of the first languages after moving away from Web dev, that i liked from structure point. Even got the book from Ali. But ... the more i worked with it, the more frustration, upon frustration, upon ... kept creeping in with constant issues. * Not user friendly tooling ( somewhat improved, took only a few years ) * Compiler bugs. O those bugs ... * Upgrading the compiler resulting in packages ( like vibe.d going broke ) * Lacking IDE support ( somewhat improved on Visual Studio Code ) * Lacking in packages to easily get going. * Constant and frustrating changing direction. BetterC? First get BetterD doing and when your a big boy, then do this. * The feeling off being looked down when suggestion that D is not user friendly. Especially in the past you constantly got this as a answer: "you have a issue, fix it yourself". Yes, this was the constant answer from everybody. * The snarky and frankly poison comments from regular members. Some of those now seem to have left but D really has some members that got under anybody their skin for daring to mention a issue. * Focus upon C++ crowd, when very few C++ developers have any interest in D. Hell, if a C++ developer wants to switch, they have better alternatives with bigger or better growing communities. * A leadership that seems to be stuck in the 1990's mentality. The world has moved on, people expect more these days, they have choices most old timers did not have. So they are more spoiled with those choices and are not going to put in the time. But when you ignore those "spoiled" people, you never build up a loyal base and you will never motivate them to help out with code or money. You can not run anything worth while with D unless you plan on spending a lot of time writing supporting code yourself. Some companies have no issue with this but little old me can not be bothered with it, when there are plenty of good alternatives that get the job done as good. * I know one of the posters on this forum a bit more personal. We have had some discussions about the different compilers. Just as he used Go for his personal project, i used Crystal for mine. We both found D too much trouble for the advantages it gave. Its a vicious circle and i know it. Lack in people / contributors => No Packaging / Bugs / Issues => New people scared away => Lack in people / contributors .... * D just is not sexy. I know D now from 10/2016 ( when i got Ali's book. Programming in D ). Loved the book and language syntax ( especially if you come from PHP ) but it was all the rest that ruined it for me. And over the years D seems to have been going down a direction that i hated. New features that had no meaning for me, code breaking issues and resources being put in features like BetterC, when all i wanted was a god darn simple HTTP server that worked and kept working on release updates. And more active packages. * On the subject of Programming in D, i noticed a lot of books are old. BetterC, the feature where a lot of resources have gone into, seems to be ignored in every book on the market. ------------------------------ To give you a nice example how much of a irritation D can be even today. Just right now, i tried to install D on Ubuntu because i want to time the current D compile speed, vs a few other languages. And i am following the official order ( https://dlang.org/download.html ): 1. sudo wget http://master.dl.sourceforge.net/project/d-apt/files/d-apt.list -O /etc/apt/sources.list.d/d-apt.list 2. sudo apt-get update && sudo apt-get -y --allow-unauthenticated install --reinstall d-apt-keyring 3. sudo apt-get update && sudo apt-get install dmd-compiler dub Guess what happens on Step 2.
 Err:7 https://netcologne.dl.sourceforge.net/project/d-apt 
 d-apt InRelease
The following signatures couldn't be verified because the public key is not available: NO_PUBKEY EBCF975E5BA24D5E And this issue was posted a LONG time ago. Yep ... Now try Go, Crystal, Nim, Rust, ... and a lot of other languages. I have installed all those dozens of times and a issue like simply do not show up or are fixed so fast, that it never hits "me". But whenever it involves D, its always one thing or another. ------------------------------ Hopefully this explains things from my point of view, where D simply fails me personally. Its late so i am off to bed.
+1 Same for me for everything you said. I personally use D only for file processing scripts, first because D is very good at that, but also because this generally (but not always) prevents me from suffering from many of those problems you mentioned.
 * Focus upon C++ crowd, when very few C++ developers have any 
 interest in D. Hell, if a C++ developer wants to switch, they 
 have better alternatives with bigger or better growing 
 communities.
 Its a vicious circle and i know it.

 Lack in people / contributors => No Packaging / Bugs / Issues 
 => New people scared away => Lack in people / contributors ....
 * D just is not sexy. I know D now from 10/2016 ( when i got 
 Ali's book. Programming in D ). Loved the book and language 
 syntax ( especially if you come from PHP ) but it was all the 
 rest that ruined it for me. And over the years D seems to have 
 been going down a direction that i hated. New features that had 
 no meaning for me, code breaking issues and resources being put 
 in features like BetterC, when all i wanted was a god darn 
 simple HTTP server that worked and kept working on release 
 updates. And more active packages.
 Yep ... Now try Go, Crystal, Nim, Rust, ... and a lot of other 
 languages. I have installed all those dozens of times and a 
 issue like simply do not show up or are fixed so fast, that it 
 never hits "me". But whenever it involves D, its always one 
 thing or another.
Very well summarized... I'm a C++ developer myself (even if these days I program mostly a better C++. And as a "true" C++ alternative, D is far from being the best choice. I'd rather choose Rust, Zig and even Nim for my typical C/C++ use case, despite indeed it's obviously *possible* to use D for that, precisely because I know D's major pain points. D being garbage collected, I once hope it would embrace the Go/Crystal wagon of those similar languages which have put their focus on web application developement, so that their base library is providing all the required building blocks (coroutines, http, etc) in such a way we can bery easily build our own web frameworks with just a few lines of code. But I now understand that the focus is on "BetterC", not on "BetterGo"...
Sep 29 2019
next sibling parent reply Joakim =?UTF-8?B?QnLDpG5uc3Ryw7Zt?= <notfornow dev.null.com> writes:
On Sunday, 29 September 2019 at 20:39:19 UTC, Ecstatic Coder 
wrote:
 On Sunday, 29 September 2019 at 01:29:59 UTC, Benjiro wrote:
 [...]
+1 Same for me for everything you said. I personally use D only for file processing scripts, first because D is very good at that, but also because this generally (but not always) prevents me from suffering from many of those problems you mentioned.
 [...]
 [...]
 [...]
 [...]
Very well summarized... I'm a C++ developer myself (even if these days I program mostly as a better C++. And as a "true" C++ alternative, D is far from being the best choice. I'd rather choose Rust, Zig and even Nim for my typical C/C++ use case, despite indeed it's obviously *possible* to use D for that, precisely because I know D's major pain points. D being garbage collected, I once hope it would embrace the Go/Crystal wagon of those similar languages which have put their focus on web application developement, so that their base library is providing all the required building blocks (coroutines, http, etc) in such a way we can bery easily build our own web frameworks with just a few lines of code. But I now understand that the focus is on "BetterC", not on "BetterGo"...
I'll shim in with some positives. I don't find D lacking in any significant way that hinder my work. I use it both privately and at work with great success. It has all the tools I need in the language and I like the improvements I'm seeing, the direction the language is taking. The development of D is mostly done by voluteres which mean that I am grateful for all their time and work. I do not expect much in the way of support or that *they* listen to me. In the same way I do not expect it from any other language I use. The few times I have asked for help there has always been someone that can provide it. Last time it was kinke in fixing an ldc bug (thank you). // Joakim
Sep 29 2019
next sibling parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Sunday, 29 September 2019 at 22:27:40 UTC, Joakim Brännström 
wrote:
 On Sunday, 29 September 2019 at 20:39:19 UTC, Ecstatic Coder 
 wrote:
 On Sunday, 29 September 2019 at 01:29:59 UTC, Benjiro wrote:
 [...]
+1 Same for me for everything you said. I personally use D only for file processing scripts, first because D is very good at that, but also because this generally (but not always) prevents me from suffering from many of those problems you mentioned.
 [...]
 [...]
 [...]
 [...]
Very well summarized... I'm a C++ developer myself (even if these days I program Python/Ruby, not as a better C++. And as a "true" C++ alternative, D is far from being the best choice. I'd rather choose Rust, Zig and even Nim for my typical C/C++ use case, despite indeed it's obviously *possible* to use D for that, precisely because I know D's major pain points. D being garbage collected, I once hope it would embrace the Go/Crystal wagon of those similar languages which have put their focus on web application developement, so that their base library is providing all the required building blocks (coroutines, http, etc) in such a way we can bery easily build our own web frameworks with just a few lines of code. But I now understand that the focus is on "BetterC", not on "BetterGo"...
I'll shim in with some positives. I don't find D lacking in any significant way that hinder my work. I use it both privately and at work with great success. It has all the tools I need in the language and I like the improvements I'm seeing, the direction the language is taking. The development of D is mostly done by voluteres which mean that I am grateful for all their time and work. I do not expect much in the way of support or that *they* listen to me. In the same way I do not expect it from any other language I use. The few times I have asked for help there has always been someone that can provide it. Last time it was kinke in fixing an ldc bug (thank you). // Joakim
Same for me. I'm very grateful to Walter and to the community to have provided me with such a fantastic language. My favorite one actually. Just take a look at my Github account, and this will be obvious to you : https://github.com/senselogic?tab=repositories But as I said, in my opinion, Go and Crystal focus on web application development have their advantages. For instance, I was interested in implementing Cyclone, my CQL/SQL script runner, in D. But I quickly switched to Genesis, my own variant of Go. The code is simple, concise and performant, and use only the default librairies and the official Cassandra and MySQL database drivers : https://github.com/senselogic/CYCLONE/blob/master/cyclone.gs Try porting this to D in the same way, and then tell me if I was wrong to finally choose Go for that specific tool...
Sep 29 2019
prev sibling parent reply JN <666total wp.pl> writes:
On Sunday, 29 September 2019 at 22:27:40 UTC, Joakim Brännström 
wrote:
 The development of D is mostly done by voluteres which mean 
 that I am grateful for all their time and work.
 I do not expect much in the way of support or that *they* 
 listen to me.
 In the same way I do not expect it from any other language I 
 use.
 The few times I have asked for help there has always been 
 someone that can provide it.
 Last time it was kinke in fixing an ldc bug (thank you).

 // Joakim
This is true. However, given that it's a volunteer project, there is room for improvement. If the workforce is limited, it's spread over the language/libraries surface. If we were to minimize that surface, we can have more work done. I think D shouldn't be afraid to cut some of the language features which aren't used much, or have a viable workaround. This way the language would be smaller and easier to maintain/bugfix. Also, smaller language means lesser chance of bugs occuring in the first place. Right now I feel like D is in that weird spot. It's too afraid to cut or rework existing features for language stability, but then is too brave introducing new concepts which weren't battle-tested before. Main issues with D aren't with the specific features, which often came up as a good idea and solve some problem, but the problem is when these features interact with other features. The result is a lot of unexpected interactions (look at any DIP discussion thread, there's always a case of "yeah but this breaks if I have an immutable shared union with alias this"), added cognitive load for the programmer and harder to write generic code.
Sep 30 2019
next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 30/09/2019 8:07 PM, JN wrote:
 I think D shouldn't be afraid to cut some of the language features which 
 aren't used much, or have a viable workaround. This way the language 
 would be smaller and easier to maintain/bugfix. Also, smaller language 
 means lesser chance of bugs occuring in the first place. Right now I 
 feel like D is in that weird spot. It's too afraid to cut or rework 
 existing features for language stability, but then is too brave 
 introducing new concepts which weren't battle-tested before.
I.e. hex strings
Sep 30 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 30 September 2019 at 07:07:34 UTC, JN wrote:

 This is true. However, given that it's a volunteer project, 
 there is room for improvement. If the workforce is limited, 
 it's spread over the language/libraries surface. If we were to 
 minimize that surface, we can have more work done.

 I think D shouldn't be afraid to cut some of the language 
 features which aren't used much, or have a viable workaround. 
 This way the language would be smaller and easier to 
 maintain/bugfix. Also, smaller language means lesser chance of 
 bugs occuring in the first place. Right now I feel like D is in 
 that weird spot. It's too afraid to cut or rework existing 
 features for language stability, but then is too brave 
 introducing new concepts which weren't battle-tested before.

 Main issues with D aren't with the specific features, which 
 often came up as a good idea and solve some problem, but the 
 problem is when these features interact with other features. 
 The result is a lot of unexpected interactions (look at any DIP 
 discussion thread, there's always a case of "yeah but this 
 breaks if I have an immutable shared union with alias this"), 
 added cognitive load for the programmer and harder to write 
 generic code.
I agree. New features are introduced way too fast, often based on the latest CS fashion of the day ("Look here's a paper on...") or "Rust has it, so we need it too!", regardless of whether or not this will break code. Maybe the reason why (ex-)users (including myself) are sometimes so grumpy and negative is that this implicit disregard of users' code makes users feel as if they and their work didn't count at all, yet D complains that there aren't enough users, yeah right. I don't know about others, but when I was still using D actively I would sometimes see discussions on the forum and say "Holy sh*t! Does this mean I'll have to refactor my code once this new feature becomes part of D?" It's like programming on an ejector seat, and after a while I got just sick and tired of it. Nobody wants to live like that forever, not to mention that it is really annoying if you have to go back to old code to "fix" things that shouldn't be broke in the first place. Ah, c'mon. D is 20 years old and still the same criticism pops up again and again and again and the same answers are given: volunteer effort, do it yourself or PFO. Is it the users' or the language leadership's fault I wonder, after 20 years. Hm. D is theoretically in a good position to do a spring cleaning. It has loads of features. Take what really works (battle-tested features), drop all the half-baked features that only a minority really uses. Improve the stability of the language and set up a proper ecosystem (e.g. out of the box compilation for various platforms / architectures). Atm, I'm mainly using Kotlin and I have to say that a small set of clever and well thought-out features can get you a long way. Do I miss some of D's features? Not really, because Kotlin provides enough useful and battle-tested features that you need 90% of the time [1]. Once I missed `static if`, but I could live without it. Mind you, Kotlin has some restrictions due to the fact that it has to be a 100% compatible with Java/JVM. But even when you use Kotlin/Native (without the Java universe, i.e. modules and libraries) you can get quite far. I think D should aim at that. 1. Take D's great features that are battle-tested 2. See what is not strictly necessary and drop it (dead weight), i.e. figure out what programmers need 90-95% of the time and don't pollute the language with features you only need in the remaining 5-10% of the cases. 3. Set up a sound and stable ecosystem But then again, I fear this will never happen.
Sep 30 2019
next sibling parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 D is theoretically in a good position to do a spring cleaning. 
 But then again, I fear this will never happen.
But the big one is nogc and all its associated machinery. This is the killer, the biggest source of all the mess. And getting rid of that is not going to happen. Instead you'll see more and more stuff designed to support nogc. Complaining about it is useless. So I personally don't complain, even though I disapprove of this course and consider it the only real strategic mistake in D. This is, of course, assuming that maximising the number of users is the strategic goal. OTOH, all these "f... off, don't whine, do it yourself" on the forum suggest that it's not actually such an important goal. As Adam Ruppe mentioned, more users, more problems. Which is fair enough, so indeed why whine? It just annoys people and provokes trolling.
Sep 30 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 09:57:59 UTC, nkm1 wrote:
 This is, of course, assuming that maximising the number of 
 users is the strategic goal.
Maximizing the number of users should not be the strategic goal. The strategic goal should be to provide a fertile ground for building an eco-system which becomes a strong contender in at least one niche. I can't think of any language that survives without satisfying that requirement. If you try to maximize the number of users you will end up not being particularly good for any particular purpose. Which means that the language is easy to replace with any upcoming generic language. To be a player in that area the ecosystem is even more important very deliberately try to piggyback on an existing eco systems for that reason. This is not to say that languages disappear, programming languages tend to have a long long dwindling trail (Perl, Fortran, Cobol, etc).
Sep 30 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 10:21:42 UTC, Ola Fosheim Grøstad 
wrote:
 This is not to say that languages disappear, programming 
 languages tend to have a long long dwindling trail (Perl, 
 Fortran, Cobol, etc).
tail, not trail...
Sep 30 2019
prev sibling parent reply nkm1 <t4nk074 openmailbox.org> writes:
On Monday, 30 September 2019 at 10:21:42 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 30 September 2019 at 09:57:59 UTC, nkm1 wrote:
 This is, of course, assuming that maximising the number of 
 users is the strategic goal.
Maximizing the number of users should not be the strategic goal. The strategic goal should be to provide a fertile ground for building an eco-system which becomes a strong contender in at least one niche.
That seems more like tactical or operational goal to me... What do you think of Haskell's "avoid success at all costs" or Lua's constant breakage?
Sep 30 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 10:25:56 UTC, nkm1 wrote:
 That seems more like tactical or operational goal to me... What 
 do you think of Haskell's "avoid success at all costs" or Lua's 
 constant breakage?
Haskell is very successful. It completely dominates the niche it was aiming for, a shared language for functional programming researchers. Sure, it lacks some namespace capabilities needed for writing large programs, but that is not needed for research... I am not particularly fond of Lua, but I am not sure that breakage is a problem for an embedded scripting language as you can just stay with a particular version and even fork the language with low cost. You don't need to focus on interop outside your own application. Lua and Python are the two dominating contenders in that niche (embedded scripting engines).
Sep 30 2019
parent reply Eugene Wissner <belka caraus.de> writes:
On Monday, 30 September 2019 at 10:41:21 UTC, Ola Fosheim Grøstad 
wrote:
 Haskell is very successful. It completely dominates the niche 
 it was aiming for, a shared language for functional programming 
 researchers. Sure, it lacks some namespace capabilities needed 
 for writing large programs, but that is not needed for 
 research...
Last time Haskell was a research language was 10 or 15 years ago. There are now better research functional programming languages than Haskell, like Idris or Agda. Haskell is used by industry much more than D, by big corporations as well as start-ups. The thing with Haskell is that it is a dead simple language. There are algebraic data structures, functions, typeclasses, some syntactic sugar and a few compiler extensions. No „alias this“, no structs and classes, no delegates and functions, no loops, no attributes, no function overloading, no casts – well, some of them may be useful for systems programming, but sure not everything. Instead it provides abstractions that can be used in different situations. D (and the most imperative languages) introduce language features for each use case and languages become bloated (particulary if it is a mainstream language). This joke „Avoid success at all cost“ probably means: Don‘t just follow trends, think about features you‘re going to introduce. As result Haskell has great GC and a compiler which is good at optimizing, it doesn‘t have the greatest infrastructure, but at least a set of libraries, you can build almost everything on. The only language I used, that is as buggy as D is Facebook‘s Hacklang. But well, Facebook is known for low quality software. When I first came to D, I thought: Wow, what a nice, simple language. But after some years it isn‘t simple, it gets terribly complicated. Therefore I like Zig‘s slogan: „Focus on debugging your application rather than debugging your programming language knowledge.“ Even the last DIP says: It is the first step, more is coming later. And I‘m 90% sure, it will be never finished. Not because it is D, but because this „I finish it later“ just never works in engeneering. And this kind of things just doesn‘t happen anymore to Haskell (or at least not to the same extent).
Sep 30 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 16:39:22 UTC, Eugene Wissner 
wrote:
 Last time Haskell was a research language was 10 or 15 years 
 ago. There are now better research functional programming 
 languages than Haskell, like Idris or Agda. Haskell is used by 
 industry much more than D, by big corporations as well as 
 start-ups.
Alright then, in my mind languages like Haskell and ML will always primarily be academic PL references, although they also have commercial applications. Anyway, both Haskell and ML are successful considering the use context they evolved in.
 As result Haskell has great GC and a compiler which is good at 
 optimizing, it doesn‘t have the greatest infrastructure, but at 
 least a set of libraries, you can build almost everything on.
Right, I am not a Haskell programmer, but I think the naming of some functions are odd, although I understand that there are historic FP traditions that has led to it. It does bring with it FP-cultural baggage... It could do better from a usability point of view.
 Even the last DIP says: It is the first step, more is coming 
 later. And I‘m 90% sure, it will be never finished. Not because 
 it is D, but because this „I finish it later“ just never works 
 in engeneering. And this kind of things just doesn‘t happen 
 anymore to Haskell (or at least not to the same extent).
It is not a good idea in software in general, considering that common wisdom says that initial development is only 10% of the overall life-time-costs. If you keep pushing forward without completing existing features I think the ratio will be a lot worse...
Sep 30 2019
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 30 September 2019 at 09:57:59 UTC, nkm1 wrote:
 On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 D is theoretically in a good position to do a spring cleaning. 
 But then again, I fear this will never happen.
But the big one is nogc and all its associated machinery. This is the killer, the biggest source of all the mess. And getting rid of that is not going to happen. Instead you'll see more and more stuff designed to support nogc. Complaining about it is useless. So I personally don't complain, even though I disapprove of this course and consider it the only real strategic mistake in D. This is, of course, assuming that maximising the number of users is the strategic goal. OTOH, all these "f... off, don't whine, do it yourself" on the forum suggest that it's not actually such an important goal. As Adam Ruppe mentioned, more users, more problems. Which is fair enough, so indeed why whine? It just annoys people and provokes trolling.
Whine: "3. To complain or protest with a whine or as if with a whine." [1]. "Whining" is not constructive and does not propose any solution. However, _a lot of people_ criticizing D over the years have done so suggesting concrete solutions, and yet they have consistently been labelled as "whiners" or "trolls". If it's true that the D leadership and enthusiasts are not interested in users (and everything hints at that), then why do they complain about "negativity that hinders adoption of D"? Wanna have your cake and eat it? That's a rather schizophrenic attitude: "Don't put off users, but to hell with them, we don't give a damn about users anyway!" Ironically, the D leadership are the real whiners, they've been complaining about low adoption rates for years, and yet treat their target group like unworthy worms. Not a good way to sell a product. Yeah, no, I do wonder why D is not more popular. You're basically saying that there's no cure for D and that it's is beyond hope. I wish the D leadership finally published a statement to this effect on dlang.org. Something like "D is a volunteer effort and serves no purpose other than playing around with features and ideas. It ships with no guarantees of stability or backward compatibility as we pretty much do what we please. Use at your own peril. Complaints will be ignored." [1] https://en.wiktionary.org/wiki/whine
Sep 30 2019
prev sibling parent reply IGotD- <nise nise.com> writes:
On Monday, 30 September 2019 at 09:57:59 UTC, nkm1 wrote:
 On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 D is theoretically in a good position to do a spring cleaning. 
 But then again, I fear this will never happen.
But the big one is nogc and all its associated machinery. This is the killer, the biggest source of all the mess. And getting rid of that is not going to happen. Instead you'll see more and more stuff designed to support nogc. Complaining about it is useless. So I personally don't complain, even though I disapprove of this course and consider it the only real strategic mistake in D. This is, of course, assuming that maximising the number of users is the strategic goal. OTOH, all these "f... off, don't whine, do it yourself" on the forum suggest that it's not actually such an important goal. As Adam Ruppe mentioned, more users, more problems. Which is fair enough, so indeed why whine? It just annoys people and provokes trolling.
I though this thread was about Nim. Nim hit 1.0 which I think is great because we need alternative systems languages. Instead this thread becomes some kind of panic for the D community because Nim advances and many think D doesn't. The whining is good because it shows that people use the language. What I think is obvious is that many posts seems to complain about the something undefined or that D hasn't become more popular in general. It would be easier if you were more specific about the the deficiencies of D. A lot of negativism while C++ is committing harakiri. One obvious future of D is services inside Linux embedded systems. Recently I was working on a project with web client SW inside a Linux system. The project used C++17 and the code was horrible (also from an aesthetic point of view which cannot be solved), unmaintainable with concurrency bugs waiting to happen. Also C++17 doesn't come with any obvious libraries for this so you have to hunt these down too with licenses that fits and these libraries are often horrible. I did a small test and wrote a portion of the the service in D and the difference was astonishing. First D offers more and better libraries out of the box and addition to that D has Vibe. The code became much more readable and consistent. Memory leakages are gone because of GC (don't hate GC, it is great for most use cases). Some language is going to take that place from C++ and it could several and D is placed in a pretty good spot here. Nim terms of libraries Nim is also a C++ replacement for these kind of use cases, however what I've seen is that the libraries/interfaces are more immature. Rust is the most competitive contender but as complicated Rust can be D can really be an option here. Most companies want a solution quickly and with good quality and D could provide this fast path. For my next project, I will suggest using D instead for those kinds of services. I don't mind GC that much but removing GC from Phobos isn't bad either so the priority is not that high for me. Could someone please explain why nogc is a priority. What I think is one high priority is to get reference counted GC in D because that opens up D for the performance crowd who do not want stop the world GC. Look at the possibilities instead.
Sep 30 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 11:05:58 UTC, IGotD- wrote:
 A lot of negativism while C++ is committing harakiri. One 
 obvious future of D is services inside Linux embedded systems.
That is certainly an option, you might be able build a community around the Linux embedded niche.
 Vibe. The code became much more readable and consistent. Memory 
 leakages are gone because of GC (don't hate GC, it is great for 
 most use cases).
But you can not rely on GC on small embedded systems. Even allocators that doesn't combat fragmentation is a relevant issue. So if embedded is a target niche then such runtime features must be addressed. If not, why would a project manager pick D over Go (beyond syntax preferences)?
 libraries/interfaces are more immature. Rust is the most 
 competitive contender but as complicated Rust can be D can 
 really be an option here.
But D does not provide the same memory management capabilities of Rust. Rust would be the stronger contender if GC isn't an option, and in that case D does not have a comparable memory management solution.
 What I think is one high priority is to get reference counted 
 GC in D because that opens up D for the performance crowd who 
 do not want stop the world GC.
Well, ref counting is too slow for the performance crowd, but it is better for predictable execution and limited memory usage than a GC. Which in some embedded contexts might be sufficient. If embedded is the main target niche then you would want clean and flexible features related to stack allocation and automatic proving upper bounds on stack size. Such features could make any reasonable language a strong contender in this niche. Being generic isn't sufficient to be a strong contender in specialized niches.
Sep 30 2019
parent IGotD- <nise nise.com> writes:
On Monday, 30 September 2019 at 11:40:01 UTC, Ola Fosheim Grøstad 
wrote:
 But you can not rely on GC on small embedded systems. Even 
 allocators that doesn't combat fragmentation is a relevant 
 issue.

 So if embedded is a target niche then such runtime features 
 must be addressed.

 If not, why would a project manager pick D over Go (beyond 
 syntax preferences)?
Bare metal or small embedded systems with a limited OS you cannot often use the standard library anyway. That's often the case with C as well unless you port the clib standard calls which is unusual. Often you have to work with proprietary APIs anyway. For OS agnostic D libraries it makes sense to remove any GC support but for OS related libraries it doesn't matter much as you almost always have some custom API. Go is probably one of the worst choices for bare metal as it relies on a runtime. D has betterC which makes it more suitable for bare metal. Right now I think betterC is a bit limited and I would actually like to see this part expanded, like full class support for example.
Sep 30 2019
prev sibling parent reply JN <666total wp.pl> writes:
On Monday, 30 September 2019 at 11:05:58 UTC, IGotD- wrote:
 I though this thread was about Nim. Nim hit 1.0 which I think 
 is great because we need alternative systems languages. Instead 
 this thread becomes some kind of panic for the D community 
 because Nim advances and many think D doesn't.
While I don't fully subscribe to the language war theory, I don't see Nim as any competition. I just don't see anything going on for Nim right now. I expect it will linger on the same level of popularity as say Pascal, with some dedicated community, support in most online compiler tools, some IDE plugins, but nothing else. Even compared to D... I just don't see Nim ever going big
 libraries/interfaces are more immature. Rust is the most 
 competitive contender but as complicated Rust can be D can 
 really be an option here. Most companies want a solution 
 quickly and with good quality and D could provide this fast 
 path. For my next project, I will suggest using D instead for 
 those kinds of services.
Many people mistakenly assume Rust's popularity comes down to borrow checker & memory safety features. But it is just a bonus. Most of the Rust's popularity comes from active community and good language focus. Lack of language runtime is a plus too, because it makes it a very good language for targeting embedded/WebAssembly.
 I don't mind GC that much but removing GC from Phobos isn't bad 
 either so the priority is not that high for me. Could someone 
 please explain why  nogc is a priority. What I think is one 
 high priority is to get reference counted GC in D because that 
 opens up D for the performance crowd who do not want stop the 
 world GC.
I think no matter what you do, C++ folks will complain. They're just triggered by the word GC. That's why every single discussion thread about D outside of this forums starts at the GC. Then someone will mention nogc or refcounting. Then someone will chime in about how you lose most of the packages and standard library because it assumes GC is present. And then people will just go "oh man, that is so complicated".
Sep 30 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 16:47:51 UTC, JN wrote:
 Many people mistakenly assume Rust's popularity comes down to 
 borrow checker & memory safety features. But it is just a 
 bonus. Most of the Rust's popularity comes from active 
 community and good language focus. Lack of language runtime is 
 a plus too, because it makes it a very good language for 
 targeting embedded/WebAssembly.
Probably true, but I think Rust is the hipster language of imperative languages. The fact that they had a fairly active community before it was usable says something, I think. That hipster-factor came from referencing advanced type theory (the theoretical foundation for the borrow checker) and also providing a ML-ish academicish syntax. C++/D is bread-and-butter in comparison
 And then people will just go "oh man, that is so complicated".
Memory management ought to be integrated into the language semantics and not delegated to libraries though. Otherwise it quickly becomes complicated for the programmer... So, that is a weakness that C++ and D shares.
Sep 30 2019
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 30, 2019 at 04:47:51PM +0000, JN via Digitalmars-d wrote:
[...]
 I think no matter what you do, C++ folks will complain. They're just
 triggered by the word GC. That's why every single discussion thread
 about D outside of this forums starts at the GC. Then someone will
 mention  nogc or refcounting. Then someone will chime in about how you
 lose most of the packages and standard library because it assumes GC
 is present. And then people will just go "oh man, that is so
 complicated".
+1 LOL, that's totally how such discussions tend to go. It's just like Walter himself has said on several occasions. Why bend over backwards to please the non-adopting crowds who will just move on to the next excuse not to use D once you've addressed their current complaint? Rather, we should be concerned about making *existing* users happier -- and there's no shortage of action items there. T -- Unix is my IDE. -- Justin Whear
Sep 30 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 30 September 2019 at 18:41:07 UTC, H. S. Teoh wrote:
 It's just like Walter himself has said on several occasions. 
 Why bend over backwards to please the non-adopting crowds who 
 will just move on to the next excuse not to use D once you've 
 addressed their current complaint? Rather, we should be 
 concerned about making *existing* users happier -- and there's 
 no shortage of action items there.


 T
How do you reconcile that with people leaving D again? Anyway, there are two things in the statement above: 1. Walter admits that D only caters for a few users with very specific use cases (niches). There's nothing wrong with that, but please state it _publicly_ on dlang.org. Of course people will be p****d off when they are sold one thing (general purpose language), only to find out later that they don't count at all, that D is only for the "chosen few". Also, the "chosen few" shouldn't go on about how great things are with D, of course it's great for _them_, because they get their pet features implemented. 2. Given 1., it makes even less sense that D is such a mess. If you have a small target audience with a more or less small set of requirements, then D could focus on that and have a sound and compact language for special purposes. In other words, D fails in both ways. It disregards the ordinary programmer, but it doesn't provide its pet users (the "chosen few") with a sound and stable language either. So what is D all about then?
Oct 01 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Monday, 30 September 2019 at 18:41:07 UTC, H. S. Teoh wrote:
 It's just like Walter himself has said on several occasions. 
 Why bend over backwards to please the non-adopting crowds who 
 will just move on to the next excuse not to use D once you've 
 addressed their current complaint? Rather, we should be 
 concerned about making *existing* users happier -- and there's 
 no shortage of action items there.


 T
How do you reconcile that with people leaving D again? Anyway, there are two things in the statement above: 1. Walter admits that D only caters for a few users with very specific use cases (niches). There's nothing wrong with that, but please state it _publicly_ on dlang.org. Of course people will be p****d off when they are sold one thing (general purpose language), only to find out later that they don't count at all, that D is only for the "chosen few". Also, the "chosen few" shouldn't go on about how great things are with D, of course it's great for _them_, because they get their pet features implemented. 2. Given 1., it makes even less sense that D is such a mess. If you have a small target audience with a more or less small set of requirements, then D could focus on that and have a sound and compact language for special purposes. In other words, D fails in both ways. It disregards the ordinary programmer, but it doesn't provide its pet users (the "chosen few") with a sound and stable language either. So what is D all about then?
Oct 01 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 1 October 2019 at 08:47:19 UTC, Chris wrote:
 1. Walter admits that D only caters for a few users with very 
 specific use cases (niches).
Which niches are these? My impression is that D primarily caters for users that want the feature-set of C++, but find C++ to be too inconvenient or complicated.
Oct 01 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 1 October 2019 at 10:05:31 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 1 October 2019 at 08:47:19 UTC, Chris wrote:
 1. Walter admits that D only caters for a few users with very 
 specific use cases (niches).
Which niches are these? My impression is that D primarily caters for users that want the feature-set of C++, but find C++ to be too inconvenient or complicated.
Look at the companies listed here: https://dlang.org/orgs-using-d.html and the posters who say "everything's great" often work for one of those companies. Weka.io and Symmetry come to mind. It's mostly real time (e.g. Funkwerk, sociomantic) and high throughput stuff (Weka.io etc.). It's mostly cloud / server based (that's why mobile is regarded unimportant). This is perfectly fine, if that's what D wants, but please be clear about it. I think what puts people off is that D is advertised as a general purpose "best language in the world", but is in fact a special interest niche language. Imo, it's this dishonesty that is at the heart of a lot of discontent and rants on this forum, and misunderstandings. Of course, an engineer who works on cloud based real time systems will not (care to) understand the frustrations of a team that wants to make their D program available on Android and iOS. If you look at the list of organizations above, it makes perfect sense why certain aspects - that any modern language needs nowadays - are neglected: mobile (cf. mostly cloud based users), proper tooling (cf. carefully crafted internal tooling in each organization). It calls my attention, though, that at Mercedes-Benz R&D "D is used for software development tools." Apparently, they're not yet convinced, and if you look at the other organizations above, they can afford to use a highly specialized, exotic language like D, because they're not too big and use it for very specific tasks. But once things get bigger, e.g. Mercedes-Benz, the company starts to tread more carefully.
Oct 01 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Tuesday, 1 October 2019 at 10:43:24 UTC, Chris wrote:
 On Tuesday, 1 October 2019 at 10:05:31 UTC, Ola Fosheim Grøstad 
 wrote:
 Which niches are these?
[...]
 Look at the companies listed here:

 https://dlang.org/orgs-using-d.html

 and the posters who say "everything's great" often work for one 
 of those companies. Weka.io and Symmetry come to mind. It's 
 mostly real time (e.g. Funkwerk, sociomantic) and high 
 throughput stuff (Weka.io etc.). It's mostly cloud / server 
 based (that's why mobile is regarded unimportant). This is 
 perfectly fine, if that's what D wants, but please be clear 
 about it.
That list is a bit difficult to assess as some of the companies have not moved forward with the language, but doesn't some of the those that have gone with D wholesale use their own runtimes? But yes, real time seems to be a niche with some adoption. Batch-dataprocessing seems to be a niche (I assume this accounts for situations where developers have failed to bring Python or other high level language up to the performance they wanted). But there isn't really many features in the language that makes it particularly well suited for real time or dataprocessing. It is mostly on the library/runtime level. So my current impression, from that list, is that there is no particular niche where it is the best option. There is not enough adoption in any particular niche IMO. (Compared to other "new" languages that have significant growth.)
 It calls my attention, though, that at Mercedes-Benz R&D "D is 
 used for software development tools." Apparently, they're not 
 yet convinced, and if you look at the other organizations 
 above, they can afford to use a highly specialized, exotic 
 language like D, because they're not too big and use it for 
 very specific tasks. But once things get bigger, e.g. 
 Mercedes-Benz, the company starts to tread more carefully.
Right, what it suggests is that Mercedes-Benz R&D don't see any obvious solution for what they want to do, so they look at smaller languages. Then they will try it for some task, but the acid test is if they expand as they go forward. Although, I do suspect that many people/managers tend to be inclined to try out languages that looks (syntax and feature wise) like something they already know. So if they look for something C-like it doesn't really tell us much about whether a more esoteric language like Rust would be "better" or "worse" for the task they try to solve. So that Rust is expanding in some niches despite having a syntax that is alien to many makes me more confident that they are carving a niche of their own. It is more difficult to make this argument for languages like Nim and D.
Oct 02 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
Or to phrase it differently:

In  which niche can Nim or D hope to obtain 1% market share of 
within 10 years?

Which niches are out of reach?

Clearly, web, iOS, Android are all out of reach. But maybe there 
are some other niches where a focused effort could lead to 
market-capturing progress.

What would have to change (in terms of technology) to increase 
the probability of capturing one such market?
Oct 02 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 10:16:45 UTC, Ola Fosheim Grøstad 
wrote:
 Or to phrase it differently:

 In  which niche can Nim or D hope to obtain 1% market share of 
 within 10 years?
Pew! That's a tough one.
 Which niches are out of reach?

 Clearly, web, iOS, Android are all out of reach. But maybe 
 there are some other niches where a focused effort could lead 
 to market-capturing progress.
I wouldn't write off Nim here. Nim could be an option for iOS and Android development if you can compile it to C libs or some sort of OS specific format.
 What would have to change (in terms of technology) to increase 
 the probability of capturing one such market?
Something like GraalVM? Write in any language you like and you can interact with any language you like, but I don't really see it for D. Nim, perhaps. I don't know.
Oct 02 2019
next sibling parent reply Rel <relmail rambler.ru> writes:
On Wednesday, 2 October 2019 at 11:12:29 UTC, Chris wrote:
 On Wednesday, 2 October 2019 at 10:16:45 UTC, Ola Fosheim 
 Grøstad wrote:
 Or to phrase it differently:

 In  which niche can Nim or D hope to obtain 1% market share of 
 within 10 years?
Pew! That's a tough one.
 Which niches are out of reach?

 Clearly, web, iOS, Android are all out of reach. But maybe 
 there are some other niches where a focused effort could lead 
 to market-capturing progress.
I wouldn't write off Nim here. Nim could be an option for iOS and Android development if you can compile it to C libs or some sort of OS specific format.
 What would have to change (in terms of technology) to increase 
 the probability of capturing one such market?
Something like GraalVM? Write in any language you like and you can interact with any language you like, but I don't really see it for D. Nim, perhaps. I don't know.
Actually Nim may be pretty fine for web as it compiles to JS as well. I believe that they have a full stack framework already, where you do both backend and frontend in Nim.
Oct 02 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 12:43:30 UTC, Rel wrote:
 I wouldn't write off Nim here. Nim could be an option for iOS 
 and Android development if you can compile it to C libs or 
 some sort of OS specific format.

 What would have to change (in terms of technology) to 
 increase the probability of capturing one such market?
Something like GraalVM? Write in any language you like and you can interact with any language you like, but I don't really see it for D. Nim, perhaps. I don't know.
Actually Nim may be pretty fine for web as it compiles to JS as well. I believe that they have a full stack framework already, where you do both backend and frontend in Nim.
I wouldn't write off Nim here either. Nim has the benefit of being "young", so they've been able to see and adapt to the latest developments (e.g. web client + server, JS etc.).
Oct 02 2019
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 2 October 2019 at 12:53:49 UTC, Chris wrote:
 On Wednesday, 2 October 2019 at 12:43:30 UTC, Rel wrote:
 I wouldn't write off Nim here. Nim could be an option for iOS 
 and Android development if you can compile it to C libs or 
 some sort of OS specific format.

 What would have to change (in terms of technology) to 
 increase the probability of capturing one such market?
Something like GraalVM? Write in any language you like and you can interact with any language you like, but I don't really see it for D. Nim, perhaps. I don't know.
Actually Nim may be pretty fine for web as it compiles to JS as well. I believe that they have a full stack framework already, where you do both backend and frontend in Nim.
I wouldn't write off Nim here either. Nim has the benefit of being "young", so they've been able to see and adapt to the latest developments (e.g. web client + server, JS etc.).
Nim seems to also be plagued by Rust envy, so not sure about it. Thanks to http://pling.jondgoodwin.com/post/cyclone/ I arrived to https://github.com/nim-lang/RFCs/issues/144 The comments regarding dropping GC and mixed mode libraries will feel remarkably familiar.
Oct 02 2019
parent Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 13:45:46 UTC, Paulo Pinto wrote:
 Nim seems to also be plagued by Rust envy, so not sure about it.

 Thanks to http://pling.jondgoodwin.com/post/cyclone/ I arrived 
 to https://github.com/nim-lang/RFCs/issues/144

 The comments regarding dropping GC and mixed mode libraries 
 will feel remarkably familiar.
Hope Nim devs have learned their lesson from D.
Oct 02 2019
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 2 October 2019 at 11:12:29 UTC, Chris wrote:
 I wouldn't write off Nim here. Nim could be an option for iOS 
 and Android development if you can compile it to C libs or some 
 sort of OS specific format.
That is a very good point. If you can compile to C or C++ then you can become part of most existing build chains (as a specialized tool) with limited effort.
Oct 02 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 10:10:56 UTC, Ola Fosheim Grøstad 
wrote:

 That list is a bit difficult to assess as some of the companies 
 have not moved forward with the language, but doesn't some of 
 the those that have gone with D wholesale use their own 
 runtimes?
I don't know, but I'm sure they have carefully crafted special purpose tooling around their D code (else you cannot work with D anyway).
 But yes, real time seems to be a niche with some adoption. 
 Batch-dataprocessing seems to be a niche (I assume this 
 accounts for situations where developers have failed to bring 
 Python or other high level language up to the performance they 
 wanted).
 But there isn't really many features in the language that makes 
 it particularly well suited for real time or dataprocessing. It 
 is mostly on the library/runtime level.
I suppose it's because it's easier to write in D than C/C++ (productivity) and the native performance gave them an edge over companies that use(d) Python for data analysis. D, like Python, is good for fast prototyping and, unlike Python, it is quite fast. But does it really scale? [...]
 Right, what it suggests is that  Mercedes-Benz R&D  don't see 
 any obvious solution for what they want to do, so they look at 
 smaller languages. Then they will try it for some task, but the 
 acid test is if they expand as they go forward.
Apparently, Facebook has dropped active D development. I'm always skeptical when I hear "X is using D now." People often say that D needs a big player behind it, but the big players actually have to be very careful with exotic languages. If it doesn't scale, they cannot use it. It's not that they're all knobs adopting the latest hipster fashion or sticking to old technologies. They simply cannot risk to be stuck with an exotic language. Smaller organizations that operate within very special scenarios can afford to use D and it might give them an edge over their competitors. But that doesn't mean D is a good choice per se, and keep in mind that for standard technologies like XML parsing and the like they might still rely on battle-tested libraries written in C/C++.
Oct 02 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 2 October 2019 at 11:00:41 UTC, Chris wrote:
 I don't know, but I'm sure they have carefully crafted special 
 purpose tooling around their D code (else you cannot work with 
 D anyway).
That is the impression I am getting.
 Apparently, Facebook has dropped active D development. I'm 
 always skeptical when I hear "X is using D now." People often 
 say that D needs a big player behind it, but the big players 
 actually have to be very careful with exotic languages. If it 
 doesn't scale, they cannot use it. It's not that they're all 
 knobs adopting the latest hipster fashion or sticking to old 
 technologies. They simply cannot risk to be stuck with an 
 exotic language.
Right, most larger companies have used multiple languages, but there is a big difference between trying out a new tool on some smaller projects and going for it for larger critical applications.
 Smaller organizations that operate within very special 
 scenarios can afford to use D and it might give them an edge 
 over their competitors.
Startups is not the best canary as startups have less risk-aversion and technology choices are more influenced by the preferences of the initial staff. They usually don't have enough experience with the task at hand when they start out to properly evaluate the tradeoffs, although since they often are cash-restricted they might go with what they think is the cheaper alternative (or "productivity" as you mentioned). How that works out is difficult to assess. No (sane) company will speak in negative terms about their tech-choices publicly of course, as it would undermine themselves in terms of PR. Thus it is also very difficult to assess what they say (they tend to speak positively about the tech they choose) and one has to assess how they expand into the tech platform as time goes on.
Oct 02 2019
next sibling parent reply Rel <relmail rambler.ru> writes:
On Wednesday, 2 October 2019 at 21:31:17 UTC, Ola Fosheim Grøstad 
wrote:
 On Wednesday, 2 October 2019 at 11:00:41 UTC, Chris wrote:
 I don't know, but I'm sure they have carefully crafted special 
 purpose tooling around their D code (else you cannot work with 
 D anyway).
That is the impression I am getting.
 Apparently, Facebook has dropped active D development. I'm 
 always skeptical when I hear "X is using D now." People often 
 say that D needs a big player behind it, but the big players 
 actually have to be very careful with exotic languages. If it 
 doesn't scale, they cannot use it. It's not that they're all 
 knobs adopting the latest hipster fashion or sticking to old 
 technologies. They simply cannot risk to be stuck with an 
 exotic language.
Right, most larger companies have used multiple languages, but there is a big difference between trying out a new tool on some smaller projects and going for it for larger critical applications.
 Smaller organizations that operate within very special 
 scenarios can afford to use D and it might give them an edge 
 over their competitors.
Startups is not the best canary as startups have less risk-aversion and technology choices are more influenced by the preferences of the initial staff. They usually don't have enough experience with the task at hand when they start out to properly evaluate the tradeoffs, although since they often are cash-restricted they might go with what they think is the cheaper alternative (or "productivity" as you mentioned). How that works out is difficult to assess. No (sane) company will speak in negative terms about their tech-choices publicly of course, as it would undermine themselves in terms of PR. Thus it is also very difficult to assess what they say (they tend to speak positively about the tech they choose) and one has to assess how they expand into the tech platform as time goes on.
As for startups I really like NoRedInk's story, who started with Ruby and React and eventually gradually switched to Haskell and Elm and never looked back. There are few talks on YouTube by Richard Feldman about this story, which is kinda interesting.
Oct 03 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 07:19:52 UTC, Rel wrote:
 As for startups I really like NoRedInk's story, who started 
 with Ruby and React and eventually gradually switched to 
 Haskell and Elm and never looked back. There are few talks on
That sounds interesting, transitioning to a functional programming setup from Ruby and React. I assume that they use a database backend that works well with FP to then.
Oct 03 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 21:31:17 UTC, Ola Fosheim Grøstad 
wrote:
 Startups is not the best canary as startups have less 
 risk-aversion and technology choices are more influenced by the 
 preferences of the initial staff. They usually don't have 
 enough experience with the task at hand when they start out to 
 properly evaluate the tradeoffs, although since they often are 
 cash-restricted they might go with what they think is the 
 cheaper alternative (or "productivity" as you mentioned). How 
 that works out is difficult to assess. No (sane) company will 
 speak in negative terms about their tech-choices publicly of 
 course, as it would undermine themselves in terms of PR. Thus 
 it is also very difficult to assess what they say (they tend to 
 speak positively about the tech they choose) and one has to 
 assess how they expand into the tech platform as time goes on.
True, true. We never hear about "X dropped D", it's more like "X has command line tool in D now!" Wow! ;) But now that you say that, that might also be the reason for the "Everything's grand, would you PFO !" posts on this forum by people who built their organization / product around D. What if potential investors or clients hear that the technology they're using doesn't scale very well - "scale" as in expansion? The initial "advantage" might come back to bite them once they wanna scale up. As for the big players, I can imagine that they like to play around with D to develop prototypes fast, and then see what other languages they can use to implement the real-world application. A language with a proper and healthy ecosystem. Python is often used for prototyping too, and then the real app is written in C++. This begs the question, is D becoming a native Python?
Oct 03 2019
next sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Thu, 2019-10-03 at 08:20 +0000, Chris via Digitalmars-d wrote:
 [=E2=80=A6]
 language with a proper and healthy ecosystem. Python is often=20
 used for prototyping too, and then the real app is written in=20
 C++. This begs the question, is D becoming a native Python?
Wasn't one of the starting points of Nim to be a native Python? (In the sam= e way Crystal is a native Ruby.) --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Oct 03 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 09:03:10 UTC, Russel Winder wrote:

 Wasn't one of the starting points of Nim to be a native Python? 
 (In the same way Crystal is a native Ruby.)
With the difference that Nim has the potential to become a native Python with all that's involved, i.e. rich ecosystem a la Python. But I think it can become much more than that. If you can compile it to C libs then it can become part of many things (mobile, embedded, gaming etc.) Personally, I'd love to see scientists and the like to use Nim instead of Python so you can use their code for production (more or less) immediately.
Oct 03 2019
parent reply LocoDelPueblo <fdp-dyna-hum nowhere.mx> writes:
On Thursday, 3 October 2019 at 09:13:37 UTC, Chris wrote:
 On Thursday, 3 October 2019 at 09:03:10 UTC, Russel Winder 
 wrote:

 Wasn't one of the starting points of Nim to be a native 
 Python? (In the same way Crystal is a native Ruby.)
With the difference that Nim has the potential to become a native Python with all that's involved, i.e. rich ecosystem a la Python. But I think it can become much more than that. If you can compile it to C libs then it can become part of many things (mobile, embedded, gaming etc.) Personally, I'd love to see scientists and the like to use Nim instead of Python so you can use their code for production (more or less) immediately.
Problem is also that people using python don't want to be programmers... they just do stuff in python, and sometimes these stuff become mainstream. So if the idea is to say "use Nim instead of python" you basically kill the "why" the python stuff started being widely used and developed. You fallback in the "no-way" again. I've observed this again two days ago where the UI of for a command line app required python-qt. The application could have been made in cpp, easily, but you know the authors just wanted to make their stuff...but now this ui is available on most linux distribs, no matter the way it's made.
Oct 03 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 09:36:17 UTC, LocoDelPueblo wrote:
 Problem is also that people using python don't want to be 
 programmers... they just do stuff in python, and sometimes 
 these stuff become mainstream. So if the idea is to say "use 
 Nim instead of python" you basically kill the "why" the python 
 stuff started being widely used and developed. You fallback in 
 the "no-way" again.

 I've observed this again two days ago where the UI of for a 
 command line app required python-qt. The application could have 
 been made in cpp, easily, but you know the authors just wanted 
 to make their stuff...but now this ui is available on most 
 linux distribs, no matter the way it's made.
You have a point there. But the good thing about Nim is that you can generate C, C++ and Objective-C: https://nim-lang.org/docs/backends.html This can be done by the developer who works on the actual product. The scientist (or whoever develops the prototype) can do so in Python-like Nim, and then the dev ports or incorporates it with a compiler switch. Say you have great code in Nim and you wanna make an Android or iOS app. Just convert it to C or C++ and compile it for the mobile platform. I've never tried this in practice, but it sounds promising.
Oct 03 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 09:49:57 UTC, Chris wrote:
 You have a point there. But the good thing about Nim is that 
 you can generate C, C++ and Objective-C:

 https://nim-lang.org/docs/backends.html

 This can be done by the developer who works on the actual 
 product. The scientist (or whoever develops the prototype) can 
 do so in Python-like Nim, and then the dev ports or 
 incorporates it with a compiler switch. Say you have great code 
 in Nim and you wanna make an Android or iOS app. Just convert 
 it to C or C++ and compile it for the mobile platform. I've 
 never tried this in practice, but it sounds promising.
Not bad at all: https://nim-lang.org/docs/nimc.html#cross-compilation-for-android https://nim-lang.org/docs/nimc.html#cross-compilation-for-ios
Oct 03 2019
parent reply LocoDelPueblo <fdp-dyna-hum nowhere.mx> writes:
On Thursday, 3 October 2019 at 10:15:49 UTC, Chris wrote:
 On Thursday, 3 October 2019 at 09:49:57 UTC, Chris wrote:
 You have a point there. But the good thing about Nim is that 
 you can generate C, C++ and Objective-C:

 https://nim-lang.org/docs/backends.html

 This can be done by the developer who works on the actual 
 product. The scientist (or whoever develops the prototype) can 
 do so in Python-like Nim, and then the dev ports or 
 incorporates it with a compiler switch. Say you have great 
 code in Nim and you wanna make an Android or iOS app. Just 
 convert it to C or C++ and compile it for the mobile platform. 
 I've never tried this in practice, but it sounds promising.
Not bad at all: https://nim-lang.org/docs/nimc.html#cross-compilation-for-android https://nim-lang.org/docs/nimc.html#cross-compilation-for-ios
My comment earlier was more on the fact that some said that Nim can be seen as a better python, just like D was sold as a better C or C++. It will work to some extant. But what will work, will it work because of the strategy ? I think that a good example for D is the TSV tools. It works and is recognized because it's a serious project, tested, benchmarked, etc. I'm not even sure that the company the author works at plays **that** an important role. The titled terminal made in D, also very popular, but made by a non corporate developer tends to confirm this. The strategy of "Y is a better X" is to make people moving from X to Y. So far in D successes are not movers, it's new projects, corporate or not.
Oct 03 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 10:49:47 UTC, LocoDelPueblo wrote:
 My comment earlier was more on the fact that some said that Nim 
 can be seen as a better python, just like D was sold as a 
 better C or C++. It will work to some extant. But what will 
 work, will it work because of the strategy ?

 I think that a good example for D is the TSV tools. It works 
 and is recognized because it's a serious project, tested, 
 benchmarked, etc. I'm not even sure that the company the author 
 works at plays **that** an important role. The titled terminal 
 made in D, also very popular, but made by a non corporate 
 developer tends to confirm this.

 The strategy of "Y is a better X" is to make people moving from 
 X to Y. So far in D successes are not movers, it's new 
 projects, corporate or not.
A big project, say a huge data mining / processing framework, would definitely help Nim and Python programmers are already familiar with the syntax. But there are other reasons why languages become popular. One reason (among many) why even great projects in D are ignored is D's atrocious ecosystem. In Python and Java for example I can download stuff I need, install dependencies (pip and gradle/maven) and that's usually it.
Oct 03 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 11:14:41 UTC, Chris wrote:
 In Python and Java for example I can download stuff I need, 
 install dependencies (pip and gradle/maven) and that's usually 
 it.
Yeah, I think that is the primarily driving force behind Java and Python, that and platform support. Once you've learned the ins and outs of Python you cannot put it aside because of the swiss-army-knife aspect of it. Basically all platforms support Python in one way or another, even embedded, it has taken the generic high-level position that C has on the low level.
Oct 03 2019
prev sibling parent reply mipri <mipri minimaltype.com> writes:
On Thursday, 3 October 2019 at 11:14:41 UTC, Chris wrote:
 One reason (among many) why even
 great projects in D are ignored is D's atrocious ecosystem.
The rhetorical value of hyperbolic language like this is that it's so rare in these forums. Everyone's just so peppy and ra-ra about D all the time that it's very startling to see any dissent.
 In
 Python and Java for example I can download stuff I need,
 install dependencies (pip and gradle/maven) and that's usually
 it.
Hi. I don't know what you're talking about, because I've done exactly this with dub, but I have some guesses. happen to need will probably be downloadable and immediately usable. "People don't use D because people don't use D." you'll contribute to people using D, and by an increment solve the problem of people not using D. to be written in only Python and Java, and to only depend on other pure-Python and pure-Java libraries, so that when you want something you can just download a bunch of Python or Java code and it works. With D you'll more often hit a requirement to install some system libraries, and then you have to look that up, which is a pain. package systems so that it can A) suggest OS dependencies, and B) install OS dependencies. "I noticed that the binary I just built is linking to these libraries, because I ran ldd on it. Then I did a yum whatprovides /path/to/libraries and I found some RPMs. Do you want to add those as CentOS 7 OS dependencies for this project? This won't prevent anyone from installing your library, but it will make installation much easier on CentOS 7." that you won't describe because you've mentioned them before and nothing happened and now everyone agrees that D has an atrocious ecosystem, right? In the third case, sorry for intruding on your established consensus, but could you at least mention an actual problem with dub when it comes up? Any one of the actual problems, as opposed to the fake problem of "you can't just download D stuff and use it the way you can with Python."
Oct 03 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 11:59:08 UTC, mipri wrote:
 comes up? Any one of the actual problems, as opposed to the fake
 problem of "you can't just download D stuff and use it the way 
 you can with Python."
IMO a package manager is only needed for scripty throw-away programming. I sincerely doubt it has a big effect on serious application development. If you build something with a long horizon you don't need a package manager. The time it takes to download and build libraries from github is negligible compared to the overall development time. Besides, if you build something big you also ought to understand the code you are using and the build process, even if it is a library. You have to be able to maintain the library in case it is abandoned...
Oct 03 2019
next sibling parent drug <drug2004 bk.ru> writes:
On 10/3/19 3:13 PM, Ola Fosheim Grøstad wrote:
 On Thursday, 3 October 2019 at 11:59:08 UTC, mipri wrote:
 comes up? Any one of the actual problems, as opposed to the fake
 problem of "you can't just download D stuff and use it the way you can 
 with Python."
IMO a package manager is only needed for scripty throw-away programming. I sincerely doubt it has a big effect on serious application development. If you build something with a long horizon you don't need a package manager. The time it takes to download and build libraries from github is negligible compared to the overall development time. Besides, if you build something big you also ought to understand the code you are using and the build process, even if it is a library. You have to be able to maintain the library in case it is abandoned...
Totally disagree to you. Package manager is the thing especially if you develop big projects. It allows you easily maintain the infrastructure without manual scripting all dependencies to build your product, test it and deploy. It's really hard to overestimate the value of good package manager.
Oct 03 2019
prev sibling parent reply Russel Winder <russel winder.org.uk> writes:
On Thu, 2019-10-03 at 12:13 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-d
wrote:
[=E2=80=A6]
=20
 IMO a package manager is only needed for scripty throw-away=20
 programming. I sincerely doubt it has a big effect on serious=20
 application development.
It seems that Python, Ruby, Rust, Go, and D disagree with this position, al= l being based on source code packages. Indeed C++ is catching up and doing something very package like.
 If you build something with a long horizon you don't need a=20
 package manager. The time it takes to download and build=20
 libraries from github is negligible compared to the overall=20
 development time. Besides, if you build something big you also=20
 ought to understand the code you are using and the build process,=20
 even if it is a library. You have to be able to maintain the=20
 library in case it is abandoned...
I think Cargo, Go, and Dub prove this to be wrong. If you end up using a library that was third-party maintained but gets abandoned, do not take the time to take over maintenance, change to another library. There is generally a very good reason the library got abandoned, a= nd there are invariably very good newer replacements. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Oct 03 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 19:54:32 UTC, Russel Winder wrote:
 It seems that Python, Ruby, Rust, Go, and D disagree with this 
 position, all being based on source code packages.
Python have enough libraries that have been maintained for a long time to be reliable, also the language is stable, does not change much and the environment is essentially a VM. So, even if it is not maintained, you can basically keep using it. Sure Go has many packages, but very few I would dare to rely on. The language is still changing, and I also don't want to use libraries on a server from a source I cannot trust. There are so many unmaintained libraries for these new languages that it doesn't look good for people who want to write software that has to be maintained for 15+ years.
 Indeed C++  is catching up and doing something very package 
 like.
Maybe, but if the libraries you use have strong ties to the OS/CPU architecture, then it has to be maintained at a higher frequency (which is more typical for low level programming than high level). What I believe is typical for C++ is to either include C-libraries that have a long track record of reliability, or depend on a small number of solid (commercial grade) C++frameworks, or use very small header libraries (preferably libraries that can be imported as independent single file headers). You really need to be careful with what you pull in when programming in C++ both in terms of being able to debug and build times.
 If you end up using a library that was third-party maintained 
 but gets abandoned, do not take the time to take over 
 maintenance, change to another library. There is generally a 
 very good reason the library got abandoned, and there are 
 invariably very good newer replacements.
That's not really possible in a larger project that has been deployed. That would incur large costs. Besides there might not be another PDF-generating library for you to use, or sound synthesis package that is anywhere near compatible. Even in Python, if I cannot use the PDF library I am using now, I would most likely set up an additional Java service rather than trying to pull in another PDF library written in Python. Besides, in reality you often end up maintaining your own fork anyway, since you usually have to make modifications to the library over time if it is doing something significant and special that you cannot roll on your own. Another reason to not use packages in larger projects is that it makes no sense to import an entire package when you only want one function. Increased build times, harder to read the codebase, and so on. What I find repulsive about the node.js world that use packages at crazy levels of pervasiveness is the security aspect of it.
Oct 03 2019
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Oct 03, 2019 at 08:50:53PM +0000, Ola Fosheim Grstad via Digitalmars-d
wrote:
 On Thursday, 3 October 2019 at 19:54:32 UTC, Russel Winder wrote:
[...]
 If you end up using a library that was third-party maintained but
 gets abandoned, do not take the time to take over maintenance,
 change to another library. There is generally a very good reason the
 library got abandoned, and there are invariably very good newer
 replacements.
That sounds good in theory, but in practice it will usually involve changing to a different API, and translating the old code to the new API (or writing a proxy wrapper) may be non-trivial (if even possible -- some may involve changing the entire operational paradigm that makes it infeasible to rewire). If that library is mission-critical I would not want to take the risk of leaving it up to the whim of whomever is hosting the source code online to remain online when I need it to be. Just imagine the hypothetical case that you have a massive vibe.d project, and suddenly one day for whatever reason vibe.d becomes unmaintained / the repo goes offline or whatever. Well, no problem, there's Adam Ruppe's arsd library with web handling, right? Just change a few lines in dub.json and off you go? The difference in API alone will probably entail rewriting the entire darned codebase before you could even get the thing to compile, let alone fixing all the bugs that were introduced in the process. Drop-in replacement is a pipe dream.
 That's not really possible in a larger project that has been deployed.
 That would incur large costs. Besides there might not be another
 PDF-generating library for you to use, or sound synthesis package that
 is anywhere near compatible. Even in Python, if I cannot use the PDF
 library I am using now, I would most likely set up an additional Java
 service rather than trying to pull in another PDF library written in
 Python.
[...] Exactly. Most non-trivial libraries of equivalent functionality tends to have drastically different APIs. It is not as easy as it sounds, that you just drop in a different library in place of the one that got abandoned. In a large, non-trivial project switching components like that is a big no-no unless there's a very compelling reason to, *and* you have the time and resources to pull it off (and clean up the resulting mess afterwards). T -- WINDOWS = Will Install Needless Data On Whole System -- CompuMan
Oct 03 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 20:50:53 UTC, Ola Fosheim Grøstad 
wrote:

 What I find repulsive about the node.js world that use packages 
 at crazy levels of pervasiveness is the security aspect of it.
The node.js world is crazy, in my opinion. Your whole project depends on some "obscure" packages. It's a big gamble and a maintenance hell. In my experience, such projects are for "instant gratification", it makes the client happy ("Wow! Looks good!"), but it quickly turns into a nightmare for those that have to maintain it. I prefer to invest time in a sound foundation and build on that. Some fancy things are easy to achieve with bog standard JS and CSS (transitions, transformations etc.), this has the advantage that you're in control and you don't depend on some obscure packages. A lot of info is here [1]. [1] https://www.w3schools.com/howto/howto_website.asp
Oct 04 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 09:34:16 UTC, Chris wrote:
 The node.js world is crazy, in my opinion. Your whole project 
 depends on some "obscure" packages. It's a big gamble and a 
 maintenance hell. In my experience, such projects are for
Yes, although I do use Angular, which also pulls in a lot of dependencies, but then I expect Google to do the maintenance. Anyway, there is a big difference between having many dependencies on something that runs on the server (bad idea) and using a framework that runs client-side in the browser "sandbox".
 foundation and build on that. Some fancy things are easy to 
 achieve with bog standard JS and CSS (transitions, 
 transformations etc.), this has the advantage that you're in 
 control and you don't depend on some obscure packages. A lot of
That's true. I believe the "culture" of using libraries for such things come from around ten years ago when browsers had very different feature sets. The exception is when you implement a UI with a standard look-and-feel. Like Google Material, which has an animated counterpart in Angular.
Oct 04 2019
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 4 October 2019 at 11:54:33 UTC, Ola Fosheim Grøstad 
wrote:
 That's true. I believe the "culture" of using libraries for 
 such things come from around ten years ago when browsers had 
 very different feature sets.
That's definitely one of the reasons. I remember those days. But even MS finally gave in (IE just faded out, devs and users didn't care anymore), nowadays you can use JS and CSS and be confident that it will work (with a few minor exceptions like getting the selected <option> of a <select> element). Another reason is that young people are used to frameworks and packages and often wonder why I didn't just download some fancy packages. But I wonder why I would do that if I can do it with keyframes and JS and 10 lines of code. Sure, web companies like Google and Facebook offer out of the box solutions, because they want devs to create apps and websites as fast as possible, and companies / startups are under pressure to impress clients and investors. Often it's disposable stuff like an app for the Soccer World Cup and the like. So there is demand and I'm not against it, but one has to decide whether or not it's good for your own project(s). Only because it's available and used all over the world doesn't mean it's the right thing for your project(s). It's like fast food. Sometimes it's a good solution, and it's good that we have it, but should you depend on it?
Oct 04 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 12:33:25 UTC, Chris wrote:
 On Friday, 4 October 2019 at 11:54:33 UTC, Ola Fosheim Grøstad 
 wrote:
 That's true. I believe the "culture" of using libraries for 
 such things come from around ten years ago when browsers had 
 very different feature sets.
That's definitely one of the reasons. I remember those days. But even MS finally gave in (IE just faded out, devs and users didn't care anymore), nowadays you can use JS and CSS and be confident that it will work (with a few minor exceptions like
Right, and the JS APIs are much better too. I sometimes wonder why jquery is still around? Established habits don't change easily...
 like. So there is demand and I'm not against it, but one has to 
 decide whether or not it's good for your own project(s).
Yeah, those frameworks are generally not good for mobil devices, but can be very good for admin-desktop-user-interfaces and the like. Although this might change, some countries have cheap and fast mobil data network providers. For instance I believe there is a scandinavian provider that allows mobile users to download 1000GB per month for around 50USD. And then you have 5G... so the limiting factors do change.
 Sometimes it's a good solution, and it's good that we have it, 
 but should you depend on it?
Right. You usually get much more fluid performance by rolling your own. Then again, how fluid do you ned "preference settings" to be? Different tools for different purposes.
Oct 04 2019
parent reply Russel Winder <russel winder.org.uk> writes:
On Fri, 2019-10-04 at 13:05 +0000, Ola Fosheim Gr=C3=B8stad via Digitalmars=
-d
wrote:
[=E2=80=A6]
=20
 Right, and the JS APIs are much better too.  I sometimes wonder=20
 why jquery is still around? Established habits don't change=20
 easily...
[=E2=80=A6] Isn't jQuery still being used because everyone with jQuery based systems refuses to update them because they work? --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Oct 07 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 7 October 2019 at 16:58:17 UTC, Russel Winder wrote:
 On Fri, 2019-10-04 at 13:05 +0000, Ola Fosheim Grøstad via 
 Digitalmars-d
 wrote:
 […]
 
 Right, and the JS APIs are much better too.  I sometimes 
 wonder why jquery is still around? Established habits don't 
 change easily...
[…] Isn't jQuery still being used because everyone with jQuery based systems refuses to update them because they work?
I don't know... https://meta.stackoverflow.com/questions/383170/how-do-i-ask-a-question-and-exclude-jquery-answers-to-strict-javascript-question *sigh*
Oct 07 2019
prev sibling next sibling parent rikki cattermole <rikki cattermole.co.nz> writes:
On 04/10/2019 12:59 AM, mipri wrote:
 
 In the third case, sorry for intruding on your established consensus,
 but could you at least mention an actual problem with dub when it
 comes up? Any one of the actual problems, as opposed to the fake
 problem of "you can't just download D stuff and use it the way you
 can with Python."
From my experience, there is a tendency from some people especially those that are more experienced in D, to fight how dub wants to do things. Do it dub's way, and it should "just work" for pure-D code bases. That isn't to say it doesn't or shouldn't be improved. It does need to be improved, to cover more advanced use cases.
Oct 03 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 11:59:08 UTC, mipri wrote:
 The rhetorical value of hyperbolic language like this is that 
 it's so
 rare in these forums. Everyone's just so peppy and ra-ra about 
 D all
 the time that it's very startling to see any dissent.
I know that D is "the best language in the world" and anyone who points out serious flaws is a heretic. But thanks for pointing it out.
 Hi. I don't know what you're talking about, because I've done
 exactly this with dub, but I have some guesses.


 happen
 to need will probably be downloadable and immediately usable. 
 "People
 don't use D because people don't use D."


 you'll contribute to people using D, and by an increment solve
 the problem of people not using D.
I used D for years. Got cured though. I think you get it only once. After that, you are immune.

 to be
 written in only Python and Java, and to only depend on other
 pure-Python and pure-Java libraries, so that when you want 
 something
 you can just download a bunch of Python or Java code and it 
 works.
 With D you'll more often hit a requirement to install some 
 system
 libraries, and then you have to look that up, which is a pain.
In other words, a mess of dependencies.

 OS
 package systems so that it can A) suggest OS dependencies, and 
 B)
 install OS dependencies. "I noticed that the binary I just 
 built is
 linking to these libraries, because I ran ldd on it. Then I did 
 a yum
 whatprovides /path/to/libraries and I found some RPMs. Do you 
 want to
 add those as CentOS 7 OS dependencies for this project? This 
 won't
 prevent anyone from installing your library, but it will make
 installation much easier on CentOS 7."


 dub that
 you won't describe because you've mentioned them before and 
 nothing
 happened and now everyone agrees that D has an atrocious 
 ecosystem,
 right?
There are loads of D users who dislike dub (I always used it for convenience though). I don't know if it's dub itself or just the fact that those who dislike it won't accept anything but make files. I never cared, tbh. The docs used to be hard to read, though, I don't know if that has improved.
 In the third case, sorry for intruding on your established 
 consensus,
 but could you at least mention an actual problem with dub when 
 it
 comes up? Any one of the actual problems, as opposed to the fake
 problem of "you can't just download D stuff and use it the way 
 you
 can with Python."
To be clear, dub is NOT "D's ecosystem", it's a tiny part of it, and most issues are not related to dub but to broken packages due to compiler upgrades and other toolchains. For example, is there a tool to streamline D integration into Android and iOS (cf. Nim)? Where are the plugins for the major IDEs? D doesn't care, but it sure won't scale this way.
Oct 03 2019
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 3 October 2019 at 12:53:58 UTC, Chris wrote:
 For example, is there a tool to streamline D integration into 
 Android and iOS (cf. Nim)?
yes, it already just works on android command line and gui coming very soon
Oct 03 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 13:02:41 UTC, Adam D. Ruppe wrote:
 On Thursday, 3 October 2019 at 12:53:58 UTC, Chris wrote:
 For example, is there a tool to streamline D integration into 
 Android and iOS (cf. Nim)?
yes, it already just works on android command line and gui coming very soon
Care to post the link?
Oct 03 2019
parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 3 October 2019 at 13:28:15 UTC, Chris wrote:
 Care to post the link?
https://github.com/ldc-developers/ldc/releases/tag/v1.18.0-beta2 me and another person are working on the ios and android gui stuff in the background.
Oct 03 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 13:39:51 UTC, Adam D. Ruppe wrote:
 On Thursday, 3 October 2019 at 13:28:15 UTC, Chris wrote:
 Care to post the link?
https://github.com/ldc-developers/ldc/releases/tag/v1.18.0-beta2 me and another person are working on the ios and android gui stuff in the background.
Thanks. Good on you. How will it work once it's finished? How hard would it be to integrate it into an Android Studio / Xcode project / toolchain, e.g. could I add it as a dependency or would I have to compile it and package it as an additional lib? Are there any plans for auto generated bridges, e.g. JNI calling rt_init() or something?
Oct 03 2019
next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 3 October 2019 at 18:15:10 UTC, Chris wrote:
 How will it work once it's finished?
That's part of what we still need to figure out. What I'm aiming for in the first version is you compile the D code with ldc then work the generated shared object back into your workflow, which won't be modified; you just make the blob then drop it in as if it was a closed-source library. Second pass is possibly a plugin with android studio, or at least the build system. I'm actually totally new to mobile dev (I personally don't even use smartphone apps!) so lots of learning as I go. I'm pretty thankful for the work the ldc people have already done on this, as well as the prior art Joakim did before he left. The iOS thing is being led by someone else (idk if he wants to publicly talk about it yet), but he's very much a competent Mac person so I have no doubt he'll make it work well.
 Are there any plans for auto generated bridges, e.g. JNI 
 calling rt_init() or something?
I do want to put the bindings to the NDK into druntime as well in core.sys.android. The D foundation is officially backing this work which makes that more realistic.
Oct 03 2019
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Oct 03, 2019 at 06:27:46PM +0000, Adam D. Ruppe via Digitalmars-d wrote:
[...]
 That's part of what we still need to figure out. What I'm aiming for
 in the first version is you compile the D code with ldc then work the
 generated shared object back into your workflow, which won't be
 modified; you just make the blob then drop it in as if it was a
 closed-source library.
BTW, I do have an android project that involves a combination of D code and Java code. It's cross-compiled from PC to the Android platform. Although it's not your typical Android project (I eschewed Android Studio in favor of my own SCons-based build script, mainly because I need to compile helper D programs that in turn generate D code for certain repetitive modules, but also because I can't be bothered with a resource-intensive IDE when a text editor works just as well), I'll be happy to provide details on the steps that worked for me. Basically, an APK can include Java bytecode, resources (XML, media, etc.), and, most importantly, a lib/ subdirectory for precompiled .so files. I took the route of using Java to interface with Android OS: there's a C API available that I *could* use if I wanted to, but I didn't want to have to reinvent the Java GUI bindings already available to Java code, so I just have the Java code declare native methods that are implemented by the .so (containing D code) via JNI. Compilation essentially amounts to running ldc on the D code to produce a .so that the build system inserts into the correct location in the APK, and the usual Java compilation and assembly into classes.dex, as required by Android, along with packing up the standard APK resources. Then run the tools in the Android SDK/NDK to process the APK, sign it, and what-not, then it's ready to be uploaded to your device for installation.
 Second pass is possibly a plugin with android studio, or at least the
 build system. I'm actually totally new to mobile dev (I personally
 don't even use smartphone apps!) so lots of learning as I go. I'm
 pretty thankful for the work the ldc people have already done on this,
 as well as the prior art Joakim did before he left.
Yes, my project got off the ground only because of the work Joakim did to make transcompilation to Android possible with LDC. There are some pretty specific details that you have to get right, otherwise your APK won't work. [...]
 Are there any plans for auto generated bridges, e.g. JNI calling
 rt_init() or something?
I do want to put the bindings to the NDK into druntime as well in core.sys.android. The D foundation is officially backing this work which makes that more realistic.
That would be VERY nice. Currently, in my code, I have to take extra care to call rt_init() and rt_cleanup() at the right junctures -- and this stuff is pretty delicate; get it wrong and it will crash at runtime, usually with no useful indication of what went wrong (you just get the "sorry, application crashed" message). I also have the beginnings of a JNI boilerplate auto-generator in my project, basically a template function that lets D code invoke a named Java method on a JNI-wrapped Java object. If you like, I can send you a copy for reference. Though keep in mind it's *very* crude, and I wrote just enough for my own needs, so it will need more work to be suitable for general consumption. But at the very least, it lets me call Java methods like this: // Equivalent to Java: // int ret = classObj.myMethod(123, "abc", 1.0); int ret = callJavaMethod!(int, "myMethod")(jniHandle, classObj, 123, "abc", 1.0); which, if you were to write it manually in terms of JNI calls, would be like a 25-50 line function in itself. (JNI is *very* boilerplate-y.) In theory, this could be expanded upon by wrapping Java class objects received over JNI in a JavaObject wrapper that contains an implicit JNI handle, so that you could actually write the above as: int ret = classObjWrapper.myMethod(123, "abc", 1.0); This can probably be done with good ole opDispatch. But so far, I haven't found the need for it yet, since my focus is mainly on the D code, and calling Java methods from D isn't really a priority except for some unavoidable calls to interact with the Android GUI API. One gotcha to watch out for, if you ever decide to go that route, is that the JNI handle handed to the D code by JNI is NOT guaranteed to be valid across different JNI calls; so if you do wrap your class objects like above, you have to be careful that you don't reuse stale JNI handles you obtained from previous JNI calls. Otherwise you may get random unexplained crashes at runtime. This is one of the reasons I haven't bothered to write such a wrapper just yet -- it's too much trouble to write code to handle the JNI handle correctly at the moment. Another challenge is that you can't deduce the Java return type of a method at D compile-time -- this is why callJavaMethod above requires a compile-time argument to specify the return type -- so you may have to resort to some heavy hackery if you want to be able to just assign the return value to a target variable without needing the user to explicitly specify the type. One approach that comes to mind is a tool that parses Java source code and emits equivalent D declarations that are then introspected by the Java class wrapper to obtain the correct method signatures. But again, a lot of work for something that I don't presently feel an urgent need for right now. (And, to dream on, in the future we could potentially have built-in Java support in D, in a similar style that we interface with C/C++ today: declare Java classes/methods using D equivalents, and have the compiler or a library auto-generate the JNI code for it. Or potentially just ditch JNI and use JNA instead, which is far less boilerplate-y. But as far as Android is concerned, JNA may not be available yet, so for the near future JNI seems to the way to go.) T -- WINDOWS = Will Install Needless Data On Whole System -- CompuMan
Oct 03 2019
next sibling parent Adam D. Ruppe <destructionator gmail.com> writes:
On Thursday, 3 October 2019 at 19:33:37 UTC, H. S. Teoh wrote:
 BTW, I do have an android project that involves a combination 
 of D code and Java code.
ooo i'll take any help i can get. email me later.
Oct 03 2019
prev sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 19:33:37 UTC, H. S. Teoh wrote:

[snip]

That sounds exactly like the nightmare I imagined it to be. I 
looked at Joakim's stuff a few times but realized the solutions 
for D on Android were still way too tricky and delicate. What he 
did is a great piece of engineering, but nothing that 
particularly inspires you to say "Yes, I'll go with D!" I'm glad 
that there is a bit of movement now.

 (And, to dream on, in the future we could potentially have 
 built-in Java support in D, in a similar style that we 
 interface with C/C++ today: declare Java classes/methods using 
 D equivalents, and have the compiler or a library auto-generate 
 the JNI code for it.  Or potentially just ditch JNI and use JNA 
 instead, which is far less boilerplate-y. But as far as Android 
 is concerned, JNA may not be available yet, so for the near 
 future JNI seems to the way to go.)


 T
That'd actually be great. Atm, interfacing D and Java is awkward. I've done it a few times (once I compiled my D code to a lib and used JavaFX for the UI), but it's certainly nothing you look forward to doing ;)
Oct 04 2019
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 18:27:46 UTC, Adam D. Ruppe wrote:
 On Thursday, 3 October 2019 at 18:15:10 UTC, Chris wrote:
 How will it work once it's finished?
That's part of what we still need to figure out. What I'm aiming for in the first version is you compile the D code with ldc then work the generated shared object back into your workflow, which won't be modified; you just make the blob then drop it in as if it was a closed-source library. Second pass is possibly a plugin with android studio, or at least the build system. I'm actually totally new to mobile dev (I personally don't even use smartphone apps!) so lots of learning as I go. I'm pretty thankful for the work the ldc people have already done on this, as well as the prior art Joakim did before he left. The iOS thing is being led by someone else (idk if he wants to publicly talk about it yet), but he's very much a competent Mac person so I have no doubt he'll make it work well.
 Are there any plans for auto generated bridges, e.g. JNI 
 calling rt_init() or something?
I do want to put the bindings to the NDK into druntime as well in core.sys.android. The D foundation is officially backing this work which makes that more realistic.
That all sounds very good. As you probably know, Android Studio uses CMake for the C/C++ libs [1], it's not hard to set up and pretty handy, all my C code is compiled for whatever architecture and version of Android automagically. I realize that it would not be quite as "simple" for D. But something like that, an automatic D build script would be desirable. Also, you might want to have a look at how Kotlin handles multiplatform stuff, especially Android and iOS [2][3]. [1] https://developer.android.com/studio/projects/configure-cmake [2] https://kotlinlang.org/docs/reference/native-overview.html [3] https://play.kotlinlang.org/hands-on/Targeting%20iOS%20and%20Android%20with%20Kotlin%20Multiplatform/01_Introduction
Oct 04 2019
prev sibling parent GreatSam4sure <greatsam4sure gmail.com> writes:
On Thursday, 3 October 2019 at 18:27:46 UTC, Adam D. Ruppe wrote:
 On Thursday, 3 October 2019 at 18:15:10 UTC, Chris wrote:
 How will it work once it's finished?
That's part of what we still need to figure out. What I'm aiming for in the first version is you compile the D code with ldc then work the generated shared object back into your workflow, which won't be modified; you just make the blob then drop it in as if it was a closed-source library. Second pass is possibly a plugin with android studio, or at least the build system. I'm actually totally new to mobile dev (I personally don't even use smartphone apps!) so lots of learning as I go. I'm pretty thankful for the work the ldc people have already done on this, as well as the prior art Joakim did before he left. The iOS thing is being led by someone else (idk if he wants to publicly talk about it yet), but he's very much a competent Mac person so I have no doubt he'll make it work well.
 Are there any plans for auto generated bridges, e.g. JNI 
 calling rt_init() or something?
I do want to put the bindings to the NDK into druntime as well in core.sys.android. The D foundation is officially backing this work which makes that more realistic.
Thanks for the effort. But it must work out of the box. A plugin to the android studio will be best prefer. All the heavy lifting should be done by the plugin. I would like to just install the plugin and would be fine for it. If I can get this, I might consider donating for it.
Oct 04 2019
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On Thursday, 3 October 2019 at 18:15:10 UTC, Chris wrote:

 Thanks. Good on you. How will it work once it's finished? How 
 hard would it be to integrate it into an Android Studio / Xcode 
 project / toolchain, e.g. could I add it as a dependency or 
 would I have to compile it and package it as an additional lib? 
 Are there any plans for auto generated bridges, e.g. JNI 
 calling rt_init() or something?
I can give some answers related to iOS, although I'm not that guy working on the iOS project. The simplest would probably be to add an external build step in Xcode to compile the D code, which should produce a static library. Then just link with that static library. It's a bit simpler on iOS since everything is native code and there are nothin like JNI that needs to be handled. For interfacing between the Swift/Objective-C code either the `extern(C)` or `extern(Objective-C)` can be used. DStep [1] can generate bindings to Objective-C code. I think it should be possible to implement an application completely in D as well. [1] http://github.com/jacob-carlborg/dstep
Oct 04 2019
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Oct 04, 2019 at 02:14:33PM +0000, Jacob Carlborg via Digitalmars-d
wrote:
[...]
 The simplest would probably be to add an external build step in Xcode
 to compile the D code, which should produce a static library. Then
 just link with that static library. It's a bit simpler on iOS since
 everything is native code and there are nothin like JNI that needs to
 be handled.
[...] FYI, on Android there *is* a C API in the NDK that you can use to interface with the OS. IIRC Java/JNI is not strictly necessary, though it's certainly more convenient since a large number of built-in resources are only accessible to Java (esp. the various GUI components). I chose the mixed Java/D route mainly so that I can just leverage off the Java stuff that's already built-in, instead of homebrewing everything from scratch in D. T -- If you think you are too small to make a difference, try sleeping in a closed room with a mosquito. -- Jan van Steenbergen
Oct 04 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 15:02:36 UTC, H. S. Teoh wrote:
 FYI, on Android there *is* a C API in the NDK that you can use 
 to interface with the OS. IIRC Java/JNI is not strictly 
 necessary, though it's certainly more convenient since a large 
 number of built-in resources are only accessible to Java (esp. 
 the various GUI components).
I wonder how hard it would be to port Flutter to D and bypass Java/Swift entirely. I don't think "modern" Dart is all that different. https://flutter.dev/docs/resources/technical-overview
Oct 04 2019
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 4 October 2019 at 15:18:03 UTC, Ola Fosheim Grøstad 
wrote:
 I wonder how hard it would be to port Flutter to D and bypass 
 Java/Swift entirely. I don't think "modern" Dart is all that 
 different.

 https://flutter.dev/docs/resources/technical-overview
The guy is obviously biased, but he has a point. Kotlin devs will not want to move to Dart, and who knows how long Flutter will last, till the next cool framework comes along? Personally, it's too React-ish and Angular-ish for my liking. https://blog.kotlin-academy.com/flutter-and-kotlin-multiplatform-relationship-890616005f57
Oct 04 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 17:41:15 UTC, Chris wrote:
 The guy is obviously biased, but he has a point. Kotlin devs 
 will not want to move to Dart, and who knows how long Flutter 
 will last, till the next cool framework comes along? 
 Personally, it's too React-ish and Angular-ish for my liking.
I don't know, but Google is using Dart internally and also control Android, so it might currently be the best option for cross platform. Anyway, if it all is open source, then the code will be there even if they drop it. So, I have no experience with Flutter, but it seems to be more mature than it was a few years ago. It doesn't really matter all that much whether people want to transition or not. If Google don't drop Flutter then the business-cost-savings will be more important than what programmers want or don't want... Anyway, since the layer under Dart is C/C++, then it should be possible to port Flutter to other languages.
Oct 04 2019
prev sibling parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Friday, 4 October 2019 at 15:02:36 UTC, H. S. Teoh wrote:
 On Fri, Oct 04, 2019 at 02:14:33PM +0000, Jacob Carlborg via 
 Digitalmars-d wrote: [...]
 The simplest would probably be to add an external build step 
 in Xcode to compile the D code, which should produce a static 
 library. Then just link with that static library. It's a bit 
 simpler on iOS since everything is native code and there are 
 nothin like JNI that needs to be handled.
[...] FYI, on Android there *is* a C API in the NDK that you can use to interface with the OS. IIRC Java/JNI is not strictly necessary, though it's certainly more convenient since a large number of built-in resources are only accessible to Java (esp. the various GUI components). I chose the mixed Java/D route mainly so that I can just leverage off the Java stuff that's already built-in, instead of homebrewing everything from scratch in D. T
JNI is pretty much a requirement for anything that isn't OpenGL, Vulkan, Audio, ISO C or ISO C++ standard libraries, even accessing files outside the APK install directory requires Java due to SAF. Additionally since Android 7 Google has been clamping down the NDK APIs to minimize exploits via native code.
Oct 04 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 15:34:32 UTC, Paulo Pinto wrote:
 JNI is pretty much a requirement for anything that isn't 
 OpenGL, Vulkan, Audio, ISO C or ISO C++ standard libraries, 
 even accessing files outside the APK install directory requires 
 Java due to SAF.

 Additionally since Android 7 Google has been clamping down the 
 NDK APIs to minimize exploits via native code.
But Flutter is made by Google, so they clearly have a path to a Java-free environment? Or does Flutter depend on Java?
Oct 04 2019
parent reply Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Friday, 4 October 2019 at 15:56:28 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 4 October 2019 at 15:34:32 UTC, Paulo Pinto wrote:
 JNI is pretty much a requirement for anything that isn't 
 OpenGL, Vulkan, Audio, ISO C or ISO C++ standard libraries, 
 even accessing files outside the APK install directory 
 requires Java due to SAF.

 Additionally since Android 7 Google has been clamping down the 
 NDK APIs to minimize exploits via native code.
But Flutter is made by Google, so they clearly have a path to a Java-free environment? Or does Flutter depend on Java?
We're using Flutter at work for one of our applications, so I can answer that. The Flutter project is divided in two parts: engine and framework. The engine is written in a mix of C++, Java and Objective-C where Java and Objective-C are respectively used to access the platform APIs. The framework (what Flutter apps are directly using) is written in pure Dart. Flutter is using a technique called system channels which is basically an RPC-like thing that marshals method calls to/from C/C++/Java/Objective-C from/to C++ and Dart. It's slower than calling the method from the same language. I believe Flutter will be framework of choice for their upcoming Fuchsia OS, but currently Dart is very much a second class citizen on Android (unlike Kotlin), just like it is on iOS. Fortunately, Google have done a stellar job of creating good dev experience by providing a solid CLI and editor plugins that abstract the platform differences.
Oct 04 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 4 October 2019 at 16:28:31 UTC, Petar Kirov 
[ZombineDev] wrote:
 The Flutter project is divided in two parts: engine and 
 framework. The engine is written in a mix of C++, Java and 
 Objective-C where Java and Objective-C are respectively used to 
 access the platform APIs. The framework (what Flutter apps are 
 directly using) is written in pure Dart.
 Flutter is using a technique called system channels which is 
 basically an RPC-like thing that marshals method calls to/from 
 C/C++/Java/Objective-C from/to C++ and Dart. It's slower than 
 calling the method from the same language.
Ok, it was not apparent from their system overview chart that they go through Java, it basically says that the engine is C/C++ and sits on top of a system specific "embedder" layer. So I guess then that Java is called from the "embedder" layer where the API isn't available otherwise? But rendering is done straight from C/C++? It does use its own renderer, right?
 I believe Flutter will be framework of choice for their 
 upcoming Fuchsia OS, but currently Dart is very much a second 
 class citizen on Android (unlike Kotlin), just like it is on 
 iOS. Fortunately, Google have done a stellar job of creating 
 good dev experience by providing a solid CLI and editor plugins 
 that abstract the platform differences.
Thanks for sharing your experience with Flutter! Being able to do cross-platform apps for iOS, Android and web did peak my curiosity, so downloading now to see how it all works.
Oct 04 2019
prev sibling next sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Friday, 4 October 2019 at 14:14:33 UTC, Jacob Carlborg wrote:
 I can give some answers related to iOS, although I'm not that 
 guy working on the iOS project.
oh i thought it was you :S
 For interfacing between the Swift/Objective-C code either the 
 `extern(C)` or `extern(Objective-C)` can be used. DStep [1] can 
 generate bindings to Objective-C code.
So it isn't that different than doing a Mac cocoa program? I don't have a mac nor an apple developer license though, just a vm. So I probably couldn't do it even if I tried. (the android thing is painful enough, I don't even know how to do it and most the work is already done for me by others!)
Oct 04 2019
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Fri, Oct 04, 2019 at 03:19:48PM +0000, Adam D. Ruppe via Digitalmars-d wrote:
[...]
 (the android thing is painful enough, I don't even know how to do it
 and most the work is already done for me by others!)
Just in case this is helpful, here's an overview of I get from sources to .apk in my mixed Java/D project: 1) Generate R.java from Java sources by running `aapt` (from the SDK/NDK; some Linux distros ship it separately as well -- be aware that the version must match the rest of your environment or you might get strange errors / compatibility issues). 2) Build Java sources by invoking `javac`. 3) Run the `dx` tool to transform the resulting .class files to the classes.dex format required by Android. 4) Cross-compile D sources to the target architecture using LDC (in my case, I target cortex-a8 / armv7 androideabi). In my case, I found that I need to compile to object files first, and link separately (see following step). 5) Link .o files to .so file using the clang compiler from the Android NDK. This is to ensure that you get the right native libs linked to your D code, otherwise you'll end up with link errors and/or runtime problems. Be sure to specify -shared and the right -soname. EXTRA NOTE: In order to optimize the resulting binary size, I also specify -Wl,--gc-sections and -Wl,--version-script=libmyapp.version, where libmyapp.version contains the following: LIBGAME_1.0 { global: Java_*; local: *; }; This is to ensure that --gc-sections deletes all sections not reachable from a symbol that matches the name Java_* (i.e., a JNI entry point). Without this, tons of unused symbols from druntime and Phobos get included in the .so, significantly bloating the APK size. If you're generating a Java-free native app, you'll have to customize the global: line to point to your app's entry point so that it doesn't get GC'd. :-D 6) Create the initial (unaligned) APK file using the following commands: aapt package -f -m -M$manifest_file -S$resource_dir -I$sdkjar -F $target-unaligned.apk aapt add $target-unaligned.apk classes.dex aapt add $target-unaligned.apk path/to/libmyapp.so where: $manifest_file is the Android manifest file for your project; $sdkjar points to platforms/android-$version/android.jar where $version is the Android API version number you're targeting; $resource_dir Points to the subdir (generally called `res`) containing various XML resources and other media. 7) Sign the APK: apksigner sign --ks-pass file:path/to/passwordfile -ks path/to/keystore $target-unaligned.apk (Note: apksigner expects a keystore in the proprietary Java format; if you want to use a different keystore format you'll have to manually do the signing -- google for instructions on how to do this. Be warned that it will not be pretty.) 8) Align the APK: zipalign -f 4 $target-unaligned.apk $target.apk Now $target.apk should be ready for installation. Most of the above steps are customarily hidden from view when you use an IDE like Android Studio, and it's mostly automated by Gradle. Ideally, to make the above more accessible to D users, you'd probably want to create a Gradle plugin that integrates the LDC cross-compilation and linking step into Gradle, or whatever IDE / build system you want to target, so that users can just add the plugin to their build config and build away. I did all of the above manually primarily because I have auxiliary code generators in my project (small D programs running on the host PC that read in a bunch of data files and process them into a form suitable for inclusion into the APK, along the way also generating D wrapper code for working with said data). Doing it manually is also more direct, and has less external dependencies, which makes it faster (using SCons, I can already finish building the darned thing from start to finish in the time it takes just for Gradle to *start up* -- and this, in a fraction of the RAM required to even run Gradle in the first place). Generally, I don't recommend doing all of this manually. :-D So for your general D user who's uncomfortable with writing build scripts manually you probably want to look into Gradle integration of some sort, or whatever IDE or build system you want to target. Note that the current version of dub is unable to handle some of the above steps without heavy hackery / hard-coding of stuff / external scripting. (The cross-compilation step and the need to compile Java are big show-stoppers when it comes to dub.) It would be *very* nice if you could somehow coax dub to do the Right Thing(tm), but I honestly have my doubts about it. You *should* be able to do the cross-compilation and linking step as a Gradle plugin, though, and Gradle will take care of the rest of the APK heavy-lifting for you, so perhaps this could be a first target: get D cross-compilation working in Android Studio. (Unfortunately, I have not bothered to use Gradle to any meaningful degree, so I'm unable to help you on that front.) T -- The best way to destroy a cause is to defend it poorly.
Oct 04 2019
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2019-10-04 17:19, Adam D. Ruppe wrote:

 oh i thought it was you :S
Unfortunately no. It doesn't look like anyone is working on it [1].
 So it isn't that different than doing a Mac cocoa program?
No, it's the same idea. The underlying operating systems are the same (or very similar). They share many of the non-UI frameworks and other libraries. [1] https://dlang.org/blog/2019/10/04/d-language-foundation-funding-new-platforms-new-bounties/ -- /Jacob Carlborg
Oct 04 2019
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2019-10-04 16:14, Jacob Carlborg wrote:

 I can give some answers related to iOS, although I'm not that guy 
 working on the iOS project.
 
 The simplest would probably be to add an external build step in Xcode to 
 compile the D code, which should produce a static library. Then just 
 link with that static library. It's a bit simpler on iOS since 
 everything is native code and there are nothin like JNI that needs to be 
 handled.
 
 For interfacing between the Swift/Objective-C code either the 
 `extern(C)` or `extern(Objective-C)` can be used. DStep [1] can generate 
 bindings to Objective-C code.
 
 I think it should be possible to implement an application completely in 
 D as well.
 
 [1] http://github.com/jacob-carlborg/dstep
I can also add that it's not possible to create plugins for Xcode. It does support extensions but so far it's only possible to access and manipulate the source code. Can't add support for new languages, can't add new build types or similar. It's probably possible to build an iOS project without the help of Xcode, just look at the commands Xcode is invoking and replicate those. Can probably use Dub as well with the help of pre and post actions. -- /Jacob Carlborg
Oct 04 2019
prev sibling parent reply mipri <mipri minimaltype.com> writes:
On Thursday, 3 October 2019 at 12:53:58 UTC, Chris wrote:
 On Thursday, 3 October 2019 at 11:59:08 UTC, mipri wrote:
 The rhetorical value of hyperbolic language like this is that 
 it's so
 rare in these forums. Everyone's just so peppy and ra-ra about 
 D all
 the time that it's very startling to see any dissent.
Let's go through this piece by piece: The rhetorical value of hyperbolic language like this I'm guessing this what you replied to. is that it's so rare in these forums. Do you agree with that? Everyone's just so peppy and ra-ra about D all the time Do you think anyone would perceive the forums this way? it's very startling to see any dissent. Did you think anyone would be startled by your remarks? To put it plainly, the forums are characterized by a constant drone of hyperbolic complaints about D and even a week of exposure is enough time to start tuning it out automatically. I don't read "D has an atrocious ecosystem" and think, oh no, someone had better do something about that! Nor do I feel that I should get in front of this unjust attack on D's honor. What I think is, "if I ask this person what he even means by that, is he even going to tell me?"
 To be clear, dub is NOT "D's ecosystem", it's a tiny part of 
 it, and most issues are not related to dub but to broken 
 packages due to compiler upgrades and other toolchains. For 
 example, is there a tool to streamline D integration into 
 Android and iOS (cf. Nim)? Where are the plugins for the major 
 IDEs? D doesn't care, but it sure won't scale this way.
Oct 03 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 13:35:09 UTC, mipri wrote:
 Let's go through this piece by piece:

   The rhetorical value of hyperbolic language like this

 I'm guessing this what you replied to.

   is that it's so rare in these forums.

 Do you agree with that?

   Everyone's just so peppy and ra-ra about D all the time

 Do you think anyone would perceive the forums this way?

   it's very startling to see any dissent.

 Did you think anyone would be startled by your remarks?

 To put it plainly, the forums are characterized by a constant 
 drone of
 hyperbolic complaints about D and even a week of exposure is 
 enough
 time to start tuning it out automatically. I don't read "D has 
 an
 atrocious ecosystem" and think, oh no, someone had better do 
 something
 about that! Nor do I feel that I should get in front of this 
 unjust
 attack on D's honor. What I think is, "if I ask this person 
 what he
 even means by that, is he even going to tell me?"


Er, I've raised some issues, as have others over the years. I don't know what your problem is. I'm not trying to shock or troll. But I think it's ok to talk about D on forum.dlang.org? Ain't it? And I'm learning interesting things and it's interesting to hear what other people (not the fanboys and those with a vested interest) say. True, I'm not a fan of how things have developed, but who cares? If you wanna be sarcastic, try to be less obscure. That might help. Good look to you!
Oct 03 2019
parent reply mipri <mipri minimaltype.com> writes:
On Thursday, 3 October 2019 at 13:44:09 UTC, Chris wrote:
 If you wanna be sarcastic, try to be less obscure.
What you did was rhetorical pattern-matching. if Chris says <bad things about D> Someone else <replies disapprovingly> then deploy prepared comeback: "Aha! You D fanboys are at it again, attacking the messenger!" Read what I say instead of lazily continuing your unproductive bitch-fest into pseudo-conversation with me, and you'll catch obvious sarcasm just fine.
 That might help. Good look to you!
Oct 03 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 13:47:57 UTC, mipri wrote:
 On Thursday, 3 October 2019 at 13:44:09 UTC, Chris wrote:
 If you wanna be sarcastic, try to be less obscure.
What you did was rhetorical pattern-matching. if Chris says <bad things about D> Someone else <replies disapprovingly> then deploy prepared comeback: "Aha! You D fanboys are at it again, attacking the messenger!" Read what I say instead of lazily continuing your unproductive bitch-fest into pseudo-conversation with me, and you'll catch obvious sarcasm just fine.
 That might help. Good look to you!
You're a charming chap indeed! Your social skills are amazing. If you don't like the conversation, just ignore it, that's what most people would do. I've quite enjoyed the chat with some of the posters here. And I've learned a lot too. Actually, Nim looks promising.
Oct 03 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 13:47:57 UTC, mipri wrote:
 Read what I say instead of lazily continuing your unproductive
 bitch-fest into pseudo-conversation with me, and you'll catch
 obvious sarcasm just fine.
You went ad-hominem. That has been a plague of these forums for years. Unfortunately there is no real forum moderation policy in place, and frankly key developers have had a tendency to go down this road quite frequently, but moving from something technical to something personal is not in the spirit of engineering, and should be avoided, no matter how much you dislike the presented positions. Most people in this thread have presented reasonable positions. But D is moving slowly in the critical area of memory management, which really is a prerequisite for being a real alternative for many developers. On the other hand, D is moving, so that is a good thing, but I really hope that the D community don't scare off people with the theoretical knowledge to make it work. Without those people there is a very slim chance of getting it right. Other languages are moving to, in this thread Nim has been highlighted, so comparing Nim's situation to D's situation ought to be on-topic for this thread. I don't really understand why you want Chris to sugar-coat his views. Again, I really hope that the D community don't scare off people with the theoretical knowledge to design memory management solutions that works well.
Oct 03 2019
parent reply mipri <mipri minimaltype.com> writes:
On Thursday, 3 October 2019 at 16:57:13 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 3 October 2019 at 13:47:57 UTC, mipri wrote:
 Read what I say instead of lazily continuing your unproductive
 bitch-fest into pseudo-conversation with me, and you'll catch
 obvious sarcasm just fine.
You went ad-hominem.
Suppose a neighbor deliberately poisons your dog. How should you respond? By poisoning the neighbor's dog? What if you're morally opposed to poisoning animals in general? What if the neighbor doesn't have a dog? Offenses can't always be replied to in kind. You shouldn't be so alert for specific forms of offense that seeing them blinds you to earlier interactions. For example, even if I wanted to hold a grudge over this rude post of yours, such that I'd wait for an opportunity to jump in *after* you've stopped arguing with someone to say "woah! Let's all settle down!", followed by mischaracterizations of your behavior in the argument, I just wouldn't remember to do it. And I'm morally opposed to posts such as yours, or to holding petty grudges.
 I don't really understand
 why you want Chris to sugar-coat his views.
This is the timeline: Chris expresses a view that D has an atrocious ecosystem, that you can't just download and use D code like you can in Python and Java, etc. I then complain mildly and indirectly about the hyperbolic language, but mainly: I ask for details on what's so bad about the ecosystem, and write a lot about potential badnesses that Chris could've meant. You probably don't think this is "asking for sugarcoating" yet, right? Chris then sneers that I am saying that D is "the best language in the world" and that I am attacking him as a heretic, which is extra special annoying as this is blatantly a prepared reply to "someone complaining about his D bashing" rather than a reply to what I actually complained about. Interpersonally, this is interrupting someone and smugly rebutting what you think they're trying to say--while being wrong about what they mean to say, while characterizing them as a stupid fanboy. Nothing ad hominen here, I guess? There's then some back and forth about my sarcasm, because Chris offered nothing else to talk about, and I took the bait as I am doing now.
 Again, I really
 hope that the D community don't scare off people with the
 theoretical knowledge to design memory management solutions
 that works well.
Just post about other stuff. Bump other threads. Talk about Nim in the Nim thread instead of about arguments that you didn't pay a lot of attention to. Or if you want to offer some general appeals like your hopes for the D community, do that, and just that. You don't really care the timeline above, right? So why make some non-ad-hominen remarks suggestions that read a bit like "we really need a moderator in here to handle hotheads that insult people so obviously that I notice it after they fail to get someone else to sugarcoat their views"? What part of that didn't you think wouldn't lead to more posts about the thing you're wanting to go away? I've been repeatedly insulted for asking for technical details behind a vacuous and meaningless complaint about D. Because I was interested in them. Because I'm interested in improvements to the thing I saw complained about. I don't feel bad about some mild words like "lazy", "unproductive", or "bitch-fest". What I've learned is to not bother asking for details about vacuous complaints in here. People can show they care about their own stated positions by being at least a little technical or specific at the outset.
Oct 03 2019
next sibling parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 17:50:09 UTC, mipri wrote:
 On Thursday, 3 October 2019 at 16:57:13 UTC, Ola Fosheim 
 Grøstad wrote:
 [...]
Suppose a neighbor deliberately poisons your dog. How should you respond? By poisoning the neighbor's dog? What if you're morally opposed to poisoning animals in general? What if the neighbor doesn't have a dog? Offenses can't always be replied to in kind. You shouldn't be so alert for specific forms of offense that seeing them blinds you to earlier interactions. For example, even if I wanted to hold a grudge over this rude post of yours, such that I'd wait for an opportunity to jump in *after* you've stopped arguing with someone to say "woah! Let's all settle down!", followed by mischaracterizations of your behavior in the argument, I just wouldn't remember to do it. And I'm morally opposed to posts such as yours, or to holding petty grudges.
 [...]
This is the timeline: Chris expresses a view that D has an atrocious ecosystem, that you can't just download and use D code like you can in Python and Java, etc. I then complain mildly and indirectly about the hyperbolic language, but mainly: I ask for details on what's so bad about the ecosystem, and write a lot about potential badnesses that Chris could've meant. You probably don't think this is "asking for sugarcoating" yet, right? Chris then sneers that I am saying that D is "the best language in the world" and that I am attacking him as a heretic, which is extra special annoying as this is blatantly a prepared reply to "someone complaining about his D bashing" rather than a reply to what I actually complained about. Interpersonally, this is interrupting someone and smugly rebutting what you think they're trying to say--while being wrong about what they mean to say, while characterizing them as a stupid fanboy. Nothing ad hominen here, I guess? There's then some back and forth about my sarcasm, because Chris offered nothing else to talk about, and I took the bait as I am doing now.
 [...]
Just post about other stuff. Bump other threads. Talk about Nim in the Nim thread instead of about arguments that you didn't pay a lot of attention to. Or if you want to offer some general appeals like your hopes for the D community, do that, and just that. You don't really care the timeline above, right? So why make some non-ad-hominen remarks suggestions that read a bit like "we really need a moderator in here to handle hotheads that insult people so obviously that I notice it after they fail to get someone else to sugarcoat their views"? What part of that didn't you think wouldn't lead to more posts about the thing you're wanting to go away? I've been repeatedly insulted for asking for technical details behind a vacuous and meaningless complaint about D. Because I was interested in them. Because I'm interested in improvements to the thing I saw complained about. I don't feel bad about some mild words like "lazy", "unproductive", or "bitch-fest". What I've learned is to not bother asking for details about vacuous complaints in here. People can show they care about their own stated positions by being at least a little technical or specific at the outset.
On Thursday, 3 October 2019 at 17:50:09 UTC, mipri wrote: You do realize that it was you who hijacked this thread about Nim and tried to turn it into a flamewar with ad hominems galore. As for the details, well, anyone who has been on this forum for a while (as most posters here have) knows the details, and honestly, I won't repeat them every time a random poster says "If you don't provide details, I won't take you seriously." Everybody remembers the discussions and issues regarding: ecosystem / tooling, autodecode, GC, ARM support and loads of other stuff that pops up regularly. If you want the details put the aforementioned terms in the forum search engine. Also, if you find Ola's reply "offensive", seriously, I don't know what to say. Again, I wish you good luck.
Oct 03 2019
parent reply mipri <mipri minimaltype.com> writes:
On Thursday, 3 October 2019 at 18:09:06 UTC, Chris wrote:
 You do realize that it was you who hijacked this thread about
 Nim and tried to turn it into a flamewar with ad hominems
 galore.
No, I don't realize that. I very kindly wrote my exact interpretation of what happened here in the timeline that you quoted, in the reply that you're replying to. You offered some remarks about D in this thread about Nim and I asked about them. Suppose you'd just answered the question? Suppose you'd even just said "eh I can't be arsed, just search the forum." All of my replies, past the first one, have been to your insults. Do you not wonder if you might be hijacking the thread with your own insults? Please, wonder a little bit about it.
 Also, if you find Ola's reply "offensive", seriously,
What do you mean 'if'? I wasn't subtle about why I found it offensive. I emphasized that though as it seems to have not even occurred to him that I might've gotten annoyed with you for any reason other than "you are still not sugarcoating your views as I'd wanted". That's all.
 Again, I wish you good luck.
Twice while preparing this reply it was clear to me that you'd just not bothered to read what I'd said earlier, or to take it seriously. Over the course of a very short conversation, that's three strikes. I probably won't remember not to waste further words on you, so if you mean any kindness with these wishes, please keep the luck and just read my actual words next time.
Oct 03 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 18:52:44 UTC, mipri wrote:
 I emphasized that though as it seems to have not even
 occurred to him that I might've gotten annoyed with you for any
 reason other than "you are still not sugarcoating your views as
It doesn't matter why you get annoyed. If technology arguments quickly move to a personal/emotional level than that is something that turns off people with engineering/academic/comp-sci backgrounds who expect technical discussions to stay technical (often with the agreement to disagree). Basically, if things frequently gets emotional they will most likely not stay around and move elsewhere, like Rust.
Oct 04 2019
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 17:50:09 UTC, mipri wrote:
 views"? What part of that didn't you think wouldn't lead to more
 posts about the thing you're wanting to go away?
I really don't care and it isn't about you, I was pointing out a community symptom. The reality is that there have been enough people in the D community in the past 10 years that had the skillset and the knowhow to move D forward in most difficult theoretical areas. But they don't stick around. And getting those things right does take hard work.
Oct 03 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 10:49:47 UTC, LocoDelPueblo wrote:
 The strategy of "Y is a better X" is to make people moving from 
 X to Y. So far in D successes are not movers, it's new 
 projects, corporate or not.
That is an interesting metric. There is at least one larger project moving from Pascal to D, based on what has been stated in public. Although, I guess, most native languages could have been used as the target if you transition from Pascal.
Oct 03 2019
parent reply LocoDelPueblo <fdp-dyna-hum nowhere.mx> writes:
On Thursday, 3 October 2019 at 11:29:35 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 3 October 2019 at 10:49:47 UTC, LocoDelPueblo 
 wrote:
 The strategy of "Y is a better X" is to make people moving 
 from X to Y. So far in D successes are not movers, it's new 
 projects, corporate or not.
That is an interesting metric. There is at least one larger project moving from Pascal to D, based on what has been stated in public. Although, I guess, most native languages could have been used as the target if you transition from Pascal.
Yes indeed, I think that this comment is related to Bastian V. Still not accomplished move I think. Also I didn't deny that it can work "to some extant".
Oct 03 2019
parent LocoDelPueblo <fdp-dyna-hum nowhere.mx> writes:
On Thursday, 3 October 2019 at 11:32:13 UTC, LocoDelPueblo wrote:
 On Thursday, 3 October 2019 at 11:29:35 UTC, Ola Fosheim 
 Grøstad wrote:
 On Thursday, 3 October 2019 at 10:49:47 UTC, LocoDelPueblo 
 wrote:
 The strategy of "Y is a better X" is to make people moving 
 from X to Y. So far in D successes are not movers, it's new 
 projects, corporate or not.
That is an interesting metric. There is at least one larger project moving from Pascal to D, based on what has been stated in public. Although, I guess, most native languages could have been used as the target if you transition from Pascal.
Yes indeed, I think that this comment is related to Bastian V. Still not accomplished move I think. Also I didn't deny that it can work "to some extant".
and the move is not from cpp to D BTW ;)
Oct 03 2019
prev sibling parent Chris <wendlec tcd.ie> writes:
On Thursday, 3 October 2019 at 09:36:17 UTC, LocoDelPueblo wrote:
 Problem is also that people using python don't want to be 
 programmers... they just do stuff in python, and sometimes 
 these stuff become mainstream. So if the idea is to say "use 
 Nim instead of python" you basically kill the "why" the python 
 stuff started being widely used and developed. You fallback in 
 the "no-way" again.

 I've observed this again two days ago where the UI of for a 
 command line app required python-qt. The application could have 
 been made in cpp, easily, but you know the authors just wanted 
 to make their stuff...but now this ui is available on most 
 linux distribs, no matter the way it's made.
Actually, I had some time to play around with Nim and it really "feels" like Python. I was thinking if they had an automatic Python to Nim converter (like Kotlin has a Java to Kotlin converter), it'd be much easier to make the transition. It'd be great to be able to do something like this: `nim pyToNim file.py file.nim`.
Oct 03 2019
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 3 October 2019 at 08:20:55 UTC, Chris wrote:
 Python is often used for prototyping too, and then the real app 
 is written in C++. This begs the question, is D becoming a 
 native Python?
I don't know about that, but I think Python works both ways. Some use Python for application-scripting or application-plugins, others use Python for the main program and use C/C++ for "plugins". So there is a lot of flexibility in using Python for prototyping that few other platforms provide. For instance, if you design a vector-drawing-app, you might start by looking for Python libs then set it up with a simple interface then replace each component. Few other languages let you start from that level of "patching things together" (except outdated languages like Perl, Common Lisp or *choke* Visual Basic). Prototyping in Nim or D doesn't sound very attractive to me. They probably should aim for production. At this point TypeScript/node.js/electron/angular would probably be a more likely candidate for prototyping.
Oct 03 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Tuesday, 1 October 2019 at 10:05:31 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 1 October 2019 at 08:47:19 UTC, Chris wrote:
 1. Walter admits that D only caters for a few users with very 
 specific use cases (niches).
Which niches are these? My impression is that D primarily caters for users that want the feature-set of C++, but find C++ to be too inconvenient or complicated.
See what I've found, a very wise man: "[managed languages] will have their share, but native languages will also get a part of the cake. Here most developers seem to be satisfied with the available choices: Go, Rust, C++, C, X (place your favorite one in here). So why should D now (suddenly) get the attention (which it would certainly deserve)? My guess is that it won't be about making D more attractive to new developers. I feel that the mission will be to make D more powerful for a variety of very special scenarios, which will give the language a piece of the cake that [is] not in the mainstream area." (8/29/2015) https://florian-rappl.de/News/Page/310/is-it-d-comeback Funnily enough, it wasn't too long after that, in 2017, that I was getting increasingly frustrated with D and started to look for alternatives to D as I was beginning to realize that D had become a language for "special scenarios" and that other use cases (e.g. mobile) would never happen.
Oct 02 2019
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Wednesday, 2 October 2019 at 09:57:50 UTC, Chris wrote:
 On Tuesday, 1 October 2019 at 10:05:31 UTC, Ola Fosheim Grøstad 
 wrote:
 On Tuesday, 1 October 2019 at 08:47:19 UTC, Chris wrote:
 1. Walter admits that D only caters for a few users with very 
 specific use cases (niches).
Which niches are these? My impression is that D primarily caters for users that want the feature-set of C++, but find C++ to be too inconvenient or complicated.
See what I've found, a very wise man: "[managed languages] will have their share, but native languages will also get a part of the cake. Here most developers seem to be satisfied with the available choices: Go, Rust, C++, C, X (place your favorite one in here). So why should D now (suddenly) get the attention (which it would certainly deserve)? My guess is that it won't be about making D more attractive to new developers. I feel that the mission will be to make D more powerful for a variety of very special scenarios, which will give the language a piece of the cake that [is] not in the mainstream area." (8/29/2015) https://florian-rappl.de/News/Page/310/is-it-d-comeback Funnily enough, it wasn't too long after that, in 2017, that I was getting increasingly frustrated with D and started to look for alternatives to D as I was beginning to realize that D had become a language for "special scenarios" and that other use cases (e.g. mobile) would never happen.
That same wise man has this interesting post. https://florian-rappl.de/News/Page/385/designing-programming-language-for-2019
Oct 02 2019
parent Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 11:00:55 UTC, Paulo Pinto wrote:
 That same wise man has this interesting post.

 https://florian-rappl.de/News/Page/385/designing-programming-language-for-2019
Sounds good, in theory, but even the author can foresee problems: "Great thing about this design is that any version of the language works with any version of the compiler (well, at least if the base part has been done correctly - in practice we will certainly see some dependency here)." But basically I think it's a sound approach.
Oct 02 2019
prev sibling parent reply Ecstatic Coder <ecstatic.coder gmail.com> writes:
 I think no matter what you do, C++ folks will complain. They're 
 just triggered by the word GC. That's why every single 
 discussion thread about D outside of this forums starts at the 
 GC. Then someone will mention  nogc or refcounting. Then 
 someone will chime in about how you lose most of the packages 
 and standard library because it assumes GC is present. And then 
 people will just go "oh man, that is so complicated".
Exactly. ILHO the "BetterC' approach won't help D get new adopters. One the contrary, all the effort spent in removing the GC is useless, as the GC is precisely what makes D a better language over C++. I'm convinced his huge amount of effort should have been spent on making D a better Go/Crystal/etc. As I said earlier, I *REALLY* wanted to implement Cyclone, my CQL/SQL script runner in D. The problem is it was SO MUCH easier to do it in Go, despite is EXTREMELY limited compared to an object oriented language like D : no genericity, no parametric polymorphism, no virtual inheritance, etc. But the language and its standard library provides ALL the building blocks you need to implement web servers (coroutines, HTTP, etc), and the official CQL/SQL database drivers are complete, optimized, well maintained and used extensively. As I already said, it takes you just a few very simple lines of code to implement this script runner in Go, because you can clearly see that the language itself was designed especially for that : manage efficiently HTTP and database connections. Which is nice, because it's the world most developers live in now. Putting so much effort in trying to convince the few developers developing C++ embedded applications to switch to D, while D is a very nice garbage collected language, and so many of us use other connected applications a completely ineffective strategy, if D wants more contributors. Because as one has said above, with so few contributors working on their spare time or funded by the D sponsors, I think the language should strive to remain as SIMPLE as possible, and first be enhanced to provide what most STANDARD developers need, not to add complicated micro-features which are useless to most of us. And unfortunately, despite their imperfections, it seems that several new languages like Go and Crystal have at least perfectly understood those principles.
Sep 30 2019
next sibling parent reply mipri <mipri minimaltype.com> writes:
On Tuesday, 1 October 2019 at 05:35:22 UTC, Ecstatic Coder wrote:
 As I already said, it takes you just a few very simple lines of 
 code to implement this script runner in Go, because you can 
 clearly see that the language itself was designed especially 
 for that : manage efficiently HTTP and database connections.
I don't disbelieve you, but I also really don't want to learn Go just so that I can understand what D is missing. Do any of these threads ever result in arewexyet type webpages, or even wiki lists of features that D would need to be as good at some role as some other technology? Say, a category of "D Gaps", and pages of the pattern, "I looked at using D for <purpose>, and I missed these conveniences: [list]". Some purposes: 1. data science 2. bioinformatics 3. machine learning 4. scripting Microsoft applications through COM 5. writing CGI scripts 6. writing an Apache module 7. rewriting Python and Perl sysadmin scripts 8. mobile app development 9. desktop app development 10. quick CLI tools that fetch some webpages I've done a lot of these but for bioinformatics f.e. my list of features ends at: * they use 'FASTA files' * they want fast regular expressions * some preferences stemming from them not being 'real programmers' -- not pejoratively, but look at https://www.youtube.com/watch?v=9ZxtaccqyWA occasional remarks about scientists using Python. things to want to do in the current millennium.
 I think the language should strive to remain as SIMPLE as 
 possible, and first be enhanced to provide what most STANDARD 
 developers need
Are you sure that 'STANDARD developers' even exist? This might be unfair but I see a lot of solipsism in complaints like this. To put it a kinder way, I think you are underestimating how valuable your own experience is in your own industry.
Sep 30 2019
parent reply Paulo Pinto <pjmlp progtools.org> writes:
On Tuesday, 1 October 2019 at 06:29:30 UTC, mipri wrote:
 On Tuesday, 1 October 2019 at 05:35:22 UTC, Ecstatic Coder 
 wrote:
 As I already said, it takes you just a few very simple lines 
 of code to implement this script runner in Go, because you can 
 clearly see that the language itself was designed especially 
 for that : manage efficiently HTTP and database connections.
I don't disbelieve you, but I also really don't want to learn Go just so that I can understand what D is missing. Do any of these threads ever result in arewexyet type webpages, or even wiki lists of features that D would need to be as good at some role as some other technology? Say, a category of "D Gaps", and pages of the pattern, "I looked at using D for <purpose>, and I missed these conveniences: [list]". Some purposes: 1. data science 2. bioinformatics 3. machine learning 4. scripting Microsoft applications through COM 5. writing CGI scripts 6. writing an Apache module 7. rewriting Python and Perl sysadmin scripts 8. mobile app development 9. desktop app development 10. quick CLI tools that fetch some webpages I've done a lot of these but for bioinformatics f.e. my list of features ends at: * they use 'FASTA files' * they want fast regular expressions * some preferences stemming from them not being 'real programmers' -- not pejoratively, but look at https://www.youtube.com/watch?v=9ZxtaccqyWA occasional remarks about scientists using Python. valid things to want to do in the current millennium.
 I think the language should strive to remain as SIMPLE as 
 possible, and first be enhanced to provide what most STANDARD 
 developers need
Are you sure that 'STANDARD developers' even exist? This might be unfair but I see a lot of solipsism in complaints like this. To put it a kinder way, I think you are underestimating how valuable your own experience is in your own industry.
valid thing to do in currently millennium, given that since Windows Vista COM has been the bread and butter of Windows APIs (nowadays known as UWP), and of recently device drivers as well (universal drivers). computing", again it depends on how long this fad might keep going on the cloud computing generation.
Oct 01 2019
parent rikki cattermole <rikki cattermole.co.nz> writes:
On 01/10/2019 8:45 PM, Paulo Pinto wrote:

 thing to do in currently millennium, given that since Windows Vista COM 
 has been the bread and butter of Windows APIs (nowadays known as UWP), 
 and of recently device drivers as well (universal drivers).
The direct equivalent of COM on *nix is D-Bus. And desktop environments like KDE are pushing a lot of behavior through it. So its written very specifically, but in a more generic form it is entirely valid.
Oct 01 2019
prev sibling next sibling parent reply bachmeier <no spam.net> writes:
On Tuesday, 1 October 2019 at 05:35:22 UTC, Ecstatic Coder wrote:
 I'm convinced his huge amount of effort should have been spent 
 on making D a better Go/Crystal/etc.

 As I said earlier, I *REALLY* wanted to implement Cyclone, my 
 CQL/SQL script runner in D.

 The problem is it was SO MUCH easier to do it in Go, despite is 
 EXTREMELY limited compared to an object oriented language like 
 D : no genericity, no parametric polymorphism, no virtual 
 inheritance, etc.

 But the language and its standard library provides ALL the 
 building blocks you need to implement web servers (coroutines, 
 HTTP, etc), and the official CQL/SQL database drivers are 
 complete, optimized, well maintained and used extensively.

 As I already said, it takes you just a few very simple lines of 
 code to implement this script runner in Go, because you can 
 clearly see that the language itself was designed especially 
 for that : manage efficiently HTTP and database connections.
But none of this is in any way related to Walter. This is not Walter's area. No matter how many times anyone writes it on the mailing list, Walter will not be writing web servers or database drivers. Your complaint, as written, is a complaint to nobody. This is not a corporation making decisions on where to spend developer time. It is a volunteer project. Sure, you can vent on the mailing list, but you don't have anyone to be the target of your anger. It's ultimately 100% wasted electrons unless you have a proposal to go along with it that will result in the work getting it done. "D should do X" posts do not have any value unless they go a step further to suggest a solution beyond "someone else should do all the work". That is simply reality. Did you even put together a proposal for GSoC or SAoC? It's easy to generate ideas of what some mythical developer should do. It's much harder to actually make it happen.
Oct 01 2019
parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
 It's easy to generate ideas of what some mythical developer 
 should do. It's much harder to actually make it happen.
I was just saying that the way the D sponsors funds are used towards C-like application programming doesn't seem optimal to me, as I personally think D is a nice garbage collected language for general application programming. But hey, that's just my two cents...
Oct 01 2019
prev sibling parent reply Adam D. Ruppe <destructionator gmail.com> writes:
On Tuesday, 1 October 2019 at 05:35:22 UTC, Ecstatic Coder wrote:
 As I said earlier, I *REALLY* wanted to implement Cyclone, my 
 CQL/SQL script runner in D.
are you familiar with my libraries? your code there looks trivial to do in D, the only piece I don't have sitting on my shelf is the cassandra one, and since there's a C library for that, that'd be easily accessible with my framework. I think it is folly to insist on rewriting everything in your own language/library of choice. I'll agree there - the big guys might be able to do that (or be popular enough to get the other companies to officially support you) but D isn't really in that position. But we do have excellent interop with the lingua franca of apis - C. I have had reliable database code for ages because I just use an interface over the official C client libraries. HTTP I wrote from scratch, but even that got bootstrapped by reusing the existing stuff - the original cgi interface (which btw is now a misnomer, since it does let you do an embedded server) did the bare minimum to talk to apache and iis servers! And the original http client I did called curl. I have since replaced that with scratch too, but I still think curl is a decent way to do it. anyway though i just wonder if you tried it and found it sucked (or didn't try it cuz you saw it and was like "this sux" - to this day my docs are super light) or just didn't see it.
Oct 01 2019
next sibling parent Ecstatic Coder <ecstatic.coder gmail.com> writes:
On Tuesday, 1 October 2019 at 12:29:59 UTC, Adam D. Ruppe wrote:
 On Tuesday, 1 October 2019 at 05:35:22 UTC, Ecstatic Coder 
 wrote:
 As I said earlier, I *REALLY* wanted to implement Cyclone, my 
 CQL/SQL script runner in D.
are you familiar with my libraries? your code there looks trivial to do in D, the only piece I don't have sitting on my shelf is the cassandra one, and since there's a C library for that, that'd be easily accessible with my framework. I think it is folly to insist on rewriting everything in your own language/library of choice. I'll agree there - the big guys might be able to do that (or be popular enough to get the other companies to officially support you) but D isn't really in that position. But we do have excellent interop with the lingua franca of apis - C. I have had reliable database code for ages because I just use an interface over the official C client libraries. HTTP I wrote from scratch, but even that got bootstrapped by reusing the existing stuff - the original cgi interface (which btw is now a misnomer, since it does let you do an embedded server) did the bare minimum to talk to apache and iis servers! And the original http client I did called curl. I have since replaced that with scratch too, but I still think curl is a decent way to do it. anyway though i just wonder if you tried it and found it sucked (or didn't try it cuz you saw it and was like "this sux" - to this day my docs are super light) or just didn't see it.
Indeed I didn't wanted to spend my week end trying to use low-level C libraries or fixing the Cassandra package because it is incomplete and abandoned since several years. As I say, the easy path to success was to use Go, for all the reasons I've explained. Call this lazyness if you want, but my spare time was limited to a few hours, and IMO D didn't allow me to easily implement this very simple tool with just a few line of high level code like I did in Go. I understand that D's philosophy is that you must put efforts to do this kind of development with it. My philosophy is to take the tool which allows me to program what I need quickly and efficiently. And for anything related to web and database technologies, D is definitely not the path of less friction.
Oct 01 2019
prev sibling parent JN <666total wp.pl> writes:
On Tuesday, 1 October 2019 at 12:29:59 UTC, Adam D. Ruppe wrote:
 are you familiar with my libraries? your code there looks 
 trivial to do in D, the only piece I don't have sitting on my 
 shelf is the cassandra one, and since there's a C library for 
 that, that'd be easily accessible with my framework.
I think the difference is, since Go has libraries like http in standard library, or de facto standard library, even if it's an external package, it's been included in thousands of projects since and went through a lot of testing. Even though I like your libraries and have used them a few times, they haven't been as battletested. There might be missing vital functionalities (which you didn't just need for the moment), or there might be some nasty bugs lurking which just weren't triggered for the usual usecases. And if I were making a real web application, not a hobby project, I want to fully trust the libraries. I want to include the library and make use of it, and move on to more fun stuff. I don't want to debug the library whenever something breaks.
Oct 01 2019
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 D is theoretically in a good position to do a spring cleaning. 
 It has loads of features. Take what really works (battle-tested 
 features), drop all the half-baked features that only a 
 minority really uses. Improve the stability of the language and 
 set up a proper ecosystem (e.g. out of the box compilation for 
 various platforms / architectures).
In theory, yes, but in order to make it easy for open source contributions to be made the compiler would have to be made modular and basically rewritten from scratch. A spring cleaning should not only focus on the language itself, but focus on attracting new contributors. Quite frankly, a higher regard for basic computer science would go a long way... I think the Rust community has benefitted greatly for taking the field of computer science very seriously (perhaps even a bit too far in that direction, but it builds a very solid group identity).
 1. Take D's great features that are battle-tested
 2. See what is not strictly necessary and drop it (dead 
 weight), i.e. figure out what programmers need 90-95% of the 
 time and don't pollute the language with features you only need 
 in the remaining 5-10% of the cases.
 3. Set up a sound and stable ecosystem

 But then again, I fear this will never happen.
It won't happen, and it would only succeed if you focus on a carefully selected set of use cases. Which basically has been D's main problem since D2. D1 had the advantage of being a simple upcoming alternative to C++, which at that point was a niche with no contenders. Today, you have to provide some use cases where the eco-system has next to no competition and where it excels. There are contenders in most niches... at that point the eco-system is no longer optional... It could be Linux. It could be embedded. It could be games development. It could be scientific computing. It could be web development (although in that case the language has to be made more forgiving). It cannot be iphone, android, database-integration or any other area where the eco-system is incredibly expensive to both build and maintain. (That includes Python's scripting niche, despite D developers saying that they prefer D over Python for scripting tasks.) You really need a niche to grow from to get that ecosystem going. Go is a very good example of this. They kept the language small. And the "owner" Google decided that it was primarily good for writing web services and put a lot of resources into the runtime, not the language. The Go language is not a lot better for writing web services than other languages. The runtime and ecosystem in combination with the basic language is what makes it a strong contender in that space. TypeScript, same story. It is very difficult to make a sales pitch for a language if it does not have an ecosystem that makes it a strong contender in at least one niche. After 20 years... that becomes a problem. Even if the language had been perfect, that lack of a niche where you are a strong contender is problematic. You see this with Kotlin too, it is now a strong contender in the Android niche. That clearly makes a big difference in how the language eco system evolves...
Sep 30 2019
parent reply JN <666total wp.pl> writes:
On Monday, 30 September 2019 at 10:13:13 UTC, Ola Fosheim Grøstad 
wrote:
 It won't happen, and it would only succeed if you focus on a 
 carefully selected set of use cases. Which basically has been 
 D's main problem since D2. D1 had the advantage of being a 
 simple upcoming alternative to C++, which at that point was a 
 niche with no contenders.
D1 wasn't simple, but it had a good set of features. It was a the best of two worlds of unmanaged and managed. And compared to C++ it looked amazing.
 Go is a very good example of this. They kept the language 
 small. And the "owner" Google decided that it was primarily 
 good for writing web services and put a lot of resources into 
 the runtime, not the language.  The Go language is not a lot 
 better for writing web services than other languages. The 
 runtime and ecosystem in combination with the basic language is 
 what makes it a strong contender in that space.
And yet people now use it for other things they'd use C/C++ for, because the language and ecosystem is good enough.
 TypeScript, same story.
Good timing and easy transition story. Most people were frustrated with Javascript for a long time and the ECMAScript committee took ages to add new features such as classes, which people were asking from.
Sep 30 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 10:47:14 UTC, JN wrote:
 D1 wasn't simple, but it had a good set of features. It was a 

 Like the best of two worlds of unmanaged and managed. And 
 compared to C++ it looked amazing.
I meant that it was simple compared to C++ and D2. D1 was clearly more complex than Pascal! More importantly, one could prototype things faster in D1 than in C++98, which made it a promising newcomer. Although, I gave up on the first releases of D1 for pragmatic reasons: compiler quality, error messages, code gen, underperforming GC, no real alternative to the GC that improves on C memory management. Yet, at that point that could be seen as baby-illnesses and developers would come back every once in a while to see if D1 was maturing. Unfortunately, the key issues with D1 wasn't fixed and new features was added instead of making the foundation really solid before adding more features. As a result D2 become more like C++, without being compatible and still having the baby-illnesses. I personally think that the D1 feature set was good enough for embedded and simple audio/graphics programming if the rest had been polished thoroughly. There was a lot of enthusiasm for D1 that you don't see for D2, IMHO.
 And yet people now use it for other things they'd use C/C++ 
 for, because the language and ecosystem is good enough.
Yes, but only because hardware has more speed/memory today than when those programs were written in C/C++. So, this is what has been going on for the past 50 years. Code that was written in machine language could be implemented in C, then in Java, then in Python or TypeScript. Some software go from C to Go as well, true, but mostly because that software didn't really require the C feature set.
 TypeScript, same story.
Good timing and easy transition story. Most people were frustrated with Javascript for a long time and the ECMAScript committee took ages to add new features such as classes, which people were asking from.
Right, so I guess we have these "parasitic" transitions for both C++, Kotlin and TypeScript.
Sep 30 2019
prev sibling parent reply JN <666total wp.pl> writes:
On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 Atm, I'm mainly using Kotlin and I have to say that a small set 
 of clever and well thought-out features can get you a long way. 
 Do I miss some of D's features? Not really, because Kotlin 
 provides enough useful and battle-tested features that you need 
 90% of the time [1]. Once I missed `static if`, but I could 
 live without it.

 Mind you, Kotlin has some restrictions due to the fact that it 
 has to be a 100% compatible with Java/JVM. But even when you 
 use Kotlin/Native (without the Java universe, i.e. modules and 
 libraries) you can get quite far. I think D should aim at that.
I think Kotlin's dependency on Java/JVM is actually a benefit, because they know what worked and didn't work in Java and could work on improving that. Also, they are 100% behind the OOP paradigm, so they can introduce most features that benefit that. For example features like data classes or singleton objects are very useful, even if they are mostly a language syntax sugar for certain design patterns.
Sep 30 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 30 September 2019 at 10:14:52 UTC, JN wrote:
 I think Kotlin's dependency on Java/JVM is actually a benefit, 
 because they know what worked and didn't work in Java and could 
 work on improving that.
Actually, it is a big disadvantage, except for one thing: they can piggyback on the ecosystem and cut down the costs of developing a solid runtime system. In that Kotlin has the same advantage of being able to gradually take over for Java, which is comparable to how C++ could gradually take over for C.
Sep 30 2019
prev sibling next sibling parent Chris <wendlec tcd.ie> writes:
On Monday, 30 September 2019 at 10:14:52 UTC, JN wrote:

 I think Kotlin's dependency on Java/JVM is actually a benefit, 
 because they know what worked and didn't work in Java and could 
 work on improving that. Also, they are 100% behind the OOP 
 paradigm, so they can introduce most features that benefit 
 that. For example features like data classes or singleton 
 objects are very useful, even if they are mostly a language 
 syntax sugar for certain design patterns.
Yeah, I agree, Kotlin has done a great job on that. Kotlin has added a lot of nice features to the old Java/JVM world. Data classes are very useful, and it's really easy to use Singletons. My point was, that the authors of Kotlin cannot do whatever they want due to Java/JVM, and still they managed to come up with a sound language.
Sep 30 2019
prev sibling parent Paulo Pinto <pjmlp progtools.org> writes:
On Monday, 30 September 2019 at 10:14:52 UTC, JN wrote:
 On Monday, 30 September 2019 at 09:21:30 UTC, Chris wrote:
 Atm, I'm mainly using Kotlin and I have to say that a small 
 set of clever and well thought-out features can get you a long 
 way. Do I miss some of D's features? Not really, because 
 Kotlin provides enough useful and battle-tested features that 
 you need 90% of the time [1]. Once I missed `static if`, but I 
 could live without it.

 Mind you, Kotlin has some restrictions due to the fact that it 
 has to be a 100% compatible with Java/JVM. But even when you 
 use Kotlin/Native (without the Java universe, i.e. modules and 
 libraries) you can get quite far. I think D should aim at that.
I think Kotlin's dependency on Java/JVM is actually a benefit, because they know what worked and didn't work in Java and could work on improving that. Also, they are 100% behind the OOP paradigm, so they can introduce most features that benefit that. For example features like data classes or singleton objects are very useful, even if they are mostly a language syntax sugar for certain design patterns.
If Kotlin wants to stay relevant on the Java/JVM platform (Android is another matter), they cannot come up with incompatible designs that force FFI jumps between platform semantics and whatever they come up with in Kotlin land. Some examples of these impedance mistmatch are already visible, namely memory semantics between Kotlin/JVM and Kotlin/Native, sequences vs JVM streams, Kotlin lambdas vs JVM SAM types. Same applies to whatever reboot might happen with D.
Sep 30 2019
prev sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Sunday, 29 September 2019 at 20:39:19 UTC, Ecstatic Coder 
wrote:
 And as a "true" C++ alternative, D is far from being the best 
 choice. I'd rather choose Rust, Zig and even Nim for my typical 
 C/C++ use case, despite indeed it's obviously *possible* to use 
 D for that, precisely because I know D's major pain points.
In all fairness, D started out as an alternative to C++, but didn't align it's semantics when C++11 came. That would be necessary to get good integration. At this point it is too late as Rust and other alternatives have reached critical mass. So, I don't think one can argue this only in terms of "what is" and "what isn't", timing is a big factor.
Sep 29 2019
prev sibling parent reply GreatSam4sure <greatsam4sure gmail.com> writes:
On Sunday, 29 September 2019 at 01:29:59 UTC, Benjiro wrote:
 On Friday, 27 September 2019 at 15:11:26 UTC, GreatSam4sure 
 wrote:
 The biggest problem to D adoption is this community IMHO. 
 Almost everything here is negative. The community is 95% 
 negative to the language. There are people here that never see 
 anything good about D yet they are here. The community is 
 damaging to D image.

 It is a community that is out to destroy the language. All the 
 hope of me using D has been almost destroyed.
As one of these negative people that from time to time reads up on multiple languages forums. D was one of the first languages after moving away from Web dev, that i liked from structure point. Even got the book from Ali. But ... the more i worked with it, the more frustration, upon frustration, upon ... kept creeping in with constant issues. * Not user friendly tooling ( somewhat improved, took only a few years ) * Compiler bugs. O those bugs ... * Upgrading the compiler resulting in packages ( like vibe.d going broke ) * Lacking IDE support ( somewhat improved on Visual Studio Code ) * Lacking in packages to easily get going. * Constant and frustrating changing direction. BetterC? First get BetterD doing and when your a big boy, then do this. * The feeling off being looked down when suggestion that D is not user friendly. Especially in the past you constantly got this as a answer: "you have a issue, fix it yourself". Yes, this was the constant answer from everybody. * The snarky and frankly poison comments from regular members. Some of those now seem to have left but D really has some members that got under anybody their skin for daring to mention a issue. * Focus upon C++ crowd, when very few C++ developers have any interest in D. Hell, if a C++ developer wants to switch, they have better alternatives with bigger or better growing communities. * A leadership that seems to be stuck in the 1990's mentality. The world has moved on, people expect more these days, they have choices most old timers did not have. So they are more spoiled with those choices and are not going to put in the time. But when you ignore those "spoiled" people, you never build up a loyal base and you will never motivate them to help out with code or money. You can not run anything worth while with D unless you plan on spending a lot of time writing supporting code yourself. Some companies have no issue with this but little old me can not be bothered with it, when there are plenty of good alternatives that get the job done as good. * I know one of the posters on this forum a bit more personal. We have had some discussions about the different compilers. Just as he used Go for his personal project, i used Crystal for mine. We both found D too much trouble for the advantages it gave. Its a vicious circle and i know it. Lack in people / contributors => No Packaging / Bugs / Issues => New people scared away => Lack in people / contributors .... * D just is not sexy. I know D now from 10/2016 ( when i got Ali's book. Programming in D ). Loved the book and language syntax ( especially if you come from PHP ) but it was all the rest that ruined it for me. And over the years D seems to have been going down a direction that i hated. New features that had no meaning for me, code breaking issues and resources being put in features like BetterC, when all i wanted was a god darn simple HTTP server that worked and kept working on release updates. And more active packages. * On the subject of Programming in D, i noticed a lot of books are old. BetterC, the feature where a lot of resources have gone into, seems to be ignored in every book on the market. ------------------------------ To give you a nice example how much of a irritation D can be even today. Just right now, i tried to install D on Ubuntu because i want to time the current D compile speed, vs a few other languages. And i am following the official order ( https://dlang.org/download.html ): 1. sudo wget http://master.dl.sourceforge.net/project/d-apt/files/d-apt.list -O /etc/apt/sources.list.d/d-apt.list 2. sudo apt-get update && sudo apt-get -y --allow-unauthenticated install --reinstall d-apt-keyring 3. sudo apt-get update && sudo apt-get install dmd-compiler dub Guess what happens on Step 2.
 Err:7 https://netcologne.dl.sourceforge.net/project/d-apt 
 d-apt InRelease
The following signatures couldn't be verified because the public key is not available: NO_PUBKEY EBCF975E5BA24D5E And this issue was posted a LONG time ago. Yep ... Now try Go, Crystal, Nim, Rust, ... and a lot of other languages. I have installed all those dozens of times and a issue like simply do not show up or are fixed so fast, that it never hits "me". But whenever it involves D, its always one thing or another. ------------------------------ Hopefully this explains things from my point of view, where D simply fails me personally. Its late so i am off to bed.
I can reckon with you in your paid and frustration. It is real. My personal frustration with D is that the language is too good to be where it is today. To me D is the best language I ever known of.I am not in any way ignorant of the languages out there. D packages don't work out of the box. They throw errors here and there. The tooling sets is also weak and little tutorial. Many aspects of D are known to only a few. But the Languages is truly elegant. I also see the leaders have not been able to work with the community coherently. There is too much dichotomy in the forum. Some people have even decided to keep quiet and just observed the trends of things. I really did not blame them. Some have left, like the android guy. It is time to change our attitudes and begin to look for the way forward. Negativity is obviously not the solution. Insult and yelling are out of point. Ways is the way forward a change of philosophy and mindset. A critical examination of why we are the way we are, like a survey that will be back by implementations. Working coherently together as against working individually. I think we need the humility of heart and humbleness of mind to accept that where we are today is far behind. We are behind not much of a cooperate sponsor but of our own making. Harmony is the first step to grow in a community. The division is one major cause of our backwardness and lack of fast progress. We much realize what is against us and correct to move to our desired level. I am willing to help in any capacity I can but I don't know how since my programming skill is still low. I need to grow more. If all D packages can work out of the box, the compiler can minimize breakages and the community can unity D will become the best language in the world. Looking forward to a better D.
Sep 30 2019
parent Meta <jared771 gmail.com> writes:
On Monday, 30 September 2019 at 12:25:46 UTC, GreatSam4sure wrote:
 There is too much dichotomy in the forum. Some people have even 
 decided to keep quiet and just observed the trends of things. I 
 really
 did not blame them. Some have left, like the android guy.
To be clear, the "Android Guy", AKA Joakim, left due to having (one or more, can't remember) over-the-line posts deleted. It was a personal problem on his part, not any kind of problem with the D community (which is by no means perfect, obviously). A better example is Kenji Hara.
 It is time to change our attitudes and begin to look for the 
 way forward. Negativity is obviously not the solution. Insult 
 and yelling are out of point. Ways is the way forward a change 
 of philosophy and mindset. A critical examination of why we are 
 the way we are, like a survey that will be back by 
 implementations. Working coherently together as against working 
 individually.

 I think we need the humility of heart and humbleness of mind to 
 accept that where we are today is far behind. We are behind not 
 much of a cooperate sponsor but of our own making. Harmony is 
 the first step to grow in a community. The division is one 
 major cause of our backwardness and lack of fast progress. We 
 much realize what is against us and correct to move to our 
 desired level.
I don't agree that D is "way behind", but its velocity is certainly slower than Go/Rust/C++. I think this comes down to money=acceleration. However, if you look at what's happened even in just the past 7 years, D has "accelerated" dramatically - a yearly conference, real companies successfully using D and even contributing back to D's development, a far more stable compiler and standard library, a thriving package ecosystem... to quote Andrei, it's an "embarassment of riches".
Sep 30 2019
prev sibling next sibling parent reply Dennis <dkorpel gmail.com> writes:
On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 What do you think about it?
I like how they are going to maintain a stable 1.0 branch and continue developing on a 1.1 branch. I wonder if something like that could work for D.
Sep 25 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Wednesday, 25 September 2019 at 14:58:43 UTC, Dennis wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 What do you think about it?
I like how they are going to maintain a stable 1.0 branch and continue developing on a 1.1 branch. I wonder if something like that could work for D.
I'd say this should be the standard procedure.
Sep 25 2019
prev sibling parent Paul Backus <snarwin gmail.com> writes:
On Wednesday, 25 September 2019 at 14:58:43 UTC, Dennis wrote:
 On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 What do you think about it?
I like how they are going to maintain a stable 1.0 branch and continue developing on a 1.1 branch. I wonder if something like that could work for D.
I believe this is the intent behind the `-preview` and `-revert` compiler options, though they work on a feature-by-feature basis rather than affecting the entire language at once.
Sep 25 2019
prev sibling next sibling parent reply Rel <relmail rambler.ru> writes:
It is kinda strange to see, how discussion went from "Nim hitting 
1.0" to "what I dislike about D" in a matter of few posts. 
Obviously programming languages are tools and there is no ideal 
programming language. D is opinionated like many other languages, 
the design and most of the compiler work is mainly done by few 
core members of the team. While I think it would be nice to have 
Rust's automatic memory management at compile time or Go's 
awesome standard library or Zig's error handling enforcement in 
D, in reality core members won't be available to keep up with all 
new trends in programming languages. You don't have to use D for 
everything, use D for stuff that D fits nicely. Or go make D3 
with all the features you like (you can use D2 to bootstrap the 
compiler, use LLVM as a backend and etc).
Oct 01 2019
next sibling parent Samir <samir aol.com> writes:
On Tuesday, 1 October 2019 at 22:18:05 UTC, Rel wrote:
 It is kinda strange to see, how discussion went from "Nim 
 hitting 1.0" to "what I dislike about D" in a matter of few 
 posts. Obviously programming languages are tools and there is 
 no ideal programming language. D is opinionated like many other 
 languages, the design and most of the compiler work is mainly 
 done by few core members of the team. While I think it would be 
 nice to have Rust's automatic memory management at compile time 
 or Go's awesome standard library or Zig's error handling 
 enforcement in D, in reality core members won't be available to 
 keep up with all new trends in programming languages. You don't 
 have to use D for everything, use D for stuff that D fits 
 nicely. Or go make D3 with all the features you like (you can 
 use D2 to bootstrap the compiler, use LLVM as a backend and 
 etc).
This thread in general, and some earlier posts in particular, inspired me to document why I have been using D lately. I wrote it up here: http://samirparikh.com/blog/adventures-in-dlang.html From a beginner's perspective, it's the active community on these lists that have helped me out the most.
Oct 01 2019
prev sibling parent reply JN <666total wp.pl> writes:
On Tuesday, 1 October 2019 at 22:18:05 UTC, Rel wrote:
 It is kinda strange to see, how discussion went from "Nim 
 hitting 1.0" to "what I dislike about D" in a matter of few 
 posts. Obviously programming languages are tools and there is 
 no ideal programming language. D is opinionated like many other 
 languages, the design and most of the compiler work is mainly 
 done by few core members of the team. While I think it would be 
 nice to have Rust's automatic memory management at compile time 
 or Go's awesome standard library or Zig's error handling 
 enforcement in D, in reality core members won't be available to 
 keep up with all new trends in programming languages. You don't 
 have to use D for everything, use D for stuff that D fits 
 nicely. Or go make D3 with all the features you like (you can 
 use D2 to bootstrap the compiler, use LLVM as a backend and 
 etc).
No one claims that. The general idea expressed in this thread is that D isn't opinionated enough. Would it be nice to have those features? Depends on the person. Personally I don't care much about Rust's memory management or Zig's error handling. I don't mind GC. But there are people here who do, and for them it could be appealing. As for D3 with all the features you like, I'd rather see D3 with half the features :)
Oct 02 2019
parent reply tester <tester test.aol> writes:
On Wednesday, 2 October 2019 at 07:20:00 UTC, JN wrote:
 On Tuesday, 1 October 2019 at 22:18:05 UTC, Rel wrote:
 As for D3 with all the features you like, I'd rather see D3 
 with half the features :)
Yup - lets go back to D1 sort of and I am serious aout that.
Oct 02 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 2 October 2019 at 09:57:40 UTC, tester wrote:
 On Wednesday, 2 October 2019 at 07:20:00 UTC, JN wrote:
 On Tuesday, 1 October 2019 at 22:18:05 UTC, Rel wrote:
 As for D3 with all the features you like, I'd rather see D3 
 with half the features :)
Yup - lets go back to D1 sort of and I am serious aout that.
Yes, something like that, but now D2 has too many features that are being used by the organizations that use D. I think some of the (half-baked) features were actually introduced to cater for their needs. I suppose that the organizations that are using D2 now will not accept a trimmed down version of D by now. I'd say it's too late.
Oct 02 2019
parent tester <tester test.aol> writes:
On Wednesday, 2 October 2019 at 10:04:44 UTC, Chris wrote:
 Yes, something like that, but now D2 has too many features that 
 are being used by the organizations that use D. I think some of 
 the (half-baked) features were actually introduced to cater for 
 their needs. I suppose that the organizations that are using D2 
 now will not accept a trimmed down version of D by now. I'd say 
 it's too late.
Well, I guess then it is too late for me to start using D. Best of luck for all of you.
Oct 02 2019
prev sibling next sibling parent reply Jack Applegame <japplegame gmail.com> writes:
On Wednesday, 25 September 2019 at 10:01:40 UTC, Rel wrote:
 https://nim-lang.org/blog/2019/09/23/version-100-released.html

 Well, finally Nim team released 1.0. Now future releases 
 shouldn't break people's code and this fact should increase 
 language adoption. Still few things seems to be unfinished 
 (like their NewRuntime thing), but I'd like to congratulate 
 Nim's team with this big release. What do you think about it?
I recently tried Nim. The first good impression was quickly replaced by disappointment. No nested types, constants and functions. Poor generics and templates (but macros are great). Primitive modules system. Unclear error messages. But. Nim's GC (including boehm) is much faster then D's one. And Nim compiler creates very compact binaries.
Oct 05 2019
parent reply Chris <wendlec tcd.ie> writes:
On Saturday, 5 October 2019 at 08:01:15 UTC, Jack Applegame wrote:

 I recently tried Nim. The first good impression was quickly 
 replaced by disappointment.

 No nested types, constants and functions. Poor generics and 
 templates (but macros are great). Primitive modules system. 
 Unclear error messages.

 But. Nim's GC (including boehm) is much faster then D's one. 
 And Nim compiler creates very compact binaries.
I suppose Nim is not trying to have all features one can possibly look for in a programming language, rather it's designed as a "native Python" and that'd be great from my point of view. If researchers used Nim to write code that they usually write in Python, it'd speed up the development of real world programs (especially for programs where performance is of the essence). There is a trend (started by Google?) to keep PLs simple so that a) non-programmers can write code too and b) the code is more robust. Go is one example and now definitely Dart, i.e. there's only so much you can do / choose from. C++ is out of the question and D is somewhere between C++ and Python, fast modelling but way too many features for people who are not really into programing.
Oct 07 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 7 October 2019 at 10:38:49 UTC, Chris wrote:
 there's only so much you can do / choose from. C++ is out of 
 the question and D is somewhere between C++ and Python, fast 
 modelling but way too many features for people who are not 
 really into programing.
I don't think features necessarily is a problem. Python has an incredible amount of features and is rather complex if you want to master it all, but you usually don't have to deal with it if you just want to solve a problem. So what is important is to have a smooth curve of incremental learning and keep the features out of the application level programming interface. C++ is at the other end of the spectrum, not because of features, but because you cannot make good use of it (outside the C-subset) without being proficient. So when newbies complain about not understanding the documentation for the standard library of a language, then that could be a sign of not supporting the incremental learning process as well as desired. Sort of.
Oct 07 2019
parent reply Chris <wendlec tcd.ie> writes:
On Monday, 7 October 2019 at 14:20:10 UTC, Ola Fosheim Grøstad 
wrote:
 I don't think features necessarily is a problem. Python has an 
 incredible amount of features and is rather complex if you want 
 to master it all, but you usually don't have to deal with it if 
 you just want to solve a problem. So what is important is to 
 have a smooth curve of incremental learning and keep the 
 features out of the application level programming interface.

 C++ is at the other end of the spectrum, not because of 
 features, but because you cannot make good use of it (outside 
 the C-subset) without being proficient.

 So when newbies complain about not understanding the 
 documentation for the standard library of a language, then that 
 could be a sign of not supporting the incremental learning 
 process as well as desired.

 Sort of.
Yes. However, D is a bit like C++ in the sense that you have to be proficient or else it's very hard to make sense of it. Templates, ranges and lambdas are everywhere.
Oct 07 2019
parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Monday, 7 October 2019 at 14:38:01 UTC, Chris wrote:
 Yes. However, D is a bit like C++ in the sense that you have to 
 be proficient or else it's very hard to make sense of it. 
 Templates, ranges and lambdas are everywhere.
Yes, it becomes problematic when you need that kind of knowledge to use the std lib API. The questions newbies ask also suggests that having two different types of pointers increases the learning curve. How arrays reallocate when expanded is another surprise for newbies... and so on. By keeping track of what newbies ask about, one could probably learn a lot, but it has always been the case that seasoned D-programmers tend to have more influence over the direction than newbies. Which probably is why the language keeps heading in that direction...
Oct 07 2019
prev sibling parent Russel Winder <russel winder.org.uk> writes:
On Mon, 2019-10-07 at 10:38 +0000, Chris via Digitalmars-d wrote:
[=E2=80=A6]
 There is a trend (started by Google?) to keep PLs simple so that=20
 a) non-programmers can write code too and b) the code is more=20
 robust. Go is one example and now definitely Dart, i.e. there's=20
 only so much you can do / choose from. C++ is out of the question=20
 and D is somewhere between C++ and Python, fast modelling but way=20
 too many features for people who are not really into programing.
I believe the publicly released rationale for Go: =E2=80=93 A modern update on C: remove the obvious cruft of C, add object-b= ased and GC. =E2=80=93 A language very fast to parse and code generate. =E2=80=93 Include modern approach to parallelism and concurrency in the lan= guage: processes and channels on a thread pool. =E2=80=93 Include a modern approach to code reuse =E2=80=93 it is not total= ly clear Go has yet achieved this. =E2=80=93 A language in which Google interns cannot make big errors. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Dr Russel Winder t: +44 20 7585 2200 41 Buckmaster Road m: +44 7770 465 077 London SW11 1EN, UK w: www.russel.org.uk
Oct 07 2019
prev sibling parent reply IGotD- <nise nise.com> writes:
One Achilles heel of Nim is that it compiles via C or C++. This 
means in a debugging environment you will be watching the 
environment of the intermediate language. There workarounds for 
this described here.

https://nim-lang.org/blog/2017/10/02/documenting-profiling-and-debugging-nim-code.html#using-gdb-lldb

However, this doesn't solve everything. Also all the variable are 
intermixed with generated symbols, probably with some ugly hash 
added to it. Languages that directly compiles to the target like 
D will give a better debugging environment.

There is a attempt make Nim compile via LLVM instead.

https://github.com/arnetheduck/nlvm

not sure if it is in any productive state yet. Also Nim seems to 
allow C to intermix and all sorts of built in C features that 
will not work when going via LLVM. Trying to merge Nim with C 
features is a bad decision in my opinion.
Oct 09 2019
parent reply Chris <wendlec tcd.ie> writes:
On Wednesday, 9 October 2019 at 13:52:18 UTC, IGotD- wrote:
 One Achilles heel of Nim is that it compiles via C or C++. This 
 means in a debugging environment you will be watching the 
 environment of the intermediate language. There workarounds for 
 this described here.

 https://nim-lang.org/blog/2017/10/02/documenting-profiling-and-debugging-nim-code.html#using-gdb-lldb

 However, this doesn't solve everything. Also all the variable 
 are intermixed with generated symbols, probably with some ugly 
 hash added to it. Languages that directly compiles to the 
 target like D will give a better debugging environment.

 There is a attempt make Nim compile via LLVM instead.

 https://github.com/arnetheduck/nlvm

 not sure if it is in any productive state yet. Also Nim seems 
 to allow C to intermix and all sorts of built in C features 
 that will not work when going via LLVM. Trying to merge Nim 
 with C features is a bad decision in my opinion.
There are some things I'm not happy with either. But I think, as has been mentioned before, the purpose of Nim is to be a "native Python", i.e. it produces fast native stand alone binaries and you don't need Cython or an interpreter (Nim does that in one go). So the route via C/C++ seems acceptable from Nim's point of view. As I have no real world experience with Nim, I don't know how often you'll actually need to go low-level when debugging.
Oct 09 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Wednesday, 9 October 2019 at 14:00:46 UTC, Chris wrote:
 There are some things I'm not happy with either. But I think, 
 as has been mentioned before, the purpose of Nim is to be a 
 "native Python", i.e. it produces fast native stand alone 
 binaries and you don't need Cython or an interpreter (Nim does 
 that in one go).
Interesting that you take that viewpoint, that seems also to be the angle some in the press are going with: https://www.zdnet.com/article/python-inspired-nim-version-1-0-of-the-programming-language-launches/ I personally think that probably won't work out for Python programmers, but it is at least a good marketing strategy. Although if compiled Python is the goal the goal, I'd think that Go would be the best "VM" today.
 So the route via C/C++ seems acceptable from Nim's point of 
 view. As I have no real world experience with Nim, I don't know 
 how often you'll actually need to go low-level when debugging.
What works reasonably well in the JavaScript world for TypeScript is source-maps. So that you step through the TypeScript code, but the debugger executes JavaScript. Something similar ought to be possible for C, but probably takes a commercial-level effort to be usable.
Oct 10 2019
parent reply Chris <wendlec tcd.ie> writes:
On Thursday, 10 October 2019 at 08:16:40 UTC, Ola Fosheim Grøstad 
wrote:

 Interesting that you take that viewpoint, that seems also to be 
 the angle some in the press are going with:

 https://www.zdnet.com/article/python-inspired-nim-version-1-0-of-the-programming-language-launches/

 I personally think that probably won't work out for Python 
 programmers, but it is at least a good marketing strategy. 
 Although if compiled Python is the goal the goal, I'd think 
 that Go would be the best "VM" today.
Well, with the advent of GraalVM that already supports Python [1] and R [2], it will be much harder for Nim to convince Python users and data miners who use R to make the switch. If you have one VM that is polyglot you can combine all the widely used tools, e.g. Java/Kotlin for server/backend, desktop and (maybe) apps, Python and R for stats. [1] https://www.graalvm.org/docs/reference-manual/languages/python/ [2] https://www.graalvm.org/docs/reference-manual/languages/r/
Oct 10 2019
parent reply Marek Marczak <siloamx gmail.com> writes:
On Thursday, 10 October 2019 at 10:40:00 UTC, Chris wrote:
 Well, with the advent of GraalVM that already supports Python 
 [1] and R [2], it will be much harder for Nim to convince 
 Python users and data miners who use R to make the switch. If 
 you have one VM that is polyglot you can combine all the widely 
 used tools, e.g. Java/Kotlin for server/backend, desktop and 
 (maybe) apps, Python and R for stats.
Ekm... GraalVM has poor Python support. "Warning: The support for Python is experimental. Experimental features might never be included in a production version, or might change significantly before being considered production-ready."
Oct 18 2019
parent Chris <wendlec tcd.ie> writes:
On Friday, 18 October 2019 at 16:28:47 UTC, Marek Marczak wrote:
 On Thursday, 10 October 2019 at 10:40:00 UTC, Chris wrote:
 Well, with the advent of GraalVM that already supports Python 
 [1] and R [2], it will be much harder for Nim to convince 
 Python users and data miners who use R to make the switch. If 
 you have one VM that is polyglot you can combine all the 
 widely used tools, e.g. Java/Kotlin for server/backend, 
 desktop and (maybe) apps, Python and R for stats.
Ekm... GraalVM has poor Python support. "Warning: The support for Python is experimental. Experimental features might never be included in a production version, or might change significantly before being considered production-ready."
Oh, I know that and they saliently state it on the website. But who knows, in 2 or three years GraalVM may be truely polyglot and take languages like Python to new places. But you never know. Oracle might just kill the whole thing again.
Oct 20 2019