www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Uphill

reply "Chris" <wendlec tcd.ie> writes:
I was recently thinking that D is a bit like climbing up a hill 
or a mountain. For the most part you are focused on reaching the 
top, yet every once in a while it's good to stop and turn around 
to enjoy the scenery and see how far you've come. So here is what 
I see:

- LDC/GDC: easy to download and use. Nicely packaged.
- DUB: great tool for project management
- DVM: great tool for upgrading from D to D.
- Phobos: has become quite a useful library. Ranges are an 
important part of data processing, I don't wanna miss 'em anymore
- vibe.d: a web server in D.
- Projects in D: LuaD, PyD etc etc.
- the expertise that's involved
[add anything you like]

Mind you, this has been achieved without millions of dollars and 
corporate backing and yet D is a real language with real 
applications (only nobody talks about it). I know, there is still 
a steep climb ahead of D, but let's enjoy the view for a while. 
What has been achieved is by no means trivial.
May 22 2015
next sibling parent Rikki Cattermole <alphaglosined gmail.com> writes:
On 22/05/2015 10:21 p.m., Chris wrote:
 I was recently thinking that D is a bit like climbing up a hill or a
 mountain. For the most part you are focused on reaching the top, yet
 every once in a while it's good to stop and turn around to enjoy the
 scenery and see how far you've come. So here is what I see:

 - LDC/GDC: easy to download and use. Nicely packaged.
 - DUB: great tool for project management
 - DVM: great tool for upgrading from D to D.
 - Phobos: has become quite a useful library. Ranges are an important
 part of data processing, I don't wanna miss 'em anymore
 - vibe.d: a web server in D.
s/web server/web server framework/ My only gripe!
 - Projects in D: LuaD, PyD etc etc.
 - the expertise that's involved
 [add anything you like]

 Mind you, this has been achieved without millions of dollars and
 corporate backing and yet D is a real language with real applications
 (only nobody talks about it). I know, there is still a steep climb ahead
 of D, but let's enjoy the view for a while. What has been achieved is by
 no means trivial.
May 22 2015
prev sibling next sibling parent "Paolo Invernizzi" <paolo.invernizzi no.address> writes:
On Friday, 22 May 2015 at 10:21:18 UTC, Chris wrote:
 I was recently thinking that D is a bit like climbing up a hill 
 or a mountain. For the most part you are focused on reaching 
 the top, yet every once in a while it's good to stop and turn 
 around to enjoy the scenery and see how far you've come. So 
 here is what I see:

 - LDC/GDC: easy to download and use. Nicely packaged.
 - DUB: great tool for project management
 - DVM: great tool for upgrading from D to D.
 - Phobos: has become quite a useful library. Ranges are an 
 important part of data processing, I don't wanna miss 'em 
 anymore
 - vibe.d: a web server in D.
 - Projects in D: LuaD, PyD etc etc.
 - the expertise that's involved
 [add anything you like]

 Mind you, this has been achieved without millions of dollars 
 and corporate backing and yet D is a real language with real 
 applications (only nobody talks about it). I know, there is 
 still a steep climb ahead of D, but let's enjoy the view for a 
 while. What has been achieved is by no means trivial.
Well said! -- Paolo
May 22 2015
prev sibling next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, May 22, 2015 at 10:21:17AM +0000, Chris via Digitalmars-d wrote:
 I was recently thinking that D is a bit like climbing up a hill or a
 mountain. For the most part you are focused on reaching the top, yet
 every once in a while it's good to stop and turn around to enjoy the
 scenery and see how far you've come. So here is what I see:
 
 - LDC/GDC: easy to download and use. Nicely packaged.
 - DUB: great tool for project management
 - DVM: great tool for upgrading from D to D.
 - Phobos: has become quite a useful library. Ranges are an important
 part of data processing, I don't wanna miss 'em anymore
 - vibe.d: a web server in D.
 - Projects in D: LuaD, PyD etc etc.
 - the expertise that's involved
 [add anything you like]
 
 Mind you, this has been achieved without millions of dollars and
 corporate backing and yet D is a real language with real applications
 (only nobody talks about it). I know, there is still a steep climb
 ahead of D, but let's enjoy the view for a while. What has been
 achieved is by no means trivial.
+1, finally, something other than the usual bickering on the forum. ;-) T -- Nobody is perfect. I am Nobody. -- pepoluan, GKC forum
May 22 2015
parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, 22 May 2015 at 14:11:49 UTC, H. S. Teoh wrote:
 +1, finally, something other than the usual bickering on the 
 forum. ;-)
LOL. Don't worry. I'm sure that someone will come along and start griping about something soon. :( Joking aside, we do seem to frequently have the problem that what we have is good enough that folks expect it to be perfect and thus start complaining about how we don't do something well enough when we actually do it better than most anyone else. D certainly isn't perfect - and we _do_ have areas to improve upon - but we what we do have is pretty darn awesome. - Jonathan M Davis
May 22 2015
next sibling parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, May 22, 2015 at 04:00:28PM +0000, Jonathan M Davis via Digitalmars-d
wrote:
 On Friday, 22 May 2015 at 14:11:49 UTC, H. S. Teoh wrote:
+1, finally, something other than the usual bickering on the forum.
;-)
LOL. Don't worry. I'm sure that someone will come along and start griping about something soon. :( Joking aside, we do seem to frequently have the problem that what we have is good enough that folks expect it to be perfect and thus start complaining about how we don't do something well enough when we actually do it better than most anyone else. D certainly isn't perfect - and we _do_ have areas to improve upon - but we what we do have is pretty darn awesome.
[...] Agreed, D does have its warts and dark corners, but overall it's extremely awesome. I just can't bring myself to starting new projects in any other language these days (unless I get paid to do it, of course). D has totally wrecked my life, and it's all yall's fault!! j/k T -- Why have vacation when you can work?? -- EC
May 22 2015
next sibling parent reply "Chris" <wendlec tcd.ie> writes:
On Friday, 22 May 2015 at 17:05:10 UTC, H. S. Teoh wrote:
 On Fri, May 22, 2015 at 04:00:28PM +0000, Jonathan M Davis via 
 Digitalmars-d wrote:
 On Friday, 22 May 2015 at 14:11:49 UTC, H. S. Teoh wrote:
+1, finally, something other than the usual bickering on the 
forum.
;-)
LOL. Don't worry. I'm sure that someone will come along and start griping about something soon. :( Joking aside, we do seem to frequently have the problem that what we have is good enough that folks expect it to be perfect and thus start complaining about how we don't do something well enough when we actually do it better than most anyone else. D certainly isn't perfect - and we _do_ have areas to improve upon - but we what we do have is pretty darn awesome.
[...] Agreed, D does have its warts and dark corners, but overall it's extremely awesome. I just can't bring myself to starting new projects in any other language these days
True, true. If I have the choice, it's D. If it's another language, I very soon start to miss D's features.
 (unless I get paid to do it, of course).
 D has totally wrecked my life, and it's all yall's fault!! j/k


 T
May 22 2015
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 05/22/2015 01:16 PM, Chris wrote:
 On Friday, 22 May 2015 at 17:05:10 UTC, H. S. Teoh wrote:
 Agreed, D does have its warts and dark corners, but overall it's
 extremely awesome. I just can't bring myself to starting new projects in
 any other language these days
True, true. If I have the choice, it's D. If it's another language, I very soon start to miss D's features.
I've pretty much sworn off all other languages[1]. Life's too damn short to even touch them. I'd sooner switch careers than waste any more of my life on those other langs. [1] (Aside from occasional nemerle if I need CLR.)
May 30 2015
prev sibling parent "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Friday, 22 May 2015 at 17:05:10 UTC, H. S. Teoh wrote:
 Agreed, D does have its warts and dark corners, but overall it's
 extremely awesome. I just can't bring myself to starting new 
 projects in
 any other language these days (unless I get paid to do it, of 
 course).
 D has totally wrecked my life, and it's all yall's fault!! j/k
LOL. The problem I have is that I'm not learning any new languages anymore. Not only do I never have enough time to do all that I want to get done with D itself, but if I'm working on a project that I care about at all, I want to do it in D. But in my experience, to really learn a new language, you need to either be using it all the time at work, or you need to be using it as you go-to language for all of your side projects. If you don't, you don't really end up using it enough to really learn it. And since I really want to be doing my side projects in D (and I work in C++), I'm really not experimenting with other languages much, and I'm definitely not learning them well. I haven't even learned C++11/14 properly, because I can't use it at work yet, and almost all the programming I do in my free time is in D. But everything other than D is just so frustrating anyway. ;) - Jonathan M Davis
May 22 2015
prev sibling next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Fri, 22 May 2015 16:00:28 +0000, Jonathan M Davis wrote:

 Joking aside, we do seem to frequently have the problem that what we
 have is good enough that folks expect it to be perfect and thus start
 complaining about how we don't do something well enough when we actually
 do it better than most anyone else. D certainly isn't perfect - and we
 _do_ have areas to improve upon - but we what we do have is pretty darn
 awesome.
yes, this is the case. being good for D is took as granted, and being=20 imperfect seen as devs lazyness. ;-)=
May 23 2015
prev sibling parent reply Russel Winder via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Fri, 2015-05-22 at 16:00 +0000, Jonathan M Davis via Digitalmars-d
wrote:
 On Friday, 22 May 2015 at 14:11:49 UTC, H. S. Teoh wrote:
 +1, finally, something other than the usual bickering on the=20
 forum. ;-)
=20 LOL. Don't worry. I'm sure that someone will come along and start=20 griping about something soon. =20
I think much of the problem is pure angst. And bad email list etiquette. In other language mailing lists I am, on a gripe thread lasts about five to 10 emails and then fades away. With this list it generally gets to about 300 or more. Usually covering, over time, seven or 20 different topics completely unrelated to the original posting and retained subject line. Anyway now D has Rust to content with, as well as Go and C++. --=20 Russel. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D Dr Russel Winder t: +44 20 7585 2200 voip: sip:russel.winder ekiga.n= et 41 Buckmaster Road m: +44 7770 465 077 xmpp: russel winder.org.uk London SW11 1EN, UK w: www.russel.org.uk skype: russel_winder
May 23 2015
parent reply "Joakim" <dlang joakim.fea.st> writes:
On Saturday, 23 May 2015 at 14:20:40 UTC, Russel Winder wrote:
 On Fri, 2015-05-22 at 16:00 +0000, Jonathan M Davis via 
 Digitalmars-d
 wrote:
 On Friday, 22 May 2015 at 14:11:49 UTC, H. S. Teoh wrote:
 +1, finally, something other than the usual bickering on the 
 forum. ;-)
LOL. Don't worry. I'm sure that someone will come along and start griping about something soon.
I think much of the problem is pure angst. And bad email list etiquette. In other language mailing lists I am, on a gripe thread lasts about five to 10 emails and then fades away. With this list it generally gets to about 300 or more. Usually covering, over time, seven or 20 different topics completely unrelated to the original posting and retained subject line.
Self-criticism is necessary for improvement.
 Anyway now D has Rust to content with, as well as Go and C++.
Rust's syntax dooms it to the same niche as Haskell. Go isn't full-featured enough to really go after the king, C++. There is only one challenger to C++ and it's D. It's a good sign that C++ has been copying D features recently, it means they're feeling the heat.
May 24 2015
next sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 It's a good sign that C++ has been copying D features recently, 
 it means they're feeling the heat.
I suspect that it's not so much that they're really feeling any pressure from D so much as that when they see a cool feature that they think could be made to reasonably fit into C++ and improve it, they do it (or at least try to - obviously, not everything actually makes it in). And some of the improvements that D made are natural enough that the C++ guys could have easily come up with them on their own (whereas others almost had to have come from someone who had seen them in D). The fact that C++ has taken some of D's features is definitely a good sign for D in that it shows their value, and it means that we're getting some good cross-pollination going on between languages, but I very much doubt that all that many serious C++ folks feel that D is much of a threat to them - not at this point anyway. D is a magnet for folks wanting something better than C++, but we haven't grown enough yet to challenge C++ in any serious way as far as market share goes. - Jonathan M Davis
May 24 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Sunday, 24 May 2015 at 08:05:37 UTC, Jonathan M Davis wrote:
 On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 It's a good sign that C++ has been copying D features 
 recently, it means they're feeling the heat.
I suspect that it's not so much that they're really feeling any pressure from D so much as that when they see a cool feature that they think could be made to reasonably fit into C++ and improve it, they do it (or at least try to - obviously, not everything actually makes it in). And some of the improvements that D made are natural enough that the C++ guys could have easily come up with them on their own (whereas others almost had to have come from someone who had seen them in D). The fact that C++ has taken some of D's features is definitely a good sign for D in that it shows their value, and it means that we're getting some good cross-pollination going on between languages, but I very much doubt that all that many serious C++ folks feel that D is much of a threat to them - not at this point anyway. D is a magnet for folks wanting something better than C++, but we haven't grown enough yet to challenge C++ in any serious way as far as market share goes. - Jonathan M Davis
IMO I think the worst thing C++ has done is blatantly ignore features that have been 'killer' in D(see: the reaction to the static_if proposal)
May 24 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 5/24/15 1:20 AM, weaselcat wrote:
 IMO I think the worst thing C++ has done is blatantly ignore features
 that have been 'killer' in D(see: the reaction to the static_if proposal)
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4461.html -- Andrei
May 24 2015
next sibling parent reply "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Sunday, 24 May 2015 at 16:03:30 UTC, Andrei Alexandrescu wrote:
 On 5/24/15 1:20 AM, weaselcat wrote:
 IMO I think the worst thing C++ has done is blatantly ignore 
 features
 that have been 'killer' in D(see: the reaction to the 
 static_if proposal)
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4461.html -- Andrei
Quote: "always going to establish a new scope." This alone will make it much less powerful that D's static if.
May 25 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 5/25/15 3:51 AM, "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net>" 
wrote:
 On Sunday, 24 May 2015 at 16:03:30 UTC, Andrei Alexandrescu wrote:
 On 5/24/15 1:20 AM, weaselcat wrote:
 IMO I think the worst thing C++ has done is blatantly ignore features
 that have been 'killer' in D(see: the reaction to the static_if
 proposal)
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4461.html -- Andrei
Quote: "always going to establish a new scope." This alone will make it much less powerful that D's static if.
I know. Politics does have a cost. -- Andrei
May 25 2015
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 5/24/2015 9:03 AM, Andrei Alexandrescu wrote:
 On 5/24/15 1:20 AM, weaselcat wrote:
 IMO I think the worst thing C++ has done is blatantly ignore features
 that have been 'killer' in D(see: the reaction to the static_if proposal)
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4461.html -- Andrei
The proposal: Proposal: static if declaration http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2012/n3329.pdf The rebuttal: "Static If" Considered http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3613.pdf Quoted from the rebuttal:
 The static if feature recently proposed for C++ [1, 2] is fundamentally flawed,
 and its adoption would be a disaster for the language. The feature provides a
 single syntax with three distinct semantics, depending on the context of use.
 The primary mechanism of these semantics is to avoid parsing in branches not
 taken. This will make programs harder to read, understand, maintain, and
 debug. It would also impede and possibly prevent the future development of
 other language features, such as concepts. Furthermore, the adoption of this
 feature would seriously compromise our ability to produce AST- based tools
 for C++, and therefore put C++ at a further disadvantage compared to other
 modern languages vis a vis tool support. It would make C++ a lower-level
 language.
May 30 2015
parent reply "Brian Schott" <briancschott gmail.com> writes:
On Sunday, 31 May 2015 at 04:18:32 UTC, Walter Bright wrote:
 Furthermore, the adoption of this
 feature would seriously compromise our ability to produce AST- 
 based tools
 for C++, and therefore put C++ at a further disadvantage 
 compared to other
 modern languages vis a vis tool support.
I find it hilarious that they can say that in a language that needs a preprocessor. Macros (and mixins) destroy AST-based tools, not things like "static if" that are right there in the AST.
May 30 2015
next sibling parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Sunday, 31 May 2015 at 06:03:36 UTC, Brian Schott wrote:
 I find it hilarious that they can say that in a language that 
 needs a preprocessor. Macros (and mixins) destroy AST-based 
 tools, not things like "static if" that are right there in the 
 AST.
That's not right. "static if" is just as bad as macros and affects partial evaluation. Besides I think C++ is moving away from macros, and I believe it is a stated goal for BS to do so. I don't use macros in C++ anymore.
May 30 2015
prev sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 31 May 2015 at 06:03:36 UTC, Brian Schott wrote:
 On Sunday, 31 May 2015 at 04:18:32 UTC, Walter Bright wrote:
 Furthermore, the adoption of this
 feature would seriously compromise our ability to produce 
 AST- based tools
 for C++, and therefore put C++ at a further disadvantage 
 compared to other
 modern languages vis a vis tool support.
I find it hilarious that they can say that in a language that needs a preprocessor. Macros (and mixins) destroy AST-based tools, not things like "static if" that are right there in the AST.
Using macros in C++ is considered bad style and a sign of someone sticking to Cisms. With meta-programming, templates, strong enums, const, constexpr and inline there are very few valid reasons to use macros other than C copy-paste compatibility. However, C++ seems to be really into the route of library only language if we look at how it is available on mobile OS, only as complement to the main languages, not as the language under the spotlight. Even on WinRT, C++/CX doesn't seem to get many followers outside the game developers world. To the point that Windows 10 will also expose DirectX as WinRT components (on 8.x it is only directly available to C++). -- Paulo
May 31 2015
next sibling parent reply "Atila Neves" <atila.neves gmail.com> writes:
While C++ programmers should try and avoid the preprocessor as 
much as possible, sometimes it just isn't possible to do so. 
There's just no other way to generate code sometimes. I know, 
I've tried.

Atila

On Sunday, 31 May 2015 at 07:54:29 UTC, Paulo Pinto wrote:
 On Sunday, 31 May 2015 at 06:03:36 UTC, Brian Schott wrote:
 On Sunday, 31 May 2015 at 04:18:32 UTC, Walter Bright wrote:
 Furthermore, the adoption of this
 feature would seriously compromise our ability to produce 
 AST- based tools
 for C++, and therefore put C++ at a further disadvantage 
 compared to other
 modern languages vis a vis tool support.
I find it hilarious that they can say that in a language that needs a preprocessor. Macros (and mixins) destroy AST-based tools, not things like "static if" that are right there in the AST.
Using macros in C++ is considered bad style and a sign of someone sticking to Cisms. With meta-programming, templates, strong enums, const, constexpr and inline there are very few valid reasons to use macros other than C copy-paste compatibility. However, C++ seems to be really into the route of library only language if we look at how it is available on mobile OS, only as complement to the main languages, not as the language under the spotlight. Even on WinRT, C++/CX doesn't seem to get many followers outside the game developers world. To the point that Windows 10 will also expose DirectX as WinRT components (on 8.x it is only directly available to C++). -- Paulo
May 31 2015
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 31 May 2015 at 08:51:00 UTC, Atila Neves wrote:
 While C++ programmers should try and avoid the preprocessor as 
 much as possible, sometimes it just isn't possible to do so. 
 There's just no other way to generate code sometimes. I know, 
 I've tried.

 Atila
Yes, there is. By using an external tool like in other languages. :) However I do agree that for small things, it doesn't make sense to add an external dependency to the build. -- Paulo
May 31 2015
prev sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Sunday, 31 May 2015 at 08:51:00 UTC, Atila Neves wrote:
 While C++ programmers should try and avoid the preprocessor as 
 much as possible, sometimes it just isn't possible to do so. 
 There's just no other way to generate code sometimes. I know, 
 I've tried.
In what case is this true? If it is only to avoid some boiler plate then it is not a good excuse, IMO. You can usually avoid macros by restructuring composition (using multiple layers of templates). Anyway, cpp is a separate language from c++ so it only affects AST related tooling that modify source files where macros are present. It is overall more separate than builtin textual substitution (which is generally a bad idea) since it is a discrete independent step that a tool easily can apply before analysis.
May 31 2015
parent reply "Atila Neves" <atila.neves gmail.com> writes:
On Sunday, 31 May 2015 at 09:13:33 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 31 May 2015 at 08:51:00 UTC, Atila Neves wrote:
 While C++ programmers should try and avoid the preprocessor as 
 much as possible, sometimes it just isn't possible to do so. 
 There's just no other way to generate code sometimes. I know, 
 I've tried.
In what case is this true? If it is only to avoid some boiler plate then it is not a good excuse, IMO. You can usually avoid macros by restructuring composition (using multiple layers of templates).
I'll take a macro over boilerplate any day of the week and twice on Sundays. Atila
Jun 01 2015
parent reply Dan Olson <gorox comcast.net> writes:
"Atila Neves" <atila.neves gmail.com> writes:

 On Sunday, 31 May 2015 at 09:13:33 UTC, Ola Fosheim Grøstad wrote:
 On Sunday, 31 May 2015 at 08:51:00 UTC, Atila Neves wrote:
 While C++ programmers should try and avoid the preprocessor as much
 as possible, sometimes it just isn't possible to do so. There's
 just no other way to generate code sometimes. I know, I've tried.
In what case is this true? If it is only to avoid some boiler plate then it is not a good excuse, IMO. You can usually avoid macros by restructuring composition (using multiple layers of templates).
I'll take a macro over boilerplate any day of the week and twice on Sundays.
Timely! I and stack overflow struggled for a couple hours to find an equivalent C++ template for something that was straightforward with a couple macros.
Jun 01 2015
next sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 16:09:34 UTC, Dan Olson wrote:
 "Atila Neves" <atila.neves gmail.com> writes:

 On Sunday, 31 May 2015 at 09:13:33 UTC, Ola Fosheim Grøstad 
 wrote:
 On Sunday, 31 May 2015 at 08:51:00 UTC, Atila Neves wrote:
 While C++ programmers should try and avoid the preprocessor 
 as much
 as possible, sometimes it just isn't possible to do so. 
 There's
 just no other way to generate code sometimes. I know, I've 
 tried.
In what case is this true? If it is only to avoid some boiler plate then it is not a good excuse, IMO. You can usually avoid macros by restructuring composition (using multiple layers of templates).
I'll take a macro over boilerplate any day of the week and twice on Sundays.
Timely! I and stack overflow struggled for a couple hours to find an equivalent C++ template for something that was straightforward with a couple macros.
I use macros for stuff like exceptions all the time - e.g. THROW(MyException, ("This value is wrong: %d", foo)); The macro handles logging the exception, getting and setting the stacktrace on the exception, setting the file and line number of the exception, as well as constructing the string for the exception's message given the arguments - and of course finally throwing it. You _are_ still forced to call format in D (whereas that macro does it for you), but aside from that, the built-in exception stuff does all of that for you by simply throwing a new exception with a message, whereas C++ doesn't even come close. Without a macro, getting all of the information in C++ - and doing it consistently and correctly - would be a big problem. Yes, macros should be avoided in general, but there are areas where you really don't have much choice, and saying that all macros are unequivocably bad is quite short-sighted. It's when macros are used when they aren't needed that it's a problem. Coming out of college, I actually believed the dogma that all macros are bad, but experience has shown me that that's just not true. Sure, it would be nice if we had a better solution in C++, but sometimes we just don't. - Jonathan M Davis
Jun 01 2015
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 16:57:00 UTC, Jonathan M Davis wrote:
 I use macros for stuff like exceptions all the time - e.g.

 THROW(MyException, ("This value is wrong: %d", foo));
ick! Yes, if you want stack-trace like information in release-builds you need to use the macro system, but that's because __FILE__ and __LINE__ are macros! That's a C deficiency. Usually your debugger gets you what you are looking for without this in debug builds, right? And your THROW macro does not help when you receive exceptions from libraries. (I don't use exceptions in C++)
 even come close. Without a macro, getting all of the 
 information in C++ - and doing it consistently and correctly - 
 would be a big problem.
You would need a core dump ;^), but consistently… well, if you don't use libraries.
 Yes, macros should be avoided in general, but there are areas 
 where you really don't have much choice, and saying that all 
 macros are unequivocably bad is quite short-sighted. It's when 
 macros are used when they aren't needed that it's a problem.
It's a problem if you need it. It is almost always used to address language design flaws or other structural flaws.
 Coming out of college, I actually believed the dogma that all 
 macros are bad, but experience has shown me that that's just 
 not true. Sure, it would be nice if we had a better solution in 
 C++, but sometimes we just don't.
Yet, some of programmers don't actually use it anymore. In C you need it, it is an integral part of the language design. In C++ you can avoid it for the most part or replace it with external code-gen ( + makefile ). (I sometimes use macros for constants like π out of habit, but I consider it a bad habit :-P.)
Jun 01 2015
next sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 17:17:18 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 1 June 2015 at 16:57:00 UTC, Jonathan M Davis wrote:
 I use macros for stuff like exceptions all the time - e.g.

 THROW(MyException, ("This value is wrong: %d", foo));
ick! Yes, if you want stack-trace like information in release-builds you need to use the macro system, but that's because __FILE__ and __LINE__ are macros! That's a C deficiency.
The SAD thing here is that C++ actually do expensive stack introspection to unwind the stack based on return address, so the C++ runtime _could_ have been designed to look up call site information down the stack at runtime for "low additional cost" even with separate compilation, just like a high level language with stack introspection (in Python: "inspect.stack()[1][0].f_lineno"). Not saying it is a good idea for a system level language, but there is no technical reason to have macro warts like __LINE__ since the full C++ runtime already is bloated. And they could have gotten around it with by adding compile time introspection to the language too. I'd say C has embraced macros for good reasons, as a minimalistic language design strategy (newest C version using it for generics), but C++ has no longer an excuse for providing it.
Jun 01 2015
parent reply "Atila Neves" <atila.neves gmail.com> writes:
 I'd say C has embraced macros for good reasons, as a 
 minimalistic language design strategy  (newest C version using 
 it for generics), but C++ has no longer an excuse for providing 
 it.
Excuses for C++ : 1. Backwards compatibility with existing C++ code 2. Being able to call C code that depends on macro definitons to actually work 3. The aforementioned cases in which templates can't do the job a macro to achieve the equivalent task in C++. It's either that or boilerplate. Yay when I get to write D, boo when I just have to use C++. I feel dirty every time I type `#define`, but I'd feel dirtier if I repeated code all over the place. As always, it's a trade-off. Atila
Jun 01 2015
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 18:44:02 UTC, Atila Neves wrote:
 1. Backwards compatibility with existing C++ code
 2. Being able to call C code that depends on macro definitons 
 to actually work
 3. The aforementioned cases in which templates can't do the job


 need a macro to achieve the equivalent task in C++. It's either 
 that or boilerplate. Yay when I get to write D, boo when I just 
 have to use C++. I feel dirty every time I type `#define`, but 
 I'd feel dirtier if I repeated code all over the place. As 
 always, it's a trade-off.
Yes, it is a trade-off. 20 years ago I tried to write terser code and do code gen with macros if possible, now I don't mind some repeated code as it is often easier to understand later on than indirections and I often find that building tables etc with an external tool is cleaner than macros anyway. Some boiler plate, like SFINAE templates for testing type properties ("does the class have a .length() function") look really ugly in C++ compared to newer languages, but I keep those tucked away in a header file, so it is not so bad. The good thing about this is that I don't get tempted to add SFINAE constructs I don't understand, which is an advantage when debugging... I agree macros are needed for reasonable C interop, but _unfortunately_ the presence of macros is like a sleeping pillow for the C++ language-designers that allows them to add new pieces without getting down to a clean core language. As for mixins, I don't like those either. You can often get around that with analysis that leads to a well thought out design, but I agree that sometimes it is too late for that and boilerplate or macros ensue as a result of evolution… The question then is, would a lack of textual-substitution-first-aid have made the architect/programmers spend more time on design before coding? So yes, providing and using macros is a trade-off, but I usually of the wrong kind. (E.g. a convenient excuse for not cleaning up the language or the application design)
Jun 01 2015
prev sibling parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Monday, 1 June 2015 at 17:17:18 UTC, Ola Fosheim Grøstad wrote:
 On Monday, 1 June 2015 at 16:57:00 UTC, Jonathan M Davis wrote:
 I use macros for stuff like exceptions all the time - e.g.

 THROW(MyException, ("This value is wrong: %d", foo));
ick! Yes, if you want stack-trace like information in release-builds you need to use the macro system, but that's because __FILE__ and __LINE__ are macros! That's a C deficiency. Usually your debugger gets you what you are looking for without this in debug builds, right?
Goodness no. The exception needs to have that information in it, and I want exceptions logged so that I can track what happened without running in a debugger - and in release mode as well as debug mode. And it's not like I can necessarily reproduce the problem by rerunning the program anyway, so relying on the debugger for this sort of thing just doesn't fly in general, much as it might in simple cases.
 (I don't use exceptions in C++)
My condolences.
 It's a problem if you need it. It is almost always used to 
 address language design flaws or other structural flaws.
Even if that's true, you're still stuck unless they fix the language. If/until C++ provides alternate solutions to the kinds of things that require macros right now, you need macros. I can always wish that C++ were better, but if I'm programming in C++, I have to deal with what it provides. The only way around that is to use another language (like D), and it that were an option, I probably wouldn't be using C++ in the first place. It sounds to me like you're too against macros for your own good. Sure, they suck, but that's just life with C++ unless you want to make life harder for yourself. Simply wishing that the situation were better doesn't make it so. - Jonathan M Davis
Jun 01 2015
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 18:54:18 UTC, Jonathan M Davis wrote:
 It sounds to me like you're too against macros for your own 
 good. Sure, they suck, but that's just life with C++ unless you 
 want to make life harder for yourself. Simply wishing that the
But it doesn't make my C++ life harder, it makes my C++ life better. I don't need them, where I would use macros in C, I can usually make do with a constexpr or a template and get type safety and more readable code. But I hope you see my point that __FILE__ and __LINE__ would be better done with introspection, also in D? Such "magic cookies" are signs of a lack of orthogonality. (And it can be done as hidden registers/parameters in the language implementation so that you can have separate compilation and compile time evaluation of introspection where possible.)
Jun 01 2015
parent reply "Jonathan M Davis" <jmdavisProg gmx.com> writes:
On Tuesday, 2 June 2015 at 05:58:43 UTC, Ola Fosheim Grøstad 
wrote:
 On Monday, 1 June 2015 at 18:54:18 UTC, Jonathan M Davis wrote:
 It sounds to me like you're too against macros for your own 
 good. Sure, they suck, but that's just life with C++ unless 
 you want to make life harder for yourself. Simply wishing that 
 the
But it doesn't make my C++ life harder, it makes my C++ life better. I don't need them, where I would use macros in C, I can usually make do with a constexpr or a template and get type safety and more readable code. But I hope you see my point that __FILE__ and __LINE__ would be better done with introspection, also in D? Such "magic cookies" are signs of a lack of orthogonality. (And it can be done as hidden registers/parameters in the language implementation so that you can have separate compilation and compile time evaluation of introspection where possible.)
D's semantics for __FILE__ and __LINE__ are so much better than C++'s. I sorely miss D's semantics for them whenever I'm in C++. Having them get the values from the call site rather than the declaration site is so much more useful that it's not even funny. - Jonathan M Davis
Jun 01 2015
next sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 2 June 2015 at 06:32:43 UTC, Jonathan M Davis wrote:
 D's semantics for __FILE__ and __LINE__ are so much better than 
 C++'s. I sorely miss D's semantics for them whenever I'm in 
 C++. Having them get the values from the call site rather than 
 the declaration site is so much more useful that it's not even 
 funny.
But you are referring to evaluation of default parameters. Evaluating default parameters at the call site can be problematic, so I don't think this is obvious at all. All functions have a conceptual object which is the "activation record", often represented by the stack frame (but in some languages a heap object). It would be reasonable to be able to query this "activation record" object, just like you have "this" that refers to the object of a method you could have a "thisfunction.caller.lineno" etc.
Jun 01 2015
parent reply "deadalnix" <deadalnix gmail.com> writes:
On Tuesday, 2 June 2015 at 06:42:13 UTC, Ola Fosheim Grøstad 
wrote:
 On Tuesday, 2 June 2015 at 06:32:43 UTC, Jonathan M Davis wrote:
 D's semantics for __FILE__ and __LINE__ are so much better 
 than C++'s. I sorely miss D's semantics for them whenever I'm 
 in C++. Having them get the values from the call site rather 
 than the declaration site is so much more useful that it's not 
 even funny.
But you are referring to evaluation of default parameters. Evaluating default parameters at the call site can be problematic, so I don't think this is obvious at all. All functions have a conceptual object which is the "activation record", often represented by the stack frame (but in some languages a heap object). It would be reasonable to be able to query this "activation record" object, just like you have "this" that refers to the object of a method you could have a "thisfunction.caller.lineno" etc.
That would be even greater if they would be chained ! What a debug tool we could get out of this ! I know ! We should call them "stack traces" ! Catchy !
Jun 02 2015
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 2 June 2015 at 08:06:16 UTC, deadalnix wrote:
 That would be even greater if they would be chained ! What a 
 debug tool we could get out of this ! I know ! We should call 
 them "stack traces" ! Catchy !
A stack trace is a dump of activation records, yes. But compile time introspection isn't actually a stack trace.
Jun 02 2015
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2015-06-02 08:32, Jonathan M Davis wrote:

 D's semantics for __FILE__ and __LINE__ are so much better than C++'s. I
 sorely miss D's semantics for them whenever I'm in C++. Having them get
 the values from the call site rather than the declaration site is so
 much more useful that it's not even funny.
Yeah, it's pretty cool. Swift also evaluates them in the same way as D, they even credited D in their blog post :) -- /Jacob Carlborg
Jun 02 2015
prev sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 16:09:34 UTC, Dan Olson wrote:
 Timely!  I and stack overflow struggled for a couple hours to 
 find an
 equivalent C++ template for something that was straightforward 
 with a
 couple macros.
…but without an example it is hard to figure out what macros are needed for.
Jun 01 2015
parent reply Dan Olson <gorox comcast.net> writes:
"Ola Fosheim "Grøstad\"" <ola.fosheim.grostad+dlang gmail.com> writes:

 On Monday, 1 June 2015 at 16:09:34 UTC, Dan Olson wrote:
 Timely!  I and stack overflow struggled for a couple hours to find
 an
 equivalent C++ template for something that was straightforward with
 a
 couple macros.
…but without an example it is hard to figure out what macros are needed for.
Stringify - here I want to rapidly prototype code with syscalls that need return values checked, and get nice output when they fails. My C++ template skills are weak and was unable to come up with an equivalent replacement. Is this a way? #define STRINGIFY(x) #x #define STR(x) STRINGIFY(x) #define SYSCHK(syscall) \ ({__typeof__(syscall) r = (syscall); \ if (r == -1) die(__FILE__ ":" STR(__LINE__) ":" #syscall " failed"); \ r;}) int fd = SYSCHK(open(fname, O_EVTONLY)); int kfd = SYSCHK(kqueue()); struct kevent changes; ... struct kevent kevs[5]; int n = SYSCHK(kevent(kfd, &changes, 1, kevs, 5, 0));
Jun 03 2015
parent reply Dan Olson <gorox comcast.net> writes:
Dan Olson <gorox comcast.net> writes:
 Stringify - here I want to rapidly prototype code with syscalls that
 need return values checked, and get nice output when they fails. My C++
 template skills are weak and was unable to come up with an equivalent
 replacement.  Is this a way?
Meant "Is there a way?" to do such a thing with templates.
Jun 03 2015
parent "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Wednesday, 3 June 2015 at 07:47:04 UTC, Dan Olson wrote:
 Dan Olson <gorox comcast.net> writes:
 Meant "Is there a way?" to do such a thing with templates.
I don't think there is a way to turn symbols into strings without a table or #preprocessing. In C++ I would personally have used a table instead, but as a quick hack for testing, a macro would do. I am not sure if this should be possible with templates as the actual name of the symbol should be transparent to the type-system. The type system should only care about identity/uniqueness of symbols, not spelling, sorting etc. So if C++ made it available it should not be through the type system IMO.
Jun 03 2015
prev sibling next sibling parent reply "Joakim" <dlang joakim.fea.st> writes:
On Sunday, 31 May 2015 at 07:54:29 UTC, Paulo Pinto wrote:
 However, C++ seems to be really into the route of library only 
 language if we look at how it is available on mobile OS, only 
 as complement to the main languages, not as the language under 
 the spotlight.

 Even on WinRT, C++/CX doesn't seem to get many followers 
 outside the game developers world. To the point that Windows 10 
 will also expose DirectX as WinRT components (on 8.x it is only 
 directly available to C++).
You seem to dismiss game development as some niche, when it is one of the main killer apps driving the mobile boom. Around 80% of the top paid apps on iOS and Android are games: https://www.appannie.com/apps/google-play/top/united-states/ Most mobile games are written in C/C++/OpenGL, to the point where google even makes their Play Games Services APIs available as C++ headers, which they don't do for most of the rest of their Java-only APIs: https://developers.google.com/games/services/cpp/gettingStartedAndroid As for WinRT, almost nobody uses Windows Phone and Windows 8 was a huge bust, especially Modern apps, so that's neither here nor there. You're right that google keeps pushing Java into the spotlight, but it is native development that is actually doing well, pushing Java back into library mode for google's Java-only APIs. Perhaps that's why they compile Java Ahead-Of-Time since the recent Lollipop release. :)
May 31 2015
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 31 May 2015 at 09:08:28 UTC, Joakim wrote:
 On Sunday, 31 May 2015 at 07:54:29 UTC, Paulo Pinto wrote:
 However, C++ seems to be really into the route of library only 
 language if we look at how it is available on mobile OS, only 
 as complement to the main languages, not as the language under 
 the spotlight.

 Even on WinRT, C++/CX doesn't seem to get many followers 
 outside the game developers world. To the point that Windows 
 10 will also expose DirectX as WinRT components (on 8.x it is 
 only directly available to C++).
You seem to dismiss game development as some niche, when it is one of the main killer apps driving the mobile boom. Around 80% of the top paid apps on iOS and Android are games: https://www.appannie.com/apps/google-play/top/united-states/
I don't know how you got to understand that. My point was that C++/CX was pushed specially for the MFC users. However, apparently only game developers cared to pick it up and it has been ignored by traditional business developers, this was my point. And a big complaint was having to manually make use of COM interop to access DirectX from .NET languages or being forced to write WinRT wrappers in C++/CX. So as of Windows 10, DirectX is also exposed as WinRT compoments.
 Most mobile games are written in C/C++/OpenGL, to the point 
 where google even makes their Play Games Services APIs 
 available as C++ headers, which they don't do for most of the 
 rest of their Java-only APIs:

 https://developers.google.com/games/services/cpp/gettingStartedAndroid
If you knew Android development, you would know that Google Games API was Java only. Here are the videos from Google explaining to NDK users how to write themselves the respective JNI wrappers to the API. https://www.youtube.com/watch?v=zst3R1OP6Y0 Only after a huge backslash from game developers did they port the API to C++. In many cases those games are being written in Java (LibGDX), Unity, Cocos-2D, with C++ for the graphics and sound layer only.
 As for WinRT, almost nobody uses Windows Phone and Windows 8 
 was a huge bust, especially Modern apps, so that's neither here 
 nor there.
Nobody in US, yes. In South America, Eastern Europe and some southern Europe countries it is a bit different. In Portugal for example, you will see more people with Android and Windows Phone than iOS.
 You're right that google keeps pushing Java into the spotlight, 
 but it is native development that is actually doing well, 
 pushing Java back into library mode for google's Java-only 
 APIs.  Perhaps that's why they compile Java Ahead-Of-Time since 
 the recent Lollipop release. :)
They are late to the game there. As Android was the only platform still using a JIT. iOS never used one and Windows Phone 8 always had .NET compiled to native code. Dalvik was just a lame "good enough" VM implementation, so I guess they really needed to do something there. -- Paulo
May 31 2015
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.=
May 31 2015
parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 1 June 2015 at 10:56, ketmar via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
May 31 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
May 31 2015
parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 1 June 2015 at 14:05, weaselcat via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
Ah, yeah, but Unity itself is all C code. Every modern game has a scripting solution, just that Unity has made that interface front-and-center. Lots of meaty Unity plugins are native too.
May 31 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 1 June 2015 at 05:14:59 UTC, Manu wrote:
 On 1 June 2015 at 14:05, weaselcat via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
Ah, yeah, but Unity itself is all C code. Every modern game has a scripting solution, just that Unity has made that interface front-and-center. Lots of meaty Unity plugins are native too.
IL2CPP.
May 31 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 05:20:27 UTC, Paulo Pinto wrote:
 On Monday, 1 June 2015 at 05:14:59 UTC, Manu wrote:
 On 1 June 2015 at 14:05, weaselcat via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
Ah, yeah, but Unity itself is all C code. Every modern game has a scripting solution, just that Unity has made that interface front-and-center. Lots of meaty Unity plugins are native too.
via IL2CPP.
Only because of mono's license update, it's why they've been using a nearly decade old mono for so long.
May 31 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 1 June 2015 at 05:37:39 UTC, weaselcat wrote:
 On Monday, 1 June 2015 at 05:20:27 UTC, Paulo Pinto wrote:
 On Monday, 1 June 2015 at 05:14:59 UTC, Manu wrote:
 On 1 June 2015 at 14:05, weaselcat via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
Ah, yeah, but Unity itself is all C code. Every modern game has a scripting solution, just that Unity has made that interface front-and-center. Lots of meaty Unity plugins are native too.
via IL2CPP.
Only because of mono's license update, it's why they've been using a nearly decade old mono for so long.
No, only because they are too cheap to pay for the work of Xamarin. I doubt that the amount of money wasted in Danish salaries for writing a .NET native compiler is cheaper than paying for the licenses. However riding the fame wave is easy to forget how unknown they were before they firstly added Mono to their JavaScript and Boo offerings, followed by porting the engine to Windows. So I don't really get why Xamarin gets the blame and Unity is portraid as the good guys. For me they are just a company that got lucky using open source and now doesn't want to pay back. Xamarin is doing great without their money.
May 31 2015
parent "weaselcat" <weaselcat gmail.com> writes:
On Monday, 1 June 2015 at 05:45:56 UTC, Paulo Pinto wrote:
 On Monday, 1 June 2015 at 05:37:39 UTC, weaselcat wrote:
 On Monday, 1 June 2015 at 05:20:27 UTC, Paulo Pinto wrote:
 On Monday, 1 June 2015 at 05:14:59 UTC, Manu wrote:
 On 1 June 2015 at 14:05, weaselcat via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
a large portion of ios, android, and steam games use unity, which outside of the core engine uses mono for programming.
Ah, yeah, but Unity itself is all C code. Every modern game has a scripting solution, just that Unity has made that interface front-and-center. Lots of meaty Unity plugins are native too.
via IL2CPP.
Only because of mono's license update, it's why they've been using a nearly decade old mono for so long.
No, only because they are too cheap to pay for the work of Xamarin. I doubt that the amount of money wasted in Danish salaries for writing a .NET native compiler is cheaper than paying for the licenses.
AFAIK It's heavily based off of mono 2.0 code, it actually directly links against a lot of mono libraries to supply the CLR(or did, anyways.)
 However riding the fame wave is easy to forget how unknown they 
 were before they firstly added Mono to their JavaScript and Boo 
 offerings, followed by porting the engine to Windows.
Boo requires mono ;)
 So I don't really get why Xamarin gets the blame and Unity is 
 portraid as the good guys.

 For me they are just a company that got lucky using open source 
 and now doesn't want to pay back.

 Xamarin is doing great without their money.
I wasn't blaming Xamarin, unity owes a lot of their success to them - mono helped them greatly reduce the barriers of indie gamedev.
May 31 2015
prev sibling next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
Objective-C, Swift, Java (as of Android 5) and .NET on mobile OSes aren't no less native than D. All of them compile to pure native code. There isn't any VM running on the device. _ Paulo
May 31 2015
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/01/2015 01:08 AM, Paulo Pinto wrote:
 On Monday, 1 June 2015 at 03:38:44 UTC, Manu wrote:
 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
How so? Game dev's aren't moving away from native code any time soon...
Objective-C, Swift, Java (as of Android 5) and .NET on mobile OSes aren't no less native than D. All of them compile to pure native code. There isn't any VM running on the device.
That may be technically true, but the elimination of the VM still doesn't eliminate all the VM baggage. The need for the languages to support their old VMs does still impose restrictions on the language designers, and thus language itself, that wouldn't need to exist had there been no VM at all. Ex: No matter how aggressively AOT is used, manual memory management and controlling memory layout are still gonna be a royal pain in Java/Android5 and .NET. That's unlikely to go away without deprecating support for the VMs. But it IS likely to become more and more of a problem for those languages with performance being increasingly sensitive to data layout. Good news for D though ;)
Jun 02 2015
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 01 Jun 2015 13:38:28 +1000, Manu via Digitalmars-d wrote:

 On 1 June 2015 at 10:56, ketmar via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On Sun, 31 May 2015 09:08:27 +0000, Joakim wrote:

 Most mobile games are written in C/C++/OpenGL
that will fade away soon. it's safe to ignore that in long-time plan.
=20 How so? Game dev's aren't moving away from native code any time soon...
even PC games tend to migrate to Unity. mobile games will stop using home- made engines very soon, as porting it to each platform is simply wasting=20 of time. so there will be old codebases which nobody will convert anyway,=20 and new codebases that using Unity, Cocos or something like it. so "generating native code for mobile platforms" target can be ignored,=20 it's investement that will take alot of time and efforts with very little=20 benefit. =
Jun 01 2015
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Monday, 1 June 2015 at 10:37:31 UTC, ketmar wrote:
 even PC games tend to migrate to Unity. mobile games will stop 
 using home-
 made engines very soon, as porting it to each platform is 
 simply wasting
 of time. so there will be old codebases which nobody will 
 convert anyway,
 and new codebases that using Unity, Cocos or something like it.
I don't know, it is hard to stand out if you build on Unity or Cocos unmodified. But I think web browsers are slowly moving towards a situation where you soon can make sensible games in webgl + asm.js using home-made engines using "native javascript" (basically javascript targetting LLVM IR) or pnacl (LLVM IR). So the distinction between native and non-native is getting blurred. Just take a look at shadertoy and see what people can do in GL shaders that work on the web. And shaders are written to the current hardware if they are to perform well (so "native"). Realtime javascript programming + WebGL quality is going to become more important than native binaries for games if the payment model issues find a solution. I'd say the payment model that the app-stores provide are more important than distributing native code. A web-based game has a distinct significant marketing advantage. Click on a web ad and instantly find yourself in a game world (free trial). Anything that takes installation is at a disadvantage. Unfortunately, anything that requires entering credit card info is at a disadvantage too...
 so "generating native code for mobile platforms" target can be 
 ignored,
 it's investement that will take alot of time and efforts with 
 very little
 benefit.
I think the focus will shift from "generating native code for specific hardware" to "generating code that effectively translates to native code for specific hardware". Which roughly is the same deal. (Most games are scripty, yes, "paper doll", "cartoony 2D" etc…)
Jun 01 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 01 Jun 2015 11:10:09 +0000, Ola Fosheim Gr=C3=B8stad wrote:

 But I think web browsers are slowly moving towards a situation where you
 soon can make sensible games in webgl + asm.js using home-made engines
 using "native javascript" (basically javascript targetting LLVM IR) or
 pnacl (LLVM IR).  So the distinction between native and non-native is
 getting blurred.
this whole thing is complete disaster. instead of designing a simple=20 virtual machine with well-defined commands, they keep uglifying already=20 ugly js. this is so bad that i believe that we will live with that for=20 many years (really, i see that the worst technology with as many ugliness=20 one can stuff into it usually wins; that "web shit" is ugly enough).=
Jun 02 2015
next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Tuesday, 2 June 2015 at 09:18:27 UTC, ketmar wrote:
 On Mon, 01 Jun 2015 11:10:09 +0000, Ola Fosheim GrĂƒÂ¸stad wrote:

 But I think web browsers are slowly moving towards a situation 
 where you
 soon can make sensible games in webgl + asm.js using home-made 
 engines
 using "native javascript" (basically javascript targetting 
 LLVM IR) or
 pnacl (LLVM IR).  So the distinction between native and 
 non-native is
 getting blurred.
this whole thing is complete disaster. instead of designing a simple virtual machine with well-defined commands, they keep uglifying already ugly js. this is so bad that i believe that we will live with that for many years (really, i see that the worst technology with as many ugliness one can stuff into it usually wins; that "web shit" is ugly enough).
Thankfully mobile OSes and desktop app stores seem to be on the right track to kill this. http://www.quirksmode.org/blog/archives/2015/05/web_vs_native_l.html
Jun 02 2015
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 09:44:24 +0000, Paulo  Pinto wrote:

 Thankfully mobile OSes and desktop app stores seem to be on the right
 track to kill this.
yet they pushing cromeos and firefoxos...=
Jun 02 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Tuesday, 2 June 2015 at 10:18:59 UTC, ketmar wrote:
 On Tue, 02 Jun 2015 09:44:24 +0000, Paulo  Pinto wrote:

 Thankfully mobile OSes and desktop app stores seem to be on 
 the right
 track to kill this.
yet they pushing cromeos and firefoxos...
FirefoxOS is going nowhere. http://www.cnet.com/uk/news/mozilla-overhauls-firefox-smartphone-plan-to-focus-on-quality-not-cost/ As for Chromebooks, at least in Germany they are gathering dust on the few stores that bother to try to sell them. They might be on the Amazon US top charts, but I bet most of its users are GNU/Linux users, wiping ChromeOS and using GNU/Linux instead.
Jun 02 2015
next sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/02/2015 07:26 AM, Paulo Pinto wrote:
 On Tuesday, 2 June 2015 at 10:18:59 UTC, ketmar wrote:
 On Tue, 02 Jun 2015 09:44:24 +0000, Paulo  Pinto wrote:

 Thankfully mobile OSes and desktop app stores seem to be on the right
 track to kill this.
yet they pushing cromeos and firefoxos...
FirefoxOS is going nowhere. http://www.cnet.com/uk/news/mozilla-overhauls-firefox-smartphone-plan-to-focus-on-quality-not-cost/ As for Chromebooks, at least in Germany they are gathering dust on the few stores that bother to try to sell them. They might be on the Amazon US top charts, but I bet most of its users are GNU/Linux users, wiping ChromeOS and using GNU/Linux instead.
Yes. That's also part of why I don't have much expectation for Taizen, either. It would already have enough of an uphill battle even if the technical aspect was total 100%. But by promoting HTML5-for-apps, they might very well be riding the wrong end of a last-decade fad.
Jun 02 2015
prev sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Tuesday, 2 June 2015 at 11:26:06 UTC, Paulo  Pinto wrote:
 On Tuesday, 2 June 2015 at 10:18:59 UTC, ketmar wrote:
 On Tue, 02 Jun 2015 09:44:24 +0000, Paulo  Pinto wrote:

 Thankfully mobile OSes and desktop app stores seem to be on 
 the right
 track to kill this.
yet they pushing cromeos and firefoxos...
FirefoxOS is going nowhere. http://www.cnet.com/uk/news/mozilla-overhauls-firefox-smartphone-plan-to-focus-on-quality-not-cost/ As for Chromebooks, at least in Germany they are gathering dust on the few stores that bother to try to sell them. They might be on the Amazon US top charts, but I bet most of its users are GNU/Linux users, wiping ChromeOS and using GNU/Linux instead.
They're insanely popular, especially in educational environments. They do everything 98% of modern computer users do, which is generally check email, browse facebook, and use twitter. my local public highschool ordered 800-some of them instead of upgrading their ipads(???), which would have cost far, far more. AFAIK they got the cool thinkpad versions.
Jun 02 2015
parent reply "Joakim" <dlang joakim.fea.st> writes:
On Tuesday, 2 June 2015 at 22:38:47 UTC, weaselcat wrote:
 They're insanely popular, especially in educational 
 environments. They do everything 98% of modern computer users 
 do, which is generally check email, browse facebook, and use 
 twitter.
Not really. While they do sell some in education, they were 1.8% of the PC market last year, much less than even Macs despite being much cheaper: https://www.petri.com/chromebook-continues-to-be-a-tiny-slice-of-the-pc-market Compare that 5.7 million in sales to a billion Android devices sold last year, native is definitely winning.
Jun 02 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 3 June 2015 at 03:41:39 UTC, Joakim wrote:
 On Tuesday, 2 June 2015 at 22:38:47 UTC, weaselcat wrote:
 They're insanely popular, especially in educational 
 environments. They do everything 98% of modern computer users 
 do, which is generally check email, browse facebook, and use 
 twitter.
Not really. While they do sell some in education, they were 1.8% of the PC market last year, much less than even Macs despite being much cheaper: https://www.petri.com/chromebook-continues-to-be-a-tiny-slice-of-the-pc-market Compare that 5.7 million in sales to a billion Android devices sold last year, native is definitely winning.
chromebooks weren't even really usable until the latter half of 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got on board and it stopped being just Samsung making them. 2% is huge for less than 2 years. That was the chromebook revision that featured the ultra low power Haswell CPUs(2955U,) before that they were incredibly slow and suffered from general netbook issues. And they're not even comparable to an android /phone/. Compare them to tablet sales.
Jun 02 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 3 June 2015 at 04:36:31 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 03:41:39 UTC, Joakim wrote:
 On Tuesday, 2 June 2015 at 22:38:47 UTC, weaselcat wrote:
 They're insanely popular, especially in educational 
 environments. They do everything 98% of modern computer users 
 do, which is generally check email, browse facebook, and use 
 twitter.
Not really. While they do sell some in education, they were 1.8% of the PC market last year, much less than even Macs despite being much cheaper: https://www.petri.com/chromebook-continues-to-be-a-tiny-slice-of-the-pc-market Compare that 5.7 million in sales to a billion Android devices sold last year, native is definitely winning.
chromebooks weren't even really usable until the latter half of 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got on board and it stopped being just Samsung making them. 2% is huge for less than 2 years. That was the chromebook revision that featured the ultra low power Haswell CPUs(2955U,) before that they were incredibly slow and suffered from general netbook issues. And they're not even comparable to an android /phone/. Compare them to tablet sales.
Oh, I forgot the most important part. The acer c720 was $200 on release, it was the cheapest chromebook to date. C700 launched at $349, and the samsung series 5 launched at $399 for reference. Before the haswell iteration they just weren't ready to be a thing.
Jun 02 2015
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 3 June 2015 at 04:40:14 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 04:36:31 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 03:41:39 UTC, Joakim wrote:
 On Tuesday, 2 June 2015 at 22:38:47 UTC, weaselcat wrote:
 They're insanely popular, especially in educational 
 environments. They do everything 98% of modern computer 
 users do, which is generally check email, browse facebook, 
 and use twitter.
Not really. While they do sell some in education, they were 1.8% of the PC market last year, much less than even Macs despite being much cheaper: https://www.petri.com/chromebook-continues-to-be-a-tiny-slice-of-the-pc-market Compare that 5.7 million in sales to a billion Android devices sold last year, native is definitely winning.
chromebooks weren't even really usable until the latter half of 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got on board and it stopped being just Samsung making them. 2% is huge for less than 2 years. That was the chromebook revision that featured the ultra low power Haswell CPUs(2955U,) before that they were incredibly slow and suffered from general netbook issues. And they're not even comparable to an android /phone/. Compare them to tablet sales.
Oh, I forgot the most important part. The acer c720 was $200 on release, it was the cheapest chromebook to date. C700 launched at $349, and the samsung series 5 launched at $399 for reference. Before the haswell iteration they just weren't ready to be a thing.
For that price I can easily get a tablet with keyboard, with the advantage of real native applications + a web browser. For example, Lenovo A10-70 just one randomly picked out at German Amazon. Eventually Google will realize they are as useful as WebOS and will merge them with Android. -- Paulo
Jun 03 2015
prev sibling parent reply "Joakim" <dlang joakim.fea.st> writes:
On Wednesday, 3 June 2015 at 04:36:31 UTC, weaselcat wrote:
 chromebooks weren't even really usable until the latter half of 
 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got on 
 board and it stopped being just Samsung making them. 2% is huge 
 for less than 2 years. That was the chromebook revision that 
 featured the ultra low power Haswell CPUs(2955U,) before that 
 they were incredibly slow and suffered from general netbook 
 issues.
So you think they're about to break out? I don't see it.
 And they're not even comparable to an android /phone/. Compare 
 them to tablet sales.
Why? Do phones not "do everything 98% of modern computer users do... check email, browse facebook, and use twitter?" Seems like phones have taken over those use cases these days. :) There is a giant market for devices that don't catch viruses and have all kinds of registry settings, but Android and iOS have taken 99+% of that market. I was going to make the same point Paulo just made: just get an Android device and you can put a Chrome browser on there too. I don't see the point of limiting yourself to just the browser, even though that is what a significant fraction of people probably use most of the time. ChromeOS strikes me as google trying to use their one hammer everywhere, even when there are no nails, ie they're built around the web so they made an OS out of it. But it's frankly kind of a dumb idea, I don't see it lasting. They're working on a multi-window mode for Android, early versions of which have been found by those spelunking through the recent Android M preview. Once that's done, I suspect they'll start putting Android on laptops too and kill off Chrome OS.
Jun 03 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 3 June 2015 at 08:34:22 UTC, Joakim wrote:
 On Wednesday, 3 June 2015 at 04:36:31 UTC, weaselcat wrote:
 chromebooks weren't even really usable until the latter half 
 of 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got on 
 board and it stopped being just Samsung making them. 2% is 
 huge for less than 2 years. That was the chromebook revision 
 that featured the ultra low power Haswell CPUs(2955U,) before 
 that they were incredibly slow and suffered from general 
 netbook issues.
So you think they're about to break out? I don't see it.
I think cornering 2% of the PC market in 2 years is a pretty big deal.
 And they're not even comparable to an android /phone/. Compare 
 them to tablet sales.
Why? Do phones not "do everything 98% of modern computer users do... check email, browse facebook, and use twitter?" Seems like phones have taken over those use cases these days. :)
because phones are used for communication, my mother has a smart phone and to her it's a confusing landline phone.
 There is a giant market for devices that don't catch viruses 
 and have all kinds of registry settings, but Android and iOS 
 have taken 99+% of that market.  I was going to make the same 
 point Paulo just made: just get an Android device and you can 
 put a Chrome browser on there too.  I don't see the point of 
 limiting yourself to just the browser, even though that is what 
 a significant fraction of people probably use most of the time.

 ChromeOS strikes me as google trying to use their one hammer 
 everywhere, even when there are no nails, ie they're built 
 around the web so they made an OS out of it.  But it's frankly 
 kind of a dumb idea, I don't see it lasting.

 They're working on a multi-window mode for Android, early 
 versions of which have been found by those spelunking through 
 the recent Android M preview.  Once that's done, I suspect 
 they'll start putting Android on laptops too and kill off 
 Chrome OS.
chromebooks sell because touchscreens are a gimmick and android is terrible with a keyboard. but hey, if it didn't work so well why is Microsoft trying so hard to copy them, going as far as making commercials about how "awful" chromebooks are, then releasing their own chromebook - I mean, stream.
Jun 03 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 3 June 2015 at 08:38:09 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 08:34:22 UTC, Joakim wrote:
 On Wednesday, 3 June 2015 at 04:36:31 UTC, weaselcat wrote:
 chromebooks weren't even really usable until the latter half 
 of 2013/start of 2014 when Acer/HP/Dell/Toshiba/etc all got 
 on board and it stopped being just Samsung making them. 2% is 
 huge for less than 2 years. That was the chromebook revision 
 that featured the ultra low power Haswell CPUs(2955U,) before 
 that they were incredibly slow and suffered from general 
 netbook issues.
So you think they're about to break out? I don't see it.
I think cornering 2% of the PC market in 2 years is a pretty big deal.
 And they're not even comparable to an android /phone/. 
 Compare them to tablet sales.
Why? Do phones not "do everything 98% of modern computer users do... check email, browse facebook, and use twitter?" Seems like phones have taken over those use cases these days. :)
because phones are used for communication, my mother has a smart phone and to her it's a confusing landline phone.
 There is a giant market for devices that don't catch viruses 
 and have all kinds of registry settings, but Android and iOS 
 have taken 99+% of that market.  I was going to make the same 
 point Paulo just made: just get an Android device and you can 
 put a Chrome browser on there too.  I don't see the point of 
 limiting yourself to just the browser, even though that is 
 what a significant fraction of people probably use most of the 
 time.

 ChromeOS strikes me as google trying to use their one hammer 
 everywhere, even when there are no nails, ie they're built 
 around the web so they made an OS out of it.  But it's frankly 
 kind of a dumb idea, I don't see it lasting.

 They're working on a multi-window mode for Android, early 
 versions of which have been found by those spelunking through 
 the recent Android M preview.  Once that's done, I suspect 
 they'll start putting Android on laptops too and kill off 
 Chrome OS.
chromebooks sell because touchscreens are a gimmick and android is terrible with a keyboard. but hey, if it didn't work so well why is Microsoft trying so hard to copy them, going as far as making commercials about how "awful" chromebooks are, then releasing their own chromebook - I mean, stream.
Surfaces have always been full-blown laptops with detachable keyboards. Stream?! I had to search for it, only found the HP Stream model, running a full Windows 8.1 OS, not a browser pretending to be an OS. -- Paulo
Jun 03 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 3 June 2015 at 08:44:28 UTC, Paulo  Pinto wrote:

 Stream?! I had to search for it, only found the HP Stream 
 model, running a full Windows 8.1 OS, not a browser pretending 
 to be an OS.

 --
 Paulo
Yes, and that "full Windows 8.1 OS" makes it run 2-3x slower than equivalent hardware Chromebooks. http://www.engadget.com/2014/11/28/hp-stream-11-review/
Jun 03 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 3 June 2015 at 08:48:03 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 08:44:28 UTC, Paulo  Pinto wrote:

 Stream?! I had to search for it, only found the HP Stream 
 model, running a full Windows 8.1 OS, not a browser pretending 
 to be an OS.

 --
 Paulo
Yes, and that "full Windows 8.1 OS" makes it run 2-3x slower than equivalent hardware Chromebooks. http://www.engadget.com/2014/11/28/hp-stream-11-review/
So what, a lousy laptop model from HP vs a lousy OS experience from Google? You will never convince me there is any good stuff in Chromebooks, as well as, I will never convince you they belong alongside WebOS and Symbian Web Widgets in OS Heaven.
Jun 03 2015
parent "weaselcat" <weaselcat gmail.com> writes:
On Wednesday, 3 June 2015 at 09:27:49 UTC, Paulo  Pinto wrote:
 On Wednesday, 3 June 2015 at 08:48:03 UTC, weaselcat wrote:
 On Wednesday, 3 June 2015 at 08:44:28 UTC, Paulo  Pinto wrote:

 Stream?! I had to search for it, only found the HP Stream 
 model, running a full Windows 8.1 OS, not a browser 
 pretending to be an OS.

 --
 Paulo
Yes, and that "full Windows 8.1 OS" makes it run 2-3x slower than equivalent hardware Chromebooks. http://www.engadget.com/2014/11/28/hp-stream-11-review/
So what, a lousy laptop model from HP vs a lousy OS experience from Google? You will never convince me there is any good stuff in Chromebooks, as well as, I will never convince you they belong alongside WebOS and Symbian Web Widgets in OS Heaven.
the only chromebooks made by Google are extremely highend ones...
Jun 03 2015
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/03/2015 04:44 AM, Paulo Pinto wrote:
 Surfaces have always been full-blown laptops with detachable keyboards.
Very, very, very low-end laptops (with a high-end price tag) considering their notable lack of I/O and storage, and their sub-par keyboards (by laptop standards anyway. Keyboards: desktop > laptop > surface > tablet
 phone). Only the CPU/RAM really matched real laptops, none of the 
other specs. And that's why all the attention they got didn't really translate into sales. Though surface certainly *has* been improving on all those fronts. They're certainly poised to *become* full-blown laptops with detachable keyboards, but they haven't always been.
Jun 03 2015
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/02/2015 05:44 AM, Paulo Pinto wrote:
 Thankfully mobile OSes and desktop app stores seem to be on the right
 track to kill this.

 http://www.quirksmode.org/blog/archives/2015/05/web_vs_native_l.html
+1billion My god it's nice to see that finally acknowledged. "Still, we web developers have spent the last six years in denial. Our working assumption has been that all web sites should be app-like, and therefore tooled up to the hilt." Love it. Thing is, one of the biggest reasons (likely even THE biggest reason) for the giant "web for apps" push was the whole no-install aspect. But then instead of actually, y'know, creating a no-install for applications, which could have been made and well-entrenched by now, the whole tech sector went and blew (probably) billions in $ and time retrofitting "rich, dynamic experience" into a document platform. And for all those blown resources, it's STILL at least as much of a broken mess as it was back when we still thought the IE/Netscape divergence was the greatest damage we could ever inflict our unfortunate web-based users. And then on top of all that, Java/Flash/JS/basic-freaking-displaying-a-stupid-little-image demonstrated that even the whole purported "sandboxing" benefit of web apps (and VMs for that matter) was a near total bust.
Jun 02 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Tuesday, 2 June 2015 at 23:04:30 UTC, Nick Sabalausky wrote:
 On 06/02/2015 05:44 AM, Paulo Pinto wrote:
 Thankfully mobile OSes and desktop app stores seem to be on 
 the right
 track to kill this.

 http://www.quirksmode.org/blog/archives/2015/05/web_vs_native_l.html
+1billion My god it's nice to see that finally acknowledged. "Still, we web developers have spent the last six years in denial. Our working assumption has been that all web sites should be app-like, and therefore tooled up to the hilt." Love it. Thing is, one of the biggest reasons (likely even THE biggest reason) for the giant "web for apps" push was the whole no-install aspect. But then instead of actually, y'know, creating a no-install for applications, which could have been made and well-entrenched by now, the whole tech sector went and blew (probably) billions in $ and time retrofitting "rich, dynamic experience" into a document platform. And for all those blown resources, it's STILL at least as much of a broken mess as it was back when we still thought the IE/Netscape divergence was the greatest damage we could ever inflict our unfortunate web-based users. And then on top of all that, Java/Flash/JS/basic-freaking-displaying-a-stupid-little-image demonstrated that even the whole purported "sandboxing" benefit of web apps (and VMs for that matter) was a near total bust.
I have to disable javascript on amazon.com to be able to use the site or else it brings my browser to a crawl. I, for one, am in favor of scrapping javascript and replacing it with lua. At least it has a decent JIT implementation. fun tidbit, back when the benchmark's game had luajit on it, luajit's _interpreter_ would beat V8 in every single benchmark.
Jun 02 2015
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 23:51:13 +0000, weaselcat wrote:

 I, for one, am in favor of scrapping javascript and replacing it with
 lua. At least it has a decent JIT implementation.
it doesn't really matter. be it js, Lua, basic -- it's all equally bad.=20 websites are not applications, and they doesn't need any scripting at all.=
Jun 02 2015
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/02/2015 07:51 PM, weaselcat wrote:
 I have to disable javascript on amazon.com to be able to use the site or
 else it brings my browser to a crawl.
When I finally figured out how to hack (by that I mean "load it down with an endless list of add-ons" and THEN waste an evening configuring) a modern Firefox to be tolerable, I thought I was finally in pretty decent good shape. (Well, at least until I was subjected to Mozilla's next couple rounds of UI blunders anyway...) But now that I've finally accepted defeat on "screw cellphones" and been in the android habit for over a year, I've found myself in a new web hell: Now, no matter what mobile browser I use (Chrome, Firefox, Dolphin, or the amazingly-incorrectly named "Internet"), I have a choice: A. Use the goofy, straightjacketed, sluggish "mobile" versions of websites (Wikipedia's is particularly bad, what with the auto-collapsing of every section on the entire page *while* you're scrolling and reading and then again every time you navigate "back" to a section you'd already re-expanded manually. Gee, thanks for closing my book, Wikipedia, never mind that I was reading it.) Or B. Switch on "request desktop site" mode and get *improved* mobile UX but can easily take 30 seconds to a full minute (per page) before becoming responsive enough to accept clicking on a link. And that's for pages that aren't even dynamic beyond the initial flurry of JS onLoad() nonsense. (Seriously, nearly anything that can be run during onLoad(), BELONGS on the server. Why would ANYONE in their right mind EVER make every single client do several Ajax/REST/whatever requests, THEN render the EXACT SAME page every time, for every single incoming request? Instead of, oh, I dunno, rendering the EXACT SAME page ONCE when content ACTUALLY changes and having the server spit THAT out to every request? Not enterprisey enough, I guess. Really, how often does a blog or news site actually post or modify an article? That exact same pages REALLY need to get completely regenerated on every hit even though NOTHING has changed since the last 500 hits?) What I find very interesting is that it's consistently big businesses that have the most impossible-to-use sites. Ex: Just look at any site by SCE. You'd almost think they *don't* want any viewers and customers.
Jun 02 2015
prev sibling parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 2 June 2015 at 09:18:27 UTC, ketmar wrote:
 ugly js. this is so bad that i believe that we will live with 
 that for many years
Possibly, but whatever is available on dominant browsers will be used. I many situations avoiding user-initiated installs are desirable. In most, actually...
Jun 02 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 10:37:40 +0000, Ola Fosheim Gr=C3=B8stad wrote:

 On Tuesday, 2 June 2015 at 09:18:27 UTC, ketmar wrote:
 ugly js. this is so bad that i believe that we will live with that for
 many years
=20 Possibly, but whatever is available on dominant browsers will be used. I many situations avoiding user-initiated installs are desirable. In most, actually...
and there was a thing that allows to use real applications without=20 installing them... java web start! way too ahead of it's time, though...=
Jun 02 2015
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 2 June 2015 at 13:37:47 UTC, ketmar wrote:
 and there was a thing that allows to use real applications 
 without
 installing them... java web start! way too ahead of it's time, 
 though...
Java got itself a bad reputation in the 90s by providing a really poor implementation. The situation that is emerging now is that you can integrate different browser technologies, and that browser vendors cooperate, and that makes all the difference.
Jun 02 2015
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 13:43:29 +0000, Ola Fosheim Gr=C3=B8stad wrote:

 The situation that is emerging now is that you can integrate different
 browser technologies, and that browser vendors cooperate, and that makes
 all the difference.
except you can't really integrate browser techs in your program, but you=20 are required to integrate your program into browser. ;-)=
Jun 02 2015
parent reply "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= writes:
On Tuesday, 2 June 2015 at 14:01:48 UTC, ketmar wrote:
 On Tue, 02 Jun 2015 13:43:29 +0000, Ola Fosheim GrĂƒÂ¸stad wrote:

 The situation that is emerging now is that you can integrate 
 different
 browser technologies, and that browser vendors cooperate, and 
 that makes
 all the difference.
except you can't really integrate browser techs in your program, but you are required to integrate your program into browser. ;-)
https://github.com/domokit/mojo
Jun 02 2015
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 17:04:58 +0000, Ola Fosheim Gr=C3=B8stad wrote:

 On Tuesday, 2 June 2015 at 14:01:48 UTC, ketmar wrote:
 On Tue, 02 Jun 2015 13:43:29 +0000, Ola Fosheim Gr=C3=83=C2=B8stad wrote=
:
 The situation that is emerging now is that you can integrate different
 browser technologies, and that browser vendors cooperate, and that
 makes all the difference.
except you can't really integrate browser techs in your program, but you are required to integrate your program into browser. ;-)
=20 https://github.com/domokit/mojo
exactly! "integrate your app into our stinkin' browser!"=
Jun 02 2015
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 06/02/2015 09:43 AM, "Ola Fosheim =?UTF-8?B?R3LDuHN0YWQi?= 
<ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Tuesday, 2 June 2015 at 13:37:47 UTC, ketmar wrote:
 and there was a thing that allows to use real applications without
 installing them... java web start! way too ahead of it's time, though...
And 0install. (Though I admit I haven't checked in on that in a few years. But I always wanted to see that, or something like it, succeed.)
 Java got itself a bad reputation in the 90s by providing a really poor
 implementation.
Over the years, I've noticed that one of the best ways to kill an idea is to get everyone onboard a really, really bad implementation of it. (Ex, see: "email" vs "exchange server and webmail clients", or "static type system" vs "C++ and Java". Or IMHO: "desktop/laptop computers" vs "every major desktop OS in existence" ;))
Jun 02 2015
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 02 Jun 2015 18:41:20 -0400, Nick Sabalausky wrote:

 On 06/02/2015 09:43 AM, "Ola Fosheim =3D?UTF-8?B?R3LDuHN0YWQi?=3D
 <ola.fosheim.grostad+dlang gmail.com>" wrote:
 On Tuesday, 2 June 2015 at 13:37:47 UTC, ketmar wrote:
 and there was a thing that allows to use real applications without
 installing them... java web start! way too ahead of it's time,
 though...
=20 And 0install. (Though I admit I haven't checked in on that in a few years. But I always wanted to see that, or something like it, succeed.)
that's a different story. 0install is a self-containing bundle with all=20 libs and so on. you got it, you run it, you use it. and javaws is a something completely different. it doesn't even download=20 the whole app at the start -- it can download only the classes it=20 requires to run right now, downloading the rest on background or when=20 user activates a feature for the first time. besides, application is not a bundle, it's more like a browser cache.=20 javaws can check for updates transparently and update only parts of=20 application (think about fixing some bug in some classes -- whoa, only=20 that classes need to be downloaded). and if user didn't use a feature=20 that requres updated class yet, he doesn't even need to restart the app. that said, javaws was a great idea -- killed by java and by being too=20 advanced for it's time.=
Jun 02 2015
prev sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Sun, 31 May 2015 07:54:26 +0000, Paulo Pinto wrote:

 Using macros in C++ is considered bad style and a sign of someone
 sticking to Cisms.
=20
 With meta-programming, templates, strong enums, const, constexpr and
 inline there are very few valid reasons to use macros other than C
 copy-paste compatibility.
and the fact that it's way easier to write a C-like macro than C++=20 template. ;-)=
May 31 2015
prev sibling next sibling parent reply "bachmeier" <no spam.net> writes:
On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 Rust's syntax dooms it to the same niche as Haskell.
They'd have been better off to go with XML. I think the developers got comfortable with the syntax as they went along, and they have no idea just how ugly it is.
May 24 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 24 May 2015 at 09:43:38 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 Rust's syntax dooms it to the same niche as Haskell.
They'd have been better off to go with XML. I think the developers got comfortable with the syntax as they went along, and they have no idea just how ugly it is.
know about it.
May 24 2015
parent reply "bachmeier" <no spam.net> writes:
On Sunday, 24 May 2015 at 11:59:00 UTC, Paulo Pinto wrote:
 On Sunday, 24 May 2015 at 09:43:38 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 Rust's syntax dooms it to the same niche as Haskell.
They'd have been better off to go with XML. I think the developers got comfortable with the syntax as they went along, and they have no idea just how ugly it is.
know about it.
I'm not sure what you're saying. Apple and Microsoft are responsible for Rust's syntax?
May 24 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Sunday, 24 May 2015 at 17:22:26 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 11:59:00 UTC, Paulo Pinto wrote:
 On Sunday, 24 May 2015 at 09:43:38 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 Rust's syntax dooms it to the same niche as Haskell.
They'd have been better off to go with XML. I think the developers got comfortable with the syntax as they went along, and they have no idea just how ugly it is.
know about it.
I'm not sure what you're saying. Apple and Microsoft are responsible for Rust's syntax?
All those languages are based in the ML syntax. Which means many do find such syntax pleasant and it is being adopted by companies with major impact in the industry. -- Paulo
May 24 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Sunday, 24 May 2015 at 18:40:49 UTC, Paulo Pinto wrote:
 On Sunday, 24 May 2015 at 17:22:26 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 11:59:00 UTC, Paulo Pinto wrote:
 On Sunday, 24 May 2015 at 09:43:38 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 07:21:19 UTC, Joakim wrote:
 Rust's syntax dooms it to the same niche as Haskell.
They'd have been better off to go with XML. I think the developers got comfortable with the syntax as they went along, and they have no idea just how ugly it is.
OCaml) know about it.
I'm not sure what you're saying. Apple and Microsoft are responsible for Rust's syntax?
All those languages are based in the ML syntax. Which means many do find such syntax pleasant and it is being adopted by companies with major impact in the industry. -- Paulo
Rust's issue isn't the ML syntax, it's the explicit lifetime management and extremely verbose error system. A typical rust block is nested 10 levels deep of matches and full of random 'a 'b 'c annotations everywhere. I think ML-based syntax has a very clean feeling about it and IMO Rust has definitely not inherited that[1]. Furthermore, I strongly dislike that Rust has made it completely impossible to opt out of bounds checking without annotating your code with unsafe. Bounds checking can absolutely destroy a tight loop's performance(as has already been seen quite a few times in scientific/mathematical Rust benchmarks against other native languages.) FWIW I'm not picking on Rust, I used it for a rather long time(while in beta, obviously) before I switched to D full time for my academic work and I don't regret my decision. I thought Rust would get more improvements than it did. I feel like they made so many poor decisions as the development went on, cut so many good features etc just to cater to the non-ML crowd that the language ended up being a frankenstein mess. [1] - https://github.com/andreaferretti/kmeans/blob/935b8966d4fe0d4854d3d69ec0fbfb4dd69a3fd1/rust/ rc/point/mod.rs#L54 this is fairly typical Rust code, I found it by a random Google search.
Programs must be written for people to read, and only 
incidentally for machines to execute.
May 24 2015
next sibling parent reply "Laeeth Isharc" <laeeth nospamlaeeth.com> writes:
Weaselcat:

 FWIW I'm not picking on Rust, I used it for a rather long 
 time(while in beta, obviously) before I switched to D full time 
 for my academic work and I don't regret my decision. I thought 
 Rust would get more improvements than it did. I feel like they 
 made so many poor decisions as the development went on, cut so 
 many good features etc just to cater to the non-ML crowd that 
 the language ended up being a frankenstein mess.
Hi. Without wishing to dwell on the negatives of alternatives, might I ask what made you decide to settle on D? Do you have collaborators who write code and, if so, how did the discussions with them go about this? For your use case, what have been the main gains you have seen and how long did it take before these started to come through? I am interested because I may have the chance to persuade some people to explore using D, and I would like to know honestly what some experiences are of people who have made that change. (My personal experience may not be so easy to generalize from). Thanks. Laeeth.
May 24 2015
next sibling parent reply "bachmeier" <no spam.net> writes:
On Sunday, 24 May 2015 at 20:36:47 UTC, Laeeth Isharc wrote:
 Without wishing to dwell on the negatives of alternatives, 
 might I ask what made you decide to settle on D?  Do you have 
 collaborators who write code and, if so, how did the 
 discussions with them go about this?  For your use case, what 
 have been the main gains you have seen and how long did it take 
 before these started to come through?
I'm not weaselcat, but I'm an academic and I also tried out Rust before using D. I came to the conclusion that there was no way I could ever expect any collaborator to use Rust. The syntax was crazy. The requirement to study memory management issues (something completely irrelevant) before even reading the code was a non-starter. It's just a complicated language that is not suited for the average programmer. D is different. As long as I avoid templates, it's easy to read the code I've written, without any experience with the language. I tried C++ (Dirk Eddelbuettel devoted a section of his Rcpp book to an example I contributed), Rust, and Go. The other realistic alternative was Go, but I chose D as a matter of personal preference.
May 24 2015
next sibling parent "Laeeth Isharc" <nospamlaeeth nospam.laeeth.com> writes:
weaselcat:
"I feel like I could write a book on why I use D, so I'm going to
stop now : )"

Actually, kidding aside, I do believe that it would make sense to 
collect some personal warts-and-all accounts of the experience of 
individuals working in academe, the corporate sector, and 
elsewhere in switching to D.  One can think of it as a Studs 
Terkel type exercise, or something more like a Stanford case 
study.  But either way, a narrative is very powerful in making 
the prospect of switching vivid, because in those little details 
and with the benefit of the natural coherence to which humans are 
used to thinking there is power that may supplement a drier, more 
factual presentation of the benefits of D.  I personally found 
Don's account at a dconf a year or two back rather powerful.  
(Who he was representing helped, but less than one might think).  
I would also mention the very good talk by the German games 
developer whose name I have unfortunately forgotten this second - 
and in a rush.

There is an empty page here if anyone cares to get the ball 
rolling.  I'll add something myself when I have time in a few 
days, but if anyone cares to add their own experience, perhaps 
that might be of considerable benefit over time:

http://wiki.dlang.org/?title=User_narratives_on_switching_to_D&action=edit&redlink=1


bachmeier:
 I'm not weaselcat, but I'm an academic and I also tried out 
 Rust before using D. I came to the conclusion that there was no 
 way I could ever expect any collaborator to use Rust. The 
 syntax was crazy. The requirement to study memory management 
 issues (something completely irrelevant) before even reading 
 the code was a non-starter. It's just a complicated language 
 that is not suited for the average programmer.
I very much appreciate your taking the time to share your perspective (and I always enjoy reading your posts). I looked at Rust, but it doesn't address the problems I have, and I find the complexity off-putting.
 D is different. As long as I avoid templates, it's easy to read 
 the code I've written, without any experience with the language.
My own curve has been flattish, up until the point I got to templates, which are a bit more of a challenge. Until recently the most advanced part of language design I was familiar with was ANSI C prototypes, so it's worse for me than for most others, I suppose! weaselcat again: "I truly believe that D is easier to port C code to than C++ because you can write D in a "cleaned up" C for the most part, and slowly turn it into D whereas C++ is essentially a completely different style". Yes - exactly what I have found (I don't know C++, although as I learn D it becomes easier to read C++ code). Thanks for sharing the thoughts. In a hurry, but I wanted to say something quickly now.
May 26 2015
prev sibling parent reply "Chris" <wendlec tcd.ie> writes:
On Sunday, 24 May 2015 at 21:35:13 UTC, bachmeier wrote:
 On Sunday, 24 May 2015 at 20:36:47 UTC, Laeeth Isharc wrote:
 Without wishing to dwell on the negatives of alternatives, 
 might I ask what made you decide to settle on D?  Do you have 
 collaborators who write code and, if so, how did the 
 discussions with them go about this?  For your use case, what 
 have been the main gains you have seen and how long did it 
 take before these started to come through?
I'm not weaselcat, but I'm an academic and I also tried out Rust before using D. I came to the conclusion that there was no way I could ever expect any collaborator to use Rust. The syntax was crazy. The requirement to study memory management issues (something completely irrelevant) before even reading the code was a non-starter. It's just a complicated language that is not suited for the average programmer. D is different. As long as I avoid templates, it's easy to read the code I've written, without any experience with the language. I tried C++ (Dirk Eddelbuettel devoted a section of his Rcpp book to an example I contributed), Rust, and Go. The other realistic alternative was Go, but I chose D as a matter of personal preference.
This is an often underestimated aspect. Code that looks clean and pretty makes programming much more enjoyable and thus boosts productivity. If a language is not nice to look at, it puts people off. Not only because it makes code less readable, it is also aesthetically off-putting, which is purely psychological, but real nonetheless. D is quite clean for the most part and you can easily discern the "shape" of a program (even people who've never used D, can make sense of it by looking at it). If a language like Rust introduces convoluted syntax for even the simplest tasks, then it will put people off, both in an aesthetic sense and as far as understanding the code is concerned (not to mention all the typing!). I think this is one of the reasons Python took off. It's nice to look at. With Go I have the sinking feeling that it won't be able to contend with C++ - or D for that matter. It took off due to Google and a fool-proof, easy-to-use infrastructure. But it is way too limited and limiting to be useful for more sophisticated tasks. Go's core devs even say that they wanted it to be an easy-to-use, middle-of-the-road language for those who work in their code mines, focusing on a high output, and it doesn't matter, if you have to write the same function or for-loop with slight modifications over and over and over again. Nim looks interesting, though. It combines nice features with Python's cleanliness. It is just this type of language people at universities and "coders-by-accident" love. Ha ha ha.
May 26 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 10:07:08 +0000, Chris wrote:

 With Go I have the sinking feeling that it won't be able to contend with
 C++ - or D for that matter. It took off due to Google and a fool-proof,
 easy-to-use infrastructure. But it is way too limited and limiting to be
 useful for more sophisticated tasks. Go's core devs even say that they
 wanted it to be an easy-to-use, middle-of-the-road language for those
 who work in their code mines, focusing on a high output, and it doesn't
 matter, if you have to write the same function or for-loop with slight
 modifications over and over and over again.
and it really doesn't matter... for Rob Pike. he also don't like shared=20 libraries and other bells and whistles. sometimes he is right, but=20 sometimes he is too radical. Go is a "java from google", aimed to raise a bunch of easily replaceable=20 programmers. so, like java, Go can't be complicated. both Gosling and=20 Pike are highly talented people, and that talent helps them to design=20 dumb languages (which is not as easy as it seems ;-).=
May 26 2015
parent reply "Chris" <wendlec tcd.ie> writes:
On Tuesday, 26 May 2015 at 17:13:18 UTC, ketmar wrote:
 On Tue, 26 May 2015 10:07:08 +0000, Chris wrote:

 With Go I have the sinking feeling that it won't be able to 
 contend with
 C++ - or D for that matter. It took off due to Google and a 
 fool-proof,
 easy-to-use infrastructure. But it is way too limited and 
 limiting to be
 useful for more sophisticated tasks. Go's core devs even say 
 that they
 wanted it to be an easy-to-use, middle-of-the-road language 
 for those
 who work in their code mines, focusing on a high output, and 
 it doesn't
 matter, if you have to write the same function or for-loop 
 with slight
 modifications over and over and over again.
and it really doesn't matter... for Rob Pike. he also don't like shared libraries and other bells and whistles. sometimes he is right, but sometimes he is too radical. Go is a "java from google", aimed to raise a bunch of easily replaceable programmers.
Exactly. As such it cannot be a serious contender as regards quality and versatility. There will be loads of Go code around, millions of for-loops on hundreds of thousands of servers, but I don't think it will go any further. Languages like D that are flexible and take useful concepts on board are much better suited for the programming challenges of the future (e.g. sophisticated high speed data processing algorithms). The thing is that Java and Python (and soon Go?) hit a brick wall sooner or later. Huge efforts are made to improve speed, flexibility and whatnot (JIT, Cython etc). But the real problem lies in rigid and narrow minded design decisions taken more than a decade ago. This is why it's still back to C and C++ for serious stuff.[1] [1] For more than a decade I've been hearing that with Java 8.x/9.x/10.x this or that issue will be fixed, or that Python will soon have native performance. It never happens and it never will. It's time to move on. Take the D train. :-)
 so, like java, Go can't be complicated. both Gosling and
 Pike are highly talented people, and that talent helps them to 
 design
 dumb languages (which is not as easy as it seems ;-).
May 27 2015
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 27 May 2015 at 10:01:35 UTC, Chris wrote:
 On Tuesday, 26 May 2015 at 17:13:18 UTC, ketmar wrote:
 On Tue, 26 May 2015 10:07:08 +0000, Chris wrote:

 With Go I have the sinking feeling that it won't be able to 
 contend with
 C++ - or D for that matter. It took off due to Google and a 
 fool-proof,
 easy-to-use infrastructure. But it is way too limited and 
 limiting to be
 useful for more sophisticated tasks. Go's core devs even say 
 that they
 wanted it to be an easy-to-use, middle-of-the-road language 
 for those
 who work in their code mines, focusing on a high output, and 
 it doesn't
 matter, if you have to write the same function or for-loop 
 with slight
 modifications over and over and over again.
and it really doesn't matter... for Rob Pike. he also don't like shared libraries and other bells and whistles. sometimes he is right, but sometimes he is too radical. Go is a "java from google", aimed to raise a bunch of easily replaceable programmers.
Exactly. As such it cannot be a serious contender as regards quality and versatility. There will be loads of Go code around, millions of for-loops on hundreds of thousands of servers, but I don't think it will go any further. Languages like D that are flexible and take useful concepts on board are much better suited for the programming challenges of the future (e.g. sophisticated high speed data processing algorithms). The thing is that Java and Python (and soon Go?) hit a brick wall sooner or later. Huge efforts are made to improve speed, flexibility and whatnot (JIT, Cython etc). But the real problem lies in rigid and narrow minded design decisions taken more than a decade ago. This is why it's still back to C and C++ for serious stuff.[1] [1] For more than a decade I've been hearing that with Java 8.x/9.x/10.x this or that issue will be fixed, or that Python will soon have native performance. It never happens and it never will. It's time to move on. Take the D train. :-)
Only when I can sell D to customers that put money into this kind of stuff http://www.azulsystems.com/press-2014/azul-systems-and-orc-partner-to-enable-smarter-high-performance-trading http://chronicle.software/products/koloboke-collections/ http://devblogs.nvidia.com/parallelforall/next-wave-enterprise-performance-java-power-systems-nvidia-gpus/ Ecosystems count more than language features. -- Paulo
May 27 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can=20 do their work without my help, 'cause they are so knowledgeable.=
May 27 2015
parent reply "Chris" <wendlec tcd.ie> writes:
On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this 
 kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
May 28 2015
next sibling parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 28/05/2015 8:55 p.m., Chris wrote:
 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
Or we put together a D consultancy firm, perhaps as part of D's future foundation? Starting at e.g. bug fixes ext. with price tag ranges on them. Perhaps even a price tag on working on DIP's.
May 28 2015
next sibling parent "Chris" <wendlec tcd.ie> writes:
On Thursday, 28 May 2015 at 09:23:07 UTC, Rikki Cattermole wrote:
 On 28/05/2015 8:55 p.m., Chris wrote:
 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this 
 kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
Or we put together a D consultancy firm, perhaps as part of D's future foundation? Starting at e.g. bug fixes ext. with price tag ranges on them. Perhaps even a price tag on working on DIP's.
Charge them and they will come! :-)
May 28 2015
prev sibling parent reply Manu via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 28 May 2015 at 19:23, Rikki Cattermole via Digitalmars-d
<digitalmars-d puremagic.com> wrote:
 On 28/05/2015 8:55 p.m., Chris wrote:
 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
Or we put together a D consultancy firm, perhaps as part of D's future foundation? Starting at e.g. bug fixes ext. with price tag ranges on them. Perhaps even a price tag on working on DIP's.
I would put my money on the table.
May 30 2015
parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 31/05/2015 3:52 p.m., Manu via Digitalmars-d wrote:
 On 28 May 2015 at 19:23, Rikki Cattermole via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On 28/05/2015 8:55 p.m., Chris wrote:
 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
Or we put together a D consultancy firm, perhaps as part of D's future foundation? Starting at e.g. bug fixes ext. with price tag ranges on them. Perhaps even a price tag on working on DIP's.
I would put my money on the table.
Now we just need either Walter or Andrei to weigh in on this idea. And perhaps somebody willing to step up and lead this. Laeeth Isharc perhaps? As I know he is interested in this area already and has the skills to back it up.
May 30 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 5/30/15 8:55 PM, Rikki Cattermole wrote:
 On 31/05/2015 3:52 p.m., Manu via Digitalmars-d wrote:
 On 28 May 2015 at 19:23, Rikki Cattermole via Digitalmars-d
 <digitalmars-d puremagic.com> wrote:
 On 28/05/2015 8:55 p.m., Chris wrote:
 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
Or we put together a D consultancy firm, perhaps as part of D's future foundation? Starting at e.g. bug fixes ext. with price tag ranges on them. Perhaps even a price tag on working on DIP's.
I would put my money on the table.
Now we just need either Walter or Andrei to weigh in on this idea. And perhaps somebody willing to step up and lead this.
Speaking only for myself, I'll look into how the foundation can facilitate such paid consultancy. (Probably poorly if it's non-profit.) The practical issue I see is with the geographical distribution of participants. Probably a strong entrepreneur could overcome these issues. -- Andrei
May 30 2015
prev sibling next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Thu, 28 May 2015 08:55:57 +0000, Chris wrote:

 On Wednesday, 27 May 2015 at 19:35:35 UTC, ketmar wrote:
 On Wed, 27 May 2015 13:23:16 +0000, Paulo  Pinto wrote:

 Only when I can sell D to customers that put money into this kind of
 stuff
if customers are deciding which technologies to use... ok then, they can do their work without my help, 'cause they are so knowledgeable.
=20 It's funny that people spend millions on technologies that makes mediocre or crap languages better, but they would never invest in something like D, because they dread the investment. I think it's because D doesn't have a price tag attached to it. "If it's for free, it must be sh*t", I often hear people say. Maybe we should have a D Enterprise Edition (DEE) and sell it for $1,000. Believe me, people would take to it like ducks take to water.
a nice idea. "you don't want it for free? ok, we'll take your money if=20 you want that." ;-)=
May 28 2015
prev sibling parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On 05/28/2015 04:55 AM, Chris wrote:
 "If it's for free, it
 must be sh*t", I often hear people say. Maybe we should have a D
 Enterprise Edition (DEE) and sell it for $1,000. Believe me, people
 would take to it like ducks take to water.
Indeed. There's an MBA born every minute ;)
May 30 2015
prev sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Sunday, 24 May 2015 at 20:36:47 UTC, Laeeth Isharc wrote:
 Weaselcat:

 FWIW I'm not picking on Rust, I used it for a rather long 
 time(while in beta, obviously) before I switched to D full 
 time for my academic work and I don't regret my decision. I 
 thought Rust would get more improvements than it did. I feel 
 like they made so many poor decisions as the development went 
 on, cut so many good features etc just to cater to the non-ML 
 crowd that the language ended up being a frankenstein mess.
Hi. Without wishing to dwell on the negatives of alternatives, might I ask what made you decide to settle on D?
Before I was mainly using C++ for my work(and dabbled in Rust, but never ended up switching to it) D's C-like syntax inspires familiarity to the point where I truly believe that D is easier to port C code to than C++ because you can write D in a "cleaned up" C for the most part, and slowly turn it into D whereas C++ is essentially a completely different style. Not too many languages can really claim this, and IMO it's a huge boon. D provides native performance directly on comparison with C/C++ while being much easier to jump in and bash out a few quick ideas(Phobos helps a lot with this.) Phobos offers far more than the C++ standard library as well. It's nice not to have to go hunting for libraries, C++11 helped with this a little I guess but it just feels like "it's there," whereas Andrei/Walter seem to actively be working towards "you should always be using Phobos or it should be fixed." Also, D's metaprogramming system is actually usable. I use CTFE to do a ton of precomputations at compiletime, I even recently learned that the compiler can unroll foreach loops in switches(! that's cool - D has a lot of neat tricks that nobody really seems to discuss,) I feel like I'm always learning new tricks in D, and not in the C++ way. The ability to use C libraries with barely any fuss(!), I ported a C's library's headers to D in about 10 minutes using htod and a bit of manual touchup. This is a `killer feature` in my opinion. Ranges/functional programming in general, C++ really has nothing on this. I think C++17 might be getting ranges but when I reviewed the paper they seemed far uglier and an afterthought(like most of C++), I've written large parts of my programs in purely functional style and LDC optimizes it to the equivalent imperative code. D's operator overloading is extremely well designed IMO,(it just goes with the rest of the 'make D work the way you want it to' feeling of the rest of the language) I feel like I could write a book on why I use D, so I'm going to stop now : )
 Do you have collaborators who write code and, if so, how did 
 the discussions with them go about this?  For your use case, 
 what have been the main gains you have seen and how long did it 
 take before these started to come through?
No, but my advisor was very open to the idea of me using D. He was familiar with it, but never used it himself(this seems to be fairly common.) I went from prototyping in python and porting into C++ as necessary to just doing everything in D, and I think this has saved me a lot of time. D(md) compiles pretty fast, I think it used to have a bigger advantage here over C++ before Clang became popular but it's still pretty darn fast and makes for python-esque edit-run-debug editing style. I went from compiling my projects on the university servers to my home desktop(and it compiles faster.)
 I am interested because I may have the chance to persuade some 
 people to explore using D, and I would like to know honestly 
 what some experiences are of people who have made that change.  
 (My personal experience may not be so easy to generalize from).


 Thanks.


 Laeeth.
I think D's best quality is probably how approachable it is for C/C++ programmers compared to i.e, Rust which has a (weird) ML influence. I wouldn't be surprised if you could get a C or C++ programmer up to speed in D in an afternoon, at least enough to be efficient in it. D is of course not all roses, shared still feels half implemented and left to rot(D's memory model in general.) TDPL is aging, and there's not too much other literature on D - but Ali's book is very good. The compiler situation feels odd, LDC and GDC have very few contributors despite being better than dmd at optimizing and providing a lot of extra perks - i.e, LDC lets you use all of the LLVM sanitizers like the thread, memory, etc. ones, and LDC provides in-depth optimization analysis thanks to LLVM. But comparatively, this list is not so bad. I'm not sure if my specific experience was what you're looking for, but maybe this helped.
May 24 2015
parent reply Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 24 May 2015 23:45, "weaselcat via Digitalmars-d" <
digitalmars-d puremagic.com> wrote:
 On Sunday, 24 May 2015 at 20:36:47 UTC, Laeeth Isharc wrote:
 Weaselcat:

 FWIW I'm not picking on Rust, I used it for a rather long time(while in
beta, obviously) before I switched to D full time for my academic work and I don't regret my decision. I thought Rust would get more improvements than it did. I feel like they made so many poor decisions as the development went on, cut so many good features etc just to cater to the non-ML crowd that the language ended up being a frankenstein mess.

 Hi.

 Without wishing to dwell on the negatives of alternatives, might I ask
what made you decide to settle on D?
 Before I was mainly using C++ for my work(and dabbled in Rust, but never
ended up switching to it)
 D's C-like syntax inspires familiarity to the point where I truly believe
that D is easier to port C code to than C++ because you can write D in a "cleaned up" C for the most part, and slowly turn it into D whereas C++ is essentially a completely different style. Not too many languages can really claim this, and IMO it's a huge boon.
 D provides native performance directly on comparison with C/C++ while
being much easier to jump in and bash out a few quick ideas(Phobos helps a lot with this.)
 Phobos offers far more than the C++ standard library as well. It's nice
not to have to go hunting for libraries, C++11 helped with this a little I guess but it just feels like "it's there," whereas Andrei/Walter seem to actively be working towards "you should always be using Phobos or it should be fixed."
 Also, D's metaprogramming system is actually usable. I use CTFE to do a
ton of precomputations at compiletime, I even recently learned that the compiler can unroll foreach loops in switches(! that's cool - D has a lot of neat tricks that nobody really seems to discuss,) I feel like I'm always learning new tricks in D, and not in the C++ way.
 The ability to use C libraries with barely any fuss(!), I ported a C's
library's headers to D in about 10 minutes using htod and a bit of manual touchup. This is a `killer feature` in my opinion.
 Ranges/functional programming in general, C++ really has nothing on this.
I think C++17 might be getting ranges but when I reviewed the paper they seemed far uglier and an afterthought(like most of C++), I've written large parts of my programs in purely functional style and LDC optimizes it to the equivalent imperative code.
 D's operator overloading is extremely well designed IMO,(it just goes
with the rest of the 'make D work the way you want it to' feeling of the rest of the language)
 I feel like I could write a book on why I use D, so I'm going to stop now
: )
 Do you have collaborators who write code and, if so, how did the
discussions with them go about this? For your use case, what have been the main gains you have seen and how long did it take before these started to come through?
 No, but my advisor was very open to the idea of me using D. He was
familiar with it, but never used it himself(this seems to be fairly common.)
 I went from prototyping in python and porting into C++ as necessary to
just doing everything in D, and I think this has saved me a lot of time.
 D(md) compiles pretty fast, I think it used to have a bigger advantage
here over C++ before Clang became popular but it's still pretty darn fast and makes for python-esque edit-run-debug editing style. I went from compiling my projects on the university servers to my home desktop(and it compiles faster.)
 I am interested because I may have the chance to persuade some people to
explore using D, and I would like to know honestly what some experiences are of people who have made that change. (My personal experience may not be so easy to generalize from).
 Thanks.


 Laeeth.
I think D's best quality is probably how approachable it is for C/C++
programmers compared to i.e, Rust which has a (weird) ML influence. I wouldn't be surprised if you could get a C or C++ programmer up to speed in D in an afternoon, at least enough to be efficient in it.
 D is of course not all roses, shared still feels half implemented and
left to rot(D's memory model in general.) TDPL is aging, and there's not too much other literature on D - but Ali's book is very good. The compiler situation feels odd, LDC and GDC have very few contributors despite being better than dmd at optimizing and providing a lot of extra perks I find the situation being like at university looking for grants or funding, and constantly being told. 'Oh yes, it is important what you are doing, and you must keep doing it as it is pivotal for future success. But no, we won't help you.'
May 24 2015
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 25 May 2015 00:24:26 +0200, Iain Buclaw via Digitalmars-d wrote:

 I find the situation being like at university looking for

 important what you are doing, and you must keep doing it as it is

that's 'cause GCC is untameable beast for average Joe like me, for=20 example. ;-)=
May 25 2015
parent reply Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 25 May 2015 09:45, "ketmar via Digitalmars-d" <
digitalmars-d puremagic.com> wrote:
 On Mon, 25 May 2015 00:24:26 +0200, Iain Buclaw via Digitalmars-d wrote:

 I find the situation being like at university looking for

 important what you are doing, and you must keep doing it as it is

that's 'cause GCC is untameable beast for average Joe like me, for example. ;-)
Both have equal complexity, so that is no excuse. DMD just operates at a lower level, on a smaller scale, and forces you to think about the effect on generated object code.
May 25 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 25 May 2015 16:59:48 +0200, Iain Buclaw via Digitalmars-d wrote:

 On 25 May 2015 09:45, "ketmar via Digitalmars-d" <
 digitalmars-d puremagic.com> wrote:
 On Mon, 25 May 2015 00:24:26 +0200, Iain Buclaw via Digitalmars-d
 wrote:

 I find the situation being like at university looking for grants or

 you are doing, and you must keep doing it as it is pivotal for future

that's 'cause GCC is untameable beast for average Joe like me, for example. ;-)
=20 Both have equal complexity, so that is no excuse. DMD just operates at a lower level, on a smaller scale, and forces you to think about the effect on generated object code.
i'm afraid that they doesn't have equal complexity. i can read DMD code=20 (ok, even backend, it's hard, but doable), but i cannot read GCC backend=20 code in the same amount of time. and there are alot more things i have to=20 know to understand GDC. i made some trivial fixes in DMD backend, yet i=20 don't even know where to start to understand at least *something* in GCC.=
May 25 2015
parent reply Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 25 May 2015 at 18:14, ketmar via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 On Mon, 25 May 2015 16:59:48 +0200, Iain Buclaw via Digitalmars-d wrote:

 On 25 May 2015 09:45, "ketmar via Digitalmars-d" <
 digitalmars-d puremagic.com> wrote:
 On Mon, 25 May 2015 00:24:26 +0200, Iain Buclaw via Digitalmars-d
 wrote:

 I find the situation being like at university looking for grants or

 you are doing, and you must keep doing it as it is pivotal for future

that's 'cause GCC is untameable beast for average Joe like me, for example. ;-)
Both have equal complexity, so that is no excuse. DMD just operates at a lower level, on a smaller scale, and forces you to think about the effect on generated object code.
i'm afraid that they doesn't have equal complexity. i can read DMD code (ok, even backend, it's hard, but doable), but i cannot read GCC backend code in the same amount of time. and there are alot more things i have to know to understand GDC. i made some trivial fixes in DMD backend, yet i don't even know where to start to understand at least *something* in GCC.
Yes, they do. The key difference is that GCC doesn't require you to delve into it's backend, as a language implementer, you only need to think of how the code should be represented in it's tree language (ie: http://icps.u-strasbg.fr/~pop/gcc-ast.html) - Because of this, I never need to look at assembly dumps to understand what is going on, only tree dumps, which are handily outputted in a C-style format with -fdump-tree-original=stdout.
May 25 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 25 May 2015 18:34:24 +0200, Iain Buclaw via Digitalmars-d wrote:

 Yes, they do.  The key difference is that GCC doesn't require you to
 delve into it's backend, as a language implementer, you only need to
 think of how the code should be represented in it's tree language (ie:
 http://icps.u-strasbg.fr/~pop/gcc-ast.html) - Because of this, I never
 need to look at assembly dumps to understand what is going on, only tree
 dumps,
 which are handily outputted in a C-style format with
 -fdump-tree-original=3Dstdout.
yet there are no well-documented samples for GCC, like "let's create a=20 frontend for simple C-like language, step by step" (at least not in the=20 distribution). there are none for DMD too, but DMD code can be read and=20 understood enough to work with it. and reading GCC code is out of=20 question, it's way too huge. i once thinking about using GCC as backend for my experimental language,=20 and ended writing my own codegen. it does awful job, spitting almost non- optimised code, but it was at least maintainable. with GCC i never got=20 far enough, it's too complex and poorly documented. sure, that is not your fault, i'm simply trying to explain why there are=20 almost no people working on GDC. it's just too hard. or it seems to be=20 hard, but without good GCC documentation it's almost the same.=
May 25 2015
next sibling parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 25 May 2015 at 22:27:10 UTC, ketmar wrote:
 On Mon, 25 May 2015 18:34:24 +0200, Iain Buclaw via 
 Digitalmars-d wrote:

 Yes, they do.  The key difference is that GCC doesn't require 
 you to
 delve into it's backend, as a language implementer, you only 
 need to
 think of how the code should be represented in it's tree 
 language (ie:
 http://icps.u-strasbg.fr/~pop/gcc-ast.html) - Because of this, 
 I never
 need to look at assembly dumps to understand what is going on, 
 only tree
 dumps,
 which are handily outputted in a C-style format with
 -fdump-tree-original=stdout.
yet there are no well-documented samples for GCC, like "let's create a frontend for simple C-like language, step by step" (at least not in the distribution). there are none for DMD too, but DMD code can be read and understood enough to work with it. and reading GCC code is out of question, it's way too huge. i once thinking about using GCC as backend for my experimental language, and ended writing my own codegen. it does awful job, spitting almost non- optimised code, but it was at least maintainable. with GCC i never got far enough, it's too complex and poorly documented. sure, that is not your fault, i'm simply trying to explain why there are almost no people working on GDC. it's just too hard. or it seems to be hard, but without good GCC documentation it's almost the same.
then contribute to LDC?
May 25 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 25 May 2015 22:46:51 +0000, weaselcat wrote:

 then contribute to LDC?
sorry, not interested in LLVM in any way. not that "it should die", but=20 "i don't care if it exists or not".=
May 25 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Monday, 25 May 2015 at 22:48:59 UTC, ketmar wrote:
 On Mon, 25 May 2015 22:46:51 +0000, weaselcat wrote:

 then contribute to LDC?
sorry, not interested in LLVM in any way. not that "it should die", but "i don't care if it exists or not".
LLVM addresses every issue you complained about in GCC.
May 25 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Mon, 25 May 2015 23:19:26 +0000, weaselcat wrote:

 On Monday, 25 May 2015 at 22:48:59 UTC, ketmar wrote:
 On Mon, 25 May 2015 22:46:51 +0000, weaselcat wrote:

 then contribute to LDC?
sorry, not interested in LLVM in any way. not that "it should die", but "i don't care if it exists or not".
=20 LLVM addresses every issue you complained about in GCC.
non-GPL. that means "non-existent" for me.=
May 25 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Tuesday, 26 May 2015 at 00:24:20 UTC, ketmar wrote:
 On Mon, 25 May 2015 23:19:26 +0000, weaselcat wrote:

 On Monday, 25 May 2015 at 22:48:59 UTC, ketmar wrote:
 On Mon, 25 May 2015 22:46:51 +0000, weaselcat wrote:

 then contribute to LDC?
sorry, not interested in LLVM in any way. not that "it should die", but "i don't care if it exists or not".
LLVM addresses every issue you complained about in GCC.
non-GPL. that means "non-existent" for me.
you're aware that the dmd backend is proprietary, right?
May 25 2015
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 00:53:35 +0000, weaselcat wrote:

 you're aware that the dmd backend is proprietary, right?
of course. that's why i never submitted any fixes to DMD backend,=20 although i have some.=
May 25 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Tuesday, 26 May 2015 at 02:14:46 UTC, ketmar wrote:
 On Tue, 26 May 2015 00:53:35 +0000, weaselcat wrote:

 you're aware that the dmd backend is proprietary, right?
of course. that's why i never submitted any fixes to DMD backend, although i have some.
was just making sure while I agree with your ideological standpoint on GPL, gcc is virtually inaccessible to anyone who isn't already familiar with the codebase in comparison to llvm. At least it seems gcc is interested in changing this.
May 25 2015
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 04:07:40 +0000, weaselcat wrote:

 while I agree with your ideological standpoint on GPL, gcc is virtually
 inaccessible to anyone who isn't already familiar with the codebase in
 comparison to llvm. At least it seems gcc is interested in changing
 this.
funny, that movement in GCC is in major part stimulated by LLVM. and=20 that's why i'm not saying "LLVM must die!" ;-)=
May 26 2015
prev sibling parent reply Iain Buclaw via Digitalmars-d <digitalmars-d puremagic.com> writes:
On 26 May 2015 00:30, "ketmar via Digitalmars-d" <
digitalmars-d puremagic.com> wrote:
 On Mon, 25 May 2015 18:34:24 +0200, Iain Buclaw via Digitalmars-d wrote:

 Yes, they do.  The key difference is that GCC doesn't require you to
 delve into it's backend, as a language implementer, you only need to
 think of how the code should be represented in it's tree language (ie:
 http://icps.u-strasbg.fr/~pop/gcc-ast.html) - Because of this, I never
 need to look at assembly dumps to understand what is going on, only tree
 dumps,
 which are handily outputted in a C-style format with
 -fdump-tree-original=stdout.
yet there are no well-documented samples for GCC, like "let's create a frontend for simple C-like language, step by step" (at least not in the distribution).
There used to be treelang, but nobody maintained it, and it quickly became obsolete. This is the best small frontend I've come across. https://github.com/giuseppe/gccbrainfuck Depending on when you tried your attempt, the response is always going to be , the entry barrier is getting lower every year. LLVM and GCC are very similar, but where LLVM built it's AST on class hierarchy from the start, due to legacy, GCC achieves the same OO style using unions and accessor macros. The best documentation for these can be found in the source code (see tree.def and tree.h for all definitions and accessors). I suspect the most common beginner bugs occurs when using right accessor on the wrong tree causes an ICE at runtime in 'some obscure location'. Incase you intend to dabble again in the future as more parts move over to C++, so far, RTL, GIMPLE, callgraphs, and almost all synthetic value types have been switched over. When trees finally get shunted across, you'll finally benefit from compile-time verified AST.
 there are none for DMD too, but DMD code can be read and
 understood enough to work with it. and reading GCC code is out of
 question, it's way too huge.
No one (sane) starts with studying the whole of gcc. There are only three or four key files that you only need to reference to get off the ground.
 sure, that is not your fault, i'm simply trying to explain why there are
 almost no people working on GDC. it's just too hard. or it seems to be
 hard, but without good GCC documentation it's almost the same.
It normally starts with a clean frontend. Currently there are bits and pieces split across the place for GDC. But at least for the most part, all codegen is done via handy routines that call the correct backend builder for you. But I'd like to go one better and have these part of the new visitors we are switching to. Once all code generation for e.g expressions is all encapsulated into a single file, then it's just a case of documentation. Iain
May 25 2015
next sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 07:43:59 +0200, Iain Buclaw via Digitalmars-d wrote:

 On 26 May 2015 00:30, "ketmar via Digitalmars-d" <
 digitalmars-d puremagic.com> wrote:
thank you for the info. ah, if only i have a "time freezing device" to=20 make my days longer... ;-)=
May 26 2015
prev sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 07:43:59 +0200, Iain Buclaw via Digitalmars-d wrote:

p.s.

 I suspect the most common beginner bugs occurs when using right accessor
 on the wrong tree causes an ICE at runtime in 'some obscure location'.
=20
 Incase you intend to dabble again in the future as more parts move over
 to C++, so far, RTL, GIMPLE, callgraphs, and almost all synthetic value
 types have been switched over. When trees finally get shunted across,
 you'll finally benefit from compile-time verified AST.
that's why i have mixed feelings about C++ code in GCC. although i really=20 like it being "C only", compile-time checking is very valuable.=
May 26 2015
parent reply "weaselcat" <weaselcat gmail.com> writes:
On Tuesday, 26 May 2015 at 08:35:46 UTC, ketmar wrote:
 On Tue, 26 May 2015 07:43:59 +0200, Iain Buclaw via 
 Digitalmars-d wrote:

 p.s.

 I suspect the most common beginner bugs occurs when using 
 right accessor
 on the wrong tree causes an ICE at runtime in 'some obscure 
 location'.
 
 Incase you intend to dabble again in the future as more parts 
 move over
 to C++, so far, RTL, GIMPLE, callgraphs, and almost all 
 synthetic value
 types have been switched over. When trees finally get shunted 
 across,
 you'll finally benefit from compile-time verified AST.
that's why i have mixed feelings about C++ code in GCC. although i really like it being "C only", compile-time checking is very valuable.
just convince the gcc dev team to port it to D ;)
May 26 2015
parent ketmar <ketmar ketmar.no-ip.org> writes:
On Tue, 26 May 2015 08:45:01 +0000, weaselcat wrote:

 that's why i have mixed feelings about C++ code in GCC. although i
 really like it being "C only", compile-time checking is very valuable.
=20 just convince the gcc dev team to port it to D ;)
ah, that would be *ideal*... but ok, let my Secret Plan work: first they=20 will port GCC to simplified C++, and then... tadam! MagicPort Adjusted, D=20 everywhere, C++-lovers are crying!=
May 26 2015
prev sibling parent "Laeeth Isharc" <nospamlaeeth nospam.laeeth.com> writes:
On Sunday, 24 May 2015 at 22:24:40 UTC, Iain Buclaw wrote:
 The compiler
 situation feels odd, LDC and GDC have very few contributors 
 despite being
 better than dmd at optimizing and providing a lot of extra 
 perks
 I find the situation being like at university looking for 
 grants or funding, and constantly being told. 'Oh yes, it is 
 important what you are doing, and you must keep doing it as it 
 is pivotal for future success.  But no, we won't help you.'
I wish I could do something to help, but unfortunately I can't think of much for now. (C++ coding and learning compiler internals exceeds what's possible as things stand). If things should take off on the business front, I'll try to help where I can, but that's a matter of a few years in the best case. Anyway, I very much appreciate the work of you and your colleagues on GDC.
May 26 2015
prev sibling next sibling parent "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Sunday, 24 May 2015 at 19:06:28 UTC, weaselcat wrote:
 Furthermore, I strongly dislike that Rust has made it 
 completely impossible to opt out of bounds checking without 
 annotating your code with unsafe.
But this is exactly the situation in D, isn't it? As soon as you use `arr.ptr[i]`, you're no longer safe. Or do you mean that there's no command-line switch equivalent to -noboundscheck?
May 25 2015
prev sibling parent Ziad Hatahet via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sun, May 24, 2015 at 12:06 PM, weaselcat via Digitalmars-d <
digitalmars-d puremagic.com> wrote:

 Furthermore, I strongly dislike that Rust has made it completely
 impossible to opt out of bounds checking without annotating your code with
 unsafe.
Using iterators should cause bounds checking to be eliminated; otherwise, it should be a bug. -- Ziad
May 26 2015
prev sibling parent "Laeeth Isharc" <laeeth nospamlaeeth.com> writes:
 Self-criticism is necessary for improvement.
Yes, and what matters is after the storm has passed what you have done with that energy. People with high standards and no immediate ability to change things often complain a lot ;) To this newcomer, at least, the progress is impressive.
May 24 2015
prev sibling parent "ponce" <contact gam3sfrommars.fr> writes:
On Friday, 22 May 2015 at 10:21:18 UTC, Chris wrote:
 I was recently thinking that D is a bit like climbing up a hill 
 or a mountain. For the most part you are focused on reaching 
 the top, yet every once in a while it's good to stop and turn 
 around to enjoy the scenery and see how far you've come. So 
 here is what I see:

 - LDC/GDC: easy to download and use. Nicely packaged.
 - DUB: great tool for project management
 - DVM: great tool for upgrading from D to D.
 - Phobos: has become quite a useful library. Ranges are an 
 important part of data processing, I don't wanna miss 'em 
 anymore
 - vibe.d: a web server in D.
 - Projects in D: LuaD, PyD etc etc.
 - the expertise that's involved
 [add anything you like]

 Mind you, this has been achieved without millions of dollars 
 and corporate backing and yet D is a real language with real 
 applications (only nobody talks about it). I know, there is 
 still a steep climb ahead of D, but let's enjoy the view for a 
 while. What has been achieved is by no means trivial.
Fully agreed, D is an interesting phenomena that makes everyone of my days happier :) Now, can we have OSX shared libraries please? :O
May 22 2015