www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Had another 48hr game jam this weekend...

reply Manu <turkeyman gmail.com> writes:
We have to get the user experience and first impressions under control...

I'd really love to to see a general roadmap and list of priorities. Even if
goals are high-level, they might help direct focus?

So I had another game-jam this weekend with a bunch of friends who are all
industry professionals.
The point of a 48 hour game jam is to prioritise productivity and
creativity.
Choosing a language like D here makes sense from a productivity point of
view... that is, if it 'JUST WORKS'=E2=84=A2.

There were 7 programmers, they were all using D for the first time (except
me).

Most running windows, one on OSX, one on Linux.
We ran into the same problems old that have been giving me the shits as
long as I've been using D.

Configuring compilers:

Naturally, this is primarily a problem with the windows experience, but
it's so frustrating that it is STILL a problem... how many years later?
People don't want to 'do work' to install a piece of software. Rather, they
expect it to 'just work'. We lost about 6 hours trying to get everyone's
machines working properly.
In the context of a 48 hour game jam, that's a terrible sign! I just kept
promising people that it would save time overall... which I wish were true.

The only compiler you can realistically use productively in windows is
DMD-Win64, and that doesn't work out of the box.
We needed to mess with sc.ini for quite some time to get the stars aligned
such that it would actually compile and find the linker+libs.

Walter: DMD needs to internally detect installations of various versions of
VisualStudio, and either 'just work', or amend sc.ini on its own. Or the
installer needs to amend sc.ini. Either way, leaving it to a user to fiddle
with an ini file just isn't acceptable. We had to google solutions to this
problem, and even then, we had trouble with the paths we added to sc.ini;
are spaces acceptable? Do they have quites around them?...
I might also suggest that Microsoft supplied (ie, 'standard'), libraries
should be automatically detected and path entries added in there too:
  C:\Program Files (x86)\Microsoft SDKs\...
  C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
These are on basically every windows developers machine, and each of us had
to configure them ourselves.


Getting a workable environment:

Unsurprisingly, the Linux user was the only person happy work with a
makefile. Everybody else wanted a comfortable IDE solution (and the linux
user would prefer it too).

!!!!!!!!!
This has to be given first-class attention!
I am completely and utterly sick of this problem. Don made a massive point
of it in his DConf talk, and I want to re-re-re-re-re-re-re-stress how
absolutely important this is.
!!!!!!!!!

I have come to the conclusion that treating IDE integration as ancillary
projects maintained by usually just one single member of the community has
absolutely failed.
I suggest:
 * These should be made central D community projects.
 * I think they should be hosted in the same github organisation as DMD.
 *** As many contributors as possible should be encouraged to work with
them every day.
   - Deprecate DMD makefiles. Seriously! Insist that contributors use the
IDE bindings to work on DMD.
   - The fact that you all laughed at the prior point suggests clearly why
it needs to be done. It will cease to be a problem when all the
druntime/phobos contributors are suffering the end-user experience.
 * They should receive bugs in the main github bug-tracker, so EVERY D
contributor can see them, and how many there are.

IDE integration absolutely needs to be considered a first class feature of
D.
I also suggest that the IDE integration downloads should be hosted on the
dlang download page so they are obvious and available to everyone without
having to go looking, and also as a statement that they are actually
endorsed by the dlanguage authorities. As an end-user, you're not left
guessing which ones are good/bad/out of date/actually work/etc.

Obviously, we settled on Visual-D (Windows) and Mono-D (OSX/Linux); the
only realistic choices available. The OSX user would have preferred an
XCode integration.

Overwhelmingly, the biggest complaint was a lack of symbolic information to
assist with auto-completion. Visual-D tries valiantly, but it falls quite
short of the mark.
This goes back to the threads where the IDE guys are writing their own
parsers, when really, DMD should be able to be built as a lib, with an API
designed for using DMD as a lib/plugin.
I think continuous code compilation for auto-completion and syntax
highlighting purposes should be a service offered and maintained by DMD.
That way it will evolve with the language, and anyone can use it without
reinventing the wheel.


Debugging:

Poor debugging experience wastes your time every 5 minutes.
I can only speak for the Windows experience (since we failed to get OSX
working); there are lots of problems with the debugging experience under
visual studio...
I haven't logged bugs yet, but I intend to.
There were many instances of people wasting their time chasing bugs in
random places when it was simply a case of the debugger lying about the
value of variables to them, and many more cases where the debugger simply
refused to produce values for some variables at all.
This is an unacceptable waste of programmers time, and again, really burned
us in a 48hour context.


Documentation:

Okay for the most part, but some windows dev's want a CHM that looks like
the typical Microsoft doc's people are used to. Those that aren't familiar
with the CHM viewer; it's just HTML but with a nice index + layout tree.


Containers:

The question came up multiple times; "I don't think this should be an
array... what containers can I use, and where are they?"...
Also, nobody could work out how to remove an arbitrary item from an array,
or an item from an AA by reference/value (only by key).

This code:
  foreach(i, item; array)
    if(item =3D=3D itemToRemove)
      array =3D array[0..i] ~ array[i+1..$];
Got a rather 'negative' reaction from the audience to put it lightly...


Bugs:
Yes, we hit DMD bugs, like the one with opaque structs which required
extensive work-arounds.
  struct MyStruct;
  MyStruct*[] =3D new MyStruct*[n];

We also ran into some completely nonsense error messages, but I forgot to
log them, since we were working against the clock.


One more thing:
I'll just pick one language complaint from the weekend.
It is how quickly classes became disorganised and difficult to navigate

We all wanted to ability to define class member functions outside the class
definition:
  class MyClass
  {
    void method();
  }

  void MyClass.method()
  {
    //...
  }

It definitely cost us time simply trying to understand the class layout
visually (ie, when IDE support is barely available).
You don't need to see the function bodies in the class definition, you want
to quickly see what a class has and does.


Conclusion:
I think this 48 hour jam approach is a good test for the language and it's
infrastructure. I encourage everybody to try it (ideally with a clean slate
computer).
The lesson is that we need to make this process smooth, since it mirrors
the first-experience of everybody new to D.
It also highlights and magnifies time-wasters that are equally unacceptable
in a commercial environment.

I don't think I made any converts this weekend wrt the above issues
encountered. I might have even just proved to them that they should indeed


Please, we need a road-map, we need to prioritise these most basic aspects
of the experience, and we need to do it soon.
I might re-iterate my feeling that external IDE integration projects should
be claimed by the community officially, and user experience + debugging
issues should be first-class issues in the main d language bug-tracker so
everybody can see them, and everybody is aware of the stats.
Also, the DMD front-end should be a lib offering auto-completion and syntax
hilighting data to clients.

I'm doing some more work on premake (a nice light buildsystem that
generated makefiles and project files for popular IDE's) to tightly
incorporate D into the various IDE's it supports.

</endrant>
Aug 31 2013
next sibling parent reply "Trent" <anon nope.avi> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 We have to get the user experience and first impressions under 
 control...

 I'd really love to to see a general roadmap and list of 
 priorities. Even if
 goals are high-level, they might help direct focus?

 So I had another game-jam this weekend with a bunch of friends 
 who are all
 industry professionals.
 The point of a 48 hour game jam is to prioritise productivity 
 and
 creativity.
 Choosing a language like D here makes sense from a productivity 
 point of
 view... that is, if it 'JUST WORKS'™.

 There were 7 programmers, they were all using D for the first 
 time (except
 me).

 Most running windows, one on OSX, one on Linux.
 We ran into the same problems old that have been giving me the 
 shits as
 long as I've been using D.

 Configuring compilers:

 Naturally, this is primarily a problem with the windows 
 experience, but
 it's so frustrating that it is STILL a problem... how many 
 years later?
 People don't want to 'do work' to install a piece of software. 
 Rather, they
 expect it to 'just work'. We lost about 6 hours trying to get 
 everyone's
 machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! I 
 just kept
 promising people that it would save time overall... which I 
 wish were true.

 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the 
 stars aligned
 such that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various 
 versions of
 VisualStudio, and either 'just work', or amend sc.ini on its 
 own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a 
 user to fiddle
 with an ini file just isn't acceptable. We had to google 
 solutions to this
 problem, and even then, we had trouble with the paths we added 
 to sc.ini;
 are spaces acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), 
 libraries
 should be automatically detected and path entries added in 
 there too:
   C:\Program Files (x86)\Microsoft SDKs\...
   C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and 
 each of us had
 to configure them ourselves.


 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work 
 with a
 makefile. Everybody else wanted a comfortable IDE solution (and 
 the linux
 user would prefer it too).

 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a 
 massive point
 of it in his DConf talk, and I want to 
 re-re-re-re-re-re-re-stress how
 absolutely important this is.
 !!!!!!!!!

 I have come to the conclusion that treating IDE integration as 
 ancillary
 projects maintained by usually just one single member of the 
 community has
 absolutely failed.
 I suggest:
  * These should be made central D community projects.
  * I think they should be hosted in the same github 
 organisation as DMD.
  *** As many contributors as possible should be encouraged to 
 work with
 them every day.
    - Deprecate DMD makefiles. Seriously! Insist that 
 contributors use the
 IDE bindings to work on DMD.
    - The fact that you all laughed at the prior point suggests 
 clearly why
 it needs to be done. It will cease to be a problem when all the
 druntime/phobos contributors are suffering the end-user 
 experience.
  * They should receive bugs in the main github bug-tracker, so 
 EVERY D
 contributor can see them, and how many there are.

 IDE integration absolutely needs to be considered a first class 
 feature of
 D.
 I also suggest that the IDE integration downloads should be 
 hosted on the
 dlang download page so they are obvious and available to 
 everyone without
 having to go looking, and also as a statement that they are 
 actually
 endorsed by the dlanguage authorities. As an end-user, you're 
 not left
 guessing which ones are good/bad/out of date/actually work/etc.

 Obviously, we settled on Visual-D (Windows) and Mono-D 
 (OSX/Linux); the
 only realistic choices available. The OSX user would have 
 preferred an
 XCode integration.

 Overwhelmingly, the biggest complaint was a lack of symbolic 
 information to
 assist with auto-completion. Visual-D tries valiantly, but it 
 falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
 I think continuous code compilation for auto-completion and 
 syntax
 highlighting purposes should be a service offered and 
 maintained by DMD.
 That way it will evolve with the language, and anyone can use 
 it without
 reinventing the wheel.


 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to 
 get OSX
 working); there are lots of problems with the debugging 
 experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing 
 bugs in
 random places when it was simply a case of the debugger lying 
 about the
 value of variables to them, and many more cases where the 
 debugger simply
 refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, 
 really burned
 us in a 48hour context.


 Documentation:

 Okay for the most part, but some windows dev's want a CHM that 
 looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.


 Containers:

 The question came up multiple times; "I don't think this should 
 be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
   foreach(i, item; array)
     if(item == itemToRemove)
       array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...


 Bugs:
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];

 We also ran into some completely nonsense error messages, but I 
 forgot to
 log them, since we were working against the clock.


 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.


 Conclusion:
 I think this 48 hour jam approach is a good test for the 
 language and it's
 infrastructure. I encourage everybody to try it (ideally with a 
 clean slate
 computer).
 The lesson is that we need to make this process smooth, since 
 it mirrors
 the first-experience of everybody new to D.
 It also highlights and magnifies time-wasters that are equally 
 unacceptable
 in a commercial environment.

 I don't think I made any converts this weekend wrt the above 
 issues
 encountered. I might have even just proved to them that they 
 should indeed


 Please, we need a road-map, we need to prioritise these most 
 basic aspects
 of the experience, and we need to do it soon.
 I might re-iterate my feeling that external IDE integration 
 projects should
 be claimed by the community officially, and user experience + 
 debugging
 issues should be first-class issues in the main d language 
 bug-tracker so
 everybody can see them, and everybody is aware of the stats.
 Also, the DMD front-end should be a lib offering 
 auto-completion and syntax
 hilighting data to clients.

 I'm doing some more work on premake (a nice light buildsystem 
 that
 generated makefiles and project files for popular IDE's) to 
 tightly
 incorporate D into the various IDE's it supports.

 </endrant>
I've hinted a few times here and on Reddit that I've been working on revamping/redoing CMake support for D. This project would have gone public almost a month ago if it weren't for the situation on Windows, but as it stands, I would only make it easier for new users to hit a brick wall when they try to do D development on Windows. I don't think that's very beneficial. To add to Manu's gripes: OMF vs COFF / optlink is ridiculous One of CMake's appeals is that it can check the system for needed libraries, and aid in linking to them. DMD32's need for OMF libraries throws an uncomfortable wrench into the situation, which I have yet to resolve to my satisfaction. VisualD generation was fairly straightforward. VS generation for mixed C/D projects is not something I'm even sure I can do (on Win32/DMD, that is), since to get VS to use a different C compiler (dmc), I need to make a VS config that defers to a makefile config. I'm not sure that CMake is set up to support this. I'm still working on it during my (ever-shrinking) free time, but it's pretty slow going.
Aug 31 2013
parent "Rikki Cattermole" <alphaglosined gmail.com> writes:
On Sunday, 1 September 2013 at 03:17:29 UTC, Trent wrote:
 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 We have to get the user experience and first impressions under 
 control...

 I'd really love to to see a general roadmap and list of 
 priorities. Even if
 goals are high-level, they might help direct focus?

 So I had another game-jam this weekend with a bunch of friends 
 who are all
 industry professionals.
 The point of a 48 hour game jam is to prioritise productivity 
 and
 creativity.
 Choosing a language like D here makes sense from a 
 productivity point of
 view... that is, if it 'JUST WORKS'™.

 There were 7 programmers, they were all using D for the first 
 time (except
 me).

 Most running windows, one on OSX, one on Linux.
 We ran into the same problems old that have been giving me the 
 shits as
 long as I've been using D.

 Configuring compilers:

 Naturally, this is primarily a problem with the windows 
 experience, but
 it's so frustrating that it is STILL a problem... how many 
 years later?
 People don't want to 'do work' to install a piece of software. 
 Rather, they
 expect it to 'just work'. We lost about 6 hours trying to get 
 everyone's
 machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! 
 I just kept
 promising people that it would save time overall... which I 
 wish were true.

 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the 
 stars aligned
 such that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of 
 various versions of
 VisualStudio, and either 'just work', or amend sc.ini on its 
 own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a 
 user to fiddle
 with an ini file just isn't acceptable. We had to google 
 solutions to this
 problem, and even then, we had trouble with the paths we added 
 to sc.ini;
 are spaces acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), 
 libraries
 should be automatically detected and path entries added in 
 there too:
  C:\Program Files (x86)\Microsoft SDKs\...
  C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and 
 each of us had
 to configure them ourselves.


 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work 
 with a
 makefile. Everybody else wanted a comfortable IDE solution 
 (and the linux
 user would prefer it too).

 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a 
 massive point
 of it in his DConf talk, and I want to 
 re-re-re-re-re-re-re-stress how
 absolutely important this is.
 !!!!!!!!!

 I have come to the conclusion that treating IDE integration as 
 ancillary
 projects maintained by usually just one single member of the 
 community has
 absolutely failed.
 I suggest:
 * These should be made central D community projects.
 * I think they should be hosted in the same github 
 organisation as DMD.
 *** As many contributors as possible should be encouraged to 
 work with
 them every day.
   - Deprecate DMD makefiles. Seriously! Insist that 
 contributors use the
 IDE bindings to work on DMD.
   - The fact that you all laughed at the prior point suggests 
 clearly why
 it needs to be done. It will cease to be a problem when all the
 druntime/phobos contributors are suffering the end-user 
 experience.
 * They should receive bugs in the main github bug-tracker, so 
 EVERY D
 contributor can see them, and how many there are.

 IDE integration absolutely needs to be considered a first 
 class feature of
 D.
 I also suggest that the IDE integration downloads should be 
 hosted on the
 dlang download page so they are obvious and available to 
 everyone without
 having to go looking, and also as a statement that they are 
 actually
 endorsed by the dlanguage authorities. As an end-user, you're 
 not left
 guessing which ones are good/bad/out of date/actually work/etc.

 Obviously, we settled on Visual-D (Windows) and Mono-D 
 (OSX/Linux); the
 only realistic choices available. The OSX user would have 
 preferred an
 XCode integration.

 Overwhelmingly, the biggest complaint was a lack of symbolic 
 information to
 assist with auto-completion. Visual-D tries valiantly, but it 
 falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
 I think continuous code compilation for auto-completion and 
 syntax
 highlighting purposes should be a service offered and 
 maintained by DMD.
 That way it will evolve with the language, and anyone can use 
 it without
 reinventing the wheel.


 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed 
 to get OSX
 working); there are lots of problems with the debugging 
 experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing 
 bugs in
 random places when it was simply a case of the debugger lying 
 about the
 value of variables to them, and many more cases where the 
 debugger simply
 refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, 
 really burned
 us in a 48hour context.


 Documentation:

 Okay for the most part, but some windows dev's want a CHM that 
 looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.


 Containers:

 The question came up multiple times; "I don't think this 
 should be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
  foreach(i, item; array)
    if(item == itemToRemove)
      array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...


 Bugs:
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
  struct MyStruct;
  MyStruct*[] = new MyStruct*[n];

 We also ran into some completely nonsense error messages, but 
 I forgot to
 log them, since we were working against the clock.


 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
  class MyClass
  {
    void method();
  }

  void MyClass.method()
  {
    //...
  }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.


 Conclusion:
 I think this 48 hour jam approach is a good test for the 
 language and it's
 infrastructure. I encourage everybody to try it (ideally with 
 a clean slate
 computer).
 The lesson is that we need to make this process smooth, since 
 it mirrors
 the first-experience of everybody new to D.
 It also highlights and magnifies time-wasters that are equally 
 unacceptable
 in a commercial environment.

 I don't think I made any converts this weekend wrt the above 
 issues
 encountered. I might have even just proved to them that they 
 should indeed


 Please, we need a road-map, we need to prioritise these most 
 basic aspects
 of the experience, and we need to do it soon.
 I might re-iterate my feeling that external IDE integration 
 projects should
 be claimed by the community officially, and user experience + 
 debugging
 issues should be first-class issues in the main d language 
 bug-tracker so
 everybody can see them, and everybody is aware of the stats.
 Also, the DMD front-end should be a lib offering 
 auto-completion and syntax
 hilighting data to clients.

 I'm doing some more work on premake (a nice light buildsystem 
 that
 generated makefiles and project files for popular IDE's) to 
 tightly
 incorporate D into the various IDE's it supports.

 </endrant>
I've hinted a few times here and on Reddit that I've been working on revamping/redoing CMake support for D. This project would have gone public almost a month ago if it weren't for the situation on Windows, but as it stands, I would only make it easier for new users to hit a brick wall when they try to do D development on Windows. I don't think that's very beneficial. To add to Manu's gripes: OMF vs COFF / optlink is ridiculous One of CMake's appeals is that it can check the system for needed libraries, and aid in linking to them. DMD32's need for OMF libraries throws an uncomfortable wrench into the situation, which I have yet to resolve to my satisfaction. VisualD generation was fairly straightforward. VS generation for mixed C/D projects is not something I'm even sure I can do (on Win32/DMD, that is), since to get VS to use a different C compiler (dmc), I need to make a VS config that defers to a makefile config. I'm not sure that CMake is set up to support this. I'm still working on it during my (ever-shrinking) free time, but it's pretty slow going.
I also would love to see Microsoft's linker support for Win32 added. Although not as a replacement for Optlink. More as an option that can be enabled. So maybe at the same time a rethink of PE-COFF support should be in order? We could also fix the paths issue at the same time. About the whole IDE thing my target for DOOGLE[1] is to eventually be able to build an IDE out of it. Possibly even a full windowing environment. Because honestly we need one fully written in D (IDE). [1] https://github.com/rikkimax/DOOGLE
Aug 31 2013
prev sibling next sibling parent "Jakob Ovrum" <jakobovrum gmail.com> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 We have to get the user experience and first impressions under 
 control...

 I'd really love to to see a general roadmap and list of 
 priorities. Even if
 goals are high-level, they might help direct focus?

 So I had another game-jam this weekend with a bunch of friends 
 who are all
 industry professionals.
 The point of a 48 hour game jam is to prioritise productivity 
 and
 creativity.
 Choosing a language like D here makes sense from a productivity 
 point of
 view... that is, if it 'JUST WORKS'™.

 There were 7 programmers, they were all using D for the first 
 time (except
 me).

 Most running windows, one on OSX, one on Linux.
 We ran into the same problems old that have been giving me the 
 shits as
 long as I've been using D.

 Configuring compilers:

 Naturally, this is primarily a problem with the windows 
 experience, but
 it's so frustrating that it is STILL a problem... how many 
 years later?
 People don't want to 'do work' to install a piece of software. 
 Rather, they
 expect it to 'just work'. We lost about 6 hours trying to get 
 everyone's
 machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! I 
 just kept
 promising people that it would save time overall... which I 
 wish were true.

 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the 
 stars aligned
 such that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various 
 versions of
 VisualStudio, and either 'just work', or amend sc.ini on its 
 own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a 
 user to fiddle
 with an ini file just isn't acceptable. We had to google 
 solutions to this
 problem, and even then, we had trouble with the paths we added 
 to sc.ini;
 are spaces acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), 
 libraries
 should be automatically detected and path entries added in 
 there too:
   C:\Program Files (x86)\Microsoft SDKs\...
   C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and 
 each of us had
 to configure them ourselves.


 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work 
 with a
 makefile. Everybody else wanted a comfortable IDE solution (and 
 the linux
 user would prefer it too).

 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a 
 massive point
 of it in his DConf talk, and I want to 
 re-re-re-re-re-re-re-stress how
 absolutely important this is.
 !!!!!!!!!

 I have come to the conclusion that treating IDE integration as 
 ancillary
 projects maintained by usually just one single member of the 
 community has
 absolutely failed.
 I suggest:
  * These should be made central D community projects.
  * I think they should be hosted in the same github 
 organisation as DMD.
  *** As many contributors as possible should be encouraged to 
 work with
 them every day.
    - Deprecate DMD makefiles. Seriously! Insist that 
 contributors use the
 IDE bindings to work on DMD.
    - The fact that you all laughed at the prior point suggests 
 clearly why
 it needs to be done. It will cease to be a problem when all the
 druntime/phobos contributors are suffering the end-user 
 experience.
  * They should receive bugs in the main github bug-tracker, so 
 EVERY D
 contributor can see them, and how many there are.

 IDE integration absolutely needs to be considered a first class 
 feature of
 D.
 I also suggest that the IDE integration downloads should be 
 hosted on the
 dlang download page so they are obvious and available to 
 everyone without
 having to go looking, and also as a statement that they are 
 actually
 endorsed by the dlanguage authorities. As an end-user, you're 
 not left
 guessing which ones are good/bad/out of date/actually work/etc.

 Obviously, we settled on Visual-D (Windows) and Mono-D 
 (OSX/Linux); the
 only realistic choices available. The OSX user would have 
 preferred an
 XCode integration.

 Overwhelmingly, the biggest complaint was a lack of symbolic 
 information to
 assist with auto-completion. Visual-D tries valiantly, but it 
 falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
 I think continuous code compilation for auto-completion and 
 syntax
 highlighting purposes should be a service offered and 
 maintained by DMD.
 That way it will evolve with the language, and anyone can use 
 it without
 reinventing the wheel.


 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to 
 get OSX
 working); there are lots of problems with the debugging 
 experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing 
 bugs in
 random places when it was simply a case of the debugger lying 
 about the
 value of variables to them, and many more cases where the 
 debugger simply
 refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, 
 really burned
 us in a 48hour context.


 Documentation:

 Okay for the most part, but some windows dev's want a CHM that 
 looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.


 Containers:

 The question came up multiple times; "I don't think this should 
 be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
   foreach(i, item; array)
     if(item == itemToRemove)
       array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...


 Bugs:
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];

 We also ran into some completely nonsense error messages, but I 
 forgot to
 log them, since we were working against the clock.


 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.


 Conclusion:
 I think this 48 hour jam approach is a good test for the 
 language and it's
 infrastructure. I encourage everybody to try it (ideally with a 
 clean slate
 computer).
 The lesson is that we need to make this process smooth, since 
 it mirrors
 the first-experience of everybody new to D.
 It also highlights and magnifies time-wasters that are equally 
 unacceptable
 in a commercial environment.

 I don't think I made any converts this weekend wrt the above 
 issues
 encountered. I might have even just proved to them that they 
 should indeed


 Please, we need a road-map, we need to prioritise these most 
 basic aspects
 of the experience, and we need to do it soon.
 I might re-iterate my feeling that external IDE integration 
 projects should
 be claimed by the community officially, and user experience + 
 debugging
 issues should be first-class issues in the main d language 
 bug-tracker so
 everybody can see them, and everybody is aware of the stats.
 Also, the DMD front-end should be a lib offering 
 auto-completion and syntax
 hilighting data to clients.

 I'm doing some more work on premake (a nice light buildsystem 
 that
 generated makefiles and project files for popular IDE's) to 
 tightly
 incorporate D into the various IDE's it supports.

 </endrant>
Aug 31 2013
prev sibling next sibling parent reply Michel Fortin <michel.fortin michelf.ca> writes:
On 2013-09-01 02:05:39 +0000, Manu <turkeyman gmail.com> said:

 I might re-iterate my feeling that external IDE integration projects should
 be claimed by the community officially
In my opinion the community is just too short on man-hours. I did integrate D with Xcode at one point (no autocompletion though) and created an all-in-one installer for the compiler and the Xcode plugin. It just worked. I don't maintain it anymore and Xcode 4 broke it. It's open source so anyone in the community could have taken it further, but so far this hasn't happened. I'm not using D anymore. I realized that with the time required to maintain the toolset (including installer and Xcode plugin) plus the time it'd take to make the language suitable to my needs (Objective-C integration, perhaps ARM support for iOS), all that by itself would probably be more than one full-time job. As all this meta-work would seriously get in the way of my actual work, I let it go. I'm not regretting that move. So I'm no longer using D, but I'm still hanging around here from time to time because there's always something interesting to read. -- Michel Fortin michel.fortin michelf.ca http://michelf.ca
Aug 31 2013
next sibling parent "Froglegs" <lugtug yahoo.com> writes:
  Yes IDE should be built with the language :)

D has bad IDE support that is why I read about D but don't 
actually use it...
Aug 31 2013
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/1/13, Michel Fortin <michel.fortin michelf.ca> wrote:
 So I'm no longer using D, but I'm still hanging around here from time
 to time because there's always something interesting to read.
That's a shame. But yeah, people should use what makes them productive and what brings food on the table, I can't judge you with that decision. The newsgroup is pretty entertaining to stay around though. :p
Sep 01 2013
prev sibling parent "Paolo Invernizzi" <paolo.invernizzi gmail.com> writes:
On Sunday, 1 September 2013 at 03:40:44 UTC, Michel Fortin wrote:
 On 2013-09-01 02:05:39 +0000, Manu <turkeyman gmail.com> said:

 I'm not using D anymore. I realized that with the time required 
 to maintain the toolset (including installer and Xcode plugin) 
 plus the time it'd take to make the language suitable to my 
 needs (Objective-C integration, perhaps ARM support for iOS), 
 all that by itself would probably be more than one full-time 
 job. As all this meta-work would seriously get in the way of my 
 actual work, I let it go. I'm not regretting that move.

 So I'm no longer using D, but I'm still hanging around here 
 from time to time because there's always something interesting 
 to read.
That's a pity, but I can understand: actually I'm relaying in calling Objective-C runtime functionality directly, having wrapped the *very-minimum* Cocoa things that I need for our projects... But the reality is that it's simply not feasible to use D for OSX applications apart from sticking with the posix/bsd face of the system, so one of the big three OS is out. I would also add that actually I'm not able to debug on OSX, and that's simply something that it's a show stopper for my colleague: the best results are coming from lldb, with decent stack trace and, alas I can also set breakpoints, but I'm not able to print any local at all. In all the tree platform, at least the debugger, as set of patch, as documentation in the main site (not in the wiki!), as support in the backend, should be the priority number one, in my opinion. - Paolo Invernizzi
Sep 01 2013
prev sibling next sibling parent reply "Jakob Ovrum" <jakobovrum gmail.com> writes:
Sorry about the nonsensical reply, the web interface was acting 
up... this is the intended reply.

On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
Why didn't you go with DMD-Win32? Because of OMF? implib and/or objconv is a hassle but probably less of a hassle than using the nascent DMD-Win64.
 Overwhelmingly, the biggest complaint was a lack of symbolic 
 information to
 assist with auto-completion. Visual-D tries valiantly, but it 
 falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
Although I'm not convinced auto-completion is a vital feature (Microsoft's C++ IntelliSense is shit too), I agree that any time spent on custom parsers and best-effort semantic analysis is a complete waste of time. The only semantic analysis engine that is going to be sufficiently good for D is one from a compiler front-end. Apart from DMD, it's worth taking a look at SDC for this.
 some windows dev's want a CHM that looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.
dmd2\windows\bin\d.chm
 The question came up multiple times; "I don't think this should 
 be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
   foreach(i, item; array)
     if(item == itemToRemove)
       array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...
`std.algorithm.remove` provides both stable (preserves order, shuffles entire array down) and unstable (swaps with last element and shrinks by one) removal. However, Phobos does not make a habit of providing helpers for strictly O(n) algorithms, so the O(n) nature has to be made explicit by first getting the index with `std.algorithm.countUntil`. Removing a pair from an AA by value is also an exercise in linear search, and as such will not get a deceptive helper function. However, once range interfaces for AAs mature, such an algorithm can be composed trivially.
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];
I'm not sure this is a bug. How do you default initialize an array of structs you don't know the .init values of?
Aug 31 2013
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 05:53, Jakob Ovrum wrote:

 Although I'm not convinced auto-completion is a vital feature
 (Microsoft's C++ IntelliSense is shit too)
That doesn't mean there aren't any IDE's out there with good support for autocompletion. The one in Eclipse for Java is fantastic. The one in Xcode 4+ for C/C++ and Objective-C/C++ is really good. I'm also quite amazed by JetBrains, they're spitting out IDE's like mad, with full language support for many types of languages. Even those that are usually very difficult, like dynamically typed languages. They fully support Ruby on Rails.
 I'm not sure this is a bug. How do you default initialize an array of
 structs you don't know the .init values of?
You're forced to explicitly initialize it. -- /Jacob Carlborg
Sep 01 2013
parent reply "Jakob Ovrum" <jakobovrum gmail.com> writes:
On Sunday, 1 September 2013 at 09:46:08 UTC, Jacob Carlborg wrote:
 That doesn't mean there aren't any IDE's out there with good 
 support for autocompletion. The one in Eclipse for Java is 
 fantastic. The one in Xcode 4+ for C/C++ and Objective-C/C++ is 
 really good.
Java is not a good comparison because it does not have any compile-time metaprogramming features to speak of. I've never used Xcode but how does it fare when faced with C++ template metaprogramming?
 You're forced to explicitly initialize it.
What do you mean? Please show some code how a declared but not defined struct can be initialized in any way, shape or form.
Sep 01 2013
next sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Sunday, 1 September 2013 at 10:21:18 UTC, Jakob Ovrum wrote:
 On Sunday, 1 September 2013 at 09:46:08 UTC, Jacob Carlborg 
 wrote:
 That doesn't mean there aren't any IDE's out there with good 
 support for autocompletion. The one in Eclipse for Java is 
 fantastic. The one in Xcode 4+ for C/C++ and Objective-C/C++ 
 is really good.
Java is not a good comparison because it does not have any compile-time metaprogramming features to speak of. I've never used Xcode but how does it fare when faced with C++ template metaprogramming?
My understanding is that it uses libclang.
Sep 01 2013
parent Jacob Carlborg <doob me.com> writes:
On 2013-09-01 12:43, deadalnix wrote:

 My understanding is that it uses libclang.
Yes, and it can see through macros, templates and everything. -- /Jacob Carlborg
Sep 01 2013
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 12:21, Jakob Ovrum wrote:

 What do you mean? Please show some code how a declared but not defined
 struct can be initialized in any way, shape or form.
int[] a; // default initialized int[] b = [3, 4, 5]; // explicitly initialized -- /Jacob Carlborg
Sep 01 2013
parent "Jakob Ovrum" <jakobovrum gmail.com> writes:
On Sunday, 1 September 2013 at 13:21:20 UTC, Jacob Carlborg wrote:
 int[] a; // default initialized
 int[] b = [3, 4, 5]; // explicitly initialized
This has nothing to do with the problem.
Sep 01 2013
prev sibling next sibling parent reply "Simen Kjaeraas" <simen.kjaras gmail.com> writes:
On Sun, 01 Sep 2013 05:53:29 +0200, Jakob Ovrum <jakobovrum gmail.com>  
wrote:

 Yes, we hit DMD bugs, like the one with opaque structs which required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];
I'm not sure this is a bug. How do you default initialize an array of structs you don't know the .init values of?
An array of struct *pointers*. Fill it with nulls, I'd say. -- Simen
Sep 01 2013
parent "Jakob Ovrum" <jakobovrum gmail.com> writes:
On Sunday, 1 September 2013 at 10:39:15 UTC, Simen Kjaeraas wrote:
 On Sun, 01 Sep 2013 05:53:29 +0200, Jakob Ovrum 
 <jakobovrum gmail.com> wrote:

 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
  struct MyStruct;
  MyStruct*[] = new MyStruct*[n];
I'm not sure this is a bug. How do you default initialize an array of structs you don't know the .init values of?
An array of struct *pointers*. Fill it with nulls, I'd say.
I assumed that's the "work-around" he referred to. After testing I realize not even the pointers work. Yeah, that's clearly a bug.
Sep 01 2013
prev sibling next sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/1/13, Jakob Ovrum <jakobovrum gmail.com> wrote:
 I'm not sure this is a bug. How do you default initialize an
 array of structs you don't know the .init values of?
Note that this is an array of /pointers/ to opaque structs, so it's valid code.
Sep 01 2013
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 13:53, Jakob Ovrum <jakobovrum gmail.com> wrote:

 Sorry about the nonsensical reply, the web interface was acting up... this
 is the intended reply.


 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:

 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
Why didn't you go with DMD-Win32? Because of OMF? implib and/or objconv is a hassle but probably less of a hassle than using the nascent DMD-Win64.
Ummm... just no. I'm just not even gonna touch that. It just feels 'fake', and I have no confidence in it at all. Maybe I'm being difficult, but I just want to work in the same world as all the other existing code on my ecosystem. I'm not making major concessions like that for the niche language when everything else is already working well. If that suggestion works flawlessly, then DMD-Win32 should be enhanced to embed those tools, and convert COFF libs to OMF automatically at link time. Overwhelmingly, the biggest complaint was a lack of symbolic information to
 assist with auto-completion. Visual-D tries valiantly, but it falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing their own
 parsers, when really, DMD should be able to be built as a lib, with an API
 designed for using DMD as a lib/plugin.
Although I'm not convinced auto-completion is a vital feature (Microsoft's C++ IntelliSense is shit too), I agree that any time spent on custom parsers and best-effort semantic analysis is a complete waste of time. The only semantic analysis engine that is going to be sufficiently good for D is one from a compiler front-end. Apart from DMD, it's worth taking a look at SDC for this.
reason D couldn't be just as good. I think the deficiencies in the C++ experience come from the language it's self. C++ offers the opportunity for too many ambiguities. The preprocessor has gotta hinder the intellisense engine for one. some windows dev's want a CHM that looks like
 the typical Microsoft doc's people are used to. Those that aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + layout tree.
dmd2\windows\bin\d.chm
Noted. I suggest the installer put a link in the start menu :) Yes, we hit DMD bugs, like the one with opaque structs which required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];
I'm not sure this is a bug. How do you default initialize an array of structs you don't know the .init values of?
It's an array of pointers. Pointers always .init to null...?
Sep 01 2013
next sibling parent "Jakob Ovrum" <jakobovrum gmail.com> writes:
On Sunday, 1 September 2013 at 14:14:05 UTC, Manu wrote:

 there's no
 reason D couldn't be just as good.
There's one very good reason: templates. It makes it designed with Visual Studio integration in mind from the beginning, while DMD is an evolution of DMC++.
Sep 01 2013
prev sibling parent "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 14:14:05 UTC, Manu wrote:
 On 1 September 2013 13:53, Jakob Ovrum <jakobovrum gmail.com> 
 wrote:
 dmd2\windows\bin\d.chm
Noted. I suggest the installer put a link in the start menu :)
https://github.com/D-Programming-Language/installer/pull/21
Sep 01 2013
prev sibling next sibling parent reply "Kapps" <opantm2+spam gmail.com> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the 
 stars aligned
 such that it would actually compile and find the linker+libs.
I also spent a decent bit of effort getting Win64 to work, and I agree that this is something that DMD should attempt to bundle. It may not need to go as far as downloading VC express for you, but it should allow integration of an existing install during installation (and ideally post-installation as well). This is a pretty big deal IMO. When I was a newbie, issues with COFF vs OMF, coffimplib confusion, etc, were almost deal-breakers to me. I just couldn't get things easily working, and I'm sure many others see it the same way. Having Win64 gives us a chance to fix that, but it *has* to be integrated into the installer. The compiler should ideally detect that the VS linker / libraries are missing when you use -m64, and tell you how to download VS Express as well as directing you to a batch file bundled with DMD to update sc.ini. Going a step further, it'd be even nicer if -m64 was default but that's not feasible with external dependencies. However when it detects a library in an invalid format, it should inform you about the existence of coffimplib and a download link, as well as how this is necessary only when compiling 32-bit executables. I don't think that the core contributors realize how incredibly frustrating these issues are to beginners, and thus feel as if it's not something that needs to be solved.
 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work 
 with a
 makefile. Everybody else wanted a comfortable IDE solution (and 
 the linux
 user would prefer it too).

 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a 
 massive point
 of it in his DConf talk, and I want to 
 re-re-re-re-re-re-re-stress how
 absolutely important this is.
 !!!!!!!!!
I have to say, I don't see why you find this to be such a large issue. DMD is unique in the sense that it's the only thing I've ever been able to compile on Windows without running into many issues. So long as you have DMC installed, it just works. I've never had any makefile related issues on any platform. This is a big deal, as DMD evolves so rapidly that users should be able to get the git version with minimal effort, without having to download an IDE for it.
 Overwhelmingly, the biggest complaint was a lack of symbolic 
 information to
 assist with auto-completion. Visual-D tries valiantly, but it 
 falls quite
 short of the mark.
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
 I think continuous code compilation for auto-completion and 
 syntax
 highlighting purposes should be a service offered and 
 maintained by DMD.
 That way it will evolve with the language, and anyone can use 
 it without
 reinventing the wheel.
While yes, it would be wonderful if we could get DMD to do this (again, I don't think a lot of the core contributors realize just how incredibly important IDEs and perfect auto-completion are), it doesn't seem to be something likely in the short-term. That being said, I've actually found Mono-D to be excellent recently. It doesn't handle things like CTFE properly and other similar issues, but for my purposes it's worked rather well (that being said, I avoid mixins for the most part, largely due to this). Despite all this, I'm actually quite happy with Mono-D lately. One thing I've toyed with is the idea of using reflection for getting auto-complete information. I made a runtime reflection module (has a fair few issues still), and I wonder if it would be possible to use something similar for this purpose. Most modules could be cached, allowing us to build only the module you're altering. On top of that, some real-time parsing could be done for code modified since the last recompile (really it would need to primarily handle scanning for methods and variables). History completion from the current file, such as what editors like Sublime Text do, could perhaps be integrated to completely eliminate the issue of not being able to find a symbol in your auto-complete list. That would likely only kick in when it finds no results for anything else. Plus since you're recompiling frequently in the background, you would get early notifications I'm not sure if this would be updated fast enough (depends how long compiles take) to be feasible.
 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to 
 get OSX
 working); there are lots of problems with the debugging 
 experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing 
 bugs in
 random places when it was simply a case of the debugger lying 
 about the
 value of variables to them, and many more cases where the 
 debugger simply
 refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, 
 really burned
 us in a 48hour context.
Agreed, debugging is currently lacking thoroughly. Using Linux, Mono-D's works okay but it still jumps around a lot (even in debug mode) and hovering over variables isn't great yet. I really be nice to have that. This is something that runtime reflection could handle.
 Containers:

 The question came up multiple times; "I don't think this should 
 be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
   foreach(i, item; array)
     if(item == itemToRemove)
       array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...
Completely agreed. Containers are another huge problem which needs to be solved as soon as possible, yet is constantly getting pushed back. The idea that a language like D doesn't even have anything but the most basic containers in it's standard library is laughable. I shudder to think at how many different implementations exist of common containers right now in user code.
 Bugs:
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];

 We also ran into some completely nonsense error messages, but I 
 forgot to
 log them, since we were working against the clock.
I used to hit compiler bugs very frequently, yet this is something that D has improved incredibly on. It's wonderful to not have to worry with every compile if there will be bugs. It's also wonderful to not have to worry about changing your code with every single compiler release. D has made *huge* headway in these two issues. Sure, there are still many bugs, but they're becoming less frequent for me and I find that workarounds are easier than before.
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.
This isn't something I've found to be an issue personally, but I suppose it's a matter of what you're used to. Since I'm used to was the IDE's job, personally. That being said, perhaps .di files could help with this?
Aug 31 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sun, 01 Sep 2013 06:45:48 +0200
"Kapps" <opantm2+spam gmail.com> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.
This isn't something I've found to be an issue personally, but I suppose it's a matter of what you're used to. Since I'm used to was the IDE's job, personally. That being said, perhaps .di files could help with this?
I see it as the job of doc generators.
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 17:46, Nick Sabalausky <
SeeWebsiteToContactMe semitwist.com> wrote:

 On Sun, 01 Sep 2013 06:45:48 +0200
 "Kapps" <opantm2+spam gmail.com> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to
 navigate

 We all wanted to ability to define class member functions
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class
 definition, you want
 to quickly see what a class has and does.
This isn't something I've found to be an issue personally, but I suppose it's a matter of what you're used to. Since I'm used to was the IDE's job, personally. That being said, perhaps .di files could help with this?
I see it as the job of doc generators.
Why complicate the issue? What's wrong with readable code?
Sep 01 2013
next sibling parent reply "SomeDude" <lovelydear mailmetrash.com> writes:
On Sunday, 1 September 2013 at 13:20:50 UTC, Manu wrote:
 Why complicate the issue? What's wrong with readable code?
Java programmers. In fact, it' the first time I hear about people complaining on this one. Maybe because of the generalized usage of IDEs, I guess. I do agree that IDEs DO matter. It really does make a monumental chore. But with them, you are often more productive than in Python (opinion based on my own professionnal experience). The problem is, maintaining one or several IDE plugins is going to be nearly a full-time job, even if the compiler is a library. Or you need a very motivated person like Iain Buclain to keep updating the tools. I think at this point, what D needs is a bit of commercial support from a company like JetBrains or some equivalent. Maybe there is now an opportunity for founding such a company, one that would specialize in building professional tools around the D language. I believe the language and the compilers are stable enough to grow a serious business around them. If we compare to what the state of C++ compilers was before 2000, I believe we are much better off. And that was just over a decade ago. Who knows what the state of D will be in 5 years ? So yes, there is a case to be made for growing a company around pro D tools, and the first company that does it will grab the whole market.
Sep 01 2013
next sibling parent reply "Brian Schott" <briancschott gmail.com> writes:
On Sunday, 1 September 2013 at 18:36:39 UTC, SomeDude wrote:
 I think at this point, what D needs is a bit of commercial 
 support from a company like JetBrains or some equivalent. Maybe 
 there is now an opportunity for founding such a company, one 
 that would specialize in building professional tools around the 
 D language. I believe the language and the compilers are stable 
 enough to grow a serious business around them. If we compare to 
 what the state of C++ compilers was before 2000, I believe we 
 are much better off. And that was just over a decade ago. Who 
 knows what the state of D will be in 5 years ? So yes, there is 
 a case to be made for growing a company around pro D tools, and 
 the first company that does it will grab the whole market.
It's a bit of a chicken-and-egg problem. I'd like to do this, but there would have to be several companies already using D professionally for it to be a viable business model. And for a company to invest in D, they'd probably want the tooling to already exist.
Sep 01 2013
next sibling parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:44, Brian Schott <briancschott gmail.com> wrote:

 On Sunday, 1 September 2013 at 18:36:39 UTC, SomeDude wrote:

 I think at this point, what D needs is a bit of commercial support from a
 company like JetBrains or some equivalent. Maybe there is now an
 opportunity for founding such a company, one that would specialize in
 building professional tools around the D language. I believe the language
 and the compilers are stable enough to grow a serious business around them.
 If we compare to what the state of C++ compilers was before 2000, I believe
 we are much better off. And that was just over a decade ago. Who knows what
 the state of D will be in 5 years ? So yes, there is a case to be made for
 growing a company around pro D tools, and the first company that does it
 will grab the whole market.
It's a bit of a chicken-and-egg problem. I'd like to do this, but there would have to be several companies already using D professionally for it to be a viable business model. And for a company to invest in D, they'd probably want the tooling to already exist.
Well you can look at Remedy's experience. I've also sparked interest in staff from a few other companies when talking to developers. It'll never fly though if when they go to try it out, they're met with the kind of experience I had on the weekend. Don't underestimate how much people hate C++ these days.
Sep 01 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 9:15 PM, Manu wrote:
 Well you can look at Remedy's experience. I've also sparked interest in staff
 from a few other companies when talking to developers.
 It'll never fly though if when they go to try it out, they're met with the kind
 of experience I had on the weekend.
 Don't underestimate how much people hate C++ these days.
Your work is really valuable to D. Please make sure all the problems you encountered wind up as bugzilla entries!
Sep 01 2013
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 21:44, Brian Schott wrote:

 It's a bit of a chicken-and-egg problem. I'd like to do this, but there
 would have to be several companies already using D professionally for it
 to be a viable business model. And for a company to invest in D, they'd
 probably want the tooling to already exist.
I would like that as well. But I feel the same things. Are there any customers? -- /Jacob Carlborg
Sep 02 2013
parent reply "Joakim" <joakim airpost.net> writes:
While there has been a lot of back and forth with Manu in this 
thread, I think it's best to look at his game jam experience as a 
kind of stress test for D installation and usage, ie informative 
but extreme.  Game development, especially in a rush and with 
everything being installed automatically for the user, is not a 
use case that is well-plumbed in D right now.  It would be nice 
if debugging properly worked, but given the complexity involved 
that Walter mentioned, it's understandable that it doesn't.

Speaking of debuggers, there was a BSD-licensed debugger written 
in D around 2009, likely D1.  It was abandoned by the author, who 
told me he got busy with work, but Martin Nowak resurrected it 
for some time a couple years back:

https://github.com/dawgfoto/ngdb

Perhaps it can be resurrected as a debugger for D, since it is 
written in D.

On Monday, 2 September 2013 at 07:42:52 UTC, Jacob Carlborg wrote:
 On 2013-09-01 21:44, Brian Schott wrote:

 It's a bit of a chicken-and-egg problem. I'd like to do this, 
 but there
 would have to be several companies already using D 
 professionally for it
 to be a viable business model. And for a company to invest in 
 D, they'd
 probably want the tooling to already exist.
I would like that as well. But I feel the same things. Are there any customers?
It strikes me that commercial support is the correct way to fix a lot of these problems, which are about fit and finish, that many others have noted open source projects are notoriously bad at. I suggested a licensing model for how D could provide commercial support in a different thread: http://forum.dlang.org/thread/kq9r7t$nu$1 digitalmars.com#post-sglcqsiphpntcdlhvwvn:40forum.dlang.org The usual open source zealots argued with me, suggesting that any closed source reference implementation would be unwelcome, even if always accompanied by an open source implementation that's available for free. I won't get into the even sillier arguments used, ;) perhaps such people wouldn't mind as long it wasn't Walter and dmd, ie the reference implementation, that went closed/paid, ie if Brian or someone else did it. As for needing "several companies already using D professionally for it to be a viable business model," most companies don't start that way. You find a few users willing to pay for fit and finish or optimizations and you grow it from part-time work to as far as the business will go. I certainly hope someone is willing to do that.
Sep 02 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 ...
There is crucial difference between having a company providing commercial services for D users (good) and having anything closed/commercial in reference implementation (bad). Former is simply matching the demand from certain market segment. Latter is screwing the life for everyone else. There is hardly anything common here.
Sep 02 2013
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-02 19:54, Dicebot wrote:

 There is crucial difference between having a company providing
 commercial services for D users (good) and having anything
 closed/commercial in reference implementation (bad). Former is simply
 matching the demand from certain market segment. Latter is screwing the
 life for everyone else. There is hardly anything common here.
Yes, I agree. I think the tools, or most of the tools, need to be free. Then they need to make money on other things as support, education or similar. -- /Jacob Carlborg
Sep 02 2013
prev sibling parent reply "SomeDude" <lovelydear mailmetrash.com> writes:
On Monday, 2 September 2013 at 17:54:01 UTC, Dicebot wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 ...
There is crucial difference between having a company providing commercial services for D users (good) and having anything closed/commercial in reference implementation (bad). Former is simply matching the demand from certain market segment. Latter is screwing the life for everyone else. There is hardly anything common here.
Wait, I do not advocate building a closed source or non free reference implementation of the compiler or of the standard library. Those need to stay open source of course. But there are plenty of pro quality tools that are sorely missing right now: - an IDE that works with any of the 3 existing compilers - an integrated debugger - a graphical memory usage viewer/analyzer - a visual profiler - an integrated package manager - maybe a GUI library Etc. The existence of such tools would be a very big incentive for companies to try D seriously.
Sep 02 2013
parent "Ramon" <spam thanks.no> writes:
I think dicebot has hit a point there. It's true, much if not 
most of OS software is a (hopefully) somewhat niced up version of 
what was written to scratch a personal itch.

But I'm quite confident that this is not true for a project like 
D. I mean, just to come up with a solid and well thought 
description of the itch (like "C++ is a mess and a PITA") that 
can serve as guiding line when conceiving a language is a major 
undertaking. One possibly may come up with some script thingy 
just to scratch an itch; to conceive, design and implement 
something like D, however, was and is a very major undertaking, 
1000s and 1000s of hours, aso.

Wouldn't one like then that others see, too, what one has 
understood 10 years ago and tried to make better? Wouldn't one 
then want to make it really easy to test drive the language, see 
it's power (on cpus rather than web sites)?

My driver for complaining about D is *that it's so great* - but 
quite low on the useability side. If D weren't that great I'd 
simply have turned away.

I get Walter Brights point that he can't (and doesn't want to) be 
in charge of everything and the kitchen sink. The debugger issue 
though *does* concern dmd itself and not having a realiably 
working debugger with full (usual) functionality *is* a major 
show-stopper.
I mean, come on, how reasonable and consistent is it to leave the 
C/C++ mess and to then spread debug writelns all over the place?!

For the rest, I agree, it might be hard to see for emacs/vim 
crowd and the like. Yes, they are right, there is life without 
nice colors and mousing around development.
Let us not forget: To be somehow useable for insiders is only a 
first step. To really gain traction a second step must be taken: 
To be reasonably well useable according to what is common today. 
Which, I think, translates at least to: vim or emacs modes plus 
some decent cross platform editor like Scite and a"thin IDE" like 
geany. From there on it's a matter of taste and religion and - 
that's an important point! - having a comfortable base useability 
through the chain we can afford to say "You want X? Go and code 
it. The tools needed are there.".

A+ -R
Sep 02 2013
prev sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting that 
 any closed source reference implementation would be unwelcome, 
 even if always accompanied by an open source implementation 
 that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
Sep 03 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Tuesday, 3 September 2013 at 13:25:44 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting that 
 any closed source reference implementation would be unwelcome, 
 even if always accompanied by an open source implementation 
 that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
I suggest you read the thread if you need a reason, it's pretty obvious.
Sep 03 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Tuesday, 3 September 2013 at 15:34:31 UTC, Joakim wrote:
 On Tuesday, 3 September 2013 at 13:25:44 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting that 
 any closed source reference implementation would be 
 unwelcome, even if always accompanied by an open source 
 implementation that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
I suggest you read the thread if you need a reason, it's pretty obvious.
Won't help, he is a zealot too.
Sep 03 2013
parent reply "deadalnix" <deadalnix gmail.com> writes:
On Tuesday, 3 September 2013 at 15:36:52 UTC, Dicebot wrote:
 On Tuesday, 3 September 2013 at 15:34:31 UTC, Joakim wrote:
 On Tuesday, 3 September 2013 at 13:25:44 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting 
 that any closed source reference implementation would be 
 unwelcome, even if always accompanied by an open source 
 implementation that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
I suggest you read the thread if you need a reason, it's pretty obvious.
Won't help, he is a zealot too.
My life for Aïur !
Sep 03 2013
parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 3 September 2013 at 15:46:16 UTC, deadalnix wrote:
 My life for Aïur !
En Taro Tassadar
Sep 03 2013
prev sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Tuesday, 3 September 2013 at 15:34:31 UTC, Joakim wrote:
 On Tuesday, 3 September 2013 at 13:25:44 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting that 
 any closed source reference implementation would be 
 unwelcome, even if always accompanied by an open source 
 implementation that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
I suggest you read the thread if you need a reason, it's pretty obvious.
Thinking something is obvious so doesn't need to be demonstrated is another common traits amongst zealots.
Sep 03 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Tuesday, 3 September 2013 at 15:45:48 UTC, deadalnix wrote:
 On Tuesday, 3 September 2013 at 15:34:31 UTC, Joakim wrote:
 On Tuesday, 3 September 2013 at 13:25:44 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 17:39:45 UTC, Joakim wrote:
 The usual open source zealots argued with me, suggesting 
 that any closed source reference implementation would be 
 unwelcome, even if always accompanied by an open source 
 implementation that's available for free.
Asserting people are zealot without actually providing any reason why make you sound like a zealot.
I suggest you read the thread if you need a reason, it's pretty obvious.
Thinking something is obvious so doesn't need to be demonstrated is another common traits amongst zealots.
Sure, but I did provide demonstration, that thread. The OSS zealots repeatedly make arguments that are wrong, irrelevant, and worst, just completely out of left field. This is a common pathology when you have decided on your conclusion and are arguing backwards from it: your arguments don't make any sense and come out of left field. They have decided that open source is good and closed source is bad, just like the global warming zealots, and will make silly arguments to try and justify that, even to someone like me who is trying to carve out a place for open source. You may agree with their conclusion and therefore defend their arguments, but any impartial observer wouldn't.
Sep 03 2013
next sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 09/03/2013 06:33 PM, Joakim wrote:
 Thinking something is obvious so doesn't need to be demonstrated is
 another common traits amongst zealots.
Sure, but I did provide demonstration, that thread.
That thread seems to demonstrate a failure of communication.
 The OSS zealots
 repeatedly make arguments that are wrong, irrelevant, and worst, just
 completely out of left field.  This is a common pathology when you have
 decided on your conclusion and are arguing backwards from it: your
 arguments don't make any sense and come out of left field.

 They have decided that open source is good and closed source is bad,
 just like the global warming zealots, and will make silly arguments to
 try and justify that, even to someone like me who is trying to carve out
 a place for open source.  You may agree with their conclusion and
 therefore defend their arguments, but any impartial observer wouldn't.
"Any" impartial observer would notice the personal attacks, even if that observer was completely ignorant of the discussion topic. "Any" impartial observer would interpret those as lack of a well-reasoned argument and decide to spend his time impartially observing something more interesting.
Sep 03 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Tuesday, 3 September 2013 at 21:34:42 UTC, Timon Gehr wrote:
 On 09/03/2013 06:33 PM, Joakim wrote:
 Sure, but I did provide demonstration, that thread.
That thread seems to demonstrate a failure of communication.
By whom? It's pretty obviously the zealots.
 They have decided that open source is good and closed source 
 is bad,
 just like the global warming zealots, and will make silly 
 arguments to
 try and justify that, even to someone like me who is trying to 
 carve out
 a place for open source.  You may agree with their conclusion 
 and
 therefore defend their arguments, but any impartial observer 
 wouldn't.
"Any" impartial observer would notice the personal attacks, even if that observer was completely ignorant of the discussion topic. "Any" impartial observer would interpret those as lack of a well-reasoned argument and decide to spend his time impartially observing something more interesting.
I call it like I see it. An impartial observer can determine if what you call "personal attacks," more like labeling of the usually silly or wrong tenor of their arguments and what kind of person generally makes such dumb arguments, are accurate. If you want to take a long thread full of arguments about the topic and pick out a little name-calling and then run away, clearly the argument is lost on you. On Wednesday, 4 September 2013 at 00:25:30 UTC, deadalnix wrote:
 On Tuesday, 3 September 2013 at 16:33:55 UTC, Joakim wrote:
 Sure, but I did provide demonstration, that thread.  The OSS 
 zealots repeatedly make arguments that are wrong, irrelevant, 
 and worst, just completely out of left field.  This is a 
 common pathology when you have decided on your conclusion and 
 are arguing backwards from it: your arguments don't make any 
 sense and come out of left field.

 They have decided that open source is good and closed source 
 is bad, just like the global warming zealots, and will make 
 silly arguments to try and justify that, even to someone like 
 me who is trying to carve out a place for open source.  You 
 may agree with their conclusion and therefore defend their 
 arguments, but any impartial observer wouldn't.
You seem confused by the difference between saying something and providing conclusive evidence.
That thread _is_ conclusive evidence. If you think otherwise, you are deeply confused.
Sep 04 2013
parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 09/04/2013 11:26 AM, Joakim wrote:
 On Tuesday, 3 September 2013 at 21:34:42 UTC, Timon Gehr wrote:
 On 09/03/2013 06:33 PM, Joakim wrote:
 Sure, but I did provide demonstration, that thread.
That thread seems to demonstrate a failure of communication.
By whom? [...]
When communication fails, there is usually not a single side responsible for it. (Unless one side is trolling. Trolls are typically anonymous.)
 They have decided that open source is good and closed source is bad,
 just like the global warming zealots, and will make silly arguments to
 try and justify that, even to someone like me who is trying to carve out
 a place for open source.  You may agree with their conclusion and
 therefore defend their arguments, but any impartial observer wouldn't.
"Any" impartial observer would notice the personal attacks, even if that observer was completely ignorant of the discussion topic. "Any" impartial observer would interpret those as lack of a well-reasoned argument and decide to spend his time impartially observing something more interesting.
I call it like I see it.
Great.
 An impartial observer can determine if what
 you call "personal attacks," more like labeling of the usually silly or
 wrong tenor of their arguments
 and what kind of person generally makes such dumb arguments, are accurate.
How? Accuracy of conclusions of fallacious reasoning is mostly incidental. Consider googling "ad hominem", "association fallacy" and "fallacy of irrelevance".
 If you want to take a long thread full of arguments about the topic
 and pick out a little name-calling
 and then run away, clearly the argument is lost on you.
Frankly, I'm unimpressed. It's you who picked out the name-calling instead of arguments when summarizing the past discussion. In case any valuable arguments were part of that discussion then I'd advise to pick out those instead and put them in a coherent form.
 On Wednesday, 4 September 2013 at 00:25:30 UTC, deadalnix wrote:
 On Tuesday, 3 September 2013 at 16:33:55 UTC, Joakim wrote:
 Sure, but I did provide demonstration, that thread.  The OSS zealots
 repeatedly make arguments that are wrong, irrelevant, and worst, just
 completely out of left field.  This is a common pathology when you
 have decided on your conclusion and are arguing backwards from it:
 your arguments don't make any sense and come out of left field.

 They have decided that open source is good and closed source is bad,
 just like the global warming zealots, and will make silly arguments
 to try and justify that, even to someone like me who is trying to
 carve out a place for open source.  You may agree with their
 conclusion and therefore defend their arguments, but any impartial
 observer wouldn't.
You seem confused by the difference between saying something and providing conclusive evidence.
That thread _is_ conclusive evidence. If you think otherwise, you are deeply confused.
(Please do not mess up the threading.) Well, if this kind of simply-minded pseudo-reasoning is to find resonance, it has to be targeted at a less critical audience.
Sep 04 2013
parent reply "Joakim" <joakim airpost.net> writes:
On Wednesday, 4 September 2013 at 13:23:19 UTC, Timon Gehr wrote:
 On 09/04/2013 11:26 AM, Joakim wrote:
 On Tuesday, 3 September 2013 at 21:34:42 UTC, Timon Gehr wrote:
 On 09/03/2013 06:33 PM, Joakim wrote:
 Sure, but I did provide demonstration, that thread.
That thread seems to demonstrate a failure of communication.
By whom? [...]
When communication fails, there is usually not a single side responsible for it. (Unless one side is trolling. Trolls are typically anonymous.)
Except that trolling has nothing to do with communication failure and one would think those zealots are the ones trolling, despite using what are presumably their real names, because of how dumb their arguments are.
 "Any" impartial observer would notice the personal attacks, 
 even if
 that observer was completely ignorant of the discussion 
 topic. "Any"
 impartial observer would interpret those as lack of a 
 well-reasoned
 argument and decide to spend his time impartially observing 
 something
 more interesting.
I call it like I see it.
Great.
Except that you then criticize me for "personal attacks" and name-calling, make up your mind.
 An impartial observer can determine if what
 you call "personal attacks," more like labeling of the usually 
 silly or
 wrong tenor of their arguments
 and what kind of person generally makes such dumb arguments, 
 are accurate.
How? Accuracy of conclusions of fallacious reasoning is mostly incidental. Consider googling "ad hominem", "association fallacy" and "fallacy of irrelevance".
I don't think you know what "incidental" means. :) In any case, if you can't see that they make several statements that are just factually wrong, I don't know what to tell you. If you are so ignorant that you don't even know the facts, there can be no discussion, which is why I bailed on that thread.
 If you want to take a long thread full of arguments about the 
 topic
 and pick out a little name-calling
 and then run away, clearly the argument is lost on you.
Frankly, I'm unimpressed. It's you who picked out the name-calling instead of arguments when summarizing the past discussion. In case any valuable arguments were part of that discussion then I'd advise to pick out those instead and put them in a coherent form.
I called them what they are, zealots, which isn't really name-calling but an accurate description, and noted one of their main dumb arguments, so I did both. I'm not going to summarize that thread for you: either read it or don't. I could care less either way, because you seem to make almost as many mistakes as them.
 On Wednesday, 4 September 2013 at 00:25:30 UTC, deadalnix 
 wrote:
 You seem confused by the difference between saying something 
 and
 providing conclusive evidence.
That thread _is_ conclusive evidence. If you think otherwise, you are deeply confused.
(Please do not mess up the threading.)
Responses to the two of you are best lumped together. I don't like it when people like you spam threads with multiple separate short responses to every other response in the thread. This is better.
 Well, if this kind of simply-minded pseudo-reasoning is to find 
 resonance, it has to be targeted at a less critical audience.
Except there was little reasoning in my above two sentences, only two statements about the other thread. The "critical audience" is not the problem, as you haven't been able to muster a "critical" response to any actual arguments in that thread. All you two do is make a bunch of dumb twits about the tone or character of the other thread, so I'll leave this "meta-discussion" here, as you two are clearly incapable of dealing with my actual arguments.
Sep 04 2013
next sibling parent Timon Gehr <timon.gehr gmx.ch> writes:
On 09/04/2013 08:00 PM, Joakim wrote:
 On Wednesday, 4 September 2013 at 13:23:19 UTC, Timon Gehr wrote:
 On 09/04/2013 11:26 AM, Joakim wrote:
 On Tuesday, 3 September 2013 at 21:34:42 UTC, Timon Gehr wrote:
 On 09/03/2013 06:33 PM, Joakim wrote:
 Sure, but I did provide demonstration, that thread.
That thread seems to demonstrate a failure of communication.
By whom? [...]
When communication fails, there is usually not a single side responsible for it. (Unless one side is trolling. Trolls are typically anonymous.)
Except that trolling has nothing to do with communication failure
Good trolling is often _indistinguishable_ from communication failure.
 ...

 "Any" impartial observer would notice the personal attacks, even if
 that observer was completely ignorant of the discussion topic. "Any"
 impartial observer would interpret those as lack of a well-reasoned
 argument and decide to spend his time impartially observing something
 more interesting.
I call it like I see it.
Great.
Except that you then criticize me
I don't criticize people, I question arguments. If you think these two things should be conflated, I beg you to reconsider.
 for "personal attacks" and name-calling, [...]
 ...
There are multiple possibilities to replace the above statement in a way I would disapprove of, eg: - "I call it like I don't see it." - "I state inevitable fact."
 An impartial observer can determine if what
 you call "personal attacks," more like labeling of the usually silly or
 wrong tenor of their arguments
 and what kind of person generally makes such dumb arguments, are
 accurate.
How? Accuracy of conclusions of fallacious reasoning is mostly incidental. Consider googling "ad hominem", "association fallacy" and "fallacy of irrelevance".
[...] what "incidental" means. :)
It means: "Occurring by chance in connection with something else." A possible reason informal reasoning makes use of heuristics is that they often work by chance in some evolutionary relevant contexts.
 [...]they make several statements that are just factually
 wrong, [...]
IIRC you more or less successfully debunk some factually wrong statements. Not all of them were actually made, though.
 If you [...] don't [...] know the facts, there can be no discussion,
One of the points of a discussion is to exchange facts and to widen one's understanding of different viewpoints.
 which is why I bailed on that thread.
 ...
There are less intrusive ways of doing that.
 If you want to take a long thread full of arguments about the topic
 and pick out a little name-calling
 and then run away, clearly the argument is lost on you.
Frankly, I'm unimpressed. It's you who picked out the name-calling instead of arguments when summarizing the past discussion. In case any valuable arguments were part of that discussion then I'd advise to pick out those instead and put them in a coherent form.
I called them what they are,
As I see it it is irrelevant in a discussion how anyone may classify anyone else taking part in that discussion. It is often even irrelevant who those people are. I'm just saying that if the goal is to make one's reasoning and opinions available to a potential reader, making it inconvenient to read and seemingly irrelevant is not the way to go.
 [...] which isn't really name-calling
 but an accurate description,
:o)
 and noted one of their main [...] arguments,
  so I did both.
No point can be made by noting that one hasn't made a specific fallacious argument or by noting that somebody has defended another point poorly.
 [...]
 On Wednesday, 4 September 2013 at 00:25:30 UTC, deadalnix wrote:
 You seem confused by the difference between saying something and
 providing conclusive evidence.
That thread _is_ conclusive evidence. [...]
(Please do not mess up the threading.)
[...]
 Well, if this kind of simply-minded pseudo-reasoning is to find
 resonance, it has to be targeted at a less critical audience.
Except there was little reasoning in my above two sentences, only two statements about the other thread.
Exactly. (Or rather, one statement about the other thread and one irrelevant statement about a community member.) So a point of contention appears to be that some assume that evidence should be given in the form of reasoning or at least be accompanied by reasoning, whereas others don't?
 [...] I'll leave this "meta-discussion" here, as you two are clearly
 incapable of dealing with
Typically the ones incapable of dealing with something leave.
 my actual arguments.
What actual arguments are there? ("Go look for them yourself." is not a valid answer.)
Sep 04 2013
prev sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Wednesday, 4 September 2013 at 18:00:21 UTC, Joakim wrote:
 Well, if this kind of simply-minded pseudo-reasoning is to 
 find resonance, it has to be targeted at a less critical 
 audience.
Except there was little reasoning in my above two sentences, only two statements about the other thread.
That, my friend, is called autodestruction. Now I'll have to invoke Poe's law and get out that thread.
Sep 04 2013
prev sibling parent "deadalnix" <deadalnix gmail.com> writes:
On Tuesday, 3 September 2013 at 16:33:55 UTC, Joakim wrote:
 Sure, but I did provide demonstration, that thread.  The OSS 
 zealots repeatedly make arguments that are wrong, irrelevant, 
 and worst, just completely out of left field.  This is a common 
 pathology when you have decided on your conclusion and are 
 arguing backwards from it: your arguments don't make any sense 
 and come out of left field.

 They have decided that open source is good and closed source is 
 bad, just like the global warming zealots, and will make silly 
 arguments to try and justify that, even to someone like me who 
 is trying to carve out a place for open source.  You may agree 
 with their conclusion and therefore defend their arguments, but 
 any impartial observer wouldn't.
You seem confused by the difference between saying something and providing conclusive evidence.
Sep 03 2013
prev sibling parent "SomeDude" <lovelydear mailmetrash.com> writes:
On Sunday, 1 September 2013 at 19:44:11 UTC, Brian Schott wrote:
 On Sunday, 1 September 2013 at 18:36:39 UTC, SomeDude wrote:
 I think at this point, what D needs is a bit of commercial 
 support from a company like JetBrains or some equivalent. 
 Maybe there is now an opportunity for founding such a company, 
 one that would specialize in building professional tools 
 around the D language. I believe the language and the 
 compilers are stable enough to grow a serious business around 
 them. If we compare to what the state of C++ compilers was 
 before 2000, I believe we are much better off. And that was 
 just over a decade ago. Who knows what the state of D will be 
 in 5 years ? So yes, there is a case to be made for growing a 
 company around pro D tools, and the first company that does it 
 will grab the whole market.
It's a bit of a chicken-and-egg problem. I'd like to do this, but there would have to be several companies already using D professionally for it to be a viable business model. And for a company to invest in D, they'd probably want the tooling to already exist.
Your concern is legitimate of course, but there has been historically many cases where small companies have created a market in IT. There is clearly a demand for D tooling even if it's not huge right now, so I believe the risk is not as great as you think. Of course one needs some good skills to be able to come up with usable tools. But when nearly every D newcomer's first question is "where are the tools ?", there is a market for that.
Sep 02 2013
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-01 20:36, SomeDude wrote:


 programmers. In fact, it' the first time I hear about people complaining
 on this one. Maybe because of the generalized usage of IDEs, I guess.
As far as I know C/C++ and Objective-C/C++ are the only languages that can do this.
 I think at this point, what D needs is a bit of commercial support from
 a company like JetBrains or some equivalent.
I'm wondering what tools or tricks they have up there selves to be able to produce so many IDE's for so many different languages. Hey, in RubyMine they basically add a new language with every new version of the IDE: * Ruby on Rails * Haml * HTML * CSS * Sass * CoffeeScript * JavaScript And so on. -- /Jacob Carlborg
Sep 02 2013
prev sibling parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sun, 1 Sep 2013 23:20:37 +1000
Manu <turkeyman gmail.com> wrote:

 On 1 September 2013 17:46, Nick Sabalausky <
 SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Sun, 01 Sep 2013 06:45:48 +0200
 "Kapps" <opantm2+spam gmail.com> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to
 navigate

 We all wanted to ability to define class member functions
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class
 definition, you want
 to quickly see what a class has and does.
This isn't something I've found to be an issue personally, but I suppose it's a matter of what you're used to. Since I'm used to was the IDE's job, personally. That being said, perhaps .di files could help with this?
I see it as the job of doc generators.
Why complicate the issue? What's wrong with readable code?
I spent several years using C/C++ exclusively (and was happy with it at the time) and I still don't understand what's "readable" about having a class's members separate from the class itself. It's also a non-DRY maintenance PITA and one of the biggest reasons I left C/C++. I don't like complicating things, and I like readable code. That's why I find C++-style class definitions intolerable. And what's so complicated about tossing in those two little characters: -D
Sep 01 2013
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-01 22:27, Nick Sabalausky wrote:

 I spent several years using C/C++ exclusively (and was happy with it
 at the time) and I still don't understand what's "readable" about having
 a class's members separate from the class itself. It's also a non-DRY
 maintenance PITA and one of the biggest reasons I left C/C++.
You cannot even copy-paste the signature. In C++ default values for parameters can only be at one place. -- /Jacob Carlborg
Sep 02 2013
prev sibling parent "PauloPinto" <pjmlp progtools.org> writes:
On Sunday, 1 September 2013 at 20:27:22 UTC, Nick Sabalausky 
wrote:
 On Sun, 1 Sep 2013 23:20:37 +1000
 Manu <turkeyman gmail.com> wrote:

 On 1 September 2013 17:46, Nick Sabalausky <
 SeeWebsiteToContactMe semitwist.com> wrote:
 
 On Sun, 01 Sep 2013 06:45:48 +0200
 "Kapps" <opantm2+spam gmail.com> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and 
 difficult to
 navigate

 We all wanted to ability to define class member functions
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand 
 the
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class
 definition, you want
 to quickly see what a class has and does.
This isn't something I've found to be an issue personally, but I suppose it's a matter of what you're used to. Since I'm used to that this was the IDE's job, personally. That being said, perhaps .di files could help with this?
I see it as the job of doc generators.
Why complicate the issue? What's wrong with readable code?
I spent several years using C/C++ exclusively (and was happy with it at the time) and I still don't understand what's "readable" about having a class's members separate from the class itself. It's also a non-DRY maintenance PITA and one of the biggest reasons I left C/C++. I don't like complicating things, and I like readable code. That's why I find C++-style class definitions intolerable.
I also hate them. It is always a pain to get back to C and C++ land with double header and implementation files, specially after being spoiled with languages that have proper module support.
Sep 06 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 8/31/2013 7:05 PM, Manu wrote:
 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the stars aligned such
 that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various versions of
 VisualStudio, and either 'just work', or amend sc.ini on its own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a user to fiddle
with
 an ini file just isn't acceptable. We had to google solutions to this problem,
 and even then, we had trouble with the paths we added to sc.ini; are spaces
 acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), libraries should
 be automatically detected and path entries added in there too:
    C:\Program Files (x86)\Microsoft SDKs\...
    C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and each of us had to
 configure them ourselves.
The default sc.ini contains: ----------------------------- [Version] version=7.51 Build 020 [Environment] LIB="% P%\..\lib";\dm\lib DFLAGS="-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe VCINSTALLDIR=%VCINSTALLDIR% WindowsSdkDir=%WindowsSdkDir% ---------------------------------- When I installed VC 2010, it set the environment variables VCINSTALLDIR and WindowsSdkDir. Then, the default sc.ini should "just work". What went wrong, specifically?
Sep 01 2013
next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
01-Sep-2013 11:42, Walter Bright пишет:
 On 8/31/2013 7:05 PM, Manu wrote:
 The default sc.ini contains:
 -----------------------------
 [Version]
 version=7.51 Build 020

 [Environment]
 LIB="% P%\..\lib";\dm\lib
 DFLAGS="-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import"
 LINKCMD=% P%\link.exe
 LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe
 VCINSTALLDIR=%VCINSTALLDIR%
 WindowsSdkDir=%WindowsSdkDir%
 ----------------------------------

 When I installed VC 2010, it set the environment variables VCINSTALLDIR
 and WindowsSdkDir. Then, the default sc.ini should "just work".

 What went wrong, specifically?
This is hopeless. It should try to detect many versions. As different versions have different folder structures. To get it to work with say VS11 I had to kill VCINSTALLDIR & WindowsSDKDir and set LIB/LINKCMD paths by hand in Environement64. Whatever magic setup code there is for VS10/WinSDK7 it doesn't work with VC11/WinSDK8. -- Dmitry Olshansky
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 2:25 AM, Dmitry Olshansky wrote:
 To get it to work with say VS11 I had to kill VCINSTALLDIR & WindowsSDKDir and
 set LIB/LINKCMD paths by hand in Environement64.
 Whatever magic setup code there is for VS10/WinSDK7 it doesn't work with
 VC11/WinSDK8.
What environment variables did VS11 set?
Sep 01 2013
next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
01-Sep-2013 21:24, Walter Bright пишет:
 On 9/1/2013 2:25 AM, Dmitry Olshansky wrote:
 To get it to work with say VS11 I had to kill VCINSTALLDIR &
 WindowsSDKDir and
 set LIB/LINKCMD paths by hand in Environement64.
 Whatever magic setup code there is for VS10/WinSDK7 it doesn't work with
 VC11/WinSDK8.
What environment variables did VS11 set?
(You can try downlaoding express editions as they are free) The only one I see is VS110COMNTOOLS that was defined to C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\ What currently works for me is: [Environment64] PATH=C:\"Program Files (x86)"\"Microsoft Visual Studio 11.0"\Common7\IDE;%PATH% LIB=C:\"Program Files (x86)"\"Windows Kits"\8.0\Lib\win8\um\x64;C:\"Program Files (x86)"\"Microsoft Visual Studio 11.0"\VC\lib\amd64;"% P%\..\lib" WindowsSdkDir= LINKCMD64=C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\bin\amd64\link.exe There are 2 distinct things - libraries are ound in Platform SDK (or rather WindowsSDK these days) and version 8 differs from 7. You can download and install it (any version) - it's a free download. Compiler/linkers are installed to VisualStudio and are dealt with separately. I still have this error when trying to comompile with debug info: LINK : fatal error LNK1101: incorrect MSPDB110.DLL version; recheck installation of this product No idea whose fault is it (DMD vs MS linker vs something in my paths/env). + I have to redefine everything in order to compile phobos/druntime as these are hardcoded to VS10 (again !) thus I don't usually test/build x64 Phobos. It's far less hassle for me to keep around x64 Linux virtual box for testing. -- Dmitry Olshansky
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 11:01 AM, Dmitry Olshansky wrote:
 The only one I see is
 VS110COMNTOOLS
Very strange. When I click on the shortcut "Visual Studio x64 Win64..." to open a command prompt, it sets a veritable blizzard of environment variables.
 that was defined to
 C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\

 What currently works for me is:
 [Environment64]
 PATH=C:\"Program Files (x86)"\"Microsoft Visual Studio 11.0"\Common7\IDE;%PATH%
 LIB=C:\"Program Files (x86)"\"Windows Kits"\8.0\Lib\win8\um\x64;C:\"Program
 Files (x86)"\"Microsoft Visual Studio 11.0"\VC\lib\amd64;"% P%\..\lib"
 WindowsSdkDir=
 LINKCMD64=C:\Program Files (x86)\Microsoft Visual Studio
11.0\VC\bin\amd64\link.exe
I'm quite sure that VS20?? sets an environment variable for "Windows Kits" somewhere.
 There are 2 distinct things - libraries are ound in Platform SDK (or rather
 WindowsSDK these days) and version 8 differs from 7. You can download and
 install it (any version) - it's a free download.
 Compiler/linkers are installed to VisualStudio and are dealt with separately.

 I still have this error when trying to comompile with debug info:

 LINK : fatal error LNK1101: incorrect MSPDB110.DLL version; recheck
installation
 of this product

 No idea whose fault is it (DMD vs MS linker vs something in my paths/env).
DMD does not load any DLLs.
 + I have to redefine everything in order to compile phobos/druntime as these
are
 hardcoded to VS10 (again !) thus I don't usually test/build x64 Phobos. It's
far
 less hassle for me to keep around x64 Linux virtual box for testing.
When I compile phobos, for example, I use the following makefile (named "makefile"). It is very handy for resetting the environment variables used by win32.mak. -------------- DIR=\dmd2 #DMD=$(DIR)\windows\bin\dmd DMD=..\dmd #DOC=..\..\html\d\phobos DOC=..\doc\phobos STDDOC=std.ddoc DRUNTIME=..\druntime MODEL=32 MACS=DMD=$(DMD) DOC=$(DOC) STDDOC=$(STDDOC) DIR=$(DIR) DRUNTIME=$(DRUNTIME) targets : make -fwin$(MODEL).mak $(MACS) test : make -fwin$(MODEL).mak $(MACS) test unittest : make -fwin$(MODEL).mak $(MACS) unittest cov : make -fwin$(MODEL).mak $(MACS) cov clean : make -fwin$(MODEL).mak $(MACS) clean html : make -fwin$(MODEL).mak $(MACS) html cleanhtml: make -fwin$(MODEL).mak $(MACS) cleanhtml zip : make -fwin$(MODEL).mak $(MACS) zip install : make -fwin$(MODEL).mak $(MACS) install cov : make -fwin$(MODEL).mak $(MACS) cov time : make -fwin$(MODEL).mak $(MACS) clean timer make -fwin$(MODEL).mak $(MACS)
Sep 01 2013
parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
01-Sep-2013 22:44, Walter Bright пишет:
 On 9/1/2013 11:01 AM, Dmitry Olshansky wrote:
 The only one I see is
 VS110COMNTOOLS
Very strange. When I click on the shortcut "Visual Studio x64 Win64..." to open a command prompt, it sets a veritable blizzard of environment variables.
Ah, that command prompt... VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\ WindowsSdkDir=C:\Program Files (x86)\Windows Kits\8.0\ However the said windows kit folder has tree like this: C:\Program Files (x86)\Windows Kits\8.0\Lib\win8\um\{arm,x86,amd64} That 'win8\um' sub-folder is what must be throwing it off. It's nothing like that in Win7 SDK.
 There are 2 distinct things - libraries are ound in Platform SDK (or
 rather
 WindowsSDK these days) and version 8 differs from 7. You can download and
 install it (any version) - it's a free download.
 Compiler/linkers are installed to VisualStudio and are dealt with
 separately.

 I still have this error when trying to comompile with debug info:

 LINK : fatal error LNK1101: incorrect MSPDB110.DLL version; recheck
 installation
 of this product

 No idea whose fault is it (DMD vs MS linker vs something in my
 paths/env).
DMD does not load any DLLs.
Okay, then it could very well be just me messing with paths/env. OT: MS x64 compiler too could be had for free (it come with SDK I installed not VS express)
 + I have to redefine everything in order to compile phobos/druntime as
 these are
 hardcoded to VS10 (again !) thus I don't usually test/build x64
 Phobos. It's far
 less hassle for me to keep around x64 Linux virtual box for testing.
When I compile phobos, for example, I use the following makefile (named "makefile"). It is very handy for resetting the environment variables used by win32.mak. --------------
Thanks! I could borrow that. With that said I don't quite like Makefiles at all. -- Dmitry Olshansky
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 11:58 AM, Dmitry Olshansky wrote:
 01-Sep-2013 22:44, Walter Bright пишет:
 On 9/1/2013 11:01 AM, Dmitry Olshansky wrote:
 The only one I see is
 VS110COMNTOOLS
Very strange. When I click on the shortcut "Visual Studio x64 Win64..." to open a command prompt, it sets a veritable blizzard of environment variables.
Ah, that command prompt... VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\ WindowsSdkDir=C:\Program Files (x86)\Windows Kits\8.0\ However the said windows kit folder has tree like this: C:\Program Files (x86)\Windows Kits\8.0\Lib\win8\um\{arm,x86,amd64} That 'win8\um' sub-folder is what must be throwing it off. It's nothing like that in Win7 SDK.
 There are 2 distinct things - libraries are ound in Platform SDK (or
 rather
 WindowsSDK these days) and version 8 differs from 7. You can download and
 install it (any version) - it's a free download.
 Compiler/linkers are installed to VisualStudio and are dealt with
 separately.

 I still have this error when trying to comompile with debug info:

 LINK : fatal error LNK1101: incorrect MSPDB110.DLL version; recheck
 installation
 of this product

 No idea whose fault is it (DMD vs MS linker vs something in my
 paths/env).
DMD does not load any DLLs.
Okay, then it could very well be just me messing with paths/env. OT: MS x64 compiler too could be had for free (it come with SDK I installed not VS express)
 + I have to redefine everything in order to compile phobos/druntime as
 these are
 hardcoded to VS10 (again !) thus I don't usually test/build x64
 Phobos. It's far
 less hassle for me to keep around x64 Linux virtual box for testing.
When I compile phobos, for example, I use the following makefile (named "makefile"). It is very handy for resetting the environment variables used by win32.mak. --------------
Thanks! I could borrow that. With that said I don't quite like Makefiles at all.
I think the most practical thing at the moment is to: 1. put comments in sc.ini explaining it better 2. replace the hardcoded tails that link.c appends to the sc.ini values with new settings in sc.ini 3. provide commented-out example settings for each variant of VS as we discover what they should be
Sep 01 2013
next sibling parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/1/13, Walter Bright <newshound2 digitalmars.com> wrote:
 3. provide commented-out example settings for each variant of VS as we
 discover
 what they should be
Imagine if after installing Windows 7 you were provided with a boot.ini file which didn't work, but hey, it had commented out examples on how to make the system properly boot. You weren't even give info on which partition the system installed, the manual says "Check your hard drive vendor instructions". This isn't a 1-click install, and it's still a usability issue. I know you seem to hate automation (because it sometimes fails), but having each individual person waste hours on initial setup is much worse than having a script which can potentially fail. At least the script can be fixed once the bug is reported, and it can be tested.
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 12:38 PM, Andrej Mitrovic wrote:
 I know you seem to hate automation (because it sometimes fails), but
 having each individual person waste hours on initial setup is much
 worse than having a script which can potentially fail. At least the
 script can be fixed once the bug is reported, and it can be tested.
Back in the bad old Datalight C days, I relied on the Microsoft linker which came on the DOS system disks. Unfortunately, Microsoft constantly changed it, even with the same version of DOS. Worse, numerous other Microsoft products came with yet other versions of LINK.EXE. All those linkers behaved differently. It was a frackin' nightmare to support customers with them. I used to have floppy disks packed full of just different versions of LINK.EXE. This drove me to get our own linker (BLINK.EXE). While it wasn't perfect, either, at least I could actually fix problems with it rather than throwing up my hands in rage being unable to control the situation. There's no way we can automate VC 2014 so its unpredictable configuration will work. All we can do is react after the fact. I don't hate automation. sc.ini works out of the box with the default install of VS 2010. What I need from you guys and your different VS installs is, for each one, a bug report with what is necessary to get it installed. Then we can add it to the modern version of my floppy disk "linker collection".
Sep 01 2013
next sibling parent reply "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 20:00:19 UTC, Walter Bright wrote:
 On 9/1/2013 12:38 PM, Andrej Mitrovic wrote:
 I know you seem to hate automation (because it sometimes 
 fails), but
 having each individual person waste hours on initial setup is 
 much
 worse than having a script which can potentially fail. At 
 least the
 script can be fixed once the bug is reported, and it can be 
 tested.
Back in the bad old Datalight C days, I relied on the Microsoft linker which came on the DOS system disks. Unfortunately, Microsoft constantly changed it, even with the same version of DOS. Worse, numerous other Microsoft products came with yet other versions of LINK.EXE. All those linkers behaved differently. It was a frackin' nightmare to support customers with them. I used to have floppy disks packed full of just different versions of LINK.EXE. This drove me to get our own linker (BLINK.EXE). While it wasn't perfect, either, at least I could actually fix problems with it rather than throwing up my hands in rage being unable to control the situation. There's no way we can automate VC 2014 so its unpredictable configuration will work. All we can do is react after the fact. I don't hate automation. sc.ini works out of the box with the default install of VS 2010. What I need from you guys and your different VS installs is, for each one, a bug report with what is necessary to get it installed. Then we can add it to the modern version of my floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005. Even if they make a release that the D installer doesn't properly detect fixing that is a 5 minute job by one person (probably me) versus requiring every user to spend a half hour or more (6 hours in Manu's experience) trying to figure out how to get 64 bit working.
Sep 01 2013
parent reply Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 9/1/13, Brad Anderson <eco gnuk.net> wrote:
 What I need from you guys and your different VS installs is,
 for each one, a bug report with what is necessary to get it
 installed. Then we can add it to the modern version of my
 floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005.
Yes, and VS comes out what, maybe once a year? This is possible to implement and maintain. If it weren't, then installing VS plugins would be impossible, but as far as I know it mostly works out of the box (hell, VisualD does it, so why can't we do something as simple as detect VS paths?)
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 1:56 PM, Andrej Mitrovic wrote:
 On 9/1/13, Brad Anderson <eco gnuk.net> wrote:
 What I need from you guys and your different VS installs is,
 for each one, a bug report with what is necessary to get it
 installed. Then we can add it to the modern version of my
 floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005.
Yes, and VS comes out what, maybe once a year? This is possible to implement and maintain. If it weren't, then installing VS plugins would be impossible, but as far as I know it mostly works out of the box (hell, VisualD does it, so why can't we do something as simple as detect VS paths?)
Pull requests are, of course, welcome.
Sep 01 2013
parent reply "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 21:08:24 UTC, Walter Bright wrote:
 On 9/1/2013 1:56 PM, Andrej Mitrovic wrote:
 On 9/1/13, Brad Anderson <eco gnuk.net> wrote:
 What I need from you guys and your different VS installs is,
 for each one, a bug report with what is necessary to get it
 installed. Then we can add it to the modern version of my
 floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005.
Yes, and VS comes out what, maybe once a year? This is possible to implement and maintain. If it weren't, then installing VS plugins would be impossible, but as far as I know it mostly works out of the box (hell, VisualD does it, so why can't we do something as simple as detect VS paths?)
Pull requests are, of course, welcome.
https://github.com/D-Programming-Language/installer/pull/22 It can detect (through registry keys) the paths of Visual C++ 10, 11, and 12 (2010, 2012, and 2013) and Windows SDK 7.0A, 7.1A, 8.0, and 8.1. It modifies the sc.ini installed from the zip file by substituting the defaults with the detected paths (which makes it important I have accurately reflected what the sc.ini defaults will be). I only have VC 10 to test with myself (and lack the disk space to have concurrent installations of all 3). I installed the Windows SDK 7.0A (comes with VC 10), 8.0, and 8.1 though I couldn't actually use 8.0 and 8.1 successfully because of the path tail issue Dmitry pointed out (and you opened a pull request to fix). The combination of VC 10 and SDK 8.1 did not work (link errors) but VC 10 with 7.0A worked perfectly. I imagine you need to pair the SDK with the version of VC that was released around the same time.
Sep 01 2013
next sibling parent "Brad Anderson" <eco gnuk.net> writes:
On Monday, 2 September 2013 at 01:41:51 UTC, Brad Anderson wrote:
 On Sunday, 1 September 2013 at 21:08:24 UTC, Walter Bright 
 wrote:
 On 9/1/2013 1:56 PM, Andrej Mitrovic wrote:
 On 9/1/13, Brad Anderson <eco gnuk.net> wrote:
 What I need from you guys and your different VS installs is,
 for each one, a bug report with what is necessary to get it
 installed. Then we can add it to the modern version of my
 floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005.
Yes, and VS comes out what, maybe once a year? This is possible to implement and maintain. If it weren't, then installing VS plugins would be impossible, but as far as I know it mostly works out of the box (hell, VisualD does it, so why can't we do something as simple as detect VS paths?)
Pull requests are, of course, welcome.
https://github.com/D-Programming-Language/installer/pull/22 It can detect (through registry keys) the paths of Visual C++ 10, 11, and 12 (2010, 2012, and 2013) and Windows SDK 7.0A, 7.1A, 8.0, and 8.1. It modifies the sc.ini installed from the zip file by substituting the defaults with the detected paths (which makes it important I have accurately reflected what the sc.ini defaults will be). I only have VC 10 to test with myself (and lack the disk space to have concurrent installations of all 3). I installed the Windows SDK 7.0A (comes with VC 10), 8.0, and 8.1 though I couldn't actually use 8.0 and 8.1 successfully because of the path tail issue Dmitry pointed out (and you opened a pull request to fix). The combination of VC 10 and SDK 8.1 did not work (link errors) but VC 10 with 7.0A worked perfectly. I imagine you need to pair the SDK with the version of VC that was released around the same time.
Err, ignore the last sentence. I forgot that it was probably the path tail issue I was hitting.
Sep 01 2013
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 11:41, Brad Anderson <eco gnuk.net> wrote:

 On Sunday, 1 September 2013 at 21:08:24 UTC, Walter Bright wrote:

 On 9/1/2013 1:56 PM, Andrej Mitrovic wrote:

 On 9/1/13, Brad Anderson <eco gnuk.net> wrote:

 What I need from you guys and your different VS installs is,
 for each one, a bug report with what is necessary to get it
 installed. Then we can add it to the modern version of my
 floppy disk "linker collection".
This can be automated easily enough. The installer can detect what versions of VS are installed and either set an environment variable or modify sc.ini (your choice). It could probably be made forward compatible since Microsoft has been using basically the same paths and registry keys for every new release since at least VS 2005.
Yes, and VS comes out what, maybe once a year? This is possible to implement and maintain. If it weren't, then installing VS plugins would be impossible, but as far as I know it mostly works out of the box (hell, VisualD does it, so why can't we do something as simple as detect VS paths?)
Pull requests are, of course, welcome.
https://github.com/D-**Programming-Language/**installer/pull/22<https://github.com/D-Programming-Language/installer/pull/22> It can detect (through registry keys) the paths of Visual C++ 10, 11, and 12 (2010, 2012, and 2013) and Windows SDK 7.0A, 7.1A, 8.0, and 8.1. It modifies the sc.ini installed from the zip file by substituting the defaults with the detected paths (which makes it important I have accurately reflected what the sc.ini defaults will be). I only have VC 10 to test with myself (and lack the disk space to have concurrent installations of all 3). I installed the Windows SDK 7.0A (comes with VC 10), 8.0, and 8.1 though I couldn't actually use 8.0 and 8.1 successfully because of the path tail issue Dmitry pointed out (and you opened a pull request to fix). The combination of VC 10 and SDK 8.1 did not work (link errors) but VC 10 with 7.0A worked perfectly. I imagine you need to pair the SDK with the version of VC that was released around the same time.
Huzzah! Give the man a beer! :) How about the DirectX SDK? http://www.microsoft.com/en-us/download/details.aspx?id=6812 It's super standard aswell for anyone working on multimedia software. It has an environment variable on my machine: DXSDK_DIR = C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 10:01 PM, Manu wrote:
 How about the DirectX SDK?
 http://www.microsoft.com/en-us/download/details.aspx?id=6812
 It's super standard aswell for anyone working on multimedia software.
 It has an environment variable on my machine: DXSDK_DIR = C:\Program Files
 (x86)\Microsoft DirectX SDK (June 2010)\
How about: LIBPATH64=%VCINSTALLDIR%lib\amd64;%WindowsSdkDir%lib\x64;%DXSDK_DIR% in your sc.ini?
Sep 01 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 16:05, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 10:01 PM, Manu wrote:

 How about the DirectX SDK?
 http://www.microsoft.com/en-**us/download/details.aspx?id=**6812<http://www.microsoft.com/en-us/download/details.aspx?id=6812>
 It's super standard aswell for anyone working on multimedia software.
 It has an environment variable on my machine: DXSDK_DIR = C:\Program Files
 (x86)\Microsoft DirectX SDK (June 2010)\
How about: LIBPATH64=%VCINSTALLDIR%lib\**amd64;%WindowsSdkDir%lib\x64;%**DXSDK_DIR% in your sc.ini?
Yeah, that's what I did. I'm just suggesting it should be detected and added by the installer.
Sep 02 2013
prev sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 6:41 PM, Brad Anderson wrote:
 On Sunday, 1 September 2013 at 21:08:24 UTC, Walter Bright wrote:
 Pull requests are, of course, welcome.
https://github.com/D-Programming-Language/installer/pull/22
It's just like ordering pizza!!! Brad, you Da Man!
Sep 01 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 06:00, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 12:38 PM, Andrej Mitrovic wrote:

 I know you seem to hate automation (because it sometimes fails), but
 having each individual person waste hours on initial setup is much
 worse than having a script which can potentially fail. At least the
 script can be fixed once the bug is reported, and it can be tested.
Back in the bad old Datalight C days, I relied on the Microsoft linker which came on the DOS system disks. Unfortunately, Microsoft constantly changed it, even with the same version of DOS. Worse, numerous other Microsoft products came with yet other versions of LINK.EXE. All those linkers behaved differently. It was a frackin' nightmare to support customers with them. I used to have floppy disks packed full of just different versions of LINK.EXE. This drove me to get our own linker (BLINK.EXE). While it wasn't perfect, either, at least I could actually fix problems with it rather than throwing up my hands in rage being unable to control the situation. There's no way we can automate VC 2014 so its unpredictable configuration will work. All we can do is react after the fact.
MS always release preview versions to toolchain/workflow vendors though. You just need to register interest to get access afaik. I don't hate automation. sc.ini works out of the box with the default
 install of VS 2010.
I don't think my 2010 install is non-standard in any way... What I need from you guys and your different VS installs is, for each one,
 a bug report with what is necessary to get it installed. Then we can add it
 to the modern version of my floppy disk "linker collection".
I posted my sc.ini in a prior post. I'll try and get Simon's from him, since he was on 2012.
Sep 01 2013
prev sibling next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 12:21 PM, Walter Bright wrote:
 2. replace the hardcoded tails that link.c appends to the sc.ini values with
new
 settings in sc.ini
https://github.com/D-Programming-Language/dmd/pull/2509
Sep 01 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:21, Walter Bright <newshound2 digitalmars.com> wrote=
:

 On 9/1/2013 11:58 AM, Dmitry Olshansky wrote:

 01-Sep-2013 22:44, Walter Bright =D0=BF=D0=B8=D1=88=D0=B5=D1=82:

 On 9/1/2013 11:01 AM, Dmitry Olshansky wrote:

 The only one I see is
 VS110COMNTOOLS
Very strange. When I click on the shortcut "Visual Studio x64 Win64..." to open a command prompt, it sets a veritable blizzard of environment variables.
Ah, that command prompt... VCINSTALLDIR=3DC:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\ WindowsSdkDir=3DC:\Program Files (x86)\Windows Kits\8.0\ However the said windows kit folder has tree like this: C:\Program Files (x86)\Windows Kits\8.0\Lib\win8\um\{arm,x86,**amd64} That 'win8\um' sub-folder is what must be throwing it off. It's nothing like that in Win7 SDK. There are 2 distinct things - libraries are ound in Platform SDK (or
 rather
 WindowsSDK these days) and version 8 differs from 7. You can download
 and
 install it (any version) - it's a free download.
 Compiler/linkers are installed to VisualStudio and are dealt with
 separately.

 I still have this error when trying to comompile with debug info:

 LINK : fatal error LNK1101: incorrect MSPDB110.DLL version; recheck
 installation
 of this product

 No idea whose fault is it (DMD vs MS linker vs something in my
 paths/env).
DMD does not load any DLLs.
Okay, then it could very well be just me messing with paths/env. OT: MS x64 compiler too could be had for free (it come with SDK I installed not VS express) + I have to redefine everything in order to compile phobos/druntime as
 these are
 hardcoded to VS10 (again !) thus I don't usually test/build x64
 Phobos. It's far
 less hassle for me to keep around x64 Linux virtual box for testing.
When I compile phobos, for example, I use the following makefile (named "makefile"). It is very handy for resetting the environment variables used by win32.mak. --------------
Thanks! I could borrow that. With that said I don't quite like Makefiles at all.
I think the most practical thing at the moment is to: 1. put comments in sc.ini explaining it better 2. replace the hardcoded tails that link.c appends to the sc.ini values with new settings in sc.ini 3. provide commented-out example settings for each variant of VS as we discover what they should be
And if we still rely on sc.ini to get it right: 4. As the final step of the DMD installer, open sc.ini in notepad for the user's approval.
Sep 01 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
There are only 2 visual studio related environment variables on my machine:

VS100COMNTOOLS = C:\Program Files (x86)\Microsoft Visual Studio
10.0\Common7\Tools\
VS110COMNTOOLS = C:\Program Files (x86)\Microsoft Visual Studio
11.0\Common7\Tools\

But in general, I reckon it's probably bad practise to rely on environment
variables in Windows. They're usually a complete mess, and hard to manage.


On 2 September 2013 03:24, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 2:25 AM, Dmitry Olshansky wrote:

 To get it to work with say VS11 I had to kill VCINSTALLDIR &
 WindowsSDKDir and
 set LIB/LINKCMD paths by hand in Environement64.
 Whatever magic setup code there is for VS10/WinSDK7 it doesn't work with
 VC11/WinSDK8.
What environment variables did VS11 set?
Sep 01 2013
prev sibling next sibling parent reply "Simen Kjaeraas" <simen.kjaras gmail.com> writes:
On Sun, 01 Sep 2013 09:42:37 +0200, Walter Bright  
<newshound2 digitalmars.com> wrote:

 On 8/31/2013 7:05 PM, Manu wrote:
 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the stars  
 aligned such
 that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various  
 versions of
 VisualStudio, and either 'just work', or amend sc.ini on its own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a user to  
 fiddle with
 an ini file just isn't acceptable. We had to google solutions to this  
 problem,
 and even then, we had trouble with the paths we added to sc.ini; are  
 spaces
 acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'),  
 libraries should
 be automatically detected and path entries added in there too:
    C:\Program Files (x86)\Microsoft SDKs\...
    C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and each of us  
 had to
 configure them ourselves.
The default sc.ini contains: ----------------------------- [Version] version=7.51 Build 020 [Environment] LIB="% P%\..\lib";\dm\lib DFLAGS="-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe VCINSTALLDIR=%VCINSTALLDIR% WindowsSdkDir=%WindowsSdkDir% ---------------------------------- When I installed VC 2010, it set the environment variables VCINSTALLDIR and WindowsSdkDir. Then, the default sc.ini should "just work". What went wrong, specifically?
Like Dmitry said, it might work for 2010, but it certainly does not 'just work' for 2012 and 2013. I've probably spent twelve hours setting up DMD Win64 on my two confusers. -- Simen
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 2:46 AM, Simen Kjaeraas wrote:
 Like Dmitry said, it might work for 2010, but it certainly does not 'just
 work' for 2012 and 2013. I've probably spent twelve hours setting up DMD
 Win64 on my two confusers.
I need specifics. Again, what environment variables are setup by these other VS's? What did you put in your sc.ini? I don't have multiple VS installs.
Sep 01 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 03:30, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 2:46 AM, Simen Kjaeraas wrote:

 Like Dmitry said, it might work for 2010, but it certainly does not 'just
 work' for 2012 and 2013. I've probably spent twelve hours setting up DMD
 Win64 on my two confusers.
I need specifics. Again, what environment variables are setup by these other VS's? What did you put in your sc.ini? I don't have multiple VS installs.
My sc.ini looks like this, I also broke dmd2/windows/lib into lib and lib64: [Version] version=7.51 Build 020 [Environment64] LIB="c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\lib\amd64";"c:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Lib\x64";"C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\Lib\x64";"% P%\..\lib64" DFLAGS=-m64 "-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import" LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe [Environment] LIB="% P%\..\lib";\dm\lib VCINSTALLDIR=C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\ DFLAGS=-m64 "-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe WindowsSdkDir=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A
Sep 01 2013
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
Hmmm, I found details on the net that recommended adding an [Environment64]
section, which we did.

I don't seem to have VCINSTALLDIR or WindowsSdkDir variables on my system
:/ .. that said, VC obviously works on my machine.
It also seems potentially problematic that a variable would define a single
install directory, since it's pretty common that programmers have multiple
versions of VS on their machines.
I have VS2010 and VS2012 on my machine. Simon had VS2012 and VS2013.

I was also thinking it might be a mistake to keep phobos64.lib in the same
folder as the 32bit omf libs. If paths are wrong, the link errors will only
be understood by a programmer who understands compilers and lib/object
formats in depth.
Perhaps you should put them in a parallel lib64 directory, and hook that up
in [Environment64] (I did that on my machine while trying to isolate
problems and wort out where paths were coming from)?

I reckon you should look into hooking up DirectX SDK patahing by default
too since it's so common for basically any multimedia app.


On 1 September 2013 17:42, Walter Bright <newshound2 digitalmars.com> wrote:

 On 8/31/2013 7:05 PM, Manu wrote:

 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the stars
 aligned such
 that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various versions
 of
 VisualStudio, and either 'just work', or amend sc.ini on its own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a user to
 fiddle with
 an ini file just isn't acceptable. We had to google solutions to this
 problem,
 and even then, we had trouble with the paths we added to sc.ini; are
 spaces
 acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), libraries
 should
 be automatically detected and path entries added in there too:
    C:\Program Files (x86)\Microsoft SDKs\...
    C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and each of us
 had to
 configure them ourselves.
The default sc.ini contains: ----------------------------- [Version] version=7.51 Build 020 [Environment] LIB="% P%\..\lib";\dm\lib DFLAGS="-I% P%\..\..\src\**phobos" "-I% P%\..\..\src\druntime\**import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\**amd64\link.exe VCINSTALLDIR=%VCINSTALLDIR% WindowsSdkDir=%WindowsSdkDir% ------------------------------**---- When I installed VC 2010, it set the environment variables VCINSTALLDIR and WindowsSdkDir. Then, the default sc.ini should "just work". What went wrong, specifically?
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 6:19 AM, Manu wrote:
 Hmmm, I found details on the net that recommended adding an [Environment64]
 section, which we did.

 I don't seem to have VCINSTALLDIR or WindowsSdkDir variables on my system :/ ..
 that said, VC obviously works on my machine.
Sorry if I am being repetitive here, but what environment variables is VS setting on your machine?
 It also seems potentially problematic that a variable would define a single
 install directory, since it's pretty common that programmers have multiple
 versions of VS on their machines.
 I have VS2010 and VS2012 on my machine. Simon had VS2012 and VS2013.
Yes, so what environment variables should sc.ini use to determine which one the user wants to use for dmd?
 I was also thinking it might be a mistake to keep phobos64.lib in the same
 folder as the 32bit omf libs. If paths are wrong, the link errors will only be
 understood by a programmer who understands compilers and lib/object formats in
 depth.
 Perhaps you should put them in a parallel lib64 directory, and hook that up in
 [Environment64] (I did that on my machine while trying to isolate problems and
 wort out where paths were coming from)?
Please, I need specifics.
 I reckon you should look into hooking up DirectX SDK patahing by default too
 since it's so common for basically any multimedia app.
I have no idea what you mean by this?
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 03:38, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 6:19 AM, Manu wrote:

 Hmmm, I found details on the net that recommended adding an
 [Environment64]
 section, which we did.

 I don't seem to have VCINSTALLDIR or WindowsSdkDir variables on my system
 :/ ..
 that said, VC obviously works on my machine.
Sorry if I am being repetitive here, but what environment variables is VS setting on your machine?
I don't have any that look meaningful, other than the TOOLS ones in my prior reply. It also seems potentially problematic that a variable would define a single
 install directory, since it's pretty common that programmers have multiple
 versions of VS on their machines.
 I have VS2010 and VS2012 on my machine. Simon had VS2012 and VS2013.
Yes, so what environment variables should sc.ini use to determine which one the user wants to use for dmd?
I think it should search through program files on installation. User might have non-standard installation path name. But I also think the paths should be supplied by Visual-D when running inside VisualStudio, since it's running within the version of VS that you actually want to use, so it can set the appropriate paths for that version (I've raised this with Rainer). I was also thinking it might be a mistake to keep phobos64.lib in the same
 folder as the 32bit omf libs. If paths are wrong, the link errors will
 only be
 understood by a programmer who understands compilers and lib/object
 formats in
 depth.
 Perhaps you should put them in a parallel lib64 directory, and hook that
 up in
 [Environment64] (I did that on my machine while trying to isolate
 problems and
 wort out where paths were coming from)?
Please, I need specifics.
Put phobos64.lib and friends in lib64/, beside lib/ I reckon you should look into hooking up DirectX SDK patahing by default
 too
 since it's so common for basically any multimedia app.
I have no idea what you mean by this?
DirectX, Microsoft's multimedia API? http://www.microsoft.com/en-us/download/details.aspx?id=6812 It installs bunches of libs in their own directory, I think you should include them in sc.ini by default. DirectX does appear to have an environment variable on my machine: DXSDK_DIR = C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\ So if you map: %DXSDK_DIR%Lib\x64 that will make the libs available.
Sep 01 2013
parent reply Mike Parker <aldacron gmail.com> writes:
On 9/2/2013 11:54 AM, Manu wrote:
 DirectX, Microsoft's multimedia API?
 http://www.microsoft.com/en-us/download/details.aspx?id=6812
 It installs bunches of libs in their own directory, I think you should
 include them in sc.ini by default.

 DirectX does appear to have an environment variable on my machine:
    DXSDK_DIR = C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\

 So if you map: %DXSDK_DIR%Lib\x64 that will make the libs available.
MS has stopped making new versions of the DirectX SDK. Everything now ships as part of the Windows SDK. See http://msdn.microsoft.com/en-us/library/windows/desktop/ee663275(v=vs.85).aspx
Sep 02 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 18:14, Mike Parker <aldacron gmail.com> wrote:

 On 9/2/2013 11:54 AM, Manu wrote:

 DirectX, Microsoft's multimedia API?
 http://www.microsoft.com/en-**us/download/details.aspx?id=**6812<http://www.microsoft.com/en-us/download/details.aspx?id=6812>
 It installs bunches of libs in their own directory, I think you should
 include them in sc.ini by default.

 DirectX does appear to have an environment variable on my machine:

    DXSDK_DIR = C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\

 So if you map: %DXSDK_DIR%Lib\x64 that will make the libs available.
MS has stopped making new versions of the DirectX SDK. Everything now ships as part of the Windows SDK. See http://msdn.microsoft.com/en-**us/library/windows/desktop/** ee663275(v=vs.85).aspx<http://msdn.microsoft.com/en-us/library/windows/desktop/ee663275(v=vs.85).aspx>
Yeah, but DMD is tested against VS2010, and many users of VS2010 still exist (like me, and all but one of my friends on the weekend). Which means the DXSDK package is still totally relevant.
Sep 02 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
As a followup,

When I installed VS2010, it created a shortcut on my desktop labelled "Visual 
Studio x64 Win64 ...". Clicking on that opened a command line window.

In that window, I could compile and run VC programs from the command prompt 
using CL, LINK, etc.

In that window, if I typed:

SET

at the prompt, it showed the environment variables set, which included:

VCINSTALLDIR - where VC was installed
WindowsSdkDir - where the SDK was installed

This is all dmd needs in order to use VS2010.

However, I didn't want to have to click on that shortcut just to use -m64, so I 
merely copied the values VCINSTALLDIR and WindowsSdkDir into my own personal 
sc.ini, and then dmd -m64 worked fine.
Sep 01 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 03:43, Walter Bright <newshound2 digitalmars.com> wrote:

 As a followup,

 When I installed VS2010, it created a shortcut on my desktop labelled
 "Visual Studio x64 Win64 ...". Clicking on that opened a command line
 window.

 In that window, I could compile and run VC programs from the command
 prompt using CL, LINK, etc.

 In that window, if I typed:

 SET

 at the prompt, it showed the environment variables set, which included:

 VCINSTALLDIR - where VC was installed
 WindowsSdkDir - where the SDK was installed

 This is all dmd needs in order to use VS2010.

 However, I didn't want to have to click on that shortcut just to use -m64,
 so I merely copied the values VCINSTALLDIR and WindowsSdkDir into my own
 personal sc.ini, and then dmd -m64 worked fine.
Ah yes, that batch file... that explains it. Maybe what you can do is scan the users Program Files for the batch file, and then pull those lines from it? (A bit hackey...)
Sep 01 2013
prev sibling parent reply "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 07:42:47 UTC, Walter Bright wrote:
 On 8/31/2013 7:05 PM, Manu wrote:
 The only compiler you can realistically use productively in 
 windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the 
 stars aligned such
 that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of 
 various versions of
 VisualStudio, and either 'just work', or amend sc.ini on its 
 own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a 
 user to fiddle with
 an ini file just isn't acceptable. We had to google solutions 
 to this problem,
 and even then, we had trouble with the paths we added to 
 sc.ini; are spaces
 acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), 
 libraries should
 be automatically detected and path entries added in there too:
   C:\Program Files (x86)\Microsoft SDKs\...
   C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and 
 each of us had to
 configure them ourselves.
The default sc.ini contains: ----------------------------- [Version] version=7.51 Build 020 [Environment] LIB="% P%\..\lib";\dm\lib DFLAGS="-I% P%\..\..\src\phobos" "-I% P%\..\..\src\druntime\import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\amd64\link.exe VCINSTALLDIR=%VCINSTALLDIR% WindowsSdkDir=%WindowsSdkDir% ---------------------------------- When I installed VC 2010, it set the environment variables VCINSTALLDIR and WindowsSdkDir. Then, the default sc.ini should "just work". What went wrong, specifically?
I can make the installer detect which versions of Visual Studio are installed and the path they are installed. Would I rather I have the installer modify the installed sc.ini or set an environment variable?
Sep 01 2013
next sibling parent "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 19:29:00 UTC, Brad Anderson wrote:
 I can make the installer detect which versions of Visual Studio 
 are installed and the path they are installed.  Would I rather 
 I have the installer modify the installed sc.ini or set an 
 environment variable?
I meant "Would you rather I..."
Sep 01 2013
prev sibling next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 12:28 PM, Brad Anderson wrote:
 I can make the installer detect which versions of Visual Studio are installed
 and the path they are installed.  Would I rather I have the installer modify
the
 installed sc.ini or set an environment variable?
dmd is specifically made to not rely on us setting any environment or system registry variables. So modifying sc.ini is the only solution.
Sep 01 2013
parent "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 21:10:23 UTC, Walter Bright wrote:
 On 9/1/2013 12:28 PM, Brad Anderson wrote:
 I can make the installer detect which versions of Visual 
 Studio are installed
 and the path they are installed.  Would I rather I have the 
 installer modify the
 installed sc.ini or set an environment variable?
dmd is specifically made to not rely on us setting any environment or system registry variables. So modifying sc.ini is the only solution.
Sounds good. I'll get started.
Sep 01 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:28, Brad Anderson <eco gnuk.net> wrote:

 On Sunday, 1 September 2013 at 07:42:47 UTC, Walter Bright wrote:

 On 8/31/2013 7:05 PM, Manu wrote:

 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the stars
 aligned such
 that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various versions
 of
 VisualStudio, and either 'just work', or amend sc.ini on its own. Or the
 installer needs to amend sc.ini. Either way, leaving it to a user to
 fiddle with
 an ini file just isn't acceptable. We had to google solutions to this
 problem,
 and even then, we had trouble with the paths we added to sc.ini; are
 spaces
 acceptable? Do they have quites around them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), libraries
 should
 be automatically detected and path entries added in there too:
   C:\Program Files (x86)\Microsoft SDKs\...
   C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and each of us
 had to
 configure them ourselves.
The default sc.ini contains: ----------------------------- [Version] version=7.51 Build 020 [Environment] LIB="% P%\..\lib";\dm\lib DFLAGS="-I% P%\..\..\src\**phobos" "-I% P%\..\..\src\druntime\**import" LINKCMD=% P%\link.exe LINKCMD64=%VCINSTALLDIR%bin\**amd64\link.exe VCINSTALLDIR=%VCINSTALLDIR% WindowsSdkDir=%WindowsSdkDir% ------------------------------**---- When I installed VC 2010, it set the environment variables VCINSTALLDIR and WindowsSdkDir. Then, the default sc.ini should "just work". What went wrong, specifically?
I can make the installer detect which versions of Visual Studio are installed and the path they are installed. Would I rather I have the installer modify the installed sc.ini or set an environment variable?
Avoid environment variables in windows, if you ask me. They are a mess, and they are hard to manage through the stupid UI hidden in system settings->advanced->advanced->environment variables. I think most windows users would consider sc.ini much simpler, but the most important thing is that they need to know it's there, and that it's critical that it's correct. Another idea is to enhance DMD's error messages to warns about correct paths in sc.ini when it either fails to find link.exe, or encounters an OMF library (the two tell-tale errors that sc.ini is wrong).
Sep 01 2013
prev sibling next sibling parent reply Benjamin Thaut <code benjamin-thaut.de> writes:
Am 01.09.2013 04:05, schrieb Manu:
 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a massive
 point of it in his DConf talk, and I want to re-re-re-re-re-re-re-stress
 how absolutely important this is.
 !!!!!!!!!
I have to fully agree here.
 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to get OSX
 working); there are lots of problems with the debugging experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing bugs in
 random places when it was simply a case of the debugger lying about the
 value of variables to them, and many more cases where the debugger
 simply refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, really
 burned us in a 48hour context.
Actually there is only one little patch needed to make debugging work nicely with the visual studio c++ debugger. It replaces all '.' chars with ' ' in type names inside the debug information. There was a pull request from Rainer Schuetze once, but it got rejected by Walter. https://github.com/Ingrater/dmd/commit/522f6dbeb93944ebfebde6f938a2ee3a2d6be124 Also if you are using Visual Studio 2012, you need to change the debugging engine because the new one can't deal with D code that has C++ debugging info. For that go to Options -> Debugging -> Edit and Continue -> Enable Edit and Continue (this will make Visual Studio 2012 use the old debugging engine). Then everything will work nicely. The only real issue is, that the debugger can't identify which class is behind a given interface. Also you should try using mago, it can be enabled inside the visualD debugging options of your project. It even features D expression evaluation, but it only works for 32-bit (but requires not cv2pdb conversion).
 Containers:

 The question came up multiple times; "I don't think this should be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item from an
 array, or an item from an AA by reference/value (only by key).

 This code:
    foreach(i, item; array)
      if(item == itemToRemove)
        array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it lightly...
Oh containers, I would love to have some. Like every one else using D I've written my own by now.
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to navigate

 We all wanted to ability to define class member functions outside the
 class definition:
    class MyClass
    {
      void method();
    }

    void MyClass.method()
    {
      //...
    }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want to quickly see what a class has and does.
You should really use VisualD's ability to parse the AST .json files dmd generates while compiling. Using this feature you get a really awesome class view and even "Go to defintion" works.
 Conclusion:
 I think this 48 hour jam approach is a good test for the language and
 it's infrastructure. I encourage everybody to try it (ideally with a
 clean slate computer).
 The lesson is that we need to make this process smooth, since it mirrors
 the first-experience of everybody new to D.
 It also highlights and magnifies time-wasters that are equally
 unacceptable in a commercial environment.

 I don't think I made any converts this weekend wrt the above issues
 encountered. I might have even just proved to them that they should


 Please, we need a road-map, we need to prioritise these most basic
 aspects of the experience, and we need to do it soon.
 I might re-iterate my feeling that external IDE integration projects
 should be claimed by the community officially, and user experience +
 debugging issues should be first-class issues in the main d language
 bug-tracker so everybody can see them, and everybody is aware of the stats.
 Also, the DMD front-end should be a lib offering auto-completion and
 syntax hilighting data to clients.

 I'm doing some more work on premake (a nice light buildsystem that
 generated makefiles and project files for popular IDE's) to tightly
 incorporate D into the various IDE's it supports.

 </endrant>
Other then that I have to fully agree with all the points you listed. Kind Regards Benjamin Thaut
Sep 01 2013
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 1:06 AM, Benjamin Thaut wrote:
 Also if you are using Visual Studio 2012, you need to change the debugging
 engine because the new one can't deal with D code that has C++ debugging info.
This endless merry-go-round of MSC breaking the tools constantly is why I went with my own toolset years ago.
Sep 01 2013
next sibling parent reply Benjamin Thaut <code benjamin-thaut.de> writes:
Am 01.09.2013 10:49, schrieb Walter Bright:
 On 9/1/2013 1:06 AM, Benjamin Thaut wrote:
 Also if you are using Visual Studio 2012, you need to change the
 debugging
 engine because the new one can't deal with D code that has C++
 debugging info.
This endless merry-go-round of MSC breaking the tools constantly is why I went with my own toolset years ago.
Still the visual studio debugger has the best debugging experience you can currently get for D.
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 18:51, Benjamin Thaut <code benjamin-thaut.de> wrote:

 Am 01.09.2013 10:49, schrieb Walter Bright:

  On 9/1/2013 1:06 AM, Benjamin Thaut wrote:
 Also if you are using Visual Studio 2012, you need to change the
 debugging
 engine because the new one can't deal with D code that has C++
 debugging info.
This endless merry-go-round of MSC breaking the tools constantly is why I went with my own toolset years ago.
Still the visual studio debugger has the best debugging experience you can currently get for D.
True, which is perhaps a little bit tragic. It's a good start, but lots of things just don't work right yet. May I re-iterate my point about the community claiming and supporting the IDE and debugger integration as a first class language feature with bugs in the main tracker... I know Walter does care about the DMD-Win64 debuginfo quality, but I don't think the debuginfo output is capable of addressing all the problems alone.
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 7:02 AM, Manu wrote:
 I know Walter does care about the DMD-Win64 debuginfo quality, but I don't
think
 the debuginfo output is capable of addressing all the problems alone.
I (and others like Rainer who help a lot with Win64 support) also cannot do a thing without targeted bugzilla reports.
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 03:46, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 7:02 AM, Manu wrote:

 I know Walter does care about the DMD-Win64 debuginfo quality, but I
 don't think
 the debuginfo output is capable of addressing all the problems alone.
I (and others like Rainer who help a lot with Win64 support) also cannot do a thing without targeted bugzilla reports.
How do you feel about using the DMD-Win64 + Visual-D suite as a dev environment when working on druntime+phobos for a while? Try debugging some issues using the VS debugger... I've suggested that devs need to understand the end-user experience. I think you'd reveal a lot of the really trivial issues (that are non-trivial in aggregate) very quickly. I'll continue to log bugs as I find them, I didn't this weekend because when finding a bug, I then need to spend the time to boil down a minimal repro case, and we were working against the clock >_<
Sep 01 2013
parent Jacob Carlborg <doob me.com> writes:
On 2013-09-02 05:02, Manu wrote:

 I'll continue to log bugs as I find them, I didn't this weekend because
 when finding a bug, I then need to spend the time to boil down a minimal
 repro case, and we were working against the clock >_<
A minimal test case can be created later. -- /Jacob Carlborg
Sep 02 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
On 1 September 2013 18:49, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 1:06 AM, Benjamin Thaut wrote:

 Also if you are using Visual Studio 2012, you need to change the debugging
 engine because the new one can't deal with D code that has C++ debugging
 info.
This endless merry-go-round of MSC breaking the tools constantly is why I went with my own toolset years ago.
And I don't blame you, but sadly, to achieve success in the Windows environment, I'm absolutely convinced it must work seamlessly with the de facto standard MS infrastructure.
Sep 01 2013
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 18:06, Benjamin Thaut <code benjamin-thaut.de> wrote:

 Am 01.09.2013 04:05, schrieb Manu:


 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a massive
 point of it in his DConf talk, and I want to re-re-re-re-re-re-re-stress
 how absolutely important this is.
 !!!!!!!!!
I have to fully agree here.
 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to get OSX
 working); there are lots of problems with the debugging experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing bugs in
 random places when it was simply a case of the debugger lying about the
 value of variables to them, and many more cases where the debugger
 simply refused to produce values for some variables at all.
 This is an unacceptable waste of programmers time, and again, really
 burned us in a 48hour context.
Actually there is only one little patch needed to make debugging work nicely with the visual studio c++ debugger. It replaces all '.' chars with ' ' in type names inside the debug information. There was a pull request from Rainer Schuetze once, but it got rejected by Walter. https://github.com/Ingrater/**dmd/commit/**522f6dbeb93944ebfebde6f938a2ee* *3a2d6be124<https://github.com/Ingrater/dmd/commit/522f6dbeb93944ebfebde6f938a2ee3a2d6be124> Also if you are using Visual Studio 2012, you need to change the debugging engine because the new one can't deal with D code that has C++ debugging info. For that go to Options -> Debugging -> Edit and Continue -> Enable Edit and Continue (this will make Visual Studio 2012 use the old debugging engine).
I use 2010, but Simon was using 2012, maybe that was the source of his problems. He certainly seemed to have a lot more problems than me. Then everything will work nicely. The only real issue is, that the debugger
 can't identify which class is behind a given interface.

 Also you should try using mago, it can be enabled inside the visualD
 debugging options of your project. It even features D expression
 evaluation, but it only works for 32-bit (but requires not cv2pdb
 conversion).
Mago is only Win32, and DMD is only Win64... I've tried encouraging the Mago guy to support Win64, but it doesn't seem to be a highly active project recently. I think this is another case of a 1-man project that represents a fairly critical part of the ecosystem. Containers:
 The question came up multiple times; "I don't think this should be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item from an
 array, or an item from an AA by reference/value (only by key).

 This code:
    foreach(i, item; array)
      if(item == itemToRemove)
        array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it lightly...
Oh containers, I would love to have some. Like every one else using D I've written my own by now.
It turns out that it's not actually so easy to just 'knock up' a set of robust containers that work well across all of D's features. Templates, and various typing complexities always seem to complicate the issue. In my experience, I knock something together, and it appears to work in the simple case... but then down the line, something a little more niche hits it, and I realise it's not a trivial tweak to fix it. Often a more substantial rethink may be required. One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to navigate

 We all wanted to ability to define class member functions outside the
 class definition:
    class MyClass
    {
      void method();
    }

    void MyClass.method()
    {
      //...
    }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want to quickly see what a class has and does.
You should really use VisualD's ability to parse the AST .json files dmd generates while compiling. Using this feature you get a really awesome class view and even "Go to defintion" works.
Sometimes... Have you worked with it on a daily basis? It just doesn't work all the time. Also, in my experience the auto-complete suggestions are often either incomplete, or list heaps of completely unrelated stuff. I'm not saying it's bad, it's better than nothing and I really appreciate the project (I've said before, if it weren't for Rainer's effort, I simply wouldn't use D), but it certainly needs more work, especially if I'm going to have any traction selling it to others with no vested interest in the language. The situation at Remedy was different, it was an easier sell. Since we implemented D as an extension language, the productivity comparison was to scripting languages like lua which also have no tight integration or debugging support in Visual Studio. Visual-D was certainly an improvement over other scripting solutions, and I think I made that case at DConf. But this project approached it from a different angle, ie, a total replacement for C++ which does have a rich toolset available. I want to use D in this context in the future, and if that were the commitment we were considering at Remedy, there's no way we would have gone for it. My hope from the Remedy experience was to prove that it works in a commercial environment. I went out on a REALLY massive limb there, and as far as I can tell, it's well received (although it took the better part of a year longer to get there than I promised). I'd love to see it take the next step one day, ie, into the main code-base... but the tools need to be rock solid before that can ever happen. Conclusion:
 I think this 48 hour jam approach is a good test for the language and
 it's infrastructure. I encourage everybody to try it (ideally with a
 clean slate computer).
 The lesson is that we need to make this process smooth, since it mirrors
 the first-experience of everybody new to D.
 It also highlights and magnifies time-wasters that are equally
 unacceptable in a commercial environment.

 I don't think I made any converts this weekend wrt the above issues
 encountered. I might have even just proved to them that they should


 Please, we need a road-map, we need to prioritise these most basic
 aspects of the experience, and we need to do it soon.
 I might re-iterate my feeling that external IDE integration projects
 should be claimed by the community officially, and user experience +
 debugging issues should be first-class issues in the main d language
 bug-tracker so everybody can see them, and everybody is aware of the
 stats.
 Also, the DMD front-end should be a lib offering auto-completion and
 syntax hilighting data to clients.

 I'm doing some more work on premake (a nice light buildsystem that
 generated makefiles and project files for popular IDE's) to tightly
 incorporate D into the various IDE's it supports.

 </endrant>
Other then that I have to fully agree with all the points you listed. Kind Regards Benjamin Thaut
Sep 01 2013
next sibling parent Benjamin Thaut <code benjamin-thaut.de> writes:
Am 01.09.2013 15:55, schrieb Manu:
 Mago is only Win32, and DMD is only Win64... I've tried encouraging the
 Mago guy to support Win64, but it doesn't seem to be a highly active
 project recently. I think this is another case of a 1-man project that
 represents a fairly critical part of the ecosystem.
You may be pleased to hear the mago is currently getting 64-bit and pdb support. Its activly beeing worked on, check the svn repository on dsource.org.
 Sometimes...
 Have you worked with it on a daily basis? It just doesn't work all the
 time. Also, in my experience the auto-complete suggestions are often
 either incomplete, or list heaps of completely unrelated stuff.
 I'm not saying it's bad, it's better than nothing and I really
 appreciate the project (I've said before, if it weren't for Rainer's
 effort, I simply wouldn't use D), but it certainly needs more work,
 especially if I'm going to have any traction selling it to others with
 no vested interest in the language.
I'm not talking about the auto-complete. I talk about the class view. It only works on the .json output of the compiler. If you regularly recompile (which is not an issue with D because it's basically free) you have a nice and clean overview of all your modules, classes and structs. And you can even search that class view to find stuff. It also helps somewhat with auto-completion and "goto definition". Yes I use it on a daily basis, and yes I know that it's not perfect. You should also go into Options -> Text Editors -> D -> Intelli Sense and play around with the options there to see what works best for you.
 But this project approached it from a different angle, ie, a total
 replacement for C++ which does have a rich toolset available. I want to
 use D in this context in the future, and if that were the commitment we
 were considering at Remedy, there's no way we would have gone for it.
Yeah I gave a talk about D at Havok recently too and basically got the same response. D simply doesn't have enough tools / momentum to be a option. I use D as a C++ replacement though, and going back to C++ annoys me more and more every day. Better tools are nice, but I'm really missing some features D has when coding C++.
Sep 01 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 6:55 AM, Manu wrote:
 Sometimes...
 Have you worked with it on a daily basis? It just doesn't work all the time.
 Also, in my experience the auto-complete suggestions are often either
 incomplete, or list heaps of completely unrelated stuff.
 I'm not saying it's bad, it's better than nothing and I really appreciate the
 project (I've said before, if it weren't for Rainer's effort, I simply wouldn't
 use D), but it certainly needs more work, especially if I'm going to have any
 traction selling it to others with no vested interest in the language.
I really appreciate your efforts here. What Rainer needs is, I assume, the same things I need. Very specific lists of what went wrong.
Sep 01 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 03:48, Walter Bright <newshound2 digitalmars.com> wrote:

 On 9/1/2013 6:55 AM, Manu wrote:

 Sometimes...
 Have you worked with it on a daily basis? It just doesn't work all the
 time.
 Also, in my experience the auto-complete suggestions are often either
 incomplete, or list heaps of completely unrelated stuff.
 I'm not saying it's bad, it's better than nothing and I really appreciate
 the
 project (I've said before, if it weren't for Rainer's effort, I simply
 wouldn't
 use D), but it certainly needs more work, especially if I'm going to have
 any
 traction selling it to others with no vested interest in the language.
I really appreciate your efforts here. What Rainer needs is, I assume, the same things I need. Very specific lists of what went wrong.
Yup, but there's so many bugs in his bug tracker. It seems clear to me he needs help. I talked with him a lot at DConf. As I understand it, most of the problems are because both his and Alexander Bothe's semantic analyser's just aren't as good as the D front end. This comes back to my point about it being a separate satelite project. I wonder if it might help with visibility of these problems if it were included in the central github organisation, and bugs present in the main bug tracker that everyone can see and looks at regularly.
Sep 01 2013
prev sibling next sibling parent reply "deadalnix" <deadalnix gmail.com> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 Configuring compilers:

 Naturally, this is primarily a problem with the windows 
 experience, but
 it's so frustrating that it is STILL a problem... how many 
 years later?
My first setup took me days :D Somehow it improved. It still sucks.
 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work 
 with a
 makefile. Everybody else wanted a comfortable IDE solution (and 
 the linux
 user would prefer it too).
I came to the same conclusion, and that is why SDC is relying on libd. Our tooling sucks hard.
 Obviously, we settled on Visual-D (Windows) and Mono-D 
 (OSX/Linux); the
 only realistic choices available. The OSX user would have 
 preferred an
 XCode integration.
I don't know any dev that is happy of XCode, even OSX lovers. I mean, how good is a editor that propose you to export functions to iTune (I should try it one day, I really wonder what it does), but not finding declaration in its contextual menu ?
 This goes back to the threads where the IDE guys are writing 
 their own
 parsers, when really, DMD should be able to be built as a lib, 
 with an API
 designed for using DMD as a lib/plugin.
Considering DMD never deallocate anything, expect to restart your IDE every hour or so.
 Debugging:
Ho god ! The demangler do not demangle most things. Hello stack trace of incomprehensible symbol. If gdb is able to give me file/line, druntime is completely unable to do so, so you have to use external tooling to get back this info. The best of the best being too long mangling being truncated in the output. Backtrace is the bare minimum to dubug and it isn't working.
 Documentation:

 Okay for the most part, but some windows dev's want a CHM that 
 looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.
I don't understand that. I never understand anything at MS documentation because of its look and feel (the content is usually pretty good). I guess it is matter of taste/habit.
 Containers:

 The question came up multiple times; "I don't think this should 
 be an
 array... what containers can I use, and where are they?"...
 Also, nobody could work out how to remove an arbitrary item 
 from an array,
 or an item from an AA by reference/value (only by key).

 This code:
   foreach(i, item; array)
     if(item == itemToRemove)
       array = array[0..i] ~ array[i+1..$];
 Got a rather 'negative' reaction from the audience to put it 
 lightly...
Me miss 1 piece in the language to make that work properly (and another one to make that more effiscient). Namely covariance on template instances and allocators. I've been raising the first one as it is also a problem for SDC. People raised it to implement AA as lib, and it is a recurring discussion around dcollection.
 Bugs:
 Yes, we hit DMD bugs, like the one with opaque structs which 
 required
 extensive work-arounds.
   struct MyStruct;
   MyStruct*[] = new MyStruct*[n];

 We also ran into some completely nonsense error messages, but I 
 forgot to
 log them, since we were working against the clock.


 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
Arguably, this is because of lack of IDE support and because C++ screwed you up :D I recognize the feature as good to have, but definitively not as big as an issue as the one mentioned above. If the problem mentioned before are solved, then this one won't be a big issue.
 I'm doing some more work on premake (a nice light buildsystem 
 that
 generated makefiles and project files for popular IDE's) to 
 tightly
 incorporate D into the various IDE's it supports.
That look a good project !
Sep 01 2013
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 2:50 AM, deadalnix wrote:
 Considering DMD never deallocate anything, expect to restart your IDE every
hour
 or so.
DMD deallocates everything when its process ends. This should have zero effect on the IDE. If you need to restart the IDE every hour, it is not because of DMD.
 Debugging:
Ho god ! The demangler do not demangle most things.
Bugzilla entry, please. There are many members in the D community who poke around looking for low hanging fruit in bugzilla that they can fix. They'll never know about the issues you're having that they can fix if they aren't in bugzilla.
Sep 01 2013
parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 19:54, Walter Bright wrote:
 On 9/1/2013 2:50 AM, deadalnix wrote:
 Considering DMD never deallocate anything, expect to restart your IDE
 every hour
 or so.
DMD deallocates everything when its process ends. This should have zero effect on the IDE. If you need to restart the IDE every hour, it is not because of DMD.
That was in reply to if DMD was built as a library and included in the IDE. Then there wouldn't be a process to end. -- /Jacob Carlborg
Sep 02 2013
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 02, 2013 at 10:11:28AM +0200, Jacob Carlborg wrote:
 On 2013-09-01 19:54, Walter Bright wrote:
On 9/1/2013 2:50 AM, deadalnix wrote:
Considering DMD never deallocate anything, expect to restart your
IDE every hour or so.
DMD deallocates everything when its process ends. This should have zero effect on the IDE. If you need to restart the IDE every hour, it is not because of DMD.
That was in reply to if DMD was built as a library and included in the IDE. Then there wouldn't be a process to end.
[...] module libDmd; auto parseDCode(string code) { pid_t pid; if ((pid=fork())==0) { dmdMain(...); exit(0); } else { auto status = waitpid(pid); return getReturnValue(...); } throw Exception("fork error"); } Though, granted, this may cause unacceptable performance hits. :-P T -- The best way to destroy a cause is to defend it poorly.
Sep 02 2013
parent reply "deadalnix" <deadalnix gmail.com> writes:
On Monday, 2 September 2013 at 15:06:40 UTC, H. S. Teoh wrote:
 On Mon, Sep 02, 2013 at 10:11:28AM +0200, Jacob Carlborg wrote:
 On 2013-09-01 19:54, Walter Bright wrote:
On 9/1/2013 2:50 AM, deadalnix wrote:
Considering DMD never deallocate anything, expect to restart 
your
IDE every hour or so.
DMD deallocates everything when its process ends. This should have zero effect on the IDE. If you need to restart the IDE every hour, it is not because of DMD.
That was in reply to if DMD was built as a library and included in the IDE. Then there wouldn't be a process to end.
[...] module libDmd; auto parseDCode(string code) { pid_t pid; if ((pid=fork())==0) { dmdMain(...); exit(0); } else { auto status = waitpid(pid); return getReturnValue(...); } throw Exception("fork error"); } Though, granted, this may cause unacceptable performance hits. :-P T
dude, libd is conceived in with that in mind, and I'm craving for help. I'm putting quite a lot of effort into it, but D is big, and any help would be welcome.
Sep 02 2013
parent reply "Brad Anderson" <eco gnuk.net> writes:
On Monday, 2 September 2013 at 16:16:48 UTC, deadalnix wrote:
 On Monday, 2 September 2013 at 15:06:40 UTC, H. S. Teoh wrote:
 On Mon, Sep 02, 2013 at 10:11:28AM +0200, Jacob Carlborg wrote:
 On 2013-09-01 19:54, Walter Bright wrote:
On 9/1/2013 2:50 AM, deadalnix wrote:
Considering DMD never deallocate anything, expect to 
restart your
IDE every hour or so.
DMD deallocates everything when its process ends. This should have zero effect on the IDE. If you need to restart the IDE every hour, it is not because of DMD.
That was in reply to if DMD was built as a library and included in the IDE. Then there wouldn't be a process to end.
[...] module libDmd; auto parseDCode(string code) { pid_t pid; if ((pid=fork())==0) { dmdMain(...); exit(0); } else { auto status = waitpid(pid); return getReturnValue(...); } throw Exception("fork error"); } Though, granted, this may cause unacceptable performance hits. :-P T
dude, libd is conceived in with that in mind, and I'm craving for help. I'm putting quite a lot of effort into it, but D is big, and any help would be welcome.
I've never heard of libd? Is it the library-fication of SDC? I assume this is it? https://github.com/deadalnix/libd
Sep 02 2013
parent "deadalnix" <deadalnix gmail.com> writes:
On Monday, 2 September 2013 at 18:12:54 UTC, Brad Anderson wrote:
 I've never heard of libd?  Is it the library-fication of SDC?  
 I assume this is it? https://github.com/deadalnix/libd
Yep, SDC has been splitted in 3 : SDC itself, libd (parsing/semantic analysis) and libd-llvm (codegen, JIT).
Sep 02 2013
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 9/2/2013 1:11 AM, Jacob Carlborg wrote:
 That was in reply to if DMD was built as a library and included in the IDE.
Then
 there wouldn't be a process to end.
Ah, I see. But that does bring up the possibility of running dmd front end as a separate process, and then using interprocess communication with it? Isn't Google's Chrome browser built that way?
Sep 02 2013
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-03 01:28, Walter Bright wrote:

 Ah, I see.

 But that does bring up the possibility of running dmd front end as a
 separate process, and then using interprocess communication with it?

 Isn't Google's Chrome browser built that way?
It will still eat a lot of memory, the way DMD currently handling memory. The process would be constantly running next to the IDE. -- /Jacob Carlborg
Sep 03 2013
prev sibling parent "Joakim" <joakim airpost.net> writes:
On Monday, 2 September 2013 at 23:28:58 UTC, Walter Bright wrote:
 On 9/2/2013 1:11 AM, Jacob Carlborg wrote:
 That was in reply to if DMD was built as a library and 
 included in the IDE. Then
 there wouldn't be a process to end.
Ah, I see. But that does bring up the possibility of running dmd front end as a separate process, and then using interprocess communication with it? Isn't Google's Chrome browser built that way?
Yes, they sandbox a WebKit renderer and v8 javascript compiler for each browser tab in a different process, for both security and stability reasons, ie a crashed tab doesn't bring down the whole browser. The main browser process handles all networking and feeds the downloaded HTML/CSS/javascript to each of the renderer processes, which return a bitmap for the tab. It's not a strict rule, because if you have several tabs loaded with the same domain, they will sometimes share a renderer process. You can read more about it here: http://www.chromium.org/developers/design-documents/multi-process-architecture
Sep 03 2013
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 04:05, Manu wrote:

 Naturally, this is primarily a problem with the windows experience, but
 it's so frustrating that it is STILL a problem... how many years later?
 People don't want to 'do work' to install a piece of software. Rather,
 they expect it to 'just work'. We lost about 6 hours trying to get
 everyone's machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! I just
 kept promising people that it would save time overall... which I wish
 were true.
Was this only on Windows or were there problems on Linux/Mac OS X as well?
 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work with a
 makefile. Everybody else wanted a comfortable IDE solution (and the
 linux user would prefer it too).
I can understand that.
 IDE integration absolutely needs to be considered a first class feature
 of D.
 I also suggest that the IDE integration downloads should be hosted on
 the dlang download page so they are obvious and available to everyone
 without having to go looking, and also as a statement that they are
 actually endorsed by the dlanguage authorities. As an end-user, you're
 not left guessing which ones are good/bad/out of date/actually work/etc.
I completely agree.
 Obviously, we settled on Visual-D (Windows) and Mono-D (OSX/Linux); the
 only realistic choices available.
There's also DDT with Eclipse. It supports auto completion, go to definition, has an outline view and so on.
 The OSX user would have preferred an  XCode integration.
This one is a bit problematic since Xcode doesn't officially supports plugins. But it's still possible, as been shown by Michel Fortin with his D for Xcode plugin.
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to navigate

 We all wanted to ability to define class member functions outside the
 class definition:
    class MyClass
    {
      void method();
    }

    void MyClass.method()
    {
      //...
    }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want to quickly see what a class has and does.
Sounds like you want an outline view in the IDE. This is supported by DDT in Eclipse. Even TextMate on Mac OS X has a form of outline view. -- /Jacob Carlborg
Sep 01 2013
next sibling parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 19:57, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 04:05, Manu wrote:

  Naturally, this is primarily a problem with the windows experience, but
 it's so frustrating that it is STILL a problem... how many years later?
 People don't want to 'do work' to install a piece of software. Rather,
 they expect it to 'just work'. We lost about 6 hours trying to get
 everyone's machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! I just
 kept promising people that it would save time overall... which I wish
 were true.
Was this only on Windows or were there problems on Linux/Mac OS X as well?
Well we never got OSX working (under mono-d), although this was mainly due to supporting apple infrastructure in the end. I think we wrangled the toolchain in the end, but never got everything linking; C++ dependencies got complicated. We eventually gave up, just wasting too much time, and he went off and did the music/sounds for the game... If you'd like to help me finish that OSX work we started together last year, that'd be really great for next time! :) Getting a workable environment:
 Unsurprisingly, the Linux user was the only person happy work with a
 makefile. Everybody else wanted a comfortable IDE solution (and the
 linux user would prefer it too).
I can understand that. IDE integration absolutely needs to be considered a first class feature
 of D.
 I also suggest that the IDE integration downloads should be hosted on
 the dlang download page so they are obvious and available to everyone
 without having to go looking, and also as a statement that they are
 actually endorsed by the dlanguage authorities. As an end-user, you're
 not left guessing which ones are good/bad/out of date/actually work/etc.
I completely agree. Obviously, we settled on Visual-D (Windows) and Mono-D (OSX/Linux); the
 only realistic choices available.
There's also DDT with Eclipse. It supports auto completion, go to definition, has an outline view and so on.
I've never met a C++ developer that likes Eclipse ;) But I should probably check it out. The OSX user would have preferred an XCode integration.

 This one is a bit problematic since Xcode doesn't officially supports
 plugins. But it's still possible, as been shown by Michel Fortin with his D
 for Xcode plugin.

  One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to navigate

 We all wanted to ability to define class member functions outside the
 class definition:
    class MyClass
    {
      void method();
    }

    void MyClass.method()
    {
      //...
    }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want to quickly see what a class has and does.
Sounds like you want an outline view in the IDE. This is supported by DDT in Eclipse. Even TextMate on Mac OS X has a form of outline view.
No, actually, as much as I keep banging on the IDE thing, in this case I absolutely don't want help from the IDE, I just want to look at my page of text, and be able to read a useful summary. Can you give me any good reasons why fully defined functions polluting the readability of a class definition could possibly be a good thing? I just don't get it... why would you ever want to break up the nice summary of what a class has&does, and why would you want to indent all of your functions an extra few tab levels by default? As a programmer, I spend a lot more time reading code than documentation, and much of that time is spent reading it in foreign places like github commit logs (limited horizontal space), diff/merge windows (hard to distinguish class API changes vs function body changes at a glance, since they're interleaved), even chat clients and communication tools. The IDE can't assist in any of these contexts. If you have to have an IDE to read your code, then something is really wrong. ...also, that implies you have good IDE integration, which is the a central part of my entire rant! ;) This argument is invalid until we have that, and at this point, it seems much more likely we may be able to define methods outside the class scope than have awesome IDE's.
Sep 01 2013
next sibling parent reply "eles" <eles eles.com> writes:
On Sunday, 1 September 2013 at 14:32:18 UTC, Manu wrote:
 On 1 September 2013 19:57, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 04:05, Manu wrote:
I've never met a C++ developer that likes Eclipse ;) But I should probably check it out.
Well, I'm mostly a C guy, but I prefer working in Eclipse over others (C::B, Anjuta etc.) For this reason, I look forward to DDT. However, for the time being, what keeps me far from DDT is the very reason that I embraced Eclipse/CDT for: the debugger. DDT lacks it.
Sep 01 2013
parent Arjan <arjan ask.me> writes:
On Sun, 01 Sep 2013 19:14:11 +0200, eles <eles eles.com> wrote:

 On Sunday, 1 September 2013 at 14:32:18 UTC, Manu wrote:
 On 1 September 2013 19:57, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 04:05, Manu wrote:
I've never met a C++ developer that likes Eclipse ;) But I should probably check it out.
Well, I'm mostly a C guy, but I prefer working in Eclipse over others (C::B, Anjuta etc.) For this reason, I look forward to DDT. However, for the time being, what keeps me far from DDT is the very reason that I embraced Eclipse/CDT for: the debugger. DDT lacks it.
I too use Eclipse+CDT (and other plugins like linuxtools, pyDEV)for C/C++ on Linux/BSD. Eclipse is also the "de facto" standard for C/C++ in the embedded world. See for example www.yoctoproject.org. Also android development toolkit (ADT) is eclipse based. See http://developer.android.com/sdk/index.html. Eclipse+CDT has improved a _lot_ over the last 5/6 years. It has become my 'IDE' of choice for C/C++ development over C::B, codelite, anjuta, KDevelop, VIM etc on Linux/BSD. On windows I still prefer MSVS for C++ which is still way ahead of eclipse+CDT, but the gap is closing... In my experience good tools (especially debugger!) and integration with MSVS/Eclipse/Xcode are mandatory to gain broader adaption. Most of my co-workers can't even get anything done without a IDE!
Sep 01 2013
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 16:32, Manu wrote:

 Well we never got OSX working (under mono-d), although this was mainly
 due to supporting apple infrastructure in the end. I think we wrangled
 the toolchain in the end, but never got everything linking; C++
 dependencies got complicated.
 We eventually gave up, just wasting too much time, and he went off and
 did the music/sounds for the game...
Gave up? Why not just use DMD directly from the zip on the command line and use TextMate or Sublime. TextMate 2 supports in app download of new languages and Sublime comes with support for D out of the box. Even though it's not perfect it has to be better than giving up.
 If you'd like to help me finish that OSX work we started together last
 year, that'd be really great for next time! :)
Well, I'm quite busy with my own projects. But I could perhaps give you a hand if you need. Although I don't want to do all the work as last time.
 No, actually, as much as I keep banging on the IDE thing, in this case I
 absolutely don't want help from the IDE, I just want to look at my page
 of text, and be able to read a useful summary.
 Can you give me any good reasons why fully defined functions polluting
 the readability of a class definition could possibly be a good thing?
 I just don't get it... why would you ever want to break up the nice
 summary of what a class has&does, and why would you want to indent all
 of your functions an extra few tab levels by default?
To keep everything in one place. Why would you want to duplicate the method signatures? I hate the header/source synchronization in the C family of languages. Especially in C++ where the signatures cannot even be exactly the same between the header and source file. I'm thinking of default values, for example.
 As a programmer, I spend a lot more time reading code than
 documentation, and much of that time is spent reading it in foreign
 places like github commit logs (limited horizontal space), diff/merge
 windows (hard to distinguish class API changes vs function body changes
 at a glance, since they're interleaved), even chat clients and
 communication tools. The IDE can't assist in any of these contexts. If
 you have to have an IDE to read your code, then something is really wrong.
That I agree with.
 ...also, that implies you have good IDE integration, which is the a
 central part of my entire rant! ;)
 This argument is invalid until we have that, and at this point, it seems
 much more likely we may be able to define methods outside the class
 scope than have awesome IDE's.
-- /Jacob Carlborg
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:20, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 16:32, Manu wrote:

  Well we never got OSX working (under mono-d), although this was mainly
 due to supporting apple infrastructure in the end. I think we wrangled
 the toolchain in the end, but never got everything linking; C++
 dependencies got complicated.
 We eventually gave up, just wasting too much time, and he went off and
 did the music/sounds for the game...
Gave up? Why not just use DMD directly from the zip on the command line and use TextMate or Sublime. TextMate 2 supports in app download of new languages and Sublime comes with support for D out of the box. Even though it's not perfect it has to be better than giving up.
I dunno. People just don't do that. It's perceived that typing commands in the command line is a completely unrealistic workflow for most people that doesn't love linux. He would have also had to have written himself a makefile, and none of us know how to write a makefile. I generate makefiles with other tools, but there are no good makegen tools that support D and C projects together, and even if there were, you'd just be writing a makegen script instead, which we still didn't know how to write... We also really didn't have time to stuff around with it. He just went and recorded audio instead. Side note; I'm working on support in premake for parallel C and D project generation for popular IDE's and makefiles. But it's not there yet. If you'd like to help me finish that OSX work we started together last
 year, that'd be really great for next time! :)
Well, I'm quite busy with my own projects. But I could perhaps give you a hand if you need. Although I don't want to do all the work as last time.
Fair enough. Well I don't have a Mac, and I don't know Cocoa, or ObjC... :/ No, actually, as much as I keep banging on the IDE thing, in this case I
 absolutely don't want help from the IDE, I just want to look at my page
 of text, and be able to read a useful summary.
 Can you give me any good reasons why fully defined functions polluting
 the readability of a class definition could possibly be a good thing?
 I just don't get it... why would you ever want to break up the nice
 summary of what a class has&does, and why would you want to indent all
 of your functions an extra few tab levels by default?
To keep everything in one place. Why would you want to duplicate the method signatures? I hate the header/source synchronization in the C family of languages. Especially in C++ where the signatures cannot even be exactly the same between the header and source file. I'm thinking of default values, for example.
I'm not suggesting it be in a separate file, Just below the class definition somewhere. The IDE actually can help here. If it detects you've modified a function signature, and it's defined lower in the file, it can fairly easily change it for you there too. As a programmer, I spend a lot more time reading code than
 documentation, and much of that time is spent reading it in foreign
 places like github commit logs (limited horizontal space), diff/merge
 windows (hard to distinguish class API changes vs function body changes
 at a glance, since they're interleaved), even chat clients and
 communication tools. The IDE can't assist in any of these contexts. If
 you have to have an IDE to read your code, then something is really wrong.
That I agree with.
So the conclusion is to put the IDE's assistance burden on the authoring side, not the reading side. Ie, when you change a function signature, IDE can update the definition's signature too. Good IDE's have awesome refactor tools, where you change a signature, and it will change it at all places that it is referenced.
Sep 01 2013
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-02 05:51, Manu wrote:

 I dunno. People just don't do that.
 It's perceived that typing commands in the command line is a completely
 unrealistic workflow for most people that doesn't love linux.
 He would have also had to have written himself a makefile, and none of
 us know how to write a makefile. I generate makefiles with other tools,
 but there are no good makegen tools that support D and C projects
 together, and even if there were, you'd just be writing a makegen script
 instead, which we still didn't know how to write...
 We also really didn't have time to stuff around with it. He just went
 and recorded audio instead.
I would have used a shell script but I get your point.
 Fair enough. Well I don't have a Mac, and I don't know Cocoa, or ObjC... :/
Hehe. You do already support iOS, how was that added?
 Good IDE's have awesome refactor tools, where you change a signature,
 and it will change it at all places that it is referenced.
Then your back to need of an IDE to use the language. -- /Jacob Carlborg
Sep 02 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 18:25, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-02 05:51, Manu wrote:

  I dunno. People just don't do that.
 It's perceived that typing commands in the command line is a completely
 unrealistic workflow for most people that doesn't love linux.
 He would have also had to have written himself a makefile, and none of
 us know how to write a makefile. I generate makefiles with other tools,
 but there are no good makegen tools that support D and C projects
 together, and even if there were, you'd just be writing a makegen script
 instead, which we still didn't know how to write...
 We also really didn't have time to stuff around with it. He just went
 and recorded audio instead.
I would have used a shell script but I get your point. Fair enough. Well I don't have a Mac, and I don't know Cocoa, or ObjC...
 :/
Hehe. You do already support iOS, how was that added?
Some time back when I did have a Mac, and I lifted the boot code from a sample app, and then the rest of the code just fell into place because it's super system-agnostic. Good IDE's have awesome refactor tools, where you change a signature,
 and it will change it at all places that it is referenced.
Then your back to need of an IDE to use the language.
Yeah, except this seems like a more sensible application of an IDE helper to me, and it's not required by any means for the language to be useful. Since the code can be read in any number of locations, and expecting the IDE to assist you with making the code readable makes no sense. Rather, move the 'burden' to the authoring stage, and the IDE can equally help there, but the advantage is readability anywhere without IDE support.
Sep 02 2013
prev sibling next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On 9/2/2013 12:51 PM, Manu wrote:

 I dunno. People just don't do that.
 It's perceived that typing commands in the command line is a completely
 unrealistic workflow for most people that doesn't love linux.
 He would have also had to have written himself a makefile, and none of
 us know how to write a makefile. I generate makefiles with other tools,
 but there are no good makegen tools that support D and C projects
 together, and even if there were, you'd just be writing a makegen script
 instead, which we still didn't know how to write...
 We also really didn't have time to stuff around with it. He just went
 and recorded audio instead.
Screw makefiles. dub[1] is the way to go. Dead easy to configure [2] and dead easy to use. A default debug build on the command line is "dub build", or even just "dub". [1] http://code.dlang.org/packages/dub [2] http://code.dlang.org/package-format
Sep 02 2013
parent "Sumit Raja" <sumitraja gmail.com> writes:
 Screw makefiles. dub[1] is the way to go. Dead easy to 
 configure [2] and dead easy to use. A default debug build on 
 the command line is "dub build", or even just "dub".

 [1] http://code.dlang.org/packages/dub
 [2] http://code.dlang.org/package-format
dub + Geany is my combination of choice. Great for cross platform - I'm using the same source tree to build across Windows, Linux and FreeBSD.
Sep 05 2013
prev sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Monday, 2 September 2013 at 03:51:54 UTC, Manu wrote:
 On 2 September 2013 05:20, Jacob Carlborg <doob me.com> wrote:
 Gave up? Why not just use DMD directly from the zip on the 
 command line
 and use TextMate or Sublime. TextMate 2 supports in app 
 download of new
 languages and Sublime comes with support for D out of the box. 
 Even though
 it's not perfect it has to be better than giving up.
I dunno. People just don't do that. It's perceived that typing commands in the command line is a completely unrealistic workflow for most people that doesn't love linux.
It is more of a cultural issue than real tool stack issue. Yes, I am perfectly aware that Microsoft has succeeded in creating incredibly closed and tool-oriented programming environment and also succeeded to create lot of programmers that accept it as the only possible way to do things. I am perfectly aware that game dev industry is completely Microsoft-centric and is forced to accept such rules of the game. But do you seriously expect anyone with no personal business interest to work on brining more of such crap into something that is not broken? You would have had my sympathy but demand "Let's force everyone to use IDE" is just insane. All this thread would have made some sense if some enterprise D entity has existed but it simply does not work that way right now. And, to be honest, I am glad about it.
Sep 02 2013
next sibling parent reply Joseph Rushton Wakeling <joseph.wakeling webdrake.net> writes:
On 02/09/13 14:51, Dicebot wrote:
 But do you seriously expect anyone with no personal business interest to work
on
 brining more of such crap into something that is not broken? You would have had
 my sympathy but demand "Let's force everyone to use IDE" is just insane. All
 this thread would have made some sense if some enterprise D entity has existed
 but it simply does not work that way right now. And, to be honest, I am glad
 about it.
Personally I find, observing a number of different open source projects, that a very typical problem is a kind of "selection bias" among contributors that leads them to significantly under-appreciate the usability problems of their software. It goes something like this: anyone who has spent any length of time using that software (which of course includes most contributors) either had a workflow and toolchain that the software matched with, or they have been able to adapt their workflow and toolchain to enable them to use the software. Usually they have managed to find ways of coping and working around any other usability issues that arise. And that situation then compounds itself over time because new users come and either adapt in the same way that existing contributors have, or they leave. So, you wind up with a body of contributors who often have much in common in terms of their setup, their perception of the priorities, and in their ability to handle the software. And that in turn can be very dangerous, because you get people who simply don't understand (or have any way to experience) problems that are brought to them by new users or by others. And of course there are always greater problems than usability, so those problems are the ones that get focused on, with the developers all the while bemoaning the lack of manpower and wondering why it is so difficult to attract and hold on to contributors. The only way that I can see to avoid that trap is to have a strong focus on usability as part of your development process, to make sure that developers have good connections with a diverse range of potential users and their experiences, and (where possible) for developers to dedicate part of their time to actually trying to undergo that experience themselves. The TL;DR of what I'm saying here is: while it's certainly crazy to force D contributors to use IDEs, there's a great deal of value in making sure that a good number of contributors regularly get IDE experience, and regularly try out "fresh start" installs of D in IDE and non-IDE environments, because that way you have a sense of how easy or painful it is for new users to get things installed and just get hacking.
Sep 02 2013
parent "Dicebot" <public dicebot.lv> writes:
On Monday, 2 September 2013 at 13:36:12 UTC, Joseph Rushton 
Wakeling wrote:
 Personally I find, observing a number of different open source 
 projects, that a very typical problem is a kind of "selection 
 bias" among contributors that leads them to significantly 
 under-appreciate the usability problems of their software.

 <snip>
There is a notable difference between making contribution easier and improving user experience. One thing that is often misunderstood about open-source driving power is that it is in fact very egoistic. You do stuff that is useful to you and share it with others because it costs you nothing and benefits in the long run. But key target user tends to be developer itself, not some kind of "end" user. Of course there is some place for idealistic motivation but in my opinion this is most important part in open-source success.
Sep 02 2013
prev sibling next sibling parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 22:51, Dicebot <public dicebot.lv> wrote:

 On Monday, 2 September 2013 at 03:51:54 UTC, Manu wrote:

 On 2 September 2013 05:20, Jacob Carlborg <doob me.com> wrote:

 Gave up? Why not just use DMD directly from the zip on the command line
 and use TextMate or Sublime. TextMate 2 supports in app download of new
 languages and Sublime comes with support for D out of the box. Even
 though
 it's not perfect it has to be better than giving up.
I dunno. People just don't do that. It's perceived that typing commands in the command line is a completely unrealistic workflow for most people that doesn't love linux.
It is more of a cultural issue than real tool stack issue. Yes, I am perfectly aware that Microsoft has succeeded in creating incredibly closed and tool-oriented programming environment and also succeeded to create lot of programmers that accept it as the only possible way to do things. I am perfectly aware that game dev industry is completely Microsoft-centric and is forced to accept such rules of the game. But do you seriously expect anyone with no personal business interest to work on brining more of such crap into something that is not broken? You would have had my sympathy but demand "Let's force everyone to use IDE" is just insane. All this thread would have made some sense if some enterprise D entity has existed but it simply does not work that way right now. And, to be honest, I am glad about it.
Okay, I clearly made my claim too strongly, but I still think it would be valuable for basically everyone to try it out every now and then, and understand the experience on offer. Sure, each to their own thing... I just wanted to stress that a higher consideration to the end-user experience wouldn't go astray.
Sep 02 2013
prev sibling parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 23:36, Joseph Rushton Wakeling <
joseph.wakeling webdrake.net> wrote:

 On 02/09/13 14:51, Dicebot wrote:

 But do you seriously expect anyone with no personal business interest to
 work on
 brining more of such crap into something that is not broken? You would
 have had
 my sympathy but demand "Let's force everyone to use IDE" is just insane.
 All
 this thread would have made some sense if some enterprise D entity has
 existed
 but it simply does not work that way right now. And, to be honest, I am
 glad
 about it.
Personally I find, observing a number of different open source projects, that a very typical problem is a kind of "selection bias" among contributors that leads them to significantly under-appreciate the usability problems of their software. It goes something like this: anyone who has spent any length of time using that software (which of course includes most contributors) either had a workflow and toolchain that the software matched with, or they have been able to adapt their workflow and toolchain to enable them to use the software. Usually they have managed to find ways of coping and working around any other usability issues that arise. And that situation then compounds itself over time because new users come and either adapt in the same way that existing contributors have, or they leave. So, you wind up with a body of contributors who often have much in common in terms of their setup, their perception of the priorities, and in their ability to handle the software. And that in turn can be very dangerous, because you get people who simply don't understand (or have any way to experience) problems that are brought to them by new users or by others. And of course there are always greater problems than usability, so those problems are the ones that get focused on, with the developers all the while bemoaning the lack of manpower and wondering why it is so difficult to attract and hold on to contributors.
I think this is a very interesting point. The only way that I can see to avoid that trap is to have a strong focus on
 usability as part of your development process, to make sure that developers
 have good connections with a diverse range of potential users and their
 experiences, and (where possible) for developers to dedicate part of their
 time to actually trying to undergo that experience themselves.

 The TL;DR of what I'm saying here is: while it's certainly crazy to force
 D contributors to use IDEs, there's a great deal of value in making sure
 that a good number of contributors regularly get IDE experience, and
 regularly try out "fresh start" installs of D in IDE and non-IDE
 environments, because that way you have a sense of how easy or painful it
 is for new users to get things installed and just get hacking.
Thank you.
Sep 02 2013
prev sibling next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Sep 02, 2013 at 12:32:06AM +1000, Manu wrote:
 On 1 September 2013 19:57, Jacob Carlborg <doob me.com> wrote:
[...]
 Sounds like you want an outline view in the IDE. This is supported
 by DDT in Eclipse. Even TextMate on Mac OS X has a form of outline
 view.
No, actually, as much as I keep banging on the IDE thing, in this case I absolutely don't want help from the IDE, I just want to look at my page of text, and be able to read a useful summary. Can you give me any good reasons why fully defined functions polluting the readability of a class definition could possibly be a good thing? I just don't get it... why would you ever want to break up the nice summary of what a class has&does, and why would you want to indent all of your functions an extra few tab levels by default?
If I wanted to to that, I'd setup folding in vim to fold function bodies. There's nothing inherently wrong with fully-defined functions inside their class -- Java does it, and I don't hear Java programmers complain about that.
 As a programmer, I spend a lot more time reading code than
 documentation, and much of that time is spent reading it in foreign
 places like github commit logs (limited horizontal space), diff/merge
 windows (hard to distinguish class API changes vs function body
 changes at a glance, since they're interleaved), even chat clients and
 communication tools. The IDE can't assist in any of these contexts. If
 you have to have an IDE to read your code, then something is really
 wrong.

 ...also, that implies you have good IDE integration, which is the a
 central part of my entire rant! ;)
 This argument is invalid until we have that, and at this point, it
 seems much more likely we may be able to define methods outside the
 class scope than have awesome IDE's.
I dunno, this sounds to me like maybe your class design needs to get looked at. :) I usually try to structure my code such that class methods are relatively short and self-contained, and I don't end up with classes with 50 methods each 10 pages long. T -- Customer support: the art of getting your clients to pay for your own incompetence.
Sep 01 2013
prev sibling next sibling parent reply "Simen Kjaeraas" <simen.kjaras gmail.com> writes:
On Sun, 01 Sep 2013 16:32:06 +0200, Manu <turkeyman gmail.com> wrote:

 No, actually, as much as I keep banging on the IDE thing, in this case I
 absolutely don't want help from the IDE, I just want to look at my page  
 of
 text, and be able to read a useful summary.
 Can you give me any good reasons why fully defined functions polluting  
 the
 readability of a class definition could possibly be a good thing?
 I just don't get it... why would you ever want to break up the nice  
 summary
 of what a class has&does, and why would you want to indent all of your
 functions an extra few tab levels by default?
Here's something D lets you do today: class Foo { // Definition: // Forbles the grabblies. void bar(); // Implementation: void bar() { myGrabblies.forble(); } } It does not get rid of the tabs, and has no checking that all functions are present in the definition (the compiler gets confused if only the definition is there), but it does give a nice list at the top. If you want the definition in a different file, and no class Foo { in the implementation file, you can do this: // foo.d class Foo { // Forbles the grabblies. void bar(); import("foo_imp.d"); } ---------------------------- //foo_imp.d: bar() { myGrabblies.forble(); } That gives no inline indication of which class the functions belong to, though. Also, no global functions in foo_imp.d. Now, neither of these solutions are perfect, but they might be good enough. I'd also like to see implementation separate from definition, but I feel dmd -H could do that job nicely by including comments, or possibly by dmd -D with the correct ddoc template. -- Simen
Sep 02 2013
parent "Dicebot" <public dicebot.lv> writes:
On Monday, 2 September 2013 at 12:54:41 UTC, Simen Kjaeraas wrote:
 If you want the definition in a different file, and no class 
 Foo { in the
 implementation file, you can do this:

 // foo.d
 class Foo {
     // Forbles the grabblies.
     void bar();

     import("foo_imp.d");
 }
 ----------------------------
 //foo_imp.d:

 bar() {
     myGrabblies.forble();
 }

 That gives no inline indication of which class the functions 
 belong to,
 though. Also, no global functions in foo_imp.d.
That is pretty fun trick but it is so preprocessor-flavored! :( Also it is likely to confuse lot of semantic analysis tool. Still may be a viable hack.
Sep 02 2013
prev sibling parent Danni Coy <danni.coy gmail.com> writes:
The linux user ended up heading the art team so we didn't test on that
environment.
Ideally the Linux user would like D support in KDevelop. Monodevelop is
acceptable but a bit clunky.



On Sun, Sep 1, 2013 at 7:57 PM, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 04:05, Manu wrote:

  Naturally, this is primarily a problem with the windows experience, but
 it's so frustrating that it is STILL a problem... how many years later?
 People don't want to 'do work' to install a piece of software. Rather,
 they expect it to 'just work'. We lost about 6 hours trying to get
 everyone's machines working properly.
 In the context of a 48 hour game jam, that's a terrible sign! I just
 kept promising people that it would save time overall... which I wish
 were true.
Was this only on Windows or were there problems on Linux/Mac OS X as well? Getting a workable environment:
 Unsurprisingly, the Linux user was the only person happy work with a
 makefile. Everybody else wanted a comfortable IDE solution (and the
 linux user would prefer it too).
I can understand that. IDE integration absolutely needs to be considered a first class feature
 of D.
 I also suggest that the IDE integration downloads should be hosted on
 the dlang download page so they are obvious and available to everyone
 without having to go looking, and also as a statement that they are
 actually endorsed by the dlanguage authorities. As an end-user, you're
 not left guessing which ones are good/bad/out of date/actually work/etc.
I completely agree. Obviously, we settled on Visual-D (Windows) and Mono-D (OSX/Linux); the
 only realistic choices available.
There's also DDT with Eclipse. It supports auto completion, go to definition, has an outline view and so on. The OSX user would have preferred an XCode integration.

 This one is a bit problematic since Xcode doesn't officially supports
 plugins. But it's still possible, as been shown by Michel Fortin with his D
 for Xcode plugin.

  One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to navigate

 We all wanted to ability to define class member functions outside the
 class definition:
    class MyClass
    {
      void method();
    }

    void MyClass.method()
    {
      //...
    }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want to quickly see what a class has and does.
Sounds like you want an outline view in the IDE. This is supported by DDT in Eclipse. Even TextMate on Mac OS X has a form of outline view. -- /Jacob Carlborg
Sep 05 2013
prev sibling next sibling parent "ponce" <contact spam.org> writes:
While I consider going to a contest with tools you don't know yet 
a very risky move, I can only empathize with some of the points 
expressed by Manu.

 We needed to mess with sc.ini for quite some time to get the 
 stars aligned
 such that it would actually compile and find the linker+libs.
Same problem here, I used DMD-Win64 only once since I don't remember the steps to get it to work (I think it involved changing some .lib and path in sc.ini).
 Debugging:

 Poor debugging experience wastes your time every 5 minutes.
 I can only speak for the Windows experience (since we failed to 
 get OSX
 working); there are lots of problems with the debugging 
 experience under
 visual studio...
 I haven't logged bugs yet, but I intend to.
 There were many instances of people wasting their time chasing 
 bugs in
 random places when it was simply a case of the debugger lying 
 about the
 value of variables to them, and many more cases where the 
 debugger simply
 refused to produce values for some variables at all.
This happened to me several times, so that I end up relying on writefln-debugging again. The fact that an IDE plugin exist at all is a huge acceptance factor in the workplace.
Sep 01 2013
prev sibling next sibling parent reply "Gary Willoughby" <dev nomad.so> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.
Uggh! I absolutely do not agree with this. You should rely on documentation or an IDE class overview for these things *not* alter the language. In lieu of IDE support just use ddoc comments for methods and properties and compile the documentation for each build.
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 20:22, Gary Willoughby <dev nomad.so> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:

 We all wanted to ability to define class member functions outside the
 class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class definition, you
 want
 to quickly see what a class has and does.
Uggh! I absolutely do not agree with this. You should rely on documentation or an IDE class overview for these things *not* alter the language. In lieu of IDE support just use ddoc comments for methods and properties and compile the documentation for each build.
I think that's unrealistic. People need to read the code in a variety of places. Github commit logs (limited horizontal space), diff/merge clients, office communication/chat tools. If the code depends on an IDE to be readable, then that's gotta be considered an epic fail! Give me one advantage to defining methods inline? I only see disadvantages. Lots of them.
Sep 01 2013
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 16:36, Manu wrote:

 Give me one advantage to defining methods inline? I only see
 disadvantages. Lots of them.
Give me one advantage to repeat the method signature. -- /Jacob Carlborg
Sep 01 2013
parent Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:21, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 16:36, Manu wrote:

  Give me one advantage to defining methods inline? I only see
 disadvantages. Lots of them.
Give me one advantage to repeat the method signature.
I've listed them, but again: * You can read the class declaration; what it has, and does, at a glance. * Function's don't have a few extra tab's of white space by default. This means you can more easily understand the flow of code within your class. * Your code uses less horizontal space. * I find it easier to review commit's in diff's, since I can clearly see separation between API changes, and function implementation changes. I find it much easier to detect the difference between 0, 1 and 2 tabs deep, than 2, 3 and 4 tab's deep. I believe there's an exponential falloff to your precision wrt estimating tab depth the deeper it gets. This makes the code less easy to follow; "Am in inside a loop, an if block, or a local function? What depth is the outer scope of the function anyway?" while skimming through code.
Sep 01 2013
prev sibling parent "Gary Willoughby" <dev nomad.so> writes:
On Sunday, 1 September 2013 at 14:37:13 UTC, Manu wrote:
 On 1 September 2013 20:22, Gary Willoughby <dev nomad.so> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:

 We all wanted to ability to define class member functions 
 outside the
 class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you
 want
 to quickly see what a class has and does.
Uggh! I absolutely do not agree with this. You should rely on documentation or an IDE class overview for these things *not* alter the language. In lieu of IDE support just use ddoc comments for methods and properties and compile the documentation for each build.
I think that's unrealistic. People need to read the code in a variety of places. Github commit logs (limited horizontal space), diff/merge clients, office communication/chat tools. If the code depends on an IDE to be readable, then that's gotta be considered an epic fail! Give me one advantage to defining methods inline? I only see disadvantages. Lots of them.
That's because you're suggesting that if we can't get an immediate overview of class code then it's unreadable which is complete nonsense. Classes do sometimes grow and become large and getting an overview is hard but that doesn't mean the code is unreadable or of low quality. In fact if the developers have bothered to write the associated ddoc comments while developing the code, creating overviews is as trivial as adding -D to the compile command. An overview of the class's interface should be the job of the documentation or a nice tool such as DDOC, an IDE class overview (there are many), class diagrams, ctags, etc... IMHO it's total folly to further complicate the language to provide you with something that already exists and i think this particular point is just because of your personal preference on how code should be structured which is obviously drawn from C++.
Sep 01 2013
prev sibling next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 ...
 </endrant>
Thought number one after reading "This is why I absolutely hate programming for Windows!" :) Was pretty happy with vim, grep, gdb and makefiles on Linux. Anyway, key problem (as far as I can see) here is that few of D developers have both experience and personal interest in any IDE/Windows focus as well as related tool stack. D has some nice flavor of anarchy - both like it and consider it a problem.
Sep 01 2013
next sibling parent reply Mike Parker <aldacron gmail.com> writes:
On 9/1/2013 7:40 PM, Dicebot wrote:
 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 ...
 </endrant>
Thought number one after reading "This is why I absolutely hate programming for Windows!" :) Was pretty happy with vim, grep, gdb and makefiles on Linux. Anyway, key problem (as far as I can see) here is that few of D developers have both experience and personal interest in any IDE/Windows focus as well as related tool stack. D has some nice flavor of anarchy - both like it and consider it a problem.
I have a batch file tied to a command prompt shortcut that always sets the D environment when I launch it. Updating DMD is a matter of deleting the old directory and unzipping the zip file. Editing is a matter of launching Sublime Text 2. Compiling is alt-tabbing to the command prompt and typing "dub build". It's a painless process and I never have any trouble. Of course, I don't try to use the MS tools, or build DMD myself. That way lies pain.
Sep 01 2013
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 13:44, Mike Parker wrote:

 I have a batch file tied to a command prompt shortcut that always sets
 the D environment when I launch it. Updating DMD is a matter of deleting
 the old directory and unzipping the zip file.
Or just use DVM: https://github.com/jacob-carlborg/dvm -- /Jacob Carlborg
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 23:24, Jacob Carlborg <doob me.com> wrote:

 On 2013-09-01 13:44, Mike Parker wrote:

  I have a batch file tied to a command prompt shortcut that always sets
 the D environment when I launch it. Updating DMD is a matter of deleting
 the old directory and unzipping the zip file.
Or just use DVM: https://github.com/jacob-**carlborg/dvm<https://github.com/jacob-carlborg/dvm>
I think infrastructure solutions like this are probably fine, but they're worthless unless they're installed by default as part of the standard DMD installer. If it's robust, convince Walter to include it in the standard installer package?
Sep 01 2013
parent Jacob Carlborg <doob me.com> writes:
On 2013-09-01 17:17, Manu wrote:

 I think infrastructure solutions like this are probably fine, but
 they're worthless unless they're installed by default as part of the
 standard DMD installer.
 If it's robust, convince Walter to include it in the standard installer
 package?
I might look in to that when it's time to update to D2. But the point of DVM is to use it instead of an installer. -- /Jacob Carlborg
Sep 01 2013
prev sibling parent reply "Brad Anderson" <eco gnuk.net> writes:
On Sunday, 1 September 2013 at 11:43:53 UTC, Mike Parker wrote:
 I have a batch file tied to a command prompt shortcut that 
 always sets the D environment when I launch it. Updating DMD is 
 a matter of deleting the old directory and unzipping the zip 
 file. Editing is a matter of launching Sublime Text 2. 
 Compiling is alt-tabbing to the command prompt and typing "dub 
 build". It's a painless process and I never have any trouble. 
 Of course, I don't try to use the MS tools, or build DMD 
 myself. That way lies pain.
For a few releases now the D installer for Windows has created a start menu shortcut that launches cmd.exe with dmd et al. added to the PATH. It also adds to the PATH during installation if you let it.
Sep 01 2013
parent reply Mike Parker <aldacron gmail.com> writes:
On 9/2/2013 8:59 AM, Brad Anderson wrote:
 For a few releases now the D installer for Windows has created a start
 menu shortcut that launches cmd.exe with dmd et al. added to the PATH.
 It also adds to the PATH during installation if you let it.
Good news. Does it allow multiple versions of DMD to be installed simultaneously?
Sep 02 2013
next sibling parent "Brad Anderson" <eco gnuk.net> writes:
On Monday, 2 September 2013 at 08:30:26 UTC, Mike Parker wrote:
 On 9/2/2013 8:59 AM, Brad Anderson wrote:
 For a few releases now the D installer for Windows has created 
 a start
 menu shortcut that launches cmd.exe with dmd et al. added to 
 the PATH.
 It also adds to the PATH during installation if you let it.
Good news. Does it allow multiple versions of DMD to be installed simultaneously?
The system PATH modification would conflict of course but you could use the start menu shortcuts that set the PATH just fine.
Sep 02 2013
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-02 10:30, Mike Parker wrote:

 Good news. Does it allow multiple versions of DMD to be installed
 simultaneously?
DVM already handles this. Although it doesn't handle Windows 64bit. -- /Jacob Carlborg
Sep 02 2013
prev sibling next sibling parent reply Manu <turkeyman gmail.com> writes:
On 1 September 2013 20:40, Dicebot <public dicebot.lv> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:

 ...

 </endrant>
Thought number one after reading "This is why I absolutely hate programming for Windows!" :) Was pretty happy with vim, grep, gdb and makefiles on Linux. Anyway, key problem (as far as I can see) here is that few of D developers have both experience and personal interest in any IDE/Windows focus as well as related tool stack. D has some nice flavor of anarchy - both like it and consider it a problem.
Plenty of the key contributors are Windows users, including Walter I believe. My suggestion is to encourage dev's to use the same tools the end users are using, and declare them first-class language features. I reckon it'll improve quick-smart under that environment.
Sep 01 2013
parent "Dicebot" <public dicebot.lv> writes:
On Sunday, 1 September 2013 at 14:39:37 UTC, Manu wrote:
 Plenty of the key contributors are Windows users, including 
 Walter I
 believe.
Yes but I'd be surprised to learn that their daily D application domain is as IDE-demanding as your game dev experience.
 My suggestion is to encourage dev's to use the same tools the 
 end users are
 using, and declare them first-class language features. I reckon 
 it'll
 improve quick-smart under that environment.
I simply don't see this working with current D development model. After all, open-source is not about simply working for free - it is about working for yourself and then sharing it. But there needs to be some _personal_ interest in implementing something, end users are just side effect. Well, of course I can't say anything about D dev motivation but I simply can't imagine people spending that much time on something they don't need, are not interested in and gain nothing from. If anything, that indicates that there needs to be more game dev guys among the core devs.
Sep 01 2013
prev sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Sun, Sep 01, 2013 at 12:40:00PM +0200, Dicebot wrote:
 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
...
</endrant>
Thought number one after reading "This is why I absolutely hate programming for Windows!" :)
+1. :)
 Was pretty happy with vim, grep, gdb and makefiles on Linux. Anyway,
 key problem (as far as I can see) here is that few of D developers
 have both experience and personal interest in any IDE/Windows focus as
 well as related tool stack. D has some nice flavor of anarchy - both
 like it and consider it a problem.
The thing is, we keep hearing complaints about how D IDE integration is bad, etc., but it seems like not many people are willing to do something about it. What we need is somebody who is (1) dedicated to D, (2) dedicated to making IDE integration for D work nicely, (3) produce lots of code to make it work. Forcing people to change the way to work on D just so you can have IDE integration probably won't have much of an effect. For instance, I wouldn't touch an IDE with a 10-foot pole. Will I still contribute to D? Sure. Will I do something about IDE integration because everyone complains about it? Unlikely. Will I be glad if somebody steps up and say, here's what I've been doing to make D IDE integration better? I'd fully support it. The question is whether there is such a somebody. :) T -- First Rule of History: History doesn't repeat itself -- historians merely repeat each other.
Sep 01 2013
parent Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sun, 1 Sep 2013 17:46:15 -0700
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote:

 On Sun, Sep 01, 2013 at 12:40:00PM +0200, Dicebot wrote:
 Was pretty happy with vim, grep, gdb and makefiles on Linux. Anyway,
 key problem (as far as I can see) here is that few of D developers
 have both experience and personal interest in any IDE/Windows focus
 as well as related tool stack. D has some nice flavor of anarchy -
 both like it and consider it a problem.
The thing is, we keep hearing complaints about how D IDE integration is bad, etc., but it seems like not many people are willing to do something about it. What we need is somebody who is (1) dedicated to D, (2) dedicated to making IDE integration for D work nicely, (3) produce lots of code to make it work. Forcing people to change the way to work on D just so you can have IDE integration probably won't have much of an effect. For instance, I wouldn't touch an IDE with a 10-foot pole. Will I still contribute to D? Sure. Will I do something about IDE integration because everyone complains about it? Unlikely. Will I be glad if somebody steps up and say, here's what I've been doing to make D IDE integration better? I'd fully support it. The question is whether there is such a somebody. :)
Yea, that's what I've noticed too, and I've been kinda biting my tongue on it since this thread started. For all the people who complain about their unhappyness with D's IDEs, it's telling how few of them find it important enough to actually *work* on instead of merely complain. I found some RDMD issues that were blocking me, so I fixed them. I wanted Windows support for DVM, so I added it. I found the zip-creating process to be problematic, so I'm doing something about it (even though it turned out to be bigger than I'd anticipated). Not to say that IDE stuff isn't a much bigger job than those, but come on, we've got what, one person on Visual-D, one on Mono-D, and one who *used* to do XCode-D but gave up because (and I sympathize) he seemed to be the only one who cared enough to tackle it? Surly *something* could be contributed if it really is as big of a problem as people say (and I'm not doubting that it is). Sure there's the matter of "I just don't have time", but frankly *none* of us do. I know I sure as hell don't and yet I *make* the time anyway because it's an important and worthwhile investment for me. And look at Andrei - he's been was one of the top contributors and leaders and he did so even while he was working on a PhD *in addition* to a full time job and a new family. "Don't have time" doesn't count because its true for all of us. It just makes IDE users sound like spoiled "gimme gimme gimme", and I'm certainly not going to claim they are, but I just want to point out that's the impression that tends to come across. There's plenty of other areas D's needed improvement, and those people who actually cared have stepped up to the plate - so why aren't (for the most part) those who want IDE improvements? (Naturally I greatly applaud the efforts of the few IDE leaders we do have and have had in the past.)
Sep 02 2013
prev sibling next sibling parent "bearophile" <bearophileHUGS lycos.com> writes:
Manu:

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.
Having the pieces of a class spread is not something I'd like. So before changing this a discussion is needed. Bye, bearophile
Sep 01 2013
prev sibling next sibling parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
01-Sep-2013 06:05, Manu пишет:
 The only compiler you can realistically use productively in windows is
 DMD-Win64, and that doesn't work out of the box.
 We needed to mess with sc.ini for quite some time to get the stars
 aligned such that it would actually compile and find the linker+libs.

 Walter: DMD needs to internally detect installations of various versions
 of VisualStudio, and either 'just work', or amend sc.ini on its own. Or
 the installer needs to amend sc.ini. Either way, leaving it to a user to
 fiddle with an ini file just isn't acceptable. We had to google
 solutions to this problem, and even then, we had trouble with the paths
 we added to sc.ini; are spaces acceptable? Do they have quites around
 them?...
 I might also suggest that Microsoft supplied (ie, 'standard'), libraries
 should be automatically detected and path entries added in there too:
    C:\Program Files (x86)\Microsoft SDKs\...
    C:\Program Files (x86)\Microsoft DirectX SDK (June 2010)\...
 These are on basically every windows developers machine, and each of us
 had to configure them ourselves.
Just find a way so that it isn't hardcoded (which Walter inevitably seems to end up with).
 Getting a workable environment:

 Unsurprisingly, the Linux user was the only person happy work with a
 makefile. Everybody else wanted a comfortable IDE solution (and the
 linux user would prefer it too).

 !!!!!!!!!
 This has to be given first-class attention!
 I am completely and utterly sick of this problem. Don made a massive
 point of it in his DConf talk, and I want to re-re-re-re-re-re-re-stress
 how absolutely important this is.
 !!!!!!!!!
+1 Strangely I went from VisualD to Sublime Text and never look back - it's far simpler (the overhead of having project/configuration etc. per short snippet of code I need to try), and STABILITY issues are well, need to be addressed.
 I have come to the conclusion that treating IDE integration as ancillary
 projects maintained by usually just one single member of the community
 has absolutely failed.
 I suggest:
   * These should be made central D community projects.
   * I think they should be hosted in the same github organisation as DMD.
Maybe. Though this move alone hardly buys anything unlike e.g. being prominently featured on dlang.org download page that would rise the number of users and bugs reported though (that you propose though).
   *** As many contributors as possible should be encouraged to work with
 them every day.
So true. But in case of say VisualD there simply not enough of folks with right kind of expertise - that are familiar with VisualStudio SDK + the number of arcane steps to build/integrate it AND have time/inclination to work on it. Ditto with other IDEs, it's a lot of upfront work to learn the infrastructure + as anything interactive there is no easy test suite to check your tweaks/hacks.
     - Deprecate DMD makefiles. Seriously! Insist that contributors use
 the IDE bindings to work on DMD.
While I understand the sentiment it is not a good idea. Makefiles are crap but some modern build tool would do just fine (certainly not specific per IDE project). _Having_ to use some IDE just hurts automation and adds dependency AND rises the barrier of entry (we'd lose all the editor + cmd line guys, and hell they are too amazing to lose). Not only that but toolchains and core libraries are all about batch processing and easily reproducible test runs - nothing like an interactive app. Hence the large difference in mindset w.r.t. say debugging, as batch tools are far more amendable to printf/assert style of debugging (more specifically postmortem style, including analyzing core dumps/stack traces). Unless there is some organized effort behind "interactive D" things will keep moving slowly. -- Dmitry Olshansky
Sep 01 2013
parent reply Jacob Carlborg <doob me.com> writes:
On 2013-09-01 18:24, Dmitry Olshansky wrote:

 So true. But in case of say VisualD there simply not enough of folks
 with right kind of expertise - that are familiar with VisualStudio SDK +
 the number of arcane steps to build/integrate it AND have
 time/inclination to work on it.
 Ditto with other IDEs, it's a lot of upfront work to learn the
 infrastructure + as anything interactive there is no easy test suite to
 check your tweaks/hacks.
And Xcode which doesn't even officially supports plugins. -- /Jacob Carlborg
Sep 01 2013
parent reply Michel Fortin <michel.fortin michelf.ca> writes:
On 2013-09-01 19:25:10 +0000, Jacob Carlborg <doob me.com> said:

 On 2013-09-01 18:24, Dmitry Olshansky wrote:
 
 So true. But in case of say VisualD there simply not enough of folks
 with right kind of expertise - that are familiar with VisualStudio SDK +
 the number of arcane steps to build/integrate it AND have
 time/inclination to work on it.
 Ditto with other IDEs, it's a lot of upfront work to learn the
 infrastructure + as anything interactive there is no easy test suite to
 check your tweaks/hacks.
And Xcode which doesn't even officially supports plugins.
But reverse-engineering Objective-C classes knowing only the name of classes and their methods is so much fun! How can people not be interested in running Xcode inside the debugger to follow what calls what? Then all you have to do is inject some code at the right place to subvert the IDE into compiling your D code, tracking D module dependencies, and suggesting completions as you type. Piece of cake! -- Michel Fortin michel.fortin michelf.ca http://michelf.ca
Sep 01 2013
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 2:04 PM, Michel Fortin wrote:
 Piece of cake!
I agree!
Sep 01 2013
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-09-01 23:04, Michel Fortin wrote:

 But reverse-engineering Objective-C classes knowing only the name of
 classes and their methods is so much fun! How can people not be
 interested in running Xcode inside the debugger to follow what calls
 what? Then all you have to do is inject some code at the right place to
 subvert the IDE into compiling your D code, tracking D module
 dependencies, and suggesting completions as you type. Piece of cake!
Hehe :) -- /Jacob Carlborg
Sep 02 2013
prev sibling next sibling parent =?UTF-8?B?QWxpIMOHZWhyZWxp?= <acehreli yahoo.com> writes:
On 08/31/2013 07:05 PM, Manu wrote:

 IDE integration absolutely needs to be considered a first class 
feature of
 D.
This is probably a repeat but Brian Schott has just announced DCD: http://forum.dlang.org/post/hrbzrholeoyyriumddjd forum.dlang.org Ali
Sep 01 2013
prev sibling next sibling parent reply "Volcz" <volcz kth.se> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 </endrant>
Completely agree with Manu! I work in the Java + telecom world of programming and recently graduated, so that's the majority of my experience. From what I've seen and experienced is that most today programmers can't live without an IDE and a rock solid tool chain. I've seen many solutions posted in this thread to parts of the problems, eg auto complete, debugging etc. The thing with an IDE is that it has "JUST WORK" it should work with all the different components seamlessly. The question I would like to ask is how can WE as a COMMUNITY improve this situation? Official IDE support? Official "IDE components"? Roadmap? Other languages like Java have corporate backing. What difference makes this? Why doesn't we have a roadmap? How does other open source communities work and what can we learn from them? Sorry for the un-structure of my post.
Sep 01 2013
parent reply Nick Sabalausky <SeeWebsiteToContactMe semitwist.com> writes:
On Sun, 01 Sep 2013 19:01:20 +0200
"Volcz" <volcz kth.se> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 </endrant>
Completely agree with Manu! I work in the Java + telecom world of programming and recently graduated, so that's the majority of my experience. From what I've seen and experienced is that most today programmers can't live without an IDE and a rock solid tool chain. I've seen many solutions posted in this thread to parts of the problems, eg auto complete, debugging etc. The thing with an IDE is that it has "JUST WORK" it should work with all the different components seamlessly. The question I would like to ask is how can WE as a COMMUNITY improve this situation? Official IDE support? Official "IDE components"? Roadmap? Other languages like Java have corporate backing. What difference makes this? Why doesn't we have a roadmap? How does other open source communities work and what can we learn from them? Sorry for the un-structure of my post.
To be honest, I never found Eclipse to "just work", especially with anything other than Java. Actually, for anything but Java I always found it to be an unusable nightmare (But good for Java even if less than perfect). Just FWIW.
Sep 02 2013
next sibling parent "growler" <growlercab gmail.com> writes:
On Tuesday, 3 September 2013 at 02:06:55 UTC, Nick Sabalausky 
wrote:
 On Sun, 01 Sep 2013 19:01:20 +0200
 "Volcz" <volcz kth.se> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 </endrant>
Completely agree with Manu! I work in the Java + telecom world of programming and recently graduated, so that's the majority of my experience. From what I've seen and experienced is that most today programmers can't live without an IDE and a rock solid tool chain. I've seen many solutions posted in this thread to parts of the problems, eg auto complete, debugging etc. The thing with an IDE is that it has "JUST WORK" it should work with all the different components seamlessly. The question I would like to ask is how can WE as a COMMUNITY improve this situation? Official IDE support? Official "IDE components"? Roadmap? Other languages like Java have corporate backing. What difference makes this? Why doesn't we have a roadmap? How does other open source communities work and what can we learn from them? Sorry for the un-structure of my post.
To be honest, I never found Eclipse to "just work", especially with anything other than Java. Actually, for anything but Java I always found it to be an unusable nightmare (But good for Java even if less than perfect). Just FWIW.
+1 The Eclipse!1Gigabyte.editor is pure bloatware. I don't use GUIs, except for debugging, but if D does get an official GUI please don't let it be Eclipse. http://www.ihateeclipse.com/
Sep 02 2013
prev sibling parent Arjan <arjan ask.me> writes:
On Tue, 03 Sep 2013 04:06:53 +0200, Nick Sabalausky  
<SeeWebsiteToContactMe semitwist.com> wrote:

 On Sun, 01 Sep 2013 19:01:20 +0200
 "Volcz" <volcz kth.se> wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 </endrant>
Completely agree with Manu! I work in the Java + telecom world of programming and recently graduated, so that's the majority of my experience. From what I've seen and experienced is that most today programmers can't live without an IDE and a rock solid tool chain.
Unfortunately that is my experience as well, even for senior developers. The ones that are able to get things done without an IDE are most often the 'geeks' 'aces' 'pro's', people with passion for the language/platform. When D wants to gain broader adoption (beyond the geek/ace/pro) an IDE like experience is indispensable and good debugging facilities is absolute mandatory!
 To be honest, I never found Eclipse to "just work"
Well w.r.t the CDT / linuxtools / pyDEV I can only say it has improved a _lot_ over the last years. In fact it has become my IDE of choice for C/C++/pyhon development on Linux/BSD.
Sep 03 2013
prev sibling next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 Documentation:

 Okay for the most part, but some windows dev's want a CHM that 
 looks like
 the typical Microsoft doc's people are used to. Those that 
 aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + 
 layout tree.
What can be improved in the CHM that's currently bundled with DMD?
Sep 01 2013
parent reply Manu <turkeyman gmail.com> writes:
On 2 September 2013 05:35, Vladimir Panteleev
<vladimir thecybershadow.net>wrote:

 On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:

 Documentation:

 Okay for the most part, but some windows dev's want a CHM that looks like
 the typical Microsoft doc's people are used to. Those that aren't familiar
 with the CHM viewer; it's just HTML but with a nice index + layout tree.
What can be improved in the CHM that's currently bundled with DMD?
It can have a link inserted in the start menu. I don't usually go looking for doc's in bin/... If I were gonna be picky, the index seems incomplete. Open it up, click the 'index' tab, and type tostringz... it's not there. I think it's also very unnatural for a chm to have the module's functions all on the one page. Can they be broken up into separate articles? But it's fine. The the main problem I had with it, is that I had no idea it existed. I didn't look in bin/.
Sep 01 2013
parent Walter Bright <newshound2 digitalmars.com> writes:
On 9/1/2013 9:11 PM, Manu wrote:
 If I were gonna be picky, the index seems incomplete. Open it up, click the
 'index' tab, and type tostringz... it's not there.
 I think it's also very unnatural for a chm to have the module's functions all
on
 the one page. Can they be broken up into separate articles?
Of course. The generation of chm is all in the git repository - anyone can fix it if they've a mind to!
Sep 02 2013
prev sibling next sibling parent reply "Ramon" <spam thanks.no> writes:
Manu

(answering the OP)

Stupid me, I just adressed similar issues in 
http://forum.dlang.org/thread/jlssvljpdgxlubdajwyy forum.dlang.org 
("End user experience with D") because first I thought this here 
was about game development (which I don't care about).

Short version: I agree with Manu. Completely. Strongly.

One warning tough: We should differentiate between "essential" 
and "nice but not urgent". For a simple reason: Feasibility.

Sure, .chm help would be nice for Windows users but frankly, good 
doc at all are more important; if it's, say, HTML that's good 
enough for a start.

As for an IDE we should definitely favour something cross 
platform (and non-java!!!) rather than caring about peculiarities 
of Windows, OSX, lx/kde, etc. Those can be done later. Really 
important is to have a base asap.

The killer: Debugging.

GDB is bare minimum and seeing adresses of structures rather than 
the members and their values is basically worthless.

When trying a new language I usually do a small project: 
"osnsort", i.e. a utility that takes (usually via a pipe from 'du 
-h') lines with OS style numbers (like "4,3M" for 4,4MB filesize) 
and then sorts those lie properly (unlike sort which lousily 
fails).
While D nicely showed some of its strengths and the coding was 
really enjoyable and the docs good enough for someone with a C 
background) I soon found myself inserting lots of "debug" 
statements as a "more elegant" version of the old "#ifdef DEBUG 
printf(...) #endif". *** YUCK!!! ***

Putting it bluntly: D is so great, so powerful, so nice language 
that it's worth a lot of efforts. With any other language I would 
have turned away and taken it to be a hobby thingy.

So, we must not please everyone and take care of their OS issues 
right now, but we definitely need some kind of, if somewhat 
crude, working base in terms of debugger support, IDE and docs.

A+ -R
Sep 01 2013
parent "Ramon" <spam thanks.no> writes:
=============== INTERRUPT ================

Can we, please, just for a moment leave details aside?



Dear Walter Bright

There are currently 3 threads addressing more or less the same 
issue: user experience.

You have, no doubt, created a wonderful, great and promising 
language. That's why we're here.

Yet we have heard from users who left. Not because D isn't good 
enough - actually it's so good that some at stay loosely 
connected and read and write here - but because in order to be 
useful and to grow and spread all over the universe, the power, 
the beauty and the elegance of a language must be found in 
practically interaction, too. In other words: user experience.

It seems in that context there are mainly 3 points coming up:

- IDE
- Docs
- Debugging

And I'd like to add a 4th: C library bindings.

Obviously neither you nor any 3 or 5 of us can possibly make 
*every*one happy. Equally obviously, making too many not happy 
leads to potential D users leaving the language.

Shouldn't we find some reasonable common grounds? Shouldn't we at 
least discuss and create a rough ToDo route with some priorities?

There are, for instance, tools out there that allow to create 
various common formats for documentation, incl. HTML and .chm.
Shouldn't we make an effort to identify and agree (or, if 
necessary, a dictum by you and/or Andrei) on some cross platform 
tool for documentation so as to be able to create all needed 
formats from one set of docs?

Thanks for D and - please - let us do the next step now. Let us 
not debate over Eclipse vs. Vim but agree that at least 1 cross 
platform IDE must be supported along with at least 1 cross 
platform Editor. Let us not only address language issues and 
problems with a narrow view but let us at the same time keep our 
mind open for the larger picture.

And let us agree that a not yet perfect language along with well 
useable and realiably working tools is worth at least as much as 
an ever more refined language with lousy, alpha, broken old, or 
simply not reasonably useable tools.
The same goes for the tools. Let us have 1 set of cross platform 
tools, even primitive ones but realiable and reliably working 
ones rather than striving to have a variety tools for each and 
every OS and whim addressing every minute detail coming up.

Let's build a base - and then conquer the world.
Sep 01 2013
prev sibling next sibling parent "Kagamin" <spam here.lot> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 Debugging:
Well, yes, for premium debugging support, you should probably use
 One more thing:
 I'll just pick one language complaint from the weekend.
 It is how quickly classes became disorganised and difficult to 
 navigate

 We all wanted to ability to define class member functions 
 outside the class
 definition:
   class MyClass
   {
     void method();
   }

   void MyClass.method()
   {
     //...
   }

 It definitely cost us time simply trying to understand the 
 class layout
 visually (ie, when IDE support is barely available).
 You don't need to see the function bodies in the class 
 definition, you want
 to quickly see what a class has and does.
combobox is for, I guess: a clear list of members.
 I might have even just proved to them that they should indeed

On Sunday, 1 September 2013 at 13:19:54 UTC, Manu wrote:
 Hmmm, I found details on the net that recommended adding an 
 [Environment64]
 section, which we did.

 I don't seem to have VCINSTALLDIR or WindowsSdkDir variables on 
 my system
 :/ .. that said, VC obviously works on my machine.
 It also seems potentially problematic that a variable would 
 define a single
 install directory, since it's pretty common that programmers 
 have multiple
 versions of VS on their machines.
VS provides shortcuts to the environment setup scripts in the start menu, which sets up the environment variables. That's how it works for C, and it works the same for everything, which uses C.
Sep 03 2013
prev sibling parent "Don" <x nospam.com> writes:
On Sunday, 1 September 2013 at 02:05:51 UTC, Manu wrote:
 We have to get the user experience and first impressions under 
 control...
I've created a bug report for the easiest of your requests: http://d.puremagic.com/issues/show_bug.cgi?id=10954 We need to do everything we can to make it more attractive to work on the IDE projects. This seems like an easy first step to start raising the profile.
Sep 03 2013