digitalmars.D.learn - Setting up dmd properly
- Jason Jeffory (26/26) Jan 10 2016 Dmd's setup construction is a bit weird and has some difficult
- Jason Jeffory (4/30) Jan 10 2016 Also, any linked in libraries could report there format and such.
- Jason Jeffory (8/47) Jan 10 2016 and how does one link in compiled static libraries into a dub
- =?iso-8859-1?Q?Robert_M._M=FCnch?= (14/24) Jan 10 2016 I agree with all your other points. Telling explicit what's going on
- Jason Jeffory (16/39) Jan 11 2016 Thanks, that works but
- Mike Parker (28/49) Jan 11 2016 lflags is probably not the best way to do it. The "libs" field is
- =?iso-8859-1?Q?Robert_M._M=FCnch?= (13/17) Jan 12 2016 I have seen countless problems because apps are using dynamic linking
- Mike Parker (5/8) Jan 12 2016 I'm not talking about dynamic linking, but dynamic loading. This
- Mike Parker (5/13) Jan 12 2016 To clarify, static bindings can be used when linking both
- Jason Jeffory (54/69) Jan 12 2016 It seems the whole state of affairs in programming is "Lets do
- Laeeth Isharc (62/94) Jan 12 2016 Since people aren't being paid to do this, and it's not enjoyable
- Jason Jeffory (63/163) Jan 12 2016 No, but what's the point if there is not a proper supported tool
- Jason Jeffory (10/10) Jan 12 2016 (I should mention that I am exaggerating a bit, and some of the
- Mike Parker (41/52) Jan 12 2016 I think there's another side of this in that what is an issue for
- Jason Jeffory (91/144) Jan 12 2016 Yes, but the world consists of many differnet programmers, and
- Jacob Carlborg (6/18) Jan 12 2016 You can get some more information by compiling with the "-v" flag. In
Dmd's setup construction is a bit weird and has some difficult issue tracking. How about if dmd supported, if it already doesn't, some ways to help the user check the configuration of dmd. It would be quick and easy to implement. e.g., dmd -showinfo Target Arch: x86 Libraries: C:\Mylib;C:\Another\Lib\Somewhere Modules: C:\MyModules; Version: 2.062 etc... This way, issues between 64 and 32 paths can easily be seen... figuring out exactly what sc.ini is doing is easier, etc... We know it is accurate because it would come from the compiler itself. No guessing. Probably a dmd app could be created that does this instead? Basically I've ran into issues before setting up D because of path issues(again, the sc.ini file is pretty crappy... littered with duplicate symbols and different paths with "hacks" for different compilers and all that(it's just not sain)). It tends to make me hesitant dealing with dmd in some cases. Something that should take a min to do can take hours trying to track down some weird issue simply because of a typo... and there's no way to know exactly what the compiler is "seeing"... I really hope something like this already exists.
Jan 10 2016
On Monday, 11 January 2016 at 01:22:28 UTC, Jason Jeffory wrote:Dmd's setup construction is a bit weird and has some difficult issue tracking. How about if dmd supported, if it already doesn't, some ways to help the user check the configuration of dmd. It would be quick and easy to implement. e.g., dmd -showinfo Target Arch: x86 Libraries: C:\Mylib;C:\Another\Lib\Somewhere Modules: C:\MyModules; Version: 2.062 etc... This way, issues between 64 and 32 paths can easily be seen... figuring out exactly what sc.ini is doing is easier, etc... We know it is accurate because it would come from the compiler itself. No guessing. Probably a dmd app could be created that does this instead? Basically I've ran into issues before setting up D because of path issues(again, the sc.ini file is pretty crappy... littered with duplicate symbols and different paths with "hacks" for different compilers and all that(it's just not sain)). It tends to make me hesitant dealing with dmd in some cases. Something that should take a min to do can take hours trying to track down some weird issue simply because of a typo... and there's no way to know exactly what the compiler is "seeing"... I really hope something like this already exists.Also, any linked in libraries could report there format and such. I've had problems figuring out certain coeff libs issues and all that. Knowing exactly what's going on is a good thing, right?!?!?
Jan 10 2016
On Monday, 11 January 2016 at 01:24:44 UTC, Jason Jeffory wrote:On Monday, 11 January 2016 at 01:22:28 UTC, Jason Jeffory wrote:and how does one link in compiled static libraries into a dub project? I tried adding stuff like "lflags" : ["+C:\\MyLibs\\"], with the .lib file in it, but that doesn't work. (I'd expect to have to supply the file name somewhere, at least) Thanks.Dmd's setup construction is a bit weird and has some difficult issue tracking. How about if dmd supported, if it already doesn't, some ways to help the user check the configuration of dmd. It would be quick and easy to implement. e.g., dmd -showinfo Target Arch: x86 Libraries: C:\Mylib;C:\Another\Lib\Somewhere Modules: C:\MyModules; Version: 2.062 etc... This way, issues between 64 and 32 paths can easily be seen... figuring out exactly what sc.ini is doing is easier, etc... We know it is accurate because it would come from the compiler itself. No guessing. Probably a dmd app could be created that does this instead? Basically I've ran into issues before setting up D because of path issues(again, the sc.ini file is pretty crappy... littered with duplicate symbols and different paths with "hacks" for different compilers and all that(it's just not sain)). It tends to make me hesitant dealing with dmd in some cases. Something that should take a min to do can take hours trying to track down some weird issue simply because of a typo... and there's no way to know exactly what the compiler is "seeing"... I really hope something like this already exists.Also, any linked in libraries could report there format and such. I've had problems figuring out certain coeff libs issues and all that. Knowing exactly what's going on is a good thing, right?!?!?
Jan 10 2016
On 2016-01-11 01:47:54 +0000, Jason Jeffory said:and how does one link in compiled static libraries into a dub project? I tried adding stuff like "lflags" : ["+C:\\MyLibs\\"], with the .lib file in it, but that doesn't work. (I'd expect to have to supply the file name somewhere, at least) Thanks.I agree with all your other points. Telling explicit what's going on would help a lot in daily business. Not only for D but all compiler stuff. But it seems to be tradition to not do this. Anyway, regarding the static libs. I used this on a Win64 project and it works: "lflags" : [ "D:\\develop\\cairo\\cairo\\src\\release\\cairo-static.lib", "D:\\develop\\cairo\\libpng\\libpng.lib", "gdi32.lib" ], -- Robert M. Münch http://www.saphirion.com smarter | better | faster
Jan 10 2016
On Monday, 11 January 2016 at 05:46:11 UTC, Robert M. Münch wrote:On 2016-01-11 01:47:54 +0000, Jason Jeffory said:Thanks, that works but 1. *not a valid lib file* (glfw3.lib) ;/ Ok, 2. What about 64? Does one have to maintain two branches for that? I don't understand why the trend is not to be verbose but to hide details ;/ It's simply the wrong way. 1. Trying windows link instead, remember having problems like this in the past with optlink. "LINK : fatal error LNK1104: cannot open file '_CMDLINE'" ;/ tried converting with coffimplib, not an import library. Another rabbit hole to go down ;/ (Why do programmers make programmers life hell?) After trying other various things that didn't work, I'm done for today... too frustrating. Hopefully I'll come back tomorrow with a nice surprise.and how does one link in compiled static libraries into a dub project? I tried adding stuff like "lflags" : ["+C:\\MyLibs\\"], with the .lib file in it, but that doesn't work. (I'd expect to have to supply the file name somewhere, at least) Thanks.I agree with all your other points. Telling explicit what's going on would help a lot in daily business. Not only for D but all compiler stuff. But it seems to be tradition to not do this. Anyway, regarding the static libs. I used this on a Win64 project and it works: "lflags" : [ "D:\\develop\\cairo\\cairo\\src\\release\\cairo-static.lib", "D:\\develop\\cairo\\libpng\\libpng.lib", "gdi32.lib" ],
Jan 11 2016
On Monday, 11 January 2016 at 16:27:54 UTC, Jason Jeffory wrote:lflags is probably not the best way to do it. The "libs" field is better. This will guarantee that the library is passed in a compiler-appropriate manner across platforms. lflags is compiler-specific.Anyway, regarding the static libs. I used this on a Win64 project and it works: "lflags" : [ "D:\\develop\\cairo\\cairo\\src\\release\\cairo-static.lib", "D:\\develop\\cairo\\libpng\\libpng.lib", "gdi32.lib" ],Thanks, that works but1. *not a valid lib file* (glfw3.lib) ;/ Ok,It's likely a COFF vs OMF issue.2. What about 64? Does one have to maintain two branches for that?No. You might keep the libraries in separate directories or use a naming convention (like appending -32 or -64 on the library names) to distinguish them. Using DUB, you could then add something like the following: "libs-windows-dmd-x86": ["myWinLib-32"], "libs-windows-dmd-x86_64": ["myWinLib-64"] Drop the "windows" bit for cross-platform stuff. Of course, this is dependent upon you passing -ax86_64 to DUB when you want to compile for 64-bit1. Trying windows link instead, remember having problems like this in the past with optlink. "LINK : fatal error LNK1104: cannot open file '_CMDLINE'" ;/ tried converting with coffimplib, not an import library. Another rabbit hole to go down ;/ (Why do programmers make programmers life hell?)coffimplib [1] is for converting import libraries, not static libraries. You can also use implib (part of the basic utilities package [2]) to generate an import library if you have a DLL. You should use coff2omf [3] to convert static libraries and object files. You can avoid all of these headaches by using dynamic bindings like those at DerelictOrg [4] if they are available for the libraries you use. Then the compile-time dependency on the C library goes away and all you need is the DLL at runtime. [1] http://www.digitalmars.com/ctg/coffimplib.html [2] http://www.digitalmars.com/download/freecompiler.html [3] http://www.digitalmars.com/ctg/coff2omf.html [4] https://github.com/DerelictOrg
Jan 11 2016
On 2016-01-12 04:15:36 +0000, Mike Parker said:You can avoid all of these headaches by using dynamic bindings like those at DerelictOrg [4] if they are available for the libraries you use. Then the compile-time dependency on the C library goes away and all you need is the DLL at runtime.I have seen countless problems because apps are using dynamic linking and whole IT environements getting into DLL hell. IMO one of the worst ideas these days. How simple would it be to just have one self-contained executable? And all the Docker hype is doing / simulating this with a sledgehammer. I prefer to link everything static, and it saved us and our clients hours of headache. Drivespace is no limiting factor anymore, but time and customer satisfaction is always. -- Robert M. Münch http://www.saphirion.com smarter | better | faster
Jan 12 2016
On Tuesday, 12 January 2016 at 08:42:19 UTC, Robert M. Münch wrote:I have seen countless problems because apps are using dynamic linking and whole IT environements getting into DLL hell. IMO one of the worst ideas these days.I'm not talking about dynamic linking, but dynamic loading. This allows more control over which versions of a dynamic library are supported and helps to avoid DLL hell.
Jan 12 2016
On Tuesday, 12 January 2016 at 12:32:11 UTC, Mike Parker wrote:On Tuesday, 12 January 2016 at 08:42:19 UTC, Robert M. Münch wrote:To clarify, static bindings can be used when linking both statically and dynamically. Dynamic bindings have no link-time dependency at all and are used to load a dynamic library manually at runtime.I have seen countless problems because apps are using dynamic linking and whole IT environements getting into DLL hell. IMO one of the worst ideas these days.I'm not talking about dynamic linking, but dynamic loading. This allows more control over which versions of a dynamic library are supported and helps to avoid DLL hell.
Jan 12 2016
On Tuesday, 12 January 2016 at 08:42:19 UTC, Robert M. Münch wrote:On 2016-01-12 04:15:36 +0000, Mike Parker said:It seems the whole state of affairs in programming is "Lets do the most minimal work to get X to work in environment Y. To hell with everything else!". The programmers tend to do the most minimal work to code stuff that they can get away with. This isn't 1984 but coding quality has no increased much since then. No programmer, in this day and age, should have to spend more than a few minutes getting anything to the point of actual programming. Programmers can code smarter, faster, and better, yet when it comes to the tooling, they tend to suck balls. Visual studio style is about the minimum one should except. I've virtually had no problems with it. MS did good job of modernizing the toolchain... Most people that code on linux think that it should be "hard" and gui's suck, that programming is suppose to be a hazing ritual. They setup their system to work for them, and it works... anyone with problems must be ignorant and not "pro programmers". It's kinda this elitist attitude. They spend more time solving 1%'er problems than creating tools that *just* work for 99% of the people. When problems occur it is never their fault but the fault of the ignorant cave man trying to become an godly programmer. Just search "openGL dmd"(28k) and about 80% of the results are people having problems with getting openGL working with D. "openGL dmd error" has 1M results, thats nearly 30 times the results. Of course, these don't mean much, but does give the trend. That's just for openGL. D has a long way to go to make it competitive... as long as the tooling sucks and there are problems with stupid issues such as coff vs omf, installation issues, ide issues, etc... it won't get off the ground. The D "core" seems to be mainly interested in fixing and enhancing very niche issues in D instead of working on making it a viable and usable candidate for the masses. They think by taking a Ferrari and doing all the pin stripes and detail work and add a specialized turbo charger is going to make it more appealing... yet they never put gas in it so that the customer can actually test drive it. There is a benefit of having D work well... the benefit is that there is a larger user database = more man-hours to help D evolve. The reason why MS and VS is better is because a noob like myself can install it and hit the gas pedal and go. It looks good, it's fast, it's not the Ferrari... it's like a Mazda. But it runs! No frustration figuring out why the damn thing won't start. I want to drive! Not fucking around for days trying to figure out why the thing won't start. It's not my job to fill it up with gas, that's the dealers responsibility. Anyways, sorry for the rant... not like things will change. D does fill a niche, and it shows ;/ Just wish I could drive the Ferrari! I know it's awesome! but the Mazda is more affordable(Man hours wise) and gets me to where I want to go without hassle. (I should have said dragster instead of Ferrari... something that is super fast but my blow up and kill you... anyways, you get the point!)You can avoid all of these headaches by using dynamic bindings like those at DerelictOrg [4] if they are available for the libraries you use. Then the compile-time dependency on the C library goes away and all you need is the DLL at runtime.I have seen countless problems because apps are using dynamic linking and whole IT environements getting into DLL hell. IMO one of the worst ideas these days. How simple would it be to just have one self-contained executable? And all the Docker hype is doing / simulating this with a sledgehammer. I prefer to link everything static, and it saved us and our clients hours of headache. Drivespace is no limiting factor anymore, but time and customer satisfaction is always.
Jan 12 2016
On Tuesday, 12 January 2016 at 19:38:32 UTC, Jason Jeffory wrote:It seems the whole state of affairs in programming is "Lets do the most minimal work to get X to work in environment Y. To hell with everything else!". The programmers tend to do the most minimal work to code stuff that they can get away with.Since people aren't being paid to do this, and it's not enjoyable for many to make things universally easy across different environments once someone has solved their own problem, you can hardly object to the behaviour - particularly because different people are good at different things, and the guy who creates a project may not be the same guy needed to make it easy to use. Then it's more a question of treating it as a challenge to be solved. It's quite amazing how much a relatively small number of people has accomplished, and it's something of a hazard of open-source that instead of gratitude people receive far more criticism and complaint. (They say a 2:1 balance of positive:negative comments is needed for a healthy relationship). So it's an engineering or institutional challenge - how does one achieve this as a community?This isn't 1984 but coding quality has no increased much since then.A little hyperbolic? ;) We do seem to have higher quality problems today, but do you recall what code from the 80s was like?I've virtually had no problems with it. MS did good job of modernizing the toolchain... Most people that code on linux think that it should be "hard" and gui's suck, that programming is suppose to be a hazing ritual. They setup their system to work for them, and it works... anyone with problems must be ignorant and not "pro programmers". It's kinda this elitist attitude. They spend more time solving 1%'er problems than creating tools that *just* work for 99% of the people. When problems occur it is never their fault but the fault of the ignorant cave man trying to become an godly programmer.Do you think that's actually representative of the attitudes of people here? I haven't seen that. But work won't get done without a plan and without someone to actually do it and one can't force people to do things they don't want to do. A big problem is people don't know what to work on, and maybe some kind of systematic approach to identify problem needs would help.Just search "openGL dmd"(28k) and about 80% of the results are people having problems with getting openGL working with D. "openGL dmd error" has 1M results, thats nearly 30 times the results.It would be a good idea to systematize this and produce a web report so one can see in a more structured way where the biggest difficulties are. I have been talking to Adam a bit about ways we could do this using forum history. I agree with your observation that there is much friction in the way of a new user learning D and that many simply won't persevere long enough. That's nonetheless a better problem to have than having an intrinsically inferior product - one just needs to establish a process, and to have some way of organizing efforts to address these difficulties (which may include funding to a certain extent). I think it's a necessary part of the way D has developed that people have focused on the language and core library first - it's not so long that it has been stable and ready to use and over time better tooling will unfold. (Constructive criticism and ideas may help this process).D has a long way to go to make it competitive... as long as the tooling sucks and there are problems with stupid issues such as coff vs omf, installation issues, ide issues, etc... it won't get off the ground.Depends what the competition is ;) Coff vs OMF will disappear in time as people move to 64 bit. Installation questions seem to be improving. IDE support keeps getting better. For many uses, these things are a one-off price for adopting D. Whether it's feasible to pay that depends on what you are doing and the people you are working with.The D "core" seems to be mainly interested in fixing and enhancing very niche issues in D instead of working on making it a viable and usable candidate for the masses.But it is in the nature of things that disruptive technologies start off as inferior in certain respects and it's only with time that they can be a superior competitor across the board to the dominant technologies. See Clayton Christensen's work "The Innovator's Dilemma". It is what it is, and one can't wave a magic wand to force people to work for free on shiny tools to make it easy. If that's what one wants, then one can do one's small part to encourage this to happen - work on that oneself and contribute it, file bugzilla issues, fund the D foundation (once it is ready). But simply complaining won't change anything. BTW I hardly think that memory allocation, containers, documentation, and web site (recent areas of focus by leadership) are niche issues.Anyways, sorry for the rant... not like things will change. D does fill a niche, and it shows ;/ Just wish I could drive the Ferrari! I know it's awesome! but the Mazda is more affordable(Man hours wise) and gets me to where I want to go without hassle.Maybe you're right and it's not ready for you for now (although this might change). It's easy to overestimate what's possible in a short period of time and underestimate over a longer period. The web site and documentation are very much better today than when I first looked at D maybe 1.5-2 years back.
Jan 12 2016
On Tuesday, 12 January 2016 at 20:38:50 UTC, Laeeth Isharc wrote:On Tuesday, 12 January 2016 at 19:38:32 UTC, Jason Jeffory wrote:No, but what's the point if there is not a proper supported tool chain to make most advantage of this stuff. It's like putting the horse before the cart(or, more sensical, having the cart pull the horse). The "leadership" focuses on what they believe is the best stuff, but they are not in a position to know what the best stuff is precisely because they are the leaders. They must ask the people they are doing this all for. If Walter, for example, want's his baby to grow up to fill an important part of human history(if not, then why all the work?), it seems wise that he make it the easiest to use. Easier = more people use it = more useful to people = bigger = longer = better. By niche, I mean simply compared to the overall D developmental issues. Web site design may not be totally niche, but solving some rare dmd internal compiler problems are. Also, D can't compete with the web server community so it is also niche in that regard. (until you make D easy to use, whats the point of creating cool stuff for it... no one will use it if they can't get there easily)It seems the whole state of affairs in programming is "Lets do the most minimal work to get X to work in environment Y. To hell with everything else!". The programmers tend to do the most minimal work to code stuff that they can get away with.Since people aren't being paid to do this, and it's not enjoyable for many to make things universally easy across different environments once someone has solved their own problem, you can hardly object to the behaviour - particularly because different people are good at different things, and the guy who creates a project may not be the same guy needed to make it easy to use. Then it's more a question of treating it as a challenge to be solved. It's quite amazing how much a relatively small number of people has accomplished, and it's something of a hazard of open-source that instead of gratitude people receive far more criticism and complaint. (They say a 2:1 balance of positive:negative comments is needed for a healthy relationship). So it's an engineering or institutional challenge - how does one achieve this as a community?This isn't 1984 but coding quality has no increased much since then.A little hyperbolic? ;) We do seem to have higher quality problems today, but do you recall what code from the 80s was like?I've virtually had no problems with it. MS did good job of modernizing the toolchain... Most people that code on linux think that it should be "hard" and gui's suck, that programming is suppose to be a hazing ritual. They setup their system to work for them, and it works... anyone with problems must be ignorant and not "pro programmers". It's kinda this elitist attitude. They spend more time solving 1%'er problems than creating tools that *just* work for 99% of the people. When problems occur it is never their fault but the fault of the ignorant cave man trying to become an godly programmer.Do you think that's actually representative of the attitudes of people here? I haven't seen that. But work won't get done without a plan and without someone to actually do it and one can't force people to do things they don't want to do. A big problem is people don't know what to work on, and maybe some kind of systematic approach to identify problem needs would help.Just search "openGL dmd"(28k) and about 80% of the results are people having problems with getting openGL working with D. "openGL dmd error" has 1M results, thats nearly 30 times the results.It would be a good idea to systematize this and produce a web report so one can see in a more structured way where the biggest difficulties are. I have been talking to Adam a bit about ways we could do this using forum history. I agree with your observation that there is much friction in the way of a new user learning D and that many simply won't persevere long enough. That's nonetheless a better problem to have than having an intrinsically inferior product - one just needs to establish a process, and to have some way of organizing efforts to address these difficulties (which may include funding to a certain extent). I think it's a necessary part of the way D has developed that people have focused on the language and core library first - it's not so long that it has been stable and ready to use and over time better tooling will unfold. (Constructive criticism and ideas may help this process).D has a long way to go to make it competitive... as long as the tooling sucks and there are problems with stupid issues such as coff vs omf, installation issues, ide issues, etc... it won't get off the ground.Depends what the competition is ;) Coff vs OMF will disappear in time as people move to 64 bit. Installation questions seem to be improving. IDE support keeps getting better. For many uses, these things are a one-off price for adopting D. Whether it's feasible to pay that depends on what you are doing and the people you are working with.The D "core" seems to be mainly interested in fixing and enhancing very niche issues in D instead of working on making it a viable and usable candidate for the masses.But it is in the nature of things that disruptive technologies start off as inferior in certain respects and it's only with time that they can be a superior competitor across the board to the dominant technologies. See Clayton Christensen's work "The Innovator's Dilemma". It is what it is, and one can't wave a magic wand to force people to work for free on shiny tools to make it easy. If that's what one wants, then one can do one's small part to encourage this to happen - work on that oneself and contribute it, file bugzilla issues, fund the D foundation (once it is ready). But simply complaining won't change anything. BTW I hardly think that memory allocation, containers, documentation, and web site (recent areas of focus by leadership) are niche issues.I don't disagree with anything you have said. The problem is mainly an "attitude". It's the same attitude that people make about life "I'll die, go to heaven, so whats the point of "fixing" stuff here, now?". But that attitude is what make it bad in the first place. Letting things "ride" because it's the easier thing to do but just perpetuates the problem which grows. D, obviously, has done some great things. But there seems to be a huge lack of focus on the important things that will make it better. It's quite simple, the 100 or so people working actively on D's growth can't do much of anything compared to 1M people working on it. Hence, it would seem best to figure out how to get the 1M people working on it quick as possible. As the 100 people toil away at solving these "niche" problems, they are not working on making D "affordable" to everyone. They are not multiplying their effort. They have a 1:1 leverage ration instead of 100:1. D should be much larger than it is. How long as it been out? There is a dysfunctional issue with D. Look how fast go and rust took off? There are reasons for that. Mainly because they created an easier to use and more functional toolset. Kids that are getting into programming that have issues with X will drop X in blink of an eye and move on to Y. If Y is easy and works and does what they want, then they will stick with Y. At some point they've invested in Y and continue to use it. D has the issue that it is not easy to use beyond basic trivial non-dependent coding. Hence when these kids try to do real world stuff(openGL, audio, GUI's) and run into 1984 problems, they move on to something easier. D has so much resistance that it has to overcome this if it cares to be a dominant player. Focusing on the wrong stuff is wrong. D does that a lot IMO... or at least neglects the right stuff(easy of use is a huge factor, and partly D has this in a fundamental way, but not in a "everyday real world use" way). My main issue with D is not so much what it does, but that I feel it is not covering enough ground on certain areas that I feel my investment in it will not return anything. I could spend 10 years becoming a D pro and if D goes fades away, then what would it all have been for? Each person that makes such a choice of using D will have to ask themselves that question. Some obviously are ok with investing in it, *many* are not(the masses). That concerns me, because the D "leadership" doesn't seem to think it matters. What D needs is a marketing genius that can make D in to something viable.Anyways, sorry for the rant... not like things will change. D does fill a niche, and it shows ;/ Just wish I could drive the Ferrari! I know it's awesome! but the Mazda is more affordable(Man hours wise) and gets me to where I want to go without hassle.Maybe you're right and it's not ready for you for now (although this might change). It's easy to overestimate what's possible in a short period of time and underestimate over a longer period. The web site and documentation are very much better today than when I first looked at D maybe 1.5-2 years back.
Jan 12 2016
(I should mention that I am exaggerating a bit, and some of the complaints about D are actually more directed to the programming community in general. D has the same fundamental issues though and it is just a matter of scale. Programming is way more fun when you are actually programming and getting the job done rather than fighting things that should work but don't. As programmers, we get used to that crap.. specially those that programmed in the 70's and 80's... but it doesn't mean it's ok. Specially when we know how to fix it. I really like the D language but the D tool chain should be more user friendly to work with on every level.)
Jan 12 2016
On Tuesday, 12 January 2016 at 21:08:30 UTC, Jason Jeffory wrote:(I should mention that I am exaggerating a bit, and some of the complaints about D are actually more directed to the programming community in general. D has the same fundamental issues though and it is just a matter of scale. Programming is way more fun when you are actually programming and getting the job done rather than fighting things that should work but don't. As programmers, we get used to that crap.. specially those that programmed in the 70's and 80's... but it doesn't mean it's ok. Specially when we know how to fix it. I really like the D language but the D tool chain should be more user friendly to work with on every level.)I think there's another side of this in that what is an issue for one person isn't necessarily an issue for another. For example, your difficulties with static linking... you aren't the first person to have this problem, but I personally never have. It all seems quite straight forward to me. I was annoyed by the COFF/OMF issue when I first started using D, sure, and that's what prompted me to start Derelict), but it's never been a problem that prevented me from building my D projects. The way to solve this particular sort of problem is to have more documentation, or tutorials, but that requires someone with the time and motivation to write it. People who aren't having that problem are going to be motivated to direct their time and energy elsewhere. So I don't see this as an issue of "getting used to" a bad situation, just one of varying opinions about what part of the situation is bad. I'm going to make an effort toward improving the situation with my learningd.org site. Linking with C libraries would be a good topic for a tutorial to go there. There is also some movement behind the scenes right now to create a comprehensive web site for teaching all things D, but it's going to be after the next DConf before any concrete effort is made in that direction. As for the toolchain... Until there is a dedicated team with a clear goal-oriented task list and the management necessary to keep development focused on those tasks, then development will go in the direction that the core contributors feel they need to direct their efforts. Anyone can champion a new cause in D's development, and several have over the years. That's what has driven the project forward. There have been many, many, many discussions over the years about how unfriendly D is to new users, or how difficult it is to get up and running with this or that aspect of the toolchain. Most of them have resulted in improvements. As a long time D user, I can tell you that you kids have it much better than we did back in the day. So things will get easier with time. Pre-D experience is what determines the degree of difficulty in getting started with D right now. For example, programmers who are comfortable with the command line, particularly when using C or C++, tend to have few difficulties with the toolchain. I'm hoping that the recent creation of the D Foundation will create opportunities to make it easier for /anyone/ to hit the ground running with D.
Jan 12 2016
On Wednesday, 13 January 2016 at 01:40:59 UTC, Mike Parker wrote:On Tuesday, 12 January 2016 at 21:08:30 UTC, Jason Jeffory wrote:Yes, but the world consists of many differnet programmers, and due the the interaction of different things, which is exponential, it is too complex to solve each persons problems on a one by one basis. What works for you works for you because you are you... not because you are me(not trying to make a Beatles song here). e.g., if write your total system configuration in a list, and I write mine, then just about every element in the list interacts with every other element and creates new dependency problems. The amount of potential issues it the powerset of all the items. e.g., ubuntu 14.3, dmd 2.062, glad opengl, etc... windows, dmd 2.061, derelict opengl, etc... Each item in your list has to interact with each other, and in each case it may or may not work well for you. But my list is totally different and you can't say "It works for me"... because it doesn't help me in any way. dmd 2.061 may have a bug in the windows version and the derelict version that is not exhibited by the linux case. Because the problem is so complex, the only way to make it work is that the elements themselves have to work well individually(basically because we can't code for every combination). This is why we have to break complex systems down into simple well understood parts. But information is the here. One has to have some way to "measure" progress(ask Andrei, he said that in his performance lecture). If a tool doesn't give results that can be compared, we can't see how we are solve the problem and it usually requires guess, looking up stuff online for similar problems/solutions, and usually a lot of wasted time. So, the solution to all this is not solving the problems after they exist but trying to prevent them in the first place... and there is one and only one way to do this, and that's to write better tools(because after the tools, is were the problems come in from). e.g., imagine compilers were perfect, no bugs... then we wouldn't have problems with them, no patches would be required(e.g., so essentially no github), etc.. But we don't work towards writing perfect tools, we work towards writing perfect fixes for the tools. Again, the horse before the cart. I do realize it is necessary, and some people are working in the absolute right direction. Most, it's just the vague general direction with a lot of circling around, getting lost, etc. My main "gripe", if you will, is simply that the tools are really really import, and that programmers in general tend to forget just how important. Mainly because they deal with it on a daily basis, and many times they've forgotten how much of their own lives have been wasted on needlessly solving problems that shouldn't have existed in the first place. Imagine if you were given all the time back that you spend wasted on chasing rabbits that didn't exist? (someone wrote code that didn't do what it was suppose to, etc.) It would probably add up to more than a year(if you've been programming a long time). That is significant! This is my impression of the D "leadership": Lets fix bugs and squabble over who's write about semantic interpretation of certain language constructs(e.g., extern(c++, ns)). Lets find out who's right and who's wrong and lets may sure no one admits they are wrong or accepts a compromise! This creates "unhappiness" on both sides. 1. If Manu had his wish, he could test out the feature and determine if it worked for him. 2. If Walter had his, he could feel good about not introducing something that caused a lot of headaches. This is not the solution to things. This creates division and separation from actually solving the real problem. The solution is "Lets figure out how to do both". That way, if Walter is wrong, he can be proven wrong and if Manu is wrong, he will be proven wrong(through real world practical problems). How much time is wasted on such arguments instead of just figuring out how to satisfies all parties? I personally believe it can be done if everyone works towards that goal. (maybe there will have be some compromise in technical issues, but not in goal) Again, I'm inflating a lot of stuff and I think, for the most part, DMD does a good job. It works well across the board in general. Specific stuff though it seems to fail at in more cases than it should, for some reason. It's more of a mentality that is wide spread across people and programmers in general. It usually leads to an unfocused and unproductive mess in the long term. I'd hate to see that happen to D because I feel it is, perhaps, the best language out there. It's elegant to some degree, it's powerful, it has meta programming constructs that just make sense, the people behind it seem intelligent and have a strong desire to see it succeed... But it has a long was to go... and that can't happen without more support. Gui's, graphics, sound, database, etc must be well done and supported innately by D if it wants to be top dog. Water the flower well and it grows into a beautiful thing. Neglect and ignore it and it will wilt and die. Same for children, programming languages, families, species, etc. D's too good to let it fade off into the bowls of mankind.(I should mention that I am exaggerating a bit, and some of the complaints about D are actually more directed to the programming community in general. D has the same fundamental issues though and it is just a matter of scale. Programming is way more fun when you are actually programming and getting the job done rather than fighting things that should work but don't. As programmers, we get used to that crap.. specially those that programmed in the 70's and 80's... but it doesn't mean it's ok. Specially when we know how to fix it. I really like the D language but the D tool chain should be more user friendly to work with on every level.)I think there's another side of this in that what is an issue for one person isn't necessarily an issue for another. For example, your difficulties with static linking... you aren't the first person to have this problem, but I personally never have. It all seems quite straight forward to me. I was annoyed by the COFF/OMF issue when I first started using D, sure, and that's what prompted me to start Derelict), but it's never been a problem that prevented me from building my D projects. The way to solve this particular sort of problem is to have more documentation, or tutorials, but that requires someone with the time and motivation to write it. People who aren't having that problem are going to be motivated to direct their time and energy elsewhere. So I don't see this as an issue of "getting used to" a bad situation, just one of varying opinions about what part of the situation is bad. I'm going to make an effort toward improving the situation with my learningd.org site. Linking with C libraries would be a good topic for a tutorial to go there. There is also some movement behind the scenes right now to create a comprehensive web site for teaching all things D, but it's going to be after the next DConf before any concrete effort is made in that direction. As for the toolchain... Until there is a dedicated team with a clear goal-oriented task list and the management necessary to keep development focused on those tasks, then development will go in the direction that the core contributors feel they need to direct their efforts. Anyone can champion a new cause in D's development, and several have over the years. That's what has driven the project forward. There have been many, many, many discussions over the years about how unfriendly D is to new users, or how difficult it is to get up and running with this or that aspect of the toolchain. Most of them have resulted in improvements. As a long time D user, I can tell you that you kids have it much better than we did back in the day. So things will get easier with time. Pre-D experience is what determines the degree of difficulty in getting started with D right now. For example, programmers who are comfortable with the command line, particularly when using C or C++, tend to have few difficulties with the toolchain. I'm hoping that the recent creation of the D Foundation will create opportunities to make it easier for /anyone/ to hit the ground running with D.
Jan 12 2016
On 2016-01-11 02:22, Jason Jeffory wrote:Dmd's setup construction is a bit weird and has some difficult issue tracking. How about if dmd supported, if it already doesn't, some ways to help the user check the configuration of dmd. It would be quick and easy to implement. e.g., dmd -showinfo Target Arch: x86 Libraries: C:\Mylib;C:\Another\Lib\Somewhere Modules: C:\MyModules; Version: 2.062 etc...You can get some more information by compiling with the "-v" flag. In the top it will print the config used, i.e. the path to sc.ini the compiler found. At the bottom it will print the command it uses for linking. -- /Jacob Carlborg
Jan 12 2016