www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - #pragma comment (lib, ...)

reply Manu <turkeyman gmail.com> writes:
Does D support some sort of #pragma lib?
I use this in C all the time, and I really like code that auto-links its
dependencies. Maintaining a massive list of arbitrary libs in my build
scripts is a pain, and even more so when the code that depends on it may be
version-ed out on particular configurations. Syncing the build scripts
against the state of the code is tedious.
Oct 10 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 1:22 AM, Manu wrote:
 Does D support some sort of #pragma lib?
Yes: pragma(lib, "mylib.lib");
Oct 10 2012
next sibling parent reply Manu <turkeyman gmail.com> writes:
Percect, thanks!

On 10 October 2012 11:27, Walter Bright <newshound2 digitalmars.com> wrote:

 On 10/10/2012 1:22 AM, Manu wrote:

 Does D support some sort of #pragma lib?
Yes: pragma(lib, "mylib.lib");
Oct 10 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-10-10 10:31, Manu wrote:
 Percect, thanks!
As far as I recall, that doesn't work with import files. -- /Jacob Carlborg
Oct 11 2012
prev sibling next sibling parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 10 October 2012 09:31, Manu <turkeyman gmail.com> wrote:
 Percect, thanks!


 On 10 October 2012 11:27, Walter Bright <newshound2 digitalmars.com> wrote:
 On 10/10/2012 1:22 AM, Manu wrote:
 Does D support some sort of #pragma lib?
Yes: pragma(lib, "mylib.lib");
NB: GCC has no such equivalent, and IMO libraries should be specified during the linking step. Such information simply doesn't belong inside a source file as a source file can be compiled or assembled even without a linking stage. Regards, -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Oct 10 2012
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-10 13:15, Iain Buclaw wrote:

 NB: GCC has no such equivalent, and IMO libraries should be specified
 during the linking step. Such information simply doesn't belong inside
 a source file as a source file can be compiled or assembled even
 without a linking stage.
I agree, I think a package manager together with a build tool should be used instead. -- /Jacob Carlborg
Oct 10 2012
next sibling parent reply Manu <turkeyman gmail.com> writes:
On 10 October 2012 15:42, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-10 13:15, Iain Buclaw wrote:

  NB: GCC has no such equivalent, and IMO libraries should be specified
 during the linking step. Such information simply doesn't belong inside
 a source file as a source file can be compiled or assembled even
 without a linking stage.
I agree, I think a package manager together with a build tool should be used instead.
None of those things actually embody the information about the relationship, nor can they. The source code does, and nothing else. Features that imply the dependency may (and often are) be disabled at compile time. I rather like that the compiler is able to put a note in the object file that it depends on a particular lib, because it does. I'm not sure how a package manager helps... What is a package manager? ;) I'd like to hear some reasons why that is a bad or undesirable thing, or is this just an opinion?
Oct 10 2012
next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 13:23:57 UTC, Manu wrote:
 On 10 October 2012 15:42, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-10 13:15, Iain Buclaw wrote:

  NB: GCC has no such equivalent, and IMO libraries should be 
 specified
 during the linking step. Such information simply doesn't 
 belong inside
 a source file as a source file can be compiled or assembled 
 even
 without a linking stage.
I agree, I think a package manager together with a build tool should be used instead.
None of those things actually embody the information about the relationship, nor can they. The source code does, and nothing else. Features that imply the dependency may (and often are) be disabled at compile time. I rather like that the compiler is able to put a note in the object file that it depends on a particular lib, because it does. I'm not sure how a package manager helps... What is a package manager? ;) I'd like to hear some reasons why that is a bad or undesirable thing, or is this just an opinion?
This only works if it is part of the language definition. In C and C++ case I am usually against it, because I favour portability over dependencies to a specific compiler vendor. Many years of writing multi-platform code do leave some scars. As for D, if this can be made part of the language then I see no big reason against it. -- Paulo
Oct 10 2012
next sibling parent Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 6:45 AM, Paulo Pinto wrote:
 In C and C++ case I am usually against it, because I favour portability over
 dependencies to a specific compiler vendor. Many years of writing
multi-platform
 code do leave some scars.
Certainly, any particular use of this feature will be platform specific. But you can use the version statement to adjust it for various platforms as required.
Oct 10 2012
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On 10 October 2012 16:45, Paulo Pinto <pjmlp progtools.org> wrote:

 This only works if it is part of the language definition.

 In C and C++ case I am usually against it, because I favour portability
 over dependencies to a specific compiler vendor. Many years of writing
 multi-platform code do leave some scars.
Errr, what? This enhanced portability, that's the point. I've spent my career writing more cross-platform code than most coders would touch in their lives, and I give thanks for the platforms where it is available. It always leads to a vastly simplified path in the build scripts for those platforms that support it, and typically produces more reliable and less fickle results; ie, I never experience link problems with those platforms. Multi-platform code always has #ifdef guards around #pragma comment(lib,)-ing the appropriate libs for the platform which the code is being built for, and that is the whole point. The code its self selects the libs it depends on by simply being compiled. As for D, if this can be made part of the language then I see no big reason
 against it.
Well, DMD says it is ;) .. Question is, is it technically possible for other compilers to support it?
Oct 10 2012
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 16:18:10 UTC, Manu wrote:
 On 10 October 2012 16:45, Paulo Pinto <pjmlp progtools.org> 
 wrote:

 This only works if it is part of the language definition.

 In C and C++ case I am usually against it, because I favour 
 portability
 over dependencies to a specific compiler vendor. Many years of 
 writing
 multi-platform code do leave some scars.
Errr, what? This enhanced portability, that's the point. I've spent my career writing more cross-platform code than most coders would touch in their lives, and I give thanks for the platforms where it is available. It always leads to a vastly simplified path in the build scripts for those platforms that support it, and typically produces more reliable and less fickle results; ie, I never experience link problems with those platforms.
I never state you did not. Actually since I know you do games development I suspect exactly that, that you are also aware of such issues.
 Multi-platform code always has #ifdef guards around #pragma
 comment(lib,)-ing the appropriate libs for the platform which 
 the code is
 being built for, and that is the whole point. The code its self 
 selects the
 libs it depends on by simply being compiled.
I tend to push for platform specific code to have their own set of files, thus minimizing preprocessor usage. I rather have a interface.h file with corresponding interface_osname.cpp files. As I mentioned in other threads, when you work with cheap developers the code tends to be scary, so minimizing preprocessor usage is a GOOD thing.
 As for D, if this can be made part of the language then I see 
 no big reason
 against it.
Well, DMD says it is ;) .. Question is, is it technically possible for other compilers to support it?
You right here, I failed to look up the language definition. -- Paulo
Oct 10 2012
prev sibling next sibling parent reply Marco Leise <Marco.Leise gmx.de> writes:
Am Wed, 10 Oct 2012 15:59:42 +0300
schrieb Manu <turkeyman gmail.com>:

 None of those things actually embody the information about the
 relationship, nor can they. The source code does, and nothing else.
 Features that imply the dependency may (and often are) be disabled at
 compile time.
 I rather like that the compiler is able to put a note in the object file
 that it depends on a particular lib, because it does.
 [=E2=80=A6]
I share your opinion whole-heartedly! I've been on #rust, yesterday and someone had problems with the bindings generator not writing out the library name: As you can see, you see nothing. But the question here really is: "Don't you force the language into using 'ld' as a linker if you use it's command-line?" Linking is just time you wait until you can run your program. In a modern language I'd like to understand compilation and linking as one process. LTO and pragma(lib, ...) are steps in that direction. (And single file compilation should work without seems compared to one-step compilation.) Part or all of the linker should be in the compiler, to allow external(...)-declarations to work without additional platform and compiler dependent command-line additions. Problematic cases are for example the 'escape' switches used by different compilers to pass arguments on to the linker in one-step mode, or 'missing' .so-names on some distributions that make it difficult to name a specific version, e.g. lua-4.2. We should learn from package managers and improve on pragma(lib, ...) to support dependencies like this: // lua.dll or liblua.so.4 (or highest liblua.so.4.*) pragma(lib, "lua-4.*"); // can match mysqlclient.dll or libmysqlclient.so.16 pragma(lib, ">=3Dmysqlclient-12"); This would go through the system specific library paths and lookup version information from .dlls or so-names. In my opinion, just linking to a library without version constraints is not enough. --=20 Marco
Oct 10 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 6:48 AM, Marco Leise wrote:
 Linking is just time you wait until you can run your program.
 In a modern language I'd like to understand compilation and
 linking as one process.
Actually, I think you're right. There's no technical reason why the compiler can't go directly to an executable in one step. I've often thought of doing this.
Oct 10 2012
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 14:41:56 UTC, Walter Bright 
wrote:
 On 10/10/2012 6:48 AM, Marco Leise wrote:
 Linking is just time you wait until you can run your program.
 In a modern language I'd like to understand compilation and
 linking as one process.
Actually, I think you're right. There's no technical reason why the compiler can't go directly to an executable in one step. I've often thought of doing this.
Turbo Pascal was already doing this back in 1987. The Pascal family of languages always made me look down to C and C++ toolchains as stone age technology that I have to endure when using those languages. This is actually one feature that I really like in JVM/.NET worlds, even in the native compiler versions that are available for them. -- Paulo
Oct 10 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-10-10 16:26, Paulo Pinto wrote:

 Turbo Pascal was already doing this back in 1987.

 The Pascal family of languages always made me look down to C and C++
 toolchains as stone age technology that I have to endure when using
 those languages.

 This is actually one feature that I really like in JVM/.NET worlds, even
 in the native compiler versions that are available for them.
I think the Clang/LLVM tool chain is planing to do this, or already has. -- /Jacob Carlborg
Oct 10 2012
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-10 14:59, Manu wrote:

 None of those things actually embody the information about the
 relationship, nor can they. The source code does, and nothing else.
 Features that imply the dependency may (and often are) be disabled at
 compile time.
 I rather like that the compiler is able to put a note in the object file
 that it depends on a particular lib, because it does.
 I'm not sure how a package manager helps... What is a package manager? ;)
 I'd like to hear some reasons why that is a bad or undesirable thing, or
 is this just an opinion?
A package manager is a tool that downloads and installs libraries, application and tools, also known as packages. It also tracks and installs all the dependencies of a package automatically. RubyGems, CPAN, PEAR, npm are a couple of examples of package managers specifically made for a programming language. This is my vision (simplified) of how build tool, package manager and the compiler work together. package foo package bar files myapp/main.d $ build myapp The build tool "build" will read the build script and see that it has two dependencies: "foo" and "bar". The build tool will get information from the package manager about these packages. Information like the path to the import files/source/headers and the path to the library to link with. The build tool will then simply invoke the compiler with the correct flags to be able to build and link with these libraries. The build tool will, of course, know the platform it runs on so it can call the compiler with different flags depending on the platform. On Posix it would probably link to "libfoo.a" where on Windows it would link to "foo.lib". If I recall correctly, using pragma(lib) with dmd, you need to specify the extension for the library, ending up with code like this: version (Posix) pragma(lib, "foo.lib"); else version (Windows) pragma(lib, "libfoo.a"); Which is just ridiculous. Sure you could change DMD to support for this. But what happens with dynamic libraries? .dll on Windows, .so on Linux/BSD and .dylib on Mac OS X. Then on Mac OS X there are various other types of dynamic libraries, like frameworks and bundles, with their own extension. Another point is that I don't want to separate the build script. Compile and link flags in one file and then the libraries to link with in a different file? That's just stupid. With the approach with the build tool and the package manager working together the build tool can also automatically specify the path to the import files. How would you do that in a source file? Either the compiler would always need to read a special file first. Or it would need to scan all files for a particular pragma to get the import paths. Then it would rescan the files again during the actual compilation. This is a specification/description of a package manager I'm working on: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D -- /Jacob Carlborg
Oct 10 2012
next sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Wednesday, 10 October 2012 at 19:19:12 UTC, Jacob Carlborg 
wrote:
 If I recall correctly, using pragma(lib) with dmd, you need to 
 specify the extension for the library, ending up with code like 
 this:
Nope, the extension is optional. I never use it. pragma(lib, "pq");
Oct 10 2012
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On 10 October 2012 21:55, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-10 14:59, Manu wrote:

  None of those things actually embody the information about the
 relationship, nor can they. The source code does, and nothing else.
 Features that imply the dependency may (and often are) be disabled at
 compile time.
 I rather like that the compiler is able to put a note in the object file
 that it depends on a particular lib, because it does.
 I'm not sure how a package manager helps... What is a package manager? ;)
 I'd like to hear some reasons why that is a bad or undesirable thing, or
 is this just an opinion?
A package manager is a tool that downloads and installs libraries, application and tools, also known as packages. It also tracks and installs all the dependencies of a package automatically. RubyGems, CPAN, PEAR, npm are a couple of examples of package managers specifically made for a programming language.
Sorry, I do know what a package manager is. I was being facetious, in that windows has no such concept, and most people use windows. It's not practical to depend on anything like that. This is my vision (simplified) of how build tool, package manager and the
 compiler work together.


 package foo
 package bar

 files myapp/main.d

 $ build myapp

 The build tool "build" will read the build script and see that it has two
 dependencies: "foo" and "bar". The build tool will get information from the
 package manager about these packages. Information like the path to the
 import files/source/headers and the path to the library to link with.
I've never seen the 'custom build tool' option in Visual Studio (and if there were one, certainly people don't use it). And no associated package manager that automatically fetches dependencies... Additionally, you're insisting build tool 'X' upon people, and I presume it's not an established 'standard' build tool that's accepted/agreed on all platforms. You'll never convince people to use it, and certainly not in in proprietary situations where your tools are dictated. The build tool will then simply invoke the compiler with the correct flags
 to be able to build and link with these libraries. The build tool will, of
 course, know the platform it runs on so it can call the compiler with
 different flags depending on the platform. On Posix it would probably link
 to "libfoo.a" where on Windows it would link to "foo.lib".

 If I recall correctly, using pragma(lib) with dmd, you need to specify the
 extension for the library, ending up with code like this:

 version (Posix)
     pragma(lib, "foo.lib");

 else version (Windows)
     pragma(lib, "libfoo.a");

 Which is just ridiculous. Sure you could change DMD to support for this.
 But what happens with dynamic libraries? .dll on Windows, .so on Linux/BSD
 and .dylib on Mac OS X. Then on Mac OS X there are various other types of
 dynamic libraries, like frameworks and bundles, with their own extension.
This is indeed ridiculous, but trivial. I'm sure it can be fixed in minutes. I don't think the solution is complicated, given a lib name, on linux it first looks for a lib*.so, then for a .a. Windows uses import libs, which will be resolved exactly the same as a static lib, and I don't know anything about OSX, is it not similar/same as linux? Another point is that I don't want to separate the build script. Compile
 and link flags in one file and then the libraries to link with in a
 different file? That's just stupid.
What's a flag? you mean those little options in the property grid when you Right Click->Properties on the project? ;) People don't like doing that, and they really hate micro-managing a massive list of libraries in there. Also those flags aren't configurable, you can't just add one for your new feature of interest. I also maintain that it's not stupid. The build script doesn't know what libs the code will link to. I guess you're arguing that your build-script should exclusively define the features/libs, not the other way around? This is annoying in a cross-platform environment, because all platforms tend to link a totally different set of libraries, and that means your build script is cluttered with annoying manually maintained lists for each platform, and the less commonly used/built platform ALWAYS seems to fall out of date. In D's context, D has very powerful (and more manageable) static logic, I can see it being even more useful to have code-driven selection of dependencies in D than it already is in C. With the approach with the build tool and the package manager working
 together the build tool can also automatically specify the path to the
 import files. How would you do that in a source file? Either the compiler
 would always need to read a special file first. Or it would need to scan
 all files for a particular pragma to get the import paths. Then it would
 rescan the files again during the actual compilation.
The linker would just need to be smart enough to gather the deps from the objects while linking. I've said before, my question is: is it _possible_ to do it in a cross platform way currently? I see some other comments with ideas, maybe it is? This is a specification/description of a package manager I'm working on:
 https://github.com/jacob-**carlborg/orbit/wiki/Orbit-**
 Package-Manager-for-D<https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D>
Does it work in windows and integrate with Visual Studio? If not, sadly, it's irrelevant. But aside from that, I am curious, what makes it better/more useful than apt?
Oct 11 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-11 10:12, Manu wrote:

 Sorry, I do know what a package manager is. I was being facetious, in
 that windows has no such concept, and most people use windows.
 It's not practical to depend on anything like that.
RubyGems is working perfectly fine on Windows. Just because most Windows does not have a built in package manager and most of these open source language seems to lean towards Posix than Windows doesn't mean that you can't have a working package manager in Windows.
 I've never seen the 'custom build tool' option in Visual Studio (and if
 there were one, certainly people don't use it). And no associated
 package manager that automatically fetches dependencies...
What's stopping anyone from adding support for this in VisualD?
 Additionally, you're insisting build tool 'X' upon people, and I presume
 it's not an established 'standard' build tool that's accepted/agreed on
 all platforms. You'll never convince people to use it, and certainly not
 in in proprietary situations where your tools are dictated.
Then tell me what established standard build tool that works cross platform. They're just pain in the ass to use or have too many dependencies to use/too complicated to install. I'm not forcing anyone upon my tools. But you're trying to force features into D that is not particular useful.
 This is indeed ridiculous, but trivial. I'm sure it can be fixed in minutes.
 I don't think the solution is complicated, given a lib name, on linux it
 first looks for a lib*.so, then for a .a. Windows uses import libs,
 which will be resolved exactly the same as a static lib, and I don't
 know anything about OSX, is it not similar/same as linux?
No, Mac OS X uses .dylib files. In addition to that it uses frameworks and other kind of bundles. A framework is a folder that the linker and Finder (file browser) treats differently. It contains a dynamic library, headers and other resources like images. It has the .framework extension.
     Another point is that I don't want to separate the build script.
     Compile and link flags in one file and then the libraries to link
     with in a different file? That's just stupid.


 What's a flag? you mean those little options in the property grid when
 you Right Click->Properties on the project? ;)
No, I mean when you left click. How the hell would I know what tools you're using?
 People don't like doing that, and they really hate micro-managing a
 massive list of libraries in there. Also those flags aren't
 configurable, you can't just add one for your new feature of interest.
So how do you chooses what libraries to link with. If you want to build in a release or a debug build. Regardless what tools you have, in the end it will add compile and link flags somewhere.
 I also maintain that it's not stupid. The build script doesn't know what
 libs the code will link to. I guess you're arguing that your
 build-script should exclusively define the features/libs, not the other
 way around?
So what does know which libraries to link with if not the build script. Something needs to know that.
 This is annoying in a cross-platform environment, because all platforms
 tend to link a totally different set of libraries, and that means your
 build script is cluttered with annoying manually maintained lists for
 each platform, and the less commonly used/built platform ALWAYS seems to
 fall out of date.
The information needs to be somewhere.
 In D's context, D has very powerful (and more manageable) static logic,
 I can see it being even more useful to have code-driven selection of
 dependencies in D than it already is in C.


     With the approach with the build tool and the package manager
     working together the build tool can also automatically specify the
     path to the import files. How would you do that in a source file?
     Either the compiler would always need to read a special file first.
     Or it would need to scan all files for a particular pragma to get
     the import paths. Then it would rescan the files again during the
     actual compilation.


 The linker would just need to be smart enough to gather the deps from
 the objects while linking.
The linker doesn't know anything about the import paths.
 I've said before, my question is: is it _possible_ to do it in a cross
 platform way currently? I see some other comments with ideas, maybe it is?


     This is a specification/description of a package manager I'm working on:

     https://github.com/jacob-__carlborg/orbit/wiki/Orbit-__Package-Manager-for-D
     <https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D>


 Does it work in windows and integrate with Visual Studio?
 If not, sadly, it's irrelevant.
Of course, I build all by software to work cross-platform, what is pointing to anything else. You're talking about "working on all platforms" but then you end with this comment making the rest of your comments not very believable. I can end with the same comment. Does Visual Studio work on other platforms than Windows? No, then it's irrelevant. I care about cross-platform not just about Windows and/or Visual Stdio. I build my tools as libraries make it easy to both create command line tools or plugins to integrate in whatever IDE/application you want to use. Not just a plugin specifically for Visual Studio.
 But aside from that, I am curious, what makes it better/more useful than
 apt?
It's specific to D and can be made a lot more easier to work with. apt doesn't know anything about D, DMD or how to build a D application. apt is also only works on Debian, Ubuntu and a few other Linux distributions. -- /Jacob Carlborg
Oct 11 2012
parent reply Manu <turkeyman gmail.com> writes:
On 11 October 2012 12:04, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-11 10:12, Manu wrote:

  Sorry, I do know what a package manager is. I was being facetious, in
 that windows has no such concept, and most people use windows.
 It's not practical to depend on anything like that.
RubyGems is working perfectly fine on Windows. Just because most Windows does not have a built in package manager and most of these open source language seems to lean towards Posix than Windows doesn't mean that you can't have a working package manager in Windows.
Okay, so I'll stop being a dick for a minute, I'm actually curious to know how you imagine it working with a tool like VisualStudio. It sounds like you're not just talking about a tool to fetch libs and dependencies, but also perform the build? And the dependencies are detailed in the build script? An inflexible build system like Visual Studio doesn't really handle that very well. A package manager which collects libs and their dependencies into common include/lib paths sounds extremely useful. But to me it sounds even more useful when combined with #pragma lib! That conveniently eliminates the lib path issue. I can imagine a situation where libraries would imply their associated lib just by being imported into your module. With a package manager as you describe, and source-embedded lib linkage statements, your average user would be in a position to never concern themselves with libs at all. Sounds I also maintain that it's not stupid. The build script doesn't know what
 libs the code will link to. I guess you're arguing that your
 build-script should exclusively define the features/libs, not the other
 way around?
So what does know which libraries to link with if not the build script. Something needs to know that.
The source -> object files would ideally know, and the linker would extract that information its self. I think that's the topic of the conversation :) The linker would just need to be smart enough to gather the deps from
 the objects while linking.
The linker doesn't know anything about the import paths.
The linker knows standard places to search, and non-standard places would still need to be configured... but this marry's very nicely with a package manager as you describe, since there'll never be confusion about where to look under that system. Does it work in windows and integrate with Visual Studio?
 If not, sadly, it's irrelevant.
Of course, I build all by software to work cross-platform, what is pointing to anything else. You're talking about "working on all platforms" but then you end with this comment making the rest of your comments not very believable. I can end with the same comment. Does Visual Studio work on other platforms than Windows? No, then it's irrelevant.
The point I'm trying to make is that a solution which is only convenient when working with a particular configurable/flexible build script isn't a sufficient solution. #pragma lib is very convenient, especially in the absence of a comprehensive build environment (such as Visual Studio). I maintain that there is absolutely nowhere that understands library dependencies better than the code that produces that dependency. I'd love to be able to describe all linker requirements in that way if I could, regardless of the toolchain/operating system. So I am actually all for a package manager for libs, particularly in conjunction with #pragma lib. That beautiful union eliminates the linker from the things that a programmer cares about almost entirely. I have my suspicions though that unless it gains universal acceptable by the D community, the libs either won't be up to date, or just not available - which may be a worse case, in that the management of libs would be fractured across multiple management mechanisms.
Oct 11 2012
next sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 11 October 2012 at 11:46:39 UTC, Manu wrote:
 On 11 October 2012 12:04, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-11 10:12, Manu wrote:

  Sorry, I do know what a package manager is. I was being 
 facetious, in
 that windows has no such concept, and most people use windows.
 It's not practical to depend on anything like that.
RubyGems is working perfectly fine on Windows. Just because most Windows does not have a built in package manager and most of these open source language seems to lean towards Posix than Windows doesn't mean that you can't have a working package manager in Windows.
Okay, so I'll stop being a dick for a minute, I'm actually curious to know how you imagine it working with a tool like VisualStudio. It sounds like you're not just talking about a tool to fetch libs and dependencies, but also perform the build? And the dependencies are detailed in the build script? An inflexible build system like Visual Studio doesn't really handle that very well. A package manager which collects libs and their dependencies into common include/lib paths sounds extremely useful. But to me it sounds even more useful when combined with #pragma lib! That conveniently eliminates the lib path issue. I can imagine a situation where libraries would imply their associated lib just by being imported into your module. With a package manager as you describe, and source-embedded lib linkage statements, your average user would be in a position to never concern themselves with libs at all. Sounds I also maintain that it's not stupid. The build script doesn't know what
 libs the code will link to. I guess you're arguing that your
 build-script should exclusively define the features/libs, not 
 the other
 way around?
So what does know which libraries to link with if not the build script. Something needs to know that.
The source -> object files would ideally know, and the linker would extract that information its self. I think that's the topic of the conversation :) The linker would just need to be smart enough to gather the deps from
 the objects while linking.
The linker doesn't know anything about the import paths.
The linker knows standard places to search, and non-standard places would still need to be configured... but this marry's very nicely with a package manager as you describe, since there'll never be confusion about where to look under that system. Does it work in windows and integrate with Visual Studio?
 If not, sadly, it's irrelevant.
Of course, I build all by software to work cross-platform, what is pointing to anything else. You're talking about "working on all platforms" but then you end with this comment making the rest of your comments not very believable. I can end with the same comment. Does Visual Studio work on other platforms than Windows? No, then it's irrelevant.
The point I'm trying to make is that a solution which is only convenient when working with a particular configurable/flexible build script isn't a sufficient solution. #pragma lib is very convenient, especially in the absence of a comprehensive build environment (such as Visual Studio). I maintain that there is absolutely nowhere that understands library dependencies better than the code that produces that dependency. I'd love to be able to describe all linker requirements in that way if I could, regardless of the toolchain/operating system. So I am actually all for a package manager for libs, particularly in conjunction with #pragma lib. That beautiful union eliminates the linker from the things that a programmer cares about almost entirely. I have my suspicions though that unless it gains universal acceptable by the D community, the libs either won't be up to date, or just not available - which may be a worse case, in that the management of libs would be fractured across multiple management mechanisms.
For those of us doing .NET development, the answer is NuGet. http://visualstudiogallery.msdn.microsoft.com/27077b70-9dad-4c64-adcf-c7cf6bc9970c And there is already a package manager for Windows that uses NuGet as infrastructure. http://chocolatey.org/ -- Paulo
Oct 11 2012
prev sibling next sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 13:22, Manu wrote:

 Okay, so I'll stop being a dick for a minute, I'm actually curious to
 know how you imagine it working with a tool like VisualStudio.
 It sounds like you're not just talking about a tool to fetch libs and
 dependencies, but also perform the build? And the dependencies are
 detailed in the build script?
 An inflexible build system like Visual Studio doesn't really handle that
 very well.
I'm talking about three separate tools that work close together. The package manager, the build tool and the compiler. The package manager handles packages, i.e. collection of files. The build tool tells the compiler what files to build, what compile and link flags to use. The build tool asks the package manager of any external dependencies. If it would be possible to interact with the package manager directly from the compiler or source code, i.e. pragma(package) that would be fine. But it's always a bigger step to modify the compiler than creating a separate tool. I also don't really think that the compiler should handle this. The compiler works with single files, the package manager works with collection of files, that is, packages.
 A package manager which collects libs and their dependencies into common
 include/lib paths sounds extremely useful. But to me it sounds even more
 useful when combined with #pragma lib! That conveniently eliminates the
 lib path issue.
I don't think that pragma(lib) works that way. It doesn't for for headers/import files. Say you create library "foo" and an import file, foo.di. Compiling, say main.d, together with foo.di will not link with foo.lib.
 I can imagine a situation where libraries would imply their associated
 lib just by being imported into your module. With a package manager as
 you describe, and source-embedded lib linkage statements, your average
 user would be in a position to never concern themselves with libs at

Exactly.
 The source -> object files would ideally know, and the linker would
 extract that information its self. I think that's the topic of the
 conversation :)
Yeah, but I don't think it's there it belongs. I would also say that a package manager can be more flexible then the current state of pragma(lib). When we start to mix in versions and other type of features.
 The linker knows standard places to search, and non-standard places
 would still need to be configured... but this marry's very nicely with a
 package manager as you describe, since there'll never be confusion about
 where to look under that system.
Note that when I say "import path" I'm referring to the path where to search for header (.h) files/import files (.di). The linker has nothing to do with this.
 The point I'm trying to make is that a solution which is only convenient
 when working with a particular configurable/flexible build script isn't
 a sufficient solution.
 #pragma lib is very convenient, especially in the absence of a
 comprehensive build environment (such as Visual Studio). I maintain that
 there is absolutely nowhere that understands library dependencies better
 than the code that produces that dependency. I'd love to be able to
 describe all linker requirements in that way if I could, regardless of
 the toolchain/operating system.
Well, if it's possible to add flags for the compiler and linker it should be possible. If it's possible to create a plugin for VisualStudio that can compile D code then I'm sure that plugin needs to be able to add these kind of flags. Instead of specifying libraries and link flags I want to specify packages. These packages know which libraries, link flags and other dependencies are needed to build with.
 So I am actually all for a package manager for libs, particularly in
 conjunction with #pragma lib. That beautiful union eliminates the linker
 from the things that a programmer cares about almost entirely.
Actually, this works quite nicely in Ruby. In Ruby the package manager overrides the standard "require" function, that would be the same as overriding "import" in D or "#include" in C/C++. But that's not possible for a library to do, so I'm trying my best here. In Ruby you also don't need any link flags or other compile flags for that matter.
 I have my suspicions though that unless it gains universal acceptable by
 the D community, the libs either won't be up to date, or just not
 available - which may be a worse case, in that the management of libs
 would be fractured across multiple management mechanisms.
Absolutely, that is a real problem. But we have to start somewhere. I'm drying to do what I think is best and then collect feedback from the newsgroups. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent reply "Kapps" <opantm2+spam gmail.com> writes:
On Thursday, 11 October 2012 at 11:46:39 UTC, Manu wrote:
 Okay, so I'll stop being a dick for a minute, I'm actually 
 curious to know
 how you imagine it working with a tool like VisualStudio.
 It sounds like you're not just talking about a tool to fetch 
 libs and
 dependencies, but also perform the build? And the dependencies 
 are detailed
 in the build script?
 An inflexible build system like Visual Studio doesn't really 
 handle that
 very well.
In an ideal world, your VS plugin would support Packages in the Add References dialog. Just like you could, in theory, add a project as a reference, you would add a package. Then, upon build, the plugin would call the package manager and get the libraries / imports required for it, appending the options to the compilre for your own build. This is something that wouldn't be particularly difficult to make a plugin for given an existing package manager. Mono-Develop would perform in the same way I'd imagine.
Oct 11 2012
next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 11 October 2012 at 23:03:45 UTC, Kapps wrote:
 On Thursday, 11 October 2012 at 11:46:39 UTC, Manu wrote:
 Okay, so I'll stop being a dick for a minute, I'm actually 
 curious to know
 how you imagine it working with a tool like VisualStudio.
 It sounds like you're not just talking about a tool to fetch 
 libs and
 dependencies, but also perform the build? And the dependencies 
 are detailed
 in the build script?
 An inflexible build system like Visual Studio doesn't really 
 handle that
 very well.
In an ideal world, your VS plugin would support Packages in the Add References dialog. Just like you could, in theory, add a project as a reference, you would add a package. Then, upon build, the plugin would call the package manager and get the libraries / imports required for it, appending the options to the compilre for your own build. This is something that wouldn't be particularly difficult to make a plugin for given an existing package manager. Mono-Develop would perform in the same way I'd imagine.
This already exists, it is called NuGet. Besides, Visual Studio projects are actually MSBuild scripts, for me that is quite flexible. -- Paulo
Oct 11 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-12 07:46, Paulo Pinto wrote:

 This already exists, it is called NuGet.
Again, it's not cross-platform. How well does it work with D, at all? -- /Jacob Carlborg
Oct 11 2012
next sibling parent reply Manu <turkeyman gmail.com> writes:
On 12 October 2012 09:51, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-12 07:46, Paulo Pinto wrote:

  This already exists, it is called NuGet.

 Again, it's not cross-platform. How well does it work with D, at all?
That's strictly for Microsoft .NET packages isn't it?
Oct 12 2012
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 12 October 2012 at 07:39:15 UTC, Manu wrote:
 On 12 October 2012 09:51, Jacob Carlborg <doob me.com> wrote:

 On 2012-10-12 07:46, Paulo Pinto wrote:

  This already exists, it is called NuGet.

 Again, it's not cross-platform. How well does it work with D, 
 at all?
That's strictly for Microsoft .NET packages isn't it?
At the moment yes, but there are plans to expose C++ libraries as well. But you can also make use of chocolatey already, to make other types of libraries available. -- Paulo
Oct 12 2012
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Friday, 12 October 2012 at 06:51:19 UTC, Jacob Carlborg wrote:
 On 2012-10-12 07:46, Paulo Pinto wrote:

 This already exists, it is called NuGet.
Again, it's not cross-platform. How well does it work with D, at all?
Fare enough, I got the feeling that the conversation had turned into Windows specific stuff. -- Paulo
Oct 12 2012
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-12 01:03, Kapps wrote:

 In an ideal world, your VS plugin would support Packages in the Add
 References dialog. Just like you could, in theory, add a project as a
 reference, you would add a package. Then, upon build, the plugin would
 call the package manager and get the libraries / imports required for
 it, appending the options to the compilre for your own build. This is
 something that wouldn't be particularly difficult to make a plugin for
 given an existing package manager. Mono-Develop would perform in the
 same way I'd imagine.
Exactly, that's how I'm thinking. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 10 October 2012 13:59, Manu <turkeyman gmail.com> wrote:
 On 10 October 2012 15:42, Jacob Carlborg <doob me.com> wrote:
 On 2012-10-10 13:15, Iain Buclaw wrote:

 NB: GCC has no such equivalent, and IMO libraries should be specified
 during the linking step. Such information simply doesn't belong inside
 a source file as a source file can be compiled or assembled even
 without a linking stage.
I agree, I think a package manager together with a build tool should be used instead.
None of those things actually embody the information about the relationship, nor can they. The source code does, and nothing else. Features that imply the dependency may (and often are) be disabled at compile time. I rather like that the compiler is able to put a note in the object file that it depends on a particular lib, because it does. I'm not sure how a package manager helps... What is a package manager? ;) I'd like to hear some reasons why that is a bad or undesirable thing, or is this just an opinion?
IIRC the toolchain used by Visual Studio *always* performs linking, so that is why this is not a problem for MSVC. To embody the information about the relationship in the object file, one must be able to embody the information about the relationship in the assembler file. And last time I checked there is no assembly syntax for '#pragma lib'. Regards -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Oct 10 2012
parent reply Marco Leise <Marco.Leise gmx.de> writes:
Am Wed, 10 Oct 2012 14:22:38 +0100
schrieb Iain Buclaw <ibuclaw ubuntu.com>:

 To embody the information about the relationship in the object file,
 one must be able to embody the information about the relationship in
 the assembler file.  And last time I checked there is no assembly
 syntax for '#pragma lib'.
 
 
 Regards
Waaaait, both ELF .o and COFF .obj files can contain additional free form sections and comments, right? -- Marco
Oct 10 2012
parent Marco Leise <Marco.Leise gmx.de> writes:
Am Wed, 10 Oct 2012 17:03:06 +0200
schrieb Marco Leise <Marco.Leise gmx.de>:

 Am Wed, 10 Oct 2012 14:22:38 +0100
 schrieb Iain Buclaw <ibuclaw ubuntu.com>:
 
 To embody the information about the relationship in the object file,
 one must be able to embody the information about the relationship in
 the assembler file.  And last time I checked there is no assembly
 syntax for '#pragma lib'.
 
 
 Regards
Waaaait, both ELF .o and COFF .obj files can contain additional free form sections and comments, right?
Specifically the GNU toolchain adds ".note.GNU-stack" to ELF object files on Linux. What prevents us from doing the same with ".note.D-dynlink"? -- Marco
Oct 10 2012
prev sibling next sibling parent reply "Jesse Phillips" <jessekphillips+D gmail.com> writes:
On Wednesday, 10 October 2012 at 11:39:29 UTC, Iain Buclaw wrote:

 NB: GCC has no such equivalent, and IMO libraries should be 
 specified
 during the linking step. Such information simply doesn't belong 
 inside
 a source file as a source file can be compiled or assembled even
 without a linking stage.

 Regards,
Well, to comply with the standard it must at least ignore it. http://dlang.org/pragma.html I don't see this needing to be outside the source files. Actually I'd be happy if all that was supported was exporting that information, probably to the json. What bothers me with not having this information in the library is that each project I create will now need to copy and paste my lib dependencies, or the general need to remember what library is used by a given library. Marco, As for versions. I don't know what version of ole32.dll I'm using or how to find out, I don't really care either. But I guess it would be good to have the option though.
Oct 10 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-10 17:00, Jesse Phillips wrote:

 Well, to comply with the standard it must at least ignore it.

 http://dlang.org/pragma.html

 I don't see this needing to be outside the source files. Actually I'd be
 happy if all that was supported was exporting that information, probably
 to the json. What bothers me with not having this information in the
 library is that each project I create will now need to copy and paste my
 lib dependencies, or the general need to remember what library is used
 by a given library.

 Marco,

 As for versions. I don't know what version of ole32.dll I'm using or how
 to find out, I don't really care either. But I guess it would be good to
 have the option though.
A package manager would be the solution for that. -- /Jacob Carlborg
Oct 10 2012
parent reply "Jesse Phillips" <jessekphillips+D gmail.com> writes:
On Wednesday, 10 October 2012 at 19:21:58 UTC, Jacob Carlborg 
wrote:
 On 2012-10-10 17:00, Jesse Phillips wrote:
 As for versions. I don't know what version of ole32.dll I'm 
 using or how
 to find out, I don't really care either. But I guess it would 
 be good to
 have the option though.
A package manager would be the solution for that.
I don't see why, I'm the programmer I'd have to tell the package manager what version I'm using... and again I don't know what it is. I'd think pragma(lib) + package manager would get along nicely. "Hi there, I need this library." says the source file. "Hmm, ok, let me get this version as I've been instructed to do so." replies the PM. "Hey, what about me... lib..." says a close friend. "Yeah, yeah, let me see... how does the latest version sound/what is installed? I don't know what you need but we can give this a try." sadly replies the PM. "Mister programmer man, you have unspecified library versions for... What do would you like to do with them? Did the currently selected version work out for you?" The PM tells the programmer.
Oct 10 2012
next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-11 04:54, Jesse Phillips wrote:

 I don't see why, I'm the programmer I'd have to tell the package manager
 what version I'm using... and again I don't know what it is. I'd think
 pragma(lib) + package manager would get along nicely.
You don't know what version of a library you're using? That sounds pretty bad. Sure you might not need to know which version of a system library you're using but you should really know the version of all third party libraries you're using. Otherwise there's a huge risk of things starting to break. What about any other compile or link flags, do you keep them in a separate file? Why would you keep half of the flags in one file and the other half in another file?
 "Hi there, I need this library." says the source file.

 "Hmm, ok, let me get this version as I've been instructed to do so."
 replies the PM.

 "Hey, what about me... lib..." says a close friend.

 "Yeah, yeah, let me see... how does the latest version sound/what is
 installed? I don't know what you need but we can give this a try." sadly
 replies the PM.
So it just picks a random version, that doesn't sound like very good behavior.
 "Mister programmer man, you have unspecified library versions for...
 What do would you like to do with them? Did the currently selected
 version work out for you?" The PM tells the programmer.
So this is just a guessing game, trail and error? I think it would be much better to work with packages and not individual libraries. You would just tell the build tool, compiler or whatever to use package "foo". Then the package manager figures out what libraries and dependencies it needs to link to and also the path to the import files. -- /Jacob Carlborg
Oct 10 2012
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Oct 11, 2012 at 08:46:49AM +0200, Jacob Carlborg wrote:
 On 2012-10-11 04:54, Jesse Phillips wrote:
[...]
"Mister programmer man, you have unspecified library versions for...
What do would you like to do with them? Did the currently selected
version work out for you?" The PM tells the programmer.
So this is just a guessing game, trail and error? I think it would be much better to work with packages and not individual libraries. You would just tell the build tool, compiler or whatever to use package "foo". Then the package manager figures out what libraries and dependencies it needs to link to and also the path to the import files.
[...] Yeah, one of the poor design decisions of the early Redhat packaging system was to allow packages to depend on individual files, rather than packages. The result was a disastrous mess: some packages export different versions of the same file, and only a subset of them will work, leading to hair-tearing dependencies. Packages would step over each other's files, causing problems with each other and the rest of the system that depended on the same files, ad nauseum. More modern packaging systems deal with packages as units, and do not allow packages to export the same files. Plus, packages can come with metadata that specify the exact version, build, etc., of a library, making it possible for multiple versions of the same library to exist on a system, and programs to pick up the exact version they were compiled with. A huge improvement indeed. You'll be surprised at how many applications will fail horribly because a careless library author changed the ABI (often without changing the API) without bumping the library version, causing calls to the old library functions to fail inexplicably, or worse, produce subtly wrong results. T -- There are 10 kinds of people in the world: those who can count in binary, and those who can't.
Oct 11 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-11 16:23, H. S. Teoh wrote:

 Yeah, one of the poor design decisions of the early Redhat packaging
 system was to allow packages to depend on individual files, rather than
 packages. The result was a disastrous mess: some packages export
 different versions of the same file, and only a subset of them will
 work, leading to hair-tearing dependencies. Packages would step over
 each other's files, causing problems with each other and the rest of the
 system that depended on the same files, ad nauseum.
That sounds even worse, a terrible, terrible idea. -- /Jacob Carlborg
Oct 11 2012
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Thu, Oct 11, 2012 at 08:54:19PM +0200, Jacob Carlborg wrote:
 On 2012-10-11 16:23, H. S. Teoh wrote:
 
Yeah, one of the poor design decisions of the early Redhat packaging
system was to allow packages to depend on individual files, rather than
packages. The result was a disastrous mess: some packages export
different versions of the same file, and only a subset of them will
work, leading to hair-tearing dependencies. Packages would step over
each other's files, causing problems with each other and the rest of the
system that depended on the same files, ad nauseum.
That sounds even worse, a terrible, terrible idea.
[...] Yeah, it was one of the things that convinced me to *not* use Redhat. I saw a similar thing back in the Bad Old Days of Win98, Win2k, and their ilk, where installing a driver would sometimes prompt you something to the effect of "this driver needs to install a file that already exists; overwrite the file, delete it, or skip it?" None of those options should be anything the *user* has to decide, IMO. It essentially amounted to "flip a coin and pray the OS won't crash, and if you're *really* lucky the driver might actually work". Things like that convinced me *not* to use Windows. (I don't know if Windows still does that, as I don't use it anyore; but for everyone else's sake I would certainly hope it doesn't!) Of course, IIRC Redhat has since fixed this broken design, but the horrible memory of it stuck. Debian, OTOH, has a depends-on-package policy, which results in a much saner system where a package can specify a dependency on other packages (with an entire package as a unit), optionally with a version constraint, and thus be ensured that it will get the correct versions of all related files. That was one of the things that convinced me to use Debian. :) T -- This is not a sentence.
Oct 11 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 21:45, H. S. Teoh wrote:

 Yeah, it was one of the things that convinced me to *not* use Redhat. I
 saw a similar thing back in the Bad Old Days of Win98, Win2k, and their
 ilk, where installing a driver would sometimes prompt you something to
 the effect of "this driver needs to install a file that already exists;
 overwrite the file, delete it, or skip it?" None of those options should
 be anything the *user* has to decide, IMO. It essentially amounted to
 "flip a coin and pray the OS won't crash, and if you're *really* lucky
 the driver might actually work". Things like that convinced me *not* to
 use Windows. (I don't know if Windows still does that, as I don't use it
 anyore; but for everyone else's sake I would certainly hope it doesn't!)
Haha.
 Of course, IIRC Redhat has since fixed this broken design, but the
 horrible memory of it stuck. Debian, OTOH, has a depends-on-package
 policy, which results in a much saner system where a package can specify
 a dependency on other packages (with an entire package as a unit),
 optionally with a version constraint, and thus be ensured that it will
 get the correct versions of all related files. That was one of the
 things that convinced me to use Debian. :)
That's how it should work. The smallest unit should be a package. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent reply "Jesse Phillips" <Jessekphillips+D gmail.com> writes:
On Thursday, 11 October 2012 at 07:10:57 UTC, Jacob Carlborg 
wrote:
 On 2012-10-11 04:54, Jesse Phillips wrote:
 I think it would be much better to work with packages and not 
 individual libraries. You would just tell the build tool, 
 compiler or whatever to use package "foo". Then the package 
 manager figures out what libraries and dependencies it needs to 
 link to and also the path to the import files.
Why can't I just tell the compile that I need library "foo," and the package manager can handle finding the package which provides that library? Packages depend on other packages, code depends on other code. I don't think it makes sense for code to depend on a package. Granted, packages could provide code that is depended on... On the note about flags... Most of them are external to the code, you could say they are the options of the package. However, for libraries, we have a separate program which can operate without the source code, thus the historical reason code dependencies are not stated in the code. Why should we have 'import?' Can't we just tell the compiler the files it will need for its symbols separately, I don't know... maybe we could add a flag to specify them so it would feel strange to have them in the source code too. (Sorry couldn't pass it up) As for the version number, you are right, I should know. But much of the code I write/use doesn't make release. This is where a handy and easy to use package manager comes in. Releases become simpler, there can be verified and tested versions of the depended libraries. But even with a tested version, that doesn't mean it will fail with older or newer versions of the package. Linux creates symbolic links to its shared libraries to provide an unversioned link. And while all the packages know which version that is, third part applications do not. Yes, this arbitrary guessing game has its problems, but so does the strict version requirements. I've had to create symbolic links to a newer library to pretend I had an older version (as digging up such a version becomes hard when the package manager becomes worthless). Anyway, I look forward to a good packaging system for D. But stating the dependency in that doing the depending does make sense. However I've realized that the dependency on a library only makes sense for header files, and header files only make sense when interfacing with C. Dependency on D code and libraries all come stated from the 'import' statement.
Oct 12 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-12 23:43, Jesse Phillips wrote:

 Why can't I just tell the compile that I need library "foo," and the
 package manager can handle finding the package which provides that library?
Because package "foo" can contain several libraries. You also don't want to bother with the libraries "foo" depends on. Say you have a project where you want to use Derelict. I much rather just say: package derelict Then some thing like this: library libDerelictGL.a library libDerelictUtil.a
 Packages depend on other packages, code depends on other code. I don't
 think it makes sense for code to depend on a package. Granted, packages
 could provide code that is depended on...
I don't understand why.
 On the note about flags... Most of them are external to the code, you
 could say they are the options of the package. However, for libraries,
 we have a separate program which can operate without the source code,
 thus the historical reason code dependencies are not stated in the code.
 Why should we have 'import?' Can't we just tell the compiler the files
 it will need for its symbols separately, I don't know... maybe we could
 add a flag to specify them so it would feel strange to have them in the
 source code too. (Sorry couldn't pass it up)
If you specify the path where the compiler should find import files in a source file, how should the compiler handle that? * Have a special file that the compiler always reads first? How is that any better than a build script? * The compiler scans all files after these commands, add the import paths and then rescans the files? This sounds very inefficient.
 As for the version number, you are right, I should know. But much of the
 code I write/use doesn't make release. This is where a handy and easy to
 use package manager comes in. Releases become simpler, there can be
 verified and tested versions of the depended libraries. But even with a
 tested version, that doesn't mean it will fail with older or newer
 versions of the package.
No, it does not.
 Linux creates symbolic links to its shared libraries to provide an
 unversioned link. And while all the packages know which version that is,
 third part applications do not. Yes, this arbitrary guessing game has
 its problems, but so does the strict version requirements. I've had to
 create symbolic links to a newer library to pretend I had an older
 version (as digging up such a version becomes hard when the package
 manager becomes worthless).
I think it's bad practice to depend on a library which you don't know the exact version. But if you want, you don't need strict version requirements. RubyGems has this nice little feature of specifying versions: In these cases the digits in the version are interpreted as follows: 3 - Major 2 - Minor 8 - Build Now, here's the interesting part: version That is, "3.2.8" is ok, "3.2.15" is ok but "3.3.0" is not ok. That is, "3.2.0" is ok, "3.2.8" is ok, "3.3.0" is ok but "4.0.0" is not ok.
 Anyway, I look forward to a good packaging system for D. But stating the
 dependency in that doing the depending does make sense. However I've
 realized that the dependency on a library only makes sense for header
 files, and header files only make sense when interfacing with C.
 Dependency on D code and libraries all come stated from the 'import'
 statement.
Ok, I really don't understand this last section. -- /Jacob Carlborg
Oct 13 2012
parent reply "Jesse Phillips" <jessekphillips+D gmail.com> writes:
On Saturday, 13 October 2012 at 11:07:36 UTC, Jacob Carlborg 
wrote:
 If you specify the path where the compiler should find import 
 files in a source file, how should the compiler handle that?
Source code does not depend on an import path, that is an environment issue. Thus you would not specify import paths in source files.
 I've
 realized that the dependency on a library only makes sense for 
 header
 files, and header files only make sense when interfacing with 
 C.
 Dependency on D code and libraries all come stated from the 
 'import'
 statement.
Ok, I really don't understand this last section.
module bob; extern(C) void fishingBob(); This function has no definition, code importing bob is not getting fishingBob. module joe; void runningJoe() { ... } The code importing joe does get runningJoe. So while I think this change makes sense: module bob; pragma(lib, "bob.lib"); extern(C) void fishingBob(); This one does not: module joe; pragma(lib, "joe.lib"); void runningJoe() { ... } Or import tango.Text; pragma(lib, "tango.lib"); ...
Oct 13 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-10-13 23:11, Jesse Phillips wrote:

 Source code does not depend on an import path, that is an environment
 issue. Thus you would not specify import paths in source files.
You need to specify it somewhere. Why would I want half of the compiler flags in one place and the other half in another place? And why should I need to specify it in the first place, when the built tool can do that with the help of the package manager.
      module bob;
      extern(C) void fishingBob();

 This function has no definition, code importing bob is not getting
 fishingBob.

      module joe;
      void runningJoe() { ... }

 The code importing joe does get runningJoe. So while I think this change
 makes sense:

      module bob;
      pragma(lib, "bob.lib");
      extern(C) void fishingBob();

 This one does not:

      module joe;
      pragma(lib, "joe.lib");
      void runningJoe() { ... }

 Or

      import tango.Text;
      pragma(lib, "tango.lib"); ...
Sure that's not point if you building from source. But if you pre-compile a library and have import files (.di) then pragma(lib) would make sense, but it doesn't currently work with import files anyway. -- /Jacob Carlborg
Oct 14 2012
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 04:54, Jesse Phillips wrote:
 On Wednesday, 10 October 2012 at 19:21:58 UTC, Jacob Carlborg wrote:
 On 2012-10-10 17:00, Jesse Phillips wrote:
 As for versions. I don't know what version of ole32.dll I'm using or how
 to find out, I don't really care either. But I guess it would be good to
 have the option though.
A package manager would be the solution for that.
I don't see why, I'm the programmer I'd have to tell the package manager what version I'm using... and again I don't know what it is. I'd think pragma(lib) + package manager would get along nicely. "Hi there, I need this library." says the source file. "Hmm, ok, let me get this version as I've been instructed to do so." replies the PM. "Hey, what about me... lib..." says a close friend. "Yeah, yeah, let me see... how does the latest version sound/what is installed? I don't know what you need but we can give this a try." sadly replies the PM. "Mister programmer man, you have unspecified library versions for... What do would you like to do with them? Did the currently selected version work out for you?" The PM tells the programmer.
Also see my other post: http://forum.dlang.org/thread/mailman.695.1349857389.5162.digitalmars-d puremagic.com?page=3#post-k54hng:241rra:241:40digitalmars.com -- /Jacob Carlborg
Oct 10 2012
prev sibling parent reply "Kagamin" <spam here.lot> writes:
On Wednesday, 10 October 2012 at 11:39:29 UTC, Iain Buclaw wrote:
 NB: GCC has no such equivalent
hmm... really? Tough it seems like ld parses .drectve section: http://pastebin.com/mc63b1b1 http://www.ibiblio.org/gferg/ldp/man/man1/dlltool.1.html And syntax used sort of resembles that of gnu tools.
Oct 12 2012
parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
.drective looks to be for functions, not libraries.

----
Iain Buclaw

*(p < e ? p++ : p) = (c & 0x0f) + '0';

On 12 Oct 2012 17:31, "Kagamin" <spam here.lot> wrote:
 On Wednesday, 10 October 2012 at 11:39:29 UTC, Iain Buclaw wrote:
 NB: GCC has no such equivalent
hmm... really? Tough it seems like ld parses .drectve section: http://pastebin.com/mc63b1b1 http://www.ibiblio.org/gferg/ldp/man/man1/dlltool.1.html And syntax used sort of resembles that of gnu tools.
Oct 12 2012
next sibling parent "Kagamin" <spam here.lot> writes:
It looks to be for arbitrary linker commands.

ps I agree that building should be handled by a build system 
and/or a build script.
Oct 12 2012
prev sibling parent "Kagamin" <spam here.lot> writes:
 From COFF spec:
------
The .drectve Section (Object Only)
A section is a “directive” section if it has the 
IMAGE_SCN_LNK_INFO flag set in the section header. By convention, 
such a section also has the name .drectve. The linker removes a 
.drectve section after processing the information, so the section 
does not appear in the image file being linked. Note that a 
section marked with IMAGE_SCN_LNK_INFO that is not named .drectve 
is ignored and discarded by the linker.

A .drectve section consists of a string of ASCII text. This 
string is a series of linker options (each option containing 
hyphen, option name, and any appropriate attribute) separated by 
spaces. The .drectve section must not have relocations or line 
numbers.

In a .drectve section, if the hyphen preceding an option is 
followed by a question mark (for example, “-?export”), and 
the option is not recognized as a valid directive, the linker 
must ignore it. This allows compilers and linkers to add new 
directives while maintaining compatibility with existing linkers, 
as long as the new directives are not required for the correct 
linking of the application. For example, if the directive enables 
a link-time optimization, it is acceptable if some linkers cannot 
recognize it.
Oct 12 2012
prev sibling parent reply Manu <turkeyman gmail.com> writes:
On 10 October 2012 14:15, Iain Buclaw <ibuclaw ubuntu.com> wrote:

 On 10 October 2012 09:31, Manu <turkeyman gmail.com> wrote:
 Percect, thanks!


 On 10 October 2012 11:27, Walter Bright <newshound2 digitalmars.com>
wrote:
 On 10/10/2012 1:22 AM, Manu wrote:
 Does D support some sort of #pragma lib?
Yes: pragma(lib, "mylib.lib");
NB: GCC has no such equivalent, and IMO libraries should be specified during the linking step. Such information simply doesn't belong inside a source file as a source file can be compiled or assembled even without a linking stage.
Really? Is it an MS thing? I'm amazed the other compilers haven't adopted that in the last 10 years or whatever. It just leaves a note in the object file that the linker happens to find and apply later. I don't see any problem with it. It's the source file has the dependency on the lib, it's annoying to manually manage that externally when the dependency is already explicit in the code, and can be easily recorded in the object file. I think D's modules make this relationship even stronger, and it's a shame it's not a standard part of D.
Oct 10 2012
next sibling parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 11:50:29 UTC, Manu wrote:
 On 10 October 2012 14:15, Iain Buclaw <ibuclaw ubuntu.com> 
 wrote:

 On 10 October 2012 09:31, Manu <turkeyman gmail.com> wrote:
 Percect, thanks!


 On 10 October 2012 11:27, Walter Bright 
 <newshound2 digitalmars.com>
wrote:
 On 10/10/2012 1:22 AM, Manu wrote:
 Does D support some sort of #pragma lib?
Yes: pragma(lib, "mylib.lib");
NB: GCC has no such equivalent, and IMO libraries should be specified during the linking step. Such information simply doesn't belong inside a source file as a source file can be compiled or assembled even without a linking stage.
Really? Is it an MS thing? I'm amazed the other compilers haven't adopted that in the last 10 years or whatever.
Yes, it is a Microsoft extension. I never saw it in any other C or C++ compiler. Maybe Intel and CodeGear compilers have it, since they value MSVC compatibility. -- Paulo
Oct 10 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 4:49 AM, Paulo Pinto wrote:
 On Wednesday, 10 October 2012 at 11:50:29 UTC, Manu wrote:
 Really? Is it an MS thing? I'm amazed the other compilers haven't adopted
 that in the last 10 years or whatever.
Yes, it is a Microsoft extension. I never saw it in any other C or C++ compiler.
Digital Mars C and C++ !!
Oct 10 2012
parent reply "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 14:44:20 UTC, Walter Bright 
wrote:
 On 10/10/2012 4:49 AM, Paulo Pinto wrote:
 On Wednesday, 10 October 2012 at 11:50:29 UTC, Manu wrote:
 Really? Is it an MS thing? I'm amazed the other compilers 
 haven't adopted
 that in the last 10 years or whatever.
Yes, it is a Microsoft extension. I never saw it in any other C or C++ compiler.
Digital Mars C and C++ !!
I only became aware of Digital Mars thanks to D, I must confess. When I moved away from Turbo Pascal, I started using Turbo C and Turbo C++, followed by Borland C++ and eventually Visual C++. Then at the university I started to use vendor's C and C++ compilers of the multiple UNIX systems we had access to. I used to see adverts for Watcom C/C++, High C/C++ and Zortech C/C++ in computer magazines, but never knew anyone that had access to them. -- Paulo
Oct 10 2012
parent reply Arjan <arjan jak.nl> writes:
On Wed, 10 Oct 2012 16:32:05 +0200, Paulo Pinto <pjmlp progtools.org>  
wrote:

 On Wednesday, 10 October 2012 at 14:44:20 UTC, Walter Bright wrote:
 On 10/10/2012 4:49 AM, Paulo Pinto wrote:
 On Wednesday, 10 October 2012 at 11:50:29 UTC, Manu wrote:
 Really? Is it an MS thing? I'm amazed the other compilers haven't  
 adopted
 that in the last 10 years or whatever.
Yes, it is a Microsoft extension. I never saw it in any other C or C++ compiler.
Digital Mars C and C++ !!
I only became aware of Digital Mars thanks to D, I must confess. When I moved away from Turbo Pascal, I started using Turbo C and Turbo C++, followed by Borland C++ and eventually Visual C++. Then at the university I started to use vendor's C and C++ compilers of the multiple UNIX systems we had access to. I used to see adverts for Watcom C/C++, High C/C++ and Zortech C/C++ in computer magazines, but never knew anyone that had access to them.
You really missed something. For years Zortech/Sysmantec/DigitalMars C++ has been my preferred compiler. Generated fast code very fast! With being STLport'ed it also had decent support for STL. It often times barked about issues with Type checking other compilers did not even mention. Which saved me from quite a few bugs. When porting big C++ libraries like wxWidgets (then wxWindows) it became apparent to me DMC++ did need the least special treatment to make it compile the code. Also ported various Boost libs to it. It wasn't util VS2005 came along before I started shifting away... About Watcom? Well a complete opposite experience... Arjan
Oct 10 2012
parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Thursday, 11 October 2012 at 07:18:45 UTC, Arjan wrote:
 On Wed, 10 Oct 2012 16:32:05 +0200, Paulo Pinto 
 <pjmlp progtools.org> wrote:

 On Wednesday, 10 October 2012 at 14:44:20 UTC, Walter Bright 
 wrote:
 On 10/10/2012 4:49 AM, Paulo Pinto wrote:
 On Wednesday, 10 October 2012 at 11:50:29 UTC, Manu wrote:
 Really? Is it an MS thing? I'm amazed the other compilers 
 haven't adopted
 that in the last 10 years or whatever.
Yes, it is a Microsoft extension. I never saw it in any other C or C++ compiler.
Digital Mars C and C++ !!
I only became aware of Digital Mars thanks to D, I must confess. When I moved away from Turbo Pascal, I started using Turbo C and Turbo C++, followed by Borland C++ and eventually Visual C++. Then at the university I started to use vendor's C and C++ compilers of the multiple UNIX systems we had access to. I used to see adverts for Watcom C/C++, High C/C++ and Zortech C/C++ in computer magazines, but never knew anyone that had access to them.
You really missed something. For years Zortech/Sysmantec/DigitalMars C++ has been my preferred compiler. Generated fast code very fast! With being STLport'ed it also had decent support for STL. It often times barked about issues with Type checking other compilers did not even mention. Which saved me from quite a few bugs. When porting big C++ libraries like wxWidgets (then wxWindows) it became apparent to me DMC++ did need the least special treatment to make it compile the code. Also ported various Boost libs to it. It wasn't util VS2005 came along before I started shifting away... About Watcom? Well a complete opposite experience... Arjan
I guess the Portuguese market was too small for having those compilers available.
Oct 11 2012
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 4:26 AM, Manu wrote:
 I think D's modules make this relationship even stronger, and it's a shame it's
 not a standard part of D.
Some object module formats do not have the ability to embed a library reference in them. Elf (cough cough), Mach-O (wheeze)
Oct 10 2012
parent reply "Simen Kjaeraas" <simen.kjaras gmail.com> writes:
On 2012-10-10, 16:19, Walter Bright wrote:

 On 10/10/2012 4:26 AM, Manu wrote:
 I think D's modules make this relationship even stronger, and it's a  
 shame it's
 not a standard part of D.
Some object module formats do not have the ability to embed a library reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses. Note: The above comes from a 2-minute cursory read of the ELF standard, and may thus contain errors. :p -- Simen
Oct 10 2012
next sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 8:19 AM, Simen Kjaeraas wrote:
 On 2012-10-10, 16:19, Walter Bright wrote:
 Some object module formats do not have the ability to embed a library
 reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses.
Exactly. Embedding it in a comment section accomplishes exactly nothing if the linker ignores it.
Oct 10 2012
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Oct 10, 2012 at 08:56:49AM -0700, Walter Bright wrote:
 On 10/10/2012 8:19 AM, Simen Kjaeraas wrote:
On 2012-10-10, 16:19, Walter Bright wrote:
Some object module formats do not have the ability to embed a library
reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses.
Exactly. Embedding it in a comment section accomplishes exactly nothing if the linker ignores it.
[...] I thought the point was that linkers that *did* understand it will do the right thing, and linkers that don't will fall back to the traditional way (require specifying libraries on the command-line). T -- The most powerful one-line C program: #include "/dev/tty" -- IOCCC
Oct 10 2012
parent reply Dmitry Olshansky <dmitry.olsh gmail.com> writes:
On 10-Oct-12 22:54, H. S. Teoh wrote:
 On Wed, Oct 10, 2012 at 08:56:49AM -0700, Walter Bright wrote:
 On 10/10/2012 8:19 AM, Simen Kjaeraas wrote:
 On 2012-10-10, 16:19, Walter Bright wrote:
 Some object module formats do not have the ability to embed a library
 reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses.
Exactly. Embedding it in a comment section accomplishes exactly nothing if the linker ignores it.
[...] I thought the point was that linkers that *did* understand it will do the right thing, and linkers that don't will fall back to the traditional way (require specifying libraries on the command-line).
BTW making the smart linker driver should be relatively easy: just get the special section contents out of passed object files and turn that into a full blown set of flags for a dumb linker. -- Dmitry Olshansky
Oct 10 2012
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Oct 10, 2012 at 10:56:30PM +0400, Dmitry Olshansky wrote:
 On 10-Oct-12 22:54, H. S. Teoh wrote:
On Wed, Oct 10, 2012 at 08:56:49AM -0700, Walter Bright wrote:
On 10/10/2012 8:19 AM, Simen Kjaeraas wrote:
On 2012-10-10, 16:19, Walter Bright wrote:
Some object module formats do not have the ability to embed a
library reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses.
Exactly. Embedding it in a comment section accomplishes exactly nothing if the linker ignores it.
[...] I thought the point was that linkers that *did* understand it will do the right thing, and linkers that don't will fall back to the traditional way (require specifying libraries on the command-line).
BTW making the smart linker driver should be relatively easy: just get the special section contents out of passed object files and turn that into a full blown set of flags for a dumb linker.
[...] Excellent idea! T -- Guns don't kill people. Bullets do.
Oct 10 2012
prev sibling parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 11:56 AM, Dmitry Olshansky wrote:
 BTW making the smart linker driver should be relatively easy: just get the
 special section contents out of passed object files and turn that into a full
 blown set of flags for a dumb linker.
Not that easy. You'd have to do it also with the objects pulled in from library files. Essentially, you have to implement most of the linker to do that. Not worth it.
Oct 10 2012
parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-11 01:18, Walter Bright wrote:

 Not that easy. You'd have to do it also with the objects pulled in from
 library files. Essentially, you have to implement most of the linker to
 do that.

 Not worth it.
Maybe it's time to update the old tool chain that we inherited form the 70s, but this is a totally different discussion. -- /Jacob Carlborg
Oct 10 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/10/2012 11:49 PM, Jacob Carlborg wrote:
 Maybe it's time to update the old tool chain that we inherited form the 70s,
but
 this is a totally different discussion.
Writing a new linker for each platform is a major undertaking.
Oct 11 2012
next sibling parent reply Manu <turkeyman gmail.com> writes:
On 11 October 2012 14:35, Walter Bright <newshound2 digitalmars.com> wrote:

 On 10/10/2012 11:49 PM, Jacob Carlborg wrote:

 Maybe it's time to update the old tool chain that we inherited form the
 70s, but
 this is a totally different discussion.
Writing a new linker for each platform is a major undertaking.
Well Microsoft's linker is obviously capable of finding lib details in the object files. Iain almost sounded like he had a working theory for ld. I know nothing of OSX. LLVM?
Oct 11 2012
next sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 13:47, Manu wrote:

 Well Microsoft's linker is obviously capable of finding lib details in
 the object files.
 Iain almost sounded like he had a working theory for ld.
 I know nothing of OSX. LLVM?
I think that Mac OS X is still using ld, but I'm not sure. LLVM do have a linker but I don't remember if it's the standard system linker. The LLVM linker most likely has a better chance of handling these kind of things if ld doesn't already do that. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent reply "David Nadlinger" <see klickverbot.at> writes:
On Thursday, 11 October 2012 at 12:11:47 UTC, Manu wrote:
 LLVM?
LLVM is capable of emitting directly to object files, but linking is not part of its (core) agenda. In LDC, we currently depend on "the system linker", i.e. GCC on Unixen and link.exe on MSVC/Windows. This might change if/when LLD (http://lld.llvm.org/) becomes stable, though. David
Oct 11 2012
parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 11 October 2012 13:18, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 12:11:47 UTC, Manu wrote:
 LLVM?
LLVM is capable of emitting directly to object files, but linking is not part of its (core) agenda. In LDC, we currently depend on "the system linker", i.e. GCC on Unixen and link.exe on MSVC/Windows. This might change if/when LLD (http://lld.llvm.org/) becomes stable, though. David
Does LDC only build one executable? -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Oct 11 2012
parent reply "David Nadlinger" <see klickverbot.at> writes:
On Thursday, 11 October 2012 at 15:18:07 UTC, Iain Buclaw wrote:
 Does LDC only build one executable?
What do you mean by "only one executable" – only one object file? If the latter, then the answer depends on whether you pass the -singleobj switch at compile time. If it is specified, all the LLVM modules which would be codegen'd as separate object files are linked together internally before emitting them, if not, individual object files are generated. David
Oct 11 2012
next sibling parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 11 October 2012 16:29, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 15:18:07 UTC, Iain Buclaw wrote:
 Does LDC only build one executable?
What do you mean by "only one executable" =96 only one object file? If th=
e
 latter, then the answer depends on whether you pass the -singleobj switch=
at
 compile time. If it is specified, all the LLVM modules which would be
 codegen'd as separate object files are linked together internally before
 emitting them, if not, individual object files are generated.

 David
Not what the LDC compiler does, the actual /usr/bin/ldc executable itself. Is it in one pieces, or in pieces? eg: GDC is in two pieces, gdc and cc1d. --=20 Iain Buclaw *(p < e ? p++ : p) =3D (c & 0x0f) + '0';
Oct 11 2012
next sibling parent reply "David Nadlinger" <see klickverbot.at> writes:
On Thursday, 11 October 2012 at 16:01:09 UTC, Iain Buclaw wrote:
 On 11 October 2012 16:29, David Nadlinger <see klickverbot.at> 
 wrote:
 On Thursday, 11 October 2012 at 15:18:07 UTC, Iain Buclaw 
 wrote:
 Does LDC only build one executable?
[…]
Not what the LDC compiler does, the actual /usr/bin/ldc executable itself. Is it in one pieces, or in pieces? eg: GDC is in two pieces, gdc and cc1d.
Ah, now I see what you mean. Yes, LDC consists of only one executable. David
Oct 11 2012
parent reply Iain Buclaw <ibuclaw ubuntu.com> writes:
On 11 October 2012 18:49, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 16:01:09 UTC, Iain Buclaw wrote:
 On 11 October 2012 16:29, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 15:18:07 UTC, Iain Buclaw wrote:
 Does LDC only build one executable?
[=85]
Not what the LDC compiler does, the actual /usr/bin/ldc executable itself. Is it in one pieces, or in pieces? eg: GDC is in two pieces, gdc and cc1d.
Ah, now I see what you mean. Yes, LDC consists of only one executable. David
So the object that interprets pragma(lib) is able to communicate with the object that handles what flags to pass to the linker. :-) This is not possible with gdc, as the driver that handles the calling of ld (gdc) does not / cannot communicate with the compiler component (cc1d). Regards, --=20 Iain Buclaw *(p < e ? p++ : p) =3D (c & 0x0f) + '0';
Oct 11 2012
parent reply "David Nadlinger" <see klickverbot.at> writes:
On Thursday, 11 October 2012 at 22:50:38 UTC, Iain Buclaw wrote:
 So the object that interprets pragma(lib) is able to 
 communicate with
 the object that handles what flags to pass to the linker. :-)
Yes, and pragma(lib) is already supported in LDC.
 This is not possible with gdc, as the driver that handles the 
 calling
 of ld (gdc) does not / cannot communicate with the compiler 
 component
 (cc1d).
Can't you just do something along the lines of having "gdc test.d" internally calling "cc1d -o … --write-additional-libraries-to-this-file=/tmp/unique.librar .deps.file.or.pipe" and read from that file/pipe later when linking? Or is this forbidden by some GCC rules? David
Oct 12 2012
parent Iain Buclaw <ibuclaw ubuntu.com> writes:
On 12 October 2012 10:47, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 22:50:38 UTC, Iain Buclaw wrote:
 So the object that interprets pragma(lib) is able to communicate with
 the object that handles what flags to pass to the linker. :-)
Yes, and pragma(lib) is already supported in LDC.
 This is not possible with gdc, as the driver that handles the calling
 of ld (gdc) does not / cannot communicate with the compiler component
 (cc1d).
Can't you just do something along the lines of having "gdc test.d" internally calling "cc1d -o =85 --write-additional-libraries-to-this-file=3D/tmp/unique.library.deps.file=
.or.pipe"
 and read from that file/pipe later when linking? Or is this forbidden by
 some GCC rules?

 David
Don't think that would be a simple thing to do in the current infrastructur= e. --=20 Iain Buclaw *(p < e ? p++ : p) =3D (c & 0x0f) + '0';
Oct 12 2012
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 18:00, Iain Buclaw wrote:

 Not what the LDC compiler does, the actual /usr/bin/ldc executable
 itself.  Is it in one pieces, or in pieces?  eg:  GDC is in two
 pieces, gdc and cc1d.
Why is that? I know that Clang does the same, I've always wonder why. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent Iain Buclaw <ibuclaw ubuntu.com> writes:
On 11 October 2012 17:00, Iain Buclaw <ibuclaw ubuntu.com> wrote:
 On 11 October 2012 16:29, David Nadlinger <see klickverbot.at> wrote:
 On Thursday, 11 October 2012 at 15:18:07 UTC, Iain Buclaw wrote:
 Does LDC only build one executable?
What do you mean by "only one executable" =96 only one object file? If t=
he
 latter, then the answer depends on whether you pass the -singleobj switc=
h at
 compile time. If it is specified, all the LLVM modules which would be
 codegen'd as separate object files are linked together internally before
 emitting them, if not, individual object files are generated.

 David
Not what the LDC compiler does, the actual /usr/bin/ldc executable itself. Is it in one pieces, or in pieces? eg: GDC is in two pieces, gdc and cc1d.
Effectively, is it one executable that handles the entire compilation process from parsing options, compiling and linking. --=20 Iain Buclaw *(p < e ? p++ : p) =3D (c & 0x0f) + '0';
Oct 11 2012
prev sibling next sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2012-10-11 13:35, Walter Bright wrote:

 Writing a new linker for each platform is a major undertaking.
Of course, but isn't writing a new compiler for each platform that as well? -- /Jacob Carlborg
Oct 11 2012
parent reply Walter Bright <newshound2 digitalmars.com> writes:
On 10/11/2012 4:59 AM, Jacob Carlborg wrote:
 On 2012-10-11 13:35, Walter Bright wrote:

 Writing a new linker for each platform is a major undertaking.
Of course, but isn't writing a new compiler for each platform that as well?
Are you saying that just to support pragma(lib), we should write a new linker?
Oct 11 2012
parent Jacob Carlborg <doob me.com> writes:
On 2012-10-11 14:36, Walter Bright wrote:

 Are you saying that just to support pragma(lib), we should write a new
 linker?
No, not just because this feature alone. There are many other features that could be implemented with a smarter linker. There has been talk about this on the newsgroups several times. Better handling templates and overloaded functions, for example. -- /Jacob Carlborg
Oct 11 2012
prev sibling parent Iain Buclaw <ibuclaw ubuntu.com> writes:
On 11 October 2012 12:47, Manu <turkeyman gmail.com> wrote:
 On 11 October 2012 14:35, Walter Bright <newshound2 digitalmars.com> wrote:
 On 10/10/2012 11:49 PM, Jacob Carlborg wrote:
 Maybe it's time to update the old tool chain that we inherited form the
 70s, but
 this is a totally different discussion.
Writing a new linker for each platform is a major undertaking.
Well Microsoft's linker is obviously capable of finding lib details in the object files. Iain almost sounded like he had a working theory for ld. I know nothing of OSX. LLVM?
I also said talk is cheap, and there are more great ideas than what will ever be implemented. In open source development, if you want to see a feature implemented, you are usually left to do it yourself and send patches to the right places. Core devs love patches, code is expensive. :~) Regards -- Iain Buclaw *(p < e ? p++ : p) = (c & 0x0f) + '0';
Oct 11 2012
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2012-10-10 17:56, Walter Bright wrote:

 Exactly. Embedding it in a comment section accomplishes exactly nothing
 if the linker ignores it.
DMD calls the linker (or rather GCC). Just read the section and append a link flag. But that might turn DMD into a linker. Again, I think all this should be handled by a build tool and a package manager. -- /Jacob Carlborg
Oct 10 2012
prev sibling parent "Paulo Pinto" <pjmlp progtools.org> writes:
On Wednesday, 10 October 2012 at 15:43:36 UTC, Simen Kjaeraas 
wrote:
 On 2012-10-10, 16:19, Walter Bright wrote:

 On 10/10/2012 4:26 AM, Manu wrote:
 I think D's modules make this relationship even stronger, and 
 it's a shame it's
 not a standard part of D.
Some object module formats do not have the ability to embed a library reference in them. Elf (cough cough), Mach-O (wheeze)
ELF has .comment and .note sections. While they're not exactly intended for the purpose, they seem to fit the bill? It'd of course require a linker to know what to do with the data, but it seems to me this would not interfere with other uses. Note: The above comes from a 2-minute cursory read of the ELF standard, and may thus contain errors. :p
This is how Go packages are compiled. The compiler reads the libraries (.a, .o) for the symbol information of what a package exports. -- Paulo
Oct 10 2012