digitalmars.D - DIP11
- jdrewsen (4/4) Aug 10 2011 What is the status of DIP11
- Jacob Carlborg (6/10) Aug 11 2011 Not sure, personally I don't like it. Instead I'm working on a more
- Jonas Drewsen (5/16) Aug 11 2011 Yes I've noticed that. Seems very promising.
- Jacob Carlborg (9/26) Aug 11 2011 I think that DIP11 is too limited, for example, it doesn't deal with
- Jonas Drewsen (4/31) Aug 11 2011 Some refinements needs to be done to the DIP yes. One of them is version...
- Steven Schveighoffer (10/37) Aug 11 2011 Given that the implementation would be a compiler-used tool, and the too...
- Jacob Carlborg (20/29) Aug 11 2011 That might be the case. Since it's arbitrary URLs that represents D
- Andrej Mitrovic (7/7) Aug 11 2011 I don't trust DIP11 to produce anything stable. We already have a
- Steven Schveighoffer (23/50) Aug 11 2011 These are implementation details. The compiler just knows "hm.. there's...
- Jacob Carlborg (13/67) Aug 11 2011 Yes, exactly.
- Nick Sabalausky (6/11) Aug 11 2011 This works cross-platform:
- Jacob Carlborg (9/22) Aug 12 2011 Will that work with all available library types (static, dynamic) on all...
- Andrew Wiley (5/49) Aug 11 2011 This last bit doesn't really come into play here because you can already...
- Steven Schveighoffer (16/28) Aug 11 2011 Yes, but then you have to restart the compiler to figure out what's next...
- Jacob Carlborg (28/58) Aug 11 2011 So how would that be different if the compiler drives everything? Say
- Steven Schveighoffer (16/55) Aug 11 2011 Forgive my compiler ignorance (not a compiler writer), but why does the ...
- Jacob Carlborg (37/70) Aug 12 2011 Probably it's no different. Well what's different is it will first parse...
- Steven Schveighoffer (41/115) Aug 12 2011 The extendability is in the url. For example, yes, http://server/file.d...
- Adam D. Ruppe (5/5) Aug 12 2011 I did write build2.d, which tries to simulate dip11 outside
- Jacob Carlborg (16/68) Aug 12 2011 Now I'm not quite sure I understand. Are you saying that every file
- Steven Schveighoffer (13/58) Aug 12 2011 Let's say file a.d pragmas that module foo means http://foo.com/projectx...
- Jacob Carlborg (6/20) Aug 12 2011 Again, will that mean you have to specify a pragma for each file?
- Steven Schveighoffer (15/39) Aug 12 2011 Yes, or specify it on the command line/config file. I think the risk of...
- Jacob Carlborg (7/23) Aug 13 2011 If x and y depends on two different versions of z, how would that be
- Steven Schveighoffer (17/43) Aug 15 2011 It wouldn't be an actual conflict, it would be a naming issue. In other...
- Jacob Carlborg (9/22) Aug 15 2011 I guess so.
- Martin Nowak (17/145) Aug 12 2011 Well, I would give it a try to implement a prototype.
- Nick Sabalausky (21/41) Aug 11 2011 That's *only* true if you go along with DIP11's misguided file-oriented
- Nick Sabalausky (3/48) Aug 11 2011 In other words, DIP11 just reinvents the wheel, poorly.
- Steven Schveighoffer (23/70) Aug 11 2011 It already does this:
- Jacob Carlborg (5/8) Aug 12 2011 The DIP needs to explain this, is that the whole point?
- Steven Schveighoffer (11/18) Aug 12 2011 The DIP focuses on the compiler changes needed + gives rudimentary
- Jacob Carlborg (4/25) Aug 12 2011 Ok, I see.
- Andrei Alexandrescu (4/41) Aug 11 2011 It's difficult to get all dependencies when not all sources have been
- Nick Sabalausky (3/5) Aug 11 2011 With DIP11, yes. With a traditional-style package manager, no.
- Steven Schveighoffer (7/15) Aug 11 2011 With either style, you need to download a package in order to determine ...
- kennytm (8/29) Aug 11 2011 In Debian's apt, there will be a central index that records all
- Jacob Carlborg (6/35) Aug 12 2011 No, but I'm guessing it's more efficient to download zip files instead
- Steven Schveighoffer (11/40) Aug 12 2011 This could be done (in fact, the tool could have a url system that uses ...
- Jacob Carlborg (4/20) Aug 12 2011 No, not if you have a single meta file with all the dependencies.
- Jacob Carlborg (5/8) Aug 12 2011 No, not when you have a single meta file containing all the dependencies...
- David Nadlinger (4/11) Aug 12 2011 Or a »spec« file along with each package containing the metadata, e.g....
- Andrei Alexandrescu (3/10) Aug 12 2011 I understand. I believe there is value in avoiding meta files.
- Jacob Carlborg (6/17) Aug 12 2011 This meta file lives on in the repository and is an implementation
- Martin Nowak (35/50) Aug 11 2011 It really looks sound to me.
- Jacob Carlborg (6/30) Aug 12 2011 We will be very dependent on Walter or anyone else that knows the
- Steven Schveighoffer (7/42) Aug 12 2011 This is not true. The compiler implements *hooks* for a download tool. ...
- Jacob Carlborg (6/12) Aug 12 2011 I've read the whole DIP and I know there is an external tool that
- Steven Schveighoffer (8/20) Aug 12 2011 I thought you meant we would be dependent on Walter to write the *downlo...
- Jacob Carlborg (7/28) Aug 12 2011 Yeah, that's true and it's a good thing. I can tell you this, I've
- Steven Schveighoffer (3/37) Aug 12 2011 /me in same boat as you
- Nick Sabalausky (30/46) Aug 11 2011 I really see it as solving the wrong problem the wrong way.
- Jacob Carlborg (4/53) Aug 11 2011 I completely agree with this.
What is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /Jonas
Aug 10 2011
On 2011-08-10 21:55, jdrewsen wrote:What is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D -- /Jacob Carlborg
Aug 11 2011
On 11/08/11 09.07, Jacob Carlborg wrote:On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasWhat is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011
On 2011-08-11 09:41, Jonas Drewsen wrote:On 11/08/11 09.07, Jacob Carlborg wrote:I think that DIP11 is too limited, for example, it doesn't deal with versions. Orbit combined with a build tool will be seamless as well. RDMD is a great tool but as soon as you need to add compiler flags or compile a library you need either some kind of script or a build tool. And in that case you can just go with the built tool and have it work on all platforms. -- /Jacob CarlborgOn 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasWhat is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011
On 11/08/11 09.49, Jacob Carlborg wrote:On 2011-08-11 09:41, Jonas Drewsen wrote:Some refinements needs to be done to the DIP yes. One of them is version handling IMHO. /JonasOn 11/08/11 09.07, Jacob Carlborg wrote:I think that DIP11 is too limited, for example, it doesn't deal with versions. Orbit combined with a build tool will be seamless as well. RDMD is a great tool but as soon as you need to add compiler flags or compile a library you need either some kind of script or a build tool. And in that case you can just go with the built tool and have it work on all platforms.On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasWhat is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011
On Thu, 11 Aug 2011 03:49:56 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 09:41, Jonas Drewsen wrote:Given that the implementation would be a compiler-used tool, and the tool can implement any protocol it wants, I think it has very few limitations. I envision the tool being able to handle any network protocol or packaging system we want it to. I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces. -SteveOn 11/08/11 09.07, Jacob Carlborg wrote:I think that DIP11 is too limited, for example, it doesn't deal with versions. Orbit combined with a build tool will be seamless as well. RDMD is a great tool but as soon as you need to add compiler flags or compile a library you need either some kind of script or a build tool. And in that case you can just go with the built tool and have it work on all platforms.On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasWhat is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011
On 2011-08-11 14:52, Steven Schveighoffer wrote:Given that the implementation would be a compiler-used tool, and the tool can implement any protocol it wants, I think it has very few limitations. I envision the tool being able to handle any network protocol or packaging system we want it to.That might be the case. Since it's arbitrary URLs that represents D modules and packages, to me it seems that there needs to be a lot of conventions: * Where to put the packages * How to name them * How to indicate a specific version and so on.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces. -SteveUntil the compiler can automatically compile dependencies we need build tools. What about linking with pre-compiled libraries, how would that work? Currently the linking paths needs to be known before the compiler is invoked. You would first need to compile without linking and then link, or something like that. Assuming the compiler isn't changed to be able to receive linker options form the external tool. Note that I'm basing all this on what's written in the DIP (and what you've said), as far as I know that's the current suggestion. But the DIP can of course be enhanced and updated. -- /Jacob Carlborg
Aug 11 2011
I don't trust DIP11 to produce anything stable. We already have a half-broken compiler, and now we're going to have a half-broken build system to go along with it? No way. I'd rather it be an external tool (built with D so we can actually hack on it), that can be *downloaded from anywhere*, without having to rely on dmars.com (which is painfully slow), and being encumbered by licensing issues (you can't distribute DMD).
Aug 11 2011
On Thu, 11 Aug 2011 10:47:36 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 14:52, Steven Schveighoffer wrote:These are implementation details. The compiler just knows "hm.. there's some external file, I don't know how to get it, tool, can you help me out?" The tool could potentially download the files and build them, or download a pre-compiled package. Or it could add files to the compiler's todo list somehow (having recursive compiler invocations might be bad...)Given that the implementation would be a compiler-used tool, and the tool can implement any protocol it wants, I think it has very few limitations. I envision the tool being able to handle any network protocol or packaging system we want it to.That might be the case. Since it's arbitrary URLs that represents D modules and packages, to me it seems that there needs to be a lot of conventions: * Where to put the packages * How to name them * How to indicate a specific versionI agree it would be advantageous to have the build tool add files to the compiler's todo list. As of now, just downloading source for import does not help because you still need to compile the file, not just import it. But you have to admit, just having the source file or include path include parts of the internet would be pretty cool. I really like that approach. Its sort of like how dsss used to work, except dsss contained a huge portion of the compiler in it. At least with DIP11, the compiler is the driver, no need to maintain a separate "compiler". It could be that DIP11 needs more work to get a technically useful and implementable feature.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces. -SteveUntil the compiler can automatically compile dependencies we need build tools.What about linking with pre-compiled libraries, how would that work? Currently the linking paths needs to be known before the compiler is invoked. You would first need to compile without linking and then link, or something like that. Assuming the compiler isn't changed to be able to receive linker options form the external tool.I'm assuming the linker path in dmd.conf would include some global or user-specific cache directory where pre-compiled libraries are downloaded. Then the download tool puts the libraries in the right spot, so the path does not need adjusting.Note that I'm basing all this on what's written in the DIP (and what you've said), as far as I know that's the current suggestion. But the DIP can of course be enhanced and updated.Yes, and much better than a DIP is a working solution, so it might very well be that Orbit wins :) I certainly don't have the time or expertise to implement it... -Steve
Aug 11 2011
On 2011-08-11 17:45, Steven Schveighoffer wrote:On Thu, 11 Aug 2011 10:47:36 -0400, Jacob Carlborg <doob me.com> wrote:In that case that needs to be said in the DIP.On 2011-08-11 14:52, Steven Schveighoffer wrote:These are implementation details. The compiler just knows "hm.. there's some external file, I don't know how to get it, tool, can you help me out?" The tool could potentially download the files and build them, or download a pre-compiled package. Or it could add files to the compiler's todo list somehow (having recursive compiler invocations might be bad...)Given that the implementation would be a compiler-used tool, and the tool can implement any protocol it wants, I think it has very few limitations. I envision the tool being able to handle any network protocol or packaging system we want it to.That might be the case. Since it's arbitrary URLs that represents D modules and packages, to me it seems that there needs to be a lot of conventions: * Where to put the packages * How to name them * How to indicate a specific versionYes, exactly.I agree it would be advantageous to have the build tool add files to the compiler's todo list. As of now, just downloading source for import does not help because you still need to compile the file, not just import it.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces. -SteveUntil the compiler can automatically compile dependencies we need build tools.But you have to admit, just having the source file or include path include parts of the internet would be pretty cool. I really like that approach. Its sort of like how dsss used to work, except dsss contained a huge portion of the compiler in it. At least with DIP11, the compiler is the driver, no need to maintain a separate "compiler". It could be that DIP11 needs more work to get a technically useful and implementable feature.It sounds pretty cool but the question is how well it would work and if the right thing to do.Assuming that, you would still need to link with the libraries. I don't know if pragma(lib, ""); could work but I don't like that pragma in general. It's platform dependent and I'm not sure if it works for dynamic libraries. I don't think GDC implements it.What about linking with pre-compiled libraries, how would that work? Currently the linking paths needs to be known before the compiler is invoked. You would first need to compile without linking and then link, or something like that. Assuming the compiler isn't changed to be able to receive linker options form the external tool.I'm assuming the linker path in dmd.conf would include some global or user-specific cache directory where pre-compiled libraries are downloaded. Then the download tool puts the libraries in the right spot, so the path does not need adjusting.I guess we just have to see what happens. I will at lease continue working on Orbit. -- /Jacob CarlborgNote that I'm basing all this on what's written in the DIP (and what you've said), as far as I know that's the current suggestion. But the DIP can of course be enhanced and updated.Yes, and much better than a DIP is a working solution, so it might very well be that Orbit wins :) I certainly don't have the time or expertise to implement it... -Steve
Aug 11 2011
"Jacob Carlborg" <doob me.com> wrote in message news:j21190$ls8$1 digitalmars.com...Assuming that, you would still need to link with the libraries. I don't know if pragma(lib, ""); could work but I don't like that pragma in general. It's platform dependentThis works cross-platform: pragma(lib, "LibNameWithoutExt"); And if you do need platform-specific, there's always version().and I'm not sure if it works for dynamic libraries. I don't think GDC implements it.A major downside of GDC, IMO.
Aug 11 2011
On 2011-08-11 23:30, Nick Sabalausky wrote:"Jacob Carlborg"<doob me.com> wrote in message news:j21190$ls8$1 digitalmars.com...Will that work with all available library types (static, dynamic) on all platforms?Assuming that, you would still need to link with the libraries. I don't know if pragma(lib, ""); could work but I don't like that pragma in general. It's platform dependentThis works cross-platform: pragma(lib, "LibNameWithoutExt");And if you do need platform-specific, there's always version().You do, since on Posix most library names are prefixed with "lib". DSSS/Rebuild did this very well with the pragma "link". It prefixed the library names depending on the platform.Yes. -- /Jacob Carlborgand I'm not sure if it works for dynamic libraries. I don't think GDC implements it.A major downside of GDC, IMO.
Aug 12 2011
On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:On Thu, 11 Aug 2011 03:49:56 -0400, Jacob Carlborg <doob me.com> wrote: On 2011-08-11 09:41, Jonas Drewsen wrote:This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.Given that the implementation would be a compiler-used tool, and the tool can implement any protocol it wants, I think it has very few limitations. I envision the tool being able to handle any network protocol or packaging system we want it to. I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.On 11/08/11 09.07, Jacob Carlborg wrote:I think that DIP11 is too limited, for example, it doesn't deal with versions. Orbit combined with a build tool will be seamless as well. RDMD is a great tool but as soon as you need to add compiler flags or compile a library you need either some kind of script or a build tool. And in that case you can just go with the built tool and have it work on all platforms.On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasWhat is the status of DIP11 http://www.wikiservice.at/d/**wiki.cgi?LanguageDevel/DIPs/**DIP11<http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11> Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-**carlborg/orbit/wiki/Orbit-** Package-Manager-for-D<https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D>
Aug 11 2011
On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile. And there is no parsing of the output data, the problem boils down to a simple get tool. Running a simple get tool over and over doesn't consume as much time/resources as running the compiler over and over. There are still problems with the DIP -- there is no way yet to say "oh yeah, compiler, you have to build this file that I downloaded too". But if nothing else, I like the approach of having the compiler drive everything. It reduces the problem space to a smaller more focused task -- get a file based on a url. We also already have many tools in existence that can parse a url and download a file/package. -SteveI think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.
Aug 11 2011
On 2011-08-11 19:07, Steven Schveighoffer wrote:On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again. This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once. Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything. Well, actually if you're using a build tool it would drive everything. You have the package dependencies in the build script file. The build tool starts by invoking the package manager (see above), then it can query the package manager for include and library paths and libraries to link with. As the final step it invokes the compiler to build everything (see above).On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.And there is no parsing of the output data, the problem boils down to a simple get tool. Running a simple get tool over and over doesn't consume as much time/resources as running the compiler over and over. There are still problems with the DIP -- there is no way yet to say "oh yeah, compiler, you have to build this file that I downloaded too". But if nothing else, I like the approach of having the compiler drive everything. It reduces the problem space to a smaller more focused task -- get a file based on a url. We also already have many tools in existence that can parse a url and download a file/package. -SteveThe best would be if the compiler could be a library. Then the build tool could drive everything and ask other tools, like the a package manager and compiler about information it needs to build everything. -- /Jacob Carlborg
Aug 11 2011
On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 19:07, Steven Schveighoffer wrote:Forgive my compiler ignorance (not a compiler writer), but why does the compiler have to start over? It's no different than importing a file, is it?On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again.On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once. Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything.The benefit of using source is the source code is already written with an import statement, there is no need to write an external build file (all you need is command line that configures the compiler). Essentially, the import statements become your "build file". I think dsss worked like this, but I don't remember completely. My ideal solution, no matter how it's implemented is, I get a file blah.d, and I do: xyz blah.d and xyz handles all the dirty work of figuring out what to build along with blah.d as well as where to get those resources. Whether xyz == dmd, I don't know. It sure sounds like it could be... -Steve
Aug 11 2011
On 2011-08-11 20:31, Steven Schveighoffer wrote:On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob me.com> wrote:Probably it's no different. Well what's different is it will first parse a couple of files, then downloads a few files, then parse the downloaded files and the fetch some more files and so on. To me this seems inefficient but since it's not implemented I don't know. It feels more efficient if it could download all needed files in one step. And then compile all files in the next step. I don't know what's possible with this DIP but to me it seems that the current suggestion will download individual files. This also seems inefficient, my solution deals with packages, i.e. zip files.So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again.Forgive my compiler ignorance (not a compiler writer), but why does the compiler have to start over? It's no different than importing a file, is it?I don't see the big difference. I think the most project (not the smallest ones) will end up with a special file anyway, containing the pragmas declaring these imports. In addition to that as soon as you need to pass flags to the compiler you will most likely to put that in a file of some kind. In that case you can just as easily put them in a build script and use a build tool.This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once. Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything.The benefit of using source is the source code is already written with an import statement, there is no need to write an external build file (all you need is command line that configures the compiler).Essentially, the import statements become your "build file". I think dsss worked like this, but I don't remember completely.Yes, this is similar how DSSS worked. The difference is that you didn't need a pragma to link a package to an URL you just wrote the import declarations as you do now. One problem I think DSSS has, is, as far as I know, it can't handle top level packages with same name. Or at least not in any good way. If you go with the Java package naming scheme and name your top level package after your domain, example: module com.foo.bar; module com.foo.foobar; And another project does the same: module com.abc.defg; Then these two projects will both end up in the "com" folder. Not very good in my opinion. In my solution every package has a name independent of the packages it contains an all packages are placed in a folder named after the package, including the version number.My ideal solution, no matter how it's implemented is, I get a file blah.d, and I do: xyz blah.d and xyz handles all the dirty work of figuring out what to build along with blah.d as well as where to get those resources. Whether xyz == dmd, I don't know. It sure sounds like it could be... -SteveYeah, I would like that too. But as I said above, as soon as you need compiler flags you need an additional file. With a built tool it can then be just: "$ build" -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 03:23:30 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 20:31, Steven Schveighoffer wrote:The extendability is in the url. For example, yes, http://server/file.d downloads a single file, and would be slow if every import needed to download an individual file. But some other protocol, or some other cue to the download tool (like if the download has a path component that ends in .tgz or something), then you download all the files at once, and on subsequent imports, the cached file is used (no download necessary). I think there's even a suggestion in there for passing directives back to the compiler like "file already downloaded, open it here..." I think unless a single file *is* the package, it's going to be foolish to download individual files. I also think a protocol which defines a central repository would be beneficial. So you only need one -I parameter to include a whole community of D code (like dsource).On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob me.com> wrote:Probably it's no different. Well what's different is it will first parse a couple of files, then downloads a few files, then parse the downloaded files and the fetch some more files and so on. To me this seems inefficient but since it's not implemented I don't know. It feels more efficient if it could download all needed files in one step. And then compile all files in the next step. I don't know what's possible with this DIP but to me it seems that the current suggestion will download individual files. This also seems inefficient, my solution deals with packages, i.e. zip files.So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again.Forgive my compiler ignorance (not a compiler writer), but why does the compiler have to start over? It's no different than importing a file, is it?Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.I don't see the big difference. I think the most project (not the smallest ones) will end up with a special file anyway, containing the pragmas declaring these imports.This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once. Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything.The benefit of using source is the source code is already written with an import statement, there is no need to write an external build file (all you need is command line that configures the compiler).In addition to that as soon as you need to pass flags to the compiler you will most likely to put that in a file of some kind. In that case you can just as easily put them in a build script and use a build tool.a batch files/shell script should suffice, no need for a "special" tool.IIRC, dsss still had a global config file that defined where to import things from. The DIP defines that -I switches can also define internet resources along with pragmas, so sticking those in the dmd.conf would probably be the equivalent.Essentially, the import statements become your "build file". I think dsss worked like this, but I don't remember completely.Yes, this is similar how DSSS worked. The difference is that you didn't need a pragma to link a package to an URL you just wrote the import declarations as you do now.One problem I think DSSS has, is, as far as I know, it can't handle top level packages with same name. Or at least not in any good way. If you go with the Java package naming scheme and name your top level package after your domain, example: module com.foo.bar; module com.foo.foobar; And another project does the same: module com.abc.defg; Then these two projects will both end up in the "com" folder. Not very good in my opinion. In my solution every package has a name independent of the packages it contains an all packages are placed in a folder named after the package, including the version number.These all seem like implementation details. I don't care how the tool caches the files.Or instructions on the web site "use 'dmd -O -inline -release -version=SpecificVersion project.d' to compile" Or build.sh (build.bat) Note that dcollections has no makefile, everything is built from shell scripts. I almost never have to edit the build file, because the line's like: dmd -lib -O -release -inline dcollections/*.d dcollections/model/*.d Any new files get included automatically. And it takes a second to build, so who cares if you rebuild every file every time? Interestingly, libraries would still need to specify all the files since they may not import eachother :) I don't know if there's a "good" solution that isn't too coarse for that. All of this discussion is good to determine the viability, and clarify some misinterpretations of DIP11, but I think unless someone steps up and tries to implement it, it's a moot conversation. I certainly don't have the time or knowledge to implement it. So is there anyone who is interested, or has tried (to re-ask the original question)? -SteveMy ideal solution, no matter how it's implemented is, I get a file blah.d, and I do: xyz blah.d and xyz handles all the dirty work of figuring out what to build along with blah.d as well as where to get those resources. Whether xyz == dmd, I don't know. It sure sounds like it could be... -SteveYeah, I would like that too. But as I said above, as soon as you need compiler flags you need an additional file. With a built tool it can then be just: "$ build"
Aug 12 2011
I did write build2.d, which tries to simulate dip11 outside the compiler. http://arsdnet.net/dcode/build2.d Since it's not actually in the compiler, it can't be perfect but it sorta tries.
Aug 12 2011
On 2011-08-12 15:49, Steven Schveighoffer wrote:Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.Now I'm not quite sure I understand. Are you saying that every file needs to have these pragma imports ?Yeah, but you would need to duplicate the shell script, one for Posix and one for Windows. As far as I know there is no scripting like file that both Windows and Posix can read out of the box.In addition to that as soon as you need to pass flags to the compiler you will most likely to put that in a file of some kind. In that case you can just as easily put them in a build script and use a build tool.a batch files/shell script should suffice, no need for a "special" tool.Ok, I see.IIRC, dsss still had a global config file that defined where to import things from. The DIP defines that -I switches can also define internet resources along with pragmas, so sticking those in the dmd.conf would probably be the equivalent.Essentially, the import statements become your "build file". I think dsss worked like this, but I don't remember completely.Yes, this is similar how DSSS worked. The difference is that you didn't need a pragma to link a package to an URL you just wrote the import declarations as you do now.Well yes, but that is how DSSS works and that's what I'm explaining.One problem I think DSSS has, is, as far as I know, it can't handle top level packages with same name. Or at least not in any good way. If you go with the Java package naming scheme and name your top level package after your domain, example: module com.foo.bar; module com.foo.foobar; And another project does the same: module com.abc.defg; Then these two projects will both end up in the "com" folder. Not very good in my opinion. In my solution every package has a name independent of the packages it contains an all packages are placed in a folder named after the package, including the version number.These all seem like implementation details. I don't care how the tool caches the files.Or instructions on the web site "use 'dmd -O -inline -release -version=SpecificVersion project.d' to compile" Or build.sh (build.bat) Note that dcollections has no makefile, everything is built from shell scripts. I almost never have to edit the build file, because the line's like:You would need to shell script files, one for Windows and one for Posix, see above.dmd -lib -O -release -inline dcollections/*.d dcollections/model/*.d Any new files get included automatically. And it takes a second to build, so who cares if you rebuild every file every time? Interestingly, libraries would still need to specify all the files since they may not import eachother :) I don't know if there's a "good" solution that isn't too coarse for that.That's what a build tool can handle. I think you should read this: http://dsource.org/projects/dsss/wiki/DSSSByExample and hopefully you'll understand why a built tool is a good thing.All of this discussion is good to determine the viability, and clarify some misinterpretations of DIP11, but I think unless someone steps up and tries to implement it, it's a moot conversation. I certainly don't have the time or knowledge to implement it. So is there anyone who is interested, or has tried (to re-ask the original question)? -SteveI guess that's right. -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 14:24:46 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 15:49, Steven Schveighoffer wrote:Let's say file a.d pragmas that module foo means http://foo.com/projectx, and module b.d from another project pragmas that module foo means http://bar.com/projecty. If I import both a and b, what happens? It only makes sense for a pragma to affect the current file. This is similar to how version=x statements only affect the current file.Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.Now I'm not quite sure I understand. Are you saying that every file needs to have these pragma imports ?But I don't have to have the user install yet other build tool. There already are script interpreters on both windows and posix systems. Especially for one-liners.Yeah, but you would need to duplicate the shell script, one for Posix and one for Windows. As far as I know there is no scripting like file that both Windows and Posix can read out of the box.In addition to that as soon as you need to pass flags to the compiler you will most likely to put that in a file of some kind. In that case you can just as easily put them in a build script and use a build tool.a batch files/shell script should suffice, no need for a "special" tool.OK.Well yes, but that is how DSSS works and that's what I'm explaining.One problem I think DSSS has, is, as far as I know, it can't handle top level packages with same name. Or at least not in any good way. If you go with the Java package naming scheme and name your top level package after your domain, example: module com.foo.bar; module com.foo.foobar; And another project does the same: module com.abc.defg; Then these two projects will both end up in the "com" folder. Not very good in my opinion. In my solution every package has a name independent of the packages it contains an all packages are placed in a folder named after the package, including the version number.These all seem like implementation details. I don't care how the tool caches the files.Yes.Or instructions on the web site "use 'dmd -O -inline -release -version=SpecificVersion project.d' to compile" Or build.sh (build.bat) Note that dcollections has no makefile, everything is built from shell scripts. I almost never have to edit the build file, because the line's like:You would need to shell script files, one for Windows and one for Posix, see above.That's what a build tool can handle. I think you should read this: http://dsource.org/projects/dsss/wiki/DSSSByExample and hopefully you'll understand why a built tool is a good thing.I'll take a look when I get a moment. -Steve
Aug 12 2011
On 2011-08-12 20:36, Steven Schveighoffer wrote:On Fri, 12 Aug 2011 14:24:46 -0400, Jacob Carlborg <doob me.com> wrote:Again, will that mean you have to specify a pragma for each file? Just for the record, you cannot always solve all dependencies, it can happen that two packages conflict with each other. -- /Jacob CarlborgOn 2011-08-12 15:49, Steven Schveighoffer wrote:Let's say file a.d pragmas that module foo means http://foo.com/projectx, and module b.d from another project pragmas that module foo means http://bar.com/projecty. If I import both a and b, what happens? It only makes sense for a pragma to affect the current file. This is similar to how version=x statements only affect the current file.Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.Now I'm not quite sure I understand. Are you saying that every file needs to have these pragma imports ?
Aug 12 2011
On Fri, 12 Aug 2011 15:12:07 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 20:36, Steven Schveighoffer wrote:Yes, or specify it on the command line/config file. I think the risk of inadvertently importing incorrect files is too great. Looking back at it, however, we probably would need some mechanism for files in the same package to inherit the source location. For example, if you pragma that a module foo = http://foo.com/projectx, and import foo.xyz, and foo.xyz imports foo.abc, we don't want foo.xyz to have to pragma the same url just to include another file in its own package. Clearly we need some more thought around this.On Fri, 12 Aug 2011 14:24:46 -0400, Jacob Carlborg <doob me.com> wrote:Again, will that mean you have to specify a pragma for each file?On 2011-08-12 15:49, Steven Schveighoffer wrote:Let's say file a.d pragmas that module foo means http://foo.com/projectx, and module b.d from another project pragmas that module foo means http://bar.com/projecty. If I import both a and b, what happens? It only makes sense for a pragma to affect the current file. This is similar to how version=x statements only affect the current file.Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.Now I'm not quite sure I understand. Are you saying that every file needs to have these pragma imports ?Just for the record, you cannot always solve all dependencies, it can happen that two packages conflict with each other.As long as the dependencies are contained, there should be no conflict. If I can compile project x and y separately, and both have a conflicting dependency, then I should still be able to compile a project that depends on both x and y, as long as they don't import eachother. -Steve
Aug 12 2011
On 2011-08-12 21:22, Steven Schveighoffer wrote:Yes, or specify it on the command line/config file. I think the risk of inadvertently importing incorrect files is too great. Looking back at it, however, we probably would need some mechanism for files in the same package to inherit the source location. For example, if you pragma that a module foo = http://foo.com/projectx, and import foo.xyz, and foo.xyz imports foo.abc, we don't want foo.xyz to have to pragma the same url just to include another file in its own package. Clearly we need some more thought around this.Yeah, that's what I was afraid of.If x and y depends on two different versions of z, how would that be solved. As far as I know you cannot link the same library of two different version twice, you will get conflicting symbols. -- /Jacob CarlborgJust for the record, you cannot always solve all dependencies, it can happen that two packages conflict with each other.As long as the dependencies are contained, there should be no conflict. If I can compile project x and y separately, and both have a conflicting dependency, then I should still be able to compile a project that depends on both x and y, as long as they don't import eachother. -Steve
Aug 13 2011
On Sat, 13 Aug 2011 08:24:53 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 21:22, Steven Schveighoffer wrote:It wouldn't be an actual conflict, it would be a naming issue. In other words, I'm not talking about two different versions of the same library. That has to be a compiler error. What could happen though, is this situation: a.d pragma's foo.bar as being http://somedreposiotry.com/foo/barv1.0 b.d includes the repository path http://somedrepository.com, whose default foo.bar goes to foo/barv2.0, but b does not depend on foo.bar I don't think a.d's pragma should include b's pragma, or vice versa. Especially if it is some sort of weird import order precedent. It's just much simpler to say "b's pragma only affects b, and a's pragma only affects a." However, I think we need to add that any files imported via a pragma implicitly include that path. I should probably add that to the DIP. Obviously if a.d depends on foo.bar, and b.d depends on foo.bar, but the locations are different, it should be a compiler error. -SteveYes, or specify it on the command line/config file. I think the risk of inadvertently importing incorrect files is too great. Looking back at it, however, we probably would need some mechanism for files in the same package to inherit the source location. For example, if you pragma that a module foo = http://foo.com/projectx, and import foo.xyz, and foo.xyz imports foo.abc, we don't want foo.xyz to have to pragma the same url just to include another file in its own package. Clearly we need some more thought around this.Yeah, that's what I was afraid of.If x and y depends on two different versions of z, how would that be solved. As far as I know you cannot link the same library of two different version twice, you will get conflicting symbols.Just for the record, you cannot always solve all dependencies, it can happen that two packages conflict with each other.As long as the dependencies are contained, there should be no conflict. If I can compile project x and y separately, and both have a conflicting dependency, then I should still be able to compile a project that depends on both x and y, as long as they don't import eachother. -Steve
Aug 15 2011
On 2011-08-15 15:11, Steven Schveighoffer wrote:What could happen though, is this situation: a.d pragma's foo.bar as being http://somedreposiotry.com/foo/barv1.0 b.d includes the repository path http://somedrepository.com, whose default foo.bar goes to foo/barv2.0, but b does not depend on foo.barThat would be a problem.I don't think a.d's pragma should include b's pragma, or vice versa. Especially if it is some sort of weird import order precedent. It's just much simpler to say "b's pragma only affects b, and a's pragma only affects a." However, I think we need to add that any files imported via a pragma implicitly include that path. I should probably add that to the DIP.I guess so.Obviously if a.d depends on foo.bar, and b.d depends on foo.bar, but the locations are different, it should be a compiler error. -SteveOk. My original concern was to have to add pragmas to every D file. That is not a solution that will last long. But if you can add a compiler flag instead I guess that's ok. -- /Jacob Carlborg
Aug 15 2011
On Fri, 12 Aug 2011 15:49:30 +0200, Steven Schveighoffer <schveiguy yahoo.com> wrote:On Fri, 12 Aug 2011 03:23:30 -0400, Jacob Carlborg <doob me.com> wrote:Well, I would give it a try to implement a prototype. Given that we can sort out how to handle the actual building of remote sources. As long we as are only talking about imports this remains not so useful. Some approaches I could think of: I. make 'dmd mysource.d acme.a=http://acme.org/a -Iacme.b=http://acme.org/b' mean build every source that you download from acme.a II. for every import that is actually a source file (*.d) let the compiler decide if linking will be needed, if so build an object for that module III. specify a good naming scheme for distributing binary libraries Actually I only like the second solution. So instead of 'dmd a.d b.d' one could would write 'dmd -build-deps a.d' where a importing b. martinOn 2011-08-11 20:31, Steven Schveighoffer wrote:The extendability is in the url. For example, yes, http://server/file.d downloads a single file, and would be slow if every import needed to download an individual file. But some other protocol, or some other cue to the download tool (like if the download has a path component that ends in .tgz or something), then you download all the files at once, and on subsequent imports, the cached file is used (no download necessary). I think there's even a suggestion in there for passing directives back to the compiler like "file already downloaded, open it here..." I think unless a single file *is* the package, it's going to be foolish to download individual files. I also think a protocol which defines a central repository would be beneficial. So you only need one -I parameter to include a whole community of D code (like dsource).On Thu, 11 Aug 2011 14:19:35 -0400, Jacob Carlborg <doob me.com> wrote:Probably it's no different. Well what's different is it will first parse a couple of files, then downloads a few files, then parse the downloaded files and the fetch some more files and so on. To me this seems inefficient but since it's not implemented I don't know. It feels more efficient if it could download all needed files in one step. And then compile all files in the next step. I don't know what's possible with this DIP but to me it seems that the current suggestion will download individual files. This also seems inefficient, my solution deals with packages, i.e. zip files.So how would that be different if the compiler drives everything? Say you begin with a few local files. The compiler then scans through them looking for URL imports. Then asks a tool to download the dependencies it found and starts all over again.Forgive my compiler ignorance (not a compiler writer), but why does the compiler have to start over? It's no different than importing a file, is it?Note that the pragmas are specific to that file only. So you don't have an import file which defines pragmas. This is to prevent conflicts between two files that declare the same package override.I don't see the big difference. I think the most project (not the smallest ones) will end up with a special file anyway, containing the pragmas declaring these imports.This is how my package manager will work. You have a local file containing all the direct dependencies needed to build your project. When invoked, the package manager tool fetches a file containing all packages and all their dependencies, from the repository. It then figures out all dependencies, both direct and indirect. Then it downloads all dependencies. It does all this before the compiler is even invoked once. Then, preferably, but optionally, it hands over to a build tool that builds everything. The build tool would need to invoke the compiler twice, first to get all the dependencies of all the local files in the project that is being built. Then it finally runs the compiler to build everything.The benefit of using source is the source code is already written with an import statement, there is no need to write an external build file (all you need is command line that configures the compiler).In addition to that as soon as you need to pass flags to the compiler you will most likely to put that in a file of some kind. In that case you can just as easily put them in a build script and use a build tool.a batch files/shell script should suffice, no need for a "special" tool.IIRC, dsss still had a global config file that defined where to import things from. The DIP defines that -I switches can also define internet resources along with pragmas, so sticking those in the dmd.conf would probably be the equivalent.Essentially, the import statements become your "build file". I think dsss worked like this, but I don't remember completely.Yes, this is similar how DSSS worked. The difference is that you didn't need a pragma to link a package to an URL you just wrote the import declarations as you do now.One problem I think DSSS has, is, as far as I know, it can't handle top level packages with same name. Or at least not in any good way. If you go with the Java package naming scheme and name your top level package after your domain, example: module com.foo.bar; module com.foo.foobar; And another project does the same: module com.abc.defg; Then these two projects will both end up in the "com" folder. Not very good in my opinion. In my solution every package has a name independent of the packages it contains an all packages are placed in a folder named after the package, including the version number.These all seem like implementation details. I don't care how the tool caches the files.Or instructions on the web site "use 'dmd -O -inline -release -version=SpecificVersion project.d' to compile" Or build.sh (build.bat) Note that dcollections has no makefile, everything is built from shell scripts. I almost never have to edit the build file, because the line's like: dmd -lib -O -release -inline dcollections/*.d dcollections/model/*.d Any new files get included automatically. And it takes a second to build, so who cares if you rebuild every file every time? Interestingly, libraries would still need to specify all the files since they may not import eachother :) I don't know if there's a "good" solution that isn't too coarse for that. All of this discussion is good to determine the viability, and clarify some misinterpretations of DIP11, but I think unless someone steps up and tries to implement it, it's a moot conversation. I certainly don't have the time or knowledge to implement it. So is there anyone who is interested, or has tried (to re-ask the original question)? -SteveMy ideal solution, no matter how it's implemented is, I get a file blah.d, and I do: xyz blah.d and xyz handles all the dirty work of figuring out what to build along with blah.d as well as where to get those resources. Whether xyz == dmd, I don't know. It sure sounds like it could be... -SteveYeah, I would like that too. But as I said above, as soon as you need compiler flags you need an additional file. With a built tool it can then be just: "$ build"
Aug 12 2011
"Steven Schveighoffer" <schveiguy yahoo.com> wrote in message news:op.vz166yaieav7ka localhost.localdomain...On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:That's *only* true if you go along with DIP11's misguided file-oriented approach. With a real package manager, none of that is needed. Your app just says "I need packages X, Y and Z." And X, Y and Z do the same for their requirements. This is all trivial metadata. Emphasis on *trivial*. So, before DMD is ever invoked at all, before one line of the source is ever even read, the package manager has make sure that *everything* is *already* right there. No need to go off on some goofy half-cocked "compile/download/complile/download" dance. So DMD *never* needs to be invoked more than twice. Once to get the deps, once to compile. Better yet, if DMD gets the switch --compile-everything-dammit-not-just-the-explicit-files-fr m-the-command-line, then DMD never needs to be invoked more than *once*: Once to figure out the deps *while* being intelligent enough to actually compile all of them.On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.And there is no parsing of the output data,Parsing the .deps file is extremely simple. RDMD does it with one regex. Personally, I think even that is overkill. Better yet, with a switch to make DMD incorporate RDMD's --build-only functionality, there is *still* no parsing of output data. So all in all, there is *nothing* that DIP11 does that can't be done *better* by other (more typical) means.
Aug 11 2011
"Nick Sabalausky" <a a.a> wrote in message news:j219ck$3i0$1 digitalmars.com..."Steven Schveighoffer" <schveiguy yahoo.com> wrote in message news:op.vz166yaieav7ka localhost.localdomain...In other words, DIP11 just reinvents the wheel, poorly.On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:That's *only* true if you go along with DIP11's misguided file-oriented approach. With a real package manager, none of that is needed. Your app just says "I need packages X, Y and Z." And X, Y and Z do the same for their requirements. This is all trivial metadata. Emphasis on *trivial*. So, before DMD is ever invoked at all, before one line of the source is ever even read, the package manager has make sure that *everything* is *already* right there. No need to go off on some goofy half-cocked "compile/download/complile/download" dance. So DMD *never* needs to be invoked more than twice. Once to get the deps, once to compile. Better yet, if DMD gets the switch --compile-everything-dammit-not-just-the-explicit-files-fr m-the-command-line, then DMD never needs to be invoked more than *once*: Once to figure out the deps *while* being intelligent enough to actually compile all of them.On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.And there is no parsing of the output data,Parsing the .deps file is extremely simple. RDMD does it with one regex. Personally, I think even that is overkill. Better yet, with a switch to make DMD incorporate RDMD's --build-only functionality, there is *still* no parsing of output data. So all in all, there is *nothing* that DIP11 does that can't be done *better* by other (more typical) means.
Aug 11 2011
On Thu, 11 Aug 2011 15:09:15 -0400, Nick Sabalausky <a a.a> wrote:"Steven Schveighoffer" <schveiguy yahoo.com> wrote in message news:op.vz166yaieav7ka localhost.localdomain...It already does this: import std.stdio; says "I need the package std" The purpose of the DIP is to try and reuse this information without having to have an extraneous file that says "I depend on package std."On Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:That's *only* true if you go along with DIP11's misguided file-oriented approach. With a real package manager, none of that is needed. Your app just says "I need packages X, Y and Z."On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.And X, Y and Z do the same for their requirements. This is all trivial metadata. Emphasis on *trivial*. So, before DMD is ever invoked at all, before one line of the source is ever even read, the package manager has make sure that *everything* is *already* right there. No need to go off on some goofy half-cocked "compile/download/complile/download" dance.With the DIP, I envision encoding the package in the URL. For example, you do: -Ispiffy=dpm://spiffy.com/latest.dpm And then blah.d imports spiffy.neat. The download tool is given the url dpm://spiffy.com/latest.dpm/neat.d and understanding the d package module (dpm) protocol, downloads the package that contains neat.d, (along with compiled lib) and then pipes the contents of neat.d to the compiler, which treats it just like a file it just read from the filesystem. Then any other file needed from that package is simply piped from the already-downloaded package. It doesn't have to be one-file based, you just need to have a way to map physical packages to module packages. The DIP doesn't explain all this except for the sections titled "Packaging" and "protocols"So DMD *never* needs to be invoked more than twice. Once to get the deps, once to compile. Better yet, if DMD gets the switch --compile-everything-dammit-not-just-the-explicit-files-from-the-command-line, then DMD never needs to be invoked more than *once*: Once to figure out the deps *while* being intelligent enough to actually compile all of them.This would be beneficial to DIP11 as well, since downloading and importing the file is only half the battle. -Steve
Aug 11 2011
On 2011-08-11 21:36, Steven Schveighoffer wrote: be one-file based, you just need to have a way to mapphysical packages to module packages. The DIP doesn't explain all this except for the sections titled "Packaging" and "protocols"The DIP needs to explain this, is that the whole point? -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 05:04:13 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 21:36, Steven Schveighoffer wrote: be one-file based, you just need to have a way to mapThe DIP focuses on the compiler changes needed + gives rudimentary functionality based on a simple protocol (http). I think the scope of work for creating the tool which does full-fledged packaging is way too much for the DIP, and probably would be counter productive to create a spec for it right now. The point of those two sections is to remind you not to think in terms of "this only downloads individual files via http", but to think about the future functinality that such a tool could provide once the compiler has been hooked. The details of the tool are purposely left open for design. -Stevephysical packages to module packages. The DIP doesn't explain all this except for the sections titled "Packaging" and "protocols"The DIP needs to explain this, is that the whole point?
Aug 12 2011
On 2011-08-12 15:56, Steven Schveighoffer wrote:On Fri, 12 Aug 2011 05:04:13 -0400, Jacob Carlborg <doob me.com> wrote:Ok, I see. -- /Jacob CarlborgOn 2011-08-11 21:36, Steven Schveighoffer wrote: be one-file based, you just need to have a way to mapThe DIP focuses on the compiler changes needed + gives rudimentary functionality based on a simple protocol (http). I think the scope of work for creating the tool which does full-fledged packaging is way too much for the DIP, and probably would be counter productive to create a spec for it right now. The point of those two sections is to remind you not to think in terms of "this only downloads individual files via http", but to think about the future functinality that such a tool could provide once the compiler has been hooked. The details of the tool are purposely left open for design. -Stevephysical packages to module packages. The DIP doesn't explain all this except for the sections titled "Packaging" and "protocols"The DIP needs to explain this, is that the whole point?
Aug 12 2011
On 8/11/11 1:09 PM, Nick Sabalausky wrote:"Steven Schveighoffer"<schveiguy yahoo.com> wrote in message news:op.vz166yaieav7ka localhost.localdomain...It's difficult to get all dependencies when not all sources have been yet downloaded. AndreiOn Thu, 11 Aug 2011 12:24:48 -0400, Andrew Wiley <wiley.andrew.j gmail.com> wrote:That's *only* true if you go along with DIP11's misguided file-oriented approach. With a real package manager, none of that is needed. Your app just says "I need packages X, Y and Z." And X, Y and Z do the same for their requirements. This is all trivial metadata. Emphasis on *trivial*. So, before DMD is ever invoked at all, before one line of the source is ever even read, the package manager has make sure that *everything* is *already* right there. No need to go off on some goofy half-cocked "compile/download/complile/download" dance. So DMD *never* needs to be invoked more than twice. Once to get the deps, once to compile. Better yet, if DMD gets the switch --compile-everything-dammit-not-just-the-explicit-files-from-the-command-line, then DMD never needs to be invoked more than *once*: Once to figure out the deps *while* being intelligent enough to actually compile all of them.On Thu, Aug 11, 2011 at 5:52 AM, Steven Schveighoffer <schveiguy yahoo.com>wrote:Yes, but then you have to restart the compiler to figure out what's next. Let's say a source file needs a.d, and a.d needs b.d, and both a.d and b.d are on the network. You potentially need to run the compiler 3 times just to make sure you have all the files, then run it a fourth time to compile.I think the benefit of this approach over a build tool which wraps the compiler is, the compiler already has the information needed for dependencies, etc. To a certain extent, the wrapping build tool has to re-implement some of the compiler pieces.This last bit doesn't really come into play here because you can already ask the compiler to output all that information. and easily use it in a separate program. That much is already done.
Aug 11 2011
"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...It's difficult to get all dependencies when not all sources have been yet downloaded.With DIP11, yes. With a traditional-style package manager, no.
Aug 11 2011
On Thu, 11 Aug 2011 17:20:04 -0400, Nick Sabalausky <a a.a> wrote:"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...With either style, you need to download a package in order to determine if you need to download other packages (package a may depend on package b even though your project does not depend on package b). The DIP11 version does this JIT, whereas your version does it before compilation. It's not really any different. -SteveIt's difficult to get all dependencies when not all sources have been yet downloaded.With DIP11, yes. With a traditional-style package manager, no.
Aug 11 2011
"Steven Schveighoffer" <schveiguy yahoo.com> wrote:On Thu, 11 Aug 2011 17:20:04 -0400, Nick Sabalausky <a a.a> wrote:In Debian's apt, there will be a central index that records all dependencies for packages in the repository. So the client only needs to synchronize that index file regularly. The system will know package A depends on B which depends on C and D and download all 4 packages. That said, since you need to download the pakcgaes anyway, having a central index doesn't reduce the bytes you need to transfer and parse if DIP11 doesn't support updating."Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...With either style, you need to download a package in order to determine if you need to download other packages (package a may depend on package b even though your project does not depend on package b). The DIP11 version does this JIT, whereas your version does it before compilation. It's not really any different. -SteveIt's difficult to get all dependencies when not all sources have been >> yet downloaded.With DIP11, yes. With a traditional-style package manager, no.
Aug 11 2011
On 2011-08-12 08:17, kennytm wrote:"Steven Schveighoffer"<schveiguy yahoo.com> wrote:ExactlyOn Thu, 11 Aug 2011 17:20:04 -0400, Nick Sabalausky<a a.a> wrote:In Debian's apt, there will be a central index that records all dependencies for packages in the repository. So the client only needs to synchronize that index file regularly. The system will know package A depends on B which depends on C and D and download all 4 packages."Andrei Alexandrescu"<SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...With either style, you need to download a package in order to determine if you need to download other packages (package a may depend on package b even though your project does not depend on package b). The DIP11 version does this JIT, whereas your version does it before compilation. It's not really any different. -SteveIt's difficult to get all dependencies when not all sources have been>> yet downloaded.With DIP11, yes. With a traditional-style package manager, no.That said, since you need to download the pakcgaes anyway, having a central index doesn't reduce the bytes you need to transfer and parse if DIP11 doesn't support updating.No, but I'm guessing it's more efficient to download zip files instead of individual D files. -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 02:17:36 -0400, kennytm <kennytm gmail.com> wrote:"Steven Schveighoffer" <schveiguy yahoo.com> wrote:This could be done (in fact, the tool could have a url system that uses apt). But it's beyond the scope of the DIP, which is to define a way to hook the compiler for a downloading tool when a file is on the Internet.On Thu, 11 Aug 2011 17:20:04 -0400, Nick Sabalausky <a a.a> wrote:In Debian's apt, there will be a central index that records all dependencies for packages in the repository. So the client only needs to synchronize that index file regularly. The system will know package A depends on B which depends on C and D and download all 4 packages."Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...With either style, you need to download a package in order to determine if you need to download other packages (package a may depend on package b even though your project does not depend on package b). The DIP11 version does this JIT, whereas your version does it before compilation. It's not really any different. -SteveIt's difficult to get all dependencies when not all sources have been >> yet downloaded.With DIP11, yes. With a traditional-style package manager, no.That said, since you need to download the pakcgaes anyway, having a central index doesn't reduce the bytes you need to transfer and parse if DIP11 doesn't support updating.All the DIP does is provide a hook so the compiler can ask an external tool to download files/packages/whatever. It leaves open the protocol used by the tool, except that it should be in url form. The tool can implement the download any way it wants (individual files, packages, full-blown metadata-based centralized-indexed kitchen-sinked github or apt or whatever). -Steve
Aug 12 2011
On 2011-08-12 00:02, Steven Schveighoffer wrote:On Thu, 11 Aug 2011 17:20:04 -0400, Nick Sabalausky <a a.a> wrote:No, not if you have a single meta file with all the dependencies. -- /Jacob Carlborg"Andrei Alexandrescu" <SeeWebsiteForEmail erdani.org> wrote in message news:j21g1a$ea4$1 digitalmars.com...With either style, you need to download a package in order to determine if you need to download other packages (package a may depend on package b even though your project does not depend on package b). The DIP11 version does this JIT, whereas your version does it before compilation. It's not really any different. -SteveIt's difficult to get all dependencies when not all sources have been yet downloaded.With DIP11, yes. With a traditional-style package manager, no.
Aug 12 2011
On 2011-08-11 23:02, Andrei Alexandrescu wrote:It's difficult to get all dependencies when not all sources have been yet downloaded. AndreiNo, not when you have a single meta file containing all the dependencies of all packages. -- /Jacob Carlborg
Aug 12 2011
On 8/12/11 11:05 AM, Jacob Carlborg wrote:On 2011-08-11 23:02, Andrei Alexandrescu wrote:Or a »spec« file along with each package containing the metadata, e.g. name, description, requirements, conflicts, … DavidIt's difficult to get all dependencies when not all sources have been yet downloaded. AndreiNo, not when you have a single meta file containing all the dependencies of all packages.
Aug 12 2011
On 8/12/11 3:05 AM, Jacob Carlborg wrote:On 2011-08-11 23:02, Andrei Alexandrescu wrote:I understand. I believe there is value in avoiding meta files. AndreiIt's difficult to get all dependencies when not all sources have been yet downloaded. AndreiNo, not when you have a single meta file containing all the dependencies of all packages.
Aug 12 2011
On 2011-08-12 16:11, Andrei Alexandrescu wrote:On 8/12/11 3:05 AM, Jacob Carlborg wrote:This meta file lives on in the repository and is an implementation detail that the user never needs to know about. If this meta file can be created without meta files for individual packages I don't know. -- /Jacob CarlborgOn 2011-08-11 23:02, Andrei Alexandrescu wrote:I understand. I believe there is value in avoiding meta files. AndreiIt's difficult to get all dependencies when not all sources have been yet downloaded. AndreiNo, not when you have a single meta file containing all the dependencies of all packages.
Aug 12 2011
It really looks sound to me. --- module myfile; pragma(imppath, "dep=www.dpan.org/dep"); import dep.a; --- remote file --- module dep.a; // link directives pragma(libpath, "dep=www.dpan.org/dep"); pragma(lib, "dep"); // or alternatively some new pragma like this to cause linking the imported package pragma(build, "dep") // additional dependency pragma(imppatch, "dep2=www.dpan.org/dep2"); --- Versioning can be easily resolved by using urls like www.dpan.org/dep?version=0.45 or www.dpan.org/dep-greaterthan-0.45. Pro: - scales from local dependencies to VPN sharing to package websites to even dynamically generated source code - simple implementation compilerwise Con: - doesn't help with mixed source builds except for prebuild libraries Some thoughts: - allowing packages/libs to be packed in zip files would be nice as it's getting closer to single file packages - remote locations would need an accompanying hash file per source or a checksum index to allow local caching - sorting out some security issues would be nice but I've never seen anything in other package managers On Thu, 11 Aug 2011 09:49:56 +0200, Jacob Carlborg <doob me.com> wrote:On 2011-08-11 09:41, Jonas Drewsen wrote:<snip>On 11/08/11 09.07, Jacob Carlborg wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works. /JonasI think that DIP11 is too limited, for example, it doesn't deal with versions. Orbit combined with a build tool will be seamless as well. RDMD is a great tool but as soon as you need to add compiler flags or compile a library you need either some kind of script or a build tool. And in that case you can just go with the built tool and have it work on all platforms.
Aug 11 2011
On 2011-08-12 00:42, Martin Nowak wrote:It really looks sound to me. --- module myfile; pragma(imppath, "dep=www.dpan.org/dep"); import dep.a; --- remote file --- module dep.a; // link directives pragma(libpath, "dep=www.dpan.org/dep"); pragma(lib, "dep"); // or alternatively some new pragma like this to cause linking the imported package pragma(build, "dep") // additional dependency pragma(imppatch, "dep2=www.dpan.org/dep2"); --- Versioning can be easily resolved by using urls like www.dpan.org/dep?version=0.45 or www.dpan.org/dep-greaterthan-0.45. Pro: - scales from local dependencies to VPN sharing to package websites to even dynamically generated source code - simple implementation compilerwiseWe will be very dependent on Walter or anyone else that knows the compiler, which, as far as I know, are quite few. I'm not sure if anything _is_ simple in the compiler. -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 05:12:47 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 00:42, Martin Nowak wrote:This is not true. The compiler implements *hooks* for a download tool. The download tool will be a separate process which turns urls (generated by the compiler) into source files. Once the hooks are implemented, the tool is independent, and we would be idiotic not to implement it in D. I think you may not have read the DIP fully, or it is not clear enough. -SteveIt really looks sound to me. --- module myfile; pragma(imppath, "dep=www.dpan.org/dep"); import dep.a; --- remote file --- module dep.a; // link directives pragma(libpath, "dep=www.dpan.org/dep"); pragma(lib, "dep"); // or alternatively some new pragma like this to cause linking the imported package pragma(build, "dep") // additional dependency pragma(imppatch, "dep2=www.dpan.org/dep2"); --- Versioning can be easily resolved by using urls like www.dpan.org/dep?version=0.45 or www.dpan.org/dep-greaterthan-0.45. Pro: - scales from local dependencies to VPN sharing to package websites to even dynamically generated source code - simple implementation compilerwiseWe will be very dependent on Walter or anyone else that knows the compiler, which, as far as I know, are quite few. I'm not sure if anything _is_ simple in the compiler.
Aug 12 2011
On 2011-08-12 15:53, Steven Schveighoffer wrote:This is not true. The compiler implements *hooks* for a download tool. The download tool will be a separate process which turns urls (generated by the compiler) into source files. Once the hooks are implemented, the tool is independent, and we would be idiotic not to implement it in D. I think you may not have read the DIP fully, or it is not clear enough. -SteveI've read the whole DIP and I know there is an external tool that downloads the files. I also know that DMD doesn't have these hooks, I rest my case. -- /Jacob Carlborg
Aug 12 2011
On Fri, 12 Aug 2011 12:30:42 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 15:53, Steven Schveighoffer wrote:I thought you meant we would be dependent on Walter to write the *download part* of the DIP, not the compiler hooks. The hooks should be pretty simple I would think. But in any case, you should have a look at github and all the new people who are working on pull requests for the compiler. The group of dmd code contributors has significantly grown. -SteveThis is not true. The compiler implements *hooks* for a download tool. The download tool will be a separate process which turns urls (generated by the compiler) into source files. Once the hooks are implemented, the tool is independent, and we would be idiotic not to implement it in D. I think you may not have read the DIP fully, or it is not clear enough. -SteveI've read the whole DIP and I know there is an external tool that downloads the files. I also know that DMD doesn't have these hooks, I rest my case.
Aug 12 2011
On 2011-08-12 19:08, Steven Schveighoffer wrote:On Fri, 12 Aug 2011 12:30:42 -0400, Jacob Carlborg <doob me.com> wrote:Yeah, that's true and it's a good thing. I can tell you this, I've looked at the DMD source code and tried to do modifications but I couldn't understand much of the code at all and failed with the most simple things. -- /Jacob CarlborgOn 2011-08-12 15:53, Steven Schveighoffer wrote:I thought you meant we would be dependent on Walter to write the *download part* of the DIP, not the compiler hooks. The hooks should be pretty simple I would think. But in any case, you should have a look at github and all the new people who are working on pull requests for the compiler. The group of dmd code contributors has significantly grown. -SteveThis is not true. The compiler implements *hooks* for a download tool. The download tool will be a separate process which turns urls (generated by the compiler) into source files. Once the hooks are implemented, the tool is independent, and we would be idiotic not to implement it in D. I think you may not have read the DIP fully, or it is not clear enough. -SteveI've read the whole DIP and I know there is an external tool that downloads the files. I also know that DMD doesn't have these hooks, I rest my case.
Aug 12 2011
On Fri, 12 Aug 2011 14:31:28 -0400, Jacob Carlborg <doob me.com> wrote:On 2011-08-12 19:08, Steven Schveighoffer wrote:/me in same boat as you -SteveOn Fri, 12 Aug 2011 12:30:42 -0400, Jacob Carlborg <doob me.com> wrote:Yeah, that's true and it's a good thing. I can tell you this, I've looked at the DMD source code and tried to do modifications but I couldn't understand much of the code at all and failed with the most simple things.On 2011-08-12 15:53, Steven Schveighoffer wrote:I thought you meant we would be dependent on Walter to write the *download part* of the DIP, not the compiler hooks. The hooks should be pretty simple I would think. But in any case, you should have a look at github and all the new people who are working on pull requests for the compiler. The group of dmd code contributors has significantly grown. -SteveThis is not true. The compiler implements *hooks* for a download tool. The download tool will be a separate process which turns urls (generated by the compiler) into source files. Once the hooks are implemented, the tool is independent, and we would be idiotic not to implement it in D. I think you may not have read the DIP fully, or it is not clear enough. -SteveI've read the whole DIP and I know there is an external tool that downloads the files. I also know that DMD doesn't have these hooks, I rest my case.
Aug 12 2011
"Jonas Drewsen" <jdrewsen nospam.com> wrote in message news:j20139$2sev$1 digitalmars.com...On 11/08/11 09.07, Jacob Carlborg wrote:I really see it as solving the wrong problem the wrong way. DIP11 tries to solve two main things: 1. Automatic downloading of external dependencies. 2. Eliminate the need for a separate invokation of DMD to find all needed D files (ie, efficiency). (and by that I'm referring to the "package manager" concept of "packages", not D's module system). DIP11 handles this at the individual source-file level, which I believe is wrong and causes plenty of problems. Plus, DIP11 is very, very limited compared to a tradtional-style package manager (and pretty much has to be since it works at the individual source-file level) and thus encourages very bad things like hardcoding exactly one possible location from which to retrieve a given file. incorporating RDMD's "--build-only" functionality into the compiler (as an optional flag). This has two benefits over DIP11: A, It just works, nobody ever has to deal with DIP11's weird "DMD wants X to be retrieved", umm, "callback" mechanism. B, It's not tied to DIP11's broken package management design. all: It may download the needed D files, but it won't compile them. A separate tool *still* has to pass all the D files to DMD. And to do that it needs to know what D files are needed. And to do that, it needs to either ask DMD or be told by DMD, and *then* send all the D files to DMD as a separate invokation. The motivation for DIP11 was clearly to get a D package-retreiving system up and running with minimal effort. And that would be great if it were a good proposal. But DIP11 just smacks of corner-cutting.On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works.What is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011
On 2011-08-11 17:41, Nick Sabalausky wrote:"Jonas Drewsen"<jdrewsen nospam.com> wrote in message news:j20139$2sev$1 digitalmars.com...I completely agree with this. -- /Jacob CarlborgOn 11/08/11 09.07, Jacob Carlborg wrote:I really see it as solving the wrong problem the wrong way. DIP11 tries to solve two main things: 1. Automatic downloading of external dependencies. 2. Eliminate the need for a separate invokation of DMD to find all needed D files (ie, efficiency). (and by that I'm referring to the "package manager" concept of "packages", not D's module system). DIP11 handles this at the individual source-file level, which I believe is wrong and causes plenty of problems. Plus, DIP11 is very, very limited compared to a tradtional-style package manager (and pretty much has to be since it works at the individual source-file level) and thus encourages very bad things like hardcoding exactly one possible location from which to retrieve a given file. incorporating RDMD's "--build-only" functionality into the compiler (as an optional flag). This has two benefits over DIP11: A, It just works, nobody ever has to deal with DIP11's weird "DMD wants X to be retrieved", umm, "callback" mechanism. B, It's not tied to DIP11's broken package management design. all: It may download the needed D files, but it won't compile them. A separate tool *still* has to pass all the D files to DMD. And to do that it needs to know what D files are needed. And to do that, it needs to either ask DMD or be told by DMD, and *then* send all the D files to DMD as a separate invokation. The motivation for DIP11 was clearly to get a D package-retreiving system up and running with minimal effort. And that would be great if it were a good proposal. But DIP11 just smacks of corner-cutting.On 2011-08-10 21:55, jdrewsen wrote:Yes I've noticed that. Seems very promising. What I do like about DIP11 is how seamless it would work. You just have to compile and stuff works.What is the status of DIP11 http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11 Has anyone started implementing it? Has it been rejected? /JonasNot sure, personally I don't like it. Instead I'm working on a more traditional package manager called Orbit: https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D
Aug 11 2011