digitalmars.D - Package manager - interacting with the compiler
- Jacob Carlborg (37/37) Dec 10 2011 I think I've come so far in my development of a package manager that
- J Arrizza (24/31) Dec 10 2011 This will have to handle cross-compilations and multiple build variants ...
- Jacob Carlborg (11/43) Dec 11 2011 The package manager just invokes a build tool, like make, rdmd, dsss,
- Martin Nowak (9/64) Dec 11 2011 I think a useful approach is to implement
- Jacob Carlborg (10/77) Dec 12 2011 I don't like the ideas in that DIP. I don't think the packages should
- jdrewsen (12/50) Dec 10 2011 For use case 1 the package manager could just as well call dmd
- Jonathan M Davis (24/85) Dec 10 2011 This brings up an interesting situation. In general, I don't think that ...
- Jacob Carlborg (11/40) Dec 11 2011 Currently you specify the build tool in the specification file, which
- J Arrizza (16/16) Dec 10 2011 A few other potential twists.
- Jacob Carlborg (7/23) Dec 11 2011 Currently by default the package manager installs everything in (on
- Marco Leise (3/29) Dec 18 2011 You have to have super user rights for the default. Maven installs
- Jacob Carlborg (5/36) Dec 19 2011 I have thought of that as well, I've just picked a folder for now. It
- Jacob Carlborg (10/70) Dec 11 2011 I was thinking that the package manager just invokes a build tool like
- jdrewsen (9/105) Dec 12 2011 And for that I think the pkg-config method is the way to go.
- Jacob Carlborg (6/19) Dec 12 2011 No, I basically just picked a random name. I tried of trying to come up
- Chad J (6/9) Dec 11 2011 o.O
- Jacob Carlborg (4/13) Dec 12 2011 It's basically RubyGems but for D. It's great to hear that someone likes...
- Chad J (32/49) Dec 13 2011 OK, cool. I should probably mention some of the things I like about
- Jacob Carlborg (11/26) Dec 13 2011 That might be a good idea. I had only planned to list all installed
- Chad J (15/51) Dec 13 2011 Would you allow others to implement this, or somehow be open to it in
- Jacob Carlborg (6/28) Dec 13 2011 It might happen in the future. But currently I think it's unnecessary
I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. Currently I see two use cases: 1. When the package manager installs (and builds) a package 2. When a user (developer) builds a project and want's to use installed packages In the best of worlds the user wouldn't have to do anything and it just works. The package manger needs to somehow pass import paths to the compiler and libraries to link with. I'm not entirely sure what the best method to do this would be. But I'm thinking that if the compiler could accept compiler flags passed via environment variables use case 1 would be easy to implement. For use case 2 it would be a bit more problematic. In this use case the user would need to somehow tell the package manager that I want to use these packages, something like: // project.obspec orb "foo" orb "bar" $ orb use project.obspec or for single packages $ orb use foobar $ dmd project.d If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). The reason for this is that a child process (the package manager) can't set environment variables for the parent process (the shell). This complicates the implementation and installation of the package manager and requires different implementations for Posix and Windows. Another idea would be to manipulate the dmd.conf/sc.ini file but that seems to be quite complicated and messy. On the other hand, this wouldn't require any changes to the compiler. Any other ideas? https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D https://github.com/jacob-carlborg/orbit -- /Jacob Carlborg
Dec 10 2011
Jacob, On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com> wrote:Currently I see two use cases: 1. When the package manager installs (and builds) a packageThis will have to handle cross-compilations and multiple build variants per platform. Multiple platforms are needed especially for embedded work (simulation vs real binaries) and multiple build variants are needed, at least Debug vs Release variants. Also multiple projects require a set of config files per project. There would be a lot of commonality between project config files but that's ok. The idea of "inheriting" from a common config file can cause a lot of problems. In all cases, the config file(s) need to be version controlled per project since they are unique to the project generating the build.2. When a user (developer) builds a project and want's to use installed packagesIf environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell).Please no environment variables for anything. If all config info is in a file and that file is version controlled then keeping tight control over build configurations is much easier. For the same reasons, it would be ideal to have a method that dumps all versions of all packages used by a particular build variant. This could be generated and saved in version control for audit reasons, or, to go the extra step, it could be compared against during every build. This ensures that all components have not been inadvertently changed, i.e. all config changes are done in a controlled way. I'm not sure if any of that points the way to implement the builds any clearer... John
Dec 10 2011
On 2011-12-10 22:13, J Arrizza wrote:Jacob, On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com <mailto:doob me.com>> wrote: Currently I see two use cases: 1. When the package manager installs (and builds) a package This will have to handle cross-compilations and multiple build variants per platform. Multiple platforms are needed especially for embedded work (simulation vs real binaries) and multiple build variants are needed, at least Debug vs Release variants. Also multiple projects require a set of config files per project. There would be a lot of commonality between project config files but that's ok. The idea of "inheriting" from a common config file can cause a lot of problems. In all cases, the config file(s) need to be version controlled per project since they are unique to the project generating the build.The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.2. When a user (developer) builds a project and want's to use installed packages If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). Please no environment variables for anything. If all config info is in a file and that file is version controlled then keeping tight control over build configurations is much easier. For the same reasons, it would be ideal to have a method that dumps all versions of all packages used by a particular build variant. This could be generated and saved in version control for audit reasons, or, to go the extra step, it could be compared against during every build. This ensures that all components have not been inadvertently changed, i.e. all config changes are done in a controlled way. I'm not sure if any of that points the way to implement the builds any clearer... JohnI'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with. -- /Jacob Carlborg
Dec 11 2011
On Sun, 11 Dec 2011 22:15:26 +0100, Jacob Carlborg <doob me.com> wrote:On 2011-12-10 22:13, J Arrizza wrote:I think a useful approach is to implement http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP13 and map import paths to packages. It allows to handle different versions and hide undeclared dependencies, i.e. no accidental imports. As this only works when building packages every installed package should have a symlink for their most recent version in a common import directory so that plain dmd builds can use the packages. martinJacob, On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com <mailto:doob me.com>> wrote: Currently I see two use cases: 1. When the package manager installs (and builds) a package This will have to handle cross-compilations and multiple build variants per platform. Multiple platforms are needed especially for embedded work (simulation vs real binaries) and multiple build variants are needed, at least Debug vs Release variants. Also multiple projects require a set of config files per project. There would be a lot of commonality between project config files but that's ok. The idea of "inheriting" from a common config file can cause a lot of problems. In all cases, the config file(s) need to be version controlled per project since they are unique to the project generating the build.The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.2. When a user (developer) builds a project and want's to use installed packages If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). Please no environment variables for anything. If all config info is in a file and that file is version controlled then keeping tight control over build configurations is much easier. For the same reasons, it would be ideal to have a method that dumps all versions of all packages used by a particular build variant. This could be generated and saved in version control for audit reasons, or, to go the extra step, it could be compared against during every build. This ensures that all components have not been inadvertently changed, i.e. all config changes are done in a controlled way. I'm not sure if any of that points the way to implement the builds any clearer... JohnI'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with.
Dec 11 2011
On 2011-12-12 01:36, Martin Nowak wrote:On Sun, 11 Dec 2011 22:15:26 +0100, Jacob Carlborg <doob me.com> wrote:I don't like the ideas in that DIP. I don't think the packages should have symlinks to a common import directory. It will cause problems if the top level package of a library has the same name as some other library, like it is with DWT: org.eclipse.swt org.eclipse.jface And so on. -- /Jacob CarlborgOn 2011-12-10 22:13, J Arrizza wrote:I think a useful approach is to implement http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP13 and map import paths to packages. It allows to handle different versions and hide undeclared dependencies, i.e. no accidental imports. As this only works when building packages every installed package should have a symlink for their most recent version in a common import directory so that plain dmd builds can use the packages. martinJacob, On Sat, Dec 10, 2011 at 12:55 AM, Jacob Carlborg <doob me.com <mailto:doob me.com>> wrote: Currently I see two use cases: 1. When the package manager installs (and builds) a package This will have to handle cross-compilations and multiple build variants per platform. Multiple platforms are needed especially for embedded work (simulation vs real binaries) and multiple build variants are needed, at least Debug vs Release variants. Also multiple projects require a set of config files per project. There would be a lot of commonality between project config files but that's ok. The idea of "inheriting" from a common config file can cause a lot of problems. In all cases, the config file(s) need to be version controlled per project since they are unique to the project generating the build.The package manager just invokes a build tool, like make, rdmd, dsss, shell script and so on.2. When a user (developer) builds a project and want's to use installed packages If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). Please no environment variables for anything. If all config info is in a file and that file is version controlled then keeping tight control over build configurations is much easier. For the same reasons, it would be ideal to have a method that dumps all versions of all packages used by a particular build variant. This could be generated and saved in version control for audit reasons, or, to go the extra step, it could be compared against during every build. This ensures that all components have not been inadvertently changed, i.e. all config changes are done in a controlled way. I'm not sure if any of that points the way to implement the builds any clearer... JohnI'm not sure but I think you missed the point (or I missed your point). The projects will have a file indicating which other project they depends on. The when the package manager installs a project it will compile the project. When it's compiled the package manager needs to somehow tell the compiler what imports path to use and libraries to link with.
Dec 12 2011
On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg wrote:I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. Currently I see two use cases: 1. When the package manager installs (and builds) a package 2. When a user (developer) builds a project and want's to use installed packages In the best of worlds the user wouldn't have to do anything and it just works. The package manger needs to somehow pass import paths to the compiler and libraries to link with. I'm not entirely sure what the best method to do this would be. But I'm thinking that if the compiler could accept compiler flags passed via environment variables use case 1 would be easy to implement. For use case 2 it would be a bit more problematic. In this use case the user would need to somehow tell the package manager that I want to use these packages, something like: // project.obspec orb "foo" orb "bar" $ orb use project.obspec or for single packages $ orb use foobar $ dmd project.d If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). The reason for this is that a child process (the package manager) can't set environment variables for the parent process (the shell). This complicates the implementation and installation of the package manager and requires different implementations for Posix and Windows. Another idea would be to manipulate the dmd.conf/sc.ini file but that seems to be quite complicated and messy. On the other hand, this wouldn't require any changes to the compiler. Any other ideas? https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D https://github.com/jacob-carlborg/orbitFor use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables. Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project. What would be nice to have support for using packages without a build tool. Maybe something like what pkg-config provides: dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" returns the flags to use the foo package. /Jonas
Dec 10 2011
On Saturday, December 10, 2011 22:17:44 jdrewsen wrote:On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg wrote:This brings up an interesting situation. In general, I don't think that a package manager has any business building the project which is pulling in dependencies. However, it _does_ make some sense to build the dependencies on the box that these are being pulled in on, since they're going to have to be built for that box natively. And each of those projects could be using different build tools. One could be using make. Another could be using cmake. Another could be using scons. Etc. So, how is that dealt with? Does each package list its choose build tool as a dependency and the programmer must then make sure that that build tool has been installed on their system by whatever means non-D packages/programs are installed? Or does that mean that packages using the package manager all need to use a specific build tool? And if they do, should the package manager then be that build tool? Or do we make it so that the package manager doesn't actually build _anything_? Rather it pulls in the source along with pre-built binaries for your architecture, and if you want to build it for your manchine specifically, you have to go and built it yourself after it gets pulled down? This is all looking very messing to me. I have no idea how orbit deals with any of this, since I've never really looked at orbit. But it makes for an ugly problem. give a better opinion on it. - Jonathan M DavisI think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. Currently I see two use cases: 1. When the package manager installs (and builds) a package 2. When a user (developer) builds a project and want's to use installed packages In the best of worlds the user wouldn't have to do anything and it just works. The package manger needs to somehow pass import paths to the compiler and libraries to link with. I'm not entirely sure what the best method to do this would be. But I'm thinking that if the compiler could accept compiler flags passed via environment variables use case 1 would be easy to implement. For use case 2 it would be a bit more problematic. In this use case the user would need to somehow tell the package manager that I want to use these packages, something like: // project.obspec orb "foo" orb "bar" $ orb use project.obspec or for single packages $ orb use foobar $ dmd project.d If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). The reason for this is that a child process (the package manager) can't set environment variables for the parent process (the shell). This complicates the implementation and installation of the package manager and requires different implementations for Posix and Windows. Another idea would be to manipulate the dmd.conf/sc.ini file but that seems to be quite complicated and messy. On the other hand, this wouldn't require any changes to the compiler. Any other ideas? https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D https://github.com/jacob-carlborg/orbitFor use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables. Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project.
Dec 10 2011
On 2011-12-10 23:05, Jonathan M Davis wrote:On Saturday, December 10, 2011 22:17:44 jdrewsen wrote:Currently you specify the build tool in the specification file, which also contains dependencies, which files to include in the package and so on. The package manager then just invokes the build tool. Currently the build tool needs to be supported by the package manager, it needs to know how to invoke the build tool. Currently there is no verification that the build tool exists. The intention was not to choose among these use case, both of them happen and need to be handled with. -- /Jacob CarlborgFor use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables. Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project.This brings up an interesting situation. In general, I don't think that a package manager has any business building the project which is pulling in dependencies. However, it _does_ make some sense to build the dependencies on the box that these are being pulled in on, since they're going to have to be built for that box natively. And each of those projects could be using different build tools. One could be using make. Another could be using cmake. Another could be using scons. Etc. So, how is that dealt with? Does each package list its choose build tool as a dependency and the programmer must then make sure that that build tool has been installed on their system by whatever means non-D packages/programs are installed? Or does that mean that packages using the package manager all need to use a specific build tool? And if they do, should the package manager then be that build tool? Or do we make it so that the package manager doesn't actually build _anything_? Rather it pulls in the source along with pre-built binaries for your architecture, and if you want to build it for your manchine specifically, you have to go and built it yourself after it gets pulled down? This is all looking very messing to me. I have no idea how orbit deals with any of this, since I've never really looked at orbit. But it makes for an ugly problem. give a better opinion on it. - Jonathan M Davis
Dec 11 2011
A few other potential twists. - the installation step needs to be portable int that can install the variant build artifacts into non-standard file system locations. For example, the build artifacts for the windows build and the build artifacts for the linux build need to end up in separate directories and the Debug and Release builds need to end up in separate directories. Another example is a Build server building multiple projects. - the package system itself needs to be portable in that it can be installed in any directory. For example, if I want to source control the entire package system then it would not be in a standard file-system location. Also it implies there may be multiple installations of the package system since I can have multiple branches. For regulated industries that require the ability to recreate the software environment for any released binary image, both of these would be a terrific help for doing that. John
Dec 10 2011
On 2011-12-10 23:32, J Arrizza wrote:A few other potential twists. - the installation step needs to be portable int that can install the variant build artifacts into non-standard file system locations. For example, the build artifacts for the windows build and the build artifacts for the linux build need to end up in separate directories and the Debug and Release builds need to end up in separate directories. Another example is a Build server building multiple projects. - the package system itself needs to be portable in that it can be installed in any directory. For example, if I want to source control the entire package system then it would not be in a standard file-system location. Also it implies there may be multiple installations of the package system since I can have multiple branches. For regulated industries that require the ability to recreate the software environment for any released binary image, both of these would be a terrific help for doing that. JohnCurrently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs. -- /Jacob Carlborg
Dec 11 2011
Am 11.12.2011, 23:12 Uhr, schrieb Jacob Carlborg <doob me.com>:On 2011-12-10 23:32, J Arrizza wrote:You have to have super user rights for the default. Maven installs everything to ~/.maven by default, which will work out of the box.A few other potential twists. - the installation step needs to be portable int that can install the variant build artifacts into non-standard file system locations. For example, the build artifacts for the windows build and the build artifacts for the linux build need to end up in separate directories and the Debug and Release builds need to end up in separate directories. Another example is a Build server building multiple projects. - the package system itself needs to be portable in that it can be installed in any directory. For example, if I want to source control the entire package system then it would not be in a standard file-system location. Also it implies there may be multiple installations of the package system since I can have multiple branches. For regulated industries that require the ability to recreate the software environment for any released binary image, both of these would be a terrific help for doing that. JohnCurrently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs.
Dec 18 2011
On 2011-12-19 08:06, Marco Leise wrote:Am 11.12.2011, 23:12 Uhr, schrieb Jacob Carlborg <doob me.com>:I have thought of that as well, I've just picked a folder for now. It can easily be changed in the code. -- /Jacob CarlborgOn 2011-12-10 23:32, J Arrizza wrote:You have to have super user rights for the default. Maven installs everything to ~/.maven by default, which will work out of the box.A few other potential twists. - the installation step needs to be portable int that can install the variant build artifacts into non-standard file system locations. For example, the build artifacts for the windows build and the build artifacts for the linux build need to end up in separate directories and the Debug and Release builds need to end up in separate directories. Another example is a Build server building multiple projects. - the package system itself needs to be portable in that it can be installed in any directory. For example, if I want to source control the entire package system then it would not be in a standard file-system location. Also it implies there may be multiple installations of the package system since I can have multiple branches. For regulated industries that require the ability to recreate the software environment for any released binary image, both of these would be a terrific help for doing that. JohnCurrently by default the package manager installs everything in (on Posix) /usr/local/orbit/orbs. It's possible to override this using the environment variable "ORB_HOME", if this variables is used packages will be install into $ORB_HOME/orbs.
Dec 19 2011
On 2011-12-10 22:17, jdrewsen wrote:On Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg wrote:I was thinking that the package manager just invokes a build tool like make, rdmd, dsss, shell script and so on.I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. Currently I see two use cases: 1. When the package manager installs (and builds) a package 2. When a user (developer) builds a project and want's to use installed packages In the best of worlds the user wouldn't have to do anything and it just works. The package manger needs to somehow pass import paths to the compiler and libraries to link with. I'm not entirely sure what the best method to do this would be. But I'm thinking that if the compiler could accept compiler flags passed via environment variables use case 1 would be easy to implement. For use case 2 it would be a bit more problematic. In this use case the user would need to somehow tell the package manager that I want to use these packages, something like: // project.obspec orb "foo" orb "bar" $ orb use project.obspec or for single packages $ orb use foobar $ dmd project.d If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). The reason for this is that a child process (the package manager) can't set environment variables for the parent process (the shell). This complicates the implementation and installation of the package manager and requires different implementations for Posix and Windows. Another idea would be to manipulate the dmd.conf/sc.ini file but that seems to be quite complicated and messy. On the other hand, this wouldn't require any changes to the compiler. Any other ideas? https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D https://github.com/jacob-carlborg/orbitFor use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables.Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project. What would be nice to have support for using packages without a build tool. Maybe something like what pkg-config provides: dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" returns the flags to use the foo package. /JonasI would say that the preferred way is to use a build tool then there is no problem. The build tool just asks the package manager which import paths to use for the given packages and pass the information to the compiler. But I don't want my package manager to depend on a built tool, I want it to be usable on its own. -- /Jacob Carlborg
Dec 11 2011
On Sunday, 11 December 2011 at 21:22:37 UTC, Jacob Carlborg wrote:On 2011-12-10 22:17, jdrewsen wrote:And for that I think the pkg-config method is the way to go. Setting environment vars brings unneeded state into you development session. Another option would be to just wrap dmd in a e.g. orbdmd command and handle it there. Btw: have you considered renaming from orb to something that makes sense to newbies e.g. dpack? -JonasOn Saturday, 10 December 2011 at 08:55:57 UTC, Jacob Carlborg wrote:I was thinking that the package manager just invokes a build tool like make, rdmd, dsss, shell script and so on.I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. Currently I see two use cases: 1. When the package manager installs (and builds) a package 2. When a user (developer) builds a project and want's to use installed packages In the best of worlds the user wouldn't have to do anything and it just works. The package manger needs to somehow pass import paths to the compiler and libraries to link with. I'm not entirely sure what the best method to do this would be. But I'm thinking that if the compiler could accept compiler flags passed via environment variables use case 1 would be easy to implement. For use case 2 it would be a bit more problematic. In this use case the user would need to somehow tell the package manager that I want to use these packages, something like: // project.obspec orb "foo" orb "bar" $ orb use project.obspec or for single packages $ orb use foobar $ dmd project.d If environment variables are used in this case, then the package manager would need a shell script wrapper, the same way as DVM does it, to be able to set environment variables for the parent (the shell). The reason for this is that a child process (the package manager) can't set environment variables for the parent process (the shell). This complicates the implementation and installation of the package manager and requires different implementations for Posix and Windows. Another idea would be to manipulate the dmd.conf/sc.ini file but that seems to be quite complicated and messy. On the other hand, this wouldn't require any changes to the compiler. Any other ideas? https://github.com/jacob-carlborg/orbit/wiki/Orbit-Package-Manager-for-D https://github.com/jacob-carlborg/orbitFor use case 1 the package manager could just as well call dmd directly with the correct flags ie. no need for using environment variables.Use case 2 does not belong to a package manager in my opinion. It is the job of a build tool to configure packages for a project. What would be nice to have support for using packages without a build tool. Maybe something like what pkg-config provides: dmd -ofhello `orb -lib foo` hello.d where "org -lib foo" returns the flags to use the foo package. /JonasI would say that the preferred way is to use a build tool then there is no problem. The build tool just asks the package manager which import paths to use for the given packages and pass the information to the compiler. But I don't want my package manager to depend on a built tool, I want it to be usable on its own.
Dec 12 2011
On 2011-12-12 12:45, jdrewsen wrote:On Sunday, 11 December 2011 at 21:22:37 UTC, Jacob Carlborg wrote:Ok.I would say that the preferred way is to use a build tool then there is no problem. The build tool just asks the package manager which import paths to use for the given packages and pass the information to the compiler. But I don't want my package manager to depend on a built tool, I want it to be usable on its own.And for that I think the pkg-config method is the way to go. Setting environment vars brings unneeded state into you development session. Another option would be to just wrap dmd in a e.g. orbdmd command and handle it there.Btw: have you considered renaming from orb to something that makes sense to newbies e.g. dpack? -JonasNo, I basically just picked a random name. I tried of trying to come up with good names for tools and libraries. -- /Jacob Carlborg
Dec 12 2011
On 12/10/2011 03:55 AM, Jacob Carlborg wrote:I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. ...o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.
Dec 11 2011
On 2011-12-12 04:08, Chad J wrote:On 12/10/2011 03:55 AM, Jacob Carlborg wrote:It's basically RubyGems but for D. It's great to hear that someone likes it. -- /Jacob CarlborgI think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. ...o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.
Dec 12 2011
On 12/12/2011 08:58 AM, Jacob Carlborg wrote:On 2011-12-12 04:08, Chad J wrote:OK, cool. I should probably mention some of the things I like about portage (off the top of my head), incase it helps: - The world file: A list of all packages that the /user/ elected to install. It does not contain dependencies. It is the top level. - use-flags: Flags/keywords associated with packages that allow you to turn specific features within packages on and off. - Stability levels: Portage has a notion of unstable/untested or "hardmasked" packages at one level, slightly unstable or architecture-specific glitchiness at another level ("keyworded"), and completely stable at another. --------------------------------- As for why I like these things: - The world file: This makes it really easy to replicate installations on other machines. It also allows me to cull my tree by removing something from the world file and then telling it to remove all the orphaned packages. - use-flags: These are super useful when a package has a dependency that just will not compile on my system. In some cases I can disable the feature that causes that dependency, and then still be able to install the package. - Stability levels: These can be controlled at different granularities, examples: system has only stable packages, or all unstable, or stable except for packages in the "keywords" file, and maybe one package in the keywords file has all versions allowed, or just versions 1.3.44 and 1.5.21. This is yet more control over giving troubling packages the boot. --------------------------------- Things I don't like about portage: - The portage tree doesn't keep enough old versions around sometimes. - People who write crappy ebuilds or mark things stable when they mess up my system. The quality control used to be better. (It's still my favorite package manager by a wide margin.)On 12/10/2011 03:55 AM, Jacob Carlborg wrote:It's basically RubyGems but for D. It's great to hear that someone likes it.I think I've come so far in my development of a package manager that it's time to think how it should interact with the compiler. ...o.O I've read what I could find and I think I like where this is going. I'm not sure where you're drawing your inspiration from, but if this is going to support features similar to Portage then I am willing to give money to help make sure it happens.
Dec 13 2011
On 2011-12-13 14:04, Chad J wrote:OK, cool. I should probably mention some of the things I like about portage (off the top of my head), incase it helps: - The world file: A list of all packages that the /user/ elected to install. It does not contain dependencies. It is the top level.That might be a good idea. I had only planned to list all installed packages.- use-flags: Flags/keywords associated with packages that allow you to turn specific features within packages on and off.I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.- Stability levels: Portage has a notion of unstable/untested or "hardmasked" packages at one level, slightly unstable or architecture-specific glitchiness at another level ("keyworded"), and completely stable at another.Orbit uses Semantic Versioning: http://semver.org/Things I don't like about portage: - The portage tree doesn't keep enough old versions around sometimes.I have no plans of removing old packages as long as it doesn't cause any problems.- People who write crappy ebuilds or mark things stable when they mess up my system. The quality control used to be better. (It's still my favorite package manager by a wide margin.)This seems hard to avoid and I don't know what can be done about it. -- /Jacob Carlborg
Dec 13 2011
On 12/13/2011 08:45 AM, Jacob Carlborg wrote:On 2011-12-13 14:04, Chad J wrote:Would you allow others to implement this, or somehow be open to it in the future? Of course, I can definitely understand not wanting to handle this right now, due to scope creep.OK, cool. I should probably mention some of the things I like about portage (off the top of my head), incase it helps: - The world file: A list of all packages that the /user/ elected to install. It does not contain dependencies. It is the top level.That might be a good idea. I had only planned to list all installed packages.- use-flags: Flags/keywords associated with packages that allow you to turn specific features within packages on and off.I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.I'll read that when I get a bit of time.- Stability levels: Portage has a notion of unstable/untested or "hardmasked" packages at one level, slightly unstable or architecture-specific glitchiness at another level ("keyworded"), and completely stable at another.Orbit uses Semantic Versioning: http://semver.org/Nice. Thanks.Things I don't like about portage: - The portage tree doesn't keep enough old versions around sometimes.I have no plans of removing old packages as long as it doesn't cause any problems.Maintainers being more conservative, I suspect. It's not too bad in Portage, and mostly happens on super large projects with many packages, like KDE. The bread-and-butter linux stuff (kernel, compilers, small apps, drivers, etc) all tends to work out fine. It can also be mitigated a lot by having older versions around. I can easily avoid this by reverting to an earlier version of my system... except I can't sometimes. In a production environment I would probably keep all versions of my stuff packaged locally.- People who write crappy ebuilds or mark things stable when they mess up my system. The quality control used to be better. (It's still my favorite package manager by a wide margin.)This seems hard to avoid and I don't know what can be done about it.
Dec 13 2011
On 2011-12-13 15:05, Chad J wrote:On 12/13/2011 08:45 AM, Jacob Carlborg wrote:It might happen in the future. But currently I think it's unnecessary and too complicated.On 2011-12-13 14:04, Chad J wrote:Would you allow others to implement this, or somehow be open to it in the future?OK, cool. I should probably mention some of the things I like about portage (off the top of my head), incase it helps: - The world file: A list of all packages that the /user/ elected to install. It does not contain dependencies. It is the top level.That might be a good idea. I had only planned to list all installed packages.- use-flags: Flags/keywords associated with packages that allow you to turn specific features within packages on and off.I currently have no plans of configurable packages. Either the complete package is installed or nothing is installed.Of course, I can definitely understand not wanting to handle this right now, due to scope creep.Yeah, it won't happen in the first release. -- /Jacob Carlborg
Dec 13 2011