www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - envy for "Writing Go Packages"

reply Graham Fawcett <fawcett uwindsor.ca> writes:
A screencast overview of the Go package model (writing, publishing, 
downloading, and installing third-party code):

http://www.youtube.com/watch?v=jDWBJOXs_iI

Simple and well conceived! I would like this for D please (and am willing 
to help :)). 

DSSS covers similar ground, but appears to be moribund (and suffers from 
a single point of failure problem). Is there interest in developing some 
simple but effective package-management tools like Go's?

Best,
Graham
May 06 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Graham Fawcett wrote:
 Simple and well conceived! I would like this for D please (and am willing 
 to help :)). 
Any help in this direction will be most appreciated.
May 06 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
On Thu, 06 May 2010 16:01:49 -0700, Walter Bright wrote:

 Graham Fawcett wrote:
 Simple and well conceived! I would like this for D please (and am
 willing to help :)).
Any help in this direction will be most appreciated.
I'm slowly working on a proposal, trying to incorporate the feedback we've received on this thread. I've got a few specific questions for the group. (For clarity's sake, I'll refer to downloadable, third-party packages as "subprojects" here, so as not to overload terms like library, module and package. It's not a great name, but it's unambiguous.) Let's say I want to use a subproject in my application: let's take sybrandy's new logging module as an example. Assume I have tool support for declaring my app's dependency on his subproject, and for downloading and storing his source files. It seems to me that a decent tool should not only download the subproject's source, but also compile it into a static library. It should also inform my app's build process, stating what compiler and linker flags I need in order to build and link my app against the subproject. Do you agree that precompiling a subproject is a desirable feature? If so, what's the minimum harnessing we can impose upon a subproject writer, to make their library compilable by an automated tool? We could require, e.g., that the subproject author maintain a Makefile with a 'd-preinstall target, which compiles the package into an installable form (perhaps copying the libfile, along with a link to the root of the sources, into an $INSTALL_TARGET directory). The installer would then finish the installation process. An alternative to 'make d-preinstall' would be a more D-specific package descriptor, like Haskell's ".cabal" descriptors, which tell the installer what the package contains, and how to build and install it. The 'descriptor' approach puts a bigger burden on the design of the install-tool, and loses some flexibility compared with the 'make' approach; but lost flexibility isn't necessarily bad. Given the host of ways that people organize their own code, I don't think we can expect an automated tool to guess the layout and compilation instructions for any arbitrary subproject. I think we either need source layout conventions, or we need a demarcation point like 'make d-preinstall'. Or, what else? Any thoughts? Thanks, Graham
May 12 2010
parent reply BCS <none anon.com> writes:
Hello Graham,

 If so, what's the minimum harnessing we can impose upon a subproject
 writer, to make their library compilable by an automated tool?
For many cases, there is a zero config solution that will work much of the time: Compile all files and pack them into a .lib/.a Until that doesn't work, the writer shouldn't have to do anything but supply a name for the lib and a list of files. -- ... <IXOYE><
May 12 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hi BCS,

On Thu, 13 May 2010 01:53:34 +0000, BCS wrote:
 Hello Graham,
 
 If so, what's the minimum harnessing we can impose upon a subproject
 writer, to make their library compilable by an automated tool?
For many cases, there is a zero config solution that will work much of the time: Compile all files and pack them into a .lib/.a
Right. Last night I reviewed a set of dsource.org projects, and I see how a zero-config solution would work for many of them.
 Until that doesn't work, the writer shouldn't have to do anything
 but supply a name for the lib and a list of files.
It won't take long before more options are needed (I would still vote to have recommended linker flags be declarable, to support wrappers to C libraries), but this is a good start. Any preferences re: how to declare the list of files? (Would you want glob patterns, exclusions, etc.?) I'm tempted to support a truly zero-config version that (a) derives the library name from the parent directory name of the source (skipping intermediary directories like 'branches/foo/', 'trunk', etc.); (b) derives the list of files by including all *.d files, except those in directories named 'test' and except those which contain a main() function. Graham
May 13 2010
parent reply BCS <none anon.com> writes:
Hello Graham,

 (b) derives the list of files by including all *.d files, except those
 in directories named 'test' and except those which contain a
 main() function.
I'm thinking of how to make this work for the simple publication model of "Put it all out via HTTP/SVN/CVS/GIT/etc.". Anything that works for that should generalize well. How about: Download files as needed to do imports from any local code but don't build them. Once you have copies of everything you need, build one lib per mapping rule that includes everything retrieved via that rule. At that point the name doesn't hardly matter at all. You can make it the FQN of the mapped package if you want it easy to interpret. As for linker flags... pragams or header comments? or how about a default flags file at the root of the mapped source. -- ... <IXOYE><
May 13 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hi BCS,

On Thu, 13 May 2010 17:31:34 +0000, BCS wrote:

 (b) derives the list of files by including all *.d files, except those
 in directories named 'test' and except those which contain a main()
 function.
I'm thinking of how to make this work for the simple publication model of "Put it all out via HTTP/SVN/CVS/GIT/etc.". Anything that works for that should generalize well.
Yes, agreed, 'simple publication' is what I hope to target. I'm confident that issues like versioning, dependencies, and cataloguing can be addressed once 'simple publication' is stable.
 How about: Download files as needed to do imports from any local code
 but don't build them. Once you have copies of everything you need, build
 one lib per mapping rule that includes everything retrieved via that
 rule. At that point the name doesn't hardly matter at all. You can make
 it the FQN of the mapped package if you want it easy to interpret.
If I follow you, then the naming of libraries (or pre-compilation of libraries at all) is a non-issue: we download the necessary source files, cherry-pick the ones needed for our project, and compile them together. Or let the compiler do the cherry-picking for us: just make sure everything is "included" properly when we invoke the compiler, and it can discover all the necessary modules. I have a prototype that does roughly that already, but I was concerned others might think the lack of a pre-compiled library was too hackish (though I quite like it!). I think I'll proceed, and come back with some sample code.
 As for linker flags... pragams or header comments? or how about a
 default flags file at the root of the mapped source.
Thanks to your suggestion, I just discovered 'pragma("lib", ...)' which does exactly what I wanted, and in a canoncial way. (The only linker flag that's especially important IMHO is '-l', and this pragma addresses that.) So, we ask C-wrapper authors to toss in a quick pragma, and we're off to the races. Best, Graham
May 13 2010
next sibling parent BCS <none anon.com> writes:
Hello Graham,

 I was concerned
 others might think the lack of a pre-compiled library was too hackish
 (though I quite like it!). I think I'll proceed, and come back with
 some sample code.
 
[...]
 Thanks to your suggestion, I just discovered 'pragma("lib", ...)'
 which does exactly what I wanted, and in a canoncial way. 
put those together and an idea crops up: allow more than just local lib files in that pragma, URL's come to mind but I'd think that some way to say a path is relative to the root of the current package (local, mapped or whatnot) might be every useful. pragma(lib, "foo"); // as today pragma(lib, " foo"); // derived from package root, downloaded if needed, might need to ask user "y/n" for that. (BTW: hard coding the extension looks like a bad idea to me.)
 Best,
 Graham
-- ... <IXOYE><
May 13 2010
prev sibling parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hey again,

On Thu, 13 May 2010 18:26:07 +0000, Graham Fawcett wrote:

 Hi BCS,
 
 On Thu, 13 May 2010 17:31:34 +0000, BCS wrote:
 
 (b) derives the list of files by including all *.d files, except those
 in directories named 'test' and except those which contain a main()
 function.
I'm thinking of how to make this work for the simple publication model of "Put it all out via HTTP/SVN/CVS/GIT/etc.". Anything that works for that should generalize well.
Yes, agreed, 'simple publication' is what I hope to target. I'm confident that issues like versioning, dependencies, and cataloguing can be addressed once 'simple publication' is stable.
 How about: Download files as needed to do imports from any local code
 but don't build them. Once you have copies of everything you need,
 build one lib per mapping rule that includes everything retrieved via
 that rule. At that point the name doesn't hardly matter at all. You can
 make it the FQN of the mapped package if you want it easy to interpret.
If I follow you, then the naming of libraries (or pre-compilation of libraries at all) is a non-issue: we download the necessary source files, cherry-pick the ones needed for our project, and compile them together. Or let the compiler do the cherry-picking for us: just make sure everything is "included" properly when we invoke the compiler, and it can discover all the necessary modules. I have a prototype that does roughly that already, but I was concerned others might think the lack of a pre-compiled library was too hackish (though I quite like it!). I think I'll proceed, and come back with some sample code.
OK, I'm back. :) I've put a copy of my prototype, here: http://github.com/gmfawcett/d-build The name 'd-build' is a bit misleading, but I have nothing better to call it. Currently it's a Python script -- sorry, I can still prototype a lot faster in Python -- but a D version will follow if there is interest. If you're interested, please visit the github link and read the README to get an idea of what it does (and doesn't do). If you look at the DEPS file, you'll see the amazing way that I specify a dependency on a specific verision of an external project. At this point, it's so simple that it's almost a "so what" -- but I think it is a promising start. Once the mechanics are down pat, the fun begins. Best, Graham
May 14 2010
parent reply Jacob Carlborg <doob me.com> writes:
On 5/14/10 22:27, Graham Fawcett wrote:
 Hey again,

 On Thu, 13 May 2010 18:26:07 +0000, Graham Fawcett wrote:

 Hi BCS,

 On Thu, 13 May 2010 17:31:34 +0000, BCS wrote:

 (b) derives the list of files by including all *.d files, except those
 in directories named 'test' and except those which contain a main()
 function.
I'm thinking of how to make this work for the simple publication model of "Put it all out via HTTP/SVN/CVS/GIT/etc.". Anything that works for that should generalize well.
Yes, agreed, 'simple publication' is what I hope to target. I'm confident that issues like versioning, dependencies, and cataloguing can be addressed once 'simple publication' is stable.
 How about: Download files as needed to do imports from any local code
 but don't build them. Once you have copies of everything you need,
 build one lib per mapping rule that includes everything retrieved via
 that rule. At that point the name doesn't hardly matter at all. You can
 make it the FQN of the mapped package if you want it easy to interpret.
If I follow you, then the naming of libraries (or pre-compilation of libraries at all) is a non-issue: we download the necessary source files, cherry-pick the ones needed for our project, and compile them together. Or let the compiler do the cherry-picking for us: just make sure everything is "included" properly when we invoke the compiler, and it can discover all the necessary modules. I have a prototype that does roughly that already, but I was concerned others might think the lack of a pre-compiled library was too hackish (though I quite like it!). I think I'll proceed, and come back with some sample code.
OK, I'm back. :) I've put a copy of my prototype, here: http://github.com/gmfawcett/d-build The name 'd-build' is a bit misleading, but I have nothing better to call it. Currently it's a Python script -- sorry, I can still prototype a lot faster in Python -- but a D version will follow if there is interest. If you're interested, please visit the github link and read the README to get an idea of what it does (and doesn't do). If you look at the DEPS file, you'll see the amazing way that I specify a dependency on a specific verision of an external project. At this point, it's so simple that it's almost a "so what" -- but I think it is a promising start. Once the mechanics are down pat, the fun begins. Best, Graham
If you already have it written the script in python why do you use a shell script as the dependency file? You could use python for that as well and you wouldn't have problems on windows.
May 15 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hi Jacob,

On Sat, 15 May 2010 15:28:23 +0200, Jacob Carlborg wrote:
 
 If you already have it written the script in python why do you use a
 shell script as the dependency file? 
The dependency file isn't strictly a shell script, though the syntax looks the same. As you see, each line represents an external project; but the program reads each line and decides whether or not to execute it (via a system() call). It suits the needs of the moment, which are minimal. I'm not suggesting it's a good long-term format.
 You could use python for that as well and you wouldn't have problems
 on windows.
Sure. At this early stage, I don't want to make too many decisions about what technologies to use, what configuration formats, etc. The big questions of the day are about function and scope: are we headed in the right direction, where do we stop, and how do we separate our concerns? On a side note, I've found that dranges, dstats and dcrypt (all from dsource.org) all work with a little bit of shoehorning (though I had to patch dcrypt to get it to compile under the latest D2: see http://github.com/gmfawcett/d-build/blob/dcrypt-play/DEPS). I would really like to hear of any other online D2 libraries that appear to work (and as importantly, ones that you'd like to work but don't). Thanks for your input! Graham
May 15 2010
parent reply Philippe Sigaud <philippe.sigaud gmail.com> writes:
On Sat, May 15, 2010 at 18:27, Graham Fawcett <fawcett uwindsor.ca> wrote:

 On a side note, I've found that dranges, dstats and dcrypt (all from
 dsource.org) all work with a little bit of shoehorning (though I had
 to patch dcrypt to get it to compile under the latest D2: see
 http://github.com/gmfawcett/d-build/blob/dcrypt-play/DEPS). I would
 really like to hear of any other online D2 libraries that appear to
 work (and as importantly, ones that you'd like to work but don't).
Out of cursiosity, what kind of shoehorning for dranges? I see that you move the trunk inside a dranges directory. I admit putting the modules in trunk as that's what many other projects do. Would you prefer the files to be in .trunk/dranges/ or even directly in ./dranges/ ? Philippe
May 15 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hi Philippe,

On Sat, 15 May 2010 23:03:54 +0200, Philippe Sigaud wrote:

 On Sat, May 15, 2010 at 18:27, Graham Fawcett <fawcett uwindsor.ca>
 wrote:
 
 On a side note, I've found that dranges, dstats and dcrypt (all from
 dsource.org) all work with a little bit of shoehorning (though I had to
 patch dcrypt to get it to compile under the latest D2: see
 http://github.com/gmfawcett/d-build/blob/dcrypt-play/DEPS). I would
 really like to hear of any other online D2 libraries that appear to
 work (and as importantly, ones that you'd like to work but don't).
Out of cursiosity, what kind of shoehorning for dranges? I see that you move the trunk inside a dranges directory.
Yes, that was all I had to do for dranges.
 I admit putting the modules in trunk as that's what many other
 projects do. Would you prefer the files to be in .trunk/dranges/ or
 even directly in ./dranges/ ?
'trunk/dranges' would be fine -- keeping 'trunk' makes good sense for a Subversion project. For my purposes, any depth is okay, e.g. 'trunk/src/lang/d2/dranges' would also work. The key is that modules which belong to the 'dranges' package should exist in a 'dranges' directory. What I'm doing is adding the parent of the 'dranges' directory to dmd's include-path. Then I just let DMD look up the module files using its own conventions (e.g. DMD expects that a module named foo.bar.baz will be defined in a file named foo/bar/baz.d). That's why I renamed trunk to dranges: it was the easiest way to honour the DMD naming convention. Best, Graham
May 15 2010
parent Philippe Sigaud <philippe.sigaud gmail.com> writes:
On Sun, May 16, 2010 at 06:13, Graham Fawcett <fawcett uwindsor.ca> wrote:

 'trunk/dranges' would be fine -- keeping 'trunk' makes good sense for
 a Subversion project. For my purposes, any depth is okay, e.g.
 'trunk/src/lang/d2/dranges' would also work. The key is that modules
 which belong to the 'dranges' package should exist in a 'dranges'
 directory.
OK. I admit switching to a package.module naming convention quite recently (a month or so), so it's still a bit incomplete. I should have done that.
 What I'm doing is adding the parent of the 'dranges' directory to
 dmd's include-path. Then I just let DMD look up the module files using
 its own conventions (e.g. DMD expects that a module named foo.bar.baz
 will be defined in a file named foo/bar/baz.d). That's why I renamed
 trunk to dranges: it was the easiest way to honour the DMD naming
 convention.
OK, got it. I'll add a dranges directory inside trunk today. And add a dranges dir in the downloadable .zip.That'a another alternative you have: look inside /download if there is something to get. I'm following your project with interest, as that'd be cool to have an easy-to-use tool to grap projects. I use dstats from time to time and just dumped the files into my own project file, but a command-line tool is better. Philippe
May 16 2010
prev sibling next sibling parent reply BCS <none anon.com> writes:
Hello Graham,


 Simple and well conceived! I would like this for D please (and am
 willing to help :)).
I haven't watched the video (not enough bandwidth) but I'd like a well done package tool as well. Thinking in the direction of a spec, what do you like about the Go system? Some things I'd like: -It must work offline (via sneeker net) and from trivial local static mirrrors/archives -It should support executing bash commands, D executables or something that does most of what people use that kind of things for. (Some kind of sandbox/security feature might be needed) -- ... <IXOYE><
May 06 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
Hi BCS,

On Fri, 07 May 2010 01:54:17 +0000, BCS wrote:
n> 
 I haven't watched the video (not enough bandwidth) but I'd like a well
 done package tool as well. Thinking in the direction of a spec, what do
 you like about the Go system?
I'll summarize the video (which, by the way, contains everything I know about the Go package system!). The screencast shows a developer writing a short application with a reusable function in it. He factors the function out into a separate module in a subdirectory, then (with the aid of a copied Makefile) installs the new module into a shared location on the host, so it can be reused by other apps. Next, he pushes the new module into a github repository, and reinstalls the module from github like so: The source is checked out of github, compiled and installed in the central Go 'pkg' directory (both sources and objects are stored there). He commits some new changes to github, and reinstalls the updated module using the same goinstall command. Finally, he returns to his original application, and replaces the Go equivalent of "import mytools;" with "import github.com/nf/mytools;" and the application now refers to the version of mytools that was installed from Github.
 Some things I'd like:
 
 -It must work offline (via sneeker net) and from trivial local static
 mirrrors/archives -It should support executing bash commands, D
 executables or something that does most of what people use that kind of
 things for. (Some kind of sandbox/security feature might be needed)
Stepping away from Go specifics, I like the idea of a system that does some of the following things. (just a brain-dump, no particular order): (by the way, I'm overloading the term "package" here, at times to mean a package in the D sense, but usually to mean a downloadable and installable unit.) * It knows how to build and install packages from an arbitrary source, specified by a URL: a git/hg/svn repository, a tarball on the filesystem, a zipfile on a Web server. Exotic sources/formats could possibly be added with plugins. * It keeps the ceremony of sharing code to a minimum. It should be easy to publish a simple library package, without reading an excess of documentation or writing an excess of package metadata. * It separates the concerns of installing packages and managing a central repository of packages. As an analogy, consider 'dpkg' vs. 'apt-get' on Debian systems: one knows about the mechanics of package installation and removal, the other can search a central catalogue, automatically install dependencies, etc. While 'd-get' would be very nice, 'd-dpkg' is the essential building block. This is important: the core package tool should not try to do more than absolutely necessary to simply fetch and install code. * It enables (but does not provide) apt-get-like use by providing ways to (optionally) declare package dependencies (platform, D-version, other needed packages), and include metadata about the package (author, website, etc.) but does not require a complex package description for simple packages. (An important usability question is, what is the minimum package description that should be required of a package's author? Can we work with no metadata at all?) * It should be easy to have both user and global package directories. It shouldn't be necessary to 'have root' to install a package. * It doesn't depend on Make or another build tool, at least not for basic library packages. (From the DSSS docs, it seems that DSSS was able to manage dependency graphs, and do the right thing with a simple 'dsss compile and install' command.) Having said that, a package with complex build requirements ought to be buildable with a Makefile, perhaps requiring a 'd-build' target in the Makefile that builds the package into a state that meets our package system's installation requirements. * It makes it easy for library consumers to integrate the code. After I've installed some third-party libaries, I'd like a tool that can manage (or support) the build of an application that uses those libraries -- for example, by figuring out which object files need to be linked in, based on the modules I've imported. This could be a tool that just calculates the necessary compiler and linker flags, and/or a tool that runs the compiler/linker for me with those flags in place, and/or a tool that builds a suitable Makefile for me, based on my project's dependencies. (Arguably, this feature is outside the scope of a simple fetch-and-install system; but if it's not easy to use those fetched packages, then the overall value of the system is weakened.) * Nice to have: a way to allow multiple versions of the same package to be installed locally, and a corresponding way to specify which version of the package my application is using. * Nice to have: a way to specify dependencies between third-party packages, and dependencies on external libraries (not necessarily a way to install those external dependencies, just a declaration). * Nice to have: some support for running unit tests, especially a basic 'pre-install' unit test to confirm that all dependencies are in place (if possible). * Nice to have: a way to publish to a central feed of "known D packages". Less formal than dsource.org: not a user site, but a machine-readable one, and possibly distributed/mirrorable. As an example, the Clojure community has a "clojars.org" site which lets anyone publish Clojure packages there. Clojars has very limited support for searching/browsing, it's really more of a key/value store: it is intended for use by automated tools (it's a Maven repository, as I understand it). Clojars actually maintains an archive of those packages (as jar files), this might/might not be a good idea. I think the "feed" is the key thing. * Centralizing the feed-of-packages would also enable another useful feature: a way of mapping a canonical package name (and optionally, version) to the URL of the source for that package. (Possibly multiple URLs for the same source.) This is headed into 'apt-get' territory, but minimizing the responsibility of the central repository to "here's where to find package x.y.z" without any other assurances (e.g., that x.y.z will build properly on your system). Conventions about package naming (should my DSP package be called "dsp", or "fawcett.dsp", "algorithms.signals.dsp", etc.) and package-name dispute resolution are orthogonal and should be left out of the scope of the system. These next thoughts are outside the scope; but a good design of the core tool won't get in the way of future progress in these areas: * Thinking ahead: ensure the system can be used by continuous integration systems, so that e.g. a buildbot could read the feed for new "known D packages", and attempt to install them on various D-version and OS permutations, reporting the results. (I'm not suggesting that the buildbot is part of the package management system, just that it should be easy for buildbot maintainers to be aware of new package versions so they can be tested.) The same facility could be used by developers who want to keep a "mirror" of some or all known D packages (even though those packages might reside upstream on several different hosts). * Thinking ahead: eventually we hope for a nice site where people can browse, review, grade, and comment upon packages. An opt-in system where consumers agree to publish their installation activity (like Debian's 'popularity contest'). * Thinking ahead: easy message-digests and digital signatures for packages, enabling a truly distributed package-publishing system. Anyone can host a d-libraries mirror, and a signed "central feed" document can publish the known hashes for those packages to avoid man-in-the-middle attacks. * Thinking ahead: I really like the way github.com shows a 52-week activity graph on projects that are hosted there. I don't think we can do exactly that, but it would be nice to be able to generate a report for all packages, showing how frequently they are updated and maintained. * I personally favour a system that works with D2 only, but I am new here. :) Otherwise, the implications of D1/D2 and Phobos/Tango need to be considered by more seasoned heads than mine. And for now, I'm all thought out. :) Your thoughts? Best, Graham
May 07 2010
next sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Graham Fawcett:

 Finally, he returns to his original application, and replaces the Go
 equivalent of "import mytools;" with "import github.com/nf/mytools;"
 and the application now refers to the version of mytools that was
 installed from Github.
Is this any safe to do?
 (by the way, I'm overloading the term "package" here, at times to mean
 a package in the D sense, but usually to mean a downloadable and
 installable unit.)
Python for this has Eggs, Ruby has Gems, D needs Satellites ;-) Bye, bearophile
May 07 2010
parent reply Graham Fawcett <fawcett uwindsor.ca> writes:
On Fri, 07 May 2010 08:26:02 -0400, bearophile wrote:

 Graham Fawcett:
 
 Finally, he returns to his original application, and replaces the Go
 equivalent of "import mytools;" with "import github.com/nf/mytools;"
 and the application now refers to the version of mytools that was
 installed from Github.
Is this any safe to do?
For certain definitions of safety: yes, or no. :) What did you have in mind?
 (by the way, I'm overloading the term "package" here, at times to mean
 a package in the D sense, but usually to mean a downloadable and
 installable unit.)
Python for this has Eggs, Ruby has Gems, D needs Satellites ;-)
Satellites works for me. While they download, we could call them meteorites. :) Graham
May 07 2010
parent reply Leandro Lucarella <llucax gmail.com> writes:
Graham Fawcett, el  7 de mayo a las 12:30 me escribiste:
 On Fri, 07 May 2010 08:26:02 -0400, bearophile wrote:
 
 Graham Fawcett:
 
 Finally, he returns to his original application, and replaces the Go
 equivalent of "import mytools;" with "import github.com/nf/mytools;"
 and the application now refers to the version of mytools that was
 installed from Github.
Is this any safe to do?
For certain definitions of safety: yes, or no. :) What did you have in mind?
 (by the way, I'm overloading the term "package" here, at times to mean
 a package in the D sense, but usually to mean a downloadable and
 installable unit.)
Python for this has Eggs, Ruby has Gems, D needs Satellites ;-)
Satellites works for me. While they download, we could call them meteorites. :)
I think it's better to have a short name. What about rocks? ;) -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- Salvajes, de traje, me quieren enseñar Salvajes, de traje, me quieren educar
May 07 2010
parent reply Pelle <pelle.mansson gmail.com> writes:
On 05/07/2010 04:50 PM, Leandro Lucarella wrote:
 Graham Fawcett, el  7 de mayo a las 12:30 me escribiste:
 On Fri, 07 May 2010 08:26:02 -0400, bearophile wrote:

 Graham Fawcett:

 Finally, he returns to his original application, and replaces the Go
 equivalent of "import mytools;" with "import github.com/nf/mytools;"
 and the application now refers to the version of mytools that was
 installed from Github.
Is this any safe to do?
For certain definitions of safety: yes, or no. :) What did you have in mind?
 (by the way, I'm overloading the term "package" here, at times to mean
 a package in the D sense, but usually to mean a downloadable and
 installable unit.)
Python for this has Eggs, Ruby has Gems, D needs Satellites ;-)
Satellites works for me. While they download, we could call them meteorites. :)
I think it's better to have a short name. What about rocks? ;)
Lua has rocks.
May 09 2010
parent Leandro Lucarella <llucax gmail.com> writes:
Pelle, el  9 de mayo a las 15:39 me escribiste:
Satellites works for me. While they download, we could call them
meteorites. :)
I think it's better to have a short name. What about rocks? ;)
Lua has rocks.
D'oh! I've used lua a couple of times but never heard about rocks... -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- Y ya no tengo colores, sólo gris y negro Aquí donde el amor es de hierro Los días pasan y moriremos contando el tiempo
May 09 2010
prev sibling next sibling parent reply Leandro Lucarella <llucax gmail.com> writes:
Graham Fawcett, el  7 de mayo a las 11:55 me escribiste:
 Hi BCS,
Let me add: * It integrates well with existing packaging systems (dpkg, rpm, etc.). I hate installing stuff outside my distro's packaging system. A simple: d-pkg builddeb should leave you a nice Debian package to install using dpkg, for example. And you might think that I'm just being lazy, but no, this is essential to have a good acceptance among Linux distribution, which means more distribution of D software. Having this makes trivial for packagers to pack D software, which means you can do: apt-get install d-software easily, which means reaching more people that doesn't have a D compiler, but want to enjoy D software. -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- hypocrite opportunist don't infect me with your poison
May 07 2010
parent Johan Granberg <lijat.meREM OVEgmail.com> writes:
Leandro Lucarella wrote:

 Graham Fawcett, el  7 de mayo a las 11:55 me escribiste:
 Hi BCS,
Let me add: * It integrates well with existing packaging systems (dpkg, rpm, etc.). I hate installing stuff outside my distro's packaging system. A simple: d-pkg builddeb should leave you a nice Debian package to install using dpkg, for example. And you might think that I'm just being lazy, but no, this is essential to have a good acceptance among Linux distribution, which means more distribution of D software. Having this makes trivial for packagers to pack D software, which means you can do: apt-get install d-software easily, which means reaching more people that doesn't have a D compiler, but want to enjoy D software.
I would love to see this.
May 07 2010
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Graham Fawcett wrote:
 I'll summarize the video
Thank you for doing this. It's a real time saver for us!
May 07 2010
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 5/7/10 00:59, Graham Fawcett wrote:
 A screencast overview of the Go package model (writing, publishing,
 downloading, and installing third-party code):

 http://www.youtube.com/watch?v=jDWBJOXs_iI

 Simple and well conceived! I would like this for D please (and am willing
 to help :)).

 DSSS covers similar ground, but appears to be moribund (and suffers from
 a single point of failure problem). Is there interest in developing some
 simple but effective package-management tools like Go's?

 Best,
 Graham
I would like this as well. I'm still using dsss, it's too bad it's abandoned (it seems like it is anyway). But makefiles, I don't know if there's anything I hate more. The question is: what exactly should the tool do? Should it just download and install third party libraries? Should it also contain something similar to rdmd?
May 07 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages 2. backups are automatic 3. your actual project is small and easily moved to another machine 4. it becomes trivial to use Source code could look something like: import http.d_repository.foo.version1_23; and the compiler could interpret "http" as meaning the rest is an internet url, foo is the package name, and version1_23 is the particular version of it.
May 07 2010
next sibling parent reply Johan Granberg <lijat.meREM OVEgmail.com> writes:
Walter Bright wrote:

 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
 2. backups are automatic
 3. your actual project is small and easily moved to another machine
 4. it becomes trivial to use
 
 Source code could look something like:
 
      import http.d_repository.foo.version1_23;
 
 and the compiler could interpret "http" as meaning the rest is an internet
 url, foo is the package name, and version1_23 is the particular version of
 it.
May 07 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Johan Granberg wrote:
 Walter Bright wrote:
 
 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
May 07 2010
next sibling parent reply Johan Granberg <lijat.meREM OVEgmail.com> writes:
Walter Bright wrote:

 Johan Granberg wrote:
 Walter Bright wrote:
 
 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
I have yet to see a system where that works. How can an administrator chose to install a set of libraries in such a setup? Will it still work if the net conectivity is not always present (laptop). Can a linux distribution use that to ensure that some libraries are always pressent? Consider that users might not have the dissk quota to allways have their own copies of every library imaginable installed. Experiance from working with debian systems is that the languages that tries to make their own solutions for automatic handling and caching and so on fails badly, while those integrating/packaging as deb packages works fine. Problems seems to apear either when trying to install for all users or when installing in a users home directory.
May 07 2010
parent Walter Bright <newshound1 digitalmars.com> writes:
Johan Granberg wrote:
 Walter Bright wrote:
 
 Johan Granberg wrote:
 Walter Bright wrote:

 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
I have yet to see a system where that works.
I think the internet browser is an example. ? How can an administrator chose
 to install a set of libraries in such a setup?
Essentially, he doesn't, because the libraries are not installed. The build process looks in the temporary cache for the files, and uses them if they're there, otherwise it downloads them off the network and puts them in the temporary cache. They are never actually installed.
 Will it still work if the net conectivity is not always present (laptop).
No, it will not, unless the files happen to already be in the cache. The builder will look in the cache first for the files.
 Can a linux distribution
 use that to ensure that some libraries are always pressent?
If that is desired, then they can be actually installed by the administrator.
 Consider that users might not have the dissk quota to allways have their own
 copies of every library imaginable installed.
Being a cache, the least recently used library can always be kicked out, and then reloaded as necessary. Think of it working like the temporary cache for your browser.
 Experiance from working with debian systems is that the languages that tries
 to make their own solutions for automatic handling and caching and so on
 fails badly, while those integrating/packaging as deb packages works fine.
 Problems seems to apear either when trying to install for all users or when
 installing in a users home directory.
These problems are all sidestepped because the libraries won't be installed. It's a wholly different way of doing things.
May 07 2010
prev sibling parent reply Leandro Lucarella <llucax gmail.com> writes:
Walter Bright, el  7 de mayo a las 13:18 me escribiste:
 Johan Granberg wrote:
Walter Bright wrote:

Jacob Carlborg wrote:
Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
Already done: http://0install.net/ Too bad (or good?) nobody uses or know it. -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- 'It's not you, it's me....'? You're giving me the 'It's not you, it's me' routine? I invented 'It's not you, it's me.' Nobody tells me it's them, not me. If it's anybody, it's me. -- George Constanza
May 07 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Leandro Lucarella" <llucax gmail.com> wrote in message 
news:20100508011012.GA32072 llucax.com.ar...
 Walter Bright, el  7 de mayo a las 13:18 me escribiste:
 Johan Granberg wrote:
Walter Bright wrote:

Jacob Carlborg wrote:
Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
Already done: http://0install.net/ Too bad (or good?) nobody uses or know it.
That looks absolutely awesome! My only little concern is that it's written in python, I had to use bazaar to do something one time and it was insanely slow.
May 07 2010
next sibling parent reply Leandro Lucarella <llucax gmail.com> writes:
Nick Sabalausky, el  7 de mayo a las 22:36 me escribiste:
 The caching should handle that transparently.
Already done: http://0install.net/ Too bad (or good?) nobody uses or know it.
That looks absolutely awesome! My only little concern is that it's written in python, I had to use bazaar to do something one time and it was insanely slow.
I think your bottleneck will be your internet connection ;) -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- Oiganmen ñatos de corazón, es más posible que un potus florezca en primavera a que un ángel pase con una remera. -- Peperino Pómoro
May 08 2010
parent "Nick Sabalausky" <a a.a> writes:
"Leandro Lucarella" <llucax gmail.com> wrote in message 
news:20100508173421.GA8076 llucax.com.ar...
 Nick Sabalausky, el  7 de mayo a las 22:36 me escribiste:
 The caching should handle that transparently.
Already done: http://0install.net/ Too bad (or good?) nobody uses or know it.
That looks absolutely awesome! My only little concern is that it's written in python, I had to use bazaar to do something one time and it was insanely slow.
I think your bottleneck will be your internet connection ;)
Good point. It may very well have been just a slow connection to whatever particular server it was. Heh :)
May 08 2010
prev sibling parent Thomas Leonard <talex5+d gmail.com> writes:
On Fri, 07 May 2010 22:36:22 -0400, Nick Sabalausky wrote:

 "Leandro Lucarella" <llucax gmail.com> wrote in message
 news:20100508011012.GA32072 llucax.com.ar...
 Walter Bright, el  7 de mayo a las 13:18 me escribiste:
 Johan Granberg wrote:
Walter Bright wrote:

Jacob Carlborg wrote:
Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages
If I was administering a server or multiuser system or linux distribution I would care, being able to install packages and libraries globaly for all users easily can be important.
The caching should handle that transparently.
Already done: http://0install.net/ Too bad (or good?) nobody uses or know it.
That looks absolutely awesome! My only little concern is that it's written in python, I had to use bazaar to do something one time and it was insanely slow.
The speed will depend on your computer, of course, but the overhead on my laptop of using it is around 0.07s (70 ms) to select the versions to use and check they're cached. If they're not cached, the network is the limiting factor, naturally. Using 0compile (which also sets up build directories and generates XML metadata about the build), it seems to add about 0.5s per build: http://0install.net/0compile.html A more likely problem is portability, since Windows users might not want to have to install Python (although someone is writing a .NET version). It should work well for D, as the Delight fork uses it (e.g. see the installation instructions at http://delight.sourceforge.net/install.html). If anyone wants to discuss it, you're welcome on the mailing list: http://0install.net/support.html Cheers, -- Dr Thomas Leonard (author of Zero Install and the Delight experimental fork of D)
May 09 2010
prev sibling next sibling parent reply Rainer Deyke <rainerd eldwood.com> writes:
On 5/7/2010 11:55, Walter Bright wrote:
 Source code could look something like:
 
     import http.d_repository.foo.version1_23;
 
 and the compiler could interpret "http" as meaning the rest is an
 internet url, foo is the package name, and version1_23 is the particular
 version of it.
I like this. The only question is, how do you handle computers without an internet connection? -- Rainer Deyke - rainerd eldwood.com
May 07 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Rainer Deyke wrote:
 I like this.  The only question is, how do you handle computers without
 an internet connection?
It can also be a LAN connection if the sys admin sets up a local repository. But essentially, what we are talking about is distribution over the internet and a central internet repository, so it won't work if you're not connected. It's basically cloud compiling. If you want to work off line, then you can do the extra steps necessary to install it locally.
May 07 2010
next sibling parent reply Bernard Helyer <b.helyer gmail.com> writes:
On 08/05/10 11:25, Walter Bright wrote:
 Rainer Deyke wrote:
 I like this. The only question is, how do you handle computers without
 an internet connection?
It can also be a LAN connection if the sys admin sets up a local repository. But essentially, what we are talking about is distribution over the internet and a central internet repository, so it won't work if you're not connected. It's basically cloud compiling. If you want to work off line, then you can do the extra steps necessary to install it locally.
I guess the trouble is if you use a special syntax like `import http.`, then one has to modify the source of the application to install it locally, no?
May 07 2010
next sibling parent Walter Bright <newshound1 digitalmars.com> writes:
Bernard Helyer wrote:
 On 08/05/10 11:25, Walter Bright wrote:
 If you want to work off line, then you can do the extra steps necessary
 to install it locally.
I guess the trouble is if you use a special syntax like `import http.`, then one has to modify the source of the application to install it locally, no?
Yes. Or perhaps a compiler switch.
May 07 2010
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bernard Helyer wrote:
 I guess the trouble is if you use a special syntax like `import http.`, 
 then one has to modify the source of the application to install it 
 locally, no?
I'd like to emphasize that this idea is specifically about not having to install anything. You just type the correct package name in the import declaration, and the compiler system takes care of the rest.
May 07 2010
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 Bernard Helyer wrote:
 I guess the trouble is if you use a special syntax like `import 
 http.`, then one has to modify the source of the application to 
 install it locally, no?
I'd like to emphasize that this idea is specifically about not having to install anything. You just type the correct package name in the import declaration, and the compiler system takes care of the rest.
He meant something else - the fact that changing one's mind about a package's location (remote vs. local) entails doing surgery on the source code. Andrei
May 07 2010
prev sibling parent reply BCS <none anon.com> writes:
Hello Walter,

 You just type the correct package name in the
 import declaration, and the compiler system takes care of the rest.
How about a package->source mapping? dmd -I my.corp=http://mycorp.com/sourcerepo import my.corp.foo; // looked for at http://mycorp.com/sourcerepo.foo.d that way when things move, or corps merg, you only need to update the mappings. -- ... <IXOYE><
May 07 2010
next sibling parent Lutger <lutger.blijdestijn gmail.com> writes:
BCS wrote:

 Hello Walter,
 
 You just type the correct package name in the
 import declaration, and the compiler system takes care of the rest.
How about a package->source mapping? dmd -I my.corp=http://mycorp.com/sourcerepo import my.corp.foo; // looked for at http://mycorp.com/sourcerepo.foo.d that way when things move, or corps merg, you only need to update the mappings.
I like that idea, naming specific remote locations of a package all over the place in the source code seems a bit flaky. This way you can also more easily specify whether you want to use a local install only, download as needed, or a 'lazy-install' which installs as needed, checking for updates, etc. It starts to look a bit like dsss :) The specific package -> source mappings could located in any one of: - in a pragma - on the command line - in a repository listing, either remotely or locally specified
May 08 2010
prev sibling parent "Lars T. Kyllingstad" <public kyllingen.NOSPAMnet> writes:
On Sat, 08 May 2010 02:07:29 +0000, BCS wrote:

 Hello Walter,
 
 You just type the correct package name in the import declaration, and
 the compiler system takes care of the rest.
How about a package->source mapping? dmd -I my.corp=http://mycorp.com/sourcerepo import my.corp.foo; // looked for at http://mycorp.com/sourcerepo.foo.d that way when things move, or corps merg, you only need to update the mappings.
I think that's a really good idea. -Lars
May 08 2010
prev sibling parent "Simen kjaeraas" <simen.kjaras gmail.com> writes:
Bernard Helyer <b.helyer gmail.com> wrote:

 On 08/05/10 11:25, Walter Bright wrote:
 Rainer Deyke wrote:
 I like this. The only question is, how do you handle computers without
 an internet connection?
It can also be a LAN connection if the sys admin sets up a local repository. But essentially, what we are talking about is distribution over the internet and a central internet repository, so it won't work if you're not connected. It's basically cloud compiling. If you want to work off line, then you can do the extra steps necessary to install it locally.
I guess the trouble is if you use a special syntax like `import http.`, then one has to modify the source of the application to install it locally, no?
Depends on the system, of course. The local install could still keep the URI information, e.g. by means of directory structure or a library configuration file. -- Simen
May 07 2010
prev sibling parent BCS <none anon.com> writes:
:Hello Walter,

 Rainer Deyke wrote:
 
 I like this.  The only question is, how do you handle computers
 without an internet connection?
 
It can also be a LAN connection if the sys admin sets up a local repository. But essentially, what we are talking about is distribution over the internet and a central internet repository, so it won't work if you're not connected. It's basically cloud compiling. If you want to work off line, then you can do the extra steps necessary to install it locally.
A "the internet is here: <path>" flag would do the trick. -- ... <IXOYE><
May 07 2010
prev sibling next sibling parent reply Michel Fortin <michel.fortin michelf.com> writes:
On 2010-05-07 13:55:34 -0400, Walter Bright <newshound1 digitalmars.com> said:

 Source code could look something like:
 
      import http.d_repository.foo.version1_23;
 
 and the compiler could interpret "http" as meaning the rest is an 
 internet url, foo is the package name, and version1_23 is the 
 particular version of it.
So now, each time a new version of a library pops up you need to search-replace the version number for all your source code, and source code of other library you depend on? This is insane. The version number shouldn't be there, except perhaps if it's a 'major' version number full of breaking changes. Also, putting in the source code the location or protocol to fetch the repository isn't much better. There's a reason we have a module import path: so that finding external code depends on compile-time configuration, not on the actual code you build. Allowing URLs in the import path might be an interesting idea though. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
May 07 2010
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Michel Fortin wrote:
 On 2010-05-07 13:55:34 -0400, Walter Bright <newshound1 digitalmars.com> 
 said:
 
 Source code could look something like:

      import http.d_repository.foo.version1_23;

 and the compiler could interpret "http" as meaning the rest is an 
 internet url, foo is the package name, and version1_23 is the 
 particular version of it.
So now, each time a new version of a library pops up you need to search-replace the version number for all your source code, and source code of other library you depend on? This is insane. The version number shouldn't be there, except perhaps if it's a 'major' version number full of breaking changes.
If you leave the version number off, it gets the latest version. If you put it on, it reliably stays the same. I don't see an issue.
 Also, putting in the source code the location or protocol to fetch the 
 repository isn't much better. There's a reason we have a module import 
 path: so that finding external code depends on compile-time 
 configuration, not on the actual code you build.
It's a good point, but I think it's a detail.
 Allowing URLs in the import path might be an interesting idea though.
May 07 2010
parent Michel Fortin <michel.fortin michelf.com> writes:
On 2010-05-07 19:27:52 -0400, Walter Bright <newshound1 digitalmars.com> said:

 Michel Fortin wrote:
 On 2010-05-07 13:55:34 -0400, Walter Bright <newshound1 digitalmars.com> said:
 
 Source code could look something like:
 
      import http.d_repository.foo.version1_23;
 
 and the compiler could interpret "http" as meaning the rest is an 
 internet url, foo is the package name, and version1_23 is the 
 particular version of it.
So now, each time a new version of a library pops up you need to search-replace the version number for all your source code, and source code of other library you depend on? This is insane. The version number shouldn't be there, except perhaps if it's a 'major' version number full of breaking changes.
If you leave the version number off, it gets the latest version. If you put it on, it reliably stays the same. I don't see an issue.
Well, as soon as you have two modules trying to import a different version of the same module you'll have trouble. What if module main imports foo version 1, foo imports bar, and bar imports foo latest version? It's no issue as long as you don't use the feature, but I can't figure out how you can use it reliably. Versioning generally needs to happen at a bigger granularity than modules, because most of the time modules in the same project depend on each other. (Read below for another approach at versioning.)
 Also, putting in the source code the location or protocol to fetch the 
 repository isn't much better. There's a reason we have a module import 
 path: so that finding external code depends on compile-time 
 configuration, not on the actual code you build.
It's a good point, but I think it's a detail.
It's important, it's a security issue. You need to be in control of the code you use. This code will run on your computer, and you might distribute it to others. If I compile a library and the source code triggers the compiler into automatically downloading the newest version of a given module from somewhere on the internet before including it into your program, it is a potential threat if "somewhere on the internet" isn't trusted by those who develop the software. I like my proposal of extending the import path. We could for instance associate module package names to paths or remote URLs like this: michelf.* http://d.michelf.com/repo/ * /usr/local/d/ * http://dsource.org/repo/ Ok, that's more in the form of a configuration file, but you get the point: you can tell that files from the package michelf should be searched at the given URL, while every other files must be searched in /usr/local/d/, then if it fails they're searched on dsource.org. If someone wants to use a local copy, he replaces URLs with local path as we currently do, or if he wants to use his own proxy server (where libraries presumably get audited and approved) then he use that URL. If someone wants a different version for a given package, he specifies a different URL or path for that version. The setting will apply to a whole package (and its subpackages) and every import from every module will use the version you asked, not just the module that wanted a specific version. Example configuration: michelf.reflex.* http://d.michelf.com/repo/1.1/ michelf.* http://d.michelf.com/repo/trunk/ * /usr/local/d/ * http://dsource.org/repo/ -- Michel Fortin michel.fortin michelf.com http://michelf.com/
May 07 2010
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Michel Fortin wrote:
 On 2010-05-07 13:55:34 -0400, Walter Bright <newshound1 digitalmars.com> 
 said:
 
 Source code could look something like:

      import http.d_repository.foo.version1_23;

 and the compiler could interpret "http" as meaning the rest is an 
 internet url, foo is the package name, and version1_23 is the 
 particular version of it.
So now, each time a new version of a library pops up you need to search-replace the version number for all your source code, and source code of other library you depend on? This is insane.
Not at all. Some builds want to peg the code against a specific version, so they encode that in the path. Some others may just want to use the latest, so they'd import http.d_repository.foo.current. On the server, current is always aliased to the latest. This scheme is used everywhere on Unix.
 The version number shouldn't be there, except perhaps if it's a 'major' 
 version number full of breaking changes.
When I delivered my book, my publisher asked me to send them all LaTeX packages I used. I replied, "they're the system-provided packages coming with LiveTeX x.y.z". They insisted they need the exact files so they have them. It's not at all uncommon (albeit sad) that software is built for a very specific version of some software. My employer has huge issues m
 Also, putting in the source code the location or protocol to fetch the 
 repository isn't much better. There's a reason we have a module import 
 path: so that finding external code depends on compile-time 
 configuration, not on the actual code you build.
That's a good point. At a minimum, there should be the possibility to define aliases like this: alias http.erdani.com.tdpl.code tdpl; ... import tdpl.stuff; Then the alias definition becomes a unique point of maintenance. Unfortunately something like that doesn't currently work, so if the online modules feature is introduced, we need to introduce such aliases as well.
 Allowing URLs in the import path might be an interesting idea though.
Yah. Unfortunately other languages don't have it so it's difficult to learn from others' experience. Andrei
May 07 2010
prev sibling next sibling parent BCS <none anon.com> writes:
Hello Walter,

 I kind of like the idea that it shouldn't install D packages, but
 rather cache them from the web repository.
 
One absolute must in my book is that it be trivial to get a build working without any external dependencies. Tweaking lots of import paths is not trivial. Some users won't use a system that isn't set up for that. Two things that come to mind are 1) a mandate that "everything needed to build must be checked into source control" and 2) a project where the owner, for security reasons, wanted to be sure that all external dependencies (compiler, libs, etc.) predated the code that depended on them. -- ... <IXOYE><
May 07 2010
prev sibling parent reply Robert Clipsham <robert octarineparrot.com> writes:
On 07/05/10 18:55, Walter Bright wrote:
 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages 2. backups are automatic 3. your actual project is small and easily moved to another machine 4. it becomes trivial to use Source code could look something like: import http.d_repository.foo.version1_23; and the compiler could interpret "http" as meaning the rest is an internet url, foo is the package name, and version1_23 is the particular version of it.
How about: remote import foo.bar.baz.ver1_23; Then passing -R "http://repo.example.com/tag/%VERSION%/%PACKAGE%/" as a compiler/build tool (see below) switch? This way you don't have to mangle URLs to be valid D identifiers, the http.* namespace doesn't get eaten, and the user can specify a custom layout to get the given package/version. Doing it this way makes it completely possible for this kind of tool not to be built into the compiler too, as remote can be eaten and ignored by the compiler, but added to json output. Then something such as xfbuild can check for it and get the package as an archive, check it out from an svn/hg/git/etc repository etc, removing the need for a load of funky code to handle it from the compiler.
May 08 2010
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Robert Clipsham" <robert octarineparrot.com> wrote in message 
news:hs4f66$1a1i$1 digitalmars.com...
 On 07/05/10 18:55, Walter Bright wrote:
 Jacob Carlborg wrote:
 Should it also contain something similar to rdmd?
I kind of like the idea that it shouldn't install D packages, but rather cache them from the web repository. It would be convenient because: 1. who actually cares about installing the packages 2. backups are automatic 3. your actual project is small and easily moved to another machine 4. it becomes trivial to use Source code could look something like: import http.d_repository.foo.version1_23; and the compiler could interpret "http" as meaning the rest is an internet url, foo is the package name, and version1_23 is the particular version of it.
How about: remote import foo.bar.baz.ver1_23; Then passing -R "http://repo.example.com/tag/%VERSION%/%PACKAGE%/" as a compiler/build tool (see below) switch? This way you don't have to mangle URLs to be valid D identifiers, the http.* namespace doesn't get eaten, and the user can specify a custom layout to get the given package/version. Doing it this way makes it completely possible for this kind of tool not to be built into the compiler too, as remote can be eaten and ignored by the compiler, but added to json output. Then something such as xfbuild can check for it and get the package as an archive, check it out from an svn/hg/git/etc repository etc, removing the need for a load of funky code to handle it from the compiler.
I don't see what benefit having the " remote" there provides. Why not just: import foo.bar.baz.ver1_23; dmd -R:foo.bar.baz=http://repo.example.com/tag/%VERSION%/%PACKAGE%/
May 08 2010
parent reply Robert Clipsham <robert octarineparrot.com> writes:
On 08/05/10 21:26, Nick Sabalausky wrote:
 I don't see what benefit having the " remote" there provides. Why not just:

 import foo.bar.baz.ver1_23;

 dmd -R:foo.bar.baz=http://repo.example.com/tag/%VERSION%/%PACKAGE%/
Good point! Doing this means it can be done now too, no language changes needed (except maybe to remove the .ver1_23... how would the compiler know when to do this though?). Other than that, I like it!
May 08 2010
parent reply BCS <none anon.com> writes:
Hello Robert,

 no language
 changes needed (except maybe to remove the .ver1_23... how would the
 compiler know when to do this though?).
how about a pragama: pragma(ver, "1.23", "$") import foo.bar.baz; // imports foo.bar.baz, require a version at or later than 1.23 -- ... <IXOYE><
May 09 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"BCS" <none anon.com> wrote in message 
news:a6268ff137a98ccbd7270f72004 news.digitalmars.com...
 Hello Robert,

 no language
 changes needed (except maybe to remove the .ver1_23... how would the
 compiler know when to do this though?).
how about a pragama: pragma(ver, "1.23", "$") import foo.bar.baz; // imports foo.bar.baz, require a version at or later than 1.23
I have a mixed opinion on that. On one hand, being able to specify either a specific version or an arbitrary range matches real-world cases much better than "any version" vs "this exact version". However, different programs and libs use different versioning conventions. For example: Are "v1.1" and "v1.100" the same or is the latter much newer? Or, which is first, "v1.100" vs "v1.99"? Also, I can imagne certain programs may be able to work with more complex ranges. For instace, maybe FooApp can use BarLib's "v1.x" branch as long as it's at least "v1.7", and it can also use any version of the "v2.x" branch from "v2.1" through "v2.5", but "v2.4.3" through "v2.4.8" are known to have problems. So maybe trying to allow ranges (even one as simple as "at least vX.X") is just too complicated, and should be left to static if(), and the "helping the compiler automatically download/install a dependency" should be limited to an option between one specific known-working version or the latest version. So something roughly like: enum suggestedBarLib = "http://www.barlib.com/packages/barlib-v2.6.dstone"; pragma(ifMissingGetFrom, suggestedBarLib) import barlib; static if(!IsAcceptableBarLibVersion!(BarLibVersion)) { pragma(importFrom, suggestedBarLib); } And then maybe some sort of sugar could be added later (maybe it could all be wrapped up in a mixin).
May 09 2010
parent reply BCS <none anon.com> writes:
Hello Nick,

 "BCS" <none anon.com> wrote in message
 news:a6268ff137a98ccbd7270f72004 news.digitalmars.com...
 
 pragma(ver, "1.23", "$") import foo.bar.baz; // imports foo.bar.baz,
 require a version at or later than 1.23
 
I have a mixed opinion on that. On one hand, being able to specify either a specific version or an arbitrary range matches real-world cases much better than "any version" vs "this exact version". However, different programs and libs use different versioning conventions. For example: Are "v1.1" and "v1.100" the same or is the latter much newer? Or, which is first, "v1.100" vs "v1.99"? Also, I can imagne certain programs may be able to work with more complex ranges. For instace, maybe FooApp can use BarLib's "v1.x" branch as long as it's at least "v1.7", and it can also use any version of the "v2.x" branch from "v2.1" through "v2.5", but "v2.4.3" through "v2.4.8" are known to have problems.
Good, point. Dealing with that client side might be an issue. So how about do it server side? Define a way to map the given string (or strings) to something that can be handed off to the server and let the server do whatever it wants with it. The simplest one would be Nick's suggestion of allowing %VERSION% or the like in the mappings specifications. OTOH how do you deal with sever libraries asking for different versions? The simplest option would be to allow the pragma to take a ordered list of preferences and have DMD try to find a common version. -- ... <IXOYE><
May 09 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"BCS" <none anon.com> wrote in message 
news:a6268ff137cd8ccbd85902a1798 news.digitalmars.com...
 Good, point. Dealing with that client side might be an issue. So how about 
 do it server side? Define a way to map the given string (or strings) to 
 something that can be handed off to the server and let the server do 
 whatever it wants with it. The simplest one would be Nick's suggestion of 
 allowing %VERSION% or the like in the mappings specifications.
Actually, that was Robert's (very good) idea.
 OTOH how do you deal with sever libraries asking for different versions?
Can you clarify? I'm not sure what you mean by that.
 The simplest option would be to allow the pragma to take a ordered list of 
 preferences and have DMD try to find a common version.
May 09 2010
parent reply BCS <none anon.com> writes:
Hello Nick,

 "BCS" <none anon.com> wrote in message
 news:a6268ff137cd8ccbd85902a1798 news.digitalmars.com...
 
 OTOH how do you deal with sever libraries asking for different
 versions?
 
Can you clarify? I'm not sure what you mean by that.
My program imports lib A and B. Lib A imports lib C and asks for version "X". Lib B imports lib C and asks for version "!X". Who wins?
 The simplest option would be to allow the pragma to take a ordered
 list of preferences and have DMD try to find a common version.
 
-- ... <IXOYE><
May 09 2010
parent reply Walter Bright <newshound1 digitalmars.com> writes:
BCS wrote:
 My program imports lib A and B. Lib A imports lib C and asks for version 
 "X". Lib B imports lib C and asks for version "!X". Who wins?
Compilation error.
May 10 2010
parent reply BCS <none anon.com> writes:
Hello Walter,

 BCS wrote:
 
 My program imports lib A and B. Lib A imports lib C and asks for
 version "X". Lib B imports lib C and asks for version "!X". Who wins?
 
Compilation error.
Exactly. If there were a way for A to ask for X or Y and B to ask for Y or Z than the solution is easy: Y. -- ... <IXOYE><
May 11 2010
parent reply BCS <none anon.com> writes:
Hello BCS,

 Hello Walter,
 
 BCS wrote:
 
 My program imports lib A and B. Lib A imports lib C and asks for
 version "X". Lib B imports lib C and asks for version "!X". Who
 wins?
 
Compilation error.
Exactly. If there were a way for A to ask for X or Y and B to ask for Y or Z than the solution is easy: Y.
And to finish the thought; a system that only allows a program to ask for a single version is worse than one that doesn't allow any or allows many. -- ... <IXOYE><
May 11 2010
next sibling parent Walter Bright <newshound1 digitalmars.com> writes:
BCS wrote:
 And to finish the thought; a system that only allows a program to ask 
 for a single version is worse than one that doesn't allow any or allows 
 many.
We've been around this block already several times :-(
May 15 2010
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
BCS <none anon.com> wrote:
 Hello BCS,
 
 Hello Walter,
 BCS wrote:
 My program imports lib A and B. Lib A imports lib C and asks
 for
version "X". Lib B imports lib C and asks for version "!X". Who wins?
 Compilation error.
Exactly. If there were a way for A to ask for X or Y and B to ask for
Y or Z than the solution is easy: Y.
And to finish the thought; a system that only allows a program to ask for a single version is worse than one that doesn't allow any or allows many.
How does . NET work? I recall versioning being an integral part of the design.
May 16 2010
parent reply "Nick Sabalausky" <a a.a> writes:
"Sean Kelly" <sean invisibleduck.org> wrote in message 
news:1012172047295742031.093272sean-invisibleduck.org news.digitalmars.com...
 BCS <none anon.com> wrote:
 Hello BCS,

 Hello Walter,
 BCS wrote:
 My program imports lib A and B. Lib A imports lib C and asks
 for
version "X". Lib B imports lib C and asks for version "!X". Who wins?
 Compilation error.
Exactly. If there were a way for A to ask for X or Y and B to ask for
Y or Z than the solution is easy: Y.
And to finish the thought; a system that only allows a program to ask for a single version is worse than one that doesn't allow any or allows many.
How does . NET work? I recall versioning being an integral part of the design.
IIRC, When MS said that, it turned out they were really just talking about things like "depricated" keyword and designing the OO polymorphism so that changing one thing won't accidentally break something else (ex: manditory "override" keyword when overriding a base class function). I found it confusing too that they referred to that as "versioning".
May 17 2010
parent Jacob Carlborg <doob me.com> writes:
On 5/17/10 20:35, Nick Sabalausky wrote:
 "Sean Kelly"<sean invisibleduck.org>  wrote in message
 news:1012172047295742031.093272sean-invisibleduck.org news.digitalmars.com...
 BCS<none anon.com>  wrote:
 Hello BCS,

 Hello Walter,
 BCS wrote:
 My program imports lib A and B. Lib A imports lib C and asks
 for
version "X". Lib B imports lib C and asks for version "!X". Who wins?
 Compilation error.
Exactly. If there were a way for A to ask for X or Y and B to ask for
Y or Z than the solution is easy: Y.
And to finish the thought; a system that only allows a program to ask for a single version is worse than one that doesn't allow any or allows many.
How does . NET work? I recall versioning being an integral part of the design.
IIRC, When MS said that, it turned out they were really just talking about things like "depricated" keyword and designing the OO polymorphism so that changing one thing won't accidentally break something else (ex: manditory "override" keyword when overriding a base class function). I found it confusing too that they referred to that as "versioning".
Isn't it possible to put some version information in the binary? Found this: http://msdn.microsoft.com/en-us/library/4w8c1y2s.aspx
May 18 2010
prev sibling parent reply "Simen kjaeraas" <simen.kjaras gmail.com> writes:
Robert Clipsham <robert octarineparrot.com> wrote:

 How about:

  remote import foo.bar.baz.ver1_23;
Why invent a new keyword? Surely this is a match for extern: extern import foo.bar.baz.vert1_23; -- Simen
May 08 2010
next sibling parent Robert Clipsham <robert octarineparrot.com> writes:
On 08/05/10 21:35, Simen kjaeraas wrote:
 Why invent a new keyword? Surely this is a match for extern:

 extern import foo.bar.baz.vert1_23;
It's not a keyword, hence the :) This said, extern could be a match, if a keyword is needed (see Nick's response).
May 08 2010
prev sibling parent BCS <none anon.com> writes:
Hello Simen,

 Robert Clipsham <robert octarineparrot.com> wrote:
 
 How about:
 
  remote import foo.bar.baz.ver1_23;
 
Why invent a new keyword? Surely this is a match for extern: extern import foo.bar.baz.vert1_23;
For that matter, why even hard code that info the the code at all? -- ... <IXOYE><
May 09 2010