digitalmars.D - Automatic library download plan
- Adam Ruppe (44/44) Jun 25 2010 I was thinking this morning about a way to have a simple way for D
- Nick Sabalausky (9/14) Jun 25 2010 I haven't given much thought to the details of your proposal, but a coup...
- Adam Ruppe (7/12) Jun 25 2010 I remember reading about it in the newsgroup, but is there a link to a
- Ellery Newcomer (9/13) Jun 25 2010 No website, no details. Just a bare bones extract n compile. And a
- Jesse Phillips (3/8) Jun 25 2010 DSSS is close to being able to do what you want. It is able to download ...
- BLS (10/11) Jun 25 2010 IMHO DSSS is very bad designed. I really don't like to install a zillion...
- BCS (23/40) Jun 26 2010 very clean, very simple, but a few thoughts:
I was thinking this morning about a way to have a simple way for D library distribution. Let me run it by you guys, and we'll discuss if it is a good plan. The short version is it takes a package -> URL mapping file and downloads on http, based on the .d file being in an expected folder. The plan is simple. The program will take a file and parse out its dependencies, using dmd -v, like rdmd does. Now, it opens a package mapping file. If this file doesn't exist locally, it can download it from a central server. The file looks like this: arsd http://arsdnet.net/dcode mylib http://dsource.org/libs mylib.container http://domain.com gui http://dsource.org/libs It can be customized or you could submit your library for the central list. Now, say your program has: import arsd.dom; import mylib.container; import mylib.whatever.mod; import gui.window; This would look up the most specific entry in the list, and try to download it. So arsd.dom matches arsd, so it tries to download http://arsdnet.net/dcode/arsd/dom.d mylib.container matches mylib.container, so it gets http://domain.com/mylib/container.d mylib.whatever.mod's best match is the package mylib. So: http://dsource.org/libs/mylib/whatever/mod.d gui.window gets http://dsource.org/libs/gui/window.d It downloads them all to a folder, like package/ in the current directory, or somewhere else if you specify it. If the file is already in the local folder, it can skip redownloading it. If the module requires additional command line options, like for external libraries, pragma(lib) or maybe some magic comments in the file can list them off. In my old makefile generator, I used the module line: module whatever; // libs(sdl, sdl_audio) And this would just be passed off to dmd on the command line. It wouldn't try to download the library or anything. Would this work? I'm thinking it would do the job of a CPAN that people ask for every few months and is incredibly simple to implement. I could take the submitted packages to the central listing and fetch ddoc off the url too, assuming the same naming convention, and add it to my simple search engine at dpldocs.info.
Jun 25 2010
"Adam Ruppe" <destructionator gmail.com> wrote in message news:mailman.216.1277497077.24349.digitalmars-d puremagic.com...I was thinking this morning about a way to have a simple way for D library distribution. Let me run it by you guys, and we'll discuss if it is a good plan. The short version is it takes a package -> URL mapping file and downloads on http, based on the .d file being in an expected folder.I haven't given much thought to the details of your proposal, but a couple suggestions: - This should probably use DMDZ (or is it ZDMD?), or be built on-top of it. - It should probably be possible for the mapping file to be specified as a URL, and there should be a default official URL (at dsource.org or somewhere like that), and a set of official backup mirror URLs (which could be updated whenever the the system connects to the official URL).
Jun 25 2010
On 6/25/10, Nick Sabalausky <a a.a> wrote:- This should probably use DMDZ (or is it ZDMD?), or be built on-top of it.I remember reading about it in the newsgroup, but is there a link to a website or something? I don't remember any details.- It should probably be possible for the mapping file to be specified as a URL, and there should be a default official URL (at dsource.org or somewhere like that), and a set of official backup mirror URLs (which could be updated whenever the the system connects to the official URL).Yes, if there isn't a local map file, it would be fetched off the default url or whatever specified in the config file. Some kind of --force-refresh option could bypass the local cache and always get a new mapping file and modules off the internet.
Jun 25 2010
On 06/25/2010 03:51 PM, Adam Ruppe wrote:On 6/25/10, Nick Sabalausky<a a.a> wrote:No website, no details. Just a bare bones extract n compile. And a readme which is going to win me a nobel prize in literature. Here: personal.utulsa.edu/~ellery-newcomer/dmdz/dmdz.zip I don't know that it can do anything which a zip command can't, and it doesn't handle multiple zip files at the moment. And I was kind of hoping for admd or rdmd for archive dmd, but I haven't implemented anything for it to merit that name. I won't either, as long as we don't know what streams will look like in phobos.- This should probably use DMDZ (or is it ZDMD?), or be built on-top of it.I remember reading about it in the newsgroup, but is there a link to a website or something? I don't remember any details.
Jun 25 2010
Adam Ruppe Wrote:I was thinking this morning about a way to have a simple way for D library distribution. Let me run it by you guys, and we'll discuss if it is a good plan. The short version is it takes a package -> URL mapping file and downloads on http, based on the .d file being in an expected folder.DSSS is close to being able to do what you want. It is able to download and install libraries from the web. It won't do it automatically. I think it would be good to have DSSS revived or something similar to take over, maybe use xfbuild instead of rebuild.
Jun 25 2010
On 25/06/2010 23:17, Jesse Phillips wrote:DSSS is close to being able to do what you want. It is able to download and install libraries from the web. It won't do it automatically.IMHO DSSS is very bad designed. I really don't like to install a zillion of files within a too complicated software just to do very simple things. I think we could and should enhance rdmd for that task. Ellery: not just streams.. also std.sockets require a complete rewrite to enhance rdmd for automatic lib distribution! and consuming. Adam: Like your plan, ++vote Bjoern Beside > dsource hosting is questionable, I think we need (at least) a mirror.
Jun 25 2010
Hello Adam,I was thinking this morning about a way to have a simple way for D library distribution. Let me run it by you guys, and we'll discuss if it is a good plan. The short version is it takes a package -> URL mapping file and downloads on http, based on the .d file being in an expected folder. The plan is simple. The program will take a file and parse out its dependencies, using dmd -v, like rdmd does. Now, it opens a package mapping file. If this file doesn't exist locally, it can download it from a central server. The file looks like this: arsd http://arsdnet.net/dcode mylib http://dsource.org/libs mylib.container http://domain.com gui http://dsource.org/libsvery clean, very simple, but a few thoughts: 1) Allow more than http I'd also allow things like ftp, svn (revision locked and head), git, hg, etc. Ideally this would be done via a simple "plugin" system: if the xxx:// is unknown to the base program, look for a xxx executable along side the base program and pass off the url and the destination dir. 2) Allow local references. Don't make local sources a second class citizen. Think of people who are working on a patched lib version. 3) Use a staging approach You weren't clear on if everything would be globbed into one dir and passed as an import or if each source gets it's own dir and gets passed individually. I'd advocate the second as I suspect that it will make resolving conflicts easier (for instance if the map file changes between downloads) 4) have an "include" system Allow one file to refer to another via any url that is valid for referring to a code source. For security reasons, the should be a way to white/black list sources and also to force the used of only cached copies. 5) Have a "OK each download" mode and make it the default. Most builds will not download new code and those that do, deserve extra scrutiny. -- ... <IXOYE><
Jun 26 2010