www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Deterministic Memory Management With Standard Library Progress

reply Anthony <AnthonyMonterrosa zoho.com> writes:
I've been having difficulty finding an up-to-date answer to this 
question, so I figured I'd ask the forum: can deterministic 
memory management be done in D, without losing any main features? 
I ask this because I know it's technically possible, but the 
methods that are suggested in the answers I've read always 
suggest the avoidance of most of the standard library. I know 
that there is an effort to reverse the reliance on the GC, but I 
don't know how far that's gotten in the last six months. Would 
anyone be able to give me more information?

To give context to my question, I don't have a problem with GCs, 
and this question isn't stemming from a C++ background. I've been 
told to learn C++ though, due to its efficiency and power.

It feels like D is a better choice for me to learn than C++; it's 
ostensibly C++ but with various amounts of baggage or unfavorable 
quirks removed. But I'm wary to learn if one of the major 
proponents for C++ usage, deterministic memory management, isn't 
directly supported. I check back here every few months or so, but 
now I just can't find anything new.
Mar 04 2017
next sibling parent Dukc <ajieskola gmail.com> writes:
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
 [snip]
It can be done. C standard library, and thus malloc(), calloc() and free() come with the standard library. There also are more high-level was to do it, std.typecons.scoped, std.experimental.allocator and Dlib (a dub package) to name a few. You do not have to resign any parts of D standard library to do that. That being said, manual memory management is recommended only if you have a specific reason to use it, because D compiler cannot verify memory safety of code doing such things. But remember that neither could C++. In D you can still at least have the compiler to verify those parts of the code where you don't manage memory manually. So definitely you're better off than in c++ in this regard. Value types initialized directly on the stack are deterministicly destroyed without having to compromise safe. ( safe means code which compiler verifies for memory safety. It isn't perfect, but close enough to catch almost all errors) Of course, not everything can be a value type because some data needs to have a variable size. Scoped pointers is an upcoming feature which should, as I understand it, allow deterministic memory management with reference types too, in sfe code. It is already implemented, but still seems to me to have too much unfinished corner cases to do its job yet. The standard library is not yet compilant with that feature. I believe, trough, that it will become much better. That all assuming that "deterministic" means deterministic freeing of the memory and calling of destructors.
Mar 04 2017
prev sibling next sibling parent reply cym13 <cpicard openmailbox.org> writes:
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
 I've been having difficulty finding an up-to-date answer to 
 this question, so I figured I'd ask the forum: can 
 deterministic memory management be done in D, without losing 
 any main features? I ask this because I know it's technically 
 possible, but the methods that are suggested in the answers 
 I've read always suggest the avoidance of most of the standard 
 library. I know that there is an effort to reverse the reliance 
 on the GC, but I don't know how far that's gotten in the last 
 six months. Would anyone be able to give me more information?

 To give context to my question, I don't have a problem with 
 GCs, and this question isn't stemming from a C++ background. 
 I've been told to learn C++ though, due to its efficiency and 
 power.

 It feels like D is a better choice for me to learn than C++; 
 it's ostensibly C++ but with various amounts of baggage or 
 unfavorable quirks removed. But I'm wary to learn if one of the 
 major proponents for C++ usage, deterministic memory 
 management, isn't directly supported. I check back here every 
 few months or so, but now I just can't find anything new.
Well, you said it, the key point is "without losing main features". There's quite a chunk of the standard library that is nogc, almost everything in std.algorithm for example (to the exception of one function IIRC, minor enough I can't remember which). Aside from that most modules providing some form of reading or encoding also provide two sets of methods: one that asks for an external buffer which you can use, the other that automatically allocates a buffer to alleviate the need to manually manage it (GC-ed). The other problem with the GC in the standard library is the exception one: as we have GC-ed exceptions by default (to avoid having to manually free them) no function using exceptions can be nogc. However one should realistically note that allocation of exceptions is and should stay exceptional so there are cases where you could decide to use a cast to force an exception-using function into a nogc scope. Finally GC problems are completely exagerated. It only runs when used so having it to manage exceptions only for example is completely viable, and it is possible to dereference threads if you don't want them to be seen by the GC. Using GC globally and avoiding it locally on heat points has proved to be a successful strategy within the community. So I wouldn't worry too much, giving it a try is paramount.
Mar 04 2017
parent reply Inquie <Inquie data1.com> writes:
 Finally GC problems are completely exagerated. It only runs 
 when used so having it to manage exceptions only for example is 
 completely viable, and it is possible to dereference threads if 
 you don't want them to be seen by the GC. Using GC globally and 
 avoiding it locally on heat points has proved to be a 
 successful strategy within the community. So I wouldn't worry 
 too much, giving it a try is paramount.
Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC. It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue. It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.
Mar 04 2017
next sibling parent reply cym13 <cpicard openmailbox.org> writes:
On Sunday, 5 March 2017 at 00:06:04 UTC, Inquie wrote:
 Finally GC problems are completely exagerated. It only runs 
 when used so having it to manage exceptions only for example 
 is completely viable, and it is possible to dereference 
 threads if you don't want them to be seen by the GC. Using GC 
 globally and avoiding it locally on heat points has proved to 
 be a successful strategy within the community. So I wouldn't 
 worry too much, giving it a try is paramount.
Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC. It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue. It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.
I completely agree that there are cases where having the GC is a no-go and have even seen a number of projects doing fine without it. But exactly as you say, just because it is a problem for you doesn't mean it's a problem for everyone. Clearly the OP hasn't tried D yet so as far as I'm concerned it looks like his concern about GC come from critics he may have read elsewhere and not from an actual case that makes GC impossible to use. Given that I feel like conforting the fact that, indeed, it works great for most applications.
Mar 04 2017
parent reply Anthony <AnthonyMonterrosa zoho.com> writes:
On Sunday, 5 March 2017 at 00:44:26 UTC, cym13 wrote:
 On Sunday, 5 March 2017 at 00:06:04 UTC, Inquie wrote:
 Finally GC problems are completely exagerated. It only runs 
 when used so having it to manage exceptions only for example 
 is completely viable, and it is possible to dereference 
 threads if you don't want them to be seen by the GC. Using GC 
 globally and avoiding it locally on heat points has proved to 
 be a successful strategy within the community. So I wouldn't 
 worry too much, giving it a try is paramount.
Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC. It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue. It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.
I completely agree that there are cases where having the GC is a no-go and have even seen a number of projects doing fine without it. But exactly as you say, just because it is a problem for you doesn't mean it's a problem for everyone. Clearly the OP hasn't tried D yet so as far as I'm concerned it looks like his concern about GC come from critics he may have read elsewhere and not from an actual case that makes GC impossible to use. Given that I feel like conforting the fact that, indeed, it works great for most applications.
I've learned the basics of D. I read the tutorial book, as I would call it, and some further tutorials on templates and other cool things. I just don't feel comfortable investing a significant effort acquainting myself further with the language without some guarantee that the feature will be completely supported eventually. In a way, I'm picking a tool for my toolbelt, and C++ and D are competing tools. D looks like C++ 2.0, but it's missing a critical function of it as well. So, I'm conflicted. I plan on answering the other answers, by the way. I just figured I'd wait a day or so to digest all the feedback together. But I do appreciate the effort from everyone so far, regardless of disagreements.
Mar 04 2017
parent reply Moritz Maxeiner <moritz ucworks.org> writes:
On Sunday, 5 March 2017 at 00:58:44 UTC, Anthony wrote:
 [...]

 I've learned the basics of D. I read the tutorial book, as I 
 would call it, and some further tutorials on templates and 
 other cool things. I just don't feel comfortable investing a 
 significant effort acquainting myself further with the language 
 without some guarantee that the feature will be completely 
 supported eventually.
What do you consider complete support in this context? druntime, phobos, both? You can definitely write an application where all heap memory (after the druntime initialization) is allocated (and deallocated) deterministically, provided you don't use language builtins that require GC allocations (druntime) or stay away from other people's code that allocates using the GC (this includes those parts of phobos). std.experimental.allocator even provides a nice, generic interface for this (you'll want to use one of the allocators that aren't GCAllocator, though). Considering D development is - AFAIK - not primarily driven by people paid for their work I doubt you'll get a guarantee on any future development, though.
 In a way, I'm picking a tool for my toolbelt, and C++ and D are 
 competing tools.
If possible, don't pick one, pick both (and to be even more annoying: also pick some Lisp, Erlang, Haskell, and Rust to get exposed to many different types of abstraction).
 D looks like C++ 2.0, but it's missing a critical function of 
 it as well. So, I'm conflicted.
If you're referring to deterministic memory management, it's not; the function is there, it's just up to you to actually use it and not invoke the GC. If you're referring to not all of phobos' functions being compatible with deterministic memory management (as opposed to stdc++), then yes, that's an ongoing effort.
Mar 04 2017
parent reply Anthony <AnthonyMonterrosa zoho.com> writes:
On Sunday, 5 March 2017 at 01:41:47 UTC, Moritz Maxeiner wrote:
 On Sunday, 5 March 2017 at 00:58:44 UTC, Anthony wrote:
 [...]

 I've learned the basics of D. I read the tutorial book, as I 
 would call it, and some further tutorials on templates and 
 other cool things. I just don't feel comfortable investing a 
 significant effort acquainting myself further with the 
 language without some guarantee that the feature will be 
 completely supported eventually.
What do you consider complete support in this context? druntime, phobos, both? You can definitely write an application where all heap memory (after the druntime initialization) is allocated (and deallocated) deterministically, provided you don't use language builtins that require GC allocations (druntime) or stay away from other people's code that allocates using the GC (this includes those parts of phobos). std.experimental.allocator even provides a nice, generic interface for this (you'll want to use one of the allocators that aren't GCAllocator, though). Considering D development is - AFAIK - not primarily driven by people paid for their work I doubt you'll get a guarantee on any future development, though.
 In a way, I'm picking a tool for my toolbelt, and C++ and D 
 are competing tools.
If possible, don't pick one, pick both (and to be even more annoying: also pick some Lisp, Erlang, Haskell, and Rust to get exposed to many different types of abstraction).
 D looks like C++ 2.0, but it's missing a critical function of 
 it as well. So, I'm conflicted.
If you're referring to deterministic memory management, it's not; the function is there, it's just up to you to actually use it and not invoke the GC. If you're referring to not all of phobos' functions being compatible with deterministic memory management (as opposed to stdc++), then yes, that's an ongoing effort.
Not having it guaranteed is understandable, albeit slightly disappointing. I would pick both, if I had the time to do so. I'm a college student; with that in mind, I can only really learn one right now without giving up most of my free time. I think it'd be stressful if I tried. I was referring to phobos. I feel intimidated by the idea of trying to code some of the functions of phobos myself in a no-gc manner. I'm sure I'd run into things way out of my knowledge domain.
Mar 04 2017
next sibling parent reply Moritz Maxeiner <moritz ucworks.org> writes:
On Sunday, 5 March 2017 at 04:36:27 UTC, Anthony wrote:
 [...]

 Not having it guaranteed is understandable, albeit slightly 
 disappointing.
That is the downside of community-driven development, I'm afraid.
 I would pick both, if I had the time to do so. I'm a college 
 student; with that in mind, I can only really learn one right 
 now without giving up most of my free time. I think it'd be 
 stressful if I tried.
As a college student nearing the end of his studies I can totally understand that.
 I was referring to phobos. I feel intimidated by the idea of 
 trying to code some of the functions of phobos myself in a 
 no-gc manner. I'm sure I'd run into things way out of my 
 knowledge domain.
I get that, though usually Phobos code is fairly readable if you've got a firm grasp of D templates.
Mar 05 2017
parent reply Anthony <AnthonyMonterrosa zoho.com> writes:
On Sunday, 5 March 2017 at 13:32:04 UTC, Moritz Maxeiner wrote:
 I was referring to phobos. I feel intimidated by the idea of 
 trying to code some of the functions of phobos myself in a 
 no-gc manner. I'm sure I'd run into things way out of my 
 knowledge domain.
I get that, though usually Phobos code is fairly readable if you've got a firm grasp of D templates.
Hmm. This is encouraging news to here, given that templates were the most recent topic I've learned in D. Maybe I'll just have to give GC-reliant areas a read. Although I don't know what those areas would be; AFAIK I'd have to look for nogc tags, which sounds tedious. As for Guillaume Piolat's Advice:
 - Use struct RAII wrappers and destructors to release 
 resources. Typically structs with disabled postblit. However 
 very often resources are classes objects because they feature 
 virtual dispatch.
- In destructors, call .destroy on members that are class 
objects (reverse order of their creation if possible). Also on 
heap allocated struct but those are rare. The idea is that when 
the GC kicks in, unreachable objects on the heap should already 
have already been destroyed.
- Don't let the GC release resources, with the "GC-proof 
resource class idiom". I remember Andrei pushing for the GC not 
to call destructors, to no avail. So the GC is calling 
destructors and you should use it as a _warning that something 
is wrong_ and determinisim wasn't respected, not as a safety 
measure and a way to actually release resources (because it's 
wrong thread, wrong timing, unreliable).
- Like in C++, if your ownership is in the case where it's 
shared, you can introduce artificial owners (eg: arena pool owns 
arena pool items).
- The resources that are only memory need no owner and can be 
managed by the GC.         (eg: strings, a ubyte[] array...)
When you have deterministic destruction, it's simple to offload 
work from GC to manual memory management. You don't loose ANY 
main feature.
The GC is an efficient way to call free(), and a global owner 
not a way to release resources.
This is also encouraging! Thanks for the concrete advice. But, I think due to my inexperience, to don't really understand some of your advice. Specifically, the only-memory recourses like strings and arrays can be managed by the GC, and arena pools. I can just look these things up, but I figured it'd be easy to ask first.
Mar 06 2017
parent Guillaume Piolat <first.last gmail.com> writes:
On Monday, 6 March 2017 at 16:10:51 UTC, Anthony wrote:
 This is also encouraging! Thanks for the concrete advice. But, 
 I think due to my inexperience, to don't really understand some 
 of your advice. Specifically, the only-memory recourses like 
 strings and arrays can be managed by the GC, and arena pools. I 
 can just look these things up, but I figured it'd be easy to 
 ask first.
(Arena pools are an allocator type, unrelated to your question (though they are used to lessen GC usage) ). When I speak about "only-memory" resources I mean resources that only hold memory transitively. class MyClass { string a; ubyte[] arr; float p; } There is not much reason to destroy - or release the memory of - a MyClass deterministically since it only references memory. A global owner like the GC will do. In C++ you would find an owner for each std::string, but in D you can rely on the GC for releaseing them. But you shalt not rely on the GC for closing files, releasing mutexes, closing sockets... etc. Any object that would hold such a resource (or holds an object that holds such a resource) is better handled in a manual/RAII way instead. Memory allocated with malloc is such a resource too. In my mind, D programs use a hybrid style of ownership, GC + manual/RAII(aka "scoped ownership"). So it's more complex that just "scoped ownership" like in C++.
Mar 07 2017
prev sibling parent reply Wyatt <wyatt.epp gmail.com> writes:
On Sunday, 5 March 2017 at 04:36:27 UTC, Anthony wrote:
 I would pick both, if I had the time to do so. I'm a college 
 student; with that in mind, I can only really learn one right 
 now without giving up most of my free time. I think it'd be 
 stressful if I tried.
This is fair, but, speaking from the field: learning how to JIT-learn and pick up languages quick is a valuable skill that you will never stop using. It's always worth reminding yourself that languages are cheap; the conceptual underpinnings are what's important. -Wyatt
Mar 06 2017
parent reply Anthony <AnthonyMonterrosa zoho.com> writes:
On Monday, 6 March 2017 at 17:21:15 UTC, Wyatt wrote:
 On Sunday, 5 March 2017 at 04:36:27 UTC, Anthony wrote:
 I would pick both, if I had the time to do so. I'm a college 
 student; with that in mind, I can only really learn one right 
 now without giving up most of my free time. I think it'd be 
 stressful if I tried.
This is fair, but, speaking from the field: learning how to JIT-learn and pick up languages quick is a valuable skill that you will never stop using. It's always worth reminding yourself that languages are cheap; the conceptual underpinnings are what's important. -Wyatt
I guess I should have said this earlier, but I will not be working in the coding industry. Well, it's at least not the plan. I'm a math and physics major as well, and hope that something works out in those fields. Personally, I enjoy coding but not maintaining others' code. Consequently, I think most coding jobs would leave me unhappy, but I plan using coding as a personal hobby/skill/tool. With that in mind, C++ knowledge doesn't have some inherent value for me; I just want a tool that can do what it can do. Hence, D (almost). ------------------------- On the main topic, I was peeking around D's blog, and found an interview of Walter Bright and Joakim, an interviewer for our D blog. One of the questions asked Walter what his response toward the " nogc" crowd, and he says:
It became clear that the garbage collector wasn't needed to be 
embedded in most things, that memory allocation could be decided 
separately from the algorithm. Letting the user decide seemed 
like a great way forward.
He then shares some information about work for extern C++ exceptions and that nogc support remains a high priority. So, it sounds like this is a concrete, expectable goal. And, if that really is true, I'm okay with that. I don't mind the idea of waiting, I just don't want to invest time in a tool to realize it's not the tool I was looking for. Until this is implemented, I have a few questions that might assuage my lack of closure: Would it be possible/practical to use another language's code within a D program, like Rust for example, when deterministic memory management is necessary? It feels like this would be easier then finagling with D to get a similar outcome. Does the GC run regardless of if it's used? Someone alluded to this earlier, but I just wanted clarity. If I write nogc code, will the GC just stand by idly? Or, is there an option to completely turn it off? Future thanks for any help.
Mar 06 2017
parent bachmeier <no spam.net> writes:
On Monday, 6 March 2017 at 17:54:25 UTC, Anthony wrote:
 Would it be possible/practical to use another language's code 
 within a  D program, like Rust for example, when deterministic 
 memory management is necessary? It feels like this would be 
 easier then finagling with D to get a similar outcome.
Yes. You can call Rust code from C: https://doc.rust-lang.org/book/ffi.html#calling-rust-code-from-c You can do the same to call it from D. I have never done it, but I did call Rust from Ruby at one point.
 Does the GC run regardless of if it's used? Someone alluded to 
 this earlier, but I just wanted clarity. If I write nogc code, 
 will the GC just stand by idly? Or, is there an option to 
 completely turn it off?
If you don't allocate with the GC, the GC doesn't run. Code marked nogc is guaranteed not to cause the GC to run. You can turn the GC off using GC.disable, but there are extreme cases in which it can still run.
Mar 06 2017
prev sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Sunday, 5 March 2017 at 00:06:04 UTC, Inquie wrote:
 Please stop making this argument. Just because one can amortize 
 out the effect of the GC does not in any way change the fact 
 that it is a stop the world GC and for certain applications 
 this is provably a show stopper. One can come up with any 
 number of commercial apps that fail using D's GC.
Which of these commercial apps fail because of the D GC?
Mar 05 2017
prev sibling next sibling parent bachmeier <no spam.net> writes:
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
 To give context to my question, I don't have a problem with 
 GCs, and this question isn't stemming from a C++ background. 
 I've been told to learn C++ though, due to its efficiency and 
 power.

 It feels like D is a better choice for me to learn than C++; 
 it's ostensibly C++ but with various amounts of baggage or 
 unfavorable quirks removed. But I'm wary to learn if one of the 
 major proponents for C++ usage, deterministic memory 
 management, isn't directly supported. I check back here every 
 few months or so, but now I just can't find anything new.
Having learned C++ before D, I would argue (others will disagree) that even if your goal is to learn C++, you should start with D. You want to learn certain concepts and ways of thinking. If you start with C++, you have to spend your time learning the rough edges of C++ and what not to do, and it really does interfere with your learning. It's faster to learn D and then figure out how to do the same thing in C++ than to battle with the unpleasantness of C++ from the start. I regret all the hours I wasted on C++. It's not really accurate to say someone should avoid D because of GC/memory management. You can call C++ from D, so while the cost-benefit analysis might favor C++ in some cases, there's no reason you can't write your program in D and then call into C++ when absolutely necessary. I do the same thing, except that I call into C more than C++. Just my 2 cents as someone that does numerical programming. Ultimately, if you're looking at it from the perspective of the job market, it really doesn't make sense to spend time on D. If the goal is to learn, it doesn't make sense to spend time on C++.
Mar 04 2017
prev sibling next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
 To give context to my question, I don't have a problem with 
 GCs, and this question isn't stemming from a C++ background. 
 I've been told to learn C++ though, due to its efficiency and 
 power.
I think you should start with "What kind of programs do I want to write?" rather than what language to choose. Then pick the best language for that domain. But if you want to learn C++, then starting with the basic C subset and add feature by feature from C++ is the best alternative. If learning C++ is your goal then you need to get to terms with well thought out memory management strategies. Of the non-C++ languages that could give you some structure Rust is possibly one that could give you some training, as the Rust compiler enforce what you should try to achieve in C++ with unique_ptr (roughly the same memory model in principle). Also, there are many variants of C++ (C++17/C++11, C++03, C++98...) which leads to very different programming idioms. It takes many years to become proficient in C++. I would estimate that it will take 1-2 years from someone already proficient in C++98 to become proficient in C++17.
Mar 05 2017
prev sibling parent Guillaume Piolat <first.last gmail.com> writes:
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
 I've been having difficulty finding an up-to-date answer to 
 this question, so I figured I'd ask the forum: can 
 deterministic memory management be done in D, without losing 
 any main features?
First let's admit that C++ determinism is a strong point and that D fares a bit worse on this. I'll give you the simplistic rules for deterministic destruction now (which is required for C++-style deterministic memory management). If you follow these rules your program destruction will be deterministic and you might even push into being exception-safe if you are so inclined. :) - Use struct RAII wrappers and destructors to release resources. Typically structs with disabled postblit. However very often resources are classes objects because they feature virtual dispatch. - In destructors, call .destroy on members that are class objects (reverse order of their creation if possible). Also on heap allocated struct but those are rare. The idea is that when the GC kicks in, unreachable objects on the heap should already have already been destroyed. - Don't let the GC release resources, with the "GC-proof resource class idiom". I remember Andrei pushing for the GC not to call destructors, to no avail. So the GC is calling destructors and you should use it as a _warning that something is wrong_ and determinisim wasn't respected, not as a safety measure and a way to actually release resources (because it's wrong thread, wrong timing, unreliable). - Like in C++, if your ownership is in the case where it's shared, you can introduce artificial owners (eg: arena pool owns arena pool items). - The resources that are only memory need no owner and can be managed by the GC. (eg: strings, a ubyte[] array...) When you have deterministic destruction, it's simple to offload work from GC to manual memory management. You don't loose ANY main feature. The GC is an efficient way to call free(), and a global owner not a way to release resources. The problem of GC efficiency itself is separate.
Mar 05 2017