digitalmars.D - You don't like GC? Do you?
- aberba (22/22) Oct 11 2018 "It takes care of itself
- JN (7/12) Oct 11 2018 That is fine, if you want to position yourself as competition to
- Adam Wilson (21/26) Oct 12 2018 Does it though? The way I see it is that people who want to do what
- Stanislav Blinov (9/17) Oct 12 2018 ...there's absolutely no need for a GC. In fact, the GC runtime
- Neia Neutuladh (12/32) Oct 12 2018 Throwaway scripts can allocate a lot of memory and have nontrivial
- Stanislav Blinov (19/37) Oct 12 2018 Your point being?.. It's not like you need a GC to allocate
- Neia Neutuladh (15/42) Oct 12 2018 Over the lifetime of the script, it processed more memory than my
- Stanislav Blinov (23/51) Oct 12 2018 Now *that* is a good point. Then again, until you run out of
- Nicholas Wilson (7/31) Oct 12 2018 Freeing your mind and the codebase of having to deal with memory
- Stanislav Blinov (12/17) Oct 12 2018 That's done first and foremost by stripping out unnecessary
- welkam (6/8) Oct 12 2018 People in this thread mostly said that for some things GC is just
- Stanislav Blinov (3/7) Oct 12 2018 Read the OP again then. What message does it send? What broad
- Atila Neves (11/29) Oct 12 2018 D isn't Java. If you can, put your data on the stack. If you
- Stanislav Blinov (13/26) Oct 12 2018 Then five years later, try and hunt down that mysterious heap
- rjframe (11/21) Oct 13 2018 And sometimes it's programmer performance. Last year I had a malformed C...
- Stanislav Blinov (3/6) Oct 13 2018 And?.. Would you now go around preaching how awesome the GC is
- rjframe (5/13) Oct 15 2018 For something like I did, yes.
- Stanislav Blinov (5/13) Oct 15 2018 Yeah, well, what's the title of this thread, and what's the
- Atila Neves (12/35) Oct 13 2018 That hasn't happened to me.
- Stanislav Blinov (21/47) Oct 13 2018 It rarely does indeed. Usually it's someone else that has to sift
- 12345swordy (6/13) Oct 13 2018 Not everyone have the time nor skills of doing manual memory
- Stanislav Blinov (9/13) Oct 14 2018 That's a lamest excuse if I ever seen one. If you can't be
- 12345swordy (8/15) Oct 14 2018 It not an excuse, it's reality. The d language have multiple
- Stanislav Blinov (19/25) Oct 14 2018 Read this thread again then, carefully. You *have to* understand
- 12345swordy (4/6) Oct 15 2018 When you say statements like this.
- Stanislav Blinov (3/9) Oct 15 2018 Yep, and everything else that's inconvenient you'd just cut out.
- 12345swordy (5/16) Oct 15 2018 You mean the part that you straw man me, and resort to personal
- Stanislav Blinov (15/33) Oct 15 2018 Pfff... *I* am "straw man"ing you? That's just hilarious.
- 12345swordy (4/6) Oct 15 2018 If you wanted an argument from me, then you need to stop with the
- Stanislav Blinov (4/10) Oct 15 2018 ...and again he grabs a single quote.
- 12345swordy (3/4) Oct 15 2018 Pot called, he wants to see Mr. kettle.
- Tony (6/14) Oct 14 2018 Ideally you wouldn’t have chosen to even try D. You (and others
- Eugene Wissner (4/21) Oct 15 2018 He doesn't argue against garbage collection. And D is one of the
- Tony (5/16) Oct 16 2018 Wouldn't C++ or Rust, with their smart pointers, be a better
- Stanislav Blinov (36/42) Oct 16 2018 I did state what I was arguing against, if you actually read the
- Nicholas Wilson (14/32) Oct 12 2018 If you need perf in your _scripts_, a use LDC and b) pass -O3
- Stanislav Blinov (12/37) Oct 12 2018 If you *need* perf, you write performant code. If you don't need
- Dejan Lekic (10/28) Oct 12 2018 What a bunch of nonsense! I used to talk like this some 20 years
- Stanislav Blinov (5/13) Oct 12 2018 Who said anything about turning it off? I'm pointing out that
- bachmeier (5/11) Oct 12 2018 For me, at least, spending an extra two weeks optimizing a
- Atila Neves (15/33) Oct 12 2018 True. There's also absolutely no need for computer languages
- Stanislav Blinov (78/110) Oct 12 2018 Funny. Now for real, in a throwaway script, what is there to gain
- Atila Neves (39/100) Oct 13 2018 In case you run out of memory, the GC scans. That's the gain.
- rikki cattermole (18/23) Oct 13 2018 void main() @safe {
- Stanislav Blinov (76/164) Oct 13 2018 Please demonstrate.
"It takes care of itself ------------------------------- When writing a throwaway script that I will only use a handful of times, optimising that code isn’t necessarily high on my priority list. The priority is to get it written, and get it running. That’s where the V8 (C++) engine that NodeJS is compiled into throws you a bone. When you have no choice but to call arrays into memory and manipulate them, sometimes very very large arrays, you can begin to worry about the state of your machine and the amount of memory that is being used. Luckily, V8 handles automatic garbage collection. What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checking can be a bit intensive, it means when I am quickly iterating through code I don’t need to pay a tremendous amount of attention to my memory management, and I can entrust V8 to handle all the little nuances." Don't be a computer. Do more with GC. https://medium.com/ kieranmaher13/why-i-use-nodejs-for-basically-everything-i-do-e0a627787ecc
Oct 11 2018
On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:When writing a throwaway script that I will only use a handful of times, optimising that code isn’t necessarily high on my priority list. The priority is to get it written, and get it running. That’s where the V8 (C++) engine that NodeJS is compiled into throws you a bone.That is fine, if you want to position yourself as competition to to languages like C, C++ and Rust, as a result, there are usecases where GC might not be enough. Also, the quoted part mentions throwaway scripts, which D can be used for, but most people would use Python or Node.JS like in the article instead.
Oct 11 2018
On 10/11/18 11:20 PM, JN wrote:On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:[snip]That is fine, if you want to position yourself as competition to languages like C, C++ and Rust, as a result, there are usecases where GC might not be enough.Does it though? The way I see it is that people who want to do what better. For the C/C++ D's more involved involved semantics for non-GC code are standard library (and library ecosystem) is ALWAYS going to be a turnoff. Where D shines is in it's balance between the two extremes. If want to the ground up, which will cost me about 10 years (see MSR's Singularity). IMHO D should focus on being the best possible D it can be. If we take care of D, the rest will attend to itself. -- Adam Wilson IRC: LightBender import quiet.dlang.dev;
Oct 12 2018
On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:"It takes care of itself ------------------------------- When writing a throwaway script......there's absolutely no need for a GC. In fact, the GC runtime will only detract from performance.What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checkingWhich is just as easily achieved with just one additional line of code: free the memory.Don't be a computer. Do more with GC.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc. The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.
Oct 12 2018
On 10/12/2018 09:26 AM, Stanislav Blinov wrote:On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:Throwaway scripts can allocate a lot of memory and have nontrivial running times. It's less common for scripts than for long-running processes, granted, but I've written scripts to go through gigabytes of data."It takes care of itself ------------------------------- When writing a throwaway script......there's absolutely no need for a GC. In fact, the GC runtime will only detract from performance.People demonstrably have trouble doing that. We can do it most of the time, but everyone occasionally forgets. Beyond that, the concept you're failing to mention here is ownership. You need to use your own mental effort to figure out what memory is owned by what part of the code. The GC lets you ignore that.What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checkingWhich is just as easily achieved with just one additional line of code: free the memory.A string is a plain old array, and languages with manual memory management also support associative arrays.Don't be a computer. Do more with GC.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc. The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.
Oct 12 2018
On Friday, 12 October 2018 at 17:31:30 UTC, Neia Neutuladh wrote:Throwaway scripts can allocate a lot of memory and have nontrivial running times. It's less common for scripts than for long-running processes, granted, but I've written scripts to go through gigabytes of data.Your point being?.. It's not like you need a GC to allocate gigabytes of storage. With D it's super easy to just allocate a huge hunk and simply (literally) slice through it.People demonstrably have trouble doing that. We can do it most of the time, but everyone occasionally forgets.The GC isn't a cure for forgetfulness. One can also forget to close a file or a socket, or I dunno, cancel a financial transaction. GC isn't magic. In fact, to use it correctly you need to pay *more* attention than when managing memory manually. Don't leave dangling pointers. Nurse uninitialized data. Massage it to not sweep in hot paths... People seem to forget that and advertise it as some sort of magic wand that does all you want without you having to think.Beyond that, the concept you're failing to mention here is ownership. You need to use your own mental effort to figure out what memory is owned by what part of the code. The GC lets you ignore that.Nope, it doesn't. If you "forget" who owns the data, you may as well "forget" who writes it and when. Would GC help then as well? You need to expend pretty much the same effort to track that.An ASCII string, perhaps. Not a Unicode one. Count statically-typed compiled languages with native strings, please.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc. The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.A string is a plain old array.and languages with manual memory management also support associative arrays.Of course they do. But again, are those built-in types?
Oct 12 2018
On 10/12/2018 11:14 AM, Stanislav Blinov wrote:On Friday, 12 October 2018 at 17:31:30 UTC, Neia Neutuladh wrote:Over the lifetime of the script, it processed more memory than my computer had. That means I needed a memory management strategy other than "allocate everything". The GC made that quite easy.Throwaway scripts can allocate a lot of memory and have nontrivial running times. It's less common for scripts than for long-running processes, granted, but I've written scripts to go through gigabytes of data.Your point being?.. It's not like you need a GC to allocate gigabytes of storage. With D it's super easy to just allocate a huge hunk and simply (literally) slice through it.By lines of code, programs allocate memory much more often than they deal with files or sockets or financial transactions. So anything that requires less discipline when dealing with memory will reduce bugs a lot, compared with a similar system dealing with sockets or files.People demonstrably have trouble doing that. We can do it most of the time, but everyone occasionally forgets.The GC isn't a cure for forgetfulness. One can also forget to close a file or a socket, or I dunno, cancel a financial transaction.GC isn't magic. In fact, to use it correctly you need to pay *more* attention than when managing memory manually. Don't leave dangling pointers. Nurse uninitialized data. Massage it to not sweep in hot paths... People seem to forget that and advertise it as some sort of magic wand that does all you want without you having to think.It's good enough for a lot of people most of the time without thinking about things much. It reduces the frequency of problems and it eliminates use-after-free and double-free, which are sources of data corruption, which is hard to track down. And in the context of a one-off script, I'm probably not going to worry about using the GC efficiently as long as I'm not running out of memory.That's why we have the const system.Beyond that, the concept you're failing to mention here is ownership. You need to use your own mental effort to figure out what memory is owned by what part of the code. The GC lets you ignore that.Nope, it doesn't. If you "forget" who owns the data, you may as well "forget" who writes it and when. Would GC help then as well? You need to expend pretty much the same effort to track that.
Oct 12 2018
On Friday, 12 October 2018 at 18:50:26 UTC, Neia Neutuladh wrote:Over the lifetime of the script, it processed more memory than my computer had. That means I needed a memory management strategy other than "allocate everything". The GC made that quite easy.Now *that* is a good point. Then again, until you run out of address space you're still fine with just plain old allocate-and-forget. Not that it's a good thing for production code, but for one-off scripts? Sure.People demonstrably have trouble doing that. We can do it most of the time, but everyone occasionally forgets.The GC isn't a cure for forgetfulness. One can also forget to close a file or a socket, or I dunno, cancel a financial transaction.By lines of code, programs allocate memory much more often than they deal with files or sockets or financial transactions. So anything that requires less discipline when dealing with memory will reduce bugs a lot, compared with a similar system dealing with sockets or files.My point is it's irrelevant whether it's memory allocation or something else. If you allow yourself to slack on important problems, that habit *will* bite you in the butt in the future. But the other end of the spectrum is also harmful. That's how we get those "good" APIs such as XCB that fragment the hell out of your heap, force libc on you and make you collect their garbage.It's good enough for a lot of people most of the time without thinking about things much.Python and other bastard languages that didn't want to concern themselves with the hardware all that much. 30 years of "progress" down the drain.It reduces the frequency of problems and it eliminates use-after-freeNot in D it doesn't. Unless you only ever write safe code, in which case you're not in the "without thinking about things much" camp.and double-free, which are sources of data corruption, which is hard to track down.Agreed.And in the context of a one-off script, I'm probably not going to worry about using the GC efficiently as long as I'm not running out of memory.Sure, *that's* the appropriate message. Not the "use the GC, it's not as bad as you think".Oh please, really? Const in D? And you're still talking about people that don't like to think about things much?If you "forget" who owns the data, you may as well "forget" who writes it and when. Would GC help then as well? You need to expend pretty much the same effort to track that.That's why we have the const system.
Oct 12 2018
On Friday, 12 October 2018 at 19:43:02 UTC, Stanislav Blinov wrote:On Friday, 12 October 2018 at 18:50:26 UTC, Neia Neutuladh wrote:Freeing your mind and the codebase of having to deal with memory leaves it in an easier place to deal with the less common higher impact leaks: file descriptors, sockets, database handles ect. (this is like chopping down the forest so you can see the trees you care about ;) ).Over the lifetime of the script, it processed more memory than my computer had. That means I needed a memory management strategy other than "allocate everything". The GC made that quite easy.Now *that* is a good point. Then again, until you run out of address space you're still fine with just plain old allocate-and-forget. Not that it's a good thing for production code, but for one-off scripts? Sure.People demonstrably have trouble doing that. We can do it most of the time, but everyone occasionally forgets.The GC isn't a cure for forgetfulness. One can also forget to close a file or a socket, or I dunno, cancel a financial transaction.By lines of code, programs allocate memory much more often than they deal with files or sockets or financial transactions. So anything that requires less discipline when dealing with memory will reduce bugs a lot, compared with a similar system dealing with sockets or files.My point is it's irrelevant whether it's memory allocation or something else. If you allow yourself to slack on important problems, that habit *will* bite you in the butt in the future.
Oct 12 2018
On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson wrote:Freeing your mind and the codebase of having to deal with memory leaves it in an easier place to deal with the less common higher impact leaks: file descriptors, sockets, database handles ect. (this is like chopping down the forest so you can see the trees you care about ;) ).That's done first and foremost by stripping out unnecessary allocations, not by writing "new" every other line and closing your eyes. I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction. Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer. It's true that it isn't critical for one-off scripts, but so is deallocation. Saying stuff like "do more with GC" is just outright harmful. Kids are reading, for crying out loud.
Oct 12 2018
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov wrote:Saying stuff like "do more with GC" is just outright harmful. Kids are reading, for crying out loud.People in this thread mostly said that for some things GC is just awesome. When you need to get shit done fast and dirty GC saves time and mental capacity. Not all code deals with sockets, DB, bank transactions, multithreading, etc.
Oct 12 2018
On Friday, 12 October 2018 at 21:15:04 UTC, welkam wrote:People in this thread mostly said that for some things GC is just awesome. When you need to get shit done fast and dirty GC saves time and mental capacity. Not all code deals with sockets, DB, bank transactions, multithreading, etc.Read the OP again then. What message does it send? What broad conclusion does it draw from a niche use case?
Oct 12 2018
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov wrote:On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson wrote:D isn't Java. If you can, put your data on the stack. If you can't, `new` away and don't think about it. The chances you'll have to optimise the code are not high. If you do, the chances that the GC allocations are the problem are also not high. If the profiler shows they are... then remove those allocations.Freeing your mind and the codebase of having to deal with memory leaves it in an easier place to deal with the less common higher impact leaks: file descriptors, sockets, database handles ect. (this is like chopping down the forest so you can see the trees you care about ;) ).That's done first and foremost by stripping out unnecessary allocations, not by writing "new" every other line and closing your eyes.I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction.Sometimes we are. Other times it's a 50 line script.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer.Of any D programmer writing code that's performance sensitive.It's true that it isn't critical for one-off scripts, but so is deallocation.We'll agree to disagree.Saying stuff like "do more with GC" is just outright harmful.Disagreement yet again.
Oct 12 2018
On Friday, 12 October 2018 at 21:39:13 UTC, Atila Neves wrote:D isn't Java. If you can, put your data on the stack. If you can't, `new` away and don't think about it.Then five years later, try and hunt down that mysterious heap corruption. Caused by some destructor calling into buggy third-party code. Didn't want to think about that one either?The chances you'll have to optimise the code are not high. If you do, the chances that the GC allocations are the problem are also not high. If the profiler shows they are... then remove those allocations.There is no "sometimes" here. You're writing programs for specific machines. All. The. Time.I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction.Sometimes we are. Other times it's a 50 line script.All code is performance sensitive. Whoever invented that distinction should be publicly humiliated. If it's not speed, it's power consumption. Or memory. Or I/O. "Not thinking" about any of that means you're treating your power champion horse as if it was a one-legged pony. Advocating the "not thinking" approach makes you an outright evil person.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer.Of any D programmer writing code that's performance sensitive.
Oct 12 2018
On Fri, 12 Oct 2018 23:35:19 +0000, Stanislav Blinov wrote:And sometimes it's programmer performance. Last year I had a malformed CSV I needed to manipulate; Excel couldn't handle it, and I couldn't (or don't know how to) trust a Vim macro to do it, so I wrote a small script in D. My design wasn't even close to high-performance, but it was easy to test (which was probably my biggest requirement); I probably could have spent another 30 minutes writing something that would have run two minutes faster, but that would have been inefficient. I didn't even keep the script; I'll never need it again. There are times when the easy or simple solution really is the best one for the task at hand.All code is performance sensitive. Whoever invented that distinction should be publicly humiliated. If it's not speed, it's power consumption. Or memory. Or I/O. "Not thinking" about any of that means you're treating your power champion horse as if it was a one-legged pony.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer.Of any D programmer writing code that's performance sensitive.
Oct 13 2018
On Saturday, 13 October 2018 at 12:15:07 UTC, rjframe wrote:...I didn't even keep the script; I'll never need it again. There are times when the easy or simple solution really is the best one for the task at hand.And?.. Would you now go around preaching how awesome the GC is and that everyone should use it?
Oct 13 2018
On Sat, 13 Oct 2018 12:22:29 +0000, Stanislav Blinov wrote:On Saturday, 13 October 2018 at 12:15:07 UTC, rjframe wrote:For something like I did, yes. The article the OP links to may want GC for everything; the excerpt the OP actually quoted is talking about applications where memory management isn't the most important thing. I completely agree with that excerpt....I didn't even keep the script; I'll never need it again. There are times when the easy or simple solution really is the best one for the task at hand.And?.. Would you now go around preaching how awesome the GC is and that everyone should use it?
Oct 15 2018
On Monday, 15 October 2018 at 10:11:15 UTC, rjframe wrote:On Sat, 13 Oct 2018 12:22:29 +0000, Stanislav Blinov wrote:Yeah, well, what's the title of this thread, and what's the conclusion of that post? Automation is what literally all of us do. But you should not automate something you don't understand.And?.. Would you now go around preaching how awesome the GC is and that everyone should use it?For something like I did, yes. The article the OP links to may want GC for everything; the excerpt the OP actually quoted is talking about applications where memory management isn't the most important thing. I completely agree with that excerpt.
Oct 15 2018
On Friday, 12 October 2018 at 23:35:19 UTC, Stanislav Blinov wrote:On Friday, 12 October 2018 at 21:39:13 UTC, Atila Neves wrote:That hasn't happened to me.D isn't Java. If you can, put your data on the stack. If you can't, `new` away and don't think about it.Then five years later, try and hunt down that mysterious heap corruption. Caused by some destructor calling into buggy third-party code. Didn't want to think about that one either?I am not. The last time I wrote code for a specific machine it was for my 386, probably around 1995.There is no "sometimes" here. You're writing programs for specific machines. All. The. Time.I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction.Sometimes we are. Other times it's a 50 line script.If that were true, nobody would write code in Python. And yet...All code is performance sensitive.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer.Of any D programmer writing code that's performance sensitive.If it's not speed, it's power consumption. Or memory. Or I/O.Not if it's good enough as it is. Which, in my my experience, is frequently the case. YMMV."Not thinking" about any of that means you're treating your power champion horse as if it was a one-legged pony.Yes. I'd rather the computer spend its time than I mine. I value the latter far more than the former.Advocating the "not thinking" approach makes you an outright evil person.Is there meetup for evil people now that I qualify? :P https://www.youtube.com/watch?v=FVAD3LQmxbw&t=42
Oct 13 2018
On Saturday, 13 October 2018 at 13:17:41 UTC, Atila Neves wrote:It rarely does indeed. Usually it's someone else that has to sift through your code and fix your bugs years later. Because by that time you're long gone on another job, happily writing more code without thinking about it.Then five years later, try and hunt down that mysterious heap corruption. Caused by some destructor calling into buggy third-party code. Didn't want to think about that one either?That hasn't happened to me.Yes you are. Or what, you're running your executables on a 1990 issue calculator? :P Somehow I doubt that.There is no "sometimes" here. You're writing programs for specific machines. All. The. Time.I am not. The last time I wrote code for a specific machine it was for my 386, probably around 1995.Nobody would write code in Python if Python didn't exist. That it exists means there's a demand. Because there are an awful lot of folks who just "don't want to think about it". Remember 2000s? Everybody and their momma was a developer. Web developer, Python, Java, take your pick. Not that they knew what they were doing, but it was a good time to peddle crap. Now, Python in an of itself is not a terrible language. But people write *system* tools and scripts with it. WTF? I mean, if you could care less how the machine works, you have *no* business developing *anything* for an OS.If that were true, nobody would write code in Python. And yet...All code is performance sensitive.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer.Of any D programmer writing code that's performance sensitive.That is not a reason to intentionally write *less* efficient code.If it's not speed, it's power consumption. Or memory. Or I/O.Not if it's good enough as it is. Which, in my my experience, is frequently the case. YMMV.And what if your code wastes someone else's time at some later point? Hell with it, not your problem, right?"Not thinking" about any of that means you're treating your power champion horse as if it was a one-legged pony.Yes. I'd rather the computer spend its time than I mine. I value the latter far more than the former.Any gathering of like-minded programmers will do.Advocating the "not thinking" approach makes you an outright evil person.Is there meetup for evil people now that I qualify? :P
Oct 13 2018
On Saturday, 13 October 2018 at 14:43:22 UTC, Stanislav Blinov wrote:On Saturday, 13 October 2018 at 13:17:41 UTC, Atila Neves wrote:Not everyone have the time nor skills of doing manual memory management. Even more so when correctness is way more important than speed. Not everything needs to be fast.[...]It rarely does indeed. Usually it's someone else that has to sift through your code and fix your bugs years later. Because by that time you're long gone on another job, happily writing more code without thinking about it. [...]
Oct 13 2018
On Saturday, 13 October 2018 at 21:44:45 UTC, 12345swordy wrote:Not everyone have the time nor skills of doing manual memory management. Even more so when correctness is way more important than speed. Not everything needs to be fast.That's a lamest excuse if I ever seen one. If you can't be bothered to acquire one of the most relevant skills for writing code for modern systems, then: a) Ideally, you shouldn't be writing code b) At the very least, you're not qualified to give any advice pertaining to writing code PS. "Correctness" also includes correct use of the machine and it's resources.
Oct 14 2018
On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov wrote:On Saturday, 13 October 2018 at 21:44:45 UTC, 12345swordy wrote:It not an excuse, it's reality. The d language have multiple issues, the idea to have the language to have built in support for GC is NOT one of them. for us as we are developing web apps. I find your side remarks to be very arrogant and condescending.Not everyone have the time nor skills of doing manual memory management. Even more so when correctness is way more important than speed. Not everything needs to be fast.That's a lamest excuse if I ever seen one.
Oct 14 2018
On Sunday, 14 October 2018 at 20:26:10 UTC, 12345swordy wrote:It not an excuse, it's reality. The d language have multiple issues, the idea to have the language to have built in support for GC is NOT one of them.Read this thread again then, carefully. You *have to* understand D's GC in order to use it correctly, efficiently, and safely. And to do that, you *have to* understand your data and what you're doing with it. And to do that you *have to* work with the machine, not in spite of it. At which point you may well reconsider using the GC in the first place. Or you may not. But at least that will be an informed decision based on actual value, not this "save time" fallacy.saver for us as we are developing web apps.So you're in this for a quick buck, and to hell with everything and screw the heap wholesale, right?.. Save time writing code, waste time processing data. Cool choice.I find your side remarks to be very arrogant and condescending.I'm arrogant, huh? It's people like you who think that "the" way to program is produce crappy code fast. It's so funny how all of you guys seem to think that I'm against the GC. I'm not. I'm against stupid "advice" like the one given in the OP. Almost all of you seem like you're in the same boat: you don't give a flying duck about your impact on the industry.
Oct 14 2018
On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov wrote:I'm arrogants, huh?When you say statements like this.you don't give a flying duck about your impact on the industry.It come across as condescending and arrogant.
Oct 15 2018
On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov wrote:Yep, and everything else that's inconvenient you'd just cut out. Did I hit a nerve?..I'm arrogants, huh?When you say statements like this.you don't give a flying duck about your impact on the industry.It come across as condescending and arrogant.
Oct 15 2018
On Monday, 15 October 2018 at 17:30:28 UTC, Stanislav Blinov wrote:On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:You mean the part that you straw man me, and resort to personal attacks? No need for me to address it.On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov wrote:Yep, and everything else that's inconvenient you'd just cut out.I'm arrogants, huh?When you say statements like this.you don't give a flying duck about your impact on the industry.It come across as condescending and arrogant.Did I hit a nerve?..Case in point.
Oct 15 2018
On Monday, 15 October 2018 at 18:00:24 UTC, 12345swordy wrote:On Monday, 15 October 2018 at 17:30:28 UTC, Stanislav Blinov wrote:Pfff... *I* am "straw man"ing you? That's just hilarious. "Not everything needs to be fast" - I never said everything needs to be fast. I'm saying everything *doesn't need to be slow* due to lazy people doing lazy things because they "don't want to think about it". So who's straw man-ing who, exactly? Do you even understand the difference? By saying that you're more interested in saving your development time as opposed to processing time *for web apps, no less*, you're admitting that you don't care about the consequences of your actions. You finding it "personal" only supports that assessment, so be my guest, be offended. Or be smart, and stop and think about what you're doing.On Monday, 15 October 2018 at 16:46:45 UTC, 12345swordy wrote:You mean the part that you straw man me, and resort to personal attacks? No need for me to address it.On Monday, 15 October 2018 at 00:02:31 UTC, Stanislav Blinov wrote:Yep, and everything else that's inconvenient you'd just cut out.I'm arrogants, huh?When you say statements like this.you don't give a flying duck about your impact on the industry.It come across as condescending and arrogant.If you want to have an argument, I suggest you stop quote mining and start paying attention.Did I hit a nerve?..Case in point.
Oct 15 2018
On Monday, 15 October 2018 at 19:57:59 UTC, Stanislav Blinov wrote:If you want to have an argument, I suggest you stop quote mining and start paying attention.If you wanted an argument from me, then you need to stop with the "LOL YOU MAD BRO" rhetoric.
Oct 15 2018
On Monday, 15 October 2018 at 20:12:47 UTC, 12345swordy wrote:On Monday, 15 October 2018 at 19:57:59 UTC, Stanislav Blinov wrote:...and again he grabs a single quote. Look, so far all you've contributed to this thread is one poor excuse and a ton of victim act. Neither are of any particular use.If you want to have an argument, I suggest you stop quote mining and start paying attention.If you wanted an argument from me, then you need to stop with the "LOL YOU MAD BRO" rhetoric.
Oct 15 2018
On Monday, 15 October 2018 at 20:22:54 UTC, Stanislav Blinov wrote:Neither are of any particular use.Pot called, he wants to see Mr. kettle.
Oct 15 2018
On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov wrote:That's a lamest excuse if I ever seen one. If you can't be bothered to acquire one of the most relevant skills for writing code for modern systems, then: a) Ideally, you shouldn't be writing code b) At the very least, you're not qualified to give any advice pertaining to writing code PS. "Correctness" also includes correct use of the machine and it's resources.Ideally you wouldn’t have chosen to even try D. You (and others who spend so much time arguing against garbage collection on a forum for a language designed with garbage collection) would be better off using a non-garbage collected language.
Oct 14 2018
On Monday, 15 October 2018 at 05:26:56 UTC, Tony wrote:On Sunday, 14 October 2018 at 07:51:09 UTC, Stanislav Blinov wrote:He doesn't argue against garbage collection. And D is one of the few langauges that can be used without garbage collector, so it can be non-garbage collected language and can be used as such.That's a lamest excuse if I ever seen one. If you can't be bothered to acquire one of the most relevant skills for writing code for modern systems, then: a) Ideally, you shouldn't be writing code b) At the very least, you're not qualified to give any advice pertaining to writing code PS. "Correctness" also includes correct use of the machine and it's resources.Ideally you wouldn’t have chosen to even try D. You (and others who spend so much time arguing against garbage collection on a forum for a language designed with garbage collection) would be better off using a non-garbage collected language.
Oct 15 2018
On Monday, 15 October 2018 at 08:21:11 UTC, Eugene Wissner wrote:On Monday, 15 October 2018 at 05:26:56 UTC, Tony wrote:Well, can you state what he does argue against?Ideally you wouldn’t have chosen to even try D. You (and others who spend so much time arguing against garbage collection on a forum for a language designed with garbage collection) would be better off using a non-garbage collected language.He doesn't argue against garbage collection.And D is one of the few languages that can be used without garbage collection, so it can be a non-garbage collected language and can be used as such.Wouldn't C++ or Rust, with their smart pointers, be a better choice for someone who wants to use a compiles-to-object-code language, but can't suffer any garbage collector delays?
Oct 16 2018
On Tuesday, 16 October 2018 at 11:42:55 UTC, Tony wrote:On Monday, 15 October 2018 at 08:21:11 UTC, Eugene WissnerThanks, Eugene, I was starting to lose hope in humanity.He doesn't argue against garbage collection.Well, can you state what he does argue against?I did state what I was arguing against, if you actually read the thread and not only pick select statements I'm sure you'll find it.Wouldn't C++ or Rust, with their smart pointers, be a better choice for someone who wants to use a compiles-to-object-code language, but can't suffer any garbage collector delays?What is up with people and this thread? Who is talking about garbage collector delays? If you do use the GC, they're a given, and you work with them. *Just like you should with everything else*. I'm talking about code that doesn't give a **** about utilizing machine resources correctly. Crap out "new" everywhere, it lets you write code fast. Is it actually a good idea to collect here? Hell if you know, you don't care, carry on! Crap out classes everywhere, it lets you write code fast. Pull in a zillion of external dependencies 90% of which you have no idea what they're for, what they do and how much cruft they bring with them, they let you write code fast. Oh look, you have no threads! Maybe you should write a, a... a task system! Yes, full of classes and, and futures and... stuff. But no, nononono, writing is too long, let's take a ready one. Implemented by another awesome programmer just like you! And then spawn... ooooh, a second thread! Yes! Two threads are better than one! What for? It doesn't matter what for, don't think about it. Better yet! Spawn four! Six! Twelve! And then serialize them all with one mutex, because to hell with learning that task system you downloaded, you have code to write. What did you say? Pointers? Nah, you have twelve threads and a mutex. Surely you need reference counted objects. Pointers are bad for you, they will have you think... Then, after this jellyfish wobbly pile of crud is starting to swell and smell, then start "optimizing" it. Profile first though. Profile, measure! Only first write more cruft in order to measure what needs to be measured, otherwise you might accidentally measure all those libXXX you used and all those cache misses you wrote. And then fix it. By doing more of the above, as luck would have it. "Don't be a computer..." What a joke.
Oct 16 2018
On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav Blinov wrote:On Friday, 12 October 2018 at 19:55:02 UTC, Nicholas Wilson wrote:If you need perf in your _scripts_, a use LDC and b) pass -O3 which among many other improvements over baseline will promote unnecessary garbage collection to the stack.Freeing your mind and the codebase of having to deal with memory leaves it in an easier place to deal with the less common higher impact leaks: file descriptors, sockets, database handles ect. (this is like chopping down the forest so you can see the trees you care about ;) ).That's done first and foremost by stripping out unnecessary allocations, not by writing "new" every other line and closing your eyes.I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction.We might be sometimes. I suspect that is less likely for a script to fall in that category.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer. It's true that it isn't critical for one-off scripts, but so is deallocation. Saying stuff like "do more with GC" is just outright harmful.That is certainly not an unqualified truth. Yes one shouldn't `new` stuff just for fun, but speed of executable is often not what one is trying to optimise when writing code, e.g. when writing a script one is probably trying to minimise development/debugging time.Kids are reading, for crying out loud.Oi, you think thats bad? Try reading what some of the other Aussies post, *cough* e.g. a frustrated Manu *cough*
Oct 12 2018
On Friday, 12 October 2018 at 23:32:34 UTC, Nicholas Wilson wrote:On Friday, 12 October 2018 at 20:12:26 UTC, Stanislav BlinovIf you *need* perf, you write performant code. If you don't need perf, the least you can do is *not write* lazy-ass pessimized crap.That's done first and foremost by stripping out unnecessary allocations, not by writing "new" every other line and closing your eyes.If you need perf in your _scripts_, a use LDC and b) pass -O3 which among many other improvements over baseline will promote unnecessary garbage collection to the stack.Jesus guys. *All* code falls in that category. Because it is being executed by those machines. Yet we all oh so like to pretend that doesn't happen, for some bizarre reason.I mean come on, it's 2018. We're writing code for multi-core and multi-processor systems with complex memory interaction.We might be sometimes. I suspect that is less likely for a script to fall in that category.That's fine so long as it doesn't unnecessarily *pessimize* execution. Unfortunately, when you advertise GC for it's awesomeness in your experience with "throwaway" scripts, you're sending a very, *very* wrong message.Precisely where in memory your data is, how it got there and how it's laid out should be bread and butter of any D programmer. It's true that it isn't critical for one-off scripts, but so is deallocation. Saying stuff like "do more with GC" is just outright harmful.That is certainly not an unqualified truth. Yes one shouldn't `new` stuff just for fun, but speed of executable is often not what one is trying to optimise when writing code, e.g. when writing a script one is probably trying to minimise development/debugging time.:)Kids are reading, for crying out loud.Oi, you think thats bad? Try reading what some of the other Aussies post, *cough* e.g. a frustrated Manu *cough*
Oct 12 2018
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov wrote:On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:What a bunch of nonsense! I used to talk like this some 20 years ago when all I saw in the computing world was C and C++... Sure garbage collection is not for every project, depends what industry you are in I guess... In my case (business applications/services) I have never had the need to turn off garbage collection! However, someone in the gaming industry, embedded or realtime systems would indeed need to turn off the GC..."It takes care of itself ------------------------------- When writing a throwaway script......there's absolutely no need for a GC. In fact, the GC runtime will only detract from performance.What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checkingWhich is just as easily achieved with just one additional line of code: free the memory.Don't be a computer. Do more with GC.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc. The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.
Oct 12 2018
On Friday, 12 October 2018 at 19:06:36 UTC, Dejan Lekic wrote:What a bunch of nonsense! I used to talk like this some 20 years ago when all I saw in the computing world was C and C++... Sure garbage collection is not for every project, depends what industry you are in I guess... In my case (business applications/services) I have never had the need to turn off garbage collection! However, someone in the gaming industry, embedded or realtime systems would indeed need to turn off the GC...Who said anything about turning it off? I'm pointing out that using the GC for the sake of simplicity is precisely the wrong reason to do so, that's it. Bunch of nonsense, right. Have fun writing sloppy code then.
Oct 12 2018
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov wrote:On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:For me, at least, spending an extra two weeks optimizing a program to eliminate that last 0.1 seconds of running time is not a good decision."It takes care of itself ------------------------------- When writing a throwaway script......there's absolutely no need for a GC. In fact, the GC runtime will only detract from performance.
Oct 12 2018
On Friday, 12 October 2018 at 16:26:49 UTC, Stanislav Blinov wrote:On Thursday, 11 October 2018 at 21:22:19 UTC, aberba wrote:True. There's also absolutely no need for computer languages either, machine code is sufficient."It takes care of itself ------------------------------- When writing a throwaway script......there's absolutely no need for a GC.In fact, the GC runtime will only detract from performance.Demonstrably untrue. It puzzles me why this myth persists. There are trade-offs, and one should pick whatever is best for the situation at hand.*Simply* achieved, not *easily*. Decades of bugs has shown emphatically that it's not easy.What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checkingWhich is just as easily achieved with just one additional line of code: free the memory.There is: writing less code to achieve the same result.Don't be a computer. Do more with GC.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc.The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.For me, the power of tracing GC is that I don't need to think about ownership, lifetimes, or manual memory management. I also don't have to please the borrow checker gods. Yes, there are other resources to manage. RAII nearly always manages that, I don't need to think about that either.
Oct 12 2018
On Friday, 12 October 2018 at 21:34:35 UTC, Atila Neves wrote:Funny. Now for real, in a throwaway script, what is there to gain from a GC? Allocate away and forget about it.True. There's also absolutely no need for computer languages either, machine code is sufficient.------------------------------- When writing a throwaway script......there's absolutely no need for a GC.In fact, the GC runtime will only detract from performance.Demonstrably untrue. It puzzles me why this myth persists.Myth, is it now? Unless all you do is allocate memory, which isn't any kind of useful application, pretty much on each sweep run the GC's metadata is *cold*. What's worse, you don't control how much data there is and where it is. Need I say more? If you disagree, please do the demonstration then.There are trade-offs, and one should pick whatever is best for the situation at hand.Exactly. Which is *not at all* what the OP is encouraging to do.Alright, from one non-native English speaker to another, well done, I salute you. I also used the term "dangling pointer" previously, where I should've used "non-null". Strange you didn't catch that. To the point: *that* is a myth. The bugs you're referring to are not *solved* by the GC, they're swept under a rug. Because the bugs themselves are in the heads, stemming from that proverbial programmer laziness. It's like everyone is Scarlett O'Hara with a keyboard. For most applications, you *do* know how much memory you'll need, either exactly or an estimation. Garbage collection is useful for cases when you don't, or can't estimate, and even then a limited subset of that.*Simply* achieved, not *easily*. Decades of bugs has shown emphatically that it's not easy.What this means is that whenever I have disregarded a block of information, say removed an index from an array, then that memory is automatically cleared and freed back up on the next sweep. While the process of collection and actually checkingWhich is just as easily achieved with just one additional line of code: free the memory.Well, I guess either of those do take more arguments than a "new", so yup, you do indeed write "less" code. Only that you have no clue how much more code is hiding behind that "new", how many indirections, DLL calls, syscalls with libc's wonderful poison that is errno... You don't want to think about that. Then two people start using your script. Then ten, a hundred, a thousand. Then it becomes a part of an OS distribution. And no one wants to "think about that".There is: writing less code to achieve the same result.Don't be a computer. Do more with GC.Writing a throwaway script there's nothing stopping you from using mmap or VirtualAlloc.The "power" of GC is in the language support for non-trivial types, such as strings and associative arrays. Plain old arrays don't benefit from it in the slightest.For me, the power of tracing GC is that I don't need to think about ownership, lifetimes, or manual memory management.Yes you do, don't delude yourself. Pretty much the only way you don't is if you're writing purely functional code. But we're talking about D here. Reassigned a reference? You thought about that. If you didn't, you just wrote a nasty bug. How much more hypocrisy can we reach here? "Fun" fact: it's not safe to "new" anything in D if your program uses any classes. Thing is, it does unconditionally thanks to DRuntime.I also don't have to please the borrow checker gods.Yeah, that's another extremum. I guess "rustacians" or whatever the hell they call themselves are pushing that one, don't they? "Let's not go for a GC, let's straight up cut out whole paradigms for safety's sake..."Yes, there are other resources to manage. RAII nearly always manages that, I don't need to think about that either.Yes you do. You do need to write those destructors or scoped finalizers, don't you? Or so help me use a third-party library that implements those? There's fundamentally *no* difference from memory management here. None, zero, zip. Sad thing is, you're not alone. Look at all the major OSs today. How long does it take to, I don't know, open a project in the Visual Studio on Windows? Or do a search in a huge file opened in 'less' on Unix? On an octacore 4GHz machine with 32Gb 3GHz memory? Should just instantly pop up on the screen, shouldn't it? Why doesn't it then? Because most programmers think the way you do: "oh it doesn't matter here, I don't need to think about that". And then proceed to advocate those "awesome" laid-back solutions that oh so help them save so much time coding. Of course they do, at everyone else's expense. Decades later, we're now trying to solve problems that shouldn't have existed in the first place. You'd think that joke was just that, a joke... But let's get back to D. Look at Phobos. Why does stdout.writefln need to allocate? How many times does it copy it's arguments? Why can't it take non-copyable arguments? Does it change global state in your process' address space? Does it impose external dependencies? You don't want to think about that? The author likely didn't either. And yet everybody is encouraged to use that: it's out of the box after all... Why is Socket a class, blown up from a puny 32-bit value to a bloated who-knows-how-many-bytes monstrosity? Will that socket close if you rely on the GC? Yes? No? Maybe? Why? Can I deploy the compiler on a remote machine with limited RAM and expect it to always successfully build my projects and not run out of memory? I can go on and on, but I hope I finally made my point somewhat clearer. Just in case, a TLDR: *understand your machine and your tools and use them accordingly*. There are no silver bullets for anything, and that includes the GC. If you go on advocating it because it helped you write a 1kLOC one-time-use script, it's very likely I don't want to use anything you write.
Oct 12 2018
On Friday, 12 October 2018 at 23:24:56 UTC, Stanislav Blinov wrote:On Friday, 12 October 2018 at 21:34:35 UTC, Atila Neves wrote:In case you run out of memory, the GC scans. That's the gain.Funny. Now for real, in a throwaway script, what is there to gain from a GC? Allocate away and forget about it.True. There's also absolutely no need for computer languages either, machine code is sufficient.------------------------------- When writing a throwaway script......there's absolutely no need for a GC.Yes.In fact, the GC runtime will only detract from performance.Demonstrably untrue. It puzzles me why this myth persists.Myth, is it now?Unless all you do is allocate memory, which isn't any kind of useful application, pretty much on each sweep run the GC's metadata is *cold*.*If* the GC scans.I disagree. What I got from the OP was that for most code, the GC helps. I agree with that sentiment.There are trade-offs, and one should pick whatever is best for the situation at hand.Exactly. Which is *not at all* what the OP is encouraging to do.Alright, from one non-native English speaker to another, well done, I salute you.The only way I'd qualify as a non-native English speaker would be to pedantically assert that I can't be due to not having learned it first. In any case, I'd never make fun of somebody's English if they're non-native, and that's most definitely not what I was trying to do here - I assume the words "simple" and "easy" exist in most languages. I was arguing about semantics.To the point: *that* is a myth. The bugs you're referring to are not *solved* by the GC, they're swept under a rug.Not in my experience. They've literally disappeared from the code I write.Because the bugs themselves are in the heads, stemming from that proverbial programmer laziness. It's like everyone is Scarlett O'Hara with a keyboard.IMHO, lazy programmers are good programmers.For most applications, you *do* know how much memory you'll need, either exactly or an estimation.I don't, maybe you do. I don't even care unless I have to. See my comment above about being lazy.Well, I guess either of those do take more arguments than a "new", so yup, you do indeed write "less" code. Only that you have no clue how much more code is hiding behind that "new",I have a clue. I could even look at the druntime code if I really cared. But I don't.how many indirections, DLL calls, syscalls with libc's wonderful poison that is errno... You don't want to think about that.That's right, I don't.Then two people start using your script. Then ten, a hundred, a thousand. Then it becomes a part of an OS distribution. And no one wants to "think about that".Meh. There are so many executables that are part of distributions that are written in Python, Ruby or JavaScript.No, I don't. I used to in C++, and now I don't.For me, the power of tracing GC is that I don't need to think about ownership, lifetimes, or manual memory management.Yes you do, don't delude yourself.Pretty much the only way you don't is if you're writing purely functional code.I write pure functional code by default. I only use side-effects when I have to and I isolate the code that does.But we're talking about D here. Reassigned a reference? You thought about that. If you didn't, you just wrote a nasty bug. How much more hypocrisy can we reach here?I probably didn't write a nasty bug if the pointer that was reassigned was to GC allocated memory. It lives as long as it has to, I don't think about it."Fun" fact: it's not safe to "new" anything in D if your program uses any classes. Thing is, it does unconditionally thanks to DRuntime.I hardly ever use classes in D, but I'd like to know more about why it's not safe.I write a destructor once, then I never think about it again. It's a lot different from worrying about closing resources all the time. I don't write `scope(exit)` unless it's only once as well, otherwise I wrap the code in an RAII struct.Yes, there are other resources to manage. RAII nearly always manages that, I don't need to think about that either.Yes you do. You do need to write those destructors or scoped finalizers, don't you? Or so help me use a third-party library that implements those? There's fundamentally *no* difference from memory management here. None, zero, zip.Why is Socket a class, blown up from a puny 32-bit value to a bloated who-knows-how-many-bytes monstrosity? Will that socket close if you rely on the GC? Yes? No? Maybe? Why?I don't know. I don't think Socket should even have been a class. I assume it was written in the D1 days.Can I deploy the compiler on a remote machine with limited RAM and expect it to always successfully build my projects and not run out of memory?If the compiler had the GC turned on, yes. That's not a point about GC, it's a point about dmd.
Oct 13 2018
On 14/10/2018 2:08 AM, Atila Neves wrote:void main() safe { Foo foo = new Foo(8); foo.print(); } class Foo { int x; this(int x) safe { this.x = x; } void print() safe { import std.stdio; try { writeln(x); } catch(Exception) { } } }"Fun" fact: it's not safe to "new" anything in D if your program uses any classes. Thing is, it does unconditionally thanks to DRuntime.I hardly ever use classes in D, but I'd like to know more about why it's not safe.
Oct 13 2018
On Saturday, 13 October 2018 at 13:08:30 UTC, Atila Neves wrote:On Friday, 12 October 2018 at 23:24:56 UTC, Stanislav BlinovCorrection, in case the GC runs out of memory, it scans.Funny. Now for real, in a throwaway script, what is there to gain from a GC? Allocate away and forget about it.In case you run out of memory, the GC scans. That's the gain.Please demonstrate.Yes.Myth, is it now?In fact, the GC runtime will only detract from performance.Demonstrably untrue. It puzzles me why this myth persists."If"? So... ahem... what, exactly, is the point of a GC that doesn't scan? What are you even arguing here? That you can allocate and never free? You can do that without GC just as well.Unless all you do is allocate memory, which isn't any kind of useful application, pretty much on each sweep run the GC's metadata is *cold*.*If* the GC scans.Helps write code faster? Yes, I'm sure it does. It also helps write slower unsafe code faster, unless you're paying attention, which, judging by your comments, you're not and aren't inclined to.I disagree. What I got from the OP was that for most code, the GC helps. I agree with that sentiment.There are trade-offs, and one should pick whatever is best for the situation at hand.Exactly. Which is *not at all* what the OP is encouraging to do.Just FYI, they're the same word in my native language :PAlright, from one non-native English speaker to another, well done, I salute you.The only way I'd qualify as a non-native English speaker would be to pedantically assert that I can't be due to not having learned it first. In any case, I'd never make fun of somebody's English if they're non-native, and that's most definitely not what I was trying to do here - I assume the words "simple" and "easy" exist in most languages. I was arguing about semantics.Right. At the expense of introducing unpredictable behavior in your code. Unless you thought about that.To the point: *that* is a myth. The bugs you're referring to are not *solved* by the GC, they're swept under a rug.Not in my experience. They've literally disappeared from the code I write.Because the bugs themselves are in the heads, stemming from that proverbial programmer laziness. It's like everyone is Scarlett O'Hara with a keyboard.IMHO, lazy programmers are good programmers.Yes, but not at the expense of users and other programmers who'd use their code.For most applications, you *do* know how much memory you'll need, either exactly or an estimation.I don't, maybe you do. I don't even care unless I have to. See my comment above about being lazy.Too bad. You really, really should.You should.Well, I guess either of those do take more arguments than a "new", so yup, you do indeed write "less" code. Only that you have no clue how much more code is hiding behind that "new",I have a clue. I could even look at the druntime code if I really cared. But I don't.You should. Everybody should.how many indirections, DLL calls, syscalls with libc's wonderful poison that is errno... You don't want to think about that.That's right, I don't.Exactly my point. That's why we *must not* pile more crap on top of that. That's why we *must* think about the code we write. Just because your neighbour sh*ts in a public square, doesn't mean that you must do that too.Then two people start using your script. Then ten, a hundred, a thousand. Then it becomes a part of an OS distribution. And no one wants to "think about that".Meh. There are so many executables that are part of distributions that are written in Python, Ruby or JavaScript.Yes you do, you say as much below.No, I don't. I used to in C++, and now I don't.For me, the power of tracing GC is that I don't need to think about ownership, lifetimes, or manual memory management.Yes you do, don't delude yourself.In other words, you knew what you were doing, at which point I'd ask, what's the problem with freeing the no-longer-used memory there and then? There's nothing to "think" about.Pretty much the only way you don't is if you're writing purely functional code.I write pure functional code by default. I only use side-effects when I have to and I isolate the code that does.But we're talking about D here. Reassigned a reference? You thought about that. If you didn't, you just wrote a nasty bug. How much more hypocrisy can we reach here?I probably didn't write a nasty bug if the pointer that was reassigned was to GC allocated memory. It lives as long as it has to, I don't think about it.rikki's example isn't exactly the one I was talking about, so here goes: module mycode; import std.stdio; import thirdparty; void mySuperSafeFunction() safe { auto storage = new int[10^^6]; // do awesome work with that storage... } void main() { thirdPartyWork(); writeln("---"); mySuperSafeFunction(); // this function can't corrupt memory, can it? It's safe! writeln("---"); } module thirdparty; system: // just so we're clear void thirdPartyWork() { auto nasty = new Nasty; } private: void corruptMemory() { import std.stdio; writeln("I've just corrupted your heap, or maybe your stack, mwahahahah"); } class Nasty { ~this() { corruptMemory(); } } Thus, even if you wrote an entirely " safe" library, someone using it may encounter a nasty bug. Worse yet, they may not. Because whether or not the GC calls finalizers depends on the overall state of the program. And certainly, *your* code may cause third-party code to UB, and vice versa. Now, of course things wouldn't be called "Nasty" and "corruptMemory". They'll have innocent names, and nothing conspicuous at a glance. Because that's just how bugs are. The point is, GC can't deliver on the safe promise, at least in the language as it is at the moment."Fun" fact: it's not safe to "new" anything in D if your program uses any classes. Thing is, it does unconditionally thanks to DRuntime.I hardly ever use classes in D, but I'd like to know more about why it's not safe.I write a destructor once, then I never think about it again. It's a lot different from worrying about closing resources all the time. I don't write `scope(exit)` unless it's only once as well, otherwise I wrap the code in an RAII struct.Ok, same goes for memory management. What's your point?Exactly. But it's in a "standard" library. So if I "don't want to think about it" I'll use that, right?Why is Socket a class, blown up from a puny 32-bit value to a bloated who-knows-how-many-bytes monstrosity? Will that socket close if you rely on the GC? Yes? No? Maybe? Why?I don't know. I don't think Socket should even have been a class. I assume it was written in the D1 days.The answer is no. At least dmd will happily run out of memory, that's just the way it is. It's a point about programmers not caring about what they're doing, whether it's GC or not is irrelevant. Only in this case, it's also about programmers advising others to do so.Can I deploy the compiler on a remote machine with limited RAM and expect it to always successfully build my projects and not run out of memory?If the compiler had the GC turned on, yes. That's not a point about GC, it's a point about dmd.
Oct 13 2018