www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Possible solution to template bloat problem?

reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
With D's honestly awesome metaprogramming features, templates are liable
to be (and in fact are) used a LOT. This leads to the unfortunate
situation of template bloat: every time you instantiate a template, it
adds yet another copy of the templated code into your object file. This
gets worse when you use templated structs/classes, each of which may
define some number of methods, and each instantiation adds yet another
copy of all those methods.

This is doubly bad if these templates are used only during compile-time,
and never referenced during runtime. That's a lot of useless baggage in
the final executable. Plus, it leads to issues like this one:

	http://d.puremagic.com/issues/show_bug.cgi?id=10833

While looking at this bug, I got an idea: what if, instead of emitting
template instantiations into the same object file as non-templated code,
the compiler were to emit each instantiation into a separate static
*library*? For instance, if you have code in program.d, then the
compiler would emit non-templated code like main() into program.o, but
all template instantiations get put in, say, libprogram.a. Then during
link time, the compiler runs `ld -oprogram program.o libprogram.a`, and
then the linker will pull in symbols from libprogram.a that are
referenced by program.o.

If we were to set things up so that libprogram.a contains a separate
unit for each instantiated template function, then the linker would
actually pull in only code that is actually referenced at runtime. For
example, say our code looks like this:

	struct S(T) {
		T x;
		T method1(T t) { ... }
		T method2(T t) { ... }
		T method3(T t) { ... }
	}
	void main() {
		auto sbyte  = S!byte();
		auto sint   = S!int();
		auto sfloat = S!float();

		sbyte.method1(1);
		sint.method2(2);
		sfloat.method3(3.0);
	}

Then the compiler would put main() in program.o, and *nothing else*. In
program.o, there would be undefined references to S!byte.method1,
S!int.method2, and S!float.method3, but not the actual code. Instead,
when the compiler sees S!byte, S!int, and S!float, it puts all of the
instantiated methods inside libprogram.a as separate units:

	libprogram.a:
		struct_S_byte_method1.o:
			S!byte.method1
		struct_S_byte_method2.o:
			S!byte.method2
		struct_S_byte_method3.o:
			S!byte.method3
		struct_S_int_method1.o:
			S!int.method1
		struct_S_int_method2.o:
			S!int.method2
		struct_S_int_method3.o:
			S!int.method3
		struct_S_float_method1.o:
			S!float.method1
		struct_S_float_method2.o:
			S!float.method2
		struct_S_float_method3.o:
			S!float.method3

Since the compiler doesn't know at instantiation time which of these
methods will actually be used, it simply emits all of them and puts them
into the static library.

Then at link-time, the compiler tells the linker to include libprogram.a
when linking program.o. So the linker goes through each undefined
reference, and resolves them by linking in the module in libprogram.a
that defines said reference. So it would link in the code for
S!byte.method1, S!int.method2, and S!float.method3. The other 6
instantiations are not linked into the final executable, because they
are never actually referenced by the runtime code.

So this way, we minimize template bloat to only the code that's actually
used at runtime. If a particular template function instantiation is only
used during CTFE, for example, it would be present in libprogram.a but
won't get linked, because none of the runtime code references it. This
would fix bug 10833.

Is this workable? Is it implementable in DMD?


T

-- 
Nearly all men can stand adversity, but if you want to test a man's character,
give him power. -- Abraham Lincoln
Aug 19 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
Just two words: "separate compilation".

Any solution that is going to address template problem needs to 
improve current state with such compilation model, not make it 
even worse.

As an alternative, I have proposed one of two approaches (or 
both):

1) Stop dumping all symbols into root module supplied from the 
command line. Emit symbols to object files that match modules 
they were instantiated from. If symbol has no valid source point 
(== constraint or CTFE) than don't emit it at all.

2) Create object files in format that allows usage of `ld 
--gc-sections` (garbage collection of unused symbols upon 
linking). Don't know if similar thing exists for Windows.

Latter should be relatively easy to do but it is not 
cross-platform and it does not help build systems with tracking 
rebuild conditions.

Former feels like a proper approach and I have been working on it 
(also  eskimor) for some time. But it is rather hard as relevant 
code does not seem to track required information at all and 
probably no one but Walter knows all minor details about its 
design. To sum it up - I have not been able to produce a pull 
request that passes the test suite so far (though I have put it 
on hold for some time, going to return to this).
Aug 19 2013
next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Mon, Aug 19, 2013 at 10:37:35PM +0200, Dicebot wrote:
 Just two words: "separate compilation".
 
 Any solution that is going to address template problem needs to
 improve current state with such compilation model, not make it even
 worse.
I thought about that, actually. (I'm a fan of separate compilation, though the current state of D makes it more advantageous to compile everything in one go where possible.) There should be *one* static library where template instantiations are dumped. Let's call it templates.a for lack of a better name. Inside this archive, template instantiations are organized according to the module hierarchy -- i.e., if you import std.stdio and use writeln, the instantiation of writeln will be put into std/stdio/writeln_*.o in templates.a, NOT the root module. Furthermore, if a particular instantiation is already present in templates.a, then it is not duplicated, but simply replaced (I was going to say ignored, but replace is better in case templates.a contains leftovers from a previous compilation run). This eliminates duplicate instantiations of the same templates in multiple modules.
 As an alternative, I have proposed one of two approaches (or both):
 
 1) Stop dumping all symbols into root module supplied from the
 command line. Emit symbols to object files that match modules they
 were instantiated from. If symbol has no valid source point (==
 constraint or CTFE) than don't emit it at all.
+1. Plus, DMD really needs to stop stripping paths from object files *by default* -- now that we have package.d, this will blow up in ugly ways once you have templates in package.d that get instantiated and there are multiple packages with package.d.
 2) Create object files in format that allows usage of `ld
 --gc-sections` (garbage collection of unused symbols upon linking).
 Don't know if similar thing exists for Windows.
Well, the reason I went for bare linker essentials was precisely to avoid platform-specific issues like this. But OTOH, on platforms that support the equivalent of --gc-sections, we could just emit template instantiations into separate sections instead of separate modules in a static library. The static library approach can serve as a fallback on platforms that don't support this.
 Latter should be relatively easy to do but it is not cross-platform
 and it does not help build systems with tracking rebuild conditions.
 
 Former feels like a proper approach and I have been working on it
 (also  eskimor) for some time. But it is rather hard as relevant
 code does not seem to track required information at all and probably
 no one but Walter knows all minor details about its design. To sum
 it up - I have not been able to produce a pull request that passes
 the test suite so far (though I have put it on hold for some time,
 going to return to this).
Hmm. It'd be nice if this could be made to work. One thing I'm not very happy with in D is the sheer size of the executables even for relatively simple programs. If unused symbols could be stripped during linking, this would help a lot. T -- Political correctness: socially-sanctioned hypocrisy.
Aug 19 2013
next sibling parent "Dicebot" <public dicebot.lv> writes:
On Monday, 19 August 2013 at 21:33:08 UTC, H. S. Teoh wrote:
 Inside this
 archive, template instantiations are organized according to the 
 module
 hierarchy -- i.e., if you import std.stdio and use writeln, the
 instantiation of writeln will be put into std/stdio/writeln_*.o 
 in
 templates.a, NOT the root module.
Ok. Now you change the signature of writeln. Incremental rebuild. Will you do reflection upon the object file in templates.a to get the list of required instances? Or rebuild all modules that import std.stdio even indirectly? Most likely latter, because it also could have been inlined in several places and those need to be rebuilt too. To track modules that actually use that template instance one
Aug 19 2013
prev sibling parent "Dicebot" <public dicebot.lv> writes:
On Monday, 19 August 2013 at 21:33:08 UTC, H. S. Teoh wrote:
 Hmm. It'd be nice if this could be made to work. One thing I'm 
 not very
 happy with in D is the sheer size of the executables even for
 relatively simple programs. If unused symbols could be stripped 
 during
 linking, this would help a lot.
TBH, I was waiting until dust with DDMD settles down a bit simply reading sources, but going to get back pretty soon. Help is welcome, I can provide plenty of information to start with ;)
Aug 19 2013
prev sibling parent Andrej Mitrovic <andrej.mitrovich gmail.com> writes:
On 8/19/13, H. S. Teoh <hsteoh quickfur.ath.cx> wrote:
 Plus, DMD really needs to stop stripping paths from object files *by
 default*
Meanwhile give your vote to: https://github.com/D-Programming-Language/dmd/pull/1871
Aug 19 2013
prev sibling next sibling parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Monday, 19 August 2013 at 20:23:46 UTC, H. S. Teoh wrote:
 With D's honestly awesome metaprogramming features, templates 
 are liable
 to be (and in fact are) used a LOT. This leads to the 
 unfortunate
 situation of template bloat: every time you instantiate a 
 template, it
 adds yet another copy of the templated code into your object 
 file. This
 gets worse when you use templated structs/classes, each of 
 which may
 define some number of methods, and each instantiation adds yet 
 another
 copy of all those methods.

 This is doubly bad if these templates are used only during 
 compile-time,
 and never referenced during runtime. That's a lot of useless 
 baggage in
 the final executable. Plus, it leads to issues like this one:

 	http://d.puremagic.com/issues/show_bug.cgi?id=10833

 While looking at this bug, I got an idea: what if, instead of 
 emitting
 template instantiations into the same object file as 
 non-templated code,
 the compiler were to emit each instantiation into a separate 
 static
 *library*? For instance, if you have code in program.d, then the
 compiler would emit non-templated code like main() into 
 program.o, but
 all template instantiations get put in, say, libprogram.a. Then 
 during
 link time, the compiler runs `ld -oprogram program.o 
 libprogram.a`, and
 then the linker will pull in symbols from libprogram.a that are
 referenced by program.o.

 If we were to set things up so that libprogram.a contains a 
 separate
 unit for each instantiated template function, then the linker 
 would
 actually pull in only code that is actually referenced at 
 runtime. For
 example, say our code looks like this:

 	struct S(T) {
 		T x;
 		T method1(T t) { ... }
 		T method2(T t) { ... }
 		T method3(T t) { ... }
 	}
 	void main() {
 		auto sbyte  = S!byte();
 		auto sint   = S!int();
 		auto sfloat = S!float();

 		sbyte.method1(1);
 		sint.method2(2);
 		sfloat.method3(3.0);
 	}

 Then the compiler would put main() in program.o, and *nothing 
 else*. In
 program.o, there would be undefined references to 
 S!byte.method1,
 S!int.method2, and S!float.method3, but not the actual code. 
 Instead,
 when the compiler sees S!byte, S!int, and S!float, it puts all 
 of the
 instantiated methods inside libprogram.a as separate units:

 	libprogram.a:
 		struct_S_byte_method1.o:
 			S!byte.method1
 		struct_S_byte_method2.o:
 			S!byte.method2
 		struct_S_byte_method3.o:
 			S!byte.method3
 		struct_S_int_method1.o:
 			S!int.method1
 		struct_S_int_method2.o:
 			S!int.method2
 		struct_S_int_method3.o:
 			S!int.method3
 		struct_S_float_method1.o:
 			S!float.method1
 		struct_S_float_method2.o:
 			S!float.method2
 		struct_S_float_method3.o:
 			S!float.method3

 Since the compiler doesn't know at instantiation time which of 
 these
 methods will actually be used, it simply emits all of them and 
 puts them
 into the static library.

 Then at link-time, the compiler tells the linker to include 
 libprogram.a
 when linking program.o. So the linker goes through each 
 undefined
 reference, and resolves them by linking in the module in 
 libprogram.a
 that defines said reference. So it would link in the code for
 S!byte.method1, S!int.method2, and S!float.method3. The other 6
 instantiations are not linked into the final executable, 
 because they
 are never actually referenced by the runtime code.

 So this way, we minimize template bloat to only the code that's 
 actually
 used at runtime. If a particular template function 
 instantiation is only
 used during CTFE, for example, it would be present in 
 libprogram.a but
 won't get linked, because none of the runtime code references 
 it. This
 would fix bug 10833.

 Is this workable? Is it implementable in DMD?


 T
Without link-time optimisation, this prevents inlining doesn't it?
Aug 19 2013
parent reply "Dicebot" <public dicebot.lv> writes:
On Monday, 19 August 2013 at 22:11:39 UTC, John Colvin wrote:
 Without link-time optimisation, this prevents inlining doesn't 
 it?
It does not _prevent_ inlining, but it breaks incremental builds in case inlining has ever happened.
Aug 19 2013
parent reply "Ramon" <spam thanks.no> writes:
Well, I'm afraid that's what templates are. One (or the compiler) 
fills them in and that's it.

In other words: Templates are compile time while (real) generics 
are run time. This basically comes down to have some way of 
designating classes as, for instance, comparable and then either 
running along the object chain comparing all built in objects 
(with built in compare functionality) or having a compare 
implemented (Of course, there is also arithmetic functions, etc.).

While this sounds great it actually carries some code weight 
("bloat") with it, too because all that functionality must be 
somewhere. It gets (relatively) cheaper though when being heavily 
used.
Aug 19 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Tuesday, 20 August 2013 at 00:34:38 UTC, Ramon wrote:
 Well, I'm afraid that's what templates are. One (or the 
 compiler) fills them in and that's it.

 In other words: Templates are compile time while (real) 
 generics are run time. This basically comes down to have some 
 way of designating classes as, for instance, comparable and 
 then either running along the object chain comparing all built 
 in objects (with built in compare functionality) or having a 
 compare implemented (Of course, there is also arithmetic 
 functions, etc.).

 While this sounds great it actually carries some code weight 
 ("bloat") with it, too because all that functionality must be 
 somewhere. It gets (relatively) cheaper though when being 
 heavily used.
What you speak is true for languages with "generics" where amount of generated code is pretty much equal to one that would have been written by hand. But in D templates do much more and there is no practical reasons other one quality of implementation to keep it that way. For example, template constraints and stuff used during CTFE are low-hanging fruits. Those don't need to be emitted in resulting executable at all being only used at compile time. More theoretically complex problem is stuff like std.algorithm - simply using something like map will result in several hundreds (!) of trivial template instances most of which will be inlined and never actually used in resulting binary. That is something that link-stage gargabe collection can take care of with a totally awesome results. I doubt it can be done by compiler itself but maybe there are some options I have missed. In a perfect world using templates implies generic algorithms not generic code generation. it same thing as manual assembly - with modern optimizers and inlining capabilities all this crap must be boiled down to same code as carefully crafted manual one. No reasons to not do it. Reminds me: how hard is writing own linker is again? :)
Aug 19 2013
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 20, 2013 at 02:48:27AM +0200, Dicebot wrote:
 On Tuesday, 20 August 2013 at 00:34:38 UTC, Ramon wrote:
Well, I'm afraid that's what templates are. One (or the compiler)
fills them in and that's it.

In other words: Templates are compile time while (real) generics
are run time. This basically comes down to have some way of
designating classes as, for instance, comparable and then either
running along the object chain comparing all built in objects
(with built in compare functionality) or having a compare
implemented (Of course, there is also arithmetic functions, etc.).
In that case, the "generics" you're talking about amounts basically to OO polymorphism. You have the same machine code that can process diverse types by using indirection to abstract away the implementation details. This is no doubt useful, as OO itself proves, but it does come with a cost: using indirection incurs a (small, but nevertheless non-zero) performance hit. Inside inner loops, this can be a performance killer. At the machine code level, it's actually not possible to use the same code for, e.g., comparing two ints vs. comparing two floats. You need different machine instructions for them, so there is no single piece of code that can be reused for both types. You have to somehow switch between them at runtime depending on what types are being passed in. (It gets even hairier if you're comparing, say, ints and floats, in which case additional instructions must be used for promoting one type to another so that they are comparable.) So, say you call your function "less". In order to actually run, you need one version for comparing ints, another for comparing floats, etc.. This is what templates do. Alternatively, you use indirection: int and float can be associated with some static data structure that describes each type, say it has a function pointer that, for ints, point to the function that compares int, and for floats, point to the function that compares floats. Then the caller doesn't actually directly call the low-level int/float-specific functions, but they always look up the function pointer. This is, in essence, how OO polymorphism works: each class has a vtable with function pointers to the specific implementation of each overloaded method. Then at runtime, you call whatever function the vtable points to, thus achieving runtime genericity. The problem with indirection is that it's expensive: given two objects to be compared, you need to dereference the pointer to those objects, then dereference the pointer to their respective vtables, then dereference the function pointer in the vtables in order to call the function. Templates, OTOH, are compile-time bound: the compiler ascertains at compile-time that your particular piece of code is comparing two ints, so it bypasses all of the expensive runtime dereferencing, and directly calls the function for comparing ints. The resulting code is inflexible, in the sense that you can't change, at runtime, the arguments to floats -- it would fail horribly -- but it avoids 3 pointer dereferences. When inside an inner loop, this can mean the difference between smooth animation and unusably jerky animation. The cost, of course, is that if you need the same piece of code for comparing both ints and floats, then the compiler has to generate two copies of the code, one to handle ints, and the other to handle floats. The saving grace, as Dicebot points out, is that if these copies of code are small enough, they will be inlined, so you can even save on the cost of a function call. This somewhat reduces the resulting code size -- you save on function call instructions and stack push/pops, etc., but you're still paying for the duplicated code. This is the current state of the art. Now, my dream is that one day, perhaps compilers will get smart enough that you don't even need to worry about the distinction between templates and runtime polymorphism anymore -- you specify what you want to get done, and the compiler determines, based on characteristics of the target machine and how the program as a whole will use that particular piece of code, whether to use indirection or template expansion. Or maybe even a mix of both, depending on the situation. [...]
 Reminds me: how hard is writing own linker is again? :)
Honestly, I think it's about time linker technology is rethought and developed further. Possible developements are automatic elision of unused code sections (already implemented in some linkers), automatic merging of identical sections (not sure if implemented yet -- may require language support), link-time inlining, reordering of symbols to increase code locality during execution (optimize for CPU caches), etc.. Or more ambitiously, better integration with compiler so that the linker has access to compile-time structures to help it make decisions about optimization. Present-day object file formats are too far along the process to their executable form to permit many optimizations that could in theory be performed by the linker, requiring instead hacks like weak symbols, unused section GC, etc.. On the more mundane side, we need better algorithms for improving linker performance. Current symbol resolution algorithms don't scale very well when your object files are large or have large numbers of symbols. Surely there are ways of improving the asymptotic complexity of these things! T -- It is impossible to make anything foolproof because fools are so ingenious. -- Sammy
Aug 19 2013
next sibling parent "Daniel Murphy" <yebblies nospamgmail.com> writes:
"H. S. Teoh" <hsteoh quickfur.ath.cx> wrote in message 
news:mailman.213.1376962388.1719.digitalmars-d puremagic.com...
 [...]
 Reminds me: how hard is writing own linker is again? :)
Honestly, I think it's about time linker technology is rethought and developed further. Possible developements are automatic elision of unused code sections (already implemented in some linkers), automatic merging of identical sections (not sure if implemented yet -- may require language support), link-time inlining, reordering of symbols to increase code locality during execution (optimize for CPU caches), etc.. Or more ambitiously, better integration with compiler so that the linker has access to compile-time structures to help it make decisions about optimization. Present-day object file formats are too far along the process to their executable form to permit many optimizations that could in theory be performed by the linker, requiring instead hacks like weak symbols, unused section GC, etc.. On the more mundane side, we need better algorithms for improving linker performance. Current symbol resolution algorithms don't scale very well when your object files are large or have large numbers of symbols. Surely there are ways of improving the asymptotic complexity of these things!
Check out llvm's lld. Their choice of language sucks, but they do appear to be trying to rethink the whole mess.
Aug 20 2013
prev sibling parent "Dicebot" <public dicebot.lv> writes:
On Tuesday, 20 August 2013 at 01:33:08 UTC, H. S. Teoh wrote:
 [...]
 Reminds me: how hard is writing own linker is again? :)
Honestly, I think it's about time linker technology is rethought and developed further.
Actually, I was asking about this not because there are critical issues with existing linker technology. While it definitely has a lot of space for improvements, even good old `ld` has plenty of features that do matter. Collection of unused sections that was already mentioned can result in huge binary size improvements. I'd love to enable this by default - problem is we can't rely on platform-specific tools to define such an important feature tightly coupled with language itself (think about `export`).
Aug 20 2013
prev sibling parent reply Timon Gehr <timon.gehr gmx.ch> writes:
On 08/20/2013 02:34 AM, Ramon wrote:
 Well, I'm afraid that's what templates are. One (or the compiler) fills
 them in and that's it.

 In other words: Templates are compile time while (real) generics are run
 time.
If real generics means polymorphic type system then the difference is that templates are not part of the type system.
Aug 20 2013
parent reply "Ramon" <spam thanks.no> writes:
I'm afraid the issue is bigger.

One major criterion, for instance, is the basic question how we 
attribute weights to the involved entities.

C had a clear background. There was a new system (PDP11) and a 
need to use it. Memory was little and strongly limited, the 
typical use was quite modest (considerably less than what we 
today have on our mobile phones), processor power and capability 
was very low and very expensive, etc.

This, ladies and gentleman, is quite simple not anymore and 
adequate approach.

Don't get me wrong, I'm not on the side of the other extreme 
("Who cares about processor time and memory"). But the world has 
very considerably changed and so has computing. Features that 
would have seemd miraculous in 1970 are low standard today and - 
very importantly - the whole world had very considerably gained 
in complexity.

If I need to programm a MPC430 or even an STM32F4, I'll use C, 
period. There *is* a solution for jobs with very tight 
constraints, we just don't need a new language for that.

If, however, I have to design and build a solution that works on 
different OSes incl. mobile phones and spans over a large network 
then I won't use C.

Furthermore, we have seen again and again how unreliable humans 
are at certain jobs. Just think "virus","buffer overflow" and a 
gazillion of other problems stemming from two reasons, a) lack of 
professionality and b) lack of perfection, where perfection is 
very much dependant on working tediously diligently and 
stubbornly (in other words, somethings that computers are *way* 
better at than humans).

Templates just don't cut it, period. Templates are 
ultra-yesteryear and proven to be troublesome, no matter how 
smartly you implement them. It's just not acceptable that a 
language in 2013 (in that regard) doesn't offer dimensionally 
more and better than what I could do with Brief or the like in 
1985.

So: Are D templates ugly loat? Yes, sure. Do I care? No, not at 
all. Why should I complain about D being unsatisfying as an 
editor?
Aug 20 2013
parent reply "Dicebot" <public dicebot.lv> writes:
On Tuesday, 20 August 2013 at 16:59:00 UTC, Ramon wrote:
 I'm afraid the issue is bigger.
Your insight about variety of modern programming language applications is extremely limited. If you are willing to introduce random runtime costs for nothing, there are lot of other awesome languages that can satisfy your needs. As someone who does not want to write embedded'ish code in C anymore (and hopes to drag D there eventually) I am dangerously close to hating you.
Aug 20 2013
parent reply "Ramon" <spam thanks.no> writes:
On Tuesday, 20 August 2013 at 17:05:21 UTC, Dicebot wrote:
 On Tuesday, 20 August 2013 at 16:59:00 UTC, Ramon wrote:
 I'm afraid the issue is bigger.
Your insight about variety of modern programming language applications is extremely limited. If you are willing to introduce random runtime costs for nothing, there are lot of other awesome languages that can satisfy your needs. As someone who does not want to write embedded'ish code in C anymore (and hopes to drag D there eventually) I am dangerously close to hating you.
And that shows. Well, if that fits your definition of "professional", so be it. To avoid misunderstandings: I still like D and think that it's a great language/solution. I still see very attractive points. I still think that even templates can be attractive and are way better done in D than in C++. I just don't agree that writing "generics" in the brochure and actually delivering templates is a good thing. And I happen to think that real generics are a very good thing. Nevertheless, feel free to hate me and to get closer to ad hominems. I even promise to *not* consider your attitude to in any way reflect D's (or their creators) ;-)
Aug 20 2013
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 20, 2013 at 07:18:50PM +0200, Ramon wrote:
 On Tuesday, 20 August 2013 at 17:05:21 UTC, Dicebot wrote:
On Tuesday, 20 August 2013 at 16:59:00 UTC, Ramon wrote:
I'm afraid the issue is bigger.
Your insight about variety of modern programming language applications is extremely limited. If you are willing to introduce random runtime costs for nothing, there are lot of other awesome languages that can satisfy your needs. As someone who does not want to write embedded'ish code in C anymore (and hopes to drag D there eventually) I am dangerously close to hating you.
And that shows. Well, if that fits your definition of "professional", so be it. To avoid misunderstandings: I still like D and think that it's a great language/solution. I still see very attractive points. I still think that even templates can be attractive and are way better done in D than in C++. I just don't agree that writing "generics" in the brochure and actually delivering templates is a good thing. And I happen to think that real generics are a very good thing.
[...] I think the problem is that your definition of "generic" is not the same as ours. :) Templates actually include your definition of generics if you use it correctly. Here's an example: interface LessThanComparable { bool opCmp(LessThanComparable b); } interface LessThanComparableRange { property bool empty(); property LessThanComparable front(); void popFront(); } void sortRange(LessThanComparableRange range) { // Note: single template instantiation of sort here std.algorithm.sort(range); } class MyClass : LessThanComparable { override bool opCmp(LessThanComparable b) { ... } ... } class MyClassRange : LessThanComparableRange { override property bool empty() { ... } override property LessThanComparable front() { ... } override void popFront() { ... } } class MyOtherClass : LessThanComparable { override bool opCmp(LessThanComparable b) { ... } ... } class MyOtherClassRange : LessThanComparableRange { override property bool empty() { ... } override property LessThanComparable front() { ... } override void popFront() { ... } } void main() { MyClassRange firstRange = ...; MyOtherClassRange secondRange = ...; ... sortRange(firstRange); sortRange(secondRange); } A few notes: - LessThanComparable lets you do polymorphism at runtime, so sortRange is a non-template "real generic" function that can sort any range involving LessThanComparable. - LessThanComparableRange provides a single ABI for any range of LessThanComparable elements. - sortRange instantiates the template function std.algorithm.sort exactly *once*, and it will work with any runtime type that implements LessThanComparableRange. - The rest of the code shows how you can define a bunch of classes that implement these interfaces, and they can be used with the sortRange function with full runtime polymorphism. You will note that there's some amount of boilerplate here -- because I just typed this up off the top of my head for illustration purposes; in real code you'd use mixins or other such stuff, perhaps *cough* use a template for generating all the xxxRange classes *cough* automatically. So this lets you have "real generics" which, as you can see, is really just a subset of what is covered by the template std.algorithm.sort, which can handle *any* concrete type, even those that don't implement any runtime polymorphic interfaces. Now let's talk about about how templates and "real generics" can work together. If you think about the above code carefully, you will realize that, at the machine level, you cannot avoid the overhead of dereferencing pointers to the various interfaces and class vtables, because the CPU can only deal with concrete types, it doesn't know how to work with data that can be any type. So, at *some* level, whether visible at the code level or not, everything must be translated down to type-specific code that the CPU can actually run. In "real generics", this is accomplished by having a single ABI (the interfaces LessThanComparable and LessThanComparableRange) that all concrete types must implement. Once implemented, the sortRange function doesn't have to deal with concrete types anymore: it can express the sorting algorithm directly in terms of the interfaces, and when a concrete operation like '<' is desired, it invokes LessThanComparable's opCmp method to accomplish it. (Which, in turn, essentially dereferences a function pointer that points to the actual machine code that does the comparison of the concrete type.) This, of course, has a lot of runtime costs: you need space to store the vtables, you need to initialize the pointers to these vtables every time you create a new instance of a class, it requires runtime overhead for calling a function via a function pointer instead of directly, etc.. The advantage, though, is that it allows sortRange to be "real generic", in the sense that you can load up a dynamic library at runtime that returns a range of elements of unknown type, and sortRange will be able to sort it just by using the LessThanComparable* interfaces. Now let's think about the template version of sortRange, which is just std.algorithm.sort. Here, the binding to the concrete type happens at compile-time; rather than produce a single machine code for sortRange that handles polymorphism via indirection (interfaces, function pointers, etc.), the template produces code that sorts a range of a *specific* type. Since we know exactly what concrete types we're dealing with, we don't need any of the indirections of the "real generic" approach; the compiler's optimizer can take advantage of characteristics of the concrete type to produce the most optimized machine code for sorting ranges of that kind. Each instantiation of std.algorithm.sort produces optimal machine code for sorting that particular range, because all the bindings to the concrete type is done at compile-time rather than runtime. This gives you the top performance. Of course, in programming, nothing comes for free; so the price for this top performance is that you need to produce many copies of the sort function -- one for each type of range, each optimized for that particular type of range. And on the surface, it would appear that it would be unable to handle runtime polymorphism either, because the template must be bound to the concrete types at compile-time. Right? Actually, this is where the power of templates is shown: in the example code I gave above, I deliberately implemented sortRange with std.algorithm.sort. But actually, the way I wrote it is kind of unnecessary, because std.algorithm.sort can be instantiated *directly* with LessThanComparableRange! So actually, we don't even need sortRange at all -- we can just call std.algorithm.sort directly on our "real generic" containers, and the compiler will produce a "real generic" version of sort that has all the runtime indirections that I described above, for handling runtime polymorphic objects, dynamically-loaded objects, etc.. Armed with this insight, we can now see that we can actually have the best of both worlds: if we already know the concrete types at compile-time, the template system optimizes the sort function at compile-time to deal specifically with that concrete type -- you get optimal performance. But if we don't know the concrete type at compile-time, we create an interface to be implemented by future concrete types (say by a 3rd party vendor), and the template system produces a "real generic" version of sort that can handle runtime polymorphism. IOW, the templating system *includes* what you describe as "real generics"; it is not inferior to it at all! In fact, the templating system can do what "real generics" can't: produce optimal code for a specific type without needing to copy-n-paste code (either manually or with an IDE or whatever). It's the *compiler* that instantiates the template with the concrete type, so it has access to all the specific implementation details of said concrete type which it can use to produce the best machine code for it -- automatically. T -- Windows: the ultimate triumph of marketing over technology. -- Adrian von Bidder
Aug 20 2013
next sibling parent "John Colvin" <john.loughran.colvin gmail.com> writes:
On Tuesday, 20 August 2013 at 17:59:35 UTC, H. S. Teoh wrote:
 On Tue, Aug 20, 2013 at 07:18:50PM +0200, Ramon wrote:
 On Tuesday, 20 August 2013 at 17:05:21 UTC, Dicebot wrote:
On Tuesday, 20 August 2013 at 16:59:00 UTC, Ramon wrote:
I'm afraid the issue is bigger.
Your insight about variety of modern programming language applications is extremely limited. If you are willing to introduce random runtime costs for nothing, there are lot of other awesome languages that can satisfy your needs. As someone who does not want to write embedded'ish code in C anymore (and hopes to drag D there eventually) I am dangerously close to hating you.
And that shows. Well, if that fits your definition of "professional", so be it. To avoid misunderstandings: I still like D and think that it's a great language/solution. I still see very attractive points. I still think that even templates can be attractive and are way better done in D than in C++. I just don't agree that writing "generics" in the brochure and actually delivering templates is a good thing. And I happen to think that real generics are a very good thing.
[...] I think the problem is that your definition of "generic" is not the same as ours. :) Templates actually include your definition of generics if you use it correctly. Here's an example: interface LessThanComparable { bool opCmp(LessThanComparable b); } interface LessThanComparableRange { property bool empty(); property LessThanComparable front(); void popFront(); } void sortRange(LessThanComparableRange range) { // Note: single template instantiation of sort here std.algorithm.sort(range); } class MyClass : LessThanComparable { override bool opCmp(LessThanComparable b) { ... } ... } class MyClassRange : LessThanComparableRange { override property bool empty() { ... } override property LessThanComparable front() { ... } override void popFront() { ... } } class MyOtherClass : LessThanComparable { override bool opCmp(LessThanComparable b) { ... } ... } class MyOtherClassRange : LessThanComparableRange { override property bool empty() { ... } override property LessThanComparable front() { ... } override void popFront() { ... } } void main() { MyClassRange firstRange = ...; MyOtherClassRange secondRange = ...; ... sortRange(firstRange); sortRange(secondRange); } A few notes: - LessThanComparable lets you do polymorphism at runtime, so sortRange is a non-template "real generic" function that can sort any range involving LessThanComparable. - LessThanComparableRange provides a single ABI for any range of LessThanComparable elements. - sortRange instantiates the template function std.algorithm.sort exactly *once*, and it will work with any runtime type that implements LessThanComparableRange. - The rest of the code shows how you can define a bunch of classes that implement these interfaces, and they can be used with the sortRange function with full runtime polymorphism. You will note that there's some amount of boilerplate here -- because I just typed this up off the top of my head for illustration purposes; in real code you'd use mixins or other such stuff, perhaps *cough* use a template for generating all the xxxRange classes *cough* automatically. So this lets you have "real generics" which, as you can see, is really just a subset of what is covered by the template std.algorithm.sort, which can handle *any* concrete type, even those that don't implement any runtime polymorphic interfaces. Now let's talk about about how templates and "real generics" can work together. If you think about the above code carefully, you will realize that, at the machine level, you cannot avoid the overhead of dereferencing pointers to the various interfaces and class vtables, because the CPU can only deal with concrete types, it doesn't know how to work with data that can be any type. So, at *some* level, whether visible at the code level or not, everything must be translated down to type-specific code that the CPU can actually run. In "real generics", this is accomplished by having a single ABI (the interfaces LessThanComparable and LessThanComparableRange) that all concrete types must implement. Once implemented, the sortRange function doesn't have to deal with concrete types anymore: it can express the sorting algorithm directly in terms of the interfaces, and when a concrete operation like '<' is desired, it invokes LessThanComparable's opCmp method to accomplish it. (Which, in turn, essentially dereferences a function pointer that points to the actual machine code that does the comparison of the concrete type.) This, of course, has a lot of runtime costs: you need space to store the vtables, you need to initialize the pointers to these vtables every time you create a new instance of a class, it requires runtime overhead for calling a function via a function pointer instead of directly, etc.. The advantage, though, is that it allows sortRange to be "real generic", in the sense that you can load up a dynamic library at runtime that returns a range of elements of unknown type, and sortRange will be able to sort it just by using the LessThanComparable* interfaces. Now let's think about the template version of sortRange, which is just std.algorithm.sort. Here, the binding to the concrete type happens at compile-time; rather than produce a single machine code for sortRange that handles polymorphism via indirection (interfaces, function pointers, etc.), the template produces code that sorts a range of a *specific* type. Since we know exactly what concrete types we're dealing with, we don't need any of the indirections of the "real generic" approach; the compiler's optimizer can take advantage of characteristics of the concrete type to produce the most optimized machine code for sorting ranges of that kind. Each instantiation of std.algorithm.sort produces optimal machine code for sorting that particular range, because all the bindings to the concrete type is done at compile-time rather than runtime. This gives you the top performance. Of course, in programming, nothing comes for free; so the price for this top performance is that you need to produce many copies of the sort function -- one for each type of range, each optimized for that particular type of range. And on the surface, it would appear that it would be unable to handle runtime polymorphism either, because the template must be bound to the concrete types at compile-time. Right? Actually, this is where the power of templates is shown: in the example code I gave above, I deliberately implemented sortRange with std.algorithm.sort. But actually, the way I wrote it is kind of unnecessary, because std.algorithm.sort can be instantiated *directly* with LessThanComparableRange! So actually, we don't even need sortRange at all -- we can just call std.algorithm.sort directly on our "real generic" containers, and the compiler will produce a "real generic" version of sort that has all the runtime indirections that I described above, for handling runtime polymorphic objects, dynamically-loaded objects, etc.. Armed with this insight, we can now see that we can actually have the best of both worlds: if we already know the concrete types at compile-time, the template system optimizes the sort function at compile-time to deal specifically with that concrete type -- you get optimal performance. But if we don't know the concrete type at compile-time, we create an interface to be implemented by future concrete types (say by a 3rd party vendor), and the template system produces a "real generic" version of sort that can handle runtime polymorphism. IOW, the templating system *includes* what you describe as "real generics"; it is not inferior to it at all! In fact, the templating system can do what "real generics" can't: produce optimal code for a specific type without needing to copy-n-paste code (either manually or with an IDE or whatever). It's the *compiler* that instantiates the template with the concrete type, so it has access to all the specific implementation details of said concrete type which it can use to produce the best machine code for it -- automatically. T
Yet again, D proves to be a powerful enough language to not need extra language extensions to support a wide variety of paradigms. We should have some template mixins for this stuff in std.patterns or something.
Aug 20 2013
prev sibling parent reply "Ramon" <spam thanks.no> writes:
Thank you very much, H. S. Teoh

That was an excellent post, very helpful and constructive. Thank 
you!

I'm, btw. not opposed to templates and I see perfectly well that 
a compiler is in a better position to handle them than an editor.
I also agree that sometimes I will gladly use templates rather 
than real generics.

My main point wasn't anti-D (although I still don't like to 
advertise generics when actually delivering templates and 
mechanisms to implement generics oneself) but a completely 
different one:

Sometimes I just want (or feel needing) to go fully OO and, at 
the same time, have the compiler take care of the nitty gritties. 
This, in fact, is a rather humble position; I have learned in 
many years that, no matter how smart we are, we introduce bugs 
and problems into our code because we are humans. I've learned 
the hard way that there are things computers simply do better and 
more diligently than we humans.

And there is another point.

Nowadays, we happily use gigantesque GUI systems, we gladly put 3 
config parameters into an XML file and we happily consider 600k 
lines sorce code adequate for, say, an audio player based on some 
insane gobject system ...

On the other hand we do have serious qualms to increase program 
size by, say, 5% in an application that typically runs on a 4 
core processor with 8GB RAM. This strikes me as ... uh ... 
strange.

And we *do get* something, possibly something vital, for those 5% 
code size increase, namely code that is easier to understand and 
to maintain, easier to extend and less buggy.

I'm not sure that Prof. Meyers no compromises attitude is always 
good. After all, life is a series of trade offs and compromises. 
I am, however, pretty sure that template systems tend to be messy 
and complicated - and therefore bound to be error prone.

25 years ago I would have defended any weirdness; the harder and 
stranger the better. Meanwhile I have learned that readability 
often is considerably more important than ease of writing and 
that some compromises seem cheap but turn out to be very 
expensive later on.

Last but not least, there are hundred (thousands?) of more or 
less compromise type languages out there. That wasn't what I was 
looking for.

Thanks to your friendly, patient and constructive answer I'm 
feeling that while D isn't perfect it's quite close to it and, 
more importantly, it at least offers  guys like me what I 
consider important, if at a price. So what.

So, again: Thank you very much - Ramon
Aug 20 2013
parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Tue, Aug 20, 2013 at 08:25:59PM +0200, Ramon wrote:
[...]
 My main point wasn't anti-D (although I still don't like to
 advertise generics when actually delivering templates and mechanisms
 to implement generics oneself) but a completely different one:
 
 Sometimes I just want (or feel needing) to go fully OO and, at the
 same time, have the compiler take care of the nitty gritties. This,
 in fact, is a rather humble position; I have learned in many years
 that, no matter how smart we are, we introduce bugs and problems
 into our code because we are humans. I've learned the hard way that
 there are things computers simply do better and more diligently than
 we humans.
Actually, on this point, I think D can be improved. As John Colvin suggested, we should collect the most common and useful OO patterns that provide this kind of runtime polymorphic genericity and put them into the standard library (std.patterns was suggested), so that you don't even have to write this stuff yourself. Then maybe you will feel better about D being advertised as "generic". :)
 And there is another point.
 
 Nowadays, we happily use gigantesque GUI systems, we gladly put 3
 config parameters into an XML file and we happily consider 600k
 lines sorce code adequate for, say, an audio player based on some
 insane gobject system ...
 
 On the other hand we do have serious qualms to increase program size
 by, say, 5% in an application that typically runs on a 4 core
 processor with 8GB RAM. This strikes me as ... uh ... strange.
 
 And we *do get* something, possibly something vital, for those 5%
 code size increase, namely code that is easier to understand and to
 maintain, easier to extend and less buggy.
Actually, I think there's a lot to be said about reducing *source* code size, in order to keep things maintainable. But more on this below. OTOH, there are also advantages to keeping *machine* code size smaller -- if your program (or the resident part of it) can fit completely within the CPU cache, it will run *much* faster than if the CPU has to keep swapping code pages in/out because your program is too big. This issue is kinda hard to address at the high-level, though, because although a runtime polymorphic approach will produce smaller machine code (no code duplication), it also has poorer locality, because of all the layers of indirection (the CPU has to keep the vtables and methods of diverse objects in cache). A template approach has the disadvantage of larger code size (due to code duplication), but it has better locality because of less indirection -- it's easier for the CPU to fit the code/data for inner loops within the cache than if you need to have a vtable here, a method there, etc.. How all of this actually balances out in practice isn't something easy to reason about, and is probably highly sensitive to fine-tunings and specific use-cases.
 I'm not sure that Prof. Meyers no compromises attitude is always
 good. After all, life is a series of trade offs and compromises. I
 am, however, pretty sure that template systems tend to be messy and
 complicated - and therefore bound to be error prone.
 
 25 years ago I would have defended any weirdness; the harder and
 stranger the better. Meanwhile I have learned that readability often
 is considerably more important than ease of writing and that some
 compromises seem cheap but turn out to be very expensive later on.
I'm just guessing here, but I suspect maybe your perceptions have been colored by the way C++ templates turned out? I've a strong C/C++ background, and IME, C++ templates do in fact turn out really messy and hard to maintain. However, in my understanding, the problem isn't really with the fact of a template system itself, but rather the way C++ implements it. To be fair, when C++ templates were first introduced, generics do; the *other* usages of C++ templates were emergent behaviour that was discovered after the fact. As a result, C++'s implementation choices for templates didn't have the advantage of 20/20 hindsight, and so it turned out to be really messy to maintain. D's template system, OTOH... I have to admit that it took me a while to get used to it, but after having used it for a couple o' years now, I have nothing but praise for it. First and foremost, the syntax is far saner than in C++, and, thanks to 20/20 hindsight, it was designed to support the kind of emergent behaviour of C++ templates that C++ was never designed to support. This, coupled with other niceties of D design, CTFE, static if, signature constraints, and a few simple innovations (but with profound consequences) like eponymous templates, makes D's template system truly something to be reckoned with. It's extremely powerful, very versatile, and yet it is *readable*!! In fact, it is so readable that I have no qualms recommending people to read the Phobos standard library's source code. You will not find the kind of opaque unreadably terse convoluted hacks that, say, C/C++ is full of. Instead, you'll find code exactly of the same kind that you'd write in your own programs -- nice, clean, readable. Try it sometime. It's quite an enlightening experience. :) So, to answer your statement about ease of writing vs. readability, I'd say, what about *both*? While D is certainly not perfect in this regard, I haven't seen anything comparable among the languages that I know so far. D's template system makes code very easy to write, *and* easy to read. I honestly have no more appetite for C/C++ after learning D; there is just no comparison. For instance, in C, the most straightforward way to write code is actually the wrong way -- you end up with memory leaks, unchecked error conditions, and all sorts of such issues. Optimized C code, as I'm sure you know very well, is basically a terse blob of unreadable symbols, where changing a single character could break the entire program. In C++, the most straightforward way to write code is, fortunately, correct for the most part -- but usually slow and unoptimized. Optimized C++ code unfortunately looks not too much different from the unreadable blob of optimized C code (and sometimes worse, if you have templates in the mix). In D, however, it's actually possible to write code that's very easy to read, yet fully optimized. D is almost unique in allowing you to write code that reads like a textbook example, yet is actually carefully optimized for maximum performance. You can actually write readable code that's used for production software! D's standard library, Phobos, is a shining example of this. (On the contrary, all my experiences of production C/C++ code involve nasty hacks, inscrutable optimizations, and needlessly fragile poor designs, that make me wonder about career changes.) [...]
 Thanks to your friendly, patient and constructive answer I'm feeling
 that while D isn't perfect it's quite close to it and, more
 importantly, it at least offers  guys like me what I consider
 important, if at a price. So what.
[...] Well, I think we can reduce this price. :) As I mentioned earlier, we should collect the most common and useful OO patterns that allow you to write runtime generic code easily, and put them into the standard library so that people like you can have your cake and eat it too. I think D is up to the task. :) T -- English has the lovely word "defenestrate", meaning "to execute by throwing someone out a window", or more recently "to remove Windows from a computer and replace it with something useful". :-) -- John Cowan
Aug 20 2013
parent reply "Ramon" <spam thanks.no> writes:
Thanks again, H. S. Teoh,

for yet another informative and well informed post.

Allow me one remark though.

"Easy to read" can mean a lot of things and coming from soneone 
with a strong C/C++ background, it usually doesn't mean that 
much. No offense intended, I do very well remember my own 
attitude over the years.

Let me put it in the form of a confession. I confess that 25 
years ago I considered Pascal programmers to be lousy hobby boys, 
Ada programmers bureaucratic perverts and Eiffel guys simply 
insane perverts.

It probably doesn't shine a nice light on myself but I have to 
follow up with another and potentially more painful confession: 
It took me over a decade to even think about my position and 
C/C++ and another half decade to consider it possible that 
Pascal, Ada, and Eiffel guys actually might have gotten something 
right, if only minor details ...

Today, some hundred (or thousand?) hours of painful search for 
bugs or problems later due to things like '=' in an if clause, 
I'm ready to admit that any language using '=' for assignment 
rather than, for instance, ':=' is effectively creating a trap 
for the people using it.

Today I look at Ada and Eiffel with great respect.

I've seen hints from the authors of D themselves that '++' and 
'--' might not be the wisest way of action. So I stand here 
asking "Why the hell did they implement them?"
It would be very simple to implement that logic in an editor for 
those who feel that life without '++' is impossible to 
automagically expand "x++" to "X := x + 1". Having seen 
corporations in serious trouble because their system broke (or 
happily continued to run albeit producing erroneous data ...) for 
this "small detail" I have a hard time to defend '++'. ("Save 5 
sec typing/day and risk your company!").

Another issue (from an Ada background): Why "byte" ... (the 
complete series up to) ... "cent"? Bytes happen to be important 
for CPUs - not for the world out there. I wouldn't like to count 
the gazillion cases where code went belly up because something 
didn't fit in 16 bits. Why not the other way around, why not the 
whole she-bang, i.e., 4 (or) 8 bytes as default for a single 
fixed point type ("int") and a mechanism to specify what actually 
is needed?
So for days in a month we'd have "int'b5 dpm;" (2 pow x notation) 
or "int'32dpm;"? Even funnier, even D's authors seems to have had 
thoughts in that direction (but not following them) when 
designing the dyn array mechanism where a dyn array effectively 
has 2 pow x based de facto storage (size 6 (elements officially 
used) de facto is an 8 element array).

This happens to be a nice example for perspective. C's 
perspective (by necessity) was resource oriented along the line 
"offer an 8bit int so as to not waste 16bits were 8bits suffice".
Yet we still do that in the 21st century rather than acting more 
*human oriented* by putting the decision for the size to the 
human. Don't underestimate that! The mere action of reflecting 
how much storage is needed is valuable and helps to avoid errors.

D is, no doubts, an excellent and modern incarnation of C/C++. As 
far as I'm concerned D is *the* best C/C++ incarnation ever, 
hands down.

But is '=' really a holy issue? Would all D programmers have run 
away if D had ':=' as assignment op?

I wish, D had done all the miraculos things it did - and then on 
top, had allowed itself the luxury to be more human centric 
rather than sticking to a paradigm that was necessary 50 years 
ago (and even then not good but necessary)

BTW: I write this because D means a lot to me not to bash it. For 
Java, to bname an ugly excample, I never wasted a single line of 
criticism; t'sjust not worth it. So, please, read what I say as 
being written in a warm tone and not negatively minded.
Aug 20 2013
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/20/13 2:22 PM, Ramon wrote:
 Today, some hundred (or thousand?) hours of painful search for bugs or
 problems later due to things like '=' in an if clause, I'm ready to
 admit that any language using '=' for assignment rather than, for
 instance, ':=' is effectively creating a trap for the people using it.

 Today I look at Ada and Eiffel with great respect.
void main() { int a, b; if (a = b) {} } ./test.d(4): Error: assignment cannot be used as a condition, perhaps == was meant? This is a solved problem. (However see also http://d.puremagic.com/issues/show_bug.cgi?id=10862 which I just submitted). I haven't heard a peep about it in D and C++ circles (most C++ compilers define a similar warning, although they aren't required).
 I've seen hints from the authors of D themselves that '++' and '--'
 might not be the wisest way of action. So I stand here asking "Why the
 hell did they implement them?"
That would be news to me. I find "++" and "--" very useful for daily use. D also implements their overloading arguably in a more elegant way than C++ does.
 It would be very simple to implement that logic in an editor for those
 who feel that life without '++' is impossible to automagically expand
 "x++" to "X := x + 1".
This argument is unlikely to do very well with this crowd.
 Having seen corporations in serious trouble
 because their system broke (or happily continued to run albeit producing
 erroneous data ...) for this "small detail" I have a hard time to defend
 '++'. ("Save 5 sec typing/day and risk your company!").
Very interesting! What pernicious effects does "++" have?
 Another issue (from an Ada background): Why "byte" ... (the complete
 series up to) ... "cent"? Bytes happen to be important for CPUs - not
 for the world out there. I wouldn't like to count the gazillion cases
 where code went belly up because something didn't fit in 16 bits. Why
 not the other way around, why not the whole she-bang, i.e., 4 (or) 8
 bytes as default for a single fixed point type ("int") and a mechanism
 to specify what actually is needed?
 So for days in a month we'd have "int'b5 dpm;" (2 pow x notation) or
 "int'32dpm;"?
As a rule of thumb primitive types model primitive machine types. For everything else there are libraries. Or should be :o).
 Even funnier, even D's authors seems to have had thoughts
 in that direction (but not following them) when designing the dyn array
 mechanism where a dyn array effectively has 2 pow x based de facto
 storage (size 6 (elements officially used) de facto is an 8 element array).
Exponential growth for O(1) amortized append has nothing do to with basic data sizes.
 This happens to be a nice example for perspective. C's perspective (by
 necessity) was resource oriented along the line "offer an 8bit int so as
 to not waste 16bits were 8bits suffice".
 Yet we still do that in the 21st century rather than acting more *human
 oriented* by putting the decision for the size to the human. Don't
 underestimate that! The mere action of reflecting how much storage is
 needed is valuable and helps to avoid errors.
There's very much wrong in this. Byte-level access is necessary in a systems language for a variety of reasons, of which storing individual integers is probably the least interesting.
 D is, no doubts, an excellent and modern incarnation of C/C++. As far as
 I'm concerned D is *the* best C/C++ incarnation ever, hands down.

 But is '=' really a holy issue? Would all D programmers have run away if
 D had ':=' as assignment op?
Deprecating "=" in favor of ":=" would solve a problem that doesn't exist, and would create a whole new one.
 I wish, D had done all the miraculos things it did - and then on top,
 had allowed itself the luxury to be more human centric rather than
 sticking to a paradigm that was necessary 50 years ago (and even then
 not good but necessary)
I understand we all have our preferences, but reifying them to absolutes is specious. If you said "Yo dudes, I don't dig '=' for assignments and '++' and '--' for {inc,dec}rement. Also I don't give a rat's tail on dem fixed small integers. Other that dat, y'all boys did a rad job. Peace." - well that would have been easier to empathize with.
 BTW: I write this because D means a lot to me not to bash it. For Java,
 to bname an ugly excample, I never wasted a single line of criticism;
 t'sjust not worth it. So, please, read what I say as being written in a
 warm tone and not negatively minded.
Awesome, thank you and keep destroying. Andrei
Aug 20 2013
parent reply "Ramon" <spam thanks.no> writes:
Happily I'm stupid and completely missed the condescending tone 
of an evident genius. Instead I'll just be grateful that it 
pleased one of the D masters to drop some statement down at me at 
all.

On Tuesday, 20 August 2013 at 21:52:29 UTC, Andrei Alexandrescu 
wrote:
 On 8/20/13 2:22 PM, Ramon wrote:
 = vs :=
void main() { int a, b; if (a = b) {} } ./test.d(4): Error: assignment cannot be used as a condition, perhaps == was meant? This is a solved problem. (However see also http://d.puremagic.com/issues/show_bug.cgi?id=10862 which I just submitted). I haven't heard a peep about it in D and C++ circles (most C++ compilers define a similar warning, although they aren't required).
I did as advised and found:
 Now consider:
 
 void main() {
     int a, b;
     if ((a = b) = 0) {}
 }
 
 This compiles diagnostic-free. The shape if (expr1 = expr2) 
 should be
 disallowed at a grammatical level, i.e. during parsing
Oops. So, after all, making it invitingly easy to mix up assignment and comparison can actually be troublesome? Wow.
 I've seen hints from the authors of D themselves that '++' and 
 '--'
 might not be the wisest way of action. So I stand here asking 
 "Why the
 hell did they implement them?"
That would be news to me. I find "++" and "--" very useful for daily use. D also implements their overloading arguably in a more elegant way than C++ does.
I don't remember the precise spot but somewhere (here on this site) someone from the D team says something to the effect of maybe pre *and* post inc/dec might be not so desirable. Whatever, it was *my* error to express myself not precisely. Yes, ++ and -- *are* useful. What I meant (and didn't make clear) was that there are 2 versions, post and pre. One of them is fine, two versions, pre and post, can create trouble.
 It would be very simple to implement that logic in an editor 
 for those
 who feel that life without '++' is impossible to automagically 
 expand
 "x++" to "X := x + 1".
This argument is unlikely to do very well with this crowd.
So do other arguments in a C++ crowd. Happily enough you did think what you thought and what made you aork on D anyway.
 Having seen corporations in serious trouble
 because their system broke (or happily continued to run albeit 
 producing
 erroneous data ...) for this "small detail" I have a hard time 
 to defend
 '++'. ("Save 5 sec typing/day and risk your company!").
Very interesting! What pernicious effects does "++" have?
Same thing as above. Pre-inc'ing e.g. a pointer that should be post-inc'ed can lead to pretty ugly situations.
 Another issue (from an Ada background): Why "byte" ... (the 
 complete
 series up to) ... "cent"? Bytes happen to be important for 
 CPUs - not
 for the world out there. I wouldn't like to count the 
 gazillion cases
 where code went belly up because something didn't fit in 16 
 bits. Why
 not the other way around, why not the whole she-bang, i.e., 4 
 (or) 8
 bytes as default for a single fixed point type ("int") and a 
 mechanism
 to specify what actually is needed?
 So for days in a month we'd have "int'b5 dpm;" (2 pow x 
 notation) or
 "int'32dpm;"?
As a rule of thumb primitive types model primitive machine types. For everything else there are libraries. Or should be :o).
 This happens to be a nice example for perspective. C's 
 perspective (by
 necessity) was resource oriented along the line "offer an 8bit 
 int so as
 to not waste 16bits were 8bits suffice".
 Yet we still do that in the 21st century rather than acting 
 more *human
 oriented* by putting the decision for the size to the human. 
 Don't
 underestimate that! The mere action of reflecting how much 
 storage is
 needed is valuable and helps to avoid errors.
There's very much wrong in this. Byte-level access is necessary in a systems language for a variety of reasons, of which storing individual integers is probably the least interesting.
There is a reason or explanation for everything, no matter what. But my point was another one. It was about another perspective. C's and, so it seems, D's perspective is "What's natural with a CPU" - mine is "What's natural and useful for humans trying to solve problems". With all respect, Andrei, your argument doesn't mean too much to me anyway because if the job at hand is pure system low level programming, I'll do it in C anyway. Furthermore it is well understood nowadays that it might be smart to split even OS design into a HAL (C/Asm) and higher level stuff (higher level language). Does it hurt performance to do everything in 32 bits rather than in, say, 16 bits (on a 32 or 64 bit CPU)? Last time I looked at CPU specs, no. Finally, yes, some system level programming needs byte pointers. Well, how difficult or expensive could it be to implement a byte pointer? I think it's in the well feasible range. But again, my argument wasn't about system programming, which, of course, is the perfect argument for D. OTOH: Is D really and only meant for systems programming? Hardly.
 Deprecating "=" in favor of ":=" would solve a problem that 
 doesn't exist, and would create a whole new one.
It *does* exist. Unless of, course, you are not content to consider and treat myself as a lowly moron but the creators of Pascal, Ada, Eiffel, and others, as well (and btw yourself, too. Have a look at the link you provided above ...) And, just for completeness sake: May I ask *what* new problems ':=' would create? Other than adding a single char in the parser, that is.
 I wish, D had done all the miraculos things it did - and then 
 on top,
 had allowed itself the luxury to be more human centric rather 
 than
 sticking to a paradigm that was necessary 50 years ago (and 
 even then
 not good but necessary)
I understand we all have our preferences, but reifying them to absolutes is specious. If you said "Yo dudes, I don't dig '=' for assignments and '++' and '--' for {inc,dec}rement. Also I don't give a rat's tail on dem fixed small integers. Other that dat, y'all boys did a rad job. Peace." - well that would have been easier to empathize with.
Oh master, now that the man, to whom I would have referred as "Huh? Andrei WHO??" 2 weeks ago, has told me, I understand that it can't be tolerated to havea prsonal style of thinking and putting things. Can I be forgiven, if I pray 10 Ave Maria and clean your shoes with my worthless lips? Just btw: I don't care rat sh*t whether you empathize with me or like me.
 BTW: I write this because D means a lot to me not to bash it. 
 For Java,
 to bname an ugly excample, I never wasted a single line of 
 criticism;
 t'sjust not worth it. So, please, read what I say as being 
 written in a
 warm tone and not negatively minded.
Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain? Just in case you are able to be professional on a human level for a moment: I want to buy your book and did look for it. It seems though that it's not easily available in Germany (amazon.de). Would you happen to have a hint for me where I can get it over here? Thanks.
Aug 20 2013
parent reply "John Colvin" <john.loughran.colvin gmail.com> writes:
On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending tone 
 of an evident genius. Instead I'll just be grateful that it 
 pleased one of the D masters to drop some statement down at me 
 at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Aug 20 2013
next sibling parent "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 21, 2013 at 12:58:16AM +0200, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
Happily I'm stupid and completely missed the condescending tone of
an evident genius. Instead I'll just be grateful that it pleased
one of the D masters to drop some statement down at me at all.
Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Ahhahaha, so Andrei dropped the "destroy" word without explaining how we use it around these parts. :) For the sake of Ramon and all others who are new around here: there is a tradition on this forum where after you present a (hopefully well-reasoned) argument or wrote some D code you want to put up for peer review (e.g. to include in Phobos), you'd ask others to "destroy" your argument/code as a friendly way of saying "please find all the flaws you can so I can fix them and make it even better". I'm not sure how this usage came about, as it predated my time, but my guess is that in the past, someone would post something they're totally convinced about, only to have somebody on the forum completely tear it apart and point out all the glaring flaws that it failed to address. After a while it became something one expects will happen every time, so in a friendly, faux-challenge sort of way people would invite others to "destroy" their arguments/code/etc., and this became a tradition. In any case, "destroy" as used in these parts is a positive word, not a hostile challenge. So relax. :) T -- Doubt is a self-fulfilling prophecy.
Aug 20 2013
prev sibling parent reply "Ramon" <spam thanks.no> writes:
On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending 
 tone of an evident genius. Instead I'll just be grateful that 
 it pleased one of the D masters to drop some statement down at 
 me at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment. While I'm certainly not in a position to look down on J. Ichbiah, N. Wirth, and B. Meyer, I have certainly not spent the last 25+ years without understanding a thing or two about my profession, no matter what Mr. Alexandrescu seems to think. No matter what Mr. Alexandrescu thinks or likes/dislikes or how he behaves I recognize (and praise) D as a very major improvement on C/C++ and as a very attractive language (by no means only for system programming). Furthermore I recognize and respect Mr. Alexandrescu's profound knowledge of D and the (assumed and highly probable) value of his book and thank him for his work. Maybe I'm simply profitting from my being a lowly retarded creature who, as some kind of a generous compensation by nature, is capable to recognize the knowledge and work of others irrespective of their being friendly or rightout rude and condescending. As for Mr. Alexandrescu's book, I'm glad to report that I will no longer need to molest him with my lowly requests. I have found a way to buy an epub version (through InformIt/Pearson). "D The programming language" has been bought and already downloaded and I'm looking forward to learn quite a lot about D from it. Regards - R.
Aug 20 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
This comment set put together with recent avalanche of JS posts 
makes a lovely collection of internet abuse techniques :)

I can't even rage. It is brilliant.
Aug 20 2013
parent reply "Ramon" <spam thanks.no> writes:
On Wednesday, 21 August 2013 at 02:01:48 UTC, Dicebot wrote:
 This comment set put together with recent avalanche of JS posts 
 makes a lovely collection of internet abuse techniques :)

 I can't even rage. It is brilliant.
Thanks so much for your repeated status updates for your emotions. This is immensely important and doubtlessly of high significance for D. Just out of curiosity: Do you also sometimes refer to D or D related matters here? (Don't call. We will contact you ...) Tyler Jameson Little Thank you. As you are not the first one to excuse/explain Mr. Alexandrescu's, uhm, (sometimes?) strongly individual communication style, I take that to (positively) imply that, except for some quite unlucky situations, Mr. Alexandrescu is well liked by many around here. ACK. Concerning this as well as the "D crowd" and the culture here, kindly give me some time to find my way around here and in D, will you. If this community is just half as attractive as D, and there are indications it is, I'll soon be a happy member ;)
Aug 20 2013
next sibling parent reply "BS" <bumnutbarry gmail.com> writes:
An important aspect in developing anything is a forum where 
people can destroy ideas and opinions clearly and succinctly 
without fear of the destroyed taking it personally.

Technical arguments, and counter-arguments, are best when they 
stick to the point without fear of the other being overly 
precious.

Andrei was *not* rude, he simply showed cases where he disagreed 
with your post and gave reasonable arguments as to why. In no way 
did he make any personal remark about you or your intelligence. 
If you choose to take it personally because he disagreed with you 
then it is not his fault, nor his problem.

You either counter-destroy, with a more solid set of arguments 
and examples, or you accept their case and move onto the next 
difficult problem.

However, if the remarks were personal and of the order "You are 
clearly stupid mate. Everyone knows ABCD 1234!" then it would be 
rude and I could understand your discontent.

There is a clear difference.
Aug 20 2013
parent "BS" <bumnutbarry gmail.com> writes:
On Wednesday, 21 August 2013 at 04:12:33 UTC, BS wrote:
 An important aspect in developing anything is a forum where 
 people can destroy ideas and opinions clearly and succinctly 
 without fear of the destroyed taking it personally.

 Technical arguments, and counter-arguments, are best when they 
 stick to the point without fear of the other being overly 
 precious.

 Andrei was *not* rude, he simply showed cases where he 
 disagreed with your post and gave reasonable arguments as to 
 why. In no way did he make any personal remark about you or 
 your intelligence. If you choose to take it personally because 
 he disagreed with you then it is not his fault, nor his problem.

 You either counter-destroy, with a more solid set of arguments 
 and examples, or you accept their case and move onto the next 
 difficult problem.

 However, if the remarks were personal and of the order "You are 
 clearly stupid mate. Everyone knows ABCD 1234!" then it would 
 be rude and I could understand your discontent.

 There is a clear difference.
I should add D.learn is often less abrupt than this forum, especially if you're just finding your feet in D. Here it is often assumed that posters are OK if their arguments are shot down in flames. It is used as a technical discussion forum for new ideas driving the language forward. That sort of work has to be nutted out without tiptoeing around each other. BUT, I reiterate, it doesn't mean posters are attachking others personally. I have rarely seen that behaviour on the D forums.
Aug 20 2013
prev sibling parent "Dicebot" <public dicebot.lv> writes:
Politeness will kill western civilization one day :)
Aug 21 2013
prev sibling next sibling parent reply "H. S. Teoh" <hsteoh quickfur.ath.cx> writes:
On Wed, Aug 21, 2013 at 03:46:33AM +0200, Ramon wrote:
 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
Happily I'm stupid and completely missed the condescending tone
of an evident genius. Instead I'll just be grateful that it
pleased one of the D masters to drop some statement down at me
at all.
Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment. While I'm certainly not in a position to look down on J. Ichbiah, N. Wirth, and B. Meyer, I have certainly not spent the last 25+ years without understanding a thing or two about my profession, no matter what Mr. Alexandrescu seems to think.
I didn't find Andrei's tone condescending at all. I think you're misunderstanding his intent. Disagreement with your ideas is one thing, but being rude is completely another thing. I would not stand for anyone, even if it's Andrei, being rude to a newcomer, but I don't perceive his response as being rude at all, even if he did disagree with you. It's an unfortunate consequence of online communication that tone of voice and other extra-linguistic elements are not adequately conveyed, as a result, things tend to be taken at face value, such as the use of the word "destroy". Those of us who know Andrei better understand that he's using that word in a joking / non-serious way, but given that you're a newcomer here and don't have the necessary context to interpret what he wrote, I can totally understand how you would interpret it in a negative way. The fact that he didn't bother explaining what he meant didn't help, since at face value, "destroy" does indeed have a very negative connotation. To prove what I said about his intent, one could read through the forum archives to see the use of the word "destroy" in a non-serious way, but since it's not really appropriate for me to put the burden on you to do so, I'll merely mention it. Whether you take my word for it or not, ultimately doesn't really matter that much, but I thought I should belabor this point for the sake of other newcomers here who may experience the same misunderstanding. T -- VI = Visual Irritation
Aug 20 2013
next sibling parent "Ramon" <spam thanks.no> writes:
If I'm not completely mistaken, Mr. Alexandrescu could notice the 
obvious fact that I'm new here.

Whatever, the interesting issue is not myself, or some, so it 
seems, hate and rage obsessed member whose screen name I don't 
remember, no matter how often he informs me about his negative 
feelings, or Mr. Alexandrescu (insofar as being a major figure in 
the "D crowd" and a co-developer of D automatically gives him a 
certain importance in matters of D), *the interesting issue is D*.

Thanks to some of your constructive and socially competent posts 
to me my worries are greatly reduced and if I'm somewhat unhappy 
the reason isn't D but my somewhat unrealistic wishes and my 
possibly premature excitation.

 From what I see so far, D wins hands down against other languages 
that I've looked into in a quest over quite some years now. It's 
not perfect for me but then nothing is perfect for everyone. What 
matters is a simple question in the end: Does the overall mix 
support me in my work and in my striving to efficiently produce 
code that meets my standards in terms of reusability, reliability 
and others?
While I still feel that Eiffel comes somewhat closer to my "dream 
language" I have to - and do - also see that in the end some 
pragmatic criteria make D the best choice for me.

It should also be seen that at *no point in time* did I say (or 
feel) that D is bad or mediocre. It was clear from the beginning 
that D is in the top 3 anyway. Looking at it like this, this 
whole thing is basically a luxury discussion. And again, I did 
question (and question critically) D *because* it's so great and 
deserves attention incl. critically questioning.

As for Mr. Alexandrescu, the main importance (IMO) is in him 
being an important D figure. Whether D should offer a friendly, 
forgiving and welcoming interface to newcomers or whether 
expecting newcomers are to know about Mr. Alexandrescu's personal 
habits and acting in a way that at least makes it easy to be 
misunderstood in an unpleasant and possibly apalling way is not 
for me to decide.

For me it's more important whether Mr. Alexandrescu is capable to 
write a good book about D that helps me during my first steps 
(and possibly even beyond that). From what I know so far, this 
seems highly probable and I'm looking forward to start reading it 
tomorrow and to learn.

Nevertheless, thanks for your constructive and positive way to 
approach things once more ;)

R.
Aug 20 2013
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 8/20/13 7:29 PM, H. S. Teoh wrote:
 On Wed, Aug 21, 2013 at 03:46:33AM +0200, Ramon wrote:
 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending tone
 of an evident genius. Instead I'll just be grateful that it
 pleased one of the D masters to drop some statement down at me
 at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment. While I'm certainly not in a position to look down on J. Ichbiah, N. Wirth, and B. Meyer, I have certainly not spent the last 25+ years without understanding a thing or two about my profession, no matter what Mr. Alexandrescu seems to think.
I didn't find Andrei's tone condescending at all. I think you're misunderstanding his intent. Disagreement with your ideas is one thing, but being rude is completely another thing. I would not stand for anyone, even if it's Andrei, being rude to a newcomer, but I don't perceive his response as being rude at all, even if he did disagree with you.
Probably not worth belaboring the point, but OP's reaction made me feel guilty enough to go back and re-read http://goo.gl/mJU68I. It's a bummer that Ramon took issue with it, but I'd be insincere to apologize for what I see as no wrongdoing. This may be a cultural difference seeing as OP's rhetoric and literary mannerisms are quite a bit different from the usual on this forum. Andrei
Aug 21 2013
parent "Ramon" <spam thanks.no> writes:
On Wednesday, 21 August 2013 at 16:50:17 UTC, Andrei Alexandrescu 
wrote:
 Probably not worth belaboring the point, but OP's reaction made 
 me feel guilty enough to go back and re-read 
 http://goo.gl/mJU68I. It's a bummer that Ramon took issue with 
 it, but I'd be insincere to apologize for what I see as no 
 wrongdoing. This may be a cultural difference seeing as OP's 
 rhetoric and literary mannerisms are quite a bit different from 
 the usual on this forum.


 Andrei
What a vulgarily cheap shot (on multiple levels). But then, the D book he wrote is quite good and some of his contributions to D are really brilliant. Do I care whether the bookkeeper has what it takes to be significant on a human, character or culture level? Nope. As long as he counts the beans without destroying too much I'm satisfied. So, thank you, Mr. Alexandrescu - End of communication -
Aug 21 2013
prev sibling next sibling parent "Tyler Jameson Little" <beatgammit gmail.com> writes:
On Wednesday, 21 August 2013 at 01:46:37 UTC, Ramon wrote:
 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending 
 tone of an evident genius. Instead I'll just be grateful that 
 it pleased one of the D masters to drop some statement down 
 at me at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment. While I'm certainly not in a position to look down on J. Ichbiah, N. Wirth, and B. Meyer, I have certainly not spent the last 25+ years without understanding a thing or two about my profession, no matter what Mr. Alexandrescu seems to think. No matter what Mr. Alexandrescu thinks or likes/dislikes or how he behaves I recognize (and praise) D as a very major improvement on C/C++ and as a very attractive language (by no means only for system programming). Furthermore I recognize and respect Mr. Alexandrescu's profound knowledge of D and the (assumed and highly probable) value of his book and thank him for his work. Maybe I'm simply profitting from my being a lowly retarded creature who, as some kind of a generous compensation by nature, is capable to recognize the knowledge and work of others irrespective of their being friendly or rightout rude and condescending. As for Mr. Alexandrescu's book, I'm glad to report that I will no longer need to molest him with my lowly requests. I have found a way to buy an epub version (through InformIt/Pearson). "D The programming language" has been bought and already downloaded and I'm looking forward to learn quite a lot about D from it. Regards - R.
I'm sorry you felt offended by that, but I can assure you, he didn't mean anything negative by it. I probably won't convince you, but here are a few other times the word "destroy" has been used in a similar manner (the first is by Andre): http://forum.dlang.org/thread/kooe7p$255m$1 digitalmars.com http://forum.dlang.org/thread/iauldfsuxzifzofzmaxq forum.dlang.org http://forum.dlang.org/thread/rhwopozmtodmazcyiazj forum.dlang.org http://forum.dlang.org/thread/jlbsreudrapysiaetyrp forum.dlang.org I agree though, it isn't the best term, especially for someone who isn't accustomed to this community, but it's part of the culture. Cheers!
Aug 20 2013
prev sibling parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Wed, 21 Aug 2013 02:46:33 +0100, Ramon <spam thanks.no> wrote:

 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending tone of an  
 evident genius. Instead I'll just be grateful that it pleased one of  
 the D masters to drop some statement down at me at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment.
Have you heard the phrase "when in Rome..". Seriously, you would rather assume a negative meaning/intent even after someone has taken the time to explain the intent/usage of the word/phrase in this grand forum? I sense that you may be beyond reasonable advice at this point? But, if not.. Always start by assuming good intent, if you're right (and you will be 90% of the time) no problem. If you're wrong, well at least you've not gotten worked up about it (so they have failed in their goal) and chances are it will annoy the abuser even more that you haven't (so ultimately, you win). Communication in written form is fraught with pitfalls, and this thread demonstrates how comments can be taken in completely the wrong way. Dicebot's "I am dangerously close to hating you." was meant in a friendly way, /you/ decided not to read it that way. Likewise Andrei's style is abrupt but there are good reasons for this, none of which include the goal of offending but /you/ have chosen to read them that way. Sure, more effort could be taken to make it clearer with excess smileys etc. But, that stuff isn't necessary for communicating the content, and isn't necessary between established forum members, and isn't necessary if everyone just assumes good intent from the outset. All the best, Regan -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
On Thursday, 22 August 2013 at 09:10:33 UTC, Regan Heath wrote:
 On Wed, 21 Aug 2013 02:46:33 +0100, Ramon <spam thanks.no> 
 wrote:

 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
 Happily I'm stupid and completely missed the condescending 
 tone of an evident genius. Instead I'll just be grateful 
 that it pleased one of the D masters to drop some statement 
 down at me at all.
 Awesome, thank you and keep destroying.
"destroying"??? Which part of "not to bash it" and of "D means a lot to me" and of "D is, no doubts, an excellent and modern incarnation of C/C++. As far as I'm concerned D is *the* best C/C++ incarnation ever, hands down." was too complicated to understand for your genius brain?
I knew this would happen at some point: Andrei uses "destroy" as a positive term to denote a well-reasoned powerful argument/response. Chill :)
Uhum. Well, where I live "to destroy" has a pretty clear and very negative meaning. I took that post (of Mr. Alexandrescu) as very rude and condescending and I do not intend to change my communication habits so as to understand "to destroy" as a positive statement or even a compliment.
Have you heard the phrase "when in Rome..". Seriously, you would rather assume a negative meaning/intent even after someone has taken the time to explain the intent/usage of the word/phrase in this grand forum? I sense that you may be beyond reasonable advice at this point? But, if not.. Always start by assuming good intent, if you're right (and you will be 90% of the time) no problem. If you're wrong, well at least you've not gotten worked up about it (so they have failed in their goal) and chances are it will annoy the abuser even more that you haven't (so ultimately, you win). Communication in written form is fraught with pitfalls, and this thread demonstrates how comments can be taken in completely the wrong way. Dicebot's "I am dangerously close to hating you." was meant in a friendly way, /you/ decided not to read it that way. Likewise Andrei's style is abrupt but there are good reasons for this, none of which include the goal of offending but /you/ have chosen to read them that way. Sure, more effort could be taken to make it clearer with excess smileys etc. But, that stuff isn't necessary for communicating the content, and isn't necessary between established forum members, and isn't necessary if everyone just assumes good intent from the outset. All the best, Regan
Wow. Now I even get general advice for my life like "Always start by assuming good intent". How about some honesty? It happens to everybody of us. We hadn't any bad intentions but, alas, someone feels offended, improperly treated, etc. There is exactly 1 proper reaction for a responsible adult: To honestly look "Did I contribute to that?" and if so, to explain oneself. It would have cost pretty nothing to Mr. A. to simply say "OOps. Didn't mean any bad. When I say 'destroy' it's actually in between an invitation to continue hitting with constructive criticism and a compliment. Weird habit of mine". Not even a "sorry" would be needed. Well, he didn't. Instead he relied on his alpha-dog factor and the fact that there had already been some group members explaining and excusing him (and, in fact and very funnily, when he finally decided to comment he addressed not me but someone else). Meanwhile I'd be better placed to start trouble - if that ever were my intention. I've read a good part of Mr. A's book, watched quite some youtube, both with Mr. Bright and Mr. A. - and I have, to put it in prosecutor like wording, generously enough material in my hands (where Bright/AA basically say something I said too and got bad reactions. But then, it's not really new that it matters in social groups _who_ says sth.). One simple example: Is Mr. A perfectly well capable to talk/write within usual social limits? Yes, he is. His (btw. very well done, if somewhat jumpy) book proves it. He just happens to feel free to behave like an *** in this group, where he is an alpha and where "tough lingo" and weird personal rites are part of the "culture" - and glue - of this "D crowd". I don't feel hurt, I am not after Mr. A., I'm not looking for trouble and I'm not in fight mode or anti-D or anything like that. But would you (all) kindly refrain from playing your group games with me and telling me bullsh*t? I'm not interested. Mr. A. has written the book on D and he did that quite well. He has largely contributed to D and he did that well and some of his work is even brilliant (for "scope" alone I'd be willing to praise him gleefully). And he also happened to show himself capable of gross social and human incompetence - and I don't care; I'm interested in his work, not in his person. If at all, I'd point out the professional component, i.e. the question, if it is wise for a relatively new, unknown and little used language to drive newcomers off rather than to invite them, guide them and be patient with potentially not so smart questions. Like it or not, Mr. Bright and Mr. A.A. *are* your shopfront. You can count on me striving to be professional and constructive. Don't count on me becoming another adapting dog in the D-crowd, though. Your advice and opinions on programming and language related issues is honestly welcome and appreciated - incl. uncomfortable ones. Anything beyond that is almost certainly not what I would be taking from the "D crowd". Can we now finally continue our lifes? Sure enough someone has some detail issue to be found in the archives and to be enchantedly discussed. With a little luck, your alphas will chime in.
Aug 22 2013
next sibling parent "eles" <eles eles.com> writes:
On Thursday, 22 August 2013 at 12:11:29 UTC, Ramon wrote:
 On Thursday, 22 August 2013 at 09:10:33 UTC, Regan Heath wrote:
 On Wed, 21 Aug 2013 02:46:33 +0100, Ramon <spam thanks.no> 
 wrote:

 On Tuesday, 20 August 2013 at 22:58:24 UTC, John Colvin wrote:
 On Tuesday, 20 August 2013 at 22:49:40 UTC, Ramon wrote:
Look, is not about alphas, crowds and so on. It is a simple misunderstanding that escalated. Let's end this trouble. There is a lot of work that awaits to be done.
Aug 22 2013
prev sibling next sibling parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 22 Aug 2013 13:11:28 +0100, Ramon <spam thanks.no> wrote:
 On Thursday, 22 August 2013 at 09:10:33 UTC, Regan Heath wrote:
 On Wed, 21 Aug 2013 02:46:33 +0100, Ramon <spam thanks.no> wrote:

 Well, where I live "to destroy" has a pretty clear and very negative  
 meaning.
 I took that post (of Mr. Alexandrescu) as very rude and condescending  
 and I do not intend to change my communication habits so as to  
 understand "to destroy" as a positive statement or even a compliment.
Have you heard the phrase "when in Rome..". Seriously, you would rather assume a negative meaning/intent even after someone has taken the time to explain the intent/usage of the word/phrase in this grand forum? I sense that you may be beyond reasonable advice at this point? But, if not.. Always start by assuming good intent, if you're right (and you will be 90% of the time) no problem. If you're wrong, well at least you've not gotten worked up about it (so they have failed in their goal) and chances are it will annoy the abuser even more that you haven't (so ultimately, you win). Communication in written form is fraught with pitfalls, and this thread demonstrates how comments can be taken in completely the wrong way. Dicebot's "I am dangerously close to hating you." was meant in a friendly way, /you/ decided not to read it that way. Likewise Andrei's style is abrupt but there are good reasons for this, none of which include the goal of offending but /you/ have chosen to read them that way. Sure, more effort could be taken to make it clearer with excess smileys etc. But, that stuff isn't necessary for communicating the content, and isn't necessary between established forum members, and isn't necessary if everyone just assumes good intent from the outset. All the best, Regan
Wow. Now I even get general advice for my life like "Always start by assuming good intent".
.. and have you taken that advice as it was intended? With good intent? Or are you still assuming the worst in people?
 How about some honesty?
Who isn't being honest?
 It happens to everybody of us. We hadn't any bad intentions but,
 alas, someone feels offended, improperly treated, etc.
 There is exactly 1 proper reaction for a responsible adult: To
 honestly look "Did I contribute to that?" and if so, to explain
 oneself.

 It would have cost pretty nothing to Mr. A. to simply say "OOps.
 Didn't mean any bad. When I say 'destroy' it's actually in
 between an invitation to continue hitting with constructive
 criticism and a compliment. Weird habit of mine". Not even a
 "sorry" would be needed.
The issue is that you've got this totally backwards. In some countries people carry bodily fluids around in a small square of cloth in their pockets, in others they blow them straight onto the side walk. If one is the norm and you're offended by it does someone owe you an apology? Here, on this forum, "destroy" has a well known meaning which is the "norm". If someone uses it, and you are offended, do they owe you an apology? The answer in both case, IMO, is "no". <snip> I can't think of anything constructive to say in response to the rest of that, except that you seem to have a very different view of this community than I do... R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
Agreeing with eles

On Thursday, 22 August 2013 at 14:08:47 UTC, eles wrote:

 Let's end this trouble. There is a lot of work that awaits to 
 be done.
I limit myself to On Thursday, 22 August 2013 at 14:21:45 UTC, Regan Heath wrote:
 .. and have you taken that advice as it was intended?  With 
 good intent?  Or are you still assuming the worst in people?
Yes and no. No, I did not take the advice. Partly because it doesn't mean and/or concern me, partly because I tend to carefully select from whom I take advice. And Yes, I have understood you having good intentions. For the rest of your post: Yeah, right, *I* have got it wrong. Of course. You bunch of assholes. ("asshole", of course, meaning "esteemed colleagues" but I won't tell you that until later). On Thursday, 22 August 2013 at 14:22:44 UTC, Dicebot wrote:
 Human language rarely has any clear and well-defined meanings. 
 Without cultural context it is almost nothing. Actually, in 
 fact people almost never understand each other, they always 
 operate within some amount of false assumptions.
And, of course, the "D-crowd" is perfectly right assuming a newcomer to know their internal communication codes - while - the newcomer, of course, is plain wrong when assuming that words carry the meaning they carry for the rest of the world. Sure. Strikes me as brilliantly logical. I should bow before so much wisdom. How does one bow around here? By farting?
 In that regard, Andrei, who has been using well-established 
 communication protocol understood by most part of this 
 community was most honest and reasonable in expressing his 
 intentions. Failure to understand that is always your failure 
 as "proper" communication is always defined by community and 
 never by beliefs of some people.
Short version: Mr. A is alpha and we are "the D crowd" not only making whateverrules we please but we also expect anyone entering our virtual group to immediately know all our arbitrary wanton rules and kinks. Try that with your dog. Using that on intelligent life forms is bound to fail.
 Scorning from my side is not because of opinions you express or 
 technical goals you find important. It is because of sermon 
 flavor that overwhelms all your comments. No reasonable man can 
 think his beliefs and/or habits are any exceptional. Denying 
 this and refusing to properly study domain you oppose is quite 
 reliable indicator of ignorance or trolling. Probably both and 
 I shouldn't really care about the difference.

 I must admit I am quite fast to lose my temper and tend to 
 overreact sometimes. However, it makes me sad to see that D 
 community falls into completely opposite extreme - wasting time 
 and efforts in useless attempts to satisfy people who are 
 simply mocking it, whenever form it may take.
So I'm supposed to thank you for your lack of consistency and "wasting time"? You seem to know quite a lot about D. It might be wise to play that strength rather than doing what you did here (and which I mercifully will not take on further). Have a nice day and be sure that I'll gladly listen to you as soon as it's about D ;)
Aug 22 2013
next sibling parent reply "Dicebot" <public dicebot.lv> writes:
On Thursday, 22 August 2013 at 15:06:18 UTC, Ramon wrote:
 Let's end this trouble. There is a lot of work that awaits to 
 be done.
I limit myself to
That was cheap, you have been using better techniques before :)
 And, of course, the "D-crowd" is perfectly right assuming a 
 newcomer to know their internal communication codes - while - 
 the newcomer, of course, is plain wrong when assuming that 
 words carry the meaning they carry for the rest of the world. 
 Sure. Strikes me as brilliantly logical. I should bow before so 
 much wisdom. How does one bow around here? By farting?
I have been reading this newsgroup for ~ an year before writing first comment not related to asking questions about D spec. Probably about two years until I felt I can participate in discussions in a constructive way. Still keep failing sometimes. Don't expect that from you. Still answers the question. Actually, most self-regulating communities I have been part of worked that way. And no, right now I don't have any good intentions about you. Can't state it anymore clear. I am not an official D spokesperson and I can afford to reason about people only by their words and deeds, ignoring all PR crap. One day you may do something worthy and I will regret it. Until then you are a miserable troll with disgusting attitude.
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
Dicebot

"Have a nice day and be sure that I'll gladly listen to you as
soon as it's about D ;)"
Aug 22 2013
parent 1100110 <0b1100110 gmail.com> writes:
On 08/22/2013 10:49 AM, Ramon wrote:
 Dicebot

 "Have a nice day and be sure that I'll gladly listen to you as
 soon as it's about D ;)"
I can't decide if this is obvious trolling or a legitimate cultural clash... If it is trolling, It's the most successful that I've ever seen on this newsgroup. So.. Congrats. I guess?
Aug 22 2013
prev sibling parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 22 Aug 2013 16:06:17 +0100, Ramon <spam thanks.no> wrote:
 On Thursday, 22 August 2013 at 14:08:47 UTC, eles wrote:
 On Thursday, 22 August 2013 at 14:21:45 UTC, Regan Heath wrote:

 .. and have you taken that advice as it was intended?  With good  
 intent?  Or are you still assuming the worst in people?
Yes and no. No, I did not take the advice. Partly because it doesn't mean and/or concern me, partly because I tend to carefully select from whom I take advice.
I believe it should concern you. I believe it contributed to what happened here. I believe you would have a much more pleasant time on internet forums (in general) if you took it on board (not that I have any evidence that you don't, however can you honestly say this is the first time this has happened to you?) Side-note; it should not matter to you "who" is giving the advice, you should judge the advice on it's merits, or evaluate it yourself.
 For the rest of your post: Yeah, right, *I* have got it wrong. Of  
 course. You bunch of assholes.
 ("asshole", of course, meaning "esteemed colleagues" but I won't tell  
 you that until later).
I understand, you're blowing off steam. You're directing it in the wrong direction here however. To the point being made; 1 person cannot define the "norm", you cannot redefine "asshole" all by yourself in any meaningful way - that was the point I was making, and the distinction which is important here. R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
On Thursday, 22 August 2013 at 16:28:58 UTC, Regan Heath wrote:
 No, I did not take the advice. Partly because it doesn't mean 
 and/or concern me, partly because I tend to carefully select 
 from whom I take advice.
I believe it should concern you. I believe it contributed to what happened here. I believe you would have a much more pleasant time on internet forums (in general) if you took it on board (not that I have any evidence that you don't, however can you honestly say this is the first time this has happened to you?)
I doesn't concern me because I do not entertain the assumption of an evil (to me) world, because I do *not* assume anyone here having bad intentions towards myself (from the beginning. That might be different now)
 For the rest of your post: Yeah, right, *I* have got it wrong. 
 Of course. You bunch of assholes.
 ("asshole", of course, meaning "esteemed colleagues" but I 
 won't tell you that until later).
I understand, you're blowing off steam. You're directing it in the wrong direction here however.
Thanks for walking into my trap (put there for innocent illustrative purposes only).
 ...  To the point being made; 1 person cannot define the 
 "norm", you cannot redefine "asshole" all by yourself in any 
 meaningful way - that was the point I was making, and the 
 distinction which is important here.
Uhum. Well, it was only 1 person abusing the word "destroy". So it *is* just 1 person that did take that liberty - and you evidently think that's OK. Furthermore: If you are right, how many persons are needed? I don't think it's a quantitative issue. What you really say is: You, the newcomer, entered our group and by doing that you have to submit to our rules up to the point of redefining the well established meaning of common words. EAT IT! This not only is untenable by being quite close to rude dictatorship but it's also nonsensical because a newcomer can naturally not know the local quirks and habits, no matter his good will to adapt. Turn and bend it as you please, all the funny groups tactics to make Mr. A's rude habit look nice (an myself guilty) fail. And, surprise, some of you actually *expected* it to become problematic and said so. I do not even have a major problem with it. After all, it's common and wide spread group dynamics that can be (and have been) experienced all over the world. You have your holy little "D crowd" living room and want to impose your own little rules and want to celebrate and show loyalty to the great masters of that "D crowd" living romm? Great, just go ahead, no problem with me. But then stand with it and don't try to paint it nicely with pseudo logical, pseudo psychological or pseudo social explanations. Funny, I have been lectured, fought and even attacked (e.g. as a troll and a miserable creature) and it seems it just didn't strike you that the solution might be as simple as Mr. A. saying (what others already indicated) "Sorry, Ramon, it wasn't meant negative, quite the contrary, but I understand that it could be taken as negative and injust due to the usual meaning of 'destroy'. Feel welcome here". And I'm still quite reasonable, polite, and friendly and I'm still fair and open and praising not only D but even "evil" Mr. A. and calling his work great and even brilliant. To avoid misunderstandings: This is my opinion concerning the matter at hand. I did and do not want to judge or attack you as a person. While I have a clear and by no means positive impression of what you said, I did and do not think your intentions were in any way evil or negative. Actually I assume that your intention was peaceful and constructive.
Aug 22 2013
parent reply "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 22 Aug 2013 18:11:27 +0100, Ramon <spam thanks.no> wrote:
 On Thursday, 22 August 2013 at 16:28:58 UTC, Regan Heath wrote:
 No, I did not take the advice. Partly because it doesn't mean and/or  
 concern me, partly because I tend to carefully select from whom I take  
 advice.
I believe it should concern you. I believe it contributed to what happened here. I believe you would have a much more pleasant time on internet forums (in general) if you took it on board (not that I have any evidence that you don't, however can you honestly say this is the first time this has happened to you?)
I doesn't concern me because I do not entertain the assumption of an evil (to me) world, because I do *not* assume anyone here having bad intentions towards myself (from the beginning. That might be different now)
Right. So, what you meant to say earlier was that you were already following the advice, excellent. So, why then assume Andrei was insulting you? The two don't add up.
 For the rest of your post: Yeah, right, *I* have got it wrong. Of  
 course. You bunch of assholes.
 ("asshole", of course, meaning "esteemed colleagues" but I won't tell  
 you that until later).
I understand, you're blowing off steam. You're directing it in the wrong direction here however.
Thanks for walking into my trap (put there for innocent illustrative purposes only).
What trap? I knew what you were doing :P I assumed you weren't attempting to insult me, was I wrong?
 ...  To the point being made; 1 person cannot define the "norm", you  
 cannot redefine "asshole" all by yourself in any meaningful way - that  
 was the point I was making, and the distinction which is important here.
Uhum. Well, it was only 1 person abusing the word "destroy". So it *is* just 1 person that did take that liberty - and you evidently think that's OK.
No, it's not just 1 person (ab)using the word "destroy". That word is part of the fabric of this forum, all the regular posters and long time lurkers know it, and some use it.
 Furthermore: If you are right, how many persons are needed? I don't  
 think it's a quantitative issue.
The regular posters of this forum make up a majority, that majority use that word in that way. Every group or society defines it's own norms, generally based on a majority "vote". There are exceptions, and religion tends to warp things but in this case it's fairly simple.
 What you really say is: You, the newcomer, entered our group and by  
 doing that you have to submit to our rules up to the point of redefining  
 the well established meaning of common words. EAT IT!
This is a newsgroup/forum and you're more of less free to do what you like - much like society. However, if your behaviour is anti-social (as defined by the norms of this society) then you will continue to cause friction. This is no different to any other social group or society.
 This not only is untenable by being quite close to rude dictatorship but  
 it's also nonsensical because a newcomer can naturally not know the  
 local quirks and habits, no matter his good will to adapt.
No one assumes a newcomer will know all the "rules", all that is hoped for is that they will do the decent thing and assume good intent and proceed accordingly. Many people lurk for a long while before posting, and learn the "rules" that way. Others learn as they go, without "being offended" in the process.
 Turn and bend it as you please, all the funny groups tactics to make Mr.  
 A's rude habit look nice (an myself guilty) fail. And, surprise, some of  
 you actually *expected* it to become problematic and said so.
There are no tactics being employed here. Yes, many realise "destroy" could be miss-understood, but your reaction is, IMO, blowing it out of all proportion.
 I do not even have a major problem with it. After all, it's common and  
 wide spread group dynamics that can be (and have been) experienced all  
 over the world. You have your holy little "D crowd" living room and want  
 to impose your own little rules and want to celebrate and show loyalty  
 to the great masters of that "D crowd" living romm? Great, just go  
 ahead, no problem with me.
I've been posting since 2004 (or something) and I've never seen this "holy little D crowd" you describe. Some posters opinions do tend to hold more weight than others but this is a natural consequence of them saying something worth listening to more often than not. Naturally, newcomers don't have this and it takes time to earn. If you think that's some sort of clique then fine, but there is very little "in-house" arse kissing around here, in fact quite the opposite as we are invited to "destroy" each others ideas on a regular basis. Walter and Andrei, the two largest contributors frequently argue in public and private. Nothing is sacred, except perhaps a well reasoned argument (devoid of fallacy and abuse). <snip>.. I have nothing constructive to say to the remainder. R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
Regan Heath

You try to wrap it nicely but in the end you just prove my 
hypothesis right. The newcomer not only has to know all local 
habits and quirks of the group but he also has to know the 
history behind it. As a helpful hint you pick up dicebots hint 
that a newcomer probably should be read only for a while.

Great. And what exactly kept you away from formalizing that, such 
making it known to newcomers?

You try different funny tricks on me, for instance, by mixing up 
responsabilities. If this group has rules - which it is perfectly 
entitled to have - then it's the groups responsability to make 
those rules known in advance. It is *not* the newcomers 
responsability to somehow find out about them, possibly by 
getting accused of destruction.

Another little trick of yours is, I'm putting it bluntly, to play 
the card "We are many, you are just 1; we are here since years, 
you are new - so bend over and obey".

Frankly, the original matter doesn't even matter that much 
anymore to me. I've since quite a while put it aside as "he's a 
cheap asshole with micro-alpha syndrome but he has done very 
useful and partly brilliant work. That's all I want from him. So 
what?".
What drives me now is the desperate, abstruse and stubborn group 
dynamics at play. And no, I'm not doing that just for the fun of 
it; it can actually be a useful service (and it does have a 
certain relation to the original problem).

In two words: Context counts. (Which btw. is something you should 
like as you try playing it a lot).
In this context here group seniority might be a big thing. Or 
particular technical skills. As soon as we leave the area of 
code, however, the cards get mixed again and who was big then 
might be surprisingly small. In this discussion here, for 
instance, the capability to analyze and recognize e.g. social and 
rhetorical mechanisms is way more important than D skills (No 
suprise. After all it *is* a group, social and human thing).

To put it bluntly: Chances are that I can take apart whatever 
smart tricks you come up with. But why, what for?
Why don't you yourself just stick to your own advice and assume - 
and correctly  assume - that I have no bad intentions?
You even have proof! If I had bad intentions or just were out for 
a fight or revenge, I would certainly not have recognized A's 
work as brilliant and lauded his book. Nor would I quite politely 
and patiently discuss and respond to statements that I, no 
offense intended, perceive as, uh, less than intellectually 
exciting.

Take what I offer. Because it's good and because you will 
definitely not succeed in getting any femtogram more from me.

a) Mr. A. did act in an unfair und unjustified way, no matter how 
you try to bend it. Maybe what he did was well known and usual 
here. But not toward myself.

b) It's long forgiven and I'm in a peaceful and constructive 
state of mind. But don't you dare to convince me that Mr. A. was 
right and I should bend over and adapt to absurd group rules that 
demand inter alia precognition and possibly telepathy.

Can we now finally return to discussing D, algorithms, code and 
the like or do you insist to educate me and to continue your 
route toward nada, nothing, zilch?

Just consider me a miserable creature and really ugly on top of 
it if that helps.
Aug 22 2013
next sibling parent reply "WiseWords" <bumnutbarry gmail.com> writes:
On Thursday, 22 August 2013 at 19:20:37 UTC, Ramon wrote:
 Regan Heath

 You try to wrap it nicely but in the end you just prove my 
 hypothesis right. The newcomer not only has to know all local 
 habits and quirks of the group but he also has to know the 
 history behind it. As a helpful hint you pick up dicebots hint 
 that a newcomer probably should be read only for a while.

 Great. And what exactly kept you away from formalizing that, 
 such making it known to newcomers?

 You try different funny tricks on me, for instance, by mixing 
 up responsabilities. If this group has rules - which it is 
 perfectly entitled to have - then it's the groups 
 responsability to make those rules known in advance. It is 
 *not* the newcomers responsability to somehow find out about 
 them, possibly by getting accused of destruction.

 Another little trick of yours is, I'm putting it bluntly, to 
 play the card "We are many, you are just 1; we are here since 
 years, you are new - so bend over and obey".

 Frankly, the original matter doesn't even matter that much 
 anymore to me. I've since quite a while put it aside as "he's a 
 cheap asshole with micro-alpha syndrome but he has done very 
 useful and partly brilliant work. That's all I want from him. 
 So what?".
 What drives me now is the desperate, abstruse and stubborn 
 group dynamics at play. And no, I'm not doing that just for the 
 fun of it; it can actually be a useful service (and it does 
 have a certain relation to the original problem).

 In two words: Context counts. (Which btw. is something you 
 should like as you try playing it a lot).
 In this context here group seniority might be a big thing. Or 
 particular technical skills. As soon as we leave the area of 
 code, however, the cards get mixed again and who was big then 
 might be surprisingly small. In this discussion here, for 
 instance, the capability to analyze and recognize e.g. social 
 and rhetorical mechanisms is way more important than D skills 
 (No suprise. After all it *is* a group, social and human thing).

 To put it bluntly: Chances are that I can take apart whatever 
 smart tricks you come up with. But why, what for?
 Why don't you yourself just stick to your own advice and assume 
 - and correctly  assume - that I have no bad intentions?
 You even have proof! If I had bad intentions or just were out 
 for a fight or revenge, I would certainly not have recognized 
 A's work as brilliant and lauded his book. Nor would I quite 
 politely and patiently discuss and respond to statements that 
 I, no offense intended, perceive as, uh, less than 
 intellectually exciting.

 Take what I offer. Because it's good and because you will 
 definitely not succeed in getting any femtogram more from me.

 a) Mr. A. did act in an unfair und unjustified way, no matter 
 how you try to bend it. Maybe what he did was well known and 
 usual here. But not toward myself.

 b) It's long forgiven and I'm in a peaceful and constructive 
 state of mind. But don't you dare to convince me that Mr. A. 
 was right and I should bend over and adapt to absurd group 
 rules that demand inter alia precognition and possibly 
 telepathy.

 Can we now finally return to discussing D, algorithms, code and 
 the like or do you insist to educate me and to continue your 
 route toward nada, nothing, zilch?

 Just consider me a miserable creature and really ugly on top of 
 it if that helps.
Nice tantrum :D Wise Words are spoken unto thee "Grow a pair and move on"
Aug 22 2013
parent reply "bsd" <slackovsky gmail.com> writes:
On Friday, 23 August 2013 at 01:14:19 UTC, WiseWords wrote:
 On Thursday, 22 August 2013 at 19:20:37 UTC, Ramon wrote:
 Regan Heath

 You try to wrap it nicely but in the end you just prove my 
 hypothesis right. The newcomer not only has to know all local 
 habits and quirks of the group but he also has to know the 
 history behind it. As a helpful hint you pick up dicebots hint 
 that a newcomer probably should be read only for a while.

 Great. And what exactly kept you away from formalizing that, 
 such making it known to newcomers?

 You try different funny tricks on me, for instance, by mixing 
 up responsabilities. If this group has rules - which it is 
 perfectly entitled to have - then it's the groups 
 responsability to make those rules known in advance. It is 
 *not* the newcomers responsability to somehow find out about 
 them, possibly by getting accused of destruction.

 Another little trick of yours is, I'm putting it bluntly, to 
 play the card "We are many, you are just 1; we are here since 
 years, you are new - so bend over and obey".

 Frankly, the original matter doesn't even matter that much 
 anymore to me. I've since quite a while put it aside as "he's 
 a cheap asshole with micro-alpha syndrome but he has done very 
 useful and partly brilliant work. That's all I want from him. 
 So what?".
 What drives me now is the desperate, abstruse and stubborn 
 group dynamics at play. And no, I'm not doing that just for 
 the fun of it; it can actually be a useful service (and it 
 does have a certain relation to the original problem).

 In two words: Context counts. (Which btw. is something you 
 should like as you try playing it a lot).
 In this context here group seniority might be a big thing. Or 
 particular technical skills. As soon as we leave the area of 
 code, however, the cards get mixed again and who was big then 
 might be surprisingly small. In this discussion here, for 
 instance, the capability to analyze and recognize e.g. social 
 and rhetorical mechanisms is way more important than D skills 
 (No suprise. After all it *is* a group, social and human 
 thing).

 To put it bluntly: Chances are that I can take apart whatever 
 smart tricks you come up with. But why, what for?
 Why don't you yourself just stick to your own advice and 
 assume - and correctly  assume - that I have no bad intentions?
 You even have proof! If I had bad intentions or just were out 
 for a fight or revenge, I would certainly not have recognized 
 A's work as brilliant and lauded his book. Nor would I quite 
 politely and patiently discuss and respond to statements that 
 I, no offense intended, perceive as, uh, less than 
 intellectually exciting.

 Take what I offer. Because it's good and because you will 
 definitely not succeed in getting any femtogram more from me.

 a) Mr. A. did act in an unfair und unjustified way, no matter 
 how you try to bend it. Maybe what he did was well known and 
 usual here. But not toward myself.

 b) It's long forgiven and I'm in a peaceful and constructive 
 state of mind. But don't you dare to convince me that Mr. A. 
 was right and I should bend over and adapt to absurd group 
 rules that demand inter alia precognition and possibly 
 telepathy.

 Can we now finally return to discussing D, algorithms, code 
 and the like or do you insist to educate me and to continue 
 your route toward nada, nothing, zilch?

 Just consider me a miserable creature and really ugly on top 
 of it if that helps.
Nice tantrum :D Wise Words are spoken unto thee "Grow a pair and move on"
Well, that's a bit harsh. Can we close this thread?
Aug 22 2013
parent reply "Ramon" <spam thanks.no> writes:
On Friday, 23 August 2013 at 01:17:55 UTC, bsd wrote:
 Well, that's a bit harsh. Can we close this thread?
Aren't you a bit harsh here? After all, as a quick forum search suggests, "Wise Words", or more correctly the person who usually writes under another screen name here made the effort to bravely use another screen name in order to ... uhm ... troll. I guess "Wise Words" didn't mean me but rather told us about an experience where someone told him "Grow a pair and move on". Evidently he didn't follow at least the first part of the advice given to him. Amusing.
Aug 22 2013
parent reply "bsd" <slackovsky gmail.com> writes:
On Friday, 23 August 2013 at 02:41:35 UTC, Ramon wrote:
 On Friday, 23 August 2013 at 01:17:55 UTC, bsd wrote:
 Well, that's a bit harsh. Can we close this thread?
Aren't you a bit harsh here? After all, as a quick forum search suggests, "Wise Words", or more correctly the person who usually writes under another screen name here made the effort to bravely use another screen name in order to ... uhm ... troll. I guess "Wise Words" didn't mean me but rather told us about an experience where someone told him "Grow a pair and move on". Evidently he didn't follow at least the first part of the advice given to him. Amusing.
Indeed, BS you are a troll :D
Aug 22 2013
parent "BS" <barrybumnut gmail.com> writes:
On Friday, 23 August 2013 at 04:23:14 UTC, bsd wrote:
 On Friday, 23 August 2013 at 02:41:35 UTC, Ramon wrote:
 On Friday, 23 August 2013 at 01:17:55 UTC, bsd wrote:
 Well, that's a bit harsh. Can we close this thread?
Aren't you a bit harsh here? After all, as a quick forum search suggests, "Wise Words", or more correctly the person who usually writes under another screen name here made the effort to bravely use another screen name in order to ... uhm ... troll. I guess "Wise Words" didn't mean me but rather told us about an experience where someone told him "Grow a pair and move on". Evidently he didn't follow at least the first part of the advice given to him. Amusing.
Indeed, BS you are a troll :D
OK you got me, I'll stop :-P Damn, bitten by a gravatar !
Aug 22 2013
prev sibling parent "Regan Heath" <regan netmail.co.nz> writes:
On Thu, 22 Aug 2013 20:20:36 +0100, Ramon <spam thanks.no> wrote:
 Regan Heath
Ramon
 You try to wrap it nicely but in the end you just prove my hypothesis  
 right.
*sigh* if that's your takeaway I doubt anything I can say will change your mind, it's not how I see it. To conclude. There are no tactics, tricks or anything else other than bare honesty being employed here, you're seeing all the rest entirely off your own bat, including the "offense" given by Andrei. Imagine how this looks to a regular poster of this group. Andrei posts something which is normal for him and normal for this group. You react in a way which is /not/ normal for this group. Which one of you is "miss-behaving" (according to the group)? I agree context counts, Andrei's words to you in the context of this group were "normal" and not at all offensive. If you were to take his words and replace those which have meaning in this context with another which you were familiar with, you would not have taken offense. You cannot expect to post in a context unknown to you without some confusion, how you chose to deal with that confusion is the key, and as I said before and will say one last time "assume good intent". R -- Using Opera's revolutionary email client: http://www.opera.com/mail/
Aug 23 2013
prev sibling parent "Dicebot" <public dicebot.lv> writes:
Human language rarely has any clear and well-defined meanings. 
Without cultural context it is almost nothing. Actually, in fact 
people almost never understand each other, they always operate 
within some amount of false assumptions.

In that regard, Andrei, who has been using well-established 
communication protocol understood by most part of this community 
was most honest and reasonable in expressing his intentions. 
Failure to understand that is always your failure as "proper" 
communication is always defined by community and never by beliefs 
of some people.

Scorning from my side is not because of opinions you express or 
technical goals you find important. It is because of sermon 
flavor that overwhelms all your comments. No reasonable man can 
think his beliefs and/or habits are any exceptional. Denying this 
and refusing to properly study domain you oppose is quite 
reliable indicator of ignorance or trolling. Probably both and I 
shouldn't really care about the difference.

I must admit I am quite fast to lose my temper and tend to 
overreact sometimes. However, it makes me sad to see that D 
community falls into completely opposite extreme - wasting time 
and efforts in useless attempts to satisfy people who are simply 
mocking it, whenever form it may take.
Aug 22 2013
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2013-08-19 22:22, H. S. Teoh wrote:

 If we were to set things up so that libprogram.a contains a separate
 unit for each instantiated template function, then the linker would
 actually pull in only code that is actually referenced at runtime. For
 example, say our code looks like this:
Doesn't the compiler already do something like that with the -lib flag? -- /Jacob Carlborg
Aug 20 2013