www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Should out/ref parameters require the caller to specify out/ref like

reply WebFreak001 <d.forum webfreak.org> writes:
Imagine you wrote a function

void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
}

in your API you didn't document that this will change `a` (or we 
will assume the user simply didn't read because you would never 
do something like this).

The user now calls the code in his program, probably doesn't know 
that foo takes a as ref and because it's his first time using the 
function he doesn't expect it to either.

void main(string[] args) {
   int input = args[1].to!int + 1;
   writeln("Processing for ", input);
   foo(input);
   writeln(100 / input); // idk, it will crash if input == 0 though
}

Now his code will occasionally crash but the user can't figure 
out why and can't always reproduce it. Imagine the code is 
somewhere deep inside event handlers from some GUI library or 
recursive calls too.

Should the language spec say that those functions should get 
called with `foo(ref input);` so that surprises like this where 
the user doesn't check the docs/implementation can't happen (like 


I think the user should be enforced to use foo(ref input) instead 
of foo(input) as it greatly increases understanding of the code 
on the caller side and another advantage is that programs 
analyzing the AST can better understand if the argument is unused 
(DScanner could use this for example).

On the other hand a lot of code has been written without this 
already and especially a lot of UFCS code would break with this 
like for example functions acting like member functions of the 
ref argument. A fix for this might be just implying the ref you 
would add on an argument if you use UFCS but I'm not sure if that 
is really a good idea.

This post is just an idea because I think it can result in really 
confused users and that the usage both by library developer and 
user of out/ref in general is kind of bad by design because it 
will just imply it and it won't be visible in the code. 
Especially when changing the API this will result in many runtime 
errors that need to be discovered if the user doesn't read the 
changelog.

Especially because this would break a lot of code I don't really 
expect anything to happen but maybe this could be taken into 
account when D3 will be designed or be added in some optional DIP 
that can be used using a compiler flag?
May 28 2017
next sibling parent Moritz Maxeiner <moritz ucworks.org> writes:
On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 in your API you didn't document that this will change `a` (or 
 we will assume the user simply didn't read because you would 
 never do something like this).
Taking the parameter as `ref int` *is* documenting that you will mutate it. Otherwise, you would have taken it as `ref const int` instead. Additionally, someone who does not read documentation is likely to have his/her program so riddled with bugs, that I don't think considering him/her is worth the effort.
 The user now calls the code in his program, probably doesn't 
 know that foo takes a as ref and because it's his first time 
 using the function he doesn't expect it to either.

 void main(string[] args) {
   int input = args[1].to!int + 1;
   writeln("Processing for ", input);
   foo(input);
   writeln(100 / input); // idk, it will crash if input == 0 
 though
 }

 Now his code will occasionally crash but the user can't figure 
 out why and can't always reproduce it. Imagine the code is 
 somewhere deep inside event handlers from some GUI library or 
 recursive calls too.
Then you'll have to debug, yes. An argument to always consider `ref T` as "I will mutate this" and `ref const T` as "I just want to read from this" and to (almost) never cast away const.
 Should the language spec say that those functions should get 
 called with `foo(ref input);` so that surprises like this where 
 the user doesn't check the docs/implementation can't happen 

While I personally might like this syntax, what about const / immutable, does this then require `foo(ref const input)` and `foo(ref immutable input)`? Or `foo(ref cast(const int) input)`?
 [...]
I can see both points of your argument, but here is one more argument against it: It introduces a language inconsistency by special casing references in function calls. We would have to allow `ref` everywhere we already allow `&` for pointers to get rid of the inconsistency (regardless of whether it makes sense to allow `ref` elsewhere).
 Especially because this would break a lot of code I don't 
 really expect anything to happen but maybe this could be taken 
 into account when D3 will be designed or be added in some 
 optional DIP that can be used using a compiler flag?
AFAIK D3 is hypothetical at this point.
May 28 2017
prev sibling next sibling parent reply Meta <jared771 gmail.com> writes:
On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 in your API you didn't document that this will change `a` (or 
 we will assume the user simply didn't read because you would 
 never do something like this).

 The user now calls the code in his program, probably doesn't 
 know that foo takes a as ref and because it's his first time 
 using the function he doesn't expect it to either.

 void main(string[] args) {
   int input = args[1].to!int + 1;
   writeln("Processing for ", input);
   foo(input);
   writeln(100 / input); // idk, it will crash if input == 0 
 though
 }

 Now his code will occasionally crash but the user can't figure 
 out why and can't always reproduce it. Imagine the code is 
 somewhere deep inside event handlers from some GUI library or 
 recursive calls too.

 Should the language spec say that those functions should get 
 called with `foo(ref input);` so that surprises like this where 
 the user doesn't check the docs/implementation can't happen 


 I think the user should be enforced to use foo(ref input) 
 instead of foo(input) as it greatly increases understanding of 
 the code on the caller side and another advantage is that 
 programs analyzing the AST can better understand if the 
 argument is unused (DScanner could use this for example).

 On the other hand a lot of code has been written without this 
 already and especially a lot of UFCS code would break with this 
 like for example functions acting like member functions of the 
 ref argument. A fix for this might be just implying the ref you 
 would add on an argument if you use UFCS but I'm not sure if 
 that is really a good idea.

 This post is just an idea because I think it can result in 
 really confused users and that the usage both by library 
 developer and user of out/ref in general is kind of bad by 
 design because it will just imply it and it won't be visible in 
 the code. Especially when changing the API this will result in 
 many runtime errors that need to be discovered if the user 
 doesn't read the changelog.

 Especially because this would break a lot of code I don't 
 really expect anything to happen but maybe this could be taken 
 into account when D3 will be designed or be added in some 
 optional DIP that can be used using a compiler flag?
If a parameter is marked as ref then you have to assume it will be modified by the function (unless it's const/inout/immutable). If it's marked as out then you know it will be. If you didn't know that the function takes its parameters by ref or out... You're should've RTFM.
May 28 2017
next sibling parent reply ketmar <ketmar ketmar.no-ip.org> writes:
Meta wrote:

 If a parameter is marked as ref then you have to assume it will be 
 modified by the function (unless it's const/inout/immutable). If it's 
 marked as out then you know it will be. If you didn't know that the 
 function takes its parameters by ref or out... You're should've RTFM.
now imagine that you're reading some code: foo(a); vs: foo(ref a); which code style is easier to read without constant jumping into documentation?
May 28 2017
parent reply Meta <jared771 gmail.com> writes:
On Sunday, 28 May 2017 at 19:10:49 UTC, ketmar wrote:
 Meta wrote:

 If a parameter is marked as ref then you have to assume it 
 will be modified by the function (unless it's 
 const/inout/immutable). If it's marked as out then you know it 
 will be. If you didn't know that the function takes its 
 parameters by ref or out... You're should've RTFM.
now imagine that you're reading some code: foo(a); vs: foo(ref a); which code style is easier to read without constant jumping into documentation?
Is this ever actually a problem in practice? Anyway, having to add ref or out at the call site will greatly hamper metaprogramming.
May 28 2017
parent ketmar <ketmar ketmar.no-ip.org> writes:
Meta wrote:

 On Sunday, 28 May 2017 at 19:10:49 UTC, ketmar wrote:
 Meta wrote:

 If a parameter is marked as ref then you have to assume it will be 
 modified by the function (unless it's const/inout/immutable). If it's 
 marked as out then you know it will be. If you didn't know that the 
 function takes its parameters by ref or out... You're should've RTFM.
now imagine that you're reading some code: foo(a); vs: foo(ref a); which code style is easier to read without constant jumping into documentation?
Is this ever actually a problem in practice?
yes. aliced has this syntax for a long time. for a reason. and i must say that i'm adding alot of random features to aliced, but very little of 'em survives.
 Anyway, having to add ref or out at the call site will greatly hamper 
 metaprogramming.
not more then "&" does. except with `auto ref` -- which is misdesigned feature anyway (forcing programmer to do the work compiler can do without any external help is not a good design; besides, compiler can do that work *better*).
May 29 2017
prev sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 05/28/2017 03:06 PM, Meta wrote:
 
 If you didn't know that the 
 function takes its parameters by ref or out... You're should've RTFM.
That's the same reasoning that's been used to excuse just about every API blunder in C's infamously unsafe bug-riddled history.
May 28 2017
parent reply Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Monday, 29 May 2017 at 01:56:19 UTC, Nick Sabalausky 
(Abscissa) wrote:
 On 05/28/2017 03:06 PM, Meta wrote:
 
 If you didn't know that the function takes its parameters by 
 ref or out... You're should've RTFM.
That's the same reasoning that's been used to excuse just about every API blunder in C's infamously unsafe bug-riddled history.
This is information that a good IDE could be designed to provide. To require "ref" is rather pointless as it would make the feature redundant, just use a pointer and "&" instead and argue in favour of nonnullable static analysis...
May 28 2017
next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:
On 05/29/2017 12:57 AM, Ola Fosheim Grostad wrote:
 On Monday, 29 May 2017 at 01:56:19 UTC, Nick Sabalausky (Abscissa) wrote:
 On 05/28/2017 03:06 PM, Meta wrote:
 If you didn't know that the function takes its parameters by ref or 
 out... You're should've RTFM.
That's the same reasoning that's been used to excuse just about every API blunder in C's infamously unsafe bug-riddled history.
This is information that a good IDE could be designed to provide. To require "ref" is rather pointless as it would make the feature redundant, just use a pointer and "&" instead and argue in favour of nonnullable static analysis...
Did you intend that as a response to my post or to the OP? Sounds more like it was directed at the OP. The web interface really should add an extra "Reply" button way at the bottom that creates a reply to the OP (Or better yet, default to tree view.) I've noticed that the stupid default of "remove all threading information" combined with individual message replies creates a lot of confusion with cases where people try to reply to the OP, but wind up *actually* replying to whatever random leaf just happened the be the latest post (I also suspect the web interface's "remove threading" default is probably also the ultimate source of most of the animosity towards OT. Prentending that there's no threading in an inherently threaded newsgroup just begs for these sorts of problems)
May 28 2017
parent Ola Fosheim Grostad <ola.fosheim.grostad gmail.com> writes:
On Monday, 29 May 2017 at 05:39:41 UTC, Nick Sabalausky 
(Abscissa) wrote:
 Did you intend that as a response to my post or to the OP? 
 Sounds more like it was directed at the OP.
I tried to reply to: in favor of doing the same in D (ages ago). It got shot down way back when, so unfortunately, I don't think any reversal is going to happen.>> But failed... prolly because you had 2 msgs in a row... Sorry.
May 28 2017
prev sibling parent ketmar <ketmar ketmar.no-ip.org> writes:
Ola Fosheim Grostad wrote:

 This is information that a good IDE could be designed to provide.
when people start talking about how IDE can help to solve particular language problem, it is a clear sign of bad language design.
May 29 2017
prev sibling next sibling parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 [...]
Syntax wise we could force you to say foo(&something). Which fits perfectly in the existing pointer syntax.
May 28 2017
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Sunday, 28 May 2017 at 22:03:48 UTC, Stefan Koch wrote:
 On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 [...]
Syntax wise we could force you to say foo(&something). Which fits perfectly in the existing pointer syntax.
No it does not, because then this becomes ambiguous: foo(ref int a); foo(int* b); ...and furthermore, what would we do with this: void foo1(T)(auto ref T a) { foo2(a); } void foo2(T)(auto ref T a) { /*...*/ } ?
May 28 2017
parent reply Stefan Koch <uplink.coder googlemail.com> writes:
On Sunday, 28 May 2017 at 22:18:01 UTC, Stanislav Blinov wrote:
 On Sunday, 28 May 2017 at 22:03:48 UTC, Stefan Koch wrote:
 On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 [...]
Syntax wise we could force you to say foo(&something). Which fits perfectly in the existing pointer syntax.
No it does not, because then this becomes ambiguous: foo(ref int a); foo(int* b); ...and furthermore, what would we do with this: void foo1(T)(auto ref T a) { foo2(a); } void foo2(T)(auto ref T a) { /*...*/ } ?
Personally I stay away from ref precisely because of it's silent caller syntax.
May 28 2017
parent ketmar <ketmar ketmar.no-ip.org> writes:
Stefan Koch wrote:

 Personally I stay away from ref precisely because of it's silent caller 
 syntax.
same here. i prefer to use pointers instead, 'cause they are visible at the calling site.
May 28 2017
prev sibling next sibling parent reply Nicholas Wilson <iamthewilsonator hotmail.com> writes:
On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 [...]
It seems nice in theory but how will it interact with generic code? Perhaps it should be optional and purely documentative.
May 28 2017
parent ketmar <ketmar ketmar.no-ip.org> writes:
Nicholas Wilson wrote:

 On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Imagine you wrote a function

 void foo(ref int a) {
   if (std.random.uniform(0, 10) == 0)
     a = 0;
   // Actual code doing something
 }

 [...]
It seems nice in theory but how will it interact with generic code? Perhaps it should be optional and purely documentative.
exactly like "&" does.
May 29 2017
prev sibling next sibling parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Sunday, May 28, 2017 17:54:30 WebFreak001 via Digitalmars-d wrote:
 I think the user should be enforced to use foo(ref input) instead
 of foo(input) as it greatly increases understanding of the code
 on the caller side and another advantage is that programs
 analyzing the AST can better understand if the argument is unused
 (DScanner could use this for example).
This has been discussed before. Among other things, it does not play nicely with UFCS or generic code (both of which are huge in D). If you really want to have it show at the call site that you're not simply passing by value, then use a pointer. - Jonathan M Davis
May 28 2017
prev sibling next sibling parent reply "Nick Sabalausky (Abscissa)" <SeeWebsiteToContactMe semitwist.com> writes:

of doing the same in D (ages ago). It got shot down way back when, so 
unfortunately, I don't think any reversal is going to happen.

Getting in the way of UFCS and generic code are fair points, although 
they don't stike me as insurmountable. I think that if D's ref/out 
params had worked that way from the beginning we would've already hod 
solutions to those matters. And it does kinda bug me when generic code 
ends up being a reason to not have a nice benefitial feature for 
non-generic code, just fwiw :(

In response to any claim that this isn't a real problem in practice, I 
submit the possibility that, if it indeed isn't a real problem, maybe 
that's *because* of people (like Stefan and ketmar) simply avoiding the 
feature entirely so that it *doesn't* become a problem.
May 28 2017
parent reply ketmar <ketmar ketmar.no-ip.org> writes:
Nick Sabalausky (Abscissa) wrote:

 In response to any claim that this isn't a real problem in practice, I 
 submit the possibility that, if it indeed isn't a real problem, maybe 
 that's *because* of people (like Stefan and ketmar) simply avoiding the 
 feature entirely so that it *doesn't* become a problem.
exactly! you can include Adam here too, btw. we all prefer to use pointer args exactly to avoid *the* *problem*. so it is not a problem for us anymore... but we aren't solved it, we simply cheated. basically, my only `ref` usage is `in auto ref` (which is not a "real" ref anyway, as argument cannot be changed). side note: also, with pointers i can do `void foo (int a, bool* res=null) {...}`. not a big deal: one can always create set of overloads to get the same effect with ref, but -- less typing! ;-) yet i must say that using pointers in a code where they should be references makes me... nervous. it just doesn't feel right. but meh, i'll trade that (and safety, 'cause `&` is unsafe) for "call site ref indicator". that's why i don't bother writing safe code, btw: in one or another way i will be forced to hack aroung safe restrictions, so no reason to start that dance at all. (or just mark the whole thing as trusted and hope for the best)
May 29 2017
parent Adam D. Ruppe <destructionator gmail.com> writes:
On Monday, 29 May 2017 at 08:40:27 UTC, ketmar wrote:
 yet i must say that using pointers in a code where they should 
 be references makes me... nervous. it just doesn't feel right. 
 but meh, i'll trade that (and safety, 'cause `&` is unsafe) for 
 "call site ref indicator".
So one win with dip1000 is it allows it: --- safe void foo(scope int* a) {} safe void bar() { int a; foo(&a); } --- $ dmd ppp -c ppp.d(4): Error: cannot take address of local a in safe function bar But with dip 1000: $ dmd ppp -dip1000 -c Of course, dip1000 breaks the living crap out of just about everything else (though all the errors are in phobos....) but it is semi-sane in this case.
May 29 2017
prev sibling parent reply Dukc <ajieskola gmail.com> writes:
On Sunday, 28 May 2017 at 17:54:30 UTC, WebFreak001 wrote:
 Should the language spec say that those functions should get 
 called with `foo(ref input);` so that surprises like this where 
 the user doesn't check the docs/implementation can't happen 

I think it's mostly about good taste on what you define functions to take as ref input. I have a feeling the present way is not a big problem in practice because it is intuitive somehow. Besides, member functions mutate their class/struct anyway, and we don't want to lose our ability to call extension funcions with same syntax as member ones. Same thing as with struct/class separation: in principle it sounds pointless that whether used by value or by ref has to be defined with the type, but it somehow just works intuitively in practice. (My personal experience. Don't know if others feel the same way) But what would be worth a consideration, is that perhaps one should be allowed to pass rvalues as reference with something like this? According to TDPL, ref arguments do not take rvalues to prevent bugs where you accidently copy something before passing it, and that's a good rationale. But shouldn't it still be allowed explicitly?
May 29 2017
next sibling parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Monday, 29 May 2017 at 07:39:40 UTC, Dukc wrote:

 But what would be worth a consideration, is that perhaps one 
 should be allowed to pass rvalues as reference with something 
 like this? According to TDPL, ref arguments do not take rvalues 
 to prevent bugs where you accidently copy something before 
 passing it, and that's a good rationale. But shouldn't it still 
 be allowed explicitly?
Explicitly? It is: import std.stdio; struct S { int v; } void foo(ref S s) { writeln("took S by ref: ", s.v); } void foo(S s) { writeln("took rvalue S..."); foo(s); // calls ref overload } void main() { S s; foo(s); // calls ref overload foo(S.init); // calls rvalue overload } And for templates we have auto ref.
May 29 2017
parent Dukc <ajieskola gmail.com> writes:
On Monday, 29 May 2017 at 07:46:07 UTC, Stanislav Blinov wrote:
 On Monday, 29 May 2017 at 07:39:40 UTC, Dukc wrote:

 [snip]
Explicitly? It is: import std.stdio; struct S { int v; } void foo(ref S s) { writeln("took S by ref: ", s.v); } void foo(S s) { writeln("took rvalue S..."); foo(s); // calls ref overload } void main() { S s; foo(s); // calls ref overload foo(S.init); // calls rvalue overload } And for templates we have auto ref.
Of course, I should have noticed one can already do it that way. No need for change, then (imo).
May 29 2017
prev sibling next sibling parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 07:39:40 Dukc via Digitalmars-d wrote:
 But what would be worth a consideration, is that perhaps one
 should be allowed to pass rvalues as reference with something
 like this? According to TDPL, ref arguments do not take rvalues
 to prevent bugs where you accidently copy something before
 passing it, and that's a good rationale. But shouldn't it still
 be allowed explicitly?
I expect that we're going to see a DIP related to rvalue references at some point here, because some of the folks (particularly the game folks) think that it's critical to be able to have a function that doesn't care whether it's taking a value by value or by ref - just that it takes it efficiently, and they don't want to be forced to use auto ref to do it (since that requires templates). - Jonathan M Davis
May 29 2017
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 29 May 2017 at 07:51:13 UTC, Jonathan M Davis wrote:
 I expect that we're going to see a DIP related to rvalue 
 references at some point here, because some of the folks 
 (particularly the game folks) think that it's critical to be 
 able to have a function that doesn't care whether it's taking a 
 value by value or by ref - just that it takes it efficiently, 
 and they don't want to be forced to use auto ref to do it 
 (since that requires templates).
So it would use ref or value depending on the target platform? I guess that could make sense since some platforms allow vector registers as parameters, but it sounds more like an implementation detail? Or did you mean something else?
May 29 2017
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 08:22:23 Ola Fosheim Grøstad via Digitalmars-d 
wrote:
 On Monday, 29 May 2017 at 07:51:13 UTC, Jonathan M Davis wrote:
 I expect that we're going to see a DIP related to rvalue
 references at some point here, because some of the folks
 (particularly the game folks) think that it's critical to be
 able to have a function that doesn't care whether it's taking a
 value by value or by ref - just that it takes it efficiently,
 and they don't want to be forced to use auto ref to do it
 (since that requires templates).
So it would use ref or value depending on the target platform? I guess that could make sense since some platforms allow vector registers as parameters, but it sounds more like an implementation detail? Or did you mean something else?
I probably didn't say it very well. With C++, if you have const T&, it will accept both lvalues and rvalues. A number of folks (particularly those writing games) want an equivalent to that in D where they can then pass both lvalues and rvalues without incurring a copy. Historically, Andrei has been against it because of issues with rvalue references, but based on some of the more recent discussions, it sounds like it _might_ be possible to come up with a solution that he'd be okay with (particularly with some of the improvements that come with DIPs 25 and 1000). Ethan Watson has expressed interest in writing a DIP on the matter, so I expect that we'll see one at some point here. - Jonathan M Davis
May 29 2017
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= writes:
On Monday, 29 May 2017 at 08:41:05 UTC, Jonathan M Davis wrote:
 With C++, if you have const T&, it will accept both lvalues and 
 rvalues. A number of folks (particularly those writing games) 
 want an equivalent to that in D where they can then pass both 
 lvalues and rvalues without incurring a copy.
Mmm. But it sounds more like a compiler hint on const value types could work? Except when the making of a copy creates side-effects... so you would have to guard against that. The whole thing with temporaries in C++ in part lead to && move semantics. So this is a slippery slope, perhaps.
 solution that he'd be okay with (particularly with some of the 
 improvements that come with DIPs 25 and 1000). Ethan Watson has 
 expressed interest in writing a DIP on the matter, so I expect 
 that we'll see one at some point here.
I think D is going the wrong way with all this special casing. Memory management is hard enough as it is. At some point Rust's type system will look easier... Much better if you can get away with using a compiler hint and give the compiler a flag that turns that hint into an absolute for the game-programmers. It could also be an annotation on a type, that isn't part of the type. (This type should aways be passed as const-ref)
May 29 2017
parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 09:17:31 Ola Fosheim Grøstad via Digitalmars-d 
wrote:
 On Monday, 29 May 2017 at 08:41:05 UTC, Jonathan M Davis wrote:
 With C++, if you have const T&, it will accept both lvalues and
 rvalues. A number of folks (particularly those writing games)
 want an equivalent to that in D where they can then pass both
 lvalues and rvalues without incurring a copy.
Mmm. But it sounds more like a compiler hint on const value types could work? Except when the making of a copy creates side-effects... so you would have to guard against that. The whole thing with temporaries in C++ in part lead to && move semantics. So this is a slippery slope, perhaps.
I don't know what's going to be proposed, so I can't really say much about it. I just know what the problem is that they want to solve. However, I can say that whatever they do, having const as a requirement would not fly, because D's const is so restrictive. I think that in a previous discussion, it was suggested that we do something like have an attribute like rvalue that would go on a ref parameter which would then make it so that the ref parameter would accept rvalues. So, we may get something like that, or it could be that something completely different will be proposed. My main concern is that we not simply make it so that ref accepts rvalues, because then it makes it harder to know when ref is used because a value is going to be returned via the ref parameter and when it's used simply to avoid copying. But I don't know what it will take to alleviate Andrei's concerns about rvalue references, and I expect that that's ultimately the hurdle that any DIP would have to get over. However, DIPs 25 and 1000 should solve the related safety concerns which were always what Walter was worried about. - Jonathan M Davis
May 29 2017
prev sibling parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Monday, 29 May 2017 at 08:41:05 UTC, Jonathan M Davis wrote:

 I probably didn't say it very well.

 With C++, if you have const T&, it will accept both lvalues and 
 rvalues. A number of folks (particularly those writing games) 
 want an equivalent to that in D where they can then pass both 
 lvalues and rvalues without incurring a copy. Historically, 
 Andrei has been against it because of issues with rvalue 
 references, but based on some of the more recent discussions, 
 it sounds like it _might_ be possible to come up with a 
 solution that he'd be okay with (particularly with some of the 
 improvements that come with DIPs 25 and 1000). Ethan Watson has 
 expressed interest in writing a DIP on the matter, so I expect 
 that we'll see one at some point here.

 - Jonathan M Davis
This indeed does sound like a good extension for DIP1000, i.e: struct Vector3 { float x, y, z; } void foo(in ref Vector3 v) { /*...*/ } foo(Vector3(1,2,3)); Invalid now, even with -dip1000, but, since `in` is `const scope` it feels like it should be made allowed: `scope` guarantees the reference won't escape, the argument obviously will live through the call to foo(), so it could be made to "just work", even for mutables, i.e. `scope ref`. Of course, without `scope` it should remain an error. In regards to move semantics and && also mentioned here, there is only one case where current D looks inferior to C++: D's "double move": struct X { Y y; this(Y y) { move(y, this.y); } } auto x = X(Y()); // ctor above benefits from the compiler, so here's // a more explicit case: Y y; X x2 = move(y); With rvalue references and explicit move semantics of C++, this could be made into one bitblast. With current D, it requires two. However, this does seem like a compiler optimization issue, and is mentioned by Andrei: https://github.com/dlang/phobos/pull/4971#issuecomment-268038627 So all in all, it looks to be possible to achieve the desired results purely by enhancing the language, with no syntax changes and without adding any "rvalue reference" types.
May 29 2017
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 17:19:27 Stanislav Blinov via Digitalmars-d wrote:
 `in` is `const scope`
Walter recently changed is that in is now just const, because scope was not properly implemented previously, and folks were using in all over the place, so the odds of code breaking when scope was properly implemented were high. - Jonathan M Davis
May 29 2017
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Monday, 29 May 2017 at 19:14:54 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 17:19:27 Stanislav Blinov via 
 Digitalmars-d wrote:
 `in` is `const scope`
Walter recently changed is that in is now just const, because scope was not properly implemented previously, and folks were using in all over the place, so the odds of code breaking when scope was properly implemented were high. - Jonathan M Davis
Huh, I missed that... peculiar change. :\ So now it would break for people who did use `in` properly? In any case, to be clear, I certainly meant `const scope`.
May 29 2017
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 20:00:12 Stanislav Blinov via Digitalmars-d wrote:
 On Monday, 29 May 2017 at 19:14:54 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 17:19:27 Stanislav Blinov via

 Digitalmars-d wrote:
 `in` is `const scope`
Walter recently changed is that in is now just const, because scope was not properly implemented previously, and folks were using in all over the place, so the odds of code breaking when scope was properly implemented were high. - Jonathan M Davis
Huh, I missed that... peculiar change. :\ So now it would break for people who did use `in` properly?
Since scope was never properly defined for anything other than delegates, it's questionable that _anyone_ who used it used it properly. But regardless, nothing will break because of scope-related stuff unless you use the -dip1000 switch, which would then require scope in a number of places that it wasn't before (including on local variables in a number of cases IIRC), making it so that even if in had stayed const scope, pretty much regardless of how you used it, code would break (which is going to make life fun when when we finally transition from dip 1000 to being optional to being the normal behavior). - Jonathan M Davis
May 29 2017
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Monday, 29 May 2017 at 20:31:18 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 20:00:12 Stanislav Blinov via 
 Digitalmars-d wrote:
 On Monday, 29 May 2017 at 19:14:54 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 17:19:27 Stanislav Blinov via

 Digitalmars-d wrote:
 `in` is `const scope`
Walter recently changed is that in is now just const, because scope was not properly implemented previously, and folks were using in all over the place, so the odds of code breaking when scope was properly implemented were high. - Jonathan M Davis
Huh, I missed that... peculiar change. :\ So now it would break for people who did use `in` properly?
Since scope was never properly defined for anything other than delegates, it's questionable that _anyone_ who used it used it properly.
Errm... It was always *defined* (https://dlang.org/spec/function.html#parameters). The fact that it wasn't implemented for anything but delegates is another issue entirely. Changing spec to please those who didn't care to follow it in the first place just seems weird to me.
 But regardless, nothing will break because of scope-related 
 stuff unless you use the -dip1000 switch, which would then 
 require scope in a number of places that it wasn't before 
 (including on local variables in a number of cases IIRC), 
 making it so that even if in had stayed const scope, pretty 
 much regardless of how you used it, code would break (which is 
 going to make life fun when when we finally transition from dip 
 1000 to being optional to being the normal behavior).
With that, I see no problem. Yes, enabling dip1000 behavior would cause some breakage, that's the point of having it as a switch in the first place.
May 29 2017
parent reply Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Monday, May 29, 2017 21:13:53 Stanislav Blinov via Digitalmars-d wrote:
 On Monday, 29 May 2017 at 20:31:18 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 20:00:12 Stanislav Blinov via

 Digitalmars-d wrote:
 On Monday, 29 May 2017 at 19:14:54 UTC, Jonathan M Davis wrote:
 On Monday, May 29, 2017 17:19:27 Stanislav Blinov via

 Digitalmars-d wrote:
 `in` is `const scope`
Walter recently changed is that in is now just const, because scope was not properly implemented previously, and folks were using in all over the place, so the odds of code breaking when scope was properly implemented were high. - Jonathan M Davis
Huh, I missed that... peculiar change. :\ So now it would break for people who did use `in` properly?
Since scope was never properly defined for anything other than delegates, it's questionable that _anyone_ who used it used it properly.
Errm... It was always *defined* (https://dlang.org/spec/function.html#parameters). The fact that it wasn't implemented for anything but delegates is another issue entirely. Changing spec to please those who didn't care to follow it in the first place just seems weird to me.
That definition currently there is more precise than the definition on that page has been historically, but even as it stands, it's not particularly precise. Assuming a particular interpretation when the spec is not precise, and the compiler does not implement anything of the sort, is just begging for trouble when the compiler actually does implement something - which is precisely why Walter decided that it was too dangerous to let in imply scope when he actually started implementing scope to do something. It's quite possible that what some folks assumed scope would do won't run afoul of Walter's changes, but in many cases, it will, and since what he's doing with scope involves stuff like making it so that you have to mark local variables with scope (and not in the way that we've had scoped classes which are supposed to have been deprecated), what he's doing definitely doesn't match what scope was originally, even if it follows the basic idea. But scope has never been well-specified. - Jonathan M Davis
May 29 2017
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:

 That definition currently there is more precise than the 
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? Because a change like that should get reflected in the spec, otherwise we might just continue to ignore said spec and expect our grievances to be "gracefully" resolved later. What I mean is I'd rather see/make the change reflected there...
May 29 2017
next sibling parent reply Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:

 That definition currently there is more precise than the 
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? Because a change like that should get reflected in the spec, otherwise we might just continue to ignore said spec and expect our grievances to be "gracefully" resolved later. What I mean is I'd rather see/make the change reflected there...
Unfortunately, `in` was never implemented as `scope const`. I think it was only when Walter started working actively on scope that he found out that it's too late to change this - https://github.com/dlang/dmd/pull/5898. Here are some more references: https://github.com/dlang/druntime/pull/1740 https://github.com/dlang/druntime/pull/1749 Going forward, I think it would be best for the language if `in` would work as Q. Schroll described here: http://forum.dlang.org/post/medovwjuykzpstnwbfyy forum.dlang.org. This can also nicely fix the the problems with rvalues (with auto ref you may end with up to 2^N template instantiations where N is the number of parameters and 2 is because you get one by value and one by ref instance; doesn't play nice with delegates etc). See also https://github.com/dlang/dmd/pull/4717.
May 30 2017
next sibling parent reply Petar Kirov [ZombineDev] <petar.p.kirov gmail.com> writes:
On Tuesday, 30 May 2017 at 09:48:09 UTC, Petar Kirov [ZombineDev] 
wrote:
 On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis 
 wrote:

 That definition currently there is more precise than the 
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? Because a change like that should get reflected in the spec, otherwise we might just continue to ignore said spec and expect our grievances to be "gracefully" resolved later. What I mean is I'd rather see/make the change reflected there...
Unfortunately, `in` was never implemented as `scope const`. I think it was only when Walter started working actively on scope that he found out that it's too late to change this - https://github.com/dlang/dmd/pull/5898. Here are some more references: https://github.com/dlang/druntime/pull/1740 https://github.com/dlang/druntime/pull/1749 Going forward, I think it would be best for the language if `in` would work as Q. Schroll described here: http://forum.dlang.org/post/medovwjuykzpstnwbfyy forum.dlang.org. This can also nicely fix the the problems with rvalues (with auto ref you may end with up to 2^N template instantiations where N is the number of parameters and 2 is because you get one by value and one by ref instance; doesn't play nice with delegates etc). See also https://github.com/dlang/dmd/pull/4717.
Another interesting link: http://dgame.github.io/dneeds/
May 30 2017
parent Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Tuesday, 30 May 2017 at 09:55:14 UTC, Petar Kirov [ZombineDev] 
wrote:

 Unfortunately, `in` was never implemented as `scope const`. I 
 think it was only when Walter started working actively on 
 scope that he found out that it's too late to change this -
 https://github.com/dlang/dmd/pull/5898. Here are some more 
 references:
 https://github.com/dlang/druntime/pull/1740
 https://github.com/dlang/druntime/pull/1749

 Going forward, I think it would be best for the language if 
 `in` would work as Q. Schroll described here: 
 http://forum.dlang.org/post/medovwjuykzpstnwbfyy forum.dlang.org. This can
also nicely fix the the problems with rvalues (with auto ref you may end with
up to 2^N template instantiations where N is the number of parameters and 2 is
because you get one by value and one by ref instance; doesn't play nice with
delegates etc). See also https://github.com/dlang/dmd/pull/4717.
Another interesting link: http://dgame.github.io/dneeds/
Thanks for the links. I have to wonder though, does this go further than dormant discussions? Because apparently, when things like that are actively put on the backburner, we have good chances ending up with "too much code would break", and in the end, nothing would be changed, at least for the better. D seems to be gaining momentum, and having things half-working-not-necessarily-as-is-was-intended looks less and less like an acceptable "for the time being" state of affairs.
May 30 2017
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 05/30/2017 05:48 AM, Petar Kirov [ZombineDev] wrote:
 On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:

 That definition currently there is more precise than the definition 
 on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? Because a change like that should get reflected in the spec, otherwise we might just continue to ignore said spec and expect our grievances to be "gracefully" resolved later. What I mean is I'd rather see/make the change reflected there...
Unfortunately, `in` was never implemented as `scope const`.
Why would it need to be `const`? Thanks! -- Andrei
May 30 2017
parent reply Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Tuesday, 30 May 2017 at 15:59:04 UTC, Andrei Alexandrescu 
wrote:
 On 05/30/2017 05:48 AM, Petar Kirov [ZombineDev] wrote:
 On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov 
 wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis 
 wrote:

 That definition currently there is more precise than the 
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? ...
Unfortunately, `in` was never implemented as `scope const`.
Why would it need to be `const`? Thanks! -- Andrei
What do you mean?
May 30 2017
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 05/30/2017 12:11 PM, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 15:59:04 UTC, Andrei Alexandrescu wrote:
 On 05/30/2017 05:48 AM, Petar Kirov [ZombineDev] wrote:
 On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:

 That definition currently there is more precise than the definition 
 on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? ...
Unfortunately, `in` was never implemented as `scope const`.
Why would it need to be `const`? Thanks! -- Andrei
What do you mean?
I think `scope` would be enough. People should still be able to modify the value. -- Andrei
May 30 2017
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 05/30/2017 12:19 PM, Andrei Alexandrescu wrote:
 On 05/30/2017 12:11 PM, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 15:59:04 UTC, Andrei Alexandrescu wrote:
 On 05/30/2017 05:48 AM, Petar Kirov [ZombineDev] wrote:
 On Tuesday, 30 May 2017 at 06:13:39 UTC, Stanislav Blinov wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:

 That definition currently there is more precise than the 
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? ...
Unfortunately, `in` was never implemented as `scope const`.
Why would it need to be `const`? Thanks! -- Andrei
What do you mean?
I think `scope` would be enough. People should still be able to modify the value. -- Andrei
Oh, I'm in a different movie - thought it's the associative array "in". Sorry! -- Andrei
May 30 2017
prev sibling parent Stanislav Blinov <stanislav.blinov gmail.com> writes:
On Tuesday, 30 May 2017 at 16:19:26 UTC, Andrei Alexandrescu 
wrote:

 Apparently, it is not. Do you have a reference to Walter's 
 change regarding `in` becoming just `const`? ...
Unfortunately, `in` was never implemented as `scope const`.
Why would it need to be `const`? Thanks! -- Andrei
What do you mean?
I think `scope` would be enough. People should still be able to modify the value. -- Andrei
I'm puzzled. I was talking about https://dlang.org/spec/function.html#parameters : `in` - `const scope`. Jonathan mentioned that Walter effectively reverted it to `const`. Petar provided links to that effect. Now you're saying it should be simply `scope`? %-O
May 30 2017
prev sibling parent Jonathan M Davis via Digitalmars-d <digitalmars-d puremagic.com> writes:
On Tuesday, May 30, 2017 06:13:39 Stanislav Blinov via Digitalmars-d wrote:
 On Tuesday, 30 May 2017 at 02:12:56 UTC, Jonathan M Davis wrote:
 That definition currently there is more precise than the
 definition on that page has been historically...
Apparently, it is not. Do you have a reference to Walter's change regarding `in` becoming just `const`? Because a change like that should get reflected in the spec, otherwise we might just continue to ignore said spec and expect our grievances to be "gracefully" resolved later. What I mean is I'd rather see/make the change reflected there...
There was a thread discussing it here: http://forum.dlang.org/post/zskxjpctdipbqalpwxbj forum.dlang.org with Walter's main response being here: http://forum.dlang.org/post/o6m17i$1jh7$1 digitalmars.com - Jonathan M Davis
May 30 2017
prev sibling parent reply WebFreak001 <d.forum webfreak.org> writes:
On Monday, 29 May 2017 at 07:39:40 UTC, Dukc wrote:
 I think it's mostly about good taste on what you define 
 functions to take as ref input. I have a feeling the present 
 way is not a big problem in practice because it is intuitive 
 somehow. Besides, member functions mutate their class/struct 
 anyway, and we don't want to lose our ability to call extension 
 funcions with same syntax as member ones.
well for the extension functions I wrote that if the ref parameter is the first argument and it's called with ufcs syntax, it could implicitly add the ref probably. I don't think there are any big issues with that, it does look like a member function and the programmer could easily read it as "this modifies it"
May 29 2017
parent Dukc <ajieskola gmail.com> writes:
On Monday, 29 May 2017 at 15:31:26 UTC, WebFreak001 wrote:
 well for the extension functions I wrote that if the ref 
 parameter is the first argument and it's called with ufcs 
 syntax, it could implicitly add the ref probably. I don't think 
 there are any big issues with that, it does look like a member 
 function and the programmer could easily read it as "this 
 modifies it"
Surely better than just requiring ref everywhere. But still, a.swap(b) just feels intuitive. Hard to say what I like here, because the ref parameter is usually the first anyway. Besides swap() you rarely use more than one ref parameter.
May 29 2017