www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - unsafe

reply nick <nick.atamas gmail.com> writes:
Note: I did a search for this and didn't come up with any threads. If it
has been discussed before... my apologies.


Recently I introduced D to a friend of mine (a C.S. grad student at
Purdue). His reaction was the usual "wow, awesome". Then he became
concerned about pointer safety. D allows unrestricted access to pointers.

I will skip the "pointers are unsafe" rigmarole.


Feb 12 2006
next sibling parent reply S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-12 14:11:05 -0800, nick <nick.atamas gmail.com> said:

 Note: I did a search for this and didn't come up with any threads. If it
 has been discussed before... my apologies.
 
 
 Recently I introduced D to a friend of mine (a C.S. grad student at
 Purdue). His reaction was the usual "wow, awesome". Then he became
 concerned about pointer safety. D allows unrestricted access to pointers.
 
 I will skip the "pointers are unsafe" rigmarole.
 

Don't play with the pointers if you want them to point to the right place. It's not like people are injecting malicious sourcecode into your projects. -S.
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
S. Chancellor wrote:
 On 2006-02-12 14:11:05 -0800, nick <nick.atamas gmail.com> said:
 
 Note: I did a search for this and didn't come up with any threads. If it
 has been discussed before... my apologies.


 Recently I introduced D to a friend of mine (a C.S. grad student at
 Purdue). His reaction was the usual "wow, awesome". Then he became
 concerned about pointer safety. D allows unrestricted access to pointers.

 I will skip the "pointers are unsafe" rigmarole.


Don't play with the pointers if you want them to point to the right place. It's not like people are injecting malicious sourcecode into your projects. -S.
I guess skipping the rigmarole was a bad idea. I work with many EE people. EE people don't know how to write code. They are often very intelligent, but simple lack the experience and proper coding practices. Bjarne said that most programmers are amateurs. There is no helping that. A good language will provide safeguards against errors for amateurs. This would be one of them. When I get a piece of code from an colleague, I want to be sure that he didn't use pointers to mess something up. Having an unsafe keyword prevents his code from being compiled unless I provide the --allow-unsafe flag. That's a big assurance for me that his code doesn't mess up my heap.
Feb 12 2006
parent reply "Jarrett Billingsley" <kb3ctd2 yahoo.com> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dsoifi$2von$1 digitaldaemon.com...
 I guess skipping the rigmarole was a bad idea.
 I work with many EE people. EE people don't know how to write code. They
 are often very intelligent, but simple lack the experience and proper
 coding practices.

 Bjarne said that most programmers are amateurs. There is no helping
 that. A good language will provide safeguards against errors for
 amateurs. This would be one of them.
The thing is, pointer use in D is pretty much restricted to interfacing with C libs. Using D normally, you should almost never have to touch pointers; the one real exception to this is the result of 'in' expressions.
 When I get a piece of code from an colleague, I want to be sure that he
 didn't use pointers to mess something up. Having an unsafe keyword
 prevents his code from being compiled unless I provide the
 --allow-unsafe flag. That's a big assurance for me that his code doesn't
 mess up my heap.
No code can mess up the heap, unless you're running with protected mode off. Which can't really happen with any OSes made in the last 15 years or so.
Feb 12 2006
next sibling parent reply nick <nick.atamas gmail.com> writes:
Jarrett Billingsley wrote:
 "nick" <nick.atamas gmail.com> wrote in message 
 news:dsoifi$2von$1 digitaldaemon.com...
 The thing is, pointer use in D is pretty much restricted to interfacing with 
 C libs.  Using D normally, you should almost never have to touch pointers; 
 the one real exception to this is the result of 'in' expressions.
 
 No code can mess up the heap, unless you're running with protected mode off. 
 Which can't really happen with any OSes made in the last 15 years or so. 
I realize that pointers aren't meant to be used for things other than interfacing with C. However, it seems to me that D currently doesn't enforce this. A C programmer might be tempted to muck about with pointers. For example. ------------------------------------- CODE: import std.stdio; class A { private int data[]; public this() { data.length = 10; } public void printSelf() { writefln("Data: ", this.data); } } void surprise(inout A a) { byte *ap = cast(byte *)(a); ap[9] = 5; } int main() { A a = new A(); a.printSelf(); surprise(a); a.printSelf(); return 0; } ------------------------------------- OUTPUT: Data: [0,0,0,0,0,0,0,0,0,0] Data: [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4287008,0,2004,2768,1245184,1,0,0,0 <..SNIP..> ,0,0,0,0,0,0,0,0,0,0,0] When I compile someone else's code I want an absolute guarantee that they aren't doing anything unsafe with their pointers. If they are doing something unsafe, I'd like to know. Better yet, I'd like to know if someone's library is unsafe when I link against it.
Feb 12 2006
parent reply Sean Kelly <sean f4.ca> writes:
nick wrote:
 Jarrett Billingsley wrote:
 "nick" <nick.atamas gmail.com> wrote in message 
 news:dsoifi$2von$1 digitaldaemon.com...
 The thing is, pointer use in D is pretty much restricted to interfacing with 
 C libs.  Using D normally, you should almost never have to touch pointers; 
 the one real exception to this is the result of 'in' expressions.

 No code can mess up the heap, unless you're running with protected mode off. 
 Which can't really happen with any OSes made in the last 15 years or so. 
I realize that pointers aren't meant to be used for things other than interfacing with C.
What about someone who simply wants to maintain multiple references to a non-class type? C interfacing might be the most common use, but it certainly isn't the only use. Sean
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
 What about someone who simply wants to maintain multiple references to a
 non-class type?  C interfacing might be the most common use, but it
 certainly isn't the only use.
 
 
 Sean
References to non-class types are fine, but they should be maintained using typesafe pointers, which D provides. "Pointers are provided for interfacing with C and for specialized systems work....Most conventional uses for pointers can be replaced with dynamic arrays, out and inout parameters, and reference types." http://www.digitalmars.com/d/arrays.html You asking this is just further proof that there are people who will try I am in no way trying to attack you. I am just pointing out that C and C++ breeds bad programming practice, and we need protection from them.
Feb 12 2006
parent reply "Matthew" <nowhere noaddress.co.us> writes:
 I am in no way trying to attack you. I am just pointing out that C and
 C++ breeds bad programming practice, and we need protection from them.
Are you serious? Similarly, I am not trying to attack you, but this is terribly jejune. C and C++ and pointers are no more or less hazardous than is a hammer or a syringe or a stick of dynamite. If you're bashing in a nail or injecting a patient or excavating a tunnel these are the best tools to have. They do not "breed" bad practice. That comes from process, experience and attitude. I'll repeat something I observed here a long time ago: the very worst coding I've come across, by a country mile, was a group of Java "consultants" (from one of the Big 5, or is it now Big 4, accountancy firms). I've seen bad C, bad COM, bad .NET, and, of course, plenty of bad C++, but nothing compares to the masses of swill that these jokers spewed forth. And nary a pointer in sight. Myself and several other (non-Java) consultants were brought in to fix the mess, and it took many days and hundreds of applications of source cleaning tools (which, thankfully, were written in a fast language) to start to clear up the junk. Further, I'd suggest that an understanding of what goes on beneath the covers is actually a hazardous skill to be lacking. (You might find http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html illustrative of my point.) Walter correctly sought to incorporate pointers in D as first class parts of the language. In the same way that I avoid IOStreams in C++, one tends to avoid pointers in D. Bottom line: if you're a good engineer, you're a good engineer. If you're not, you're not. The language used won't affect this truth. And avoiding peaking inside abstractions won't help you become one.
Feb 12 2006
next sibling parent "Matthew" <nowhere noaddress.co.us> writes:
-- scratch this sentence. Unfinished though ... --

 In the same way that I avoid IOStreams in C++, one tends to avoid pointers 
 in D.
Feb 12 2006
prev sibling next sibling parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 
 Further, I'd suggest that an understanding of what goes on beneath the 
 covers is actually a hazardous skill to be lacking.
I very much agree. In fact, this is my primary issue with the use of Java as a teaching language. It's nice when the teacher wants to gloss over details to focus on algorithm design or some such, but ignorance of the software-hardware interaction doesn't seem to make for good programmers. Sean
Feb 12 2006
prev sibling next sibling parent reply nick <nick.atamas gmail.com> writes:
Matthew wrote:
 I am in no way trying to attack you. I am just pointing out that C and
 C++ breeds bad programming practice, and we need protection from them.
Are you serious?
No, I think I should rephrase myself. My new statement: people who come from a background that's primarily in C and C++ (especially if all they did was take one C or C++ course in college) are likely to use a C-style pointer instead of the appropriate D-style tools. We need a safeguard.
 Similarly, I am not trying to attack you, but this is terribly jejune. C and 
 C++ and pointers are no more or less hazardous than is a hammer or a syringe 
 or a stick of dynamite. If you're bashing in a nail or injecting a patient 
 or excavating a tunnel these are the best tools to have. They do not "breed" 
 bad practice. That comes from process, experience and attitude.
They cause a problem in D. Here is an example: void surprise( in A a ) { byte *ap = cast(byte *)(a); ap[9] = 5; } If a function like that is part of a library, and I link against it, I may be in for quite a surprise. (There is code in the previous post that demonstrates the badness)
 Walter correctly sought to incorporate pointers in D as first class parts of 
 the language. In the same way that I avoid IOStreams in C++, one tends to 
 avoid pointers in D.
 Bottom line: if you're a good engineer, you're a good engineer. If you're 
 not, you're not. The language used won't affect this truth. And avoiding 
 peaking inside abstractions won't help you become one.
But if you are a good engineer working with people who are good electrical engineers and bad programmers you are in for a void surprise(). I think you are looking at this through the eyes of someone who is an expert programmer, which you are. The fact is many people who program are not experts and you can't change that. It would be nice to have a safeguard against mistakes that they are likely to make. If you can point out why there shouldn't be a safeguard that (similar to the one in why I posted this thread. THE OTHER POINT: ---------------
 Further, I'd suggest that an understanding of what goes on beneath the 
 covers is actually a hazardous skill to be lacking. (You might find 
 http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html illustrative 
 of my point.)
I have no beef with that. I must say I am lucky to have made it into and out of my school before it became a Java prepschool. I took the weedout courses even though I had placement APs to get me out of some. You could say "I run gmake and gcc and I ain't never call malloc without calling free".
 C and 
 C++ and pointers are no more or less hazardous than is a hammer or a syringe 
 or a stick of dynamite.
From the Joel on software article: "Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code."
 I'll repeat something I observed here a long time ago: the very worst coding 
 I've come across, by a country mile, was a group of Java "consultants" (from 
<SNIP>
 cleaning tools (which, thankfully, were written in a fast language) to start 
 to clear up the junk.
We are mixing two arguments here. If you want to discuss the negative effects of Java prepschools, let's start a new thread for that. I'll join you in bashing them.
Feb 12 2006
next sibling parent reply "Matthew" <nowhere noaddress.co.us> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dsp4lg$pul$1 digitaldaemon.com...
 Matthew wrote:
 I am in no way trying to attack you. I am just pointing out that C and
 C++ breeds bad programming practice, and we need protection from them.
Are you serious?
No, I think I should rephrase myself. My new statement: people who come from a background that's primarily in C and C++ (especially if all they did was take one C or C++ course in college) are likely to use a C-style pointer instead of the appropriate D-style tools. We need a safeguard.
That's fair. But I think they're likely to not want to waste their time here, and so will have an open mind to new idioms. That's likely to be of more use and less bother in the long run, I think.
 Similarly, I am not trying to attack you, but this is terribly jejune. C 
 and
 C++ and pointers are no more or less hazardous than is a hammer or a 
 syringe
 or a stick of dynamite. If you're bashing in a nail or injecting a 
 patient
 or excavating a tunnel these are the best tools to have. They do not 
 "breed"
 bad practice. That comes from process, experience and attitude.
They cause a problem in D. Here is an example: void surprise( in A a ) { byte *ap = cast(byte *)(a); ap[9] = 5; } If a function like that is part of a library, and I link against it, I may be in for quite a surprise. (There is code in the previous post that demonstrates the badness)
Sorry, I've fully read your post now. Always helps. The answer's simple: const. One of the continuing crushing advantages that C++ has over D. But you'll never see it in D. Walter doesn't use it in C/C++, and therefore has only a compiler writer's perspective on it, which is that it is a time-consuming pest to implement correctly with little, if any, use whatsoever because it can be subverted. Surprisingly, I'm from the other viewpoint. const is immensely expressive and powerful, and I wouldn't leave home without it. It is one of the ways I can tame the C++ compiler to do my will.
 Walter correctly sought to incorporate pointers in D as first class parts 
 of
 the language. In the same way that I avoid IOStreams in C++, one tends to
 avoid pointers in D.
 Bottom line: if you're a good engineer, you're a good engineer. If you're
 not, you're not. The language used won't affect this truth. And avoiding
 peaking inside abstractions won't help you become one.
But if you are a good engineer working with people who are good electrical engineers and bad programmers you are in for a void surprise(). I think you are looking at this through the eyes of someone who is an expert programmer, which you are.
More flattery. You've obviously cotton'd on to my crushing insecurity, and its attendant vanity. ;-) To be fair, I would not claim expert status in anything other than C++ in general, STL extension in particular, and writing very fast recursive directory search libraries. Ha ha ha ha . . . . . . . wibble.
 The fact is many people who program
 are not experts and you can't change that. It would be nice to have a
 safeguard against mistakes that they are likely to make. If you can
 point out why there shouldn't be a safeguard that (similar to the one in

 why I posted this thread.
And now I fully follow your point, I agree that const would be perfect to your requirements.
 I'll repeat something I observed here a long time ago: the very worst 
 coding
 I've come across, by a country mile, was a group of Java "consultants" 
 (from
<SNIP>
 cleaning tools (which, thankfully, were written in a fast language) to 
 start
 to clear up the junk.
We are mixing two arguments here. If you want to discuss the negative effects of Java prepschools, let's start a new thread for that. I'll join you in bashing them.
Nah, it transpires that I only like talking with people with whom I violently disagree. ;-)
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
Matthew wrote:
 "nick" <nick.atamas gmail.com> wrote in message 
 That's fair. But I think they're likely to not want to waste their time 
 here, and so will have an open mind to new idioms. That's likely to be of 
 more use and less bother in the long run, I think.
You would hope that they would have an open mind and use the D features. But look at this quote:
 What about someone who simply wants to maintain multiple references to a
non-class type?  C interfacing might be the most common use, but it certainly
isn't the only use.
Maybe I'm misreading that, but isn't he basically saying "let's use c-style pointers". That's coming from someone who has been on the D newsgroup for some time. It's a testament to how much people get used to doing things a certain way. That particular person comes from a strong C/C++ background if I am not mistaken. My experience outside the D newsgroup shows this mentality to be true in general; it is human nature I guess. Then there is the specific matter of the /in/ keyword being easily broken by a c-style pointer. There may be other high-level features that are compromised by low-level features. That has to be looked into.
 And now I fully follow your point, I agree that const would be perfect to 
 your requirements.
Agreed, const would fix this particular problem. Even better, the semantics of /in/ can be modified so that it works more like const. However, I don't know if actual const parameters in D is what we need, because in/out/inout are a more explicit way of accomplishing the same idea. I realize that it's just a trade-off between the flexibility of const and the explicit nature of in/out My rambling about C++'s const follow: I've got beef with C++'s implementation of const. Allow me to explain. In C++ const serves multiple purposes that are related but not quite the same. Specifically, it can serve to signify that: 1. a function parameter is const e.g. 'void foo(const A &a)' 2. a method doesn't modify its object e.g. 'void foo() const' 3. a variable will never change e.g. const pi = 3.1415...; or const A a; for primitives?) more elegant. It seems that even in low-level code, in/out/inout can replace const in most cases. hand, D has a simple syntax, so a refactoring tool may be easy to implement. But then, maybe everything should be const by default and we should explicitly make things mutable.
Feb 13 2006
parent reply "Matthew" <nowhere noaddress.co.us> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dspgi4$1d03$1 digitaldaemon.com...
 Matthew wrote:
 "nick" <nick.atamas gmail.com> wrote in message
 That's fair. But I think they're likely to not want to waste their time
 here, and so will have an open mind to new idioms. That's likely to be of
 more use and less bother in the long run, I think.
You would hope that they would have an open mind and use the D features. But look at this quote:
 What about someone who simply wants to maintain multiple references to a 
 non-class type?  C interfacing might be the most common use, but it 
 certainly isn't the only use.
Maybe I'm misreading that, but isn't he basically saying "let's use c-style pointers". That's coming from someone who has been on the D newsgroup for some time. It's a testament to how much people get used to doing things a certain way. That particular person comes from a strong C/C++ background if I am not mistaken.
Well, Sean's very experienced in both C++ and D, and I tend to pay attention when he's opining. (Sean's going to be one of the reviewers on my new book. <g>) However, that doesn't proof him from the same frailties and preconceptions as the rest of us.
 My experience outside the D newsgroup shows this mentality to be true in
 general; it is human nature I guess.
If your point is that pointers should not be part of any of the mainline idioms of D, much less the language/std-libs, then I agree. I don't know if it's still the case, but the associative arrays used to have use pointers as part of its semantics. Fingers crossed that's no longer the case.
 Then there is the specific matter of the /in/ keyword being easily
 broken by a c-style pointer. There may be other high-level features that
 are compromised by low-level features. That has to be looked into.
No argument with the call for taking another look.
 And now I fully follow your point, I agree that const would be perfect to
 your requirements.
Agreed, const would fix this particular problem. Even better, the semantics of /in/ can be modified so that it works more like const. However, I don't know if actual const parameters in D is what we need, because in/out/inout are a more explicit way of accomplishing the same idea. I realize that it's just a trade-off between the flexibility of const and the explicit nature of in/out
It wouldn't have to be exactly the same as C++ const, but it'd have to confer all the advantages.
 My rambling about C++'s const follow:
 I've got beef with C++'s implementation of const. Allow me to explain.
 In C++ const serves multiple purposes that are related but not quite the
 same. Specifically, it can serve to signify that:
 1. a function parameter is const e.g. 'void foo(const A &a)'
 2. a method doesn't modify its object e.g. 'void foo() const'
 3. a variable will never change e.g. const pi = 3.1415...; or const A a;


 for primitives?)


 more elegant. It seems that even in low-level code, in/out/inout can
 replace const in most cases.


 hand, D has a simple syntax, so a refactoring tool may be easy to
 implement. But then, maybe everything should be const by default and we
 should explicitly make things mutable.
keyword. I agree that mixing its name with the keyword for constants is a bad one - readonly would be much better - but C++ has a long and sorry history of reusing keywords inappropriately. What does typename mean? What does class mean? How are pre-/post- inc/decrement operators discriminated? How does one designate a pure virtual function? And so on. All hideous, to be sure.
Feb 13 2006
parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 "nick" <nick.atamas gmail.com> wrote in message 
 news:dspgi4$1d03$1 digitaldaemon.com...
 Matthew wrote:
 "nick" <nick.atamas gmail.com> wrote in message
 That's fair. But I think they're likely to not want to waste their time
 here, and so will have an open mind to new idioms. That's likely to be of
 more use and less bother in the long run, I think.
You would hope that they would have an open mind and use the D features. But look at this quote:
 What about someone who simply wants to maintain multiple references to a 
 non-class type?  C interfacing might be the most common use, but it 
 certainly isn't the only use.
Maybe I'm misreading that, but isn't he basically saying "let's use c-style pointers". That's coming from someone who has been on the D newsgroup for some time. It's a testament to how much people get used to doing things a certain way. That particular person comes from a strong C/C++ background if I am not mistaken.
Well, Sean's very experienced in both C++ and D, and I tend to pay attention when he's opining. (Sean's going to be one of the reviewers on my new book. <g>) However, that doesn't proof him from the same frailties and preconceptions as the rest of us.
I was merely trying to point out that there are some situations where I might use a pointer in C/C++ that doesn't have a 'safe' D analog. However, it was somewhat of a weak position, as most of the examples could be refactored fairly easily to use classes to store the dynamically referenced data rather than using C-like pointers--the first example I thought of was the reference counter for shared pointers. Sean
Feb 13 2006
prev sibling parent reply "Unknown W. Brackets" <unknown simplemachines.org> writes:
What's going to stop them from making other mistakes, unrelated to 
pointers?  For example, the following:

void surprise(in char[] array)
{
     ubyte[100] x = cast(ubyte[100]) array;
     array[99] = 1;
}

This will compile fine, and uses zero pointers.  It's exactly the same 
concept, too.  Here's another one:

void surprise(in int i)
{
     if (i == 0 || i > 30)
         return i;
     else
         return surprise(--i);
}

Oops, what happens if i is below 0?  Oh, wait, here's another common 
mistake I see:

for (int i = 0; i != len; i++)
{
     ...
}

What happens if len is negative?  I've seen this happen, a lot, in more 
than a few different people's code.  They weren't stupid, you're right, 
but it did happen.

So do we mark != in fors as "unsafe"?  Recursion too?  And forget 
casting, any casting is unsafe now as well?

Seems to me like you're going to end up spending longer dealing with 
their problems, if they think they can use pointers but really can't, 
than you would just reviewing their darn code.

Oh wait, it's only open source where you do that "code review" thing. 
Otherwise it's considered a waste of time, except in huge corporations. 
  Why bother when "unsafe" can just fix everything for you like magic?

Valentine's day is coming up... good thing there are flowers, they just 
fix everything too.  I can be a jerk and all I need are flowers, right? 
  Magic.

-[Unknown]


 Matthew wrote:
 I am in no way trying to attack you. I am just pointing out that C and
 C++ breeds bad programming practice, and we need protection from them.
Are you serious?
No, I think I should rephrase myself. My new statement: people who come from a background that's primarily in C and C++ (especially if all they did was take one C or C++ course in college) are likely to use a C-style pointer instead of the appropriate D-style tools. We need a safeguard.
 Similarly, I am not trying to attack you, but this is terribly jejune. C and 
 C++ and pointers are no more or less hazardous than is a hammer or a syringe 
 or a stick of dynamite. If you're bashing in a nail or injecting a patient 
 or excavating a tunnel these are the best tools to have. They do not "breed" 
 bad practice. That comes from process, experience and attitude.
They cause a problem in D. Here is an example: void surprise( in A a ) { byte *ap = cast(byte *)(a); ap[9] = 5; } If a function like that is part of a library, and I link against it, I may be in for quite a surprise. (There is code in the previous post that demonstrates the badness)
 Walter correctly sought to incorporate pointers in D as first class parts of 
 the language. In the same way that I avoid IOStreams in C++, one tends to 
 avoid pointers in D.
 Bottom line: if you're a good engineer, you're a good engineer. If you're 
 not, you're not. The language used won't affect this truth. And avoiding 
 peaking inside abstractions won't help you become one.
But if you are a good engineer working with people who are good electrical engineers and bad programmers you are in for a void surprise(). I think you are looking at this through the eyes of someone who is an expert programmer, which you are. The fact is many people who program are not experts and you can't change that. It would be nice to have a safeguard against mistakes that they are likely to make. If you can point out why there shouldn't be a safeguard that (similar to the one in why I posted this thread. THE OTHER POINT: ---------------
 Further, I'd suggest that an understanding of what goes on beneath the 
 covers is actually a hazardous skill to be lacking. (You might find 
 http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html illustrative 
 of my point.)
I have no beef with that. I must say I am lucky to have made it into and out of my school before it became a Java prepschool. I took the weedout courses even though I had placement APs to get me out of some. You could say "I run gmake and gcc and I ain't never call malloc without calling free".
 C and 
 C++ and pointers are no more or less hazardous than is a hammer or a syringe 
 or a stick of dynamite.
From the Joel on software article: "Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code."
 I'll repeat something I observed here a long time ago: the very worst coding 
 I've come across, by a country mile, was a group of Java "consultants" (from 
<SNIP>
 cleaning tools (which, thankfully, were written in a fast language) to start 
 to clear up the junk.
We are mixing two arguments here. If you want to discuss the negative effects of Java prepschools, let's start a new thread for that. I'll join you in bashing them.
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
Pointer problems are notoriously difficult to track. Pointers are a
feature that is not necessary in 90% of production code. Hey, Joel
called them DANGEROUS. (I'm going to use that one a lot now.)

My example demonstrates a potential error that, if occurs in a library
that you don't have source for, will cause you hours of grief. My
example was carefully constructed. In it an object was passed in using
the /in/ keyword. That should guarantee that my copy of the object
doesn't change. If you are saying it is OK for it to change, then you
are basically saying that the /in/ keyword is useless (well, not really
useless but almost). I don't think that's cool.

Unknown W. Brackets wrote:
 What's going to stop them from making other mistakes, unrelated to
 pointers?  For example, the following:
 
 void surprise(in char[] array)
 {
     ubyte[100] x = cast(ubyte[100]) array;
     array[99] = 1;
 }
 
 This will compile fine, and uses zero pointers.  It's exactly the same
 concept, too.
No, it won't compile. Maybe I have a different version of dmd, but I get this: main.d(3): e2ir: cannot cast from char[] to ubyte[100] Try it yourself. The rest of these aren't really pointer bugs. So, if you want to try a slippery slope and argue that all of programming is unsafe, be my guest. It isn't particularly productive though. (Sorry, I am getting cranky; it's late.) Here's another one:
 
 void surprise(in int i)
 {
     if (i == 0 || i > 30)
         return i;
     else
         return surprise(--i);
 }
 
 Oops, what happens if i is below 0?  Oh, wait, here's another common
 mistake I see:
 
 for (int i = 0; i != len; i++)
 {
     ...
 }
 
 What happens if len is negative?  I've seen this happen, a lot, in more
 than a few different people's code.  They weren't stupid, you're right,
 but it did happen.
 
 So do we mark != in fors as "unsafe"?  Recursion too?  And forget
 casting, any casting is unsafe now as well?
 
 Seems to me like you're going to end up spending longer dealing with
 their problems, if they think they can use pointers but really can't,
 than you would just reviewing their darn code.
 
 Oh wait, it's only open source where you do that "code review" thing.
 Otherwise it's considered a waste of time, except in huge corporations.
  Why bother when "unsafe" can just fix everything for you like magic?
 
 Valentine's day is coming up... good thing there are flowers, they just
 fix everything too.  I can be a jerk and all I need are flowers, right?
  Magic.
 
 -[Unknown]
Feb 12 2006
next sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
I've read articles by Joel before, and while he's usually pretty good he 
can be off just as often.  It's not the king's word or anything.

Arrays can be passed using in and they can still change.  Only the array 
structure (length, pointer) cannot change.  There is no such guarantee 
for the data it points to (since both arrays point to the same place.)

I think the in keyword is very useful, if nothing else because of its 
semantic value.  But, even with it, something like this is still possible:

class A
{
     B b = null;
     int i = 1;
}

class B
{
     int j = 1;
}

int main()
{
     void test(in A a)
     {
         a.i = 6;
         a.b.j = 6;
     }

     A a = new A();
     a.b = new B();
     a.i = 5;
     a.b.j = 5;

     test(a);

     printf("i: %i, j: %i\n", a.i, a.b.j);

     return 0;
}

Run that and i and j will both be 6.  All that cannot happen is that a 
cannot be changed.  It really is not const.

As for the example you quoted, I'm afraid I am indeed guilty of a typo. 
  Anyway, for your enjoyment, here's a more complete example:

void surprise(in char[4] array)
{
     ubyte[] x = cast(ubyte[100]) array;
     x[99] = 1;
}

int main()
{
     char[4] char_array;
     ubyte[96] ubyte_array;

     surprise(char_array);

     foreach (ubyte u; ubyte_array)
     {
         if (u != 0)
             printf("1\n");
     }

     return 0;
}

You'll note, again, please, that I didn't use a single pointer.  Also, 
array bounds checking is enabled.  I'm doing everything great, except a 
bit of casting, since I know what it's doing behind the scenes.

This will print 1, even though ubyte_array was automatically initialized 
to 0 (one of the only things D does automatically to prevent bugs.) 
This means that I overwrote some other "innocent by-standard" variable. 
  Without even typing a single asterisk.

I didn't say they were "pointer bugs" or anything of the sort.  I said 
they were sloppy/stupid programming, and things that cannot be prevented 
by using "safe" code.

In other words: if someone doesn't know what they're doing, they 
shouldn't be doing it.  I don't see people arguing in court rooms that 
they're amateur drivers or anything inane like that, but yet it happens 
for programmers.  Driving is no simple thing, but people actually spend 
the time to master it.  It's not that different.

But if you can't drive - whether because you're blind, or simply haven't 
learned yet - you'd best get off the darn road.  Or at least be aware 
that you should avoid doing anything stupid.  If your amateur programmer 
friends can't figure this out they're not nearly as intelligent as 
you're making them out to be.

(Please note, of course, that everyone has to learn.  But that's what 
driver's ed is for, and what other things are for.  You just shouldn't 
be in production/on the road if you can't figure things out yet.)

-[Unknown]


 Pointer problems are notoriously difficult to track. Pointers are a
 feature that is not necessary in 90% of production code. Hey, Joel
 called them DANGEROUS. (I'm going to use that one a lot now.)
 
 My example demonstrates a potential error that, if occurs in a library
 that you don't have source for, will cause you hours of grief. My
 example was carefully constructed. In it an object was passed in using
 the /in/ keyword. That should guarantee that my copy of the object
 doesn't change. If you are saying it is OK for it to change, then you
 are basically saying that the /in/ keyword is useless (well, not really
 useless but almost). I don't think that's cool.
 
 Unknown W. Brackets wrote:
 What's going to stop them from making other mistakes, unrelated to
 pointers?  For example, the following:

 void surprise(in char[] array)
 {
     ubyte[100] x = cast(ubyte[100]) array;
     array[99] = 1;
 }

 This will compile fine, and uses zero pointers.  It's exactly the same
 concept, too.
No, it won't compile. Maybe I have a different version of dmd, but I get this: main.d(3): e2ir: cannot cast from char[] to ubyte[100] Try it yourself. The rest of these aren't really pointer bugs. So, if you want to try a slippery slope and argue that all of programming is unsafe, be my guest. It isn't particularly productive though. (Sorry, I am getting cranky; it's late.) Here's another one:
 void surprise(in int i)
 {
     if (i == 0 || i > 30)
         return i;
     else
         return surprise(--i);
 }

 Oops, what happens if i is below 0?  Oh, wait, here's another common
 mistake I see:

 for (int i = 0; i != len; i++)
 {
     ...
 }

 What happens if len is negative?  I've seen this happen, a lot, in more
 than a few different people's code.  They weren't stupid, you're right,
 but it did happen.

 So do we mark != in fors as "unsafe"?  Recursion too?  And forget
 casting, any casting is unsafe now as well?

 Seems to me like you're going to end up spending longer dealing with
 their problems, if they think they can use pointers but really can't,
 than you would just reviewing their darn code.

 Oh wait, it's only open source where you do that "code review" thing.
 Otherwise it's considered a waste of time, except in huge corporations.
  Why bother when "unsafe" can just fix everything for you like magic?

 Valentine's day is coming up... good thing there are flowers, they just
 fix everything too.  I can be a jerk and all I need are flowers, right?
  Magic.

 -[Unknown]
Feb 13 2006
prev sibling parent reply Sean Kelly <sean f4.ca> writes:
nick wrote:
 Pointer problems are notoriously difficult to track. Pointers are a
 feature that is not necessary in 90% of production code. Hey, Joel
 called them DANGEROUS. (I'm going to use that one a lot now.)
 
 My example demonstrates a potential error that, if occurs in a library
 that you don't have source for, will cause you hours of grief. My
 example was carefully constructed. In it an object was passed in using
 the /in/ keyword. That should guarantee that my copy of the object
 doesn't change. If you are saying it is OK for it to change, then you
 are basically saying that the /in/ keyword is useless (well, not really
 useless but almost). I don't think that's cool.
'in' just indicates call by value, and in the case of objects, only the reference is passed by value. No guarantee of immutability is being provided. Doesn't Ada use the same in/out/inout syntax? Sean
Feb 13 2006
parent reply nick <nick.atamas gmail.com> writes:
Sean Kelly wrote:
 nick wrote:
 Pointer problems are notoriously difficult to track. Pointers are a
 feature that is not necessary in 90% of production code. Hey, Joel
 called them DANGEROUS. (I'm going to use that one a lot now.)

 My example demonstrates a potential error that, if occurs in a library
 that you don't have source for, will cause you hours of grief. My
 example was carefully constructed. In it an object was passed in using
 the /in/ keyword. That should guarantee that my copy of the object
 doesn't change. If you are saying it is OK for it to change, then you
 are basically saying that the /in/ keyword is useless (well, not really
 useless but almost). I don't think that's cool.
'in' just indicates call by value, and in the case of objects, only the reference is passed by value. No guarantee of immutability is being provided. Doesn't Ada use the same in/out/inout syntax? Sean
Yeah, I think we're on the same page as far as the facts. I am just not so happy about the lack of const. Although, as Walter pointer out, there is no good way that he found.
Feb 13 2006
parent reply "Matthew" <nowhere noaddress.co.us> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dsrhuq$i9p$1 digitaldaemon.com...
 Sean Kelly wrote:
 nick wrote:
 Pointer problems are notoriously difficult to track. Pointers are a
 feature that is not necessary in 90% of production code. Hey, Joel
 called them DANGEROUS. (I'm going to use that one a lot now.)

 My example demonstrates a potential error that, if occurs in a library
 that you don't have source for, will cause you hours of grief. My
 example was carefully constructed. In it an object was passed in using
 the /in/ keyword. That should guarantee that my copy of the object
 doesn't change. If you are saying it is OK for it to change, then you
 are basically saying that the /in/ keyword is useless (well, not really
 useless but almost). I don't think that's cool.
'in' just indicates call by value, and in the case of objects, only the reference is passed by value. No guarantee of immutability is being provided. Doesn't Ada use the same in/out/inout syntax? Sean
Yeah, I think we're on the same page as far as the facts. I am just not so happy about the lack of const. Although, as Walter pointer out, there is no good way that he found.
Perhaps if people considered readonly, rather than const, it'd seem more workable. readonly would mean only the code seeing that decorator is prevented by the compiler from altering it. In other words, this impacts only on logical constness within the current context. readonly would have nothing whatsoever to do with physical constness. IIRC most, though not all, of Walter's objections have pertained to the physical subversion of logical constness, and how that may impact code generation and other compilery things I don't understand.
Feb 13 2006
next sibling parent "Andrew Fedoniouk" <news terrainformatica.com> writes:
"Matthew" <nowhere noaddress.co.us> wrote in message 
news:dsri7o$j2i$1 digitaldaemon.com...
 "nick" <nick.atamas gmail.com> wrote in message 
 news:dsrhuq$i9p$1 digitaldaemon.com...
 Sean Kelly wrote:
 nick wrote:
 Pointer problems are notoriously difficult to track. Pointers are a
 feature that is not necessary in 90% of production code. Hey, Joel
 called them DANGEROUS. (I'm going to use that one a lot now.)

 My example demonstrates a potential error that, if occurs in a library
 that you don't have source for, will cause you hours of grief. My
 example was carefully constructed. In it an object was passed in using
 the /in/ keyword. That should guarantee that my copy of the object
 doesn't change. If you are saying it is OK for it to change, then you
 are basically saying that the /in/ keyword is useless (well, not really
 useless but almost). I don't think that's cool.
'in' just indicates call by value, and in the case of objects, only the reference is passed by value. No guarantee of immutability is being provided. Doesn't Ada use the same in/out/inout syntax? Sean
Yeah, I think we're on the same page as far as the facts. I am just not so happy about the lack of const. Although, as Walter pointer out, there is no good way that he found.
Perhaps if people considered readonly, rather than const, it'd seem more workable. readonly would mean only the code seeing that decorator is prevented by the compiler from altering it. In other words, this impacts only on logical constness within the current context. readonly would have nothing whatsoever to do with physical constness. IIRC most, though not all, of Walter's objections have pertained to the physical subversion of logical constness, and how that may impact code generation and other compilery things I don't understand.
I think that Walter is looking for something like this: http://pag.csail.mit.edu/~mernst/pubs/ref-immutability-oopsla2005.pdf It is a Javari - Java with readonly. I think that this paper of Matthew Tschantz and Michael Ernst is the most comprehensive study on the subject. This is how it should be done in ideal world. Andrew.
Feb 13 2006
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"Matthew" <nowhere noaddress.co.us> wrote in message 
news:dsri7o$j2i$1 digitaldaemon.com...
 IIRC most, though not all, of Walter's objections have pertained to the 
 physical subversion of logical constness, and how that may impact code 
 generation and other compilery things I don't understand.
It's not just that. It's that there are many different definitions of const, each with its own advantages and problems. C++ uses two of them (confusingly conflated). 1) is the reference const? 2) is what the reference refers to const? 3) does (2) apply recursively? 4) is it only const by that reference, i.e. can other references to the same data concurrently modify it? 5) is it ROMable? 6) is it initialize-once? 7) is it an error to subvert const'ness, or is it legal? (Since D has pointers, one can always find a way to subvert it, the question is is that defined behavior.) 8) is there a place for 'mutable' members of const objects? 9) how does aliasing (multiple references to the same data) fit into all this? 10) does C++ have it backwards - const should be the default, and non-const should be explicit? 11) what happens when a const reference is supplied to a non-const reference? 12) is there a way to do const that can take advantage of hardware support for read-only? 13) how can const fit in with the notion of COW? 14) how does this fit in with the notion of an atomic function (a function with no side effects)? 15) how to do this in an aesthetically pleasing way that has a slap-ones-head-of-course-thats-the-way-to-do-it obviousness about it that everyone missed before? <g> I've asked a friend who's a languages expert about this, who views programming languages through the lense of academic rigor, and he unhelpfully suggested D do all variations <sigh>. My views on the failings of C++ const are pretty well documented in these n.g.'s, so I shan't repeat them. Suffice to say whatever D gets won't look or behave like that, and D is better off with no const at all than the C++ const. (I might add, however, that D does have C++'s readonly notion of const, and that works well. Just not the type modifier version.)
Feb 13 2006
next sibling parent reply S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-13 21:40:30 -0800, "Walter Bright" <newshound digitalmars.com> said:

 
 "Matthew" <nowhere noaddress.co.us> wrote in message 
 news:dsri7o$j2i$1 digitaldaemon.com...
 IIRC most, though not all, of Walter's objections have pertained to the 
 physical subversion of logical constness, and how that may impact code 
 generation and other compilery things I don't understand.
It's not just that. It's that there are many different definitions of const, each with its own advantages and problems. C++ uses two of them (confusingly conflated). 1) is the reference const? 2) is what the reference refers to const? 3) does (2) apply recursively? 4) is it only const by that reference, i.e. can other references to the same data concurrently modify it? 5) is it ROMable? 6) is it initialize-once? 7) is it an error to subvert const'ness, or is it legal? (Since D has pointers, one can always find a way to subvert it, the question is is that defined behavior.) 8) is there a place for 'mutable' members of const objects? 9) how does aliasing (multiple references to the same data) fit into all this? 10) does C++ have it backwards - const should be the default, and non-const should be explicit? 11) what happens when a const reference is supplied to a non-const reference? 12) is there a way to do const that can take advantage of hardware support for read-only? 13) how can const fit in with the notion of COW? 14) how does this fit in with the notion of an atomic function (a function with no side effects)? 15) how to do this in an aesthetically pleasing way that has a slap-ones-head-of-course-thats-the-way-to-do-it obviousness about it that everyone missed before? <g> I've asked a friend who's a languages expert about this, who views programming languages through the lense of academic rigor, and he unhelpfully suggested D do all variations <sigh>. My views on the failings of C++ const are pretty well documented in these n.g.'s, so I shan't repeat them. Suffice to say whatever D gets won't look or behave like that, and D is better off with no const at all than the C++ const. (I might add, however, that D does have C++'s readonly notion of const, and that works well. Just not the type modifier version.)
What does const currently do? I'm not familiar with read only memory attributes, but on MacOS X I did this: char[] foo = "Hello"; and then idiotically tried to modify foo, the program segfaulted on me. The fix was to do "Hello".dup Can you make this happen at will to a portion of memory? If so I propose this syntax: const Type Name = <initializer>; //This would make the reference constant. TypeDef Name = const new Type; //This would make the data itself constant (What's this good for? Can't imagine a class which would let you readonly it. The class would need to be written in such a way as to allow it to be a readonly class.) const Type Name = const New Type; //Both the reference and data are readonlyized This is the behavior I would expect out of const. Modifying a const variable would result in your program segfaulting due to an memory access error. -S. Let me fill out your questionnaire:
 1) is the reference const?
See above
 2) is what the reference refers to const?
See above
 3) does (2) apply recursively?
No; IE a char[] in a class that was defined as const would still be writable unless the class itself defined it as const.
 4) is it only const by that reference, i.e. can other references to the 
 same data concurrently modify it?
No.
 5) is it ROMable?
Yes
 6) is it initialize-once?
Yes
 7) is it an error to subvert const'ness, or is it legal? (Since D has 
 pointers, one can always find a way to subvert it, the question is is 
 that defined behavior.)
Yes, see above
 8) is there a place for 'mutable' members of const objects?
See above about making a readonly class. I suspect it would be wise to add some kind of keyword saying this class can be a readonly class. It might even make sense to only allow structures to be const'd (Aside from the references, I'm referring to the actual heap data.)
 9) how does aliasing (multiple references to the same data) fit into all this?
*Shrug* The memory is ROMized. *BOOM*
 10) does C++ have it backwards - const should be the default, and 
 non-const should be explicit?
Nope, I don't want to have to declare the majority of what I use as mutable.
 11) what happens when a const reference is supplied to a non-const reference?
That's not a problem, since the reference is what is constant. We care that the const reference always point to the correct object, not that nothing else does.
 12) is there a way to do const that can take advantage of hardware 
 support for read-only?
Sure, if it exists.
 13) how can const fit in with the notion of COW?
How does this relate?
 14) how does this fit in with the notion of an atomic function (a 
 function with no side effects)?
Huh?
 15) how to do this in an aesthetically pleasing way that has a 
 slap-ones-head-of-course-thats-the-way-to-do-it obviousness about it 
 that everyone missed before? <g>
I like the syntax I proposed.
Feb 14 2006
parent reply "Walter Bright" <newshound digitalmars.com> writes:
Yes, that's one possible design. I like the idea of being able to put 
*allocated* data into a readonly segment, but this may be difficult to 
achieve. 
Feb 14 2006
next sibling parent S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-14 11:03:04 -0800, "Walter Bright" <newshound digitalmars.com> said:

 Yes, that's one possible design. I like the idea of being able to put 
 *allocated* data into a readonly segment, but this may be difficult to 
 achieve.
I'm not sure what you mean. Also, but I forgot this part: const StructType Foo; <-- Foo's data would obviously be readonly also, (since structures don't have reference semantics by default) so if any references were created they couldn't modify it's data either. Where does this leave arrays that have psuedo reference semantics? -S.
Feb 14 2006
prev sibling parent Sean Kelly <sean f4.ca> writes:
Walter Bright wrote:
 Yes, that's one possible design. I like the idea of being able to put 
 *allocated* data into a readonly segment, but this may be difficult to 
 achieve. 
This would be great if it's possible. I assume one method would be to allocate from a write-protected memory page for this purpose, but this sounds a bit slow and not terribly portable. Are there any other options? Sean
Feb 14 2006
prev sibling parent Bruno Medeiros <daiphoenixNO SPAMlycos.com> writes:
Walter Bright wrote:
  (I might add, however, that D does have C++'s readonly notion of 
 const, and that works well. Just not the type modifier version.) 
 
Yes, except that there is no way apply the readonly notion to the referenced data, only to the variable (self) value. -- Bruno Medeiros - CS/E student "Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
Feb 16 2006
prev sibling parent Hasan Aljudy <hasan.aljudy gmail.com> writes:
Matthew wrote:
I am in no way trying to attack you. I am just pointing out that C and
C++ breeds bad programming practice, and we need protection from them.
[snip] Bottom line: if you're a good engineer, you're a good engineer. If you're not, you're not. The language used won't affect this truth. And avoiding peaking inside abstractions won't help you become one.
I think you didn't get his point: he's not worried that /he/ will misuse pointers, he's worried that _his colleagues_ will.
Feb 12 2006
prev sibling parent reply nick <nick.atamas gmail.com> writes:
What's worse, I can use the function prototype:

void surprise(inout A a);

and the results will be exactly the same. That just seriously breaks all
the high level language features that D puts in place. So I guess my
question is: "Shouldn't there be a mechanism to deal with this?"
Feb 12 2006
next sibling parent nick <nick.atamas gmail.com> writes:
nick wrote:
 What's worse, I can use the function prototype:
 
 void surprise(inout A a);
 
 and the results will be exactly the same. That just seriously breaks all
 the high level language features that D puts in place. So I guess my
 question is: "Shouldn't there be a mechanism to deal with this?"
Sorry, that's meant to be: void surprise(in A a). As in should not be changed inside the function call. It's been a long day.
Feb 12 2006
prev sibling parent reply S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-12 19:32:47 -0800, nick <nick.atamas gmail.com> said:

 What's worse, I can use the function prototype:
 
 void surprise(inout A a);
 
 and the results will be exactly the same. That just seriously breaks all
 the high level language features that D puts in place. So I guess my
 question is: "Shouldn't there be a mechanism to deal with this?"
If you're working with people who are idiots, they're going to be idiots regardless of a damn unsafe keyword, or a friggin' throws keyword, or whatever! Language do not fix people. Writing software is not subject to malicious attacks by hackers. People are not intentionally injecting bad code into your software! You do not need security features built into the language. If it were up to me I'd abolish the private and protected keywords too! They're rather stupid, their intent is to "recommend" people not to use them. Quite often, especially in .NET i want to use a protected method and end up having to subclass the F*CKING class to expose it to the rest of my program. -S.
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
S. Chancellor wrote:
 On 2006-02-12 19:32:47 -0800, nick <nick.atamas gmail.com> said:
 
 What's worse, I can use the function prototype:

 void surprise(inout A a);

 and the results will be exactly the same. That just seriously breaks all
 the high level language features that D puts in place. So I guess my
 question is: "Shouldn't there be a mechanism to deal with this?"
Language do not fix people. Writing software is not subject to malicious attacks by hackers. People are not intentionally injecting bad code into your software! You do not need security features built into the language. If it were up to me I'd abolish the private and protected keywords too! They're rather stupid, their intent is to "recommend" people not to use them. Quite often, especially in .NET i want to use a protected method and end up having to subclass the F*CKING class to expose it to the rest of my program. -S.
Now you're talking crazy talk. Throws declarations may be a bad idea - I agreed after having read up on it. I have yet to hear a good reason why the unsafe keyword or some other safeguard against dangerous pointer code is a bad idea. I'm just going to quote Joel here: "Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code." http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html
 If you're working with people who are idiots, they're going to be idiots
 regardless of a damn unsafe keyword, or a friggin' throws keyword, or
 whatever!
When will people finally realize that stupid an unexperienced aren't the same thing. Some individuals who write code are mathematicians, physicists, or electrical engineers by training. Programming languages aren't just for hardcore experts like Matthew or Walter. Most programmers are amateurs; you're not going to change that. Furthermore, humans are prone to error. If you think that safeguards aren't for you, then maybe you should double-check the list of D features: in/out/inout, nested functions, typesafe variadic arguments, contract programming, guaranteed initialization, and others. Are you telling me that all those should be thrown out because they are just there to prevent mistakes?
Feb 12 2006
next sibling parent reply "Chris Miller" <chris dprogramming.com> writes:
On Mon, 13 Feb 2006 00:26:48 -0500, nick <nick.atamas gmail.com> wrote:

 Now you're talking crazy talk. Throws declarations may be a bad idea - I
 agreed after having read up on it. I have yet to hear a good reason why
 the unsafe keyword or some other safeguard against dangerous pointer
 code is a bad idea.
Then would 'delete' be 'unsafe'? Even though it nulls the reference, other places may still be referencing it, hence making it unsafe.
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
Chris Miller wrote:
 On Mon, 13 Feb 2006 00:26:48 -0500, nick <nick.atamas gmail.com> wrote:
 
 Now you're talking crazy talk. Throws declarations may be a bad idea - I
 agreed after having read up on it. I have yet to hear a good reason why
 the unsafe keyword or some other safeguard against dangerous pointer
 code is a bad idea.
Then would 'delete' be 'unsafe'? Even though it nulls the reference, other places may still be referencing it, hence making it unsafe.
That seems to be an implementation detail. However, my immediate reaction is that delete probably should be unsafe; however, I am not sure. It all depends on how much it is needed for mainstream software development and how much damage it tends to cause. Of course, if you are talking about overriding new and then calling delete, that's a different story. By allocating memory manually you are preventing a good garbage collector from optimizing your heap, so you should be avoiding that in most cases. The upshot of using "unsafe" is that all code that messes with the memory manually would get marked unsafe. So, someone working on OS features may end up having to put an "unsafe:" at the top of every file and compiling with the --unsafe flag (or something to that effect). It seems like a small price to pay for preventing amateurs from screwing up your code. It seems to me that most people who write code don't need pointers. Both D and C++ are languages that provide high-level and low-level access. You are going to get both experts who need the pointers and amateurs who don't need them. Both Bjarne and Matthew seem to think that people should just "learn to code well". Despite admitting that most coders are not experts, Bjarne says: "The most direct way of addressing the problems caused by lack of type safety is to provide a range-checked Standard Library based on statically typed containers and base the teaching of C++ on that". <http://public.research.att.com/~bs/rules.pdf> I must disagree. There are too many people to teach. In some cases it is a lot easier to modify a language than to teach everyone not to use a feature. This may be one of those cases. I think experts tend to forget that a language is there to help programmers develop software and to reduce chances of human error.
Feb 12 2006
next sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
By implementation detail, are you speaking to it nulling the pointer?  I 
was pretty sure that was in the spec, and not in the implementation.

Delete is needed if you ever want to immediately call a destructor.  If 
used wisely, it can also decrease the memory usage of your software, and 
reduce garbage collection runs (if the GC won't run unless there's more 
than X to collect.)

Overriding new and delete would definitely fit into the same class as 
pointers, recursion, casting, != in fors, and delete.  They're all scary.

-[Unknown]


 Chris Miller wrote:
 On Mon, 13 Feb 2006 00:26:48 -0500, nick <nick.atamas gmail.com> wrote:

 Now you're talking crazy talk. Throws declarations may be a bad idea - I
 agreed after having read up on it. I have yet to hear a good reason why
 the unsafe keyword or some other safeguard against dangerous pointer
 code is a bad idea.
Then would 'delete' be 'unsafe'? Even though it nulls the reference, other places may still be referencing it, hence making it unsafe.
That seems to be an implementation detail. However, my immediate reaction is that delete probably should be unsafe; however, I am not sure. It all depends on how much it is needed for mainstream software development and how much damage it tends to cause. Of course, if you are talking about overriding new and then calling delete, that's a different story. By allocating memory manually you are preventing a good garbage collector from optimizing your heap, so you should be avoiding that in most cases. The upshot of using "unsafe" is that all code that messes with the memory manually would get marked unsafe. So, someone working on OS features may end up having to put an "unsafe:" at the top of every file and compiling with the --unsafe flag (or something to that effect). It seems like a small price to pay for preventing amateurs from screwing up your code. It seems to me that most people who write code don't need pointers. Both D and C++ are languages that provide high-level and low-level access. You are going to get both experts who need the pointers and amateurs who don't need them. Both Bjarne and Matthew seem to think that people should just "learn to code well". Despite admitting that most coders are not experts, Bjarne says: "The most direct way of addressing the problems caused by lack of type safety is to provide a range-checked Standard Library based on statically typed containers and base the teaching of C++ on that". <http://public.research.att.com/~bs/rules.pdf> I must disagree. There are too many people to teach. In some cases it is a lot easier to modify a language than to teach everyone not to use a feature. This may be one of those cases. I think experts tend to forget that a language is there to help programmers develop software and to reduce chances of human error.
Feb 12 2006
prev sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dsp8bc$143b$1 digitaldaemon.com...
 I must disagree. There are too many people to teach. In some cases it is
 a lot easier to modify a language than to teach everyone not to use a
 feature. This may be one of those cases. I think experts tend to forget
 that a language is there to help programmers develop software and to
 reduce chances of human error.
On the one hand, I agree with you. Often, when I show that feature X of D will eliminate a certain common class of errors one gets with C++, the response I get back is that "yes, but one can follow this procedural rigor with C++ and not get that error." Well, sure, if everyone is a god-like programmer. But they aren't, and if you're an employer, you'll inevitably be hiring mere mortal programmers, and if you work on a team, they'll be mere mortals too, despite being a god oneself <g>. On the other hand, having powerful (but dangerous) features are just too useful to ignore. How do we solve this? We can't. But we can try to mitigate it, and D does so by: 1) Trying to make the natural thing to do the right way to do it. D's arrays and reference objects are good examples of this. 2) Making practices that often lead to bugs be more visible in the code, so that code reviews can zero in on them, or even so that they're more greppable. Going from the C-style cast to requiring a 'cast' keyword is an example of this. 3) Finding attractive alternatives to common uses of unsafe practices - out and inout parameters are a good example here. being keyword based, it can be flagged by the compiler if so desired, and it can be grepped for. It has one big disadvantage, though - it's just awful <g>. I find it grating to be forced to label code as "unsafe" when, as a programmer, I know it's perfectly safe. It's patronizing. It makes the language feel like it is not for professionals. Although D eliminates maybe 90% of the need for explicit pointers, they're still needed here and there. I'd like to go that last 10%, but I don't think 'unsafe' is the right way to do it, even if its heart is in the right place.
Feb 13 2006
parent reply nick <nick_member pathlink.com> writes:
In article <dspn17$1lbq$1 digitaldaemon.com>, Walter Bright says...
3) Finding attractive alternatives to common uses of unsafe practices - out 
and inout parameters are a good example here.
I like the syntax of in, out and inout - it's very explicit. However, when it comes to objects, I am not sure sure the semantics are the best at the moment. For example, given this code: class A{ public int q;} void surprise( in A a ){a.q = 5;} a call to surprise(a) will change the value of 'a'. Is there no way to guarantee that the value of 'a' doesn't change in a function?

being keyword based, it can be flagged by the compiler if so desired, and it 
can be grepped for. It has one big disadvantage, though - it's just awful 
<g>. I find it grating to be forced to label code as "unsafe" when, as a 
programmer, I know it's perfectly safe. It's patronizing. It makes the 
language feel like it is not for professionals.
Well, would it be ok to just change the keyword to 'raw' or maybe 'h4x0r_31337'. Then we can feel good about using it; I know I would. = )
Although D eliminates maybe 90% of the need for explicit pointers, they're 
still needed here and there. I'd like to go that last 10%, but I don't think 
'unsafe' is the right way to do it, even if its heart is in the right place. 
I think that if we changed the semantics of /in/ or provided some equivalent of a 'const type*', that would help.
Feb 13 2006
next sibling parent reply "Walter Bright" <newshound digitalmars.com> writes:
"nick" <nick_member pathlink.com> wrote in message 
news:dsqr3t$2seg$1 digitaldaemon.com...
 I think that if we changed the semantics of /in/ or provided some 
 equivalent of
 a 'const type*', that would help.
I agree, but in investigating this it turned out to be quite a quagmire, so I abandoned it for now.
Feb 13 2006
parent reply John Demme <me teqdruid.com> writes:
The ability to protect not a reference, but the memory a reference points to
is a feature that (at least a few) people have been asking for for some
time.  (Think: real constant arrays.)  Walter, this is something you've
said is a 2.0 feature, yes?  Nick, this would satisfy your issue with a
reference "in" parameter?

~John Demme

Walter Bright wrote:

 
 "nick" <nick_member pathlink.com> wrote in message
 news:dsqr3t$2seg$1 digitaldaemon.com...
 I think that if we changed the semantics of /in/ or provided some
 equivalent of
 a 'const type*', that would help.
I agree, but in investigating this it turned out to be quite a quagmire, so I abandoned it for now.
Feb 13 2006
next sibling parent "Walter Bright" <newshound digitalmars.com> writes:
"John Demme" <me teqdruid.com> wrote in message 
news:dsrn2u$m91$2 digitaldaemon.com...
 The ability to protect not a reference, but the memory a reference points 
 to
 is a feature that (at least a few) people have been asking for for some
 time.  (Think: real constant arrays.)  Walter, this is something you've
 said is a 2.0 feature, yes?
Yes.
Feb 13 2006
prev sibling parent nick <nick.atamas gmail.com> writes:
John Demme wrote:
 The ability to protect not a reference, but the memory a reference points to
 is a feature that (at least a few) people have been asking for for some
 time.  (Think: real constant arrays.)  Walter, this is something you've
 said is a 2.0 feature, yes?  Nick, this would satisfy your issue with a
 reference "in" parameter?
 
 ~John Demme
 
 Walter Bright wrote:
 
 "nick" <nick_member pathlink.com> wrote in message
 news:dsqr3t$2seg$1 digitaldaemon.com...
 I think that if we changed the semantics of /in/ or provided some
 equivalent of
 a 'const type*', that would help.
I agree, but in investigating this it turned out to be quite a quagmire, so I abandoned it for now.
Thanks, John. That pretty much solves my problems.
Feb 14 2006
prev sibling parent "Derek Parnell" <derek psych.ward> writes:
On Tue, 14 Feb 2006 07:44:45 +1100, nick <nick_member pathlink.com> wrote:

 In article <dspn17$1lbq$1 digitaldaemon.com>, Walter Bright says...
 3) Finding attractive alternatives to common uses of unsafe practices -  
 out
 and inout parameters are a good example here.
I like the syntax of in, out and inout - it's very explicit. However, when it comes to objects, I am not sure sure the semantics are the best at the moment. For example, given this code: class A{ public int q;} void surprise( in A a ){a.q = 5;} a call to surprise(a) will change the value of 'a'. Is there no way to guarantee that the value of 'a' doesn't change in a function?
Sorry to interrupt, but its important to get some terms strightened out here. The 'in' qualifier does in fact protect the value of the parameter passed. In your example, it does protect the value of 'a'; that is it does not permit any changes to the object reference to be returned to the caller. However, the 'in' qualifier does *not* protect the object to which 'a' refers. In short, it protects the reference but not the referred. For example, when passed to your example, 'a' is the address of a class instance - it is not the instance itself, just its RAM address. Any changes you make to that address is not returned to the caller. The only way (currently) to protect the object itself, is to take a copy of it and work with the copy. This applies to class instances and to arrays.
 I think that if we changed the semantics of /in/ or provided some  
 equivalent of a 'const type*', that would help.
I'd only go as way as saying that I'd like to see some form of syntax to alert the compiler and maintainer of the intention of the coder and thus warn us of overt or explicit attempts to change data inside such a 'protected' object or array. This should still allow other methods to modify objects but not make it easy to do. -- Derek Parnell Melbourne, Australia
Feb 13 2006
prev sibling next sibling parent reply "Matthew" <nowhere noaddress.co.us> writes:
"nick" <nick.atamas gmail.com> wrote in message 
news:dsp5ak$s35$1 digitaldaemon.com...
 S. Chancellor wrote:
 On 2006-02-12 19:32:47 -0800, nick <nick.atamas gmail.com> said:

 What's worse, I can use the function prototype:

 void surprise(inout A a);

 and the results will be exactly the same. That just seriously breaks all
 the high level language features that D puts in place. So I guess my
 question is: "Shouldn't there be a mechanism to deal with this?"
Language do not fix people. Writing software is not subject to malicious attacks by hackers. People are not intentionally injecting bad code into your software! You do not need security features built into the language. If it were up to me I'd abolish the private and protected keywords too! They're rather stupid, their intent is to "recommend" people not to use them. Quite often, especially in .NET i want to use a protected method and end up having to subclass the F*CKING class to expose it to the rest of my program. -S.
Now you're talking crazy talk. Throws declarations may be a bad idea - I agreed after having read up on it. I have yet to hear a good reason why the unsafe keyword or some other safeguard against dangerous pointer code is a bad idea. I'm just going to quote Joel here: "Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code." http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html
 If you're working with people who are idiots, they're going to be idiots
 regardless of a damn unsafe keyword, or a friggin' throws keyword, or
 whatever!
When will people finally realize that stupid an unexperienced aren't the same thing. Some individuals who write code are mathematicians, physicists, or electrical engineers by training. Programming languages aren't just for hardcore experts like Matthew or Walter.
LOL! This appelation never fails to make my brain do a "what, you mean me!?!" double-take. Same feeling as getting caught nicking one of mum's cakes, or in-flagrante with one's girlfriend, perhaps. ;-)
 When will people finally realize that stupid an unexperienced aren't the
 same thing. . . . Most
 programmers are amateurs; you're not going to change that. Furthermore,
 humans are prone to error.
True. But you can't hobble "production" programmers to cater for hobbyists. That's just not a go-er. Isn't it better to do what D has done, and allow for the low down and dirty while fostering and promoting a generally more suitable higher-level of abstraction? Type-safety, const-correctness, and all such things are very good. Basically, whatever you can get the compiler to do for you is a good thing. Just as long as you can circumvent it when you need to. In both these areas, I think D has some way to go, but it's better than most.
Feb 12 2006
parent Sean Kelly <sean f4.ca> writes:
Matthew wrote:
 "nick" <nick.atamas gmail.com> wrote in message 
 news:dsp5ak$s35$1 digitaldaemon.com...
 
 When will people finally realize that stupid an unexperienced aren't the
 same thing. . . . Most
 programmers are amateurs; you're not going to change that. Furthermore,
 humans are prone to error.
True. But you can't hobble "production" programmers to cater for hobbyists. That's just not a go-er. Isn't it better to do what D has done, and allow for the low down and dirty while fostering and promoting a generally more suitable higher-level of abstraction? Type-safety, const-correctness, and all such things are very good. Basically, whatever you can get the compiler to do for you is a good thing. Just as long as you can circumvent it when you need to. In both these areas, I think D has some way to go, but it's better than most.
Agreed. And frankly, I would be fine with using an 'unsafe' attribute so long as doing so didn't prevent me from being as evil as the situation demanded. However, I reserve the right to complain if it's annoying to use ;-) Sean
Feb 13 2006
prev sibling next sibling parent reply "Unknown W. Brackets" <unknown simplemachines.org> writes:
Well, it's clearly not helping them, is it?  Most programmers may or may 
not know what in the world they're doing, but most of the programmers I 
want to hire or have work with me will.

All of the features you listed are there to help people who are sensible 
detect errors.  Lovely things they are, too.  But they don't just get 
used, you have to use them.  If you don't, you're no better off than if 
you didn't.

Mistakes happen, but gross mistakes shouldn't.  If they do, the person 
needs to go back and bake in training/school/less important projects for 
a while.  No language can change this, however many keywords or flags it 
adds.

-[Unknown]


 Most programmers are amateurs; you're not going to change that.
Feb 12 2006
next sibling parent reply Derek Parnell <derek psych.ward> writes:
On Sun, 12 Feb 2006 22:33:25 -0800, Unknown W. Brackets wrote:

 Most programmers are amateurs; you're not going to change that.
More indication that we could really do with a 'lint' program for D. It could warn about pointer usage too. -- Derek (skype: derek.j.parnell) Melbourne, Australia "Down with mediocracy!" 13/02/2006 5:44:24 PM
Feb 12 2006
parent nick <nick.atamas gmail.com> writes:
Derek Parnell wrote:
 On Sun, 12 Feb 2006 22:33:25 -0800, Unknown W. Brackets wrote:
 
 Most programmers are amateurs; you're not going to change that.
More indication that we could really do with a 'lint' program for D. It could warn about pointer usage too.
A lint-like tool may be the way to go. However, there definitely need to be an in-language solution to the /in/ parameter problem. That seems to be unacceptable (see my previous posts for the details). There is a lint-like project for Java called Find Bugs. Bill Pugh at UMCP is leading it. I happen to know Dr. Pugh; he taught one of my courses and sponsored my senior C.S. project. If someone decides to work on a lint-like tool, I will be happy to introduce them to Dr. Pugh.
Feb 13 2006
prev sibling parent reply nick <nick.atamas gmail.com> writes:
Unknown W. Brackets wrote:
 Well, it's clearly not helping them, is it?  Most programmers may or may
 not know what in the world they're doing, but most of the programmers I
 want to hire or have work with me will.
 
 All of the features you listed are there to help people who are sensible
 detect errors.  Lovely things they are, too.  But they don't just get
 used, you have to use them.  If you don't, you're no better off than if
 you didn't.
 
 Mistakes happen, but gross mistakes shouldn't.  If they do, the person
 needs to go back and bake in training/school/less important projects for
 a while.  No language can change this, however many keywords or flags it
 adds.
 
 -[Unknown]
 
 
 Most programmers are amateurs; you're not going to change that.
That's an easy one. You can't do unsafe things without wrapping your code in the unsafe keyword. That's fairly easy to add, if you ask me. However, when that amateur gets the compiler error, he/she will look it up. Once they do, there will be a big notice "DANGER, USE THIS INSTEAD". I work with a lot of EEs who only had one or two programming courses. They get a job mainly based on their hardware architecture knowledge. Now they have I have to work with them and write a hardware simulator. Oh, I don't know if you realize this, but essentially removed /in/out/inout from the D spec with my example; please go read it. If you think that people are going to use the language the RIGHT way when there is such a tempting wrong way, I suggest you look at C++ and its operator overloading.
Feb 12 2006
parent reply "Unknown W. Brackets" <unknown simplemachines.org> writes:
When I went into Computer Science, practically the first thing the 
teacher showed us was "cout".  Luckily, I already knew programming well 
enough to know that was stupid, and ended up teaching a large portion of 
the students in the class basic programming concepts.

When the first thing you see is a bad example, it's really hard to fix 
things.  Agreed, D has its flukes and its documentation is not perfect, 
but I haven't seen any code encouraging the abuse of pointers, 
references, for loops, casts, or asserts for that matter.

Most amateur programmers are copy-and-paste programmers, or are being 
taught by someone else.  In either case, they usually don't really 100% 
understand what they are doing, and a "danger use this instead" notice 
won't help them.

Consider, for example, if for always suggested that you use foreach 
instead.  You are guaranteed in bounds and you can't make as many 
mistakes, after all... right?  You know what would happen, right? 
People would do this:

int for_loop[1000];
foreach (int i, int ignore; for_loop)
    writefln(i);

Isn't that the better way?  No mistakes there... just a lot more memory 
usage.  Who cares about that anyway, right?  Computers are loaded these 
days.

I don't see any example where you removed anything from any spec. 
Everything seems to work as I expected in all the examples you've given.

There's really just one way to fix mistakes.  And it's not bandaids or 
notices.  And half the point of Joel's article, which you so carefully 
chose to ignore, was that learning to understand pointers is crucial to 
general understanding and principles.

-[Unknown]


 Unknown W. Brackets wrote:
 Well, it's clearly not helping them, is it?  Most programmers may or may
 not know what in the world they're doing, but most of the programmers I
 want to hire or have work with me will.

 All of the features you listed are there to help people who are sensible
 detect errors.  Lovely things they are, too.  But they don't just get
 used, you have to use them.  If you don't, you're no better off than if
 you didn't.

 Mistakes happen, but gross mistakes shouldn't.  If they do, the person
 needs to go back and bake in training/school/less important projects for
 a while.  No language can change this, however many keywords or flags it
 adds.

 -[Unknown]


 Most programmers are amateurs; you're not going to change that.
That's an easy one. You can't do unsafe things without wrapping your code in the unsafe keyword. That's fairly easy to add, if you ask me. However, when that amateur gets the compiler error, he/she will look it up. Once they do, there will be a big notice "DANGER, USE THIS INSTEAD". I work with a lot of EEs who only had one or two programming courses. They get a job mainly based on their hardware architecture knowledge. Now they have I have to work with them and write a hardware simulator. Oh, I don't know if you realize this, but essentially removed /in/out/inout from the D spec with my example; please go read it. If you think that people are going to use the language the RIGHT way when there is such a tempting wrong way, I suggest you look at C++ and its operator overloading.
Feb 13 2006
parent reply Ivan Senji <ivan.senji_REMOVE_ _THIS__gmail.com> writes:
Unknown W. Brackets wrote:
 Consider, for example, if for always suggested that you use foreach 
 instead.  You are guaranteed in bounds and you can't make as many 
 mistakes, after all... right?  You know what would happen, right? People 
 would do this:
 
 int for_loop[1000];
 foreach (int i, int ignore; for_loop)
    writefln(i);
 
 Isn't that the better way?  No mistakes there... just a lot more memory 
 usage.  Who cares about that anyway, right?  Computers are loaded these 
 days.
 
Ok, I really don't see why foreach should/would use more memory? Or I didn't understand something?
Feb 13 2006
parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
Creating the array would use more memory.  Consider:

int for_loop[] = new int[some_undetermined_int];
...

That could use, right there, 5 megabytes.  It probably wouldn't, but it 
could.  Even so, what's the point of using the extra 4k from my initial 
example?

My meaning was that someone might create an array, and use it entirely 
and ONLY for the use of foreaching over it.  After that, the array, and 
its contents, would be ignored and not used.  Further, because delete is 
unsafe, it would not be deleted until the garbage collector picked it up.

Obviously this example is a bit out there, but I was trying to be 
illustrative.

-[Unknown]


 Unknown W. Brackets wrote:
 Consider, for example, if for always suggested that you use foreach 
 instead.  You are guaranteed in bounds and you can't make as many 
 mistakes, after all... right?  You know what would happen, right? 
 People would do this:

 int for_loop[1000];
 foreach (int i, int ignore; for_loop)
    writefln(i);

 Isn't that the better way?  No mistakes there... just a lot more 
 memory usage.  Who cares about that anyway, right?  Computers are 
 loaded these days.
Ok, I really don't see why foreach should/would use more memory? Or I didn't understand something?
Feb 13 2006
prev sibling parent S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-12 21:26:48 -0800, nick <nick.atamas gmail.com> said:

 S. Chancellor wrote:
 On 2006-02-12 19:32:47 -0800, nick <nick.atamas gmail.com> said:
 
 What's worse, I can use the function prototype:
 
 void surprise(inout A a);
 
 and the results will be exactly the same. That just seriously breaks all
 the high level language features that D puts in place. So I guess my
 question is: "Shouldn't there be a mechanism to deal with this?"
Language do not fix people. Writing software is not subject to malicious attacks by hackers. People are not intentionally injecting bad code into your software! You do not need security features built into the language. If it were up to me I'd abolish the private and protected keywords too! They're rather stupid, their intent is to "recommend" people not to use them. Quite often, especially in .NET i want to use a protected method and end up having to subclass the F*CKING class to expose it to the rest of my program. -S.
Now you're talking crazy talk. Throws declarations may be a bad idea - I agreed after having read up on it. I have yet to hear a good reason why the unsafe keyword or some other safeguard against dangerous pointer code is a bad idea. I'm just going to quote Joel here: "Now, I freely admit that programming with pointers is not needed in 90% of the code written today, and in fact, it's downright dangerous in production code." http://joelonsoftware.com/articles/ThePerilsofJavaSchools.html
I frankly do not care about Joel or what his opinion is.
 
 If you're working with people who are idiots, they're going to be idiots
 regardless of a damn unsafe keyword, or a friggin' throws keyword, or
 whatever!
 When will people finally realize that stupid an unexperienced aren't the
 same thing. Some individuals who write code are mathematicians,
 physicists, or electrical engineers by training. Programming languages
 aren't just for hardcore experts like Matthew or Walter. Most
 programmers are amateurs; you're not going to change that. Furthermore,
 humans are prone to error.
For your information I'm a physicist, and I have absolutely no problem with pointers.
 If you think that safeguards aren't for you, then maybe you should
 double-check the list of D features: in/out/inout, nested functions,
 typesafe variadic arguments, contract programming, guaranteed
 initialization, and others. Are you telling me that all those should be
 thrown out because they are just there to prevent mistakes?
Sure! I end up getting annoyed when private interfaces have functions I need to use all the time. I end up having to modify the library myself, or subclass the damn thing if it's protected. It would be nice if I could just tell the compiler to shut the hell up and let me call the protected member instead. The point of those keywords are to help keep people from doing things that were not intended, not to prevent them from doing it when it is intended. And quite obviously people get too gung-ho about this and I end up having to modify the interface to the library to use a damn private method. Nonsense. Removing pointers from a systems programming language is just dumb. End - of - story. Pointers are not something you accidently do. They are not like shoving the wrong type of variable into the wrong place in an arguments list (Which typedef protects from, and no implicit conversions) or calling an internal interface accidently (Which private/protected protect from.) or calling an int with a value out of range when the interface designer wanted a discreet subset (what enums protect against) You already have to go out of your way to use pointers by using the little * symbol, that's your unsafe keyword for you right there. -S.
Feb 14 2006
prev sibling next sibling parent reply "Andrew Fedoniouk" <news terrainformatica.com> writes:
 I will skip the "pointers are unsafe" rigmarole.


If "safe" means "managed" and "unsafe" means "unmanaged" then answer is no. By definition as D code is "unmanaged". To implement realy "safe" mode (whatever it means) you need at least VM creating safe sandbox for you. Andrew.
Feb 12 2006
parent reply nick <nick.atamas gmail.com> writes:
Andrew Fedoniouk wrote:
 I will skip the "pointers are unsafe" rigmarole.


If "safe" means "managed" and "unsafe" means "unmanaged" then answer is no. By definition as D code is "unmanaged". To implement realy "safe" mode (whatever it means) you need at least VM creating safe sandbox for you. Andrew.
I didn't say I had a solution, I just said I have a problem. The If c-style pointers are left the way they are now, you might as well not have in/out/inout parameters. To save you from reading the rest of the thread, here is an example: CODE: ----- import std.stdio; class A { private int data[]; public this() { data.length = 10; } public void printSelf() { writefln("Data: ", this.data); } } void surprise(in A a) { byte *ap = cast(byte *)(a); ap[9] = 5; } int main() { A a = new A(); a.printSelf(); surprise(a); a.printSelf(); return 0; } OUTPUT: ------- Data before surprise: [0,0,0,0,0,0,0,0,0,0] Data after surprise: [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4287008,0,2004,216,1245184,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855552,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,8855680,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855808, <..SNIP..> 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
Feb 12 2006
next sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
All in does is effect the parameter in question.  In this case, said 
parameter is a pointer (reference, whatever you want to call it) to an 
instance of A.  You can change said instance all you like; you just 
cannot change the pointer.

Please read the documentation more carefully.

-[Unknown]


 Andrew Fedoniouk wrote:
 I will skip the "pointers are unsafe" rigmarole.


If "safe" means "managed" and "unsafe" means "unmanaged" then answer is no. By definition as D code is "unmanaged". To implement realy "safe" mode (whatever it means) you need at least VM creating safe sandbox for you. Andrew.
I didn't say I had a solution, I just said I have a problem. The If c-style pointers are left the way they are now, you might as well not have in/out/inout parameters. To save you from reading the rest of the thread, here is an example: CODE: ----- import std.stdio; class A { private int data[]; public this() { data.length = 10; } public void printSelf() { writefln("Data: ", this.data); } } void surprise(in A a) { byte *ap = cast(byte *)(a); ap[9] = 5; } int main() { A a = new A(); a.printSelf(); surprise(a); a.printSelf(); return 0; } OUTPUT: ------- Data before surprise: [0,0,0,0,0,0,0,0,0,0] Data after surprise: [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4287008,0,2004,216,1245184,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855552,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,8855680,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855808, <..SNIP..> 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
Feb 13 2006
prev sibling parent reply S. Chancellor <dnewsgr mephit.kicks-ass.org> writes:
On 2006-02-12 23:12:26 -0800, nick <nick.atamas gmail.com> said:

 Andrew Fedoniouk wrote:
 I will skip the "pointers are unsafe" rigmarole.
 

If "safe" means "managed" and "unsafe" means "unmanaged" then answer is no. By definition as D code is "unmanaged". To implement realy "safe" mode (whatever it means) you need at least VM creating safe sandbox for you. Andrew.
I didn't say I had a solution, I just said I have a problem. The If c-style pointers are left the way they are now, you might as well not have in/out/inout parameters. To save you from reading the rest of the thread, here is an example: CODE: ----- import std.stdio; class A { private int data[]; public this() { data.length = 10; } public void printSelf() { writefln("Data: ", this.data); } } void surprise(in A a) { byte *ap = cast(byte *)(a); ap[9] = 5; } int main() { A a = new A(); a.printSelf(); surprise(a); a.printSelf(); return 0; } OUTPUT: ------- Data before surprise: [0,0,0,0,0,0,0,0,0,0] Data after surprise: [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4287008,0,2004,216,1245184,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855552,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,8855680,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855808, <..SNIP..> 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0] Good
 
job! You intentionally screwed up an a reference, and look at the trouble you had to go through to do it! You had to first convert it to a pointer, and then set a byte of the object to something different intentionally! I don't see how you think this can be done accidentally. It's not like "Oops, shit! I just cast that object to a pointer and randomly set a byte in it to something else!" On the other hand many of the other security features D includes helps prevent "Oops" errors. Welcome to a systems programming language. If your co-workers are intentionally doing things like this, then I am very sorry for you. -S.
Feb 14 2006
parent reply Boogeyman <Boogeyman_member pathlink.com> writes:
In article <dssk1o$1gd0$2 digitaldaemon.com>, S. Chancellor says...
On 2006-02-12 23:12:26 -0800, nick <nick.atamas gmail.com> said:

 Andrew Fedoniouk wrote:
 I will skip the "pointers are unsafe" rigmarole.
 

If "safe" means "managed" and "unsafe" means "unmanaged" then answer is no. By definition as D code is "unmanaged". To implement realy "safe" mode (whatever it means) you need at least VM creating safe sandbox for you. Andrew.
I didn't say I had a solution, I just said I have a problem. The If c-style pointers are left the way they are now, you might as well not have in/out/inout parameters. To save you from reading the rest of the thread, here is an example: CODE: ----- import std.stdio; class A { private int data[]; public this() { data.length = 10; } public void printSelf() { writefln("Data: ", this.data); } } void surprise(in A a) { byte *ap = cast(byte *)(a); ap[9] = 5; } int main() { A a = new A(); a.printSelf(); surprise(a); a.printSelf(); return 0; } OUTPUT: ------- Data before surprise: [0,0,0,0,0,0,0,0,0,0] Data after surprise: [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,4287008,0,2004,216,1245184,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855552,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,8855680,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,8855808, <..SNIP..> 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0, 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0] Good
 
job! You intentionally screwed up an a reference, and look at the trouble you had to go through to do it! You had to first convert it to a pointer, and then set a byte of the object to something different intentionally! I don't see how you think this can be done accidentally. It's not like "Oops, shit! I just cast that object to a pointer and randomly set a byte in it to something else!" On the other hand many of the other security features D includes helps prevent "Oops" errors. Welcome to a systems programming language. If your co-workers are intentionally doing things like this, then I am very sorry for you. -S.
Pointers are scary!
Feb 14 2006
parent reply Don Clugston <dac nospam.com.au> writes:
 Pointers are scary! 
I've heard this for decades, and I still don't really understand it. I've never found pointers to be worse than uninitialised variables. (in fact, I think many problems attributed to pointers are actually caused by uninitialised variables). And the absolute worst is languages that don't require you to declare a variable before you use it. That's _really_ scary.
Feb 21 2006
next sibling parent Sean Kelly <sean f4.ca> writes:
Don Clugston wrote:
 
 Pointers are scary! 
I've heard this for decades, and I still don't really understand it. I've never found pointers to be worse than uninitialised variables. (in fact, I think many problems attributed to pointers are actually caused by uninitialised variables). And the absolute worst is languages that don't require you to declare a variable before you use it. That's _really_ scary.
I agree on both counts. And as for the latter... it's why I think languages like Lua are next to useless, as debugging them is an absolute nightmare. Sean
Feb 21 2006
prev sibling parent reply "Derek Parnell" <derek psych.ward> writes:
On Wed, 22 Feb 2006 03:04:19 +1100, Don Clugston <dac nospam.com.au> wrote:


 And the absolute worst is languages that don't require you to declare a  
 variable before you use it. That's _really_ scary.
When you say 'before', are you saying that forward references are not a good thing? Or do you just mean 'before' in a temporal sense. -- Derek Parnell Melbourne, Australia
Feb 21 2006
parent Don Clugston <dac nospam.com.au> writes:
Derek Parnell wrote:
 On Wed, 22 Feb 2006 03:04:19 +1100, Don Clugston <dac nospam.com.au> wrote:
 
 
 And the absolute worst is languages that don't require you to declare 
 a variable before you use it. That's _really_ scary.
When you say 'before', are you saying that forward references are not a good thing? Or do you just mean 'before' in a temporal sense.
In a temporal sense. You can strike out the words "before you use it".
 
 --Derek Parnell
 Melbourne, Australia
Feb 21 2006
prev sibling parent "Unknown W. Brackets" <unknown simplemachines.org> writes:
Why don't you give them access to a scripting language?  Perhaps 
something like Python/Ruby or even DMDScript?

If performance is an issue, just make sure the scripting language 
doesn't allow eval (which is so much more evil than pointers, by the 
way) and you should be able to convert easily.

-[Unknown]


 Note: I did a search for this and didn't come up with any threads. If it
 has been discussed before... my apologies.
 
 
 Recently I introduced D to a friend of mine (a C.S. grad student at
 Purdue). His reaction was the usual "wow, awesome". Then he became
 concerned about pointer safety. D allows unrestricted access to pointers.
 
 I will skip the "pointers are unsafe" rigmarole.
 

Feb 12 2006