www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - 64-bit

reply Just Visiting <nospam aol.com> writes:
Last things I remember:

- DmD is strictly 32-bit
- Someone ported a chronically outdated D-compiler variant to Linux x86_64

The ideas behind DmD looked promising to me. But most of my programs showed at
least a 2-fold performance increase once they were re-written for 64-bit.
Therefore 32-bit compilers are just wasting my time, and this is why I've lost
track of DmD.

After a certain waiting period I'd now like to know from the well informed
amongst you if there are any indications that DmD will eventually evolve to
support current CPU architectures?

Thanks.
Oct 17 2009
next sibling parent reply Lutger <lutger.blijdestijn gmail.com> writes:
Currently:

LDC is a mature compiler that does linux 64 bit well, but is not available 
for D2, the 'alpha' branch of the language and also doesn't work on windows.

The outdated compiler you speak of would probably be GDC, this project has 
recently been revived. The is no 64-bit dmd yet.

At the moment all efforts I believe are on finishing the D2 language and 
current dmd bugs, as well as a book called 'The D programming language'. 
IIRC the deadline is somewhere end of this year.

What happens after that I don't know, I haven't seen any statement that work 
on 64-bit is planned. I would rather bet on GDC and / or LDC getting full 
64-but support for D2 before dmd.
Oct 17 2009
next sibling parent reply Christopher Wright <dhasenan gmail.com> writes:
Lutger wrote:
 Currently:
 
 LDC is a mature compiler that does linux 64 bit well, but is not available 
 for D2, the 'alpha' branch of the language and also doesn't work on windows.
LDC works on Windows, except for exception handling. Which is probably a deal breaker for most people.
Oct 17 2009
parent BCS <none anon.com> writes:
Hello Christopher,

 Lutger wrote:
 
 Currently:
 
 LDC is a mature compiler that does linux 64 bit well, but is not
 available for D2, the 'alpha' branch of the language and also doesn't
 work on windows.
 
LDC works on Windows, except for exception handling. Which is probably a deal breaker for most people.
IIRC that's an LLVM issue, not a LDC one. Is anyone pussing on the LLVM end to get it fixed?
Oct 17 2009
prev sibling parent reply dsimcha <dsimcha yahoo.com> writes:
== Quote from Lutger (lutger.blijdestijn gmail.com)'s article
 Currently:
 LDC is a mature compiler that does linux 64 bit well, but is not available
 for D2, the 'alpha' branch of the language and also doesn't work on windows.
 The outdated compiler you speak of would probably be GDC, this project has
 recently been revived. The is no 64-bit dmd yet.
 At the moment all efforts I believe are on finishing the D2 language and
 current dmd bugs, as well as a book called 'The D programming language'.
 IIRC the deadline is somewhere end of this year.
 What happens after that I don't know, I haven't seen any statement that work
 on 64-bit is planned. I would rather bet on GDC and / or LDC getting full
 64-but support for D2 before dmd.
Where has GDC been revived? It doesn't look like there have been any checkins on the sourceforge site since Feb.
Oct 17 2009
next sibling parent Lutger <lutger.blijdestijn gmail.com> writes:
dsimcha wrote:

 == Quote from Lutger (lutger.blijdestijn gmail.com)'s article
 Currently:
 LDC is a mature compiler that does linux 64 bit well, but is not
 available for D2, the 'alpha' branch of the language and also doesn't
 work on windows. The outdated compiler you speak of would probably be
 GDC, this project has recently been revived. The is no 64-bit dmd yet.
 At the moment all efforts I believe are on finishing the D2 language and
 current dmd bugs, as well as a book called 'The D programming language'.
 IIRC the deadline is somewhere end of this year.
 What happens after that I don't know, I haven't seen any statement that
 work on 64-bit is planned. I would rather bet on GDC and / or LDC getting
 full 64-but support for D2 before dmd.
Where has GDC been revived? It doesn't look like there have been any checkins on the sourceforge site since Feb.
It has forked to http://bitbucket.org/goshawk/gdc/wiki/Home See also: http://www.digitalmars.com/webnews/newsgroups.php?art_group=D.gnu&article_id=3516
Oct 17 2009
prev sibling parent Leandro Lucarella <llucax gmail.com> writes:
dsimcha, el 17 de octubre a las 14:19 me escribiste:
 == Quote from Lutger (lutger.blijdestijn gmail.com)'s article
 Currently:
 LDC is a mature compiler that does linux 64 bit well, but is not available
 for D2, the 'alpha' branch of the language and also doesn't work on windows.
 The outdated compiler you speak of would probably be GDC, this project has
 recently been revived. The is no 64-bit dmd yet.
 At the moment all efforts I believe are on finishing the D2 language and
 current dmd bugs, as well as a book called 'The D programming language'.
 IIRC the deadline is somewhere end of this year.
 What happens after that I don't know, I haven't seen any statement that work
 on 64-bit is planned. I would rather bet on GDC and / or LDC getting full
 64-but support for D2 before dmd.
Where has GDC been revived? It doesn't look like there have been any checkins on the sourceforge site since Feb.
http://bitbucket.org/goshawk/gdc/ -- Leandro Lucarella (AKA luca) http://llucax.com.ar/ ---------------------------------------------------------------------- GPG Key: 5F5A8D05 (F8CD F9A7 BF00 5431 4145 104C 949E BFB6 5F5A 8D05) ---------------------------------------------------------------------- You look so tired-unhappy, bring down the government, they don't, they don't speak for us.
Oct 17 2009
prev sibling next sibling parent reply Jeremie Pelletier <jeremiep gmail.com> writes:
Just Visiting wrote:
 Last things I remember:
 
 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux x86_64
 
 The ideas behind DmD looked promising to me. But most of my programs showed at
least a 2-fold performance increase once they were re-written for 64-bit.
Therefore 32-bit compilers are just wasting my time, and this is why I've lost
track of DmD.
 
 After a certain waiting period I'd now like to know from the well informed
amongst you if there are any indications that DmD will eventually evolve to
support current CPU architectures?
 
 Thanks.
 
I believe 64bit support is on Walter's todo list after the spec for D2 is finished.
Oct 17 2009
parent reply Nick B <nickB gmail.com> writes:
Jeremie Pelletier wrote:
 Just Visiting wrote:
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux 
 x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were 
 re-written for 64-bit. Therefore 32-bit compilers are just wasting my 
 time, and this is why I've lost track of DmD.

 After a certain waiting period I'd now like to know from the well 
 informed amongst you if there are any indications that DmD will 
 eventually evolve to support current CPU architectures?

 Thanks.
I believe 64bit support is on Walter's todo list after the spec for D2 is finished.
I too remember this statement by Walter, but I can't find his post to prove it. So it (64-bit) can't be too far away. Nick B
Oct 17 2009
parent reply Nick B <nickB gmail.com> writes:
Nick B wrote:
 Jeremie Pelletier wrote:
 Just Visiting wrote:
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux 
 x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were 
 re-written for 64-bit. Therefore 32-bit compilers are just wasting my 
 time, and this is why I've lost track of DmD.

 After a certain waiting period I'd now like to know from the well 
 informed amongst you if there are any indications that DmD will 
 eventually evolve to support current CPU architectures?

 Thanks.
I believe 64bit support is on Walter's todo list after the spec for D2 is finished.
I too remember this statement by Walter, but I can't find his post to prove it. So it (64-bit) can't be too far away.
Correction. I have found the post. Date 11/08/2008 Subject : Re: Some questions about dmd development Question from Alexey:
 3. Are there any plans to support 64-bits system? And if it is, when?
Walters reply: Yes, after D2 is done. cheers Nick B
Oct 17 2009
parent reply Just Visiting <nospam aol.com> writes:
Nick B Wrote:

 Nick B wrote:
 Jeremie Pelletier wrote:
 Just Visiting wrote:
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux 
 x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were 
 re-written for 64-bit. Therefore 32-bit compilers are just wasting my 
 time, and this is why I've lost track of DmD.

 After a certain waiting period I'd now like to know from the well 
 informed amongst you if there are any indications that DmD will 
 eventually evolve to support current CPU architectures?

 Thanks.
I believe 64bit support is on Walter's todo list after the spec for D2 is finished.
I too remember this statement by Walter, but I can't find his post to prove it. So it (64-bit) can't be too far away.
Correction. I have found the post. Date 11/08/2008 Subject : Re: Some questions about dmd development Question from Alexey: > 3. Are there any plans to support 64-bits system? And if it is, when? Walters reply: Yes, after D2 is done. cheers Nick B
That is all I needed to know. Thank you.
Oct 17 2009
parent reply Nick B <nickB gmail.com> writes:
Just Visiting wrote:
 Nick B Wrote:
 
 Nick B wrote:
 Jeremie Pelletier wrote:
 Just Visiting wrote:
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux 
 x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were 
 re-written for 64-bit. Therefore 32-bit compilers are just wasting my 
 time, and this is why I've lost track of DmD.

 After a certain waiting period I'd now like to know from the well 
 informed amongst you if there are any indications that DmD will 
 eventually evolve to support current CPU architectures?

 Thanks.
I believe 64bit support is on Walter's todo list after the spec for D2 is finished.
I too remember this statement by Walter, but I can't find his post to prove it. So it (64-bit) can't be too far away.
Correction. I have found the post. Date 11/08/2008 Subject : Re: Some questions about dmd development Question from Alexey: > 3. Are there any plans to support 64-bits system? And if it is, when? Walters reply: Yes, after D2 is done. cheers Nick B
That is all I needed to know. Thank you.
One more correction. The date I mention for Walter post is incorrect. It is in fact 11/08/2009 8:49 a.m. To "Just Visting", no problem. Nick B
Oct 17 2009
parent BCS <none anon.com> writes:
Hello Nick,

 One more correction.  The date I mention for Walter post is incorrect.
 It is in fact 11/08/2009 8:49 a.m.
For us yanks that would be 08/11/2009
 
 To "Just Visting", no problem.
 
 Nick B
 
Oct 18 2009
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Just Visiting" <nospam aol.com> wrote in message 
news:hbcbvs$1ees$1 digitalmars.com...
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were re-written 
 for 64-bit.
Only on 64-bit systems. Which are already ridiculously fast anyway. So what if they get some more performance? They already have gobs of performance to spare. On a 32-bit system it changes the programs performance down to "It don't f** work at all", which is the mark of an incredibly arrogant developer who likes to shoot themself in the foot by arbitrarily shrinking their own potential user base.
 Therefore 32-bit compilers are just wasting my time, and this is why I've 
 lost track of DmD.
They should make roads that are only usable by Italian sports cars, and take full advantage of their special characteristics. Any other roads are just wasting my time.
Oct 17 2009
next sibling parent BCS <none anon.com> writes:
Hello Nick,

 "Just Visiting" <nospam aol.com> wrote in message
 news:hbcbvs$1ees$1 digitalmars.com...
 
 Last things I remember:
 
 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux
 x86_64
 The ideas behind DmD looked promising to me. But most of my programs
 showed at least a 2-fold performance increase once they were
 re-written for 64-bit.
 
Only on 64-bit systems. Which are already ridiculously fast anyway. So what if they get some more performance? They already have gobs of performance to spare. On a 32-bit system it changes the programs performance down to "It don't f** work at all", which is the mark of an incredibly arrogant developer who likes to shoot themself in the foot by arbitrarily shrinking their own potential user base.
best option, build the language (and the apps) so that they compile on either and ship both.
Oct 17 2009
prev sibling next sibling parent reply Yigal Chripun <yigal100 gmail.com> writes:
On 17/10/2009 22:11, Nick Sabalausky wrote:
 "Just Visiting"<nospam aol.com>  wrote in message
 news:hbcbvs$1ees$1 digitalmars.com...
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux x86_64

 The ideas behind DmD looked promising to me. But most of my programs
 showed at least a 2-fold performance increase once they were re-written
 for 64-bit.
Only on 64-bit systems. Which are already ridiculously fast anyway. So what if they get some more performance? They already have gobs of performance to spare. On a 32-bit system it changes the programs performance down to "It don't f** work at all", which is the mark of an incredibly arrogant developer who likes to shoot themself in the foot by arbitrarily shrinking their own potential user base.
 Therefore 32-bit compilers are just wasting my time, and this is why I've
 lost track of DmD.
They should make roads that are only usable by Italian sports cars, and take full advantage of their special characteristics. Any other roads are just wasting my time.
those are *fine* cars, aren't they? but seriously, your argument is ridiculously wrong since velocity is dependent on the observer. the OP said and I quote: "32-bit compilers are just wasting *my* time" (emphasis added).
Oct 17 2009
parent "Nick Sabalausky" <a a.a> writes:
"Yigal Chripun" <yigal100 gmail.com> wrote in message 
news:hbd9d5$245o$1 digitalmars.com...
 On 17/10/2009 22:11, Nick Sabalausky wrote:
 They should make roads that are only usable by Italian sports cars, and 
 take
 full advantage of their special characteristics. Any other roads are just
 wasting my time.
those are *fine* cars, aren't they?
I only wish I could answer from first-hand experience ;) (I *want* Thomas Magnum's car...erm...or rather Robin Masters's car...not to mention "Robin's Nest"...)
 but seriously, your argument is ridiculously wrong since velocity is 
 dependent on the observer.
 the OP said and I quote: "32-bit compilers are just wasting *my* time" 
 (emphasis added).
Curses! Einstein strikes again!
Oct 17 2009
prev sibling next sibling parent reply Just Visiting <nospam aol.com> writes:
Before I answer the previous message I'd like to thank
everyone for their feedback which was pretty helpful.

Now to Nick:

I really do not intend to offend anybody but you should
actually think before you question the necessity for
fast/responsive programs. People like you have used
similar arguments since the IT stone age. They usually
judge from the standpoint of their own momentary CPU
performance requirements.

If I'd use your comments during my next business meeting
we'd all have a good laugh. But I won't because I'll give
you the chance to think this over:

My girl friend is driving a Porsche (no Italian sports car, sorry).
Who do you think had to pay for it? Yeah, you guessed right,
it was the guy who is tweaking software, so a bunch of
computers can survive their replacement by a year or two.
I just wonder what car make she'd be driving if I told my clients
to use patience - instead of paying myself to get their analysis
software to finish the job in a fraction of time.

Cheers.


Nick Sabalausky Wrote:

 "Just Visiting" <nospam aol.com> wrote in message 
 news:hbcbvs$1ees$1 digitalmars.com...
 Last things I remember:

 - DmD is strictly 32-bit
 - Someone ported a chronically outdated D-compiler variant to Linux x86_64

 The ideas behind DmD looked promising to me. But most of my programs 
 showed at least a 2-fold performance increase once they were re-written 
 for 64-bit.
Only on 64-bit systems. Which are already ridiculously fast anyway. So what if they get some more performance? They already have gobs of performance to spare. On a 32-bit system it changes the programs performance down to "It don't f** work at all", which is the mark of an incredibly arrogant developer who likes to shoot themself in the foot by arbitrarily shrinking their own potential user base.
 Therefore 32-bit compilers are just wasting my time, and this is why I've 
 lost track of DmD.
They should make roads that are only usable by Italian sports cars, and take full advantage of their special characteristics. Any other roads are just wasting my time.
Oct 17 2009
next sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Just Visiting" <nospam aol.com> wrote in message 
news:hbdk23$2qoi$1 digitalmars.com...
 They usually
 judge from the standpoint of their own momentary CPU
 performance requirements.
I can say exactly the same about people who defend setting their minimum system specs higher than they need to be. As soon as most developers get their hands on a new piece of hardware, all of a sudden they think no one else should be using anything less, no matter how useful or widespread the lower-end stuff may still be. And that's been going on for ages as well.
 If I'd use your comments during my next business meeting
 we'd all have a good laugh. But I won't because I'll give
 you the chance to think this over:
I couldn't care less what a bunch of suits think about my comments. If they even exist...this sudden grab for professionalism seems quite contrived considering the arrogance of your original post: "Therefore 32-bit compilers are just wasting my time," Take a minute to think first the next time you want to jump in and tell a group of people that their compiler is wasting your time.
 it was the guy who is tweaking software, so a bunch of
 computers can survive their replacement by a year or two.
That's exactly my point. There are plenty of 32-bit systems out there that are perfectly useful, but then people like you go around waving a "32-bit is antique, support for it is useless" flag. And now you suddenly turn around and try to defend your disregard for an older piece of hardware...for the sake of hardware longevity? What?
Oct 17 2009
parent reply Just Visiting <nospam aol.com> writes:
Nick Sabalausky Wrote:

 "Just Visiting" <nospam aol.com> wrote in message 
 news:hbdk23$2qoi$1 digitalmars.com...
 They usually
 judge from the standpoint of their own momentary CPU
 performance requirements.
I can say exactly the same about people who defend setting their minimum system specs higher than they need to be. As soon as most developers get their hands on a new piece of hardware, all of a sudden they think no one else should be using anything less, no matter how useful or widespread the lower-end stuff may still be. And that's been going on for ages as well.
Not sure what you are talking about. I get paid for improving responsiveness of programs - sometimes by using assembly language if deemed necessary. I do not consult anybody what hardware they should be using, as long as it remains compatible with my software.
 If I'd use your comments during my next business meeting
 we'd all have a good laugh. But I won't because I'll give
 you the chance to think this over:
I couldn't care less what a bunch of suits think about my comments. If they even exist...this sudden grab for professionalism seems quite contrived considering the arrogance of your original post: "Therefore 32-bit compilers are just wasting my time,"
Arrogance? Does it irk you that much if someone dumps a 32-bit compiler in order to enjoy an impressive speed increase without substantial changes to the software?
 Take a minute to think first the next time you want to jump in and tell a 
 group of people that their compiler is wasting your time.
Sounds almost like I have offended D's lead developer. Sorry 'bout that. But you are absolutely right. 32-bit compilers - not just DmD - are wasting my time. For approximately equal results I'd have to add either assembly language to time critical sections of my 32-bit code, or just use a 64-bit compiler with moderate adaptations.
 it was the guy who is tweaking software, so a bunch of
 computers can survive their replacement by a year or two.
That's exactly my point. There are plenty of 32-bit systems out there that are perfectly useful, but then people like you go around waving a "32-bit is antique, support for it is useless" flag. And now you suddenly turn around and try to defend your disregard for an older piece of hardware...for the sake of hardware longevity? What?
I won't deny that for certain people 32-bit systems are still perfectly useful. Just my clients do not share this view for a series of good reasons. Even their older systems tend to be 64-bit nowadays. Migration towards 64-bit OSes is under way. There is still 32-bit compatibility if needed. At the same time certain programs will perform drastically better when compiled to 64-bit. Replacement thus can be postponed which is usually the best way to keep CFOs happy.
Oct 17 2009
next sibling parent reply language_fan <foo bar.com.invalid> writes:
Sat, 17 Oct 2009 22:56:44 -0400, Just Visiting thusly wrote:

 I won't deny that for certain people 32-bit systems are still perfectly
 useful. Just my clients do not share this view for a series of good
 reasons. Even their older systems tend to be 64-bit nowadays. Migration
 towards 64-bit OSes is under way. There is still 32-bit compatibility if
 needed. At the same time certain programs will perform drastically
 better when compiled to 64-bit. Replacement thus can be postponed which
 is usually the best way to keep CFOs happy.
64-bit programs often also require larger CPU caches to work efficiently, more disk space (larger binaries), and finally larger memory consumption. 32-bit x86 + PAE still works until you have more than 64 GB of RAM or processes larger than 2 or 3 GB. So, in desktop use 32-bit feels like the best way to go unless 64-bit algorithms are provably more efficient in the chosen task.
Oct 18 2009
parent reply Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-10-18 11:32:07 +0200, language_fan <foo bar.com.invalid> said:

 Sat, 17 Oct 2009 22:56:44 -0400, Just Visiting thusly wrote:
 
 I won't deny that for certain people 32-bit systems are still perfectly
 useful. Just my clients do not share this view for a series of good
 reasons. Even their older systems tend to be 64-bit nowadays. Migration
 towards 64-bit OSes is under way. There is still 32-bit compatibility if
 needed. At the same time certain programs will perform drastically
 better when compiled to 64-bit. Replacement thus can be postponed which
 is usually the best way to keep CFOs happy.
64-bit programs often also require larger CPU caches to work efficiently, more disk space (larger binaries), and finally larger memory consumption. 32-bit x86 + PAE still works until you have more than 64 GB of RAM or processes larger than 2 or 3 GB. So, in desktop use 32-bit feels like the best way to go unless 64-bit algorithms are provably more efficient in the chosen task.
on x86 the 64 bit extension added registers, that makes it faster, even if as you correctly point out a priori just using 64 bit pointers is just a drawback unless you have lot of memory. Anyway I also need 64 bit (computational chemistry, speed and memory hungry), and to that the only thing that I can say is D1 works very well with 64 bit. Fawzi
Oct 18 2009
parent reply language_fan <foo bar.com.invalid> writes:
Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:

 On 2009-10-18 11:32:07 +0200, language_fan <foo bar.com.invalid> said:
 
 Sat, 17 Oct 2009 22:56:44 -0400, Just Visiting thusly wrote:
 
 I won't deny that for certain people 32-bit systems are still
 perfectly useful. Just my clients do not share this view for a series
 of good reasons. Even their older systems tend to be 64-bit nowadays.
 Migration towards 64-bit OSes is under way. There is still 32-bit
 compatibility if needed. At the same time certain programs will
 perform drastically better when compiled to 64-bit. Replacement thus
 can be postponed which is usually the best way to keep CFOs happy.
64-bit programs often also require larger CPU caches to work efficiently, more disk space (larger binaries), and finally larger memory consumption. 32-bit x86 + PAE still works until you have more than 64 GB of RAM or processes larger than 2 or 3 GB. So, in desktop use 32-bit feels like the best way to go unless 64-bit algorithms are provably more efficient in the chosen task.
on x86 the 64 bit extension added registers, that makes it faster, even if as you correctly point out a priori just using 64 bit pointers is just a drawback unless you have lot of memory.
That is very silly claim. First, you need to have use for all those extra registers to obtain any performance benefits. This is nearly not always the case. Also note that cache size is heavily constrained and larger binaries will fill it with less code. This alone can make the code a lot slower on first generation and budget 2..4-core x86 machines with smaller cache sizes. Main memory is expensive and you rarely can install more than 64 GB on a PC style hardware. Many times you can even split a task into separate 2..3 GB processes quite easily. So the immediate advantages of 64-bit code are not that clear when you only need it to grow the processes larger. On Linux, for instance, ordinary 64-bit desktop requires a lot more memory than its 32-bit alternative. Why would you want to buy more hardware to fix something that can be fixed with existing software?
 Anyway I also need 64 bit (computational chemistry, speed and memory
 hungry), and to that the only thing that I can say is D1 works very well
 with 64 bit.
That's one domain where 64 bits may give you an advantage. In normal desktop applications there is often nothing in 64-bit code that can improve anything. I am talking about firefox / winamp / mediaplayer / photoshop / outlook / casual gaming use here. Why I mentioned desktop applications is that currently the trend has been to replace old 32-bit intel/amd processors in desktop use. And most desktop apps are written in systems programming languages. You can hardly buy any non-64-bit capable processor from any pc store these days. The people are getting crap that they don't need. The same thing happens with digital cameras. In cameras the pixel count of sensors is growing even though the image quality stays the same. A good quality 4 MPix camera is much better than a cheap 15 MPix pocket camera. Even the forementioned 4 MPix pic resized to 2 MPix and stored in jpg format might look better than the original 15 MPix one in raw format. They just keep pushing the limits to sell larger and larger storage media.
Oct 18 2009
next sibling parent reply Stanley Steel <news-steel kryas.com> writes:
 Why I mentioned desktop applications is that currently the trend has been
 to replace old 32-bit intel/amd processors in desktop use. And most
 desktop apps are written in systems programming languages. You can hardly
 buy any non-64-bit capable processor from any pc store these days. The
 people are getting crap that they don't need.

 The same thing happens with digital cameras. In cameras the pixel count
 of sensors is growing even though the image quality stays the same. A
 good quality 4 MPix camera is much better than a cheap 15 MPix pocket
 camera. Even the forementioned 4 MPix pic resized to 2 MPix and stored in
 jpg format might look better than the original 15 MPix one in raw format.
 They just keep pushing the limits to sell larger and larger storage media.
You'd probably be pissed to hear about microsoft creating a 128-bit OS and intel designing 128-bit processors.
Oct 18 2009
parent language_fan <foo bar.com.invalid> writes:
Sun, 18 Oct 2009 14:45:39 -0600, Stanley Steel thusly wrote:

 Why I mentioned desktop applications is that currently the trend has
 been to replace old 32-bit intel/amd processors in desktop use. And
 most desktop apps are written in systems programming languages. You can
 hardly buy any non-64-bit capable processor from any pc store these
 days. The people are getting crap that they don't need.

 The same thing happens with digital cameras. In cameras the pixel count
 of sensors is growing even though the image quality stays the same. A
 good quality 4 MPix camera is much better than a cheap 15 MPix pocket
 camera. Even the forementioned 4 MPix pic resized to 2 MPix and stored
 in jpg format might look better than the original 15 MPix one in raw
 format. They just keep pushing the limits to sell larger and larger
 storage media.
You'd probably be pissed to hear about microsoft creating a 128-bit OS and intel designing 128-bit processors.
Nah, I don't really care. I have nothing against wider vector registers, but huge general purpose registers usually don't make sense.
Oct 18 2009
prev sibling parent reply Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-10-18 20:01:26 +0200, language_fan <foo bar.com.invalid> said:

 Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:
 
 On 2009-10-18 11:32:07 +0200, language_fan <foo bar.com.invalid> said:
 
 Sat, 17 Oct 2009 22:56:44 -0400, Just Visiting thusly wrote:
 
 I won't deny that for certain people 32-bit systems are still
 perfectly useful. Just my clients do not share this view for a series
 of good reasons. Even their older systems tend to be 64-bit nowadays.
 Migration towards 64-bit OSes is under way. There is still 32-bit
 compatibility if needed. At the same time certain programs will
 perform drastically better when compiled to 64-bit. Replacement thus
 can be postponed which is usually the best way to keep CFOs happy.
64-bit programs often also require larger CPU caches to work efficiently, more disk space (larger binaries), and finally larger memory consumption. 32-bit x86 + PAE still works until you have more than 64 GB of RAM or processes larger than 2 or 3 GB. So, in desktop use 32-bit feels like the best way to go unless 64-bit algorithms are provably more efficient in the chosen task.
on x86 the 64 bit extension added registers, that makes it faster, even if as you correctly point out a priori just using 64 bit pointers is just a drawback unless you have lot of memory.
That is very silly claim. First, you need to have use for all those extra registers to obtain any performance benefits. This is nearly not always the case.
Probably you don't know x86 architecture well, it is register starved for modern standards, also with the 64 bit new instruction were added, on x86 the 64 bit change was not "add 64-bit pointers" but it was let's try to fix some major shortcomings of x86. These enhancements are available only in 64 bit mode (to keep backward compatibility). I know for a fact that my code runs faster in 64 bit mode (or you can say my compiler optimizes it better), and I am not the only one: for sure apple converted basically all its applications to 64 bit on snow leopard (that is focusing on speed), so that they are slower :P.
  Also note that cache size is heavily constrained and larger
 binaries will fill it with less code. This alone can make the code a lot
 slower on first generation and budget 2..4-core x86 machines with smaller
 cache sizes.
cache usage is a real issue, but on the whole code is faster in 64 bit mode, at least in my experience
 Main memory is expensive and you rarely can install more than 64 GB on a
 PC style hardware. Many times you can even split a task into separate
 2..3 GB processes quite easily. So the immediate advantages of 64-bit
 code are not that clear when you only need it to grow the processes
 larger. On Linux, for instance, ordinary 64-bit desktop requires a lot
 more memory than its 32-bit alternative. Why would you want to buy more
 hardware to fix something that can be fixed with existing software?
 
 Anyway I also need 64 bit (computational chemistry, speed and memory
 hungry), and to that the only thing that I can say is D1 works very well
 with 64 bit.
That's one domain where 64 bits may give you an advantage. In normal desktop applications there is often nothing in 64-bit code that can improve anything. I am talking about firefox / winamp / mediaplayer / photoshop / outlook / casual gaming use here.
not sure, see snow leopard...
 Why I mentioned desktop applications is that currently the trend has been
 to replace old 32-bit intel/amd processors in desktop use. And most
 desktop apps are written in systems programming languages. You can hardly
 buy any non-64-bit capable processor from any pc store these days. The
 people are getting crap that they don't need.
intel tried hard to avoid giving 64 bit in consumer processors (as it wanted to push IA-64), but failed and had to give what people wanted, in this case I think it was a good thing. Fawzi
Oct 19 2009
next sibling parent reply language_fan <foo bar.com.invalid> writes:
Mon, 19 Oct 2009 13:22:34 +0200, Fawzi Mohamed thusly wrote:
 On 2009-10-18 20:01:26 +0200, language_fan <foo bar.com.invalid> said:
 Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:
 
  Also note that cache size is heavily constrained and larger
 binaries will fill it with less code. This alone can make the code a
 lot slower on first generation and budget 2..4-core x86 machines with
 smaller cache sizes.
cache usage is a real issue, but on the whole code is faster in 64 bit mode, at least in my experience
I believe in real world benchmarks more than in hollow words.
 Main memory is expensive and you rarely can install more than 64 GB on
 a PC style hardware. Many times you can even split a task into separate
 2..3 GB processes quite easily. So the immediate advantages of 64-bit
 code are not that clear when you only need it to grow the processes
 larger. On Linux, for instance, ordinary 64-bit desktop requires a lot
 more memory than its 32-bit alternative. Why would you want to buy more
 hardware to fix something that can be fixed with existing software?
 
You did not comment on this.. in desktop use cpu power rarely matters. But running out of memory is pretty common (think about laptops with limited amount of memory slots and expensive memory units).
 Anyway I also need 64 bit (computational chemistry, speed and memory
 hungry), and to that the only thing that I can say is D1 works very
 well with 64 bit.
That's one domain where 64 bits may give you an advantage. In normal desktop applications there is often nothing in 64-bit code that can improve anything. I am talking about firefox / winamp / mediaplayer / photoshop / outlook / casual gaming use here.
not sure, see snow leopard...
That's a business decisions ffs. Not necessarily a technical one. Maybe they wanted to future proof their software by keeping the amount of binary formats at minimum.
Oct 19 2009
parent Fawzi Mohamed <fmohamed mac.com> writes:
On 2009-10-19 15:04:23 +0200, language_fan <foo bar.com.invalid> said:

 Mon, 19 Oct 2009 13:22:34 +0200, Fawzi Mohamed thusly wrote:
 On 2009-10-18 20:01:26 +0200, language_fan <foo bar.com.invalid> said:
 Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:
 
 Also note that cache size is heavily constrained and larger
 binaries will fill it with less code. This alone can make the code a
 lot slower on first generation and budget 2..4-core x86 machines with
 smaller cache sizes.
cache usage is a real issue, but on the whole code is faster in 64 bit mode, at least in my experience
I believe in real world benchmarks more than in hollow words.
 Main memory is expensive and you rarely can install more than 64 GB on
 a PC style hardware. Many times you can even split a task into separate
 2..3 GB processes quite easily. So the immediate advantages of 64-bit
 code are not that clear when you only need it to grow the processes
 larger. On Linux, for instance, ordinary 64-bit desktop requires a lot
 more memory than its 32-bit alternative. Why would you want to buy more
 hardware to fix something that can be fixed with existing software?
 
You did not comment on this.. in desktop use cpu power rarely matters. But running out of memory is pretty common (think about laptops with limited amount of memory slots and expensive memory units).
I fully agree that for desktop 32 bit is enough, but selling it as better is just misinformation (at least for x86). Just look at the installs for ubuntu 64 bit vs 32 and you will see that the difference is just a few %. What uses lot of place is installing things twice (32 and 64 bit version of the libs for example), if you have a pure 64 bit install (like I do) then you do loose the possibility to run 32 bit binaries or plugins that need 32 bit libs (like dmd :(), but the whole system is not much larger. In any case often what occupies most space isn't the compiled code but other resources. With this I will stop answering you, because I don't find it so productive. Fawzi
Oct 19 2009
prev sibling parent reply "Nick Sabalausky" <a a.a> writes:
"Fawzi Mohamed" <fmohamed mac.com> wrote in message 
news:hbhi5q$1gqm$1 digitalmars.com...
 On 2009-10-18 20:01:26 +0200, language_fan <foo bar.com.invalid> said:

 Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:
 on x86 the 64 bit extension added registers, that makes it faster, even
 if as you correctly point out a priori just using 64 bit pointers is
 just a drawback unless you have lot of memory.
That is very silly claim. First, you need to have use for all those extra registers to obtain any performance benefits. This is nearly not always the case.
Probably you don't know x86 architecture well, it is register starved for modern standards, also with the 64 bit new instruction were added, on x86 the 64 bit change was not "add 64-bit pointers" but it was let's try to fix some major shortcomings of x86. These enhancements are available only in 64 bit mode (to keep backward compatibility). I know for a fact that my code runs faster in 64 bit mode (or you can say my compiler optimizes it better), and I am not the only one: for sure apple converted basically all its applications to 64 bit on snow leopard (that is focusing on speed), so that they are slower :P.
I'll certainly agree with you on 64-bit x86 likely being faster than 32-bit, but Apple is bad example. Apple, at it's cor...erm..."heart", is a hardware company. That's where they make their money. If software runs efficiently, then their newer hardware becomes a tougher sell (And Jobs himself has never been anything more than a salesman, only with far more control over his company than salesmen usually have). It's not surprising that for years, every version of iTunes has kept growing noticably more bloated than the last, despite having very little extra.
Oct 19 2009
parent Tomas Lindquist Olsen <tomas.l.olsen gmail.com> writes:
On Mon, Oct 19, 2009 at 10:26 PM, Nick Sabalausky <a a.a> wrote:
 "Fawzi Mohamed" <fmohamed mac.com> wrote in message
 news:hbhi5q$1gqm$1 digitalmars.com...
 On 2009-10-18 20:01:26 +0200, language_fan <foo bar.com.invalid> said:

 Sun, 18 Oct 2009 16:35:53 +0200, Fawzi Mohamed thusly wrote:
 on x86 the 64 bit extension added registers, that makes it faster, even
 if as you correctly point out a priori just using 64 bit pointers is
 just a drawback unless you have lot of memory.
That is very silly claim. First, you need to have use for all those extra registers to obtain any performance benefits. This is nearly not always the case.
Probably you don't know x86 architecture well, it is register starved for modern standards, also with the 64 bit new instruction were added, on x86 the 64 bit change was not "add 64-bit pointers" but it was let's try to fix some major shortcomings of x86. These enhancements are available only in 64 bit mode (to keep backward compatibility). I know for a fact that my code runs faster in 64 bit mode (or you can say my compiler optimizes it better), and I am not the only one: for sure apple converted basically all its applications to 64 bit on snow leopard (that is focusing on speed), so that they are slower :P.
I'll certainly agree with you on 64-bit x86 likely being faster than 32-bit, but Apple is bad example. Apple, at it's cor...erm..."heart", is a hardware company. That's where they make their money. If software runs efficiently, then their newer hardware becomes a tougher sell (And Jobs himself has never been anything more than a salesman, only with far more control over his company than salesmen usually have). It's not surprising that for years, every version of iTunes has kept growing noticably more bloated than the last, despite having very little extra.
It's interesting how Apple is doing a lot to better performance then. With things like OpenCL and LLVM.
Oct 20 2009
prev sibling parent BCS <none anon.com> writes:
Hello Just,

 I won't deny that for certain people 32-bit systems are still
 perfectly useful.
 Just my clients do not share this view for a series of good reasons.
 Even
 their older systems tend to be 64-bit nowadays. Migration towards
 64-bit
 OSes is under way. There is still 32-bit compatibility if needed. At
 the same
 time certain programs will perform drastically better when compiled to
 64-bit.
 Replacement thus can be postponed which is usually the best way to
 keep
 CFOs happy.
If you know in advance that ALL of your market is 64bit today, or If you expect the client to buy whatever you tell them to (I understand that graphical designers buy whatever the photoshop box says to buy) that's one thing, but If you are selling to people who will only upgrade if forced to, then only shipping in 64bit is just spending someone else money (never a good idea). It sounds like you might be in the first cases so you might be fine, but enough users of D are in the second case that 32 is not a waist of time for a lot of people.
Oct 18 2009
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Just Visiting wrote:
 My girl friend is driving a Porsche (no Italian sports car, sorry).
 Who do you think had to pay for it? Yeah, you guessed right,
 it was the guy who is tweaking software, so a bunch of
 computers can survive their replacement by a year or two.
 I just wonder what car make she'd be driving if I told my clients
 to use patience - instead of paying myself to get their analysis
 software to finish the job in a fraction of time.
And it's "Porsche-uh". Andrei
Oct 17 2009
prev sibling parent Daniel de Kok <me nowhere.nospam> writes:
On 2009-10-17 22:11:56 +0200, "Nick Sabalausky" <a a.a> said:
 Only on 64-bit systems. Which are already ridiculously fast anyway. So what
 if they get some more performance? They already have gobs of performance to
 spare. On a 32-bit system it changes the programs performance down to "It
 don't f** work at all", which is the mark of an incredibly arrogant
 developer who likes to shoot themself in the foot by arbitrarily shrinking
 their own potential user base.
Well, for some it is a necessity. In our field (NLP), a theoretical maximum of 4GB of memory is just too little for anything but some scripting. Just to give some numbers: with the current size of corpora we need ~20GB of memory for error mining in parsing results, and at least ~10GB of memory for serious natural language generation. Until there is a good D2 compiler that can compile x86_64 on Linux, I'll have to continue using C++. After writing some small programs, I have come to the conclusion that the use of D would be far more comfortable, although some of the C++0x extensions that are already implemented in g++ help. Not that I am complaining, I understand that the first priority is getting the D2 specification finished, but I think it would be far more productive (and easier to support x86_64) if LLVM became the default backend. LLVM will be the future of non-managed compiled languages anyway... Take care, Daniel
Oct 19 2009