digitalmars.D - Linus' idea of "good taste" code
- Walter Bright (9/9) Oct 25 2016 It's a small bit, but the idea here is to eliminate if conditionals wher...
- bluecat (4/15) Oct 25 2016 Interesting, that's going in my tips.txt file. Quick question, if
- Walter Bright (24/27) Oct 26 2016 Ha, great question. Never thought about it before. Off the top of my hea...
- Chris (12/13) Oct 26 2016 [snip]
- Andrea Fontana (7/10) Oct 26 2016 It's a common way to solve problem. Just concentrate yourself on
- Adam D. Ruppe (6/10) Oct 26 2016 I find the most elegant bug fixes tend to be the ones with an
- Walter Bright (3/7) Oct 26 2016 Elegant fixes tend to mean a refactoring is included!
- H. S. Teoh via Digitalmars-d (22/35) Oct 25 2016 Not only easier to reason about, but probably performs better too.
- Dicebot (17/21) Oct 26 2016 protected-headers="v1"
- Walter Bright (6/11) Oct 26 2016 Any coding principle rigidly applied leads to disaster.
- Nick Sabalausky (3/7) Oct 26 2016 I always liked that principle. Tao Te Ching, if I'm not mistaken (though...
- qznc (7/17) Oct 27 2016 I'm unsure about Linus' version. For this example, I agree that
- Walter Bright (3/4) Oct 26 2016 The Hacker News thread:
- Marco Leise (24/36) Oct 26 2016 On a more controversial note, I sometimes replace nested
- sarn (6/12) Oct 26 2016 Speaking of Linus, that's idiomatic in the Linux kernel for error
- Walter Bright (2/4) Oct 26 2016 I've also found that nested functions can nicely fix spaghetti code.
- Mark (11/26) Oct 26 2016 Personally, a large amount of control flow statements (and
- Mark (9/12) Oct 26 2016 What would you say is the best way to write his "Initialize grid
- Idan Arye (13/24) Oct 27 2016 I'd like to point to Joel Spolsky excellent article "Five Worlds"
- Chris (2/14) Oct 27 2016 Not easy to be smart with Javascript ;)
- Jonathan M Davis via Digitalmars-d (3/4) Oct 27 2016 Sure, it is. Avoid it. ;)
- Chris (3/8) Oct 27 2016 I wish I could! I wish we had DScript for browsers!
- Laeeth Isharc (4/14) Oct 27 2016 I am hoping that when asm.js is more mature, then we can use the
- Dicebot (16/30) Oct 27 2016 protected-headers="v1"
- Chris (2/21) Oct 27 2016 Can't wait for this to happen!
- Laeeth Isharc (12/31) Oct 29 2016 I asked Kai at dconf about what would be involved in porting to
- Dicebot (12/20) Oct 29 2016 I have actually meant something quite different - implementing
- Walter Bright (8/11) Oct 29 2016 I looked at it for 5 minutes :-) and it looks like typical intermediate ...
- Joakim (8/23) Oct 29 2016 It is not worth it, the web is dying. I was stunned to see this
- Patrick Schluter (3/10) Oct 30 2016 Yes, because outside of web on mobile nothing else exists...
- Joakim (33/53) Nov 02 2016 Pretty soon it won't:
- Patrick Schluter (13/68) Nov 02 2016 Even that chart shows a flattening to an asymptote not a linear
- Joakim (34/85) Nov 04 2016 Nothing is ever "completely replaced"- somebody somewhere is
- Nick Sabalausky (35/41) Nov 09 2016 I'd hardly call that a "full" multi-window mode.
- qznc (15/31) Nov 10 2016 I don't believe that.
- Nick Sabalausky (35/53) Nov 10 2016 I hope you're right, because I definitely need to use an "old-fashioned"...
- Chris (13/29) Nov 10 2016 I've adopted this philosophy:
- Joakim (101/240) Nov 11 2016 Why? Note that it has been fleshed out more since that March
- Kagamin (3/6) Nov 10 2016 They just spend increasingly more time in twitter when not at
- Dicebot (26/41) Oct 31 2016 protected-headers="v1"
- Laeeth Isharc (22/44) Oct 30 2016 Existing pipeline is string together with gaffer tape and string,
- Dicebot (28/32) Oct 31 2016 protected-headers="v1"
- Laeeth Isharc (25/36) Oct 31 2016 I was disappointed that after early hype it all went quiet for
- Dicebot (22/41) Nov 01 2016 Actually, they have moved one with browser previews and dedicated
- Kagamin (3/6) Oct 27 2016 Dunno, I wouldn't expect an edge case to fall into the common
- Era Scarecrow (17/19) Oct 28 2016 A problem for myself and probably many programmers, is some of
It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test. Eliminating loops is something D adds, and goes even further to making code a straight line. One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.
Oct 25 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test. Eliminating loops is something D adds, and goes even further to making code a straight line. One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.Interesting, that's going in my tips.txt file. Quick question, if you don't mind. What would be the top three things you've learned that significantly made you a better programmer?
Oct 25 2016
On 10/25/2016 5:19 PM, bluecat wrote:Interesting, that's going in my tips.txt file. Quick question, if you don't mind. What would be the top three things you've learned that significantly made you a better programmer?Ha, great question. Never thought about it before. Off the top of my head: 1. Never try to do two things at once, i.e. never mix: refactoring translation more than one bug fix at at time optimization feature addition I sure endlessly beat people on github with clue-by-fours over this. 2. If the code looks ugly, it is guaranteed to not work properly. If you're stuck debugging someone else's code, look at the ugly bits first. (I've had several experienced developers tell me this, too.) 3. Anybody can write clever, complex code. It takes genius to write simple code. 4. Macros are like crack. The first hit feels great, but macros inevitably turn everything they touch into crap. It took me a loooong time to learn this. Some years ago I made a concerted effort to purge macros from the dmd front end, and have been purging them from the back end. I've been very pleased with the results. 5. Global variables are the Spawn of Satan. (They make code very hard to reason about.) 6. Nearly all bugs can be fixed with under 10 lines of code, and quite a few with 1 line. It's always a red flag for me when a fix PR has 200+ lines of code (test case lines of code don't count, neither do comments). 7. If you're stuck on a programming problem, go out for a jog. I often find the answer that way. What can I say, it works for me.
Oct 26 2016
On Wednesday, 26 October 2016 at 09:41:42 UTC, Walter Bright wrote:On 10/25/2016 5:19 PM, bluecat wrote:[snip] I largely agree with your points 1.-7. (especially 1. and 7.). Also that gathering the data and acting upon it should be separate. However, principles might be at loggerheads. E.g. generics vs. specialization. Performance vs. fragmentation. Cf. https://dpaste.dzfl.pl/dc8dc6e1eb53 vs. https://dpaste.dzfl.pl/392710b765a9 (Andrei's improved search algorithm that got a performance boost through manual inlining).
Oct 26 2016
On Wednesday, 26 October 2016 at 09:41:42 UTC, Walter Bright wrote:7. If you're stuck on a programming problem, go out for a jog. I often find the answer that way. What can I say, it works for me.It's a common way to solve problem. Just concentrate yourself on something more pratical than problem-solving. It really works fine for me. When we have a hard problem to solve my boss always tell me to take a walk out of office or simple to go home to lunch. Then I come back with solution. It works.
Oct 26 2016
On Wednesday, 26 October 2016 at 09:41:42 UTC, Walter Bright wrote:6. Nearly all bugs can be fixed with under 10 lines of code, and quite a few with 1 line. It's always a red flag for me when a fix PR has 200+ lines of code (test case lines of code don't count, neither do comments).I find the most elegant bug fixes tend to be the ones with an overall reduction of code. Though, sometimes things are rotten to the core and that net change of -10 lines comes from a +330, -340 diff....
Oct 26 2016
On 10/26/2016 6:20 AM, Adam D. Ruppe wrote:I find the most elegant bug fixes tend to be the ones with an overall reduction of code. Though, sometimes things are rotten to the core and that net change of -10 lines comes from a +330, -340 diff....Elegant fixes tend to mean a refactoring is included! And it's not a hard and fast rule. It's just a red flag.
Oct 26 2016
On Tue, Oct 25, 2016 at 03:53:54PM -0700, Walter Bright via Digitalmars-d wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test.Not only easier to reason about, but probably performs better too. Conditionals introduce pipeline hazards. (Though, arguably, not very significant here because the loop would dominate the running time.)Eliminating loops is something D adds, and goes even further to making code a straight line. One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.Separation of concerns is a typical "good taste" practice. It's why the range-based pipeline idiom is so effective. When you compound multiple concerns into a single piece of code, it inevitably gets tangled into a complex mess that has lots of room for bugs to hide in. It's that whole philosophy with Jackson Structured Programming again: make your code structure 1-to-1 with your data structure, and edge cases and ad hoc complexities vanish, along with their respective bug potentials. If the input structure doesn't match the code structure for whatever reason, say for algorithmic reasons, the best approach is to transform the data into a matching structure first, then operate on it. I.e., your "gather data, then operate on it" idea. Trying to do both at the same time typically leads to well-known warning signs: boolean flags, recycled variables with conflicting meanings, proliferating if-statements, complex looping conditions, etc., all the result of ad hoc attempts at resolving the structure conflict between data and code. T -- Be in denial for long enough, and one day you'll deny yourself of things you wish you hadn't.
Oct 25 2016
protected-headers="v1" From: Dicebot <public dicebot.lv> Newsgroups: d,i,g,i,t,a,l,m,a,r,s,.,D Subject: Re: Linus' idea of "good taste" code References: <nuonq0$8v7$1 digitalmars.com> In-Reply-To: <nuonq0$8v7$1 digitalmars.com> --uVQrnr7tOrbQURtuh5dc9QLolc4QoSKp6 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable On 10/26/2016 12:53 AM, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: =20 https://medium.com/ bartobri/applying-the-linus-tarvolds-good-taste-cod=I find it both funny and saddening how many reddit commentators complained about Linus version of that code is over-complicated. "Prefer clear code over smart code" principle is good in general but sometimes it is over-applied to the point where incompetence gets glorified. And this sucks. --uVQrnr7tOrbQURtuh5dc9QLolc4QoSKp6--
Oct 26 2016
On 10/26/2016 2:54 AM, Dicebot wrote:I find it both funny and saddening how many reddit commentators complained about Linus version of that code is over-complicated. "Prefer clear code over smart code" principle is good in general but sometimes it is over-applied to the point where incompetence gets glorified. And this sucks.Any coding principle rigidly applied leads to disaster. Reminds me of: 1. newbie - follows the rules because he's told to 2. master - follows the rules because he understands them 3. guru - breaks the rules because he understands that they don't apply
Oct 26 2016
On 10/26/2016 06:42 AM, Walter Bright wrote:Reminds me of: 1. newbie - follows the rules because he's told to 2. master - follows the rules because he understands them 3. guru - breaks the rules because he understands that they don't applyI always liked that principle. Tao Te Ching, if I'm not mistaken (though I might be).
Oct 26 2016
On Wednesday, 26 October 2016 at 09:54:31 UTC, Dicebot wrote:On 10/26/2016 12:53 AM, Walter Bright wrote:I'm unsure about Linus' version. For this example, I agree that it is elegant. It is fine in this specific case, because everything is local within a single function. In general, the trick to use a pointer to the element probably not a good idea. The article/Linus does not explain the tradeoffs properly, which makes it dangerous advice.It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/I find it both funny and saddening how many reddit commentators complained about Linus version of that code is over-complicated. "Prefer clear code over smart code" principle is good in general but sometimes it is over-applied to the point where incompetence gets glorified. And this sucks.
Oct 27 2016
On Thursday, 27 October 2016 at 08:24:54 UTC, qznc wrote:On Wednesday, 26 October 2016 at 09:54:31 UTC, Dicebot wrote:It's like everything in life, like playing an instrument for example - or indeed life itself. Nothing, not even the best advice, can replace experience - which comes with age. It's only when you have a lot of experience that you understand the advice given to you by your elders. Then you pass on the same advice - but the inexperienced have to make the same old mistakes all over again until they understand. Some never do, though.On 10/26/2016 12:53 AM, Walter Bright wrote:I'm unsure about Linus' version. For this example, I agree that it is elegant. It is fine in this specific case, because everything is local within a single function. In general, the trick to use a pointer to the element probably not a good idea. The article/Linus does not explain the tradeoffs properly, which makes it dangerous advice.It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/I find it both funny and saddening how many reddit commentators complained about Linus version of that code is over-complicated. "Prefer clear code over smart code" principle is good in general but sometimes it is over-applied to the point where incompetence gets glorified. And this sucks.
Oct 27 2016
protected-headers="v1" From: Dicebot <public dicebot.lv> Newsgroups: d,i,g,i,t,a,l,m,a,r,s,.,D Subject: Re: Linus' idea of "good taste" code References: <nuonq0$8v7$1 digitalmars.com> <nupugo$239g$1 digitalmars.com> <dobkofkrtljepuwsgnpy forum.dlang.org> In-Reply-To: <dobkofkrtljepuwsgnpy forum.dlang.org> --4Ft4aWCKvbIb0vILreagsPuECvTgnLAED Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable On 10/27/2016 11:24 AM, qznc wrote:I'm unsure about Linus' version. For this example, I agree that it is elegant. It is fine in this specific case, because everything is local within a single function. In general, the trick to use a pointer to the=element probably not a good idea. =20 The article/Linus does not explain the tradeoffs properly, which makes it dangerous advice.I'd agree for some general language but considering it is Linus, C can be reasonably assumed. And in C land it doesn't even count as "trick", heavy nested pointer usage is widespread and even somewhat idiomatic - being comfortable with such code can be expected from any non-newbie C programmer. --4Ft4aWCKvbIb0vILreagsPuECvTgnLAED--
Oct 27 2016
On 10/25/2016 3:53 PM, Walter Bright wrote:https://medium.com/The Hacker News thread: https://news.ycombinator.com/item?id=12793624
Oct 26 2016
Am Tue, 25 Oct 2016 15:53:54 -0700 schrieb Walter Bright <newshound2 digitalmars.com>:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test. Eliminating loops is something D adds, and goes even further to making code a straight line.On a more controversial note, I sometimes replace nested blocks of conditionals and loops with flat spaghetti code and goto with verbose labels. There are situations where you can explain straight forward what needs to be done first, second and last, but special cases and loops make it hard to tell from "normal" code. if (valueTooBig) goto BreakBigValueDown; ProcessValue: ... return; // ====================== // BreakBigValueDown: // makes big values manageable ... goto ProcessValue; There is a concise piece of code that handles 90% percent of the cases and another block for anything that requires special handling. It can in theory also avoid costly memory loads for rarely used code.One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.Interesting. -- Marco
Oct 26 2016
On Wednesday, 26 October 2016 at 10:48:34 UTC, Marco Leise wrote:On a more controversial note, I sometimes replace nested blocks of conditionals and loops with flat spaghetti code and goto with verbose labels. There are situations where you can explain straight forward what needs to be done first, second and last, but special cases and loops make it hard to tell from "normal" code.Speaking of Linus, that's idiomatic in the Linux kernel for error handling. I like using D's nested functions for simplifying code in the same kind of way. Sometimes a tiny helper function can make a big difference.
Oct 26 2016
On 10/26/2016 4:21 PM, sarn wrote:I like using D's nested functions for simplifying code in the same kind of way. Sometimes a tiny helper function can make a big difference.I've also found that nested functions can nicely fix spaghetti code.
Oct 26 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test. Eliminating loops is something D adds, and goes even further to making code a straight line. One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.Personally, a large amount of control flow statements (and especially nested control flow) makes it difficult for me to understand what the code does (not to mention debugging it). Unfortunately, it can be difficult to eliminate (or hide) control flow without creating needless complexity in other parts of the program. For instance, Linus' "good taste" example uses pointers explicitly and I would argue that this too is "distasteful". On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.An obvious example of code that gathers information separated from code that performs an action is exceptions. :)
Oct 26 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/What would you say is the best way to write his "Initialize grid edges" code in D? Ideally, I would like something like this: void initEdges(int val, ref int[GRID_SIZE][GRID_SIZE] grid) { map!(a => val)(chain(grid.row(0), grid.row($-1), grid.column(0), grid.column($-1)); } but multidimensional arrays don't have built-in primitives like that (as far as I know).
Oct 26 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/ This is something we could all do better at. Making code a straight path makes it easier to reason about and test. Eliminating loops is something D adds, and goes even further to making code a straight line. One thing I've been trying to do lately when working with DMD is to separate code that gathers information from code that performs an action. (The former can then be made pure.) My code traditionally has it all interleaved together.I'd like to point to Joel Spolsky excellent article "Five Worlds" - http://www.joelonsoftware.com/articles/FiveWorlds.html TL;DR: Joel Spolsky argues that different types("worlds") of developments require different qualities and different priorities, both from the code and the process. Because of that, advice given by experts of one world does not necessary apply to other worlds, even if the expert is really smart and experienced and even if the advice was learned with great pain. Linus Torvald is undoubtedly smart and experienced, but he belongs to the world of low-level kernels and filesystems code. Just because such code would be considered "tasteless" there doesn't mean it's tasteless everywhere.
Oct 27 2016
On Thursday, 27 October 2016 at 14:54:59 UTC, Idan Arye wrote:I'd like to point to Joel Spolsky excellent article "Five Worlds" - http://www.joelonsoftware.com/articles/FiveWorlds.html TL;DR: Joel Spolsky argues that different types("worlds") of developments require different qualities and different priorities, both from the code and the process. Because of that, advice given by experts of one world does not necessary apply to other worlds, even if the expert is really smart and experienced and even if the advice was learned with great pain. Linus Torvald is undoubtedly smart and experienced, but he belongs to the world of low-level kernels and filesystems code. Just because such code would be considered "tasteless" there doesn't mean it's tasteless everywhere.Not easy to be smart with Javascript ;)
Oct 27 2016
On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davis
Oct 27 2016
On Thursday, 27 October 2016 at 15:54:59 UTC, Jonathan M Davis wrote:On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:I wish I could! I wish we had DScript for browsers!Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davis
Oct 27 2016
On Thursday, 27 October 2016 at 16:01:26 UTC, Chris wrote:On Thursday, 27 October 2016 at 15:54:59 UTC, Jonathan M Davis wrote:I am hoping that when asm.js is more mature, then we can use the llvm back end to write at least part of front end in D. Would need runtime ported of course.On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:I wish I could! I wish we had DScript for browsers!Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davis
Oct 27 2016
protected-headers="v1" From: Dicebot <public dicebot.lv> Newsgroups: d,i,g,i,t,a,l,m,a,r,s,.,D Subject: Re: Linus' idea of "good taste" code References: <nuonq0$8v7$1 digitalmars.com> <hniyxdskbfyhczbwjjaa forum.dlang.org> <svratckpdnfffibijjbx forum.dlang.org> <mailman.215.1477583757.3398.digitalmars-d puremagic.com> <fmcmahoknfpyrnvdetce forum.dlang.org> <maycwbhebffcsssmrjus forum.dlang.org> In-Reply-To: <maycwbhebffcsssmrjus forum.dlang.org> --m0BOQUSVFKRcvGvQWWkUgI18kP4Okgwkp Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable On 10/27/2016 07:12 PM, Laeeth Isharc wrote:On Thursday, 27 October 2016 at 16:01:26 UTC, Chris wrote:On Thursday, 27 October 2016 at 15:54:59 UTC, Jonathan M Davis wrote:On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:==20 I am hoping that when asm.js is more mature, then we can use the llvm back end to write at least part of front end in D. Would need runtime=I wish I could! I wish we had DScript for browsers!Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davisported of course.I consider native compiler from D to WebAssembly to be one of possible unique breakthrough points for D in coming years. --m0BOQUSVFKRcvGvQWWkUgI18kP4Okgwkp--
Oct 27 2016
On Thursday, 27 October 2016 at 16:14:03 UTC, Dicebot wrote:On 10/27/2016 07:12 PM, Laeeth Isharc wrote:Can't wait for this to happen!On Thursday, 27 October 2016 at 16:01:26 UTC, Chris wrote:I consider native compiler from D to WebAssembly to be one of possible unique breakthrough points for D in coming years.On Thursday, 27 October 2016 at 15:54:59 UTC, Jonathan M Davis wrote:I am hoping that when asm.js is more mature, then we can use the llvm back end to write at least part of front end in D. Would need runtime ported of course.On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:I wish I could! I wish we had DScript for browsers!Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davis
Oct 27 2016
On Thursday, 27 October 2016 at 16:14:03 UTC, Dicebot wrote:On 10/27/2016 07:12 PM, Laeeth Isharc wrote:I asked Kai at dconf about what would be involved in porting to wasm, but I think he misheard me, as his answer was just generically about what was involved in porting to a new platform. Any thoughts on how much work is involved to port the runtime? And what other changes might be involved? The chap that used the C backend for LLVM wrote a little mini runtime but I guess didn't have to worry about the version blocks in the compiler front end as much. (don't recall what architecture he pretended to be compiling to). Glibc has obviously already been ported to run in browser by emscripten.On Thursday, 27 October 2016 at 16:01:26 UTC, Chris wrote:I consider native compiler from D to WebAssembly to be one of possible unique breakthrough points for D in coming years.On Thursday, 27 October 2016 at 15:54:59 UTC, Jonathan M Davis wrote:I am hoping that when asm.js is more mature, then we can use the llvm back end to write at least part of front end in D. Would need runtime ported of course.On Thursday, October 27, 2016 15:42:53 Chris via Digitalmars-d wrote:I wish I could! I wish we had DScript for browsers!Not easy to be smart with Javascript ;)Sure, it is. Avoid it. ;) - Jonathan M Davis
Oct 29 2016
On Saturday, 29 October 2016 at 21:46:37 UTC, Laeeth Isharc wrote:Any thoughts on how much work is involved to port the runtime? And what other changes might be involved? The chap that used the C backend for LLVM wrote a little mini runtime but I guess didn't have to worry about the version blocks in the compiler front end as much. (don't recall what architecture he pretended to be compiling to). Glibc has obviously already been ported to run in browser by emscripten.I have actually meant something quite different - implementing new backend for DMD which emits wasm directly (possibly by embedding binaryen). That is more work than simply using LLVM stack as-is but would result in unique marketing advantage - existing pipeline of C -> Emscripten -> asm.js -> asm2wasm is rather annoying. And native wasm backend for LLVM is in development for quite a while now with no clear ETA. At the same time intended wasm spec (https://github.com/WebAssembly/design) is much more simple than machine code for something like x86_64. If Walter gets interested, that may be a feasible path :)
Oct 29 2016
On 10/29/2016 10:30 PM, Dicebot wrote:At the same time intended wasm spec (https://github.com/WebAssembly/design) is much more simple than machine code for something like x86_64. If Walter gets interested, that may be a feasible path :)I looked at it for 5 minutes :-) and it looks like typical intermediate code that a compiler might generate. It wouldn't be hard to translate the intermediate code generated for the dmd back end to wasm. What I didn't see was any mention symbolic debug information, linking, or how to connect to system services like mutexes, I/O, etc. Time risks would also be wasm is an incomplete, moving target. Looks like a fun project, but I don't see how I could split off time to work on it.
Oct 29 2016
On Sunday, 30 October 2016 at 05:53:09 UTC, Walter Bright wrote:On 10/29/2016 10:30 PM, Dicebot wrote:It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928 This isn't some third-world country with mostly 2G usage, the web numbers in those places are much worse. Combined with mobile passing even TV for time spent, there is no point in wasting time porting D to a dying platform.At the same time intended wasm spec (https://github.com/WebAssembly/design) is much more simple than machine code for something like x86_64. If Walter gets interested, that may be a feasible path :)I looked at it for 5 minutes :-) and it looks like typical intermediate code that a compiler might generate. It wouldn't be hard to translate the intermediate code generated for the dmd back end to wasm. What I didn't see was any mention symbolic debug information, linking, or how to connect to system services like mutexes, I/O, etc. Time risks would also be wasm is an incomplete, moving target. Looks like a fun project, but I don't see how I could split off time to work on it.
Oct 29 2016
On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928 This isn't some third-world country with mostly 2G usage, the web numbers in those places are much worse. Combined with mobile passing even TV for time spent, there is no point in wasting time porting D to a dying platform.Yes, because outside of web on mobile nothing else exists... bwahahahah
Oct 30 2016
On Sunday, 30 October 2016 at 10:04:02 UTC, Patrick Schluter wrote:On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:Pretty soon it won't: https://mobile.twitter.com/asymco/status/793401867053195264 On Sunday, 30 October 2016 at 16:35:54 UTC, Laeeth Isharc wrote:It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928 This isn't some third-world country with mostly 2G usage, the web numbers in those places are much worse. Combined with mobile passing even TV for time spent, there is no point in wasting time porting D to a dying platform.Yes, because outside of web on mobile nothing else exists... bwahahahahAnd wouldn't the changes to runtime and phobos be quite similar for both dmd and ldc? I don't see how the work flow would be any different as a language user whether you used an LDC with wasm back end, or dmd with similar.The changes to druntime and phobos wouldn't depend on the compiler used, but it is difficult to test unless you have a compiler with working codegen, so that usually comes first. You can go ahead and make changes to druntime- not much has to be done for phobos, as the idea is to encapsulate platform-specific code in druntime, though a minority of phobos does call platform-specific APIs- based on the spec or available headers, but you won't know if it will work well till you can run it.Joakim - native on mobile is so much better (setting aside having to deal with Apple or Google) but I guess the browser isn't going away on the desktop for a while yet.I'm actually a heavy web user, have been for almost a quarter-century (though I don't use webapps, mostly reading), which is why that chart was so surprising to me. While native mobile apps are usually more responsive, they are not ideal for reading, as I'm not going to install and load up The Verge's app, or an app for every other news site, every time. The problem for the desktop browser is that the desktop is going away, as the linked tweet above shows. I went from using a FreeBSD desktop and a dumbphone five years ago to an Android smartphone and two Android tablets today, ie no desktop or laptop since my ultrabook died late last year. In my household, we went from using two smartphones, two PC laptops, and a Mac laptop four years ago to three smartphones, three Android tablets, and a Mac laptop today. This is a shift that is happening in most households, as a PC overserves most and a mobile device will do. Many D users are power users who cling to old tech like the desktop and the web, so they are missing this massive wave going on right now. I myself missed the death of the mobile web, as I'm such a heavy user.
Nov 02 2016
On Thursday, 3 November 2016 at 06:11:08 UTC, Joakim wrote:On Sunday, 30 October 2016 at 10:04:02 UTC, Patrick Schluter wrote:Even that chart shows a flattening to an asymptote not a linear trend. This means desktop will maube go a little bit down but it won't disappear. What people often forget is that professional office PC will never be completely replaced by mobile. What also happens in that branch (i.e. in office environment) is still a continuation to replace PC application by browser applications. This means that still focussing on good web solutions server or client side is a good investment in any case.On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:Pretty soon it won't: https://mobile.twitter.com/asymco/status/793401867053195264It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928 This isn't some third-world country with mostly 2G usage, the web numbers in those places are much worse. Combined with mobile passing even TV for time spent, there is no point in wasting time porting D to a dying platform.Yes, because outside of web on mobile nothing else exists... bwahahahahOn Sunday, 30 October 2016 at 16:35:54 UTC, Laeeth Isharc wrote:No it is not. Linear extrapolation of an incomplete chart is almost always erroneous.And wouldn't the changes to runtime and phobos be quite similar for both dmd and ldc? I don't see how the work flow would be any different as a language user whether you used an LDC with wasm back end, or dmd with similar.The changes to druntime and phobos wouldn't depend on the compiler used, but it is difficult to test unless you have a compiler with working codegen, so that usually comes first. You can go ahead and make changes to druntime- not much has to be done for phobos, as the idea is to encapsulate platform-specific code in druntime, though a minority of phobos does call platform-specific APIs- based on the spec or available headers, but you won't know if it will work well till you can run it.Joakim - native on mobile is so much better (setting aside having to deal with Apple or Google) but I guess the browser isn't going away on the desktop for a while yet.I'm actually a heavy web user, have been for almost a quarter-century (though I don't use webapps, mostly reading), which is why that chart was so surprising to me. While native mobile apps are usually more responsive, they are not ideal for reading, as I'm not going to install and load up The Verge's app, or an app for every other news site, every time. The problem for the desktop browser is that the desktop is going away, as the linked tweet above shows.I went from using a FreeBSD desktop and a dumbphone five years ago to an Android smartphone and two Android tablets today, ie no desktop or laptop since my ultrabook died late last year. In my household, we went from using two smartphones, two PC laptops, and a Mac laptop four years ago to three smartphones, three Android tablets, and a Mac laptop today. This is a shift that is happening in most households, as a PC overserves most and a mobile device will do. Many D users are power users who cling to old tech like the desktop and the web, so they are missing this massive wave going on right now. I myself missed the death of the mobile web, as I'm such a heavy user.still bwahahaha, web technology will stay a bit longer, panicking is a bit premature yet.
Nov 02 2016
On Thursday, 3 November 2016 at 06:32:07 UTC, Patrick Schluter wrote:On Thursday, 3 November 2016 at 06:11:08 UTC, Joakim wrote:Nothing is ever "completely replaced"- somebody somewhere is still using a mainframe or a UNIX workstation- but yes, PCs will basically disappear, just as you never see those old computers anymore. Android 7.0 has a full multi-window mode, just dock your smartphone with a monitor and keyboard/mouse and start working: http://arstechnica.com/gadgets/2016/03/this-is-android-ns-freeform-window-mode/ Not all devices will come with that enabled, but you can enable it yourself on any 7.0 device: http://www.androidpolice.com/2016/09/19/taskbar-updated-version-1-2-can-now-completely-replace-home-screen/ Now that mobile devices dominate the computing market, they are going after the legacy PC market too, by adding the remaining features needed. Smartphones will replace the office PC, just like they have already killed off standalone mp3 players, GPS devices, handheld consoles, dumbphones, and so on.On Sunday, 30 October 2016 at 10:04:02 UTC, Patrick Schluter wrote:Even that chart shows a flattening to an asymptote not a linear trend. This means desktop will maube go a little bit down but it won't disappear. What people often forget is that professional office PC will never be completely replaced by mobile.On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:Pretty soon it won't: https://mobile.twitter.com/asymco/status/793401867053195264It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928 This isn't some third-world country with mostly 2G usage, the web numbers in those places are much worse. Combined with mobile passing even TV for time spent, there is no point in wasting time porting D to a dying platform.Yes, because outside of web on mobile nothing else exists... bwahahahahWhat also happens in that branch (i.e. in office environment) is still a continuation to replace PC application by browser applications. This means that still focussing on good web solutions server or client side is a good investment in any case.More likely those PC apps will become mobile apps, like Office has done.I pointed out a trend and extrapolated it continuing, but I never suggested it would be "linear." I actually believe there will be a collapse in PC sales over the next decade, as the steady slide over the last five years will accelerate (see last graphic of 25% drop in Windows PC sales): http://www.asymco.com/2016/11/02/wherefore-art-thou-macintosh/ Smartphones and tablets have gone after the marginal PC uses so far, ie around the home where they were overkill anyway, now they go after the core uses.The problem for the desktop browser is that the desktop is going away, as the linked tweet above shows.No it is not. Linear extrapolation of an incomplete chart is almost always erroneous.It is already gone- look at the numbers- and I never said anything about "panicking," as I actually welcome the move. The web was always decent for publishing and reading, but I have long thought it was a bad idea for them to turn it into an app platform, as I've said in this forum many times. All tech lives and dies, the PC and the web are no different.I went from using a FreeBSD desktop and a dumbphone five years ago to an Android smartphone and two Android tablets today, ie no desktop or laptop since my ultrabook died late last year. In my household, we went from using two smartphones, two PC laptops, and a Mac laptop four years ago to three smartphones, three Android tablets, and a Mac laptop today. This is a shift that is happening in most households, as a PC overserves most and a mobile device will do. Many D users are power users who cling to old tech like the desktop and the web, so they are missing this massive wave going on right now. I myself missed the death of the mobile web, as I'm such a heavy user.still bwahahaha, web technology will stay a bit longer, panicking is a bit premature yet.
Nov 04 2016
On 11/05/2016 02:00 AM, Joakim wrote:Nothing is ever "completely replaced"- somebody somewhere is still using a mainframe or a UNIX workstation- but yes, PCs will basically disappear, just as you never see those old computers anymore. Android 7.0 has a full multi-window mode, just dock your smartphone with a monitor and keyboard/mouse and start working: http://arstechnica.com/gadgets/2016/03/this-is-android-ns-freeform-window-mode/I'd hardly call that a "full" multi-window mode. I've been (begrudgingly) using android on a daily basis for years now, and honestly, OS support for freeform windows is the least of what Android needs to be worthwhile as a PC replacement. Hell, even on the desktop, I usually have everything maximized anyway. Even with freeform windows, I'd sooner use *Win8*, of all things, on the desktop than fucking android. There are four main areas that are currently making Android and absolute shit as a desktop replacement: The OS itself, the look&feel style guidelines, the entire 3rd party ecosystem, and the form factor. Ie, basically everything. And the idea that plugging in a mouse, keyboard and monitor fixes the form factor issues is ludicrous, because seriously, compare that to them already just *being* there as with a laptop (yea, laptop keyboards and trackpads suck, but not remotely as badly as what's built-in on a smartphone/tablet). Or compared to, you know, not having to deal with a plugging in a hub, connecting all that shit, clumsily propping the stupid little thing up, OR bluetooth for that matter, because, let's be honest, fucking NOBODY other than us power users can figure out that obtuse "pairing" shit. But it probably will take over anyway, because, let's face it, when the fuck has being complete and utter fucking shit ever stopped a computing tech from becoming a runaway success?: Windows, C++, Java2, Web-as-a-platform, JS/Ajax/Toolkit-overload, walled-garden services, zero-privacy private surveillance, non-tactile "touch"-screens, intrusive forced-update systems, removing-features-as-a-feature, widescreens for general computing purposes, iOS/Android "phones", etc. People are goddamned morons, and you know what? These motherfuckers DESERVE to have their privacy stolen, waste all their time futzing around with fucking broken software & devices, lose their data, and die while text-driving (or while letting GPS/Google drive their car, which mark my words, will be the next thing). I'd hope the whole fucking world burns, but it looks like I don't *need* to hope for that, it's pretty much guaranteed at this point anyway.
Nov 09 2016
On Wednesday, 9 November 2016 at 16:00:45 UTC, Nick Sabalausky wrote:On 11/05/2016 02:00 AM, Joakim wrote:I don't believe that. Software developers need a big machine, because these days you have to run a bunch of VMs to get anything done. Unless we migrate to Cloud-IDEs, we will use PCs in the foreseeable future and I don't see Cloud-IDEs happening. Office Workers who are happy with MS Office alone could use Android. However, there is always this old internal app, which barely works on newer Windows versions. It will take a few decades until those are replaced. Executives could move to pure mobile and probably already did. Reading reports and writing emails works well already. I believe the PC is just as tenacious as the x86 architecture, which is still backwards compatible over the last three decades.Nothing is ever "completely replaced"- somebody somewhere is still using a mainframe or a UNIX workstation- but yes, PCs will basically disappear, just as you never see those old computers anymore. Android 7.0 has a full multi-window mode, just dock your smartphone with a monitor and keyboard/mouse and start working: http://arstechnica.com/gadgets/2016/03/this-is-android-ns-freeform-window-mode/But it probably will take over anyway, because, let's face it, when the fuck has being complete and utter fucking shit ever stopped a computing tech from becoming a runaway success?:
Nov 10 2016
On 11/10/2016 05:14 AM, qznc wrote:On Wednesday, 9 November 2016 at 16:00:45 UTC, Nick Sabalausky wrote:I hope you're right, because I definitely need to use an "old-fashioned" machine in order to get things done without wasting enormous time & effort. But I've experienced this pattern far too many times to be confident in that: - I use XYZ all the time, just like everyone else. It has a few things that could use improvement, and would be entirely feasible to fix, but for the most part works fine. - Instead of XYZ's existing, easily solvable, problems actually BEING solved by those in a position to do so, somebody (maybe even the same people) comes out with UVW, with tons of fanfare because of one or two little things it does better. But, for the MOST part, UVW is total shit and vastly inferior to XYZ. - UVW's hype begets more hype as people worldwide mistake hype (and "newness") for worthiness. - Most people delude themselves into pretending UVW's downsides (compared to XYZ) don't exist, because after all, it's newer and has hype so therefore it's unquestionably better. Or, they just simply tolerate UVW's downsides, because, again, it's the hot new shit, so it MUST be the right tool for the job, right? - Eventually, more and more people are forced to migrate from XYZ to UVW because of both market and industry pressures and because of XYZ becoming harder to obtain and getting less and less attention, shrinking ecosystem, etc. - UVW marginalizes XYZ. - A small faction of UVW users actually recognize UVW's downsides (or at least recognize there are "dinosaurs" who are holding out and must be converted for their own good). So the UVW folk make a half-assed amateur attempt to "fix" the downsides, only UVW still never actually reaches parity with XYZ. And why bother trying? XYZ's already been all but killed off. I've seen it over, and over, and over. Unless people finally wise up and quit mistaking hype for worthiness (highly unlikely), I fear the same is poised to happen to PC's. I'm already forced to rely on these god-awful "modern" smartphones far more than I'd like to.But it probably will take over anyway, because, let's face it, when the fuck has being complete and utter fucking shit ever stopped a computing tech from becoming a runaway success?:I don't believe that. Software developers need a big machine, because these days you have to run a bunch of VMs to get anything done. Unless we migrate to Cloud-IDEs, we will use PCs in the foreseeable future and I don't see Cloud-IDEs happening. Office Workers who are happy with MS Office alone could use Android. However, there is always this old internal app, which barely works on newer Windows versions. It will take a few decades until those are replaced. Executives could move to pure mobile and probably already did. Reading reports and writing emails works well already. I believe the PC is just as tenacious as the x86 architecture, which is still backwards compatible over the last three decades.
Nov 10 2016
On Thursday, 10 November 2016 at 16:48:01 UTC, Nick Sabalausky wrote:I hope you're right, because I definitely need to use an "old-fashioned" machine in order to get things done without wasting enormous time & effort. [...]Sit on the bank of a river and wait: Your enemy's corpse will soon float by.
Nov 11 2016
On Friday, 11 November 2016 at 10:56:31 UTC, Chris wrote:On Thursday, 10 November 2016 at 16:48:01 UTC, Nick Sabalausky wrote:I remember about 8, 9 years ago I warned that Apple was paying too much attention to the iPhone and that it was neglecting the Desktop/Laptop users. Who cares, right? I read about a year ago that Apple had a problem and sales were dropping, because they were neglecting their Desktop/Laptop users. Well, what can I say. I stopped using Apple years ago, because they became the same as (or worse than) MS. This whole App-store lock-in, the whole you-have-to-register-or-die-approach. F**k you! Enter Linux.I hope you're right, because I definitely need to use an "old-fashioned" machine in order to get things done without wasting enormous time & effort. [...]Sit on the bank of a river and wait: Your enemy's corpse will soon float by.
Nov 11 2016
On Wednesday, 9 November 2016 at 16:00:45 UTC, Nick Sabalausky wrote:But it probably will take over anyway, because, let's face it, when the fuck has being complete and utter fucking shit ever stopped a computing tech from becoming a runaway success?: Windows, C++, Java2, Web-as-a-platform, JS/Ajax/Toolkit-overload, walled-garden services, zero-privacy private surveillance, non-tactile "touch"-screens, intrusive forced-update systems, removing-features-as-a-feature, widescreens for general computing purposes, iOS/Android "phones", etc. People are goddamned morons, and you know what? These motherfuckers DESERVE to have their privacy stolen, waste all their time futzing around with fucking broken software & devices, lose their data, and die while text-driving (or while letting GPS/Google drive their car, which mark my words, will be the next thing). I'd hope the whole fucking world burns, but it looks like I don't *need* to hope for that, it's pretty much guaranteed at this point anyway.I've adopted this philosophy: "Sit on the bank of a river and wait: Your enemy's corpse will soon float by." I just sit and wait, and indeed Ajax and other technologies have been floating by ... Stick to what you know is good and don't jump on the latest bandwagon that happens to pass by. In terms of technology, you will save a lot of time, money and energy (no, I never "learned" Ajax:) #ontwitterallday #shitdidntseethattruck #donttextanddrive
Nov 10 2016
On Wednesday, 9 November 2016 at 16:00:45 UTC, Nick Sabalausky wrote:On 11/05/2016 02:00 AM, Joakim wrote:Why? Note that it has been fleshed out more since that March article, as that was a developer preview build of Android 7.0.Nothing is ever "completely replaced"- somebody somewhere is still using a mainframe or a UNIX workstation- but yes, PCs will basically disappear, just as you never see those old computers anymore. Android 7.0 has a full multi-window mode, just dock your smartphone with a monitor and keyboard/mouse and start working: http://arstechnica.com/gadgets/2016/03/this-is-android-ns-freeform-window-mode/I'd hardly call that a "full" multi-window mode.I've been (begrudgingly) using android on a daily basis for years now, and honestly, OS support for freeform windows is the least of what Android needs to be worthwhile as a PC replacement. Hell, even on the desktop, I usually have everything maximized anyway. Even with freeform windows, I'd sooner use *Win8*, of all things, on the desktop than fucking android. There are four main areas that are currently making Android and absolute shit as a desktop replacement: The OS itselflinux is too stable for you? ;)the look&feel style guidelinesEh, Material Design is fine. It will evolve as people start using Android to get work done on larger LCD monitors.the entire 3rd party ecosystemWhile there are a lot of Android apps, I agree that there are problems with the Play Store and its ecosystem, but nothing that can't be fixed on an open platform, that has the possibility to install competing app stores too.and the form factor. Ie, basically everything.That's ridiculous. There's no difference between docking your laptop at a KVM at your work desk or docking a smartphone instead. The software on the smartphone is currently slower and doesn't have as many pro apps, but many will choose to use the smartphone they already have, rather than pay more for a desktop or laptop PC they don't need.And the idea that plugging in a mouse, keyboard and monitor fixes the form factor issues is ludicrous, because seriously, compare that to them already just *being* there as with a laptop (yea, laptop keyboards and trackpads suck, but not remotely as badly as what's built-in on a smartphone/tablet).Obviously if you're docked and using a mouse, keyboard, and monitor, there is no difference of form factor. On the go, you will have options like this, for just $100 more: https://the-superbook-turn-your-smartphone-into-a-laptop-f.backerkit.com/hosted_preorders I already have a smartphone. If I want a laptop, I'll pay $100 and dock it. If I want to work at a desk with a KVM switch, I have to buy the keyboard, mouse, and monitor regardless. That is a cost and ubiquity advantage that the PC cannot match. Point-and-shoot cameras and standalone GPS devices are still better than using your smartphone, but nobody buys them anymore because your smartphone is good enough and always with you.Or compared to, you know, not having to deal with a plugging in a hub, connecting all that shit, clumsily propping the stupid little thing up, OR bluetooth for that matter, because, let's be honest, fucking NOBODY other than us power users can figure out that obtuse "pairing" shit.With USB-C, you will simply plug your phone into a dock connected to a KVM switch and go, you know, what you do with your laptop if you use it docked now. Wireless protocols like Bluetooth and Miracast will eventually kill off all the wires, and be even easier.But it probably will take over anyway, because, let's face it, when the fuck has being complete and utter fucking shit ever stopped a computing tech from becoming a runaway success?: Windows, C++, Java2, Web-as-a-platform, JS/Ajax/Toolkit-overload, walled-garden services, zero-privacy private surveillance, non-tactile "touch"-screens, intrusive forced-update systems, removing-features-as-a-feature, widescreens for general computing purposes, iOS/Android "phones", etc.That's a very mixed bag. Yes, some of that tech was mediocre or a step backwards, but most people who were around in the golden age you admire get a lot more done on computing devices today. Computers have become so powerful that they can make up for a lot of those dumb decisions. We do need to weed out a lot of those mistakes over time though.People are goddamned morons, and you know what? These motherfuckers DESERVE to have their privacy stolen, waste all their time futzing around with fucking broken software & devices, lose their data, and die while text-driving (or while letting GPS/Google drive their car, which mark my words, will be the next thing). I'd hope the whole fucking world burns, but it looks like I don't *need* to hope for that, it's pretty much guaranteed at this point anyway.I agree with most of those criticisms, but that's straying far afield from the smartphone killing off the PC and the web, which you have also lumped in with your list of bad tech. So you like at least some of the changes mobile is bringing, even if you don't necessarily like some of the decisions made. But mobile is dominant now and is the future for getting work done too; a lot of those problems will be fixed over time. On Thursday, 10 November 2016 at 10:14:34 UTC, qznc wrote:I don't believe that. Software developers need a big machine, because these days you have to run a bunch of VMs to get anything done. Unless we migrate to Cloud-IDEs, we will use PCs in the foreseeable future and I don't see Cloud-IDEs happening.They do? http://bergie.iki.fi/blog/working-on-android/ http://decoding.io/using-ipad-pro-as-a-web-developer/ The vast majority of software developers don't have a "big machine," most don't do anything with VMs. For those who have multi-minute or more builds, we will use the cloud a lot more than we do today, as that's one of the few places it makes sense.Office Workers who are happy with MS Office alone could use Android. However, there is always this old internal app, which barely works on newer Windows versions. It will take a few decades until those are replaced.There will be emulators for those legacy apps until they're replaced, just as Apple provided for OS 9 when they switched to OS X.Executives could move to pure mobile and probably already did. Reading reports and writing emails works well already. I believe the PC is just as tenacious as the x86 architecture, which is still backwards compatible over the last three decades.Both are declining, as the Asymco link I gave above showed, and will likely collapse into irrelevance soon. On Thursday, 10 November 2016 at 10:45:26 UTC, Chris wrote:I've adopted this philosophy: "Sit on the bank of a river and wait: Your enemy's corpse will soon float by." I just sit and wait, and indeed Ajax and other technologies have been floating by ... Stick to what you know is good and don't jump on the latest bandwagon that happens to pass by. In terms of technology, you will save a lot of time, money and energy (no, I never "learned" Ajax:)AJAX seemed promising when the alternative was desktop apps, but they went way overboard with HTML5 and mobile is displacing it. On Thursday, 10 November 2016 at 12:37:59 UTC, Kagamin wrote:On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:No, twitter is a small chunk and dropping, not to mention being available on the web too: http://www.cnbc.com/2016/06/06/people-are-spending-much-less-time-on-social-media-apps-said-report.html You're right that people waste a lot more time with social media on mobile, but the point is that they're increasingly not using PCs for that, and PC sales are dropping as a result. On Thursday, 10 November 2016 at 16:48:01 UTC, Nick Sabalausky wrote:It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928They just spend increasingly more time in twitter when not at home.I hope you're right, because I definitely need to use an "old-fashioned" machine in order to get things done without wasting enormous time & effort.So I take it you're still running a UNIX workstation? Or is it a Data General minicomputer? ;)But I've experienced this pattern far too many times to be confident in that: - I use XYZ all the time, just like everyone else. It has a few things that could use improvement, and would be entirely feasible to fix, but for the most part works fine. - Instead of XYZ's existing, easily solvable, problems actually BEING solved by those in a position to do so, somebody (maybe even the same people) comes out with UVW, with tons of fanfare because of one or two little things it does better. But, for the MOST part, UVW is total shit and vastly inferior to XYZ. - UVW's hype begets more hype as people worldwide mistake hype (and "newness") for worthiness. - Most people delude themselves into pretending UVW's downsides (compared to XYZ) don't exist, because after all, it's newer and has hype so therefore it's unquestionably better. Or, they just simply tolerate UVW's downsides, because, again, it's the hot new shit, so it MUST be the right tool for the job, right? - Eventually, more and more people are forced to migrate from XYZ to UVW because of both market and industry pressures and because of XYZ becoming harder to obtain and getting less and less attention, shrinking ecosystem, etc. - UVW marginalizes XYZ. - A small faction of UVW users actually recognize UVW's downsides (or at least recognize there are "dinosaurs" who are holding out and must be converted for their own good). So the UVW folk make a half-assed amateur attempt to "fix" the downsides, only UVW still never actually reaches parity with XYZ. And why bother trying? XYZ's already been all but killed off. I've seen it over, and over, and over. Unless people finally wise up and quit mistaking hype for worthiness (highly unlikely), I fear the same is poised to happen to PC's. I'm already forced to rely on these god-awful "modern" smartphones far more than I'd like to.You do realize that the PC was once the "vastly inferior" alternative to UNIX workstations? :) I don't disagree that there's a hype cycle and that the new entrant is initially worse. There's a name for that, it's called disruption: https://en.wikipedia.org/wiki/Disruptive_innovation Other than that, what you're describing is that the mass market doesn't care for niche features that early adopters want. If you're in that niche, great, somebody will make an expensive, superior option for you. You expressed interest in the Ubuntu phone before, there will always be a niche that serves your needs better. On Friday, 11 November 2016 at 11:15:10 UTC, Chris wrote:On Friday, 11 November 2016 at 10:56:31 UTC, Chris wrote:Read the Asymco Mac link I gave above, Mac sales and revenue have been inching up for years. Apple has around 25-30% profit share in the desktop/laptop market, despite selling many less devices, just as they just took 100% of the profits in the smartphone sector last quarter, despite selling only 15% of them: http://fortune.com/2016/11/04/apple-smartphone-profits/ They are not anywhere close to dying: they are the largest, most successful company on the planet. I agree that that cannot last, partially because they are so closed as you say, but so far it is one of the reasons for their success. Reports are that the recent Pro laptops, that some pros are complaining about, are selling better than ever. Most consumers want a dumbed-down, locked-down computing device. Apple goes too far in that direction, and that will eventually hurt them, but the overall trend is towards giving the masses what they want.On Thursday, 10 November 2016 at 16:48:01 UTC, Nick Sabalausky wrote:I remember about 8, 9 years ago I warned that Apple was paying too much attention to the iPhone and that it was neglecting the Desktop/Laptop users. Who cares, right? I read about a year ago that Apple had a problem and sales were dropping, because they were neglecting their Desktop/Laptop users. Well, what can I say. I stopped using Apple years ago, because they became the same as (or worse than) MS. This whole App-store lock-in, the whole you-have-to-register-or-die-approach. F**k you! Enter Linux.I hope you're right, because I definitely need to use an "old-fashioned" machine in order to get things done without wasting enormous time & effort. [...]Sit on the bank of a river and wait: Your enemy's corpse will soon float by.
Nov 11 2016
On Sunday, 30 October 2016 at 06:39:42 UTC, Joakim wrote:It is not worth it, the web is dying. I was stunned to see this chart of mobile web usage in the US: https://mobile.twitter.com/asymco/status/777915894659964928They just spend increasingly more time in twitter when not at home.
Nov 10 2016
protected-headers="v1" From: Dicebot <public dicebot.lv> Newsgroups: d,i,g,i,t,a,l,m,a,r,s,.,D Subject: Re: Linus' idea of "good taste" code References: <nuonq0$8v7$1 digitalmars.com> <hniyxdskbfyhczbwjjaa forum.dlang.org> <svratckpdnfffibijjbx forum.dlang.org> <mailman.215.1477583757.3398.digitalmars-d puremagic.com> <fmcmahoknfpyrnvdetce forum.dlang.org> <maycwbhebffcsssmrjus forum.dlang.org> <nut94b$1aug$1 digitalmars.com> <oirkyvojflyqhvohcdiq forum.dlang.org> <zeurfzpmavxzenlwdxzs forum.dlang.org> <nv41s1$2gia$1 digitalmars.com> In-Reply-To: <nv41s1$2gia$1 digitalmars.com> --Nfp8nH5GQA1XjmD55qicRtWIxBjTxqhsm Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable On 10/30/2016 07:53 AM, Walter Bright wrote:On 10/29/2016 10:30 PM, Dicebot wrote:At the same time intended wasm spec (https://github.com/WebAssembly/design) is much more simple than machine code for something like x86_64. If Walter gets interested, that may be a feasible path :)=20 I looked at it for 5 minutes :-) and it looks like typical intermediate=code that a compiler might generate. It wouldn't be hard to translate the intermediate code generated for the dmd back end to wasm. What I didn't see was any mention symbolic debug information, linking, or how to connect to system services like mutexes, I/O, etc. Time risks would also be wasm is an incomplete, moving target.Well, "risk" and "opportunity" are pretty much synonymous in such context :) Whoever comes first with easy to use toolchain for new platform gets all the hype - it pretty much boils down to making decision if one believes new platform is likely to succeed. For now I am just keeping my track on relevant information to see where does it all go.Looks like a fun project, but I don't see how I could split off time to=work on it.No argument here, it would be premature to pay much attention to it anyway. I will probably remind you of this topic some time next year with more information available so that more weighted judgment can be mad= e. --Nfp8nH5GQA1XjmD55qicRtWIxBjTxqhsm--
Oct 31 2016
On Sunday, 30 October 2016 at 05:30:04 UTC, Dicebot wrote:On Saturday, 29 October 2016 at 21:46:37 UTC, Laeeth Isharc wrote:Existing pipeline is string together with gaffer tape and string, so it's hardly there yet - and add up that, last I looked, when you turned on O2 with emscripten it didn't always go well. But what I meant was LLVM will have a wasm backend. So on basis of my limited understanding, it would be some work to make LDC produce wasm code, and then runtime and phobos would need work. Adam Ruppe of course had something like this working with plain javascript and dmd about four years back, including basic D wrapping of DOM etc and extern(js). But compiler has diverged a bit from that version used, and I guess at time there wasn't the interest or manpower to turn that experiment /prototype into something one could depend on. But maybe someone would pick it up now more people start to be involved, given that Walter has higher priority things to do. And wouldn't the changes to runtime and phobos be quite similar for both dmd and ldc? I don't see how the work flow would be any different as a language user whether you used an LDC with wasm back end, or dmd with similar. Joakim - native on mobile is so much better (setting aside having to deal with Apple or Google) but I guess the browser isn't going away on the desktop for a while yet.Any thoughts on how much work is involved to port the runtime? And what other changes might be involved? The chap that used the C backend for LLVM wrote a little mini runtime but I guess didn't have to worry about the version blocks in the compiler front end as much. (don't recall what architecture he pretended to be compiling to). Glibc has obviously already been ported to run in browser by emscripten.I have actually meant something quite different - implementing new backend for DMD which emits wasm directly (possibly by embedding binaryen). That is more work than simply using LLVM stack as-is but would result in unique marketing advantage - existing pipeline of C -> Emscripten -> asm.js -> asm2wasm is rather annoying. And native wasm backend for LLVM is in development for quite a while now with no clear ETA. At the same time intended wasm spec (https://github.com/WebAssembly/design) is much more simple than machine code for something like x86_64. If Walter gets interested, that may be a feasible path :)
Oct 30 2016
protected-headers="v1" From: Dicebot <public dicebot.lv> Newsgroups: d,i,g,i,t,a,l,m,a,r,s,.,D Subject: Re: Linus' idea of "good taste" code References: <nuonq0$8v7$1 digitalmars.com> <hniyxdskbfyhczbwjjaa forum.dlang.org> <svratckpdnfffibijjbx forum.dlang.org> <mailman.215.1477583757.3398.digitalmars-d puremagic.com> <fmcmahoknfpyrnvdetce forum.dlang.org> <maycwbhebffcsssmrjus forum.dlang.org> <nut94b$1aug$1 digitalmars.com> <oirkyvojflyqhvohcdiq forum.dlang.org> <zeurfzpmavxzenlwdxzs forum.dlang.org> <qexwfuryiovwyzivtowh forum.dlang.org> In-Reply-To: <qexwfuryiovwyzivtowh forum.dlang.org> --ctNk2xgDE2XseianfaJV4LU9Kdxd7XFa9 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable On 10/30/2016 06:35 PM, Laeeth Isharc wrote:But what I meant was LLVM will have a wasm backend.Yes, but it is developed so slowly and conservatively, that coming up with own proof-of-concept backend may be a chance to win early interest. They may speed up greatly though when WebAssembly design gets closer to MVP stage, but I am checking that regularly.So on basis of my limited understanding, it would be some work to make LDC produce wasm code, and then runtime and phobos would need work.Most likely you would need a quite different runtime as most system/libc level stuff will not be available in browser sandbox but the very same browser APIs will need to be exposed instead. Most likely whatever emscripten does for C should be fairly adaptable even outside of LLVM sta= ck. But right now it is mostly irrelevant because runtime requirements have not been defined in WebAssembly at all, only low level byte code stuff. It is all in very early stages really. --ctNk2xgDE2XseianfaJV4LU9Kdxd7XFa9--
Oct 31 2016
On Monday, 31 October 2016 at 09:52:55 UTC, Dicebot wrote:On 10/30/2016 06:35 PM, Laeeth Isharc wrote:I was disappointed that after early hype it all went quiet for now. I guess there is a window to grab attention, but language like nim already has a JS backend (so they say - I haven't used), and I think D is in a different space from languages where how it plays on hacker news is most important. Because if you have a wasm backend today for dmd, I guess it will be some time before it's fast, stable, and has the basic stuff already in emscripten (though maybe with Remedy work it gets easier to wrap that). So given limited manpower maybe the easier job (presumably most work needs to be done on phobos and runtime?) makes sense because in this case it's also likely the long term sensible way. It would be great if we had a dmd back end early of course, and I would certainly use it as soon as it was stable enough.But what I meant was LLVM will have a wasm backend.Yes, but it is developed so slowly and conservatively, that coming up with own proof-of-concept backend may be a chance to win early interest. They may speed up greatly though when WebAssembly design gets closer to MVP stage, but I am checking that regularly.But right now it is mostly irrelevant because runtime requirements have not been defined in WebAssembly at all, only low level byte code stuff. It is all in very early stages really.You have looked into it more than me at a low level, but how is it possible then to run an app today in nightly browser in wasm? Is it like saying you don't need glibc, but it's probably a better idea to use one written for your platform rather than have some combination of assembly and hacked up library designed for another architecture and platform (which is what I guess emscripten do)? Taking a step back, it's quite amusing how much ingenuity goes into having to avoid writing Javascript...
Oct 31 2016
On Tuesday, 1 November 2016 at 06:04:41 UTC, Laeeth Isharc wrote:On Monday, 31 October 2016 at 09:52:55 UTC, Dicebot wrote:Actually, they have moved one with browser previews and dedicated web site jsut as we were talking: http://webassembly.org I am curious how much attention it will gather on i.e. redditOn 10/30/2016 06:35 PM, Laeeth Isharc wrote:I was disappointed that after early hype it all went quiet for now.But what I meant was LLVM will have a wasm backend.Yes, but it is developed so slowly and conservatively, that coming up with own proof-of-concept backend may be a chance to win early interest. They may speed up greatly though when WebAssembly design gets closer to MVP stage, but I am checking that regularly.How it works right now is that you can use WebAssembly byte-code to write a JavaScript-exposed function to be called from plain JavaScript. See http://webassembly.org/docs/js . Thus most of runtime level stuff is actually done by JS. There are further plans to expose browser GC inside WebAssembly, allow direct usage of browser API like DOM and generally make possible to use it with no JavaScript at all - but those are all out of scope of MVP milestone.But right now it is mostly irrelevant because runtime requirements have not been defined in WebAssembly at all, only low level byte code stuff. It is all in very early stages really.You have looked into it more than me at a low level, but how is it possible then to run an app today in nightly browser in wasm?Taking a step back, it's quite amusing how much ingenuity goes into having to avoid writing Javascript...Judging by focus points in public statements one can reason that actually main driving goal of WebAssembly developers is not to throw away JavaScript but to make existing JavaScript/asm.js code more efficient (by pre-compiling it on server and distributing in binary form). At the same time such initiative is of great value to company like Google who has lost "better JS" competition (remember Dart language?) to Microsoft TypeScript. Having common ground in form of byte code supported by all browsers will in some way reset the competition again.
Nov 01 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:It's a small bit, but the idea here is to eliminate if conditionals where possible: https://medium.com/Dunno, I wouldn't expect an edge case to fall into the common flow of the code.
Oct 27 2016
On Tuesday, 25 October 2016 at 22:53:54 UTC, Walter Bright wrote:Eliminating loops is something D adds, and goes even further to making code a straight line.A problem for myself and probably many programmers, is some of the tricks like what Linus did simply doesn't come to mind because it's not something we'd seen before so we can't model after it, and also that you sort of follow the same logic of how you'd resolve it because that's how you were taught to resolve it or it's how you know to resolve it. Often when I go for code I start with commenting the steps/breakdown, then I write each section, refactoring and cleaning up and shortening naturally are last, but once code is working it doesn't usually change too much even if there can be improvement. Hmm what we really need is something like a good 100 examples of good 'tasty' code, good code in a variety of code, something digestible and even explained down for those not following the full logic if how/why it works, perhaps even bad examples to work up from.
Oct 28 2016