www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Scala future, Sing#

reply bearophile <bearophileHUGS lycos.com> writes:

the languages to be followed more, because they share some of future purposes
of D2/D3.

A small presentation about the close future of Scala (it's not a general
introduction to Scala):
"Scala -- The Next 5 Years" by Martin Odersky:
http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf




contains a kind of inference engine that probably processes/uses contracts and
class invariants quite better than D2 (probably in a similar way to Eiffel):
http://channel9.msdn.com/wiki/specsharp/specsharpobjectprimer

Bye,
bearophile
Aug 22 2009
parent reply Jari-Matti =?UTF-8?B?TcOka2Vsw6Q=?= <jmjmak utu.fi.invalid> writes:
bearophile wrote:


 one of the languages to be followed more, because they share some of
 future purposes of D2/D3.
 
 A small presentation about the close future of Scala (it's not a general
 introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky:
 http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf
Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors: - community: From what I've gathered, the Scala community mostly consists of much more experienced computer scientists and programmers (I don't mean industrial boiler-plate experience but experience with different kinds of languages, PL concepts and e.g. sound argumentation). These programmers aren't afraid of radical new ideas if it helps every day coding. These guys hate unorthogonality and love rigorous definition of semantics. They also want to discuss language issues and unlike Walter, Odersky doesn't lurk silently when important things are being discussed. This is a huge ++ to the PR. He also welcomes academics and doesn't ask them to go back to their ivory tower like most in D's community do. I understand embracing the industry, too, but it hasn't brought much money to D's development yet. - bloat: Scala is more lightweight. I've heard Walter say that he doesn't like e.g. library defined control structures - it's double-edged sword, and D and Scala have taken different paths here (in D if something is commonly used and it can be made built-in, it will be added to the compiler, in Scala it's the opposite). Scala has a very lightweight language core, and many additional features are defined in libraries. Several optimizations that improve the performance of HOFs are already known, but the compiler and virtual machine are not yet as good as they can be. In theory a good supercompiler can make Scala as fast as D. I personally find it funny that the meta-programming features in D are perfect for shrinking the language core, but every year new features still keep creeping it. - dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to JVM. - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is used). Thanks to the JIT compiler. The new 2.8 supports new static optimizations similar to what C++ & D have had. - syntax: Scala has a consistent and orthogonal syntax. Syntactic sugar is used sparingly and when it's being used, it shaves off boilerplate quite a bit. * e.g. (_._2 * _._1) is something like (tuple a, tuple b) { return a(1) * b(0); } in D.. I leave the definition of the tuple type as an exercise to the reader. * (A => B) => (C => D) vs (B function(A)) function (D function(C)) * case class foo(val a: Int, var b: String) is somewhere between 10-30 LOC in D * In D syntactic sugar often saves only a couple of characters (like the new template T!X syntax) - modularity & types: Scala supports modularity much better IMO (pure OOP, self types etc.). The abstractions are well suited for most tasks. But this is a bit hard to compare objectively. - high level features: Scala unifies OOP and FP. It also has novel new OOP concepts. - low level features: D wins here, OTOH a low level language isn't good for e.g. sandboxed environments - memory management: the JVM's GC is pretty mature, but of course manual memory management isn't as easy as in D - compatibility: D wins (?) here if C/C++ compatibility is important, but Scala is compatible with the large existing code base of Java, though - bugs: IMHO the unspecified parts of D and the huge amount of bugs made it unusable for me. Luckily I found Scala and have been really happy with it. I've only found maybe 1-2 bugs in it during the last 1.5 years. I usually find about 5-10 bugs in DMD in 15 minutes after coming back to D. And I'm also so happy to find that thanks to authors' knowledge of type systems (dependent types, HM, System F, etc.) Scala is a genius at inferring types. D doesn't really have a clue. Especially the array literal type inference is really naive. - to summarize: I use Scala for high level tasks, and came back to D when I need to see the actual machine code and optimize some tight inner loop. D is sometimes more suitable for this than C/C++ since it has a bit saner syntax and high level abstractions. But in general I nowadays write 90% of my code in Scala. I'm much happier and more productive writing Scala. YMMV
Aug 24 2009
next sibling parent bearophile <bearophileHUGS lycos.com> writes:
Jari-Matti M.:

There are certainly truckloads of features that could be taken from Scala to D.<
But D2 is already quite complex, so it's better to add things carefully. For example patterm matching is useful, but it adds a lot of complexity too.
But I'm afraid that the desire to have familiarity and compatibility with the
C/C++ family is more important in this community than cool new functional
features.<
And this can be a good thing, because C and C++ are commonly used languages.
- community: From what I've gathered, the Scala community mostly consists of
much more experienced computer scientists and programmers< This is an advantage for D: it can be used by more ignorant people too (or people more ignorant of functional lanmguages).
- bloat: Scala is more lightweight.<
This is a matter of balance, and there are no 'best' solutions. Moving things from the library to the language has some advantages.
Several optimizations that improve the performance of HOFs are already known,
but the compiler and virtual machine are not yet as good as they can be. In
theory a good supercompiler can make Scala as fast as D.<
HotSpot Java GC is much more efficient than the current D GC, and HotSpot is often able to inline virtual methods. D has the advantage of having a simpler core. Creating a Scala compiler on LLVM may be hard, running D1 on LLVM was easy enough. Simpler systems have some advantages. Scala type inference is much more powerful but it's also harder to use (if you want to do complex things), and requires a more complex compiler. In practice supercompilers are very hard to create, while D1 code running on the LLVM is already very efficient. Simpler systems also have the advantahe of being more transparent: understanding why some D1 code is fast or slow is probably simpler than doing the same thing with a piece of Scala code.
I personally find it funny that the meta-programming features in D are perfect
for shrinking the language core, but every year new features still keep
creeping it.<
They are not perfect, they have limits, and the ersults aren't always nice, see the struct bitfields.
- dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to
JVM.<
Some of such things can be added/improved in D too.
- syntax: Scala has a consistent and orthogonal syntax.<
Too much orthogonality is bad, it produces the LEGO disease. A compromise is better.
Syntactic sugar is used sparingly and when it's being used, it shaves off
boilerplate quite a bit.   * e.g. (_._2 * _._1)<
Here there are risks too. I have seen a syntax for a fold (reduce) in Scala that's horribly unreadable. Python3 has even removed reduce() from the core language because folds aren't easy to understand and I agree with their decision. Keeping the language easy is more important. Too much boilerplate is boring, but the boilerplate is better than hard to understand code.
* In D syntactic sugar often saves only a couple of characters (like the new
template T!X syntax)<
I agree that was a silly idea, that I may even like to remove from D2. I want eager/lazy sequence comphrensions :-)
- memory management: the JVM's GC is pretty mature, but of course manual memory
management isn't as easy as in D<
Some forms of memory usage that I use in D are impossible on the JavaVM.
- compatibility: D wins (?) here if C/C++ compatibility is important, but Scala
is compatible with the large existing code base of Java, though<
D refuses to be compatible with C++. There's just partial compatibility (and this is probably good).
the huge amount of bugs made it unusable for me.<
There are many bugs in D, but D is slowly opening a bit more and more toward the community, for example see the recent lot of bug fixes by Don. If more people will work like Don, lot of bugs will be removed. Walter is slowly understanding what open source development means. So I have hope still.
And I'm also so happy to find that thanks to authors' knowledge of type systems
(dependent types, HM, System F, etc.) Scala is a genius at inferring types. D
doesn't really have a clue.<
Scala uses a totally different type system. I think it uses an Hindley–Milner type inference algorithm. Walter is probably not expert on such thing. A single person can't be expert on everything. Today designing concurrency, type inference or garbage collectors requires lot of specialized and even some academic knowledge. Scala author has used the JavaVM to avoid doing lot of low-level work. D type system isn't so bad. It has limits, and some of such limits may be lifted a little, but you can do lot of things with the D language anyway, see some of the things I've done in my dlibs. Keeping the type system simpler has some advantages too, for example compilation speed.
Especially the array literal type inference is really naive.<
I'm sure it's not hard to fix the array literal type inference that currently is not good; it's just that Walter isn't interested in doing it, or he thinks things are good like this, like for the half-unfinished module system.
- to summarize: I use Scala for high level tasks, and came back to D when I
need to see the actual machine code and optimize some tight inner loop. D is
sometimes more suitable for this than C/C++ since it has a bit saner syntax and
high level abstractions.<
Today when you need hi-performance you need the GPU or to use SSE registers very well. An usage usample of the GPU: http://www.smartwikisearch.com/algorithm.html In the end if you need really fast programs that have to perform heavy numerical computations you need languages like Python (plus the right libs, like CorePy), D (and Scala) isn't up to the task yet: http://www.corepy.org/ http://mathema.tician.de/software/pycuda http://python.sys-con.com/node/926439 http://pypi.python.org/pypi/python-opencl/0.2 Bye, bearophile
Aug 24 2009
prev sibling next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Jari-Matti Mäkelä wrote:
 bearophile wrote:
 

 one of the languages to be followed more, because they share some of
 future purposes of D2/D3.

 A small presentation about the close future of Scala (it's not a general
 introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky:
 http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf
Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors: - community: From what I've gathered, the Scala community mostly consists of much more experienced computer scientists and programmers (I don't mean industrial boiler-plate experience but experience with different kinds of languages, PL concepts and e.g. sound argumentation). These programmers aren't afraid of radical new ideas if it helps every day coding. These guys hate unorthogonality and love rigorous definition of semantics. They also want to discuss language issues and unlike Walter, Odersky doesn't lurk silently when important things are being discussed. This is a huge ++ to the PR. He also welcomes academics and doesn't ask them to go back to their ivory tower like most in D's community do. I understand embracing the industry, too, but it hasn't brought much money to D's development yet. - bloat: Scala is more lightweight. I've heard Walter say that he doesn't like e.g. library defined control structures -
Actually, you can do them with "lazy" function arguments. There was an example somewhere of doing control structures with it.
 it's double-edged sword, and 
 D and Scala have taken different paths here (in D if something is commonly 
 used and it can be made built-in, it will be added to the compiler, in Scala 
 it's the opposite).
That's not quite right. I'll add things to the core if there is a good reason to - the compiler can do things a library cannot. For example, string literals.
 Scala has a very lightweight language core, and many 
 additional features are defined in libraries. Several optimizations that 
 improve the performance of HOFs are already known, but the compiler and 
 virtual machine are not yet as good as they can be. In theory a good 
 supercompiler can make Scala as fast as D.
I've been hearing that (about Java, same problem) for as long as Java has been around. It might get there yet, but that won't be in the near future.
 I personally find it funny that 
 the meta-programming features in D are perfect for shrinking the language 
 core, but every year new features still keep creeping it.
Actually, some features are being removed. Imaginary and complex variables, for one. There's some work being done to rewrite D forms into simpler D forms, saving hundreds of lines of code in the compiler.
 - dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to 
 JVM.
Yes, but an interpreter or JIT is required to make that work. That makes the language binary not lightweight.
 - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is 
 used).
Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
 Thanks to the JIT compiler. The new 2.8 supports new static 
 optimizations similar to what C++ & D have had.
 
 - syntax: Scala has a consistent and orthogonal syntax. Syntactic sugar is 
 used sparingly and when it's being used, it shaves off boilerplate quite a 
 bit.
   * e.g. (_._2 * _._1) is something like (tuple a, tuple b) { return a(1) * 
 b(0); } in D.. I leave the definition of the tuple type as an exercise to 
 the reader. 
   * (A => B) => (C => D) vs (B function(A)) function (D function(C))
   * case class foo(val a: Int, var b: String) is somewhere between 10-30 LOC 
 in D
   * In D syntactic sugar often saves only a couple of characters (like the 
 new template T!X syntax)
 
 - modularity & types: Scala supports modularity much better IMO (pure OOP, 
 self types etc.). The abstractions are well suited for most tasks. But this 
 is a bit hard to compare objectively.
 
 - high level features: Scala unifies OOP and FP. It also has novel new OOP 
 concepts.
 
 - low level features: D wins here, OTOH a low level language isn't good for 
 e.g. sandboxed environments
Sure, but there's the Safe D subset, and also D isn't intended for non-programmers to download untrusted source code from the internet and run.
 - memory management: the JVM's GC is pretty mature, but of course manual 
 memory management isn't as easy as in D
 
 - compatibility: D wins (?) here if C/C++ compatibility is important, but 
 Scala is compatible with the large existing code base of Java, though
You can mechanically translate Java to D, but it still requires some manual touch-up.
 - bugs: IMHO the unspecified parts of D and the huge amount of bugs made it 
 unusable for me. Luckily I found Scala and have been really happy with it. 
 I've only found maybe 1-2 bugs in it during the last 1.5 years. I usually 
 find about 5-10 bugs in DMD in 15 minutes after coming back to D.
I couldn't find any bugs you've submitted to the D bugzilla. If you don't submit them, they won't get fixed <g>.
 And I'm 
 also so happy to find that thanks to authors' knowledge of type systems 
 (dependent types, HM, System F, etc.) Scala is a genius at inferring types. 
 D doesn't really have a clue.
Can you give an example?
 Especially the array literal type inference is really naive.
How should it be done?
 - to summarize: I use Scala for high level tasks, and came back to D when I 
 need to see the actual machine code and optimize some tight inner loop. D is 
 sometimes more suitable for this than C/C++ since it has a bit saner syntax 
 and high level abstractions. But in general I nowadays write 90% of my code 
 in Scala. I'm much happier and more productive writing Scala. YMMV
I appreciate you taking the time to tell us your impressions on this.
Aug 24 2009
next sibling parent reply Ary Borenszweig <ary esperanto.org.ar> writes:
Walter Bright escribió:
 Jari-Matti Mäkelä wrote:
 bearophile wrote:
 - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM 
 is used).
Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
I think the standard name is "adaptive optimization": http://en.wikipedia.org/wiki/Adaptive_optimization "Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile." "Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."
Aug 24 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Ary Borenszweig wrote:
 Walter Bright escribió:
 Jari-Matti Mäkelä wrote:
 bearophile wrote:
 - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM 
 is used).
Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
I think the standard name is "adaptive optimization": http://en.wikipedia.org/wiki/Adaptive_optimization "Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile." "Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."
It's also called profile guided optimization, but Jari-Matti said it was "OOP" related, so I wondered how that fit in.
Aug 24 2009
parent reply Jari-Matti =?UTF-8?B?TcOka2Vsw6Q=?= <jmjmak utu.fi.invalid> writes:
Walter Bright wrote:

 Ary Borenszweig wrote:
 Walter Bright escribió:
 Jari-Matti Mäkelä wrote:
 bearophile wrote:
 - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM
 is used).
Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
I think the standard name is "adaptive optimization": http://en.wikipedia.org/wiki/Adaptive_optimization "Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile." "Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."
It's also called profile guided optimization, but Jari-Matti said it was "OOP" related, so I wondered how that fit in.
I meant this "Another important example of this kind of optimization is class-hierarchy- based optimization. A virtual method invocation, for example, involves looking at the class of the receiver object for the call to discover which actual target implements the virtual method for the receiver object. Research has shown that most virtual invocations have only a single target for all receiver objects, and JIT compilers can generate more-efficient code for a direct call than for a virtual invocation. By analyzing the class hierarchy's state when the code is compiled, the JIT compiler can find the single target method for a virtual invocation and generate code that directly calls the target method rather than performing the slower virtual invocation. Of course, if the class hierarchy changes and a second target method becomes possible, then the JIT compiler can correct the originally generated code so that the virtual invocation is performed. In practice, these corrections are rarely required. Again, the potential need to make such corrections makes performing this optimization statically troublesome." http://www.ibm.com/developerworks/java/library/j-rtj2/index.html
Aug 25 2009
parent Walter Bright <newshound1 digitalmars.com> writes:
Jari-Matti Mäkelä wrote:
 I meant this
 
 "Another important example of this kind of optimization is class-hierarchy-
 based optimization. A virtual method invocation, for example, involves 
 looking at the class of the receiver object for the call to discover which 
 actual target implements the virtual method for the receiver object. 
 Research has shown that most virtual invocations have only a single target 
 for all receiver objects, and JIT compilers can generate more-efficient code 
 for a direct call than for a virtual invocation. By analyzing the class 
 hierarchy's state when the code is compiled, the JIT compiler can find the 
 single target method for a virtual invocation and generate code that 
 directly calls the target method rather than performing the slower virtual 
 invocation. Of course, if the class hierarchy changes and a second target 
 method becomes possible, then the JIT compiler can correct the originally 
 generated code so that the virtual invocation is performed. In practice, 
 these corrections are rarely required. Again, the potential need to make 
 such corrections makes performing this optimization statically troublesome."
 
 http://www.ibm.com/developerworks/java/library/j-rtj2/index.html
I'm not quite sure what that means, but I think it means nothing more than noting that a method is not overridden, and so can be called directly. Currently, this optimization happens in D if a method or class is annotated with 'final'. It is possible for the compiler to do this if flow analysis can prove the direct type of a class reference, but the optimizer currently does not do that. It is also possible for the compiler to determine that methods are final automatically if it knows about all the modules that import a particular class.
Aug 25 2009
prev sibling next sibling parent reply Jari-Matti =?UTF-8?B?TcOka2Vsw6Q=?= <jmjmak utu.fi.invalid> writes:
Walter Bright wrote:

 Jari-Matti Mäkelä wrote:
 - bloat: Scala is more lightweight. I've heard Walter say that he doesn't
 like e.g. library defined control structures -
Actually, you can do them with "lazy" function arguments. There was an example somewhere of doing control structures with it.
Agreed, you /can/ do something similar. But in Scala it's the standard way of doing things. If you compare the grammars of both languages, you'll see that Scala is a bit lighter than D (see http://www.scala- lang.org/sites/default/files/linuxsoft_archives/docu/files/ScalaReference.pdf)
 
 it's double-edged sword, and
 D and Scala have taken different paths here (in D if something is
 commonly used and it can be made built-in, it will be added to the
 compiler, in Scala it's the opposite).
That's not quite right. I'll add things to the core if there is a good reason to - the compiler can do things a library cannot. For example, string literals.
I exaggerated a bit. But there are some constructs that some think should not be there, e.g. foreach_reverse.
 - bugs: IMHO the unspecified parts of D and the huge amount of bugs made
 it unusable for me. Luckily I found Scala and have been really happy with
 it. I've only found maybe 1-2 bugs in it during the last 1.5 years. I
 usually find about 5-10 bugs in DMD in 15 minutes after coming back to D.
I couldn't find any bugs you've submitted to the D bugzilla. If you don't submit them, they won't get fixed <g>.
I've submitted couple of reports years ago and luckily some of them have already been fixed. Maybe the search is broken.
 And I'm
 also so happy to find that thanks to authors' knowledge of type systems
 (dependent types, HM, System F, etc.) Scala is a genius at inferring
 types. D doesn't really have a clue.
Can you give an example?
http://d.puremagic.com/issues/show_bug.cgi?id=3042 auto foo = [ 1, 2L ]; // typeof == int[2u] auto foo = [ 2L, 1 ]; // typeof == long[2u] auto foo = [ "a", "abcdefgh" ]; // typeof == char[1u][2u] in D1 auto foo = [ [], [1,2,3] ]; // doesn't even compile
 
 Especially the array literal type inference is really naive.
How should it be done?
You shouldn't use the type of the first given element when constructing the type of the array. If you have [ e_1, ..., e_n ], the type of the literal is unify(type_of_e_1, ..., type_of_e_n) + "[]". For instance: => typeof([ [], [1,2,3] ]) => unify( typeof([]), typeof([1,2,3]) ) + "[]" => unify( "a[]", unify(typeof(1),typeof(2),typeof(3)) + "[]" ) + "[]" => unify( "a[]", unify("int","int","int") + "[]" ) + "[]" => unify( "a[]", "int" + "[]" ) + "[]" => unify( "a[]", "int[]" ) + "[]" // a is a local type var, subst = { a -> int } => "int[]" + "[]" => "int[][]"
Aug 25 2009
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Jari-Matti Mäkelä wrote:
 Agreed, you /can/ do something similar. But in Scala it's the standard way 
 of doing things. If you compare the grammars of both languages, you'll see 
 that Scala is a bit lighter than D (see http://www.scala-
 lang.org/sites/default/files/linuxsoft_archives/docu/files/ScalaReference.pdf)
Yes, it's a bit different philosophy. I like the advantage of built in control structures, because that leads to commonality between code bases. I wished to avoid things like C++ string, where everyone invents their own string class that's incompatible with everyone elses'. Whether that latter effect occurs in Scala or not, I don't know.
 I exaggerated a bit. But there are some constructs that some think should 
 not be there, e.g. foreach_reverse.
Perhaps, but you can just ignore it.
 I've submitted couple of reports years ago and luckily some of them have 
 already been fixed. Maybe the search is broken.
Can you provide bug numbers?
 Scala is a genius at inferring
 types. D doesn't really have a clue.
Can you give an example?
http://d.puremagic.com/issues/show_bug.cgi?id=3042 auto foo = [ 1, 2L ]; // typeof == int[2u] auto foo = [ 2L, 1 ]; // typeof == long[2u] auto foo = [ "a", "abcdefgh" ]; // typeof == char[1u][2u] in D1 auto foo = [ [], [1,2,3] ]; // doesn't even compile
 Especially the array literal type inference is really naive.
How should it be done?
You shouldn't use the type of the first given element when constructing the type of the array. If you have [ e_1, ..., e_n ], the type of the literal is unify(type_of_e_1, ..., type_of_e_n) + "[]". For instance: => typeof([ [], [1,2,3] ]) => unify( typeof([]), typeof([1,2,3]) ) + "[]" => unify( "a[]", unify(typeof(1),typeof(2),typeof(3)) + "[]" ) + "[]" => unify( "a[]", unify("int","int","int") + "[]" ) + "[]" => unify( "a[]", "int" + "[]" ) + "[]" => unify( "a[]", "int[]" ) + "[]" // a is a local type var, subst = { a -> int } => "int[]" + "[]" => "int[][]"
Ok.
Aug 25 2009
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Walter Bright wrote:
 Especially the array literal type inference is really naive.
How should it be done?
You shouldn't use the type of the first given element when constructing the type of the array. If you have [ e_1, ..., e_n ], the type of the literal is unify(type_of_e_1, ..., type_of_e_n) + "[]". For instance: => typeof([ [], [1,2,3] ]) => unify( typeof([]), typeof([1,2,3]) ) + "[]" => unify( "a[]", unify(typeof(1),typeof(2),typeof(3)) + "[]" ) + "[]" => unify( "a[]", unify("int","int","int") + "[]" ) + "[]" => unify( "a[]", "int" + "[]" ) + "[]" => unify( "a[]", "int[]" ) + "[]" // a is a local type var, subst = { a -> int } => "int[]" + "[]" => "int[][]"
Ok.
Walter: I told you :o). I'd already defined the unify meta-function, it's called CommonType, see http://www.digitalmars.com/d/2.0/phobos/std_traits.html#CommonType You pass any number of types and it will figure out the type that all types can be converted to. Using that, it is trivial to define functions that infer array types properly. At a point I had an array() function that took any number of arguments and returned a correctly-typed array. Then I dropped that and defined array() in std with a different meaning (transform a range into an array). Andrei
Aug 25 2009
parent Bill Baxter <wbaxter gmail.com> writes:
On Tue, Aug 25, 2009 at 11:47 AM, Andrei
Alexandrescu<SeeWebsiteForEmail erdani.org> wrote:
 Walter Bright wrote:
 Especially the array literal type inference is really naive.
How should it be done?
You shouldn't use the type of the first given element when constructing the type of the array. If you have [ e_1, ..., e_n ], the type of the literal is unify(type_of_e_1, ..., type_of_e_n) + "[]". For instance: =3D> typeof([ [], [1,2,3] ]) =3D> unify( typeof([]), typeof([1,2,3]) ) + "[]" =3D> unify( "a[]", unify(typeof(1),typeof(2),typeof(3)) + "[]" ) + "[]" =3D> unify( "a[]", unify("int","int","int") + "[]" ) + "[]" =3D> unify( "a[]", "int" + "[]" ) + "[]" =3D> unify( "a[]", "int[]" ) + "[]" =A0 // a is a local type var, subst=
=3D { a
 -> int }
 =3D> "int[]" + "[]"
 =3D> "int[][]"
Ok.
Walter: I told you :o). I'd already defined the unify meta-function, it's called CommonType, see http://www.digitalmars.com/d/2.0/phobos/std_traits.html#CommonType You pass any number of types and it will figure out the type that all typ=
es
 can be converted to. Using that, it is trivial to define functions that
 infer array types properly. At a point I had an array() function that too=
k
 any number of arguments and returned a correctly-typed array. Then I drop=
ped
 that and defined array() in std with a different meaning (transform a ran=
ge
 into an array).
That's also the rule used by NumPy. I found D's array behavior very surpri= sing. --bb
Aug 25 2009
prev sibling parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:

Actually, you can do them with "lazy" function arguments. There was an example
somewhere of doing control structures with it.<
There are some problems with this: - Are current (especially LDC) compilers able to inline those lazy delegates? Scala compiler contains some parts to do that at compile-time (so it's not done at runtime by the JavaVM). - I have put inside my dlibs a select() (adapting code written by another person) that uses lazy arguments to implement an eager (it can't be lazy, unfortunately) array comphrension. I'ver tried to see how the LDC compiles it and people there have shown me distaste for that code of mine, even just for a benchmark. So it seems the D community doesn't like to use lazy arguments to create control structures. - Andrei has shown so much distate for such things that the Phobos2 doesn't ususually even use normal delegates, and you have even added a "typeless" way to give a delegate to a template in D2. This shows there's little interest among D developers to go the way of Scala. Scala uses delegates for those purposes, and then inlines them.
I've been hearing that (about Java, same problem) for as long as Java has been
around. It might get there yet, but that won't be in the near future.<
Today Java is very fast, especially for very OOP-style code. Sometimes programs too is fast, for example its GC and associative arrays are much faster.
Yes, but an interpreter or JIT is required to make that work. That makes the
language binary not lightweight.<
D can be improved/debugged in several ways in this regard, even if keeps not using a VM.
Do you mean knowing a class or virtual method has no descendants? Sure, you
need to know the whole program to do that, or just declare it as final.<
I can see there's lot of confusion about such matters. Experience has shown that class-hierarchy-based optimization isn't much effective, because in most practical programs lot of virtual calls are bi- or multi- morphic. Other strategies like "Type feedback" work better. I have already discussed this topic a little, but it was on the D.learn newsgroup, so you have missed it. A good old paper about this topic: "Eliminating Virtual Function Calls in C++ Programs" (1996), by Gerald Aigner, Urs Hölzle: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.7.7766 There are other papers written on this topic, and some of such papers are more modern/updated too (this is about 13 years old), but it's a good starting point. If yoy want more papers please ask. Note that the LDC compiler has Link-Time Optimization too, and LTO can also be done when you "know the whole program". If the front-end gives some semantic annotations to LDC, it can do powerful things during LTO.
Can you give an example?<
I don't know Scala enough to give you examples, so I leave this to Jari-Matti. But I think Scala uses an Hindley-Milner type inference algorithm, it's another class of type inferencing. I am not asking you to put Hindley-Milner inside D.
[array literal type inference] How should it be done?<
Silently dropping information is bad. So cutting strings according to the length of the first one as in D1 is bad. The type of an array literal has to be determined by the type specified by the programmer on the right. If such annotation is absent (because there's an auto, or because the array is inside an expression) the type has to be the most tight able to represent all the types contained in the array literal (or raise an error if no one can be found). By default array literals have to produce dynamic arrays, unless the programmers specifies that he/she/shi wants a fixed-size one. (People are asking for similar things for years, it's not a new topic invented by Jari-Matti M. or by me). Bye, bearophile
Aug 25 2009
next sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
bearophile wrote:
 Walter Bright:
 
 Actually, you can do them with "lazy" function arguments. There was
 an example somewhere of doing control structures with it.<
There are some problems with this: - Are current (especially LDC) compilers able to inline those lazy delegates?
No, but it's not a technical issue, just more of no demand for it. But this would not be a reason to use Scala, as one doesn't use Scala for performance oriented programs.
 I'ver tried to see how the LDC compiles it and people
 there have shown me distaste for that code of mine, even just for a
 benchmark. So it seems the D community doesn't like to use lazy
 arguments to create control structures. - Andrei has shown so much
 distate for such things that the Phobos2 doesn't ususually even use
 normal delegates, and you have even added a "typeless" way to give a
 delegate to a template in D2. This shows there's little interest
 among D developers to go the way of Scala. Scala uses delegates for
 those purposes, and then inlines them.
D supports many styles of programming. Whether a particular style is popular or not is not really the issue - if someone wants to use that style, it is available. If someone builds a compelling case for the advantages of such a style, it can catch on. We can see this in C++ over time, where different styles have gained ascendancy and then were replaced by new styles.
 I've been hearing that (about Java, same problem) for as long as
 Java has been around. It might get there yet, but that won't be in
 the near future.<
Today Java is very fast, especially for very OOP-style code. Sometimes programs in C++ can be a little faster, but generally no associative arrays are much faster.
A factor of 2 is a big deal.
 Yes, but an interpreter or JIT is required to make that work. That
 makes the language binary not lightweight.<
D can be improved/debugged in several ways in this regard, even if keeps not using a VM.
Sure, but there's still a huge difference.
 Do you mean knowing a class or virtual method has no descendants?
 Sure, you need to know the whole program to do that, or just
 declare it as final.<
I can see there's lot of confusion about such matters. Experience has shown that class-hierarchy-based optimization isn't much effective, because in most practical programs lot of virtual calls are bi- or multi- morphic. Other strategies like "Type feedback" work better. I have already discussed this topic a little, but it was on the D.learn newsgroup, so you have missed it. A good old paper about this topic: "Eliminating Virtual Function
parent reply bearophile <bearophileHUGS lycos.com> writes:
Walter Bright:

No, but it's not a technical issue, just more of no demand for it. But this
would not be a reason to use Scala, as one doesn't use Scala for performance
oriented programs.<
The usage of higher-order functions is pervasive in Scala, and they use delegates. So optimiizing those delegates was quite high on the priority list of optimization to perform. In functional-style programming people use first-class functions all the time, so they have to be optimized. That's why D2 may need to do the same if it wants people to use functional-style programming :-)
DMC++ does a good job of eliminating many virtual calls.<
I have not found much of such capability in DMD. If DMD too performs that, then can you show me 1 example when it's performed? :-)
[LTO in LDC] I didn't know that.<
Note: LTO isn't done automatically yet by LDC, so to do it you have to input 3 different commands. Ask (to me, for example) if you want to know them. Bye, bearophile
Aug 25 2009
next sibling parent "Danny Wilson" <bluezenix gmail.com> writes:
Op Wed, 26 Aug 2009 02:04:18 +0200 schreef bearophile  
<bearophileHUGS lycos.com>:

 Note: LTO isn't done automatically yet by LDC, so to do it you have to  
 input 3 different commands.
 Ask (to me, for example) if you want to know them.
Well I do :-)
Aug 25 2009
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
bearophile wrote:
 Walter Bright:
 
 No, but it's not a technical issue, just more of no demand for it.
 But this would not be a reason to use Scala, as one doesn't use
 Scala for performance oriented programs.<
The usage of higher-order functions is pervasive in Scala, and they use delegates. So optimiizing those delegates was quite high on the priority list of optimization to perform. In functional-style programming people use first-class functions all the time, so they have to be optimized. That's why D2 may need to do the same if it wants people to use functional-style programming :-)
There's quite a lot of D specific optimizations that are not done at the moment, simply because of other more pressing issues.
 DMC++ does a good job of eliminating many virtual calls.<
I have not found much of such capability in DMD. If DMD too performs that, then can you show me 1 example when it's performed? :-)
DMC++, not DMD. I haven't done the effort yet to do it in DMD.
Aug 25 2009
prev sibling parent =?ISO-8859-1?Q?Jari-Matti_M=e4kel=e4?= <jmjmak utu.fi.invalid> writes:
bearophile Wrote:

 Walter Bright:
 
Actually, you can do them with "lazy" function arguments. There was an example
somewhere of doing control structures with it.<
There are some problems with this: - Are current (especially LDC) compilers able to inline those lazy delegates? Scala compiler contains some parts to do that at compile-time (so it's not done at runtime by the JavaVM). - I have put inside my dlibs a select() (adapting code written by another person) that uses lazy arguments to implement an eager (it can't be lazy, unfortunately) array comphrension. I'ver tried to see how the LDC compiles it and people there have shown me distaste for that code of mine, even just for a benchmark. So it seems the D community doesn't like to use lazy arguments to create control structures. - Andrei has shown so much distate for such things that the Phobos2 doesn't ususually even use normal delegates, and you have even added a "typeless" way to give a delegate to a template in D2. This shows there's little interest among D developers to go the way of Scala. Scala uses delegates for those purposes, and then inlines them.
The call-by-name trailing block syntax is also more uniform with the built-ins in scala, e.g.: scala> object Helpers { def run_twice[T](block: => T) = { block; block } } defined module Helpers scala> import Helpers._ import Helpers._ scala> var i = 1 i: Int = 1 scala> run_twice { i += 1; i } res5: Int = 3 scala> run_twice { print("foo\n") } foo foo
I've been hearing that (about Java, same problem) for as long as Java has been
around. It might get there yet, but that won't be in the near future.<
Today Java is very fast, especially for very OOP-style code. Sometimes on dotnet too is fast, for example its GC and associative arrays are much faster.
Java is /fast enough/ for many (if not most) purposes. People even use languages with slower implementations like php or javascript these days (ever heard of web 2.0?). Often the other aspects of the language have a larger impact on real world projects than raw execution speed.
[array literal type inference] How should it be done?<
Silently dropping information is bad. So cutting strings according to the length of the first one as in D1 is bad.
Since now I've mostly been experimenting with D1. But this seems to be fixed in D2.
 The type of an array literal has to be determined by the type specified by the
programmer on the right. If such annotation is absent (because there's an auto,
or because the array is inside an expression) the type has to be the most tight
able to represent all the types contained in the array literal (or raise an
error if no one can be found).
 By default array literals have to produce dynamic arrays, unless the
programmers specifies that he/she/shi wants a fixed-size one.
The literals should also handle cases like [[],[],[1,2]], not only the string case. What makes this frustating is that one can assume that since the generic type inference engine isn't used consistently in all cases, there isn't one in dmd. This might result in more corner cases in other parts of the language.
Aug 25 2009
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
Jari-Matti Mäkelä wrote:
 bearophile wrote:
 

 one of the languages to be followed more, because they share some of
 future purposes of D2/D3.

 A small presentation about the close future of Scala (it's not a general
 introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky:
 http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf
Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors:
[snip] You are making very good points, thanks for an insightful post. A few of them are slightly inaccurate, but those were discussed by Walter already. I'd like to make a few high-level comments. I find the lack of rigor in certain parts of D as damaging as you do. Fortunately, that has improved by leaps and bounds and is still improving. When I joined Walter with working on D back in 2006, D was quite a lot less theoretically-motivated - it was all greasy nuts and bolts and pegs hammered in place. There was a tacit agreement between Walter and me that D will improve the rigor of its ways, and indeed we have made strides in that respect. If things work out properly, we'll be able to define SafeD - a proper, checkable safe subset of the language. It would be great if there were more PL heavyweights in the D community (as you mention there are in Scala), but D's community is in large part young and enthusiastic, and that's an asset. Possibly some of the young students using D will become PL experts themselves and help improve it. (Unlike you, I think that by and large there is no hatred or shunning of the ivory tower.) Among other things, it would be great if you yourself hung out more around here. It's a risky proposition, but I guess it's a simple fact that you have a better chance at making a vital contribution to D than to Scala. About unifying FP and OOP, I hope you can educate me a bit. My understanding from a cursory search is that Scala does not have true immutability and purity, only Java-style by-convention immutability a la String class. Is that correct? If that's the case, I think Scala got the syntactic sugar part of FP right (pattern matching) but not the purity and immutability aspects, which I believe are much more important. These aspects are related to concurrency as well, where I think Scala has a distinctly underwhelming lineup. Even with what we know we can implement now in D concurrency-wise, we'll be in better shape than many (most? all?) other languages. Stay tuned. I agree that D could improve its type deduction - it doesn't even do Hindley-Milner. (Incidentally, you chose a particularly poor example regarding the type of array literals - those are very easy to type properly (and in fact I do so with a library function), but Walter insisted that typing by the first member is simple enough for people to understand and remember.) I'm afraid D's type deduction abilities will stay at about this level at least for D2. The good news is that, due to templates' deferred typechecking, I haven't found that to be a huge problem in practice. Andrei
Aug 25 2009
parent =?ISO-8859-1?Q?Jari-Matti_M=e4kel=e4?= <jmjmak utu.fi.invalid> writes:
Andrei Alexandrescu Wrote:

 Jari-Matti Mäkelä wrote:
 bearophile wrote:
 

 one of the languages to be followed more, because they share some of
 future purposes of D2/D3.

 A small presentation about the close future of Scala (it's not a general
 introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky:
 http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf
Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors:
[snip] You are making very good points, thanks for an insightful post. A few of them are slightly inaccurate, but those were discussed by Walter already. I'd like to make a few high-level comments. I find the lack of rigor in certain parts of D as damaging as you do. Fortunately, that has improved by leaps and bounds and is still improving. When I joined Walter with working on D back in 2006, D was quite a lot less theoretically-motivated - it was all greasy nuts and bolts and pegs hammered in place. There was a tacit agreement between Walter and me that D will improve the rigor of its ways, and indeed we have made strides in that respect. If things work out properly, we'll be able to define SafeD - a proper, checkable safe subset of the language.
That's great to hear.
 It would be great if there were more PL heavyweights in the D community 
 (as you mention there are in Scala), but D's community is in large part 
 young and enthusiastic, and that's an asset. Possibly some of the young 
 students using D will become PL experts themselves and help improve it. 
That's more likely. I can see why the approach doesn't attract PL post-grad students, but that's not necessarily a huge problem.
 (Unlike you, I think that by and large there is no hatred or shunning of 
 the ivory tower.)
I've experienced this kind of attitude in the D's irc channel and sometimes here on the newsgroup. It's quite understandable that since D consists of so many heterogenic features, different kinds of aspects motivate us. To me the coherent "user interface" of the language and its semantics are one of the appealing sides, but for some other people things like backend optimizations and compatibility with existing C/C++ libraries matters more and e.g. rigorous language semantics are just bikeshedding. I have little problem reading or writing code written in a cryptic unorthogonal language - on the contrary I sometimes find it an exciting way of spending time. But when I sometimes suggest some feature or change in semantics, my goal is to make the language easier to use for all of us.
 Among other things, it would be great if you yourself 
 hung out more around here. It's a risky proposition, but I guess it's a 
 simple fact that you have a better chance at making a vital contribution 
 to D than to Scala.
I do hang out here, have done so since 2003. And I have some hopes that D becomes useful for me again. I've liked the direction D2 is heading towards, and once the largest problems have been ironed out, I might actually start using it (or D3), and I'm sure there are others who will do the same.
 About unifying FP and OOP, I hope you can educate me a bit. My 
 understanding from a cursory search is that Scala does not have true 
 immutability and purity, only Java-style by-convention immutability a la 
 String class.
It doesn't have a system similar to what D2 has, but the Java-style immutability is used to provide e.g. immutable collections. I'm expecting a change here later when the concurrency becomes a larger issue (it should be already, but you only have so much time!). I've done some research on concurrency issues lately and it seems to me that there are so many computational models to choose from that in some sense it might be too early to stick with one.
 Is that correct? If that's the case, I think Scala got the 
 syntactic sugar part of FP right (pattern matching) but not the purity 
 and immutability aspects, which I believe are much more important.
Agreed, it's not a pure functional language. In fact the functional feel is quite different from the old school functional languages.
 I agree that D could improve its type deduction - it doesn't even do 
 Hindley-Milner. (Incidentally, you chose a particularly poor example 
 regarding the type of array literals - those are very easy to type 
 properly (and in fact I do so with a library function), but Walter 
 insisted that typing by the first member is simple enough for people to 
 understand and remember.) I'm afraid D's type deduction abilities will 
 stay at about this level at least for D2. The good news is that, due to 
 templates' deferred typechecking, I haven't found that to be a huge 
 problem in practice.
Yea, I did some other tests and it really seems the type inference has gotten much better. It just felt dishonest to claim something without providing examples. I expected more problems, but was happy to see that many cases indeed worked now.
Aug 25 2009