www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Most popular programming languages 1965-2019 (visualised)

reply Ethan <gooberman gmail.com> writes:
https://www.youtube.com/watch?v=Og847HVwRSI

While not unsurprising, it was still fascinating watching 
Objective-C come out of nowhere to get in the list 25 years after 
it was first released.
Oct 10 2019
next sibling parent reply kinke <noone nowhere.com> writes:
On Thursday, 10 October 2019 at 16:32:44 UTC, Ethan wrote:
 https://www.youtube.com/watch?v=Og847HVwRSI

 While not unsurprising, it was still fascinating watching 
 Objective-C come out of nowhere to get in the list 25 years 
 after it was first released.
Presumably the effect of a few opinionated decision makers in a way-too-big corporation with a shameless preference for closed ecosystems. Rightfully dropping into oblivion again by the looks of it [the language, not the corporation ;)].
Oct 10 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 10 October 2019 at 19:17:03 UTC, kinke wrote:
 Presumably the effect of a few opinionated decision makers in a 
 way-too-big corporation with a shameless preference for closed 
 ecosystems. Rightfully dropping into oblivion again by the 
 looks of it [the language, not the corporation ;)].
Not really oblivion. You still need Objective-C++ to interface gracefully with Swift, but yes, it is more of an interfacing tool than a productivity tool. Anyway, NeXT wasn't a big corporation, but you are right that Jobs was opinionated. That said the dynamic aspects of Objective-C are suitable for GUI-development. And the big advertising point for NeXT was the OO GUI + hardware, so it made some sense as there were few available alternatives (Smalltalk perhaps). It was kind of similar to Dart being plugged by Google for frontend development, not Go.
Oct 10 2019
prev sibling parent Laeeth Isharc <laeeth laeeth.com> writes:
On Thursday, 10 October 2019 at 19:17:03 UTC, kinke wrote:
 On Thursday, 10 October 2019 at 16:32:44 UTC, Ethan wrote:
 https://www.youtube.com/watch?v=Og847HVwRSI

 While not unsurprising, it was still fascinating watching 
 Objective-C come out of nowhere to get in the list 25 years 
 after it was first released.
Presumably the effect of a few opinionated decision makers in a way-too-big corporation with a shameless preference for closed ecosystems. Rightfully dropping into oblivion again by the looks of it [the language, not the corporation ;)].
https://computerhistory.org/blog/the-deep-history-of-your-apps-steve-jobs-nextstep-and-early-object-oriented-programming/ Next wasn't big at the time. They needed to do something to increase developer productivity for gui. Objective C was one early answer. Company developed to commercialise it got crushed by C++. Jobs was already using Obj C and licensed it and hired programmers from that company. So there are all kinds of ways things can happen and sometimes the path is quite unexpected.
Oct 14 2019
prev sibling next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Thursday, 10 October 2019 at 16:32:44 UTC, Ethan wrote:
 https://www.youtube.com/watch?v=Og847HVwRSI

 While not unsurprising, it was still fascinating watching 
 Objective-C come out of nowhere to get in the list 25 years 
 after it was first released.
Actually, it is surprising, because it is wrong. Assembler and BASIC was much larger in the mid 80s. 4GL should have a fairly strong presence in the late 80s. Etc. It is most likely based on surveys of big corporations. Most programming happend outside of those.
Oct 10 2019
next sibling parent reply Dennis <dkorpel gmail.com> writes:
On Friday, 11 October 2019 at 06:30:28 UTC, Ola Fosheim Grøstad 
wrote:
 Assembler and BASIC was much larger in the mid 80s. 4GL should 
 have a fairly strong presence in the late 80s. Etc.

 It is most likely based on surveys of big corporations. Most 
 programming happend outside of those.
Do you have a source saying that those languages were more popular in the 80s / that most programming happened outside corporations?
Oct 10 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 06:50:38 UTC, Dennis wrote:
 Do you have a source saying that those languages were more 
 popular in the 80s / that most programming happened outside 
 corporations?
Do you need a source??? I happend to be alive in the 80s. There is no way possible that Ada accounted for 43% of all programs written in 1986. Most programming clearly happened outside BIG corporations in the 80s, yes. Github is a completely different dataset than these (presumably US biased) surveys and measures completely different behaviour.
Oct 11 2019
parent reply Dennis <dkorpel gmail.com> writes:
On Friday, 11 October 2019 at 07:05:12 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 11 October 2019 at 06:50:38 UTC, Dennis wrote:
 Do you have a source saying that those languages were more 
 popular in the 80s / that most programming happened outside 
 corporations?
Do you need a source???
Well I don't _need_ a source, it's not an important question to me. I was just wondering if you had one, since you saying "they're wrong, their data is biased" is a bit ironic when your own source is "personal experience", which is notoriously biased ;)
Oct 11 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 07:40:27 UTC, Dennis wrote:
 source is "personal experience", which is notoriously biased ;)
I'm not sure if it is biased to say that the ocean is wet if you actually are standing in it. Keep in mind that 1986 was the heyday of 8-bit computers, low on memory and diskette for storage in a blooming small business and a home computing market. Lots of small businesses were looking for ways to transfer their existing backoffice to computers, e.g. simple filing-cabinet-style databases or custom software. No need for Ada, which you only needed to get US government contracts, like DoD projects. Besides even for US big corporations 43% in Ada sounds excessive. Might be that they just crossed off for which languages they had some project in, but not how many projects. Dunno. The stats in the video seems unreasonable all over the place. Bascially, you cannot aggregate data in the way the author of the video has. It isn't sound. You don't get an apple-pie if you throw oranges into the mix.
Oct 11 2019
next sibling parent Dennis <dkorpel gmail.com> writes:
On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad 
wrote:
 I'm not sure if it is biased to say that the ocean is wet if 
 you actually are standing in it.
Being wet is not quantitative. A better analogy is you saying "the ocean can't possibly be 17 degrees Celcius on average, I'm standing in it and it feels warmer than that".
 Bascially, you cannot aggregate data in the way the author of 
 the video has.
You can totally critisize the method, but you can't dismiss the results because they contradict your personal experience (which is at least as biased as whatever statistical methods the author used).
Oct 11 2019
prev sibling next sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad 
wrote:

 Bascially, you cannot aggregate data in the way the author of 
 the video has. It isn't sound. You don't get an apple-pie if 
 you throw oranges into the mix.
I agree, but it is still fascinating to see the rise of C, then Java and Python (we all know that, but visualization helps). And JS growing and shrinking and growing again, and PHP. It basically is the history of technology, and I recognize it especially from the 90ies onward as in internet > mobile devices. Objective-C could only become "big" because of Apple, but when I first saw Mac OS X, I knew they'd be big. A lot of people laughed and said "Oh, the shiny icons, all Mickey Mouse!" But Jobs did the job well. JS obviously succeeded due to the internet and the name that lived off Java. Now, this begs the question: To which extent do PLs influence the course of technology (e.g. C in the 80ies) and to which extent does the demand / the market created by new technologies influence PLs and their use? It's a bit like the hen and the egg, ain't it? If anything, the video depicts a changing world and society and PLs are just one indicator. I'd love to read a well researched book about it.
Oct 11 2019
next sibling parent Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 10:06:46 UTC, Chris wrote:

 Now, this begs the question: To which extent do PLs influence 
 the course of technology (e.g. C in the 80ies) and to which 
 extent does the demand / the market created by new technologies 
 influence PLs and their use? It's a bit like the hen and the 
 egg, ain't it?
Oh, another thing, when and how did the general availability (open source and "powerful" PCs) of PLs begin, i.e. when did people (nerds) start to write software at home, this, of course influenced the course of IT big time. The usual hen and egg: more powerful machines (reasonably prized), more nerds/devs, more nerds/devs more PCs etc. Interesting fact (Europe): it wasn't until the late 2000s that companies no longer demanded a CS degree but realized that a lot of people where literate in terms of programming just because they would do it at home as a hobby. I was really surprised the first time I read something like "degree in CS or experience in XYZ programming". And I remember the flamewars on the internet after Apple had introduced Xcode ("Now every idiot can program, the standard of software will go down!" - didn't happen, btw). Nowadays if you don't have something like Xcode (cf. Android Studio), you're out of the race. I.e. corporations empower people and lock them in at the same time (market share), people break out with the help of OSS and other corporations. I remember the dark days when OSS was considered the "great unwashed", now no corporation can do without it. Apple was one of the first, before that Sun with Java? Please correct me, if you know more. I don't have the whole picture. Anyway, corporations create demands, users create demands. An interesting feedback loop, to me it's nowhere clearer than in software, it could be used for courses in economics.
Oct 11 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 10:06:46 UTC, Chris wrote:
 Objective-C could only become "big" because of Apple, but when 
 I first saw Mac OS X, I knew they'd be big. A lot of people 
 laughed and said "Oh, the shiny icons, all Mickey Mouse!" But 
 Jobs did the job well.
Right, but there is also a social factor. Many had fond memories of their first mac, their first computer. Although the machine itself was crazy expensive, Apple provided "cheap" laser printers by driving it from the computer rather than building the rendering engine into the printer. So mac+printer was not unreasonable for office use with high quality printing. So, in this period of Microsoft being too dominating, there were plenty of buyers that wanted Mac to be great again.
 Now, this begs the question: To which extent do PLs influence 
 the course of technology (e.g. C in the 80ies) and to which 
 extent does the demand / the market created by new technologies 
 influence PLs and their use? It's a bit like the hen and the 
 egg, ain't it?
Javascript clearly had an impact, but it might have happened with another language too. As a consequence it is very difficult to say what would have happened. Would Go and Swift have the same feature set if D had not existed? Difficult to say. Have authors of other languages read the D forums and gotten inspiration from what they have read? Maybe, I don't know. Swift have at least picked up lambdas like this "{$0 < $1}", maybe all on their own, maybe they were inspired from /bin/sh, but I remember arguing for it in the forums. We'll never know how languages actually evolve... social dynamics are kind of messy.
 If anything, the video depicts a changing world and society and 
 PLs are just one indicator.
Yeah, but is a bit scary that anything that is presented visually in a crisp and clean manner based on "reputable datasets" are intuitively taken as true. Human beings have very little resistance to certain rhetorics. For this topic it was not a big deal, but for other topics the political connotations are not so great. Especially in this day and age of AI recommender systems ("Did you like this biased presentation? Then you probably also will like this biased presentation!" ;-)
Oct 11 2019
parent Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 17:05:47 UTC, Ola Fosheim Grøstad 
wrote:

 Right, but there is also a social factor. Many had fond 
 memories of their first mac, their first computer. Although the 
 machine itself was crazy expensive, Apple provided "cheap" 
 laser printers by driving it from the computer rather than 
 building the rendering engine into the printer. So mac+printer 
 was not unreasonable for office use with high quality printing. 
 So, in this period of Microsoft being too dominating, there 
 were plenty of buyers that wanted Mac to be great again.
Social factors, yes, a lot of Mac users worked in the "cool" sectors (layout, magazines, media in general). Apple had also struck a chord with their shiny animated icons and a computer that just worked when you bought it. With Windows there were still numerous issues. Windows had a "nerdy" GUI, MS wanted to lock users in, e.g. there were always issues with Java, whereas Apple shipped their Macs with Java - and Xcode. So Apple was also becoming a sound platform for software development. Ironically, all that changed after the advent of the iPhone and Apple began to lock users in and others out (just like MS had done before). I switched to Linux as I had promised I would, if they started to lock users and devs in - or out.
 Javascript clearly had an impact, but it might have happened 
 with another language too. As a consequence it is very 
 difficult to say what would have happened.
JS was at the right place at the right time, it was clever marketing, because the WWW would become huge anyway.
 Would Go and Swift have the same feature set if D had not 
 existed? Difficult to say. Have authors of other languages read 
 the D forums and gotten inspiration from what they have read? 
 Maybe, I don't know. Swift have at least picked up lambdas like 
 this "{$0 < $1}", maybe all on their own, maybe they were 
 inspired from /bin/sh, but I remember arguing for it in the 
 forums. We'll never know how languages actually evolve... 
 social dynamics are kind of messy.
I don't know. D's problem is that D devs don't know what they want, what D is supposed to achieve, so they have a new pet project every few months / years. Others just watch and pick and choose? But I don't know what impact D really has.
 Yeah, but is a bit scary that anything that is presented 
 visually in a crisp and clean manner based on "reputable 
 datasets" are intuitively taken as true. Human beings have very 
 little resistance to certain rhetorics. For this topic it was 
 not a big deal, but for other topics the political connotations 
 are not so great. Especially in this day and age of AI 
 recommender systems ("Did you like this biased presentation? 
 Then you probably also will like this biased presentation!" ;-)
We all knew that anyway, that and why C, JS and Objective-C got bigger. So I think we're safe here.
Oct 14 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad 
wrote:
 Keep in mind that 1986 was the heyday of 8-bit computers, low 
 on memory and diskette for storage in a blooming small business
Expanding on this so younger people understand the difference between the 80s and now. 1981-1986 were a period where personal computing became available and small businesses were able to purchase computers (with or without a tiny hard drive). This is kinda the 8/16 bit computing era. Many of these computers shipped with BASIC available. In the beginning when there was little software available people could write applications in BASIC and sell them as commercial products. As the market matured competition got harder and BASIC was no longer an option for commercial products. In the mid 80s there were already thousands software packages available for the IBM PC, and an unknown large number of similar magnitude for 8-bit home computers. Low memory footprint meant that programs were short and focused on a limited task and that developers could ship new applications after only a few months of work. On 8-bit home computers, many of the early applications were written in BASIC, then a mix of BASIC and machine language and as the competition got harder the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language. The threshold for starting up a small software company was now much lower than for the big mainframes... So a lot of programs were written, on cheap computers, using very crude tools. Some small developers would consider a macro assembler a luxury item... The old computing manufactures completely missed this boat (most of them) and that left them in the dust. They relied on expensive hardware, expensive software, expensive manpower, high margins, small volume, large profits. So they viewed the low margin, high volume, small computer market as something completely separate and somewhat insignificant, and thus "surveys" prior to 1990 are likely to see this as the serious computing market that is completely separate from the personal computer market. This didn't go well, IBM evaporated, SUN died, SGI died, DEC evaporated and so on. 1987-1994 could be viewed as the 16/32 bit era where non-GC high level programming took off also in the personal computing space... From 1995 onwards more memory was available and GC-high level programming and web-apps starts to dominate... This is where D belongs. Anyway, measuring language popularity is problematic. Programmers do not necessarily like the language they have to use at work, and don't necessarily use the same language at home. (We can assume that people who use D do so because the want to use it. Not so for Ada, which was known to be met with resistance and was most likely adopted primarily to get government contracts.) Also, far more software is written in Java, JavaScript and PhP than can be seen on github. Many deployments on the web are closed source deployments. So... these kinds of "I have statistics" are not as objective as they may seem. Getting conclusive results from quantitative analysis requires a very keen mindset and a lot attention to details and context. There is no certainty in large numbers... although people find large datasets convincing. A dangerous fallacy... Want to do data analysis? 1. Find high quality data. 2. Then get high quantity. 3. Then limit your findings based on a solid understanding of the contexts of the data-collection.
Oct 11 2019
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 10:40:13 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad 
 wrote:
 Keep in mind that 1986 was the heyday of 8-bit computers, low 
 on memory and diskette for storage in a blooming small business
Expanding on this so younger people understand the difference between the 80s and now. 1981-1986 were a period where personal computing became available and small businesses were able to purchase computers (with or without a tiny hard drive). This is kinda the 8/16 bit computing era. Many of these computers shipped with BASIC available. In the beginning when there was little software available people could write applications in BASIC and sell them as commercial products. As the market matured competition got harder and BASIC was no longer an option for commercial products. In the mid 80s there were already thousands software packages available for the IBM PC, and an unknown large number of similar magnitude for 8-bit home computers. Low memory footprint meant that programs were short and focused on a limited task and that developers could ship new applications after only a few months of work. On 8-bit home computers, many of the early applications were written in BASIC, then a mix of BASIC and machine language and as the competition got harder the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language. The threshold for starting up a small software company was now much lower than for the big mainframes... So a lot of programs were written, on cheap computers, using very crude tools. Some small developers would consider a macro assembler a luxury item... The old computing manufactures completely missed this boat (most of them) and that left them in the dust. They relied on expensive hardware, expensive software, expensive manpower, high margins, small volume, large profits. So they viewed the low margin, high volume, small computer market as something completely separate and somewhat insignificant, and thus "surveys" prior to 1990 are likely to see this as the serious computing market that is completely separate from the personal computer market. This didn't go well, IBM evaporated, SUN died, SGI died, DEC evaporated and so on.
[snip] Care to write a book? I think you, Paulo Pinto and Walter and others here could write a good book about it. I find it fascinating how companies like SUN etc. defeated themselves. The mechanisms are fascinating, and as I said it's a fascinating topic for economics. The history of technology is fascinating from the first time humans could control fire, over the wheel to the internet. However, software gives you the chance to study the development of technology in fast motion. Things have developed incredibly fast, but not as fast as they could. What are the factors? Marketing strategies, narrow-mindedness etc.
Oct 11 2019
next sibling parent reply IGotD- <nise nise.com> writes:
On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
 the best apps/games were written in pure machine language to 
 get most out of very limited hardware. Embedded programming 
 was also typically done in machine language.
This is actually an urban legend. The applications that needed most of performance in the 1980s were mostly written C (Borland C was really popular during the 80s) with a few optimized parts done in assembler. Very few programs were done in pure assembler. There wasn't any need to write everything in assembler except certain optimized loops. It is simple check this as you can just search your old DOS .exe file for Borland for example and you will be surprised how many DOS programs used C during the 80s. I suspect as previously mentioned that this survey is based on large companies. Ada has a suspiciously large cut during the 80s. Also what is based on? Per worker, per product, per company? Ada was probably big during the 80s because it was the height of the cold war but still a bit too high I think.
Oct 11 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
 On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
 the best apps/games were written in pure machine language to 
 get most out of very limited hardware. Embedded programming 
 was also typically done in machine language.
This is actually an urban legend. The applications that needed most of performance in the 1980s were mostly written C (Borland C was really popular during the 80s) with a few optimized parts done in assembler. Very few programs were done in pure assembler. There wasn't any need to write everything in assembler except certain optimized loops.
A lot of programming for 8-bit computers were done in pure assembler. Also embedded cpus like 6800 were quite limited (256 bytes static RAM), so assembler was in many ways suitable. 16-bit computers like the IBM PC had more RAM, but even people who wrote in Turbo Pascal in the late 80s would drop down to assembler for performance where needed.
 It is simple check this as you can just search your old DOS 
 .exe file for Borland for example and you will be surprised how 
 many DOS programs used C during the 80s.
So maybe it wasn't clear from what I wrote, but in my experience there were essentially two segments up to 1986: 1. The limited "16-bit" 8088 CPU with 8-bit data-bus for IBM and CP/M business oriented machines that could address more than 64KB. 2. The 8-bit small business/home owner segment that often hat 16/32/64/128 KB of RAM and Z80 or 6502/6510 CPUS. Although I know that some established developers in the later years used crosscompilers when writing for 8-bit CPUs that wasn't what most did in the early days. Then later some computers shipped with more than 1 CPU so that they both could be used for business and games, wasn't the Commodore 128 one of those? I believe it also had some kind of bank switching mechanism so that it was possible to access 128KB, but I didn't use this one myself. I have heared that it was used for cross platform development of Commodore 64 software, so they would write software on the CBM128 and execute it on the CBM64, obviously that would make it possible to use better tools. The first assembler I used on the C64, was written in BASIC, read from tape. If my machine language program crashed, then I would have to reload the assembler from tape... There were better solutions around, like ROM cartridges... but either way, you had to be patient. IIRC the most known 8-bit game composer Rob Hubbard wrote his music engine in assembler and entered his tunes in hex... These players were very small as they were added to games that already were cramped for space and the tunes had to be several minutes long. I believe he sometimes would use RAM areas that sat behind memory mapped areas and the like (ROM/Registers) because memory was so tight.
 I suspect as previously mentioned that this survey is based on 
 large companies. Ada has a suspiciously large cut during the 
 80s. Also what is based on? Per worker, per product, per 
 company? Ada was probably big during the 80s because it was the 
 height of the cold war but still a bit too high I think.
It certainly seems high if you include countries outside the US. I also strongly suspect many US software development companies in the 80s simply wanted to be able to take on Ada development as a strategic "we have consultants that know Ada" thing, since the US government required Ada, but those programmers might also do C/Pascal in their daily work when working on other projects...
Oct 11 2019
prev sibling parent reply Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
 This is actually an urban legend. The applications that needed 
 most of performance in the 1980s were mostly written C (Borland 
 C was really popular during the 80s) with a few optimized parts 
 done in assembler. Very few programs were done in pure 
 assembler. There wasn't any need to write everything in 
 assembler except certain optimized loops.

 It is simple check this as you can just search your old DOS 
 .exe file for Borland for example and you will be surprised how 
 many DOS programs used C during the 80s.

 I suspect as previously mentioned that this survey is based on 
 large companies. Ada has a suspiciously large cut during the 
 80s. Also what is based on? Per worker, per product, per 
 company? Ada was probably big during the 80s because it was the 
 height of the cold war but still a bit too high I think.
Big corporations still widely used Assembly in the 80ies (the suicide rates where highest among assembly programmers - no joke). Some people thought that C wasn't that different so why bother? However, it soon became clear that a. if the Assembly programmer left (or killed himself), nobody else could make sense of the program and b. although C was 10% slower, squeezing out the last 10% wasn't worth it (law of diminishing returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?
Oct 11 2019
next sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
 returns). I have it on good authority that the civil service 
 still uses assembler in certain areas (revenue). I wonder why?
Interesting. Maybe they use assembler because a compiler could inject malicious code? Doesn't seem likely, but this reminds me of a fairly new research topic of «proof carrying machine language». But you are right, I've heard several people state that the late 80s Motorola 68000 machine language was so programmer friendly that there was no real reason to write code in C... I remember taking a course on operating system kernels where I had the choice to use Motorola 68000 assembly or C, and I did everything in assembly because it seemed both easier and more fun. Actually, I believe I used assembly instead of C on the on-paper-exam as well... because it seemed easer, I suppose. Anyway, the both the 68000 instruction set and the first MIPS instruction set are very programmer friendly. So it all depends, I guess.
Oct 11 2019
parent Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 12:14:02 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
 returns). I have it on good authority that the civil service 
 still uses assembler in certain areas (revenue). I wonder why?
Interesting. Maybe they use assembler because a compiler could inject malicious code?
My guess is that the civil servants that had learned how to program in assembler didn't want to change / retrain and since they couldn't be fired they continued using assembler, but it might also be a security issue. I know that the public sector often has the oldest systems for several reasons: 1. security and stability: an new system introduces new errors / vulnerabilities and they can't afford to "not work" for a day or two, 2. reluctance of employees to learn something new, 3. old contracts etc. Then again, they have no problem accidentally deleting all your records (has happened to thousands of people). Schools are often very conservative because a. teachers don't want to learn something new (_they_ are the teachers after all, why should they learn anything?), b. the IT guy (which is often a teacher) learned how to use Internet Explorer, and Chrome of Firefox is just too much! Personally, I couldn't live without checking out new technologies.
Oct 11 2019
prev sibling next sibling parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
 On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
 This is actually an urban legend. The applications that needed 
 most of performance in the 1980s were mostly written C 
 (Borland C was really popular during the 80s) with a few 
 optimized parts done in assembler. Very few programs were done 
 in pure assembler. There wasn't any need to write everything 
 in assembler except certain optimized loops.

 It is simple check this as you can just search your old DOS 
 .exe file for Borland for example and you will be surprised 
 how many DOS programs used C during the 80s.

 I suspect as previously mentioned that this survey is based on 
 large companies. Ada has a suspiciously large cut during the 
 80s. Also what is based on? Per worker, per product, per 
 company? Ada was probably big during the 80s because it was 
 the height of the cold war but still a bit too high I think.
Big corporations still widely used Assembly in the 80ies (the suicide rates where highest among assembly programmers - no joke). Some people thought that C wasn't that different so why bother? However, it soon became clear that a. if the Assembly programmer left (or killed himself), nobody else could make sense of the program and b. although C was 10% slower, squeezing out the last 10% wasn't worth it (law of diminishing returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?
It depended also on the CPU used. Programming something big in x86 assembly is akin to torture, for m68k not so much. There were companies that built big software packages in pure assembler on the m68k machines. On the Atari ST for example, there was the company CCD who wrote the best programming editor of the platform Tempus in ASM. They also built Tempus Word, a full featured text processing suite which was so much better than Microsoft of the time, and all written in pure assembly.
Oct 12 2019
prev sibling parent Patrick Schluter <Patrick.Schluter bbox.fr> writes:
On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
 On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
 This is actually an urban legend. The applications that needed 
 most of performance in the 1980s were mostly written C 
 (Borland C was really popular during the 80s) with a few 
 optimized parts done in assembler. Very few programs were done 
 in pure assembler. There wasn't any need to write everything 
 in assembler except certain optimized loops.

 It is simple check this as you can just search your old DOS 
 .exe file for Borland for example and you will be surprised 
 how many DOS programs used C during the 80s.

 I suspect as previously mentioned that this survey is based on 
 large companies. Ada has a suspiciously large cut during the 
 80s. Also what is based on? Per worker, per product, per 
 company? Ada was probably big during the 80s because it was 
 the height of the cold war but still a bit too high I think.
Big corporations still widely used Assembly in the 80ies (the suicide rates where highest among assembly programmers - no joke). Some people thought that C wasn't that different so why bother? However, it soon became clear that a. if the Assembly programmer left (or killed himself), nobody else could make sense of the program and b. although C was 10% slower, squeezing out the last 10% wasn't worth it (law of diminishing returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?
As I already said in my previous post, it depends on the architecture and the culture of the platform. I was, for instance, when I was still consultant in Luxembourg at Deloitte, proposed a mission in a German bank to program in assembly on mainframes. Apparently, it is still quite common to program the big iron IBM mainframes (Z server) in assembler. The IBM macro assembler are so advanced that they allow for quite high level constructs that make programming with it not much different than programming COBOL or even C.
Oct 12 2019
prev sibling parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
 Care to write a book? I think you, Paulo Pinto and Walter and 
 others here could write a good book about it.
I am sure somebody has done so? There is at least a scientific journal about the history of computing where articles describe old systems in detail in order to record the history for future generations. (I did write an article about the first user-built graphical MUD on the Internet, though. I have to put it on the web some day.) Actually, that would be a good theme for a youtube channel.
 I find it fascinating how companies like SUN etc. defeated 
 themselves.
Yeah, SUN and SGI had some great tech ideas and I assume they also had great engineers and it still didn't work out. I wonder what they could have come up with if they had addressed the personal computing space. They didn't survive networked clusters of Linux commodity-PCs and fast cheap ethernet interconnects...
 Things have developed incredibly fast, but not as fast as they 
 could. What are the factors? Marketing strategies, 
 narrow-mindedness etc.
Right, and there are some recurring themes. Like the introduction of the iPad was kinda like the 80s all over. People got iPads, was fascinated by the hardware and was looking high and low to find applications to run on it, which in the early days were not polished. It was not obvious what they could use it for so people created many kinds of apps, and users were looking for the next great thing to try out. That's pretty much what the early personal computing era was like too. People had very little preconception of what was possible with their hardware and would look for new and interesting software to run on it. Today there seems to be stagnation and lots of copying. The most profitable and marketable ideas are rehashed in 100s, if not 1000s of variations and new and unique ideas kind of drown in the noise. So, now you have not only to create an interesting polished app, you also need to understand marketing really well (and have money to do it). Seems that the early days of new tech are the most interesting times, then we hit a low creativity equilibrium. Kinda sad... so much potential that is probably overlooked. There might be a similar dynamics in relation to programming languages. Gonna think about that some.
Oct 11 2019
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 12:59:48 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
 Care to write a book? I think you, Paulo Pinto and Walter and 
 others here could write a good book about it.
I am sure somebody has done so? There is at least a scientific journal about the history of computing where articles describe old systems in detail in order to record the history for future generations. (I did write an article about the first user-built graphical MUD on the Internet, though. I have to put it on the web some day.)
I'd be interested in the dynamics: market, demand, the feedback technology <=> user, i.e. what drives what at what stage. At what stage do users have the upper hand and at what stage are users guided (or nudged) by technologies.
Oct 11 2019
parent reply Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 13:15:24 UTC, Chris wrote:
 I'd be interested in the dynamics: market, demand, the feedback 
 technology <=> user, i.e. what drives what at what stage. At 
 what stage do users have the upper hand and at what stage are 
 users guided (or nudged) by technologies.
That is a very interesting topic. Feel free to send me an email if you want to discuss it further. I believe I have some books related to this in one way or another in another location, gotta have a look at that when I get the opportunity. Maybe something alone the lines of The Evolution of Cooperation by Axelrod is relevant (I don't quite remember the angle now, so gotta browse through my bookshelf later :-). I used to be interested in online worlds where the users themselves can build or at least create societies with activities of some sort, the paper I talked about above was such a place. You played the adventure games, when you had done them all, you would create new adventure games for others. (I guess many text MUDs work that way.) Of course, online games and online communities are not strictly market related, but there are at least stages that users go through that one have to think about when designing online games and online services. I haven't really followed this topic much since 2005, but might look into it again... So yeah, certainly interested, and could also do some article searches on the topic when I have time. :-)
Oct 11 2019
parent reply Chris <wendlec tcd.ie> writes:
On Friday, 11 October 2019 at 13:36:20 UTC, Ola Fosheim Grøstad 
wrote:
 On Friday, 11 October 2019 at 13:15:24 UTC, Chris wrote:
 I'd be interested in the dynamics: market, demand, the 
 feedback technology <=> user, i.e. what drives what at what 
 stage. At what stage do users have the upper hand and at what 
 stage are users guided (or nudged) by technologies.
That is a very interesting topic. Feel free to send me an email if you want to discuss it further. I believe I have some books related to this in one way or another in another location, gotta have a look at that when I get the opportunity. Maybe something alone the lines of The Evolution of Cooperation by Axelrod is relevant (I don't quite remember the angle now, so gotta browse through my bookshelf later :-). I used to be interested in online worlds where the users themselves can build or at least create societies with activities of some sort, the paper I talked about above was such a place. You played the adventure games, when you had done them all, you would create new adventure games for others. (I guess many text MUDs work that way.) Of course, online games and online communities are not strictly market related, but there are at least stages that users go through that one have to think about when designing online games and online services. I haven't really followed this topic much since 2005, but might look into it again... So yeah, certainly interested, and could also do some article searches on the topic when I have time. :-)
It'd be interesting to study software, hardware and IT technology in general in terms of the Austrian School - Ludwig von Mises and Friedrich von Hayek [2]. I think the period from 1970 till today is the perfect example if one wants to analyze how systems evolve, which turns they take, the various forces that work within them - and how the state still hasn't figured out how to control IT. Fascinating. [1] https://en.wikipedia.org/wiki/Ludwig_von_Mises [2] https://en.wikipedia.org/wiki/Friedrich_Hayek
Oct 11 2019
next sibling parent Ola Fosheim =?UTF-8?B?R3LDuHN0YWQ=?= <ola.fosheim.grostad gmail.com> writes:
On Friday, 11 October 2019 at 13:59:35 UTC, Chris wrote:
 It'd be interesting to study software, hardware and IT 
 technology in general in terms of the Austrian School - Ludwig 
 von Mises and Friedrich von Hayek [2].
I don't know much about the Austrian School, unfortunately. Are you thinking about this: https://en.wikipedia.org/wiki/Austrian_business_cycle_theory
 I think the period from 1970 till today is the perfect example 
 if one wants to analyze how systems evolve, which turns they 
 take, the various forces that work within them - and how the 
 state still hasn't figured out how to control IT. Fascinating.
Yes, that is true. Networked computers are creating an alternative landscape, with alternative real estate and everything that comes with that. But I think one would have to think draw on many fields. There are some complex social dynamics in this evolution.
Oct 11 2019
prev sibling parent reply Laeeth Isharc <laeeth laeeth.com> writes:
On Friday, 11 October 2019 at 13:59:35 UTC, Chris wrote:
 On Friday, 11 October 2019 at 13:36:20 UTC, Ola Fosheim Grøstad 
 wrote:
 On Friday, 11 October 2019 at 13:15:24 UTC, Chris wrote:
 I'd be interested in the dynamics: market, demand, the 
 feedback technology <=> user, i.e. what drives what at what 
 stage. At what stage do users have the upper hand and at what 
 stage are users guided (or nudged) by technologies.
That is a very interesting topic. Feel free to send me an email if you want to discuss it further. I believe I have some books related to this in one way or another in another location, gotta have a look at that when I get the opportunity. Maybe something alone the lines of The Evolution of Cooperation by Axelrod is relevant (I don't quite remember the angle now, so gotta browse through my bookshelf later :-). I used to be interested in online worlds where the users themselves can build or at least create societies with activities of some sort, the paper I talked about above was such a place. You played the adventure games, when you had done them all, you would create new adventure games for others. (I guess many text MUDs work that way.) Of course, online games and online communities are not strictly market related, but there are at least stages that users go through that one have to think about when designing online games and online services. I haven't really followed this topic much since 2005, but might look into it again... So yeah, certainly interested, and could also do some article searches on the topic when I have time. :-)
It'd be interesting to study software, hardware and IT technology in general in terms of the Austrian School - Ludwig von Mises and Friedrich von Hayek [2]. I think the period from 1970 till today is the perfect example if one wants to analyze how systems evolve, which turns they take, the various forces that work within them - and how the state still hasn't figured out how to control IT. Fascinating. [1] https://en.wikipedia.org/wiki/Ludwig_von_Mises [2] https://en.wikipedia.org/wiki/Friedrich_Hayek
I went in 1993 to see Don Lavoie at the Center for the Study of Market Processes at George Mason University, one of the three important centres of the Austrian school at tbat time (Israel Kirzner at NYU and Murray Rothbard at Auburn were the other ones). And whilst I was waiting for him I found a very interesting thesis by a guy on software components, applying Austrian capital theory sorts of ideas to software. I read the whole thing there and recently wondered what had happened to the author because open source was sort of what he talked about if not what he had in mind. It was Brad Cox, creator of Objective C. There are two strains of thought when it comes to thinking about uncertainty and capital within the Austrian school. The function of pure profit is to stimulate awareness of possibilities for greater economic coordination. But there is something creative about the act of perception and that has different implications depending on how you look at it. Time, ignorance, expectations, uncertainty, entrepreneurship, discovery - quite important topics where one can learn from Austrian and post - Keynesian thinking. So Austrian economics has had quite a lot of influence on how we approach things, including with technology.
Oct 14 2019
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 14 October 2019 at 16:23:01 UTC, Laeeth Isharc wrote:
 [snip]


 I went in 1993 to see Don Lavoie at the Center for the Study of 
 Market Processes at George Mason University, one of the three 
 important centres of the Austrian school at tbat time (Israel 
 Kirzner at NYU and Murray Rothbard at Auburn were the other 
 ones).
I did my MA in Econ at NYU and participated in the colloquium that Kirzner still showed up at from time to time.
Oct 14 2019
parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 14 October 2019 at 16:56:24 UTC, jmh530 wrote:
 On Monday, 14 October 2019 at 16:23:01 UTC, Laeeth Isharc wrote:
 [snip]


 I went in 1993 to see Don Lavoie at the Center for the Study 
 of Market Processes at George Mason University, one of the 
 three important centres of the Austrian school at tbat time 
 (Israel Kirzner at NYU and Murray Rothbard at Auburn were the 
 other ones).
I did my MA in Econ at NYU and participated in the colloquium that Kirzner still showed up at from time to time.
Speaking of EU, I'm still a big fun of Huerta de Soto, but here we move towards money ... Anyway, the importance of the impact of being "simply human" in economy can be shifted to a lot of disciplines, including technical ones. And I'm convinced that this is the most powerful factor that it's driving things nowadays.
Oct 14 2019
parent reply jmh530 <john.michael.hall gmail.com> writes:
On Monday, 14 October 2019 at 17:42:33 UTC, Paolo Invernizzi 
wrote:
 [snip]

 Speaking of EU, I'm still a big fun of Huerta de Soto, but here 
 we move towards money ...
It's been years since I read him, but I recall him making some pretty weak arguments about fractional reserve banking being fraud. I believe either Larry White or George Selgin responded to the arguments, by him and others, sufficiently to suit me. Other Austrians who have the same beliefs about FRB tended to make very bad hyperinflation forecasts in the aftermath of the Great Recession. That being said, I don't know what de Soto predicted at the time.
Oct 14 2019
parent reply Paolo Invernizzi <paolo.invernizzi gmail.com> writes:
On Monday, 14 October 2019 at 18:21:24 UTC, jmh530 wrote:
 On Monday, 14 October 2019 at 17:42:33 UTC, Paolo Invernizzi 
 wrote:
 [snip]

 Speaking of EU, I'm still a big fun of Huerta de Soto, but 
 here we move towards money ...
It's been years since I read him, but I recall him making some pretty weak arguments about fractional reserve banking being fraud. I believe either Larry White or George Selgin responded to the arguments, by him and others, sufficiently to suit me. Other Austrians who have the same beliefs about FRB tended to make very bad hyperinflation forecasts in the aftermath of the Great Recession. That being said, I don't know what de Soto predicted at the time.
Oh, I don't want to turn this post in a discussion around economics, but I believe, as him, that fractional reserve IS fraud ... White and Selgin simply missed the points, but, again, who cares at the end ... Regarding hyperinflation, well, we are since 2008 in a debt deflation, Japan history is here to tell us that US and EU are doomed to a long travel throw low or negative rates... today 60 billions/months of FED "it's not a QE it's a different things!" are there to testify that ... everything is under control ... But at the end, as usual, a default WILL happen, and we will see, if the Austrians fate will be legend or heavy truth ... End of economic rant, but, hey, I've enjoyed the digression!
Oct 14 2019
parent jmh530 <john.michael.hall gmail.com> writes:
On Monday, 14 October 2019 at 18:53:15 UTC, Paolo Invernizzi 
wrote:
 [snip]

 Oh, I don't want to turn this post in a discussion around 
 economics, [snip]
Probably a good idea!
Oct 14 2019
prev sibling parent Arun Chandrasekaran <aruncxy gmail.com> writes:
On Friday, 11 October 2019 at 06:30:28 UTC, Ola Fosheim Grøstad 
wrote:
 On Thursday, 10 October 2019 at 16:32:44 UTC, Ethan wrote:
 https://www.youtube.com/watch?v=Og847HVwRSI

 While not unsurprising, it was still fascinating watching 
 Objective-C come out of nowhere to get in the list 25 years 
 after it was first released.
Actually, it is surprising, because it is wrong. Assembler and BASIC was much larger in the mid 80s. 4GL should have a fairly strong presence in the late 80s. Etc. It is most likely based on surveys of big corporations. Most programming happend outside of those.
Coincidentally this talk brings up facts behind the PL popularity trends. https://youtu.be/QyJZzq0v7Z4?t=124 Especially the fact that $$$ were spent in the Java marketing by Sun and JS just boarded it with the Java buzzword. I am wondering how much money Mozilla/Google would've spent on marketing Rust/Go.
Oct 11 2019
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2019-10-10 18:32, Ethan wrote:
 https://www.youtube.com/watch?v=Og847HVwRSI
 
 While not unsurprising, it was still fascinating watching Objective-C 
 come out of nowhere to get in the list 25 years after it was first 
 released.
The introduction of the App Store on iPhone and letting third party developer create apps for the iPhone. -- /Jacob Carlborg
Oct 11 2019