What is the language of the future?

Started by Jeremy Dunn, June 13, 2009, 11:24:58 AM

Previous topic - Next topic

Jeremy Dunn

This post is simply my amateur musings about the future of programming and asking your thoughts. One of things I was hoping would improve would be that the environment would allow you to write code in any language that you wanted so that usefullness would determine what languages would be used. Every application I own locks me into limited languages instead of giving me freedom. Why is this and what can be done about it?



Is Mathematica the future? It is up to 2700+ functions and counting. Is this a new idea to simply grow the language to the point where it starts eating everything around it? There is something to be said for this approach and I think we can agree that Mathematica is a pretty cool product. Is Mathematica going to be like China and quietly grow in the background until everyone realizes too late that they can't stop it?



What is the syntax of the future? Can much more be done? We here obviously prefer LISP syntax because of (to us) its simplicity. Why has no one done objective testing of different languages to determine which ones are most likely to be difficult to learn or apply properly? Why is there nothing but idiotic flaming wars on the subject? You would think the software industry would investigate this more with billions of manhours of work on the line. Is it like music where one person likes jazz and another prefers classical with no objective standard? Do two people prefer different syntaxes because their brains actually perceive differently and the different syntaxes are ideal to that type of thought? Can we show that a given language is better even though another one is the individuals favorite pick?



What trends to do you see? What do you think the future of programming will be?

Kazimir Majorinc

#1
Jeremy, my take on these issues is:


Quote from: "Jeremy Dunn"Every application I own locks me into limited languages instead of giving me freedom. Why is this and what can be done about it?


I think it is some balance between human instincts for union and separation (and even fight.) That it would be better that people are more about cooperation than competition or fight, but culture isn't developed that much to make up for our instincts yet.


QuoteMathematica ...


Wolfram Researches invests in technology that will not become obsolete, so, yes, they are very promising, but they are still surprisingly small company (400 employees?) so their technological advantage cannot be that big yet.


Quote"What is the syntax of the future? Can much more be done? We here obviously prefer LISP syntax because of (to us) its simplicity. Why has no one done objective testing of different languages to determine which ones are most likely to be difficult to learn or apply properly?"


There were some research about that. In this article it appears that Lisp was relatively good. http://www.flownet.com/gat/papers/lisp-java.pdf">http://www.flownet.com/gat/papers/lisp-java.pdf But, language designers try to be innovative, not to make product that responds to habits of the people, but to improve the habits.


Quote"Why is there nothing but idiotic flaming wars on the subject? You would think the software industry would investigate this more with billions of manhours of work on the line. Is it like music where one person likes jazz and another prefers classical with no objective standard?



What trends to do you see? What do you think the future of programming will be?


Yes, some people like one, other people like other language. Even in mathematics some people prefer short notation, other people prefer long. But, it is important that people share same language. Market produces quality only if consumers are able to recognize quality, and with programming, like with art or philosophy, it is not that simple. So, I expect very slow progress in the mainstream.
http://kazimirmajorinc.com/\">WWW site; http://kazimirmajorinc.blogspot.com\">blog.

unixtechie

#2
The most successful programming language or several will be based on innate patterns of thinking. This is not speculative,  those can be observed and picked up from the human languages.



Let me explain. Functional programming (mathematics it is based on being completely irrelevant in the context of success among human practitioners) uses the same mechanism, which is built into human languages:



A man killed a dog.

A middle-aged man killed a pitbull of unknown origin.

A middle-aged man, who was walking in the park at midday killed a raging pitbul of unknown origin which previously attacked -- and so on.



The style of functional programming uses substitution for some entities, expanding them from "one word" to "a phrase, the evaluation of which provides that word".



Compare this to "procedural languages". they cannot insert entities into entities, and therefore a programmer  is forced to speak with a series of simplest sentences:



There was a man. A man was middle-aged. The man was walking. Walking was done in the park. ..................



Each of these simplest statements in terms of programming brings to life an intermediary variable, then the next statement acts on the produced variable, producing its own intermediary thingy and so on.



Ability to insert phrases into phrases, which lisp and functional languages are so good at, RIDS your brain of preoccupation with juggling all those intermediaries, and so immediately makes you think easier and better.



That is not all. We also have "object-oriented" semantics, but that one is so unnatural and outrageous, I'd be afraid to speak of it myself -- and refer you to the (unnecessarily long, but) good essay "Execution in the Kingdom of Nouns". Do read it:

http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html">http://steve-yegge.blogspot.com/2006/03 ... nouns.html">http://steve-yegge.blogspot.com/2006/03/execution-in-kingdom-of-nouns.html



The language of the future makes thinking easier BY WORKING ACCURATELY with built-in patterning humans are good at (as can be observed from the structure of "real" human languages), and BY STOPPING overburdening of the humans with housekeeping and intermediate entities, which appear as a result of that language crude design.



Not only the built-in language patterns matter - the micro-actions one takes during the act of programming, too.

"Literate programming", which is a system to process macros CREATED IN A HUMAN LANGUAGE - literate programming in essence is programming IN PSEUDOCODE, familiar to anyone who studied CS and read humanly understandable explanations -- is a leading approach in such meta-management aids.



Most programmers use programming structure - like subs/procedures - to help their brains "chunk" and process the stuff. Literate programming unloads this organizational level  from the code and splits housekeeping and coding, again raising the "brainpower" because no management tasks need to be tracked at tne moment of programming any more.

unixtechie

#3
.. and by the way, the deep psychological reason - need to make languages as natural (i.e. close to human language) as possible - is the cause of my suggestion that documentation of programming languages be rewritten WITH NATURAL PHRASES.



"Old style" (think of 1970s, the tradition is well alive today) when instead of giving a practical guidance to a programmer, the manuals would introduce a new notation, then describe the language operators in that notation:
Quote "Terms and definitions" - "every activation has an activation point, an activation period and activation end"



Note 2: Semantically, the binding relation might be imagined to be materialized in some entity, the binding. Such a binding entity is constructed at run time and destroyed later, or might have indefinite extent.



A defining form is a toplevel special form (see §12.3) that establishes a binding between name and an object which is the result of handling the arguments according to the semantics implied by defining-form-name; it is a violation if a defining form is not a toplevel form.


(extracted from some LISP documentation)

THIS is what makes the language "not widely accepted", not any imagined technical flaws. A human must INTERNALLY TRANSLATE the crap into PRACTICAL terms and instructions, and this work is excruciatingly painful.



The newer Internet generation of 1990s came up with an immensely better approach to documentation - with practical examples on the first place, and generalizations second, and rid their heads of the need to retranslate the stuff from governese into human.



However, that is not enough. Each and every operator must be given A PHRASE in a human language, which it implements;

Not "cons exp-1 exp-2" -- but

CONS(truct) list_member1 list_member2 ---- return constructed list (member1 member2)

not "find str str" -- but "find this_string inside_this_string" --- return offset_number



and so on



The reasons are psychological - ZERO internal translation must be done by the user, he should get a phrase that he can murmur to himself while "speaking" the operator during programming.



Only by being natural to thinking can languages succeed, because capacity of the human mind, especially "the short-term memory" capacity is limited - and disposing of the unnecessary load on thinking we can advance it virtually at no cost.

Kazimir Majorinc

#4
And what do you think about these issues, Jeremy?
http://kazimirmajorinc.com/\">WWW site; http://kazimirmajorinc.blogspot.com\">blog.

Jeremy Dunn

#5
Quote from: "Kazimir Majorinc"And what do you think about these issues, Jeremy?


Well, I am not sure about Unixtechie's belief in using the principles of "natural" language. In natural language we do not make use of parentheses and cannot nest expressions more than a couple levels without getting lost. Larry Walll tried to use linguistic principles in PERL syntax but my impression of it is that it is absurdly topheavy with dozens of syntax rules that mere mortals like me cannot keep straight in my head without constantly referring to a crib sheet. The most effective languages seem to use our visual processing rather than linquistic. Most text editors now use color as there is a generally acknowledged belief that this aids in accuracy and clarity. I believe that code will take more advantage of the visual system and perhaps move into 3D isometric representations. I don't believe that this would just be a gimmick, our visual system starves for 3D and it is a powerful method of representation.



There needs to be more of a concerted effort to make software as language neutral as possible. Couldn't a system be developed to specify a language's syntax in terms of a neutral intermediary like Bachus-Naur notation? The translator would go from the language to Bachus-Naur and then translate that into code for the computer. This probably would not result in optimum performance but performance is not everything. There is a need to free the programmer from being forced to use languages that they don't want to. I see this as being a real roadblock to progress.



We LISPers I think have a love for higher levels of abstraction and that is the fuel for jokes about lispers. I may be wrong but I don't expect to see tremendous progress in new abstractions. From what I see most modern languages seem to reinvent abstractions that were already part of LISP years ago and then act like they were there first.



I think there is something to be said for the Mathematica approach. With a large vocabulary of functions Mathematica is getting big enough to have some claim at being an actual language. At least the user of Mathematica does not have to bother becoming a math major to have very complex algorithms in his toolbox. I know Lutz in one posting said that he didn't want to see NewLISP become Mathematica but would that really be so awful?



There seems to be this major divide between object oriented and functional programming with people drawing a sharp line in the sand. As I mentioned before I suspect we do not all think the same. Experiments have been conducted that have proven conclusively that some people hear the same sequence of audio tones differently. I think it likely that some people are hardwired for object oriented and others are functional. There may be no way to get around this other than having multiple languages. It would be nice if someone could test this hypothesis to see if it is true. The language that is mathematically optimal is likely to be incapable of being grasped by ordinary people.

TedWalther

#6
Well, let us look at two new directions.



Haskell provides "monads".  I don't understand them one bit, not at all, and don't give too figs.  But by osmosis, I've absorbed the idea that they are a higher level abstraction of lambda functions.  That might be a future thing that gets incorporated in all languages.



Lua tries to go LISP one better by using tables as the default data types rather than lists.  I have no experience at all with Lua, so I don't know how that is working out for Lua, except it seems to be used a lot in games and fun little hacker tools like "nmap".



And recently Google introduced a table based rather than list based version of its search service.  Was it cribbing from Lua?



COBOL, LISP, FORTRAN, and C have survived.  Algol and Pascal fell by the wayside, but REXX and POSIX shell borrowed from it.  C++ and Java are the new Cobols.  JavaScript has a lot of potential because it is SO DAMN SIMPLE.  Just like C originally was.  Really, JavaScript is like stripped down LISP wrapped in C syntax.  With the MOO object system thrown in.



Future?  Who can predict.  Based on the widespread use of Visual Basic, JavaScript, and PHP, I think newLISP has a DAMN good shot at being the eventual language of the future; it is doing everything RIGHT.... all the key libraries in place, all the important idioms documented, friendly and excellent user support...  Plus the niftyness of LISP.
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence.  Nine months later, they left with a baby named newLISP.  The women of the ivory towers wept and wailed.  \"Abomination!\" they cried.

didi

#7
As a hobby-programmer i have maybe not the right view, but I think the language of the ( farer?) future is still not there .

My dream since a long time, is a new programming language, self-optimizing, self-organizing, self-simplifying .And i think newLISP is the first step to it.

TedWalther

#8
newLISP already includes the good ideas from Module2 and a bunch of other languages, plus is super simple to use and learn.  I do think it is the first step to the future language.  At some point we just need to figure out better how to incorporate assembly language as a standardized thing.



Perhaps this; compile the asm code into a byte string, place it in memory, take the address of the first byte, assign it to a symbol, then when it is called the same as a function is called, it is just a jmp to that address?  Is that already how it works?  I'd really like asm functions that don't get clobbered in the call/jsr and ret/return calls.



Perhaps for 11.0. :-)



Ted
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence.  Nine months later, they left with a baby named newLISP.  The women of the ivory towers wept and wailed.  \"Abomination!\" they cried.

Lutz

#9
it has always been possible:



http://www.newlisp.org/index.cgi?page=Embedded_Binary">http://www.newlisp.org/index.cgi?page=Embedded_Binary



and here:



http://www.newlisp.org/newlisp_manual.html#cpymem">http://www.newlisp.org/newlisp_manual.html#cpymem





ps: this example only on 32bit newLISP on Intel processors

TedWalther

#10
Ah, so when I define a function to point to a memory address, then call it, it is a direct jmp?  I was concerned that argument parsing would interfere a little bit, or that registers might get clobbered on return.



For instance, let us say I make an asm function that does some hardware dependent stuff.  It leaves a return value in a particular register.



Then I immediately make a call to another asm function I've defined, which stuffs that reg value into a newlisp symbol.  That will work?



Also, I know it is possible in newLISP right now; it is just a matter of formalizing it and incorporating the assembly language for each architecture in.  But that can wait.



Lutz, what do you think of the way Parrot implements continuations using registers instead of a stack?  The developers say it is very fast and cheap to make continuations that way.  Does newLISP support continuations?  (Like monads, I really don't understand continuations, just that they are buzzy/trendy computer theoretical ideas)
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence.  Nine months later, they left with a baby named newLISP.  The women of the ivory towers wept and wailed.  \"Abomination!\" they cried.

Elica

#11
Ah, IMHO programming languages will die within the span of a human life. What I actually mean is that in the future people would communicate with machines (this includes "programming") in a non-verbal and non-language approach. I don't know the details yes (ha-ha!) but it could be something like direct brain-to-machine connection, so machines will be "wired" to us like external limbs...



So in a hundreds years the only programming languages one could see would be those in the Museums of Ancient Programming.

Jeremy Dunn

#12
Quote from: "Elica"Ah, IMHO programming languages will die within the span of a human life. What I actually mean is that in the future people would communicate with machines (this includes "programming") in a non-verbal and non-language approach. I don't know the details yes (ha-ha!) but it could be something like direct brain-to-machine connection, so machines will be "wired" to us like external limbs...



So in a hundreds years the only programming languages one could see would be those in the Museums of Ancient Programming.


I am very skeptical of this. Do you really want someone drilling a hole in your head and making these very delicate connections at your risk given what we know of hospital goofups? And every time a correction or replacement is needed they have to grind and slice you some more? How much of your gray matter are you willing to have them dice open to make you super cyborg? Do you trust these microchips to sit in your head for 60 years and be completely unaffected by the surrounding biomaterial they are embedded in? Oh, and if you have the ability to directly interface with the outside world what is to stop some hacker from downloading some crap onto your chip that makes you have an epileptic seizure every 10 minutes or throw up continually? I think I will let you be the daring test case to lead the way for the rest of us.

Elica

#13
I'm afraid you are interpreting my words too literally. There is no need of medical intervention, no wires, no chips. You've watched too many crap SciFi movies.



Just kidding, of course.



What I imagine will be interface which is not bound to physical connection in the conventional meaning. Let's say you can control a device with your thoughts, or your gestures, or your face expression, or whatever you may think of.



Also, I'm not sure that there will be any programming at all (the way we understand it now). No languages with grammars and syntax rules, no proceses like "compiling", "linking", etc.



Just imagine 30-40 years ago someone asking: "What is the future of computer cards? How will a computer card in 2000 look like? Will they be still rectangle? Will the holes be still arranged in a grid?"



http://www.vampandtramp.com/finepress/h/hanmer-Beaut-L.jpg">//http://www.vampandtramp.com/finepress/h/hanmer-Beaut-L.jpg





But it all depends on the definition of "future". For me, "future" is 100 years from now. And to be perfectly honest, I'm not sure there will be computers in 2109!!!