Programming methodologies
All the research on high-level programming methodologies could be summarized in jut a couple of sentences
Write declarative, mostly functional specialized DSLs, package them as libraries of higher-order functions and design layers of these DLSs
. And that is, literally, it.
Ideally, one strives to have almost exactly declarative mathematical notation at the use side, like the list comprehensions of Haskell:
map f xs = [f x | x <- xs] -- or filter p xs = [x | x <-xs, p x]
The Hutton’s dialect of Haskell has monad comprehensions too. The infamous do-notation is another example of a declarative DSL which hides ugliness of implementation details.
The most successful applications, such as TeX or Rails are sets of specialized DSLs.
The only relevant recent development worth mentioning is so-called DDD - Domain Driven Design, which is a high-level framework of organizing domain knowledge and extracting knowledge from the slang of professionals in the field.
The main point is that the evolved vocabulary and standard idioms should be incorporated into and directly expressed in code in code. There should not be any translation of terms (nouns and verbs) and, ideally, no introduction of unnecessary, redundant abstractions, which is the root of all evil.
And that is really it.
One models actual domain knowledge with math (sets, relations and logic), then expresses it as a set of core types and implements a declarative DSL around these core types.
This is how a single person (Knuth) or a small team (Rails) could bootstrap a really sophisticated project using very limited resources (both time and manpower). Getting the principles right is the key.
Notice, how successful Rails
, while being implemented in a crappy language, and how TeX
is manageable and virtually bug-free, being a huge collection of specialized packages. This is where a proper theory works.
Imagine using better tools for the job. It is a well-known, experimentally established fact that a solution in Haskell could be a factor of 10 less lines of code (due to sum-types and pattern-matching on them).
So, FP (Milner, Wadler, Cardelli), modularity (ADTs of Lisov) and also the right tools together could guarantee, that one at least would not fail miserably and that even a partial solution would be reusable and worth time and effort spent.
C++
While Java is the monument to human stupidity, C++ is a tribute to brain-dead over-“engineering” (imagine designing a language without putting composition at its core and ignoring everything that Lisp, ML and Smalltalk communities have researched).
In C++ one is always very explicit and very verbose about every particular implementation detail. Not just verbose, but cryptic, which, of course, helps with employment - memorizing lots of details and particulars is what most people consider as knowledge.
This habit of being over-verbose about particulars leads to creation of unnecessary, redundant and particularly ugly abstractions, which quickly become standardized and set in stone.
Modern C++ is such a cathedral of solidified crap. Compared to PHP or node_modules
, however, it is a cathedral indeed. Just don’t look too closely of what exactly it is made off.
Old timers
Programmers from the golden era (80s and early 90s) which culminated with Haskell 98 report and now in a steady decline, used to have a solid background in mathematics, so the interfaces they have designed were based on well-known mathematical structures with mathematical rigor and attention to details. They understood the importance of minimal interfaces and uniformity, everything that mathematicians of the past realized studying actual and generalized patterns of reality (which is what a proper math is about).
No one writes like this anymore, and this is the real cause of suffering. The last old-timer language has been David Moon’s POPL, or we could consider Scala 3 as the last worth of the ML family.