Originally, this post was started to commemorate the release of the first operating system for microprocessors, CP/M. My best guess for its release date is 1974-04-25. Many sources state april, but are hesitant to specify the exact date. The person responsible for its development was Gary Kildall (1942 – 1994). This weblog post is published on the thirtieth anniversary of Kildall’s death, rather than the 50th anniversary of the first CP/M release.
Kildall has his origins in Seattle, Washington. His grandfather was a Norwegian immigrant, who ran a navigation school. Kildal is the name of a farm at Hægeland, Vest-Agder, in the south of Norway. His maternal ancestors had their roots in Långbäck, Skellefteå, Sweden before his maternal grandmother Sophia Lundmark emigrated to Edmonton, Alberta, Canada. His mother then immigrated to Seattle.
Kildall was awarded a doctorate in computer science from the University of Washington in 1972. He then worked as a computer science instructor at the at the Naval Postgraduate School in Monterey, California. This was to fulfill his military conscription obligations, since the US was still engaged in the Vietnam war.
Later that year, to learn more about processors, Kildall bought an Intel 4004 processor and began writing experimental programs for it. Intel lent him 8008 and 8080 processor systems. In 1973, Kildall developed the first high-level programming language for microprocessors, called PL/M = Programming language for microcomputers. It incorporated ideas from: PL/I = Programming language one, developed at IBM starting in 1964; ALGOL = Algorithmic language, originally from 1958; and XPL = expert’s programming language, from 1967.
One of the first software projects Kildall worked on, with Ben Cooper, was the Astrology Machine. It is generally regarded as unsuccessful, but gave Kildall an opportunity to field test programs he had written: a debugger, an assembler, part of an editor, and a Basic interpreter that he used to program.
In 1974, Kildall developed CP/M = Control Program/Monitor (originally)/ Control Program for Microcomputers (later). Intergalactic Digital Research (originally) /Digital Research, Inc. (DRI, later) was established by Kildall and his wife Dorothy McEwen (1943 – 2005) to market CP/M.
In 1975, Kildall developed a set of BIOS = Basic input/ output system routines, firmware used to provide runtime services for operating systems and programs and to perform hardware initialization during the boot process = power-on startup. BIOS initially allowed 8080 and compatible microprocessor-based computers to run the same operating system on any new hardware with trivial modifications. Later different BIOS applications were made for other computer families.
A source-to-source translator = source-to-source compiler (S2S compiler) = transcompiler = transpiler, is a type of translator that takes the source code of a program written in a programming language as its input and produces an equivalent source code in the same or a different programming language. In 1981, DRI introduced one of these, calling it a binary recompiler. XLT86 was written by Kildall. It translated .ASM source code for the Intel 8080 processor (in a format compatible with ASM, MAC or RMAC assemblers) into .A86 source code for the 8086 (compatible with ASM86).
Kildal initiated the creation of the first diskette track buffering schemes, read-ahead algorithms, file directory caches, and RAM drive emulators.
At this point it becomes difficult to separate Kildall’s role as innovator/ inventor and that as initiator/ project manager/ executive, where other engineers at DRI and elsewhere made significant technical contributions.
For example, Tom Rolander made most of the developmental inputs to CP/M starting in 1979, that later resulting in these operating systems added preemptive multitasking and windowing capabilities as well as menu-driven user interfaces. These are found on: Multi-Programming Monitor Control Program (MP/M), Concurrent CP/M, Concurrent DOS and DOS Plus.
In 1984, DRI started development Graphics Environment Manager (GEM), known primarily as the native graphical user interface of the Atari ST series of computers, providing a desktop with windows, icons, menus and pointers (WIMP). This was an outgrowth of a more general-purpose graphics library known as Graphics System Extension (GSX), written by a team led by Don Heiskell since about 1982. Another major contributor was Lee Jay Lorenzen at Graphic Software Systems.
Kildall and Rolander founded Activenture in 1984. They created the first computer interface for video disks to allow automatic nonlinear playback, presaging today’s interactive multimedia. This company became KnowledgeSet in 1985, which developed the file system and data structures for the first consumer CD-ROM, an encyclopedia for Grolier.
On 1964-05-01 mathematicians John G. Kemeny (1926 – 1992) and Thomas E. Kurtz (1928 – ) successfully ran a program written in BASIC = Beginner’s All-Purpose Symbolic Instruction Code, on Dartmouth college’s General Electric GE-225 mainframe computer.
Deception is a key word to describe General Electric’s (GE) computers. GE was founded 1892-04-15 in Schenectady, New York. In the 1960s, Chairman Ralph J. Cordiner (1900 – 1973) had forbidden GE from entering the general purpose computer business. General Manager of GE’s Computer Department in Phoenix, Arizona, Homer R. “Barney” Oldfield (1916 – 2000) claimed that the GE-200 series would produce industrial control computers. When Cordiner discovered he had been duped, he immediately fired Oldfield. Despite this, production of the computer series continued as a profitable venture for several years.
For the technically interested, only. The GE-225 used a 20-bit word, of which 13 bits could be used for an address. Along with a central processing unit (CPU) the system could also contain a floating-point unit (FPU) = Auxiliary Arithmetic Unit (AAU). Alternatively, there was a fixed-point decimal option with three six-bit decimal digits per word. It had eleven I/O channel controllers. The machines were built using about 10 000 discrete transistors and 20 000 diodes. They used magnetic-core memory, and a standard 8 kiloword system held 186 000 magnetic cores. A base level mainframe weighed about 910 kg. GE sold a variety of add-ons including storage disks and printers.
I imagine that both GE and the computer department at Dartmouth were attempting to use each other. GE probably wanted a functioning operating system, but didn’t have the human resources to make it. Dartmouth College wanted a computer, but didn’t have the money to buy it. The result was a compromise that benefited both. In 1963, Kemeny applied for a National Science Foundation grant to bring a GE-225 computer to Dartmouth and build a general-purpose time-sharing system, essentially an operating system. This time-share approach was to allow others (Read: faculty and students) to access the mainframe computer and run programs using BASIC. It just took a year to implement.
While Dartmouth College copyrighted BASIC, it was made freely available to everyone. The name originated from Kurtz’s wish for a simple but meaningful acronym. Kurtz, in an open letter, reiterates that BASIC was invented to give students a simple programming language that was easy to learn. It was meant for amateurs, not computing professionals.
Language standards were created: The European Computer Manufacturers Association (ECMA), was founded in 1961 to promote computing standards in Europe. They released their version of BASIC standards in 1986. This was followed in 1987 by the American National Standards Institute ( ANSI) releasing its version of the standards. In 1991, ECMA was renamed ECMA International.
In 1975, Paul Allen and Bill Gates adapted BASIC for personal computers like the Altair 8800. I am uncertain of Gates’ motivations. At this early stage, he undoubtedly appreciated BASIC, as he fell into the amateur category, rather than being a professional (system) programmer. My perspective is that he capitalized on the work of others: Kemeny and Kurtz with BASIC; Tim Paterson (1967 – ) with a Quick and Dirty Operating System (QDOS) at Seattle Computer Products, that became MS-DOS for Microsoft.
In 1976, Steve Wozniak developed a BASIC interpreter for the Apple I which subsequently became Integer BASIC for the Apple II in 1977. More BASIC followed with the personal computer (PC), in 1982.
In 1991, Microsoft developed Visual Basic. Over the years new variants emerged such as Microsoft Small Basic, in 2008. It teaches beginners programming concepts. Basic and similarly languages are important because they emphasize simplicity, readability, and ease of use.
Kemeny and Kurtz’ work on BASIC was recognized by the American professional association for electronic engineering and electrical engineering (IEEE) as part of their milestone program which marks historic places for human innovation from around the world. Places honored include Thomas Edison’s lab in Menlo Park, New Jersey, where he invented the light bulb and phonograph, and the hilltop outside Bologna, Italy where Guglielmo Marconi sent the first transatlantic radio transmission. On 2021-02-22 a plaque was placed outside of the computer lab at Collis Center, 2 N Main St, Hanover, NH 03755, U.S.A. The citation reads: Beginner’s All-purpose Symbolic Instruction Code (BASIC) was created in this building. During the mid-1970s and 1980s, BASIC was the principal programming language used on early microcomputers. Its simplicity and wide acceptance made it useful in fields beyond science and mathematics, and enabled more people to harness the power of computation.
Notes
Kemeny was president of Dartmouth College from 1970 to 1981 and pioneered the use of computers in tertiary education. He chaired the presidential commission that investigated the Three Mile Island nuclear meltdown of the Unit 2 reactor (TMI-2) of the Three Mile Island Nuclear Generating Station on the Susquehanna River, near Harrisburg, PA. The reactor accident occurred 1979-03-28, and released radioactive gases and radioactive iodine into the environment, resulting in the worst accident in the American commercial nuclear power plant history.
I have written about Dartmouth College before. The Synclavier synthesizer was developed there.
Of course, I am hoping that readers will mistake Prolog for Prologue = an introduction to something. In addition, I have a further hope that the poster, displayed above, will induce a feeling of calmness in these same readers, so that they will be able to approach the real content of this weblog post with detachment, but not indifference. The main problem with the poster, is that almost everything about it, apart from its wording, and especially its signal red background, but also large sans-serif white lettering and the British crown, reinforce a feeling of danger!
Wikipedia tells us, Keep Calm and Carry On was a propaganda poster produced by the British government in 1939 in preparation for World War II. The poster was intended to raise the morale of the British public. Although 2.45 million copies were printed, the poster was only rarely publicly displayed and was little known until a copy was rediscovered in 2000 at Barter Books, a bookshop in Alnwick, a market town in Northumberland, in north-east England.
Some topics, toothaches in particular, or dentistry more generally, do not induce calmness. Instead, they increase the flow of adrenaline, and other forms of psychomotor agitation, resulting in psychological and physical restlessness. Thus, before confessing what this topic is really about, I want to reassure readers that it is a topic that can be fun, if approached correctly. Initially, I had thought of dividing the topic into multiple parts and publishing them at the rate of one part a day, over more than a week. The parts are still subdivided, but each reader will have to determine her/ his/ its etc. own consumption rate.
One
I am used to dealing with actors, people pretending to be someone else. In the process, these people have helped me developed my own acting talents. Some of the actors I had to deal with, had failed their auditions, often called court appearances or trials. One of the consequences of such a failure, could be imprisonment at the Norwegian low security prison where I was assigned as their teacher.
Other actors were youth in the final years of their compulsory education, at senior secondary school. They had to attend school, but some of them were better than others at presenting themselves in a positive light. Not that everyone sought positivity. In a Media and Communication English class, I once asked the pupils to write about something they wanted to accomplish in the future, and why they wanted to do so. The reply that created the most work, not just for myself, but for the student, the school principal, the school psychologist and others, was an essay that detailed how this person wanted to become a mass murderer. Afterwards, he claimed that this was a work of fiction.
I have experienced a lot of acting performances by students. The most problematic actors are those who pretend they understand a topic, when they have absolutely no idea about it. The role of the teacher is to channel student activity so that the student finds a route that suits her/ his personality, and is effective at helping the student learn new sets of knowledge and develop new skills. This route-finding skill is the primary talent needed to teach.
Two
This weblog post’s topic is programming, in a specific language. While numbers vary with the situation, perhaps ten percent of actors will delight in learning the programming language they are confronted with. A similar number, give or take, will not master anything. Those remaining in the middle will accept programming languages as a necessary evil in this internet age. Stated another way, a small percentage will find their own route without assistance, another small percentage will never find a route, while most people in the middle will struggle to varying degrees, but ultimately find a route, hopefully one that suits their personality.
The main difficulty in terms of learning to program, is that schools begin computer science studies assuming that students will want to learn to program the particular language being offered. Admittedly, some languages are fairly general, including some that are designed more for teaching/ learning, than for any more practical applications. Pascal, is probably the best example of such a language. However, my contention is that the first computing course a student takes should look at programming principles.
I was fortunate to read Bruce J. MacLennan’s, Principles of Programming Languages: Design, Evaluation and Implementation (1983). A second edition was published in 1987, and a third in 1999. There is not much difference between the three editions, and the same languages are discussed in all three: pseudo-code interpreters, Fortran, Algol-60, Pascal, Ada, Lisp, Smalltalk and Prolog. All the editions of this book explain that computer languages can have different purposes, and asks readers to examine the purpose of each programming language. Not everyone should learn the same one. Before they decide to learn programming, people should know what they want to do with that language, after they have learned its basics. Much of the time the answer is, learn a more appropriate language.
Three
The Prolog in the title of this post refers to the Prolog programming language. Fifty years ago, in 1972, Prolog was created by Alain Colmerauer (1941 – 2017), a French computer scientist, at Aix-Marseille University, based on the work of Robert Kowalski (1941 – ), an American-British computer scientist, at the time at the University of Edinburgh.
Prolog is a logic programming language associated with artificial intelligence and computational linguistics. That doesn’t say much. It might be more understandable to say that students typically learn Prolog by creating a program/ system that shows social relationships between people. Despite their reputation as rather awkward social creatures, even computer scientists have the capability of understanding some social markers: mother, father, daughter, son, at a minimum. Thus, even computer scientists can construct a system that will determine then show relationships between any two people. The system can be constructed slowly, so that initially only, say, four relationships are allowed. Outside of those four choices, people will be labelled as having no relationship. However, in subsequent iterations, the number of relationships can be expanded, almost indefinitely.
Prolog consists of three main components: 1) a knowledge base = a collection of facts and rules fully describing knowledge in the problem domain; 2) an interface engine, that chooses which facts and rules to apply when attempting to solve a user query; 3) a user interface, that takes in the user’s query in a readable form and passes it to the interface engine. Afterwards, it displays results to the user.
Four
Programming in Prolog, written by William F. Clocksin (1955 – ) & Christopher S. Mellish (1954 – ), is the most popular textbook about the language. Originally published in 1981, a revised (read: readable) second edition appeared in 1984. My copy has my name printed on the colophon page in capital letters in blue ink, considerably faded now, along with the date, 12 iii 1985.
It is not the only book about Prolog in my library. Among the thirteen others are: Dennis Merritt, Building Expert Systems in Prolog (1989); Kenneth Bowen, Prolog and Expert Systems (1991); Alain Colmerauer & Philippe Roussel, The Birth of Prolog (1992); Krzysztof R. Apt, From Logic Programming to Prolog (1997) and even an updated 5th edition of Clocksin and Mellish, subtitled Using the ISO Standard, (2003). Part of the reason for this large number, was my using of Prolog to teach expert systems.
Five
Expert systems are not particularly popular, now. In artificial intelligence, popularity contests are being won by machine learning tools. Yet, some people don’t have to be at either the height of fashion or the cutting edge of technological advances, and can appreciate older approaches.
Edward Feigenbaum (1936 – ) constructed some of the first expert systems. He established the Knowledge Systems Laboratory (KSL) at Stanford University. Long words are often strung together to describe his work. A favourite phrase is, “Knowledge representation for shareable engineering knowledge bases and systems.” This was often coded into the phrase expert system. He used it mainly in different fields of science, medicine and engineering. KSL was one of several related organizations at Stanford. Others were: Stanford Medical Informatics (SMI), the Stanford Artificial Intelligence Lab (SAIL), the Stanford Formal Reasoning Group (SFRG), the Stanford Logic Group, and the Stanford Center for Design Research (CDR). KSL ceased to exist in 2007.
The focus of Feigenbaum, and American institutions more generally, was on rules-based systems: Typically, these found their way into shells = computer programs that expose an underlying program’s (including operating system) services to a human user, produced by for-profit corporations, that would sit on top of Lisp, one of the programming languages commented on in two chapters of MacLennan’s book, and used extensively for artificial intelligence applications. Feigenbaum and his colleagures worked with several of these expert systems, including: ACME = Automated Classification of Medical Entities, that automates underlying cause-of-death coding rules; Dendral = a study of scientific hypothesis formation generally, but resulting in an expert system to help organic chemists identify unknown organic molecules, by analyzing their mass spectra, and combining this with an existing but growing chemical knowledgebase; and, Mycin = an early backward chaining expert system that identified infection related bacteria, recommend specific antibiotic treatments, with dosage proposals adjusted for patient’s mass/ weight. He also worked with SUMEX = Stanford University Medical Experimental Computer. Feigenbaum was a co-founder of two shell producing companies: IntelliCorp and Teknowledge. Shells are often used by experts lacking programming skills, but fully capable of constructing if-then rules.
Six
Prolog is frequently contrasted with Lisp, and offers a different approach for developing expert systems. Some users are fond of saying that Prolog has a focus on first-order logic. First-order is most appropriately translated as low-level, or even simple. The most important difference between the two languages, is that anyone with average intelligence should be able to understand, and work with Prolog. Much of the work done with Lisp involves higher-orders of logic, often requiring the insights of real logicians, with advanced mathematics in their backgrounds. An introductory logic course, gives sufficient insight for anyone to work with Prolog.
Prolog is also claimed to be a more European approach. This probably has something to do with the way teaching is organized. In Norway, for example, a (Danish) Royal decree from 1675 and still valid today, required all university students to undertake an Examen philosophicum, devised by advisor Peder Griffenfeld (from griffin, the legendary creature, plus field, but originally, Schumacher = shoemaker, 1635 – 1699). Under the Danish King, Christian V (1646 – 1699), he became the king’s foremost adviser and in reality Denmark’s (and Norway’s) actual ruler. In 1676 he fell into disfavour and was imprisoned. He was sentenced to death, for treason, but the sentence was commuted to life imprisonment. He was a prisoner on Munkholmen, outside Trondheim, and about 55 km directly south-east of Cliff Cottage, for 18 years (1680–1698), and was released after 22 years of captivity.
Until the end of the 1980s, this exam involved an obligatory course in logic, including mathematical logic, along with other subjects. This means that almost every university student (at that time), no matter what they studied, had the necessary prerequisites to work with Prolog.
Seven
Expert systems often involve heuristics, an approach to problem solving using methods that are not expected to be optimal, perfect, or rational, but good enough/ satisfactory for reaching an approximate, immediate or short-term goal. George Pólya (1887-1985), who worked at Stanford 1940 – 1953, and beyond, took up this subject in How to Solve It (1945). He advised: 1) draw a picture, if one has difficulty understanding a problem; 2) work backwards, if one can’t find a solution, assuming there is one, and see what can be derived from it; 3) develop a concrete example, from an abstract problem; 4) solving a more general problem first – this involves the inventor’s paradox where a more ambitious plan may have a greater chance of success.
One list of areas where expert systems can be used, involve system control, in particular: 1) interpretation, making high-level conclusions/ descriptions based on raw data content; 2) prediction, proposing probable future consequences of given situations; 3) diagnosis, determining the origins/ consequences of events, especially in complex situations based on observable data/ symptoms; 4) design, configuring components to meet/ enhance performance goals, while meeting/ satisfying design constraints; 5) planning, sequencing actions to achieve a set of goals with start and run-time constraints; 6) monitoring, comparing observed with expected behaviour, and issuing warnings if excessive variations occur; 7) repair, prescribing and implementing remedies when expected values are exceeded.
Sometimes one comes across Prolog tutorials that begin with subjective knowledge/ considerations. Music is a good example. Unfortunately, it is not always easy to remember if one has labelled something as trash metal or punk, and this may have operational consequences. It is much easier to confirm that person X is person Y’s granddaughter, and that person Y is person X’s grandfather, especially if persons X and Y are members of your own family.
It is always hard to know which Prolog expert system implementation will impress readers most. Here are some choices: Bitcoinolog = configures bitcoin mining rigs for an optimal return on investment; CEED = Cloud-assisted Electronic Eye Doctor, for screening glaucoma (2019); Sudoku = solves sudoku problems; an unnamed system constructed by Singla to diagnose 32 different types of lung disease (2013), another for diabetes (2019); an unnamed system by Iqbal, Maniak, Doctor and Karyotis for automated fault detection and isolation in industrial processes (2019); an unnamed system by Eteng and Udese to diagnose Candidiasis (2020). These are just some of hundreds, if not thousands, many open source.
Eight
One of the challenges/ problems with expert systems is that the scope of its domain can be unknown. In other words, when a person starts using an implemented expert system, it can be unknown just how big or little the range of problems is that can be used successfully with it. There can also be challenges with system feedback. What looks like an answer, may be a default because the system has insufficient insights (read: rules) to process information. Expert systems do not rely on common sense, only on rules and logic. Systems are not always up to date, and do not learn from experience. This means that real living experts are needed to initiate and maintain systems. Frequently, an old system is an out of date system, that may do more harm than good.
This begs a question of responsibility/ liability in case the advice provided by a system is wrong. Consider the following choices: The user, the domain expert, the knowledge engineer, the programmer of the expert system or its shell, the company selling the software or providing it as an open-source product.
Infinity
Just before publication, I learned of the death of crime novelist Susie Steiner (1971 – 2022). I decided to mention her in this weblog post, when I read in her obituary that she had spotted a Keep Calm poster on the kitchen wall at a writing retreat in Devon. She was cheered by its message of stoicism and patience.
Speaking of kitchens, at one point my intention was to use Prolog to develop a nutritional expert system, that will ensure a balanced diet over a week long time frame, along with a varied menu for three meals a day. I still think that this would be a useful system. Unfortunately, I do not think that I am the right person to implement it, lacking both stoicism and patience, to complete the undertaking.
Reflecting on Susie, I am certain that a Prolog system could be made to help writers construct their novels, especially crime fiction. A knowledge base could keep track of the facts, as well as red herrings and other fish introduced to confuse the reader, and prevent them from solving the crime. Conversely, a Prolog system could also be built that would help readers deconstruct these works, and help them solve the crime and find textual inconsistencies.
Confessions
Readers should be delighted to hear that while writing this post I used my original Clocksin and Mellish book on a daily basis! Yes, it held my laptop open at an angle of about 145°, about 10° further open than without it. When writing on other topics, I also use other books for the same purpose. Note to self: ensure that your next laptop opens at least 180 degrees!
The writer should be dismayed about the length of this post. Patricia reminds me, repeatedly, that shorter is better. She felt last week’s post on Transition One was a more appropriate length. Transition One was written in the course of an hour, with a couple of additional proof-reading sessions. Writing Prolog took more than a year, with multiple writing sessions, each adding several paragraphs.
Forth is not a mainstream programming language. Whenever it is compared to something, the most operative word is different. It is almost like assembly language, which is how a machine would interpret code, if it used English, rather than 0s and 1s to calculate and communicate. Some refer to Forth as a virtual machine, which is software pretending to be a physical machine. In part, this is because it is not just a programming language, but also an operating system. Despite this, Forth is simple. It can run on a few kilobytes (kB) of memory. When coded appropriately, it seems to be its own independent language, but with a lot of English-like words.
While Forth was invented by Charles (Chuck) Havice Moore II (1938 – ) in 1970. It was operationalized by Elizabeth (Bess) Rather (1940 – ), who – with Moore – started Forth, Inc. in 1973. Rather refined and ported Forth to numerous platforms throughout the 1970s. She also chaired the ANSI Technical Committee that produced the ANSI Standard for Forth (1994).
Forth was made specifically for the real-time control of telescopes at the United States National Radio Astronomy Observatory and, later, at Kitt Peak National Observatory. A real-time response is one that guarantees that something will happen within a specified time period. In other words, it sets a deadline for something to happen, usually one that is relatively short. Thus, a real-time process is one that happens within defined time steps of some maximum duration.
Forth is the antithesis of Ada. Wikipedia defines Ada as “a structured, statically typed, imperative, and object-oriented high-level programming language, extended from Pascal and other languages.” In its purest form, Forth is none of these, with the exception of being imperative. Most computer languages are imperative. They use statements/ commands to change a program’s state. Ada originated in the 1970s because of US Department of Defense (DoD) concerns about the high number of programming languages being used in embedded computer systems. They wanted one language to do everything. Unfortunately, with Ada they created a monster that was far too large and complex, that was slow to compile and difficult to run. A compiler (used with a compiled language) requires code to be translated into machine language, before it can be run. In contrast, an interpreter (used with an interpreted language) directly executes instructions without requiring them to have been translated into a machine language, in advance.
Allegedly, a Forth program can be compiled, but not if it contains words that are only evaluated at runtime: DOES>, EVALUATE and INTERPRET are three such words. If even one word has to be interpreted, the entire Forthdictionary would have to be embedded inside the program. Thus, Forth should always be treated as an interpreted language.
Forth is an appealing language because of its one and only guiding principle, Keep it simple! Part of this simplicity involves how the language is used. Leo Brodie – a third main contributor to the language – explains, in Starting Forth, E2 (1987): The interpreter reads a line of input from the user input device, which is then parsed for a word using spaces as a delimiter. When the interpreter finds a word, it looks it up in the dictionary. If the word is found, the interpreter executes the code associated with the word, and then returns to parse the rest of the input stream. If the word isn’t found, the word is assumed to be a number and an attempt is made to convert it into a number and push it on the stack; if successful, the interpreter continues parsing the input stream. Otherwise, if both the lookup and the number conversion fail, the interpreter prints the word followed by an error message indicating that the word is not recognised, flushes the input stream, and waits for new user input. (p. 14) While the use of unusual words may make the above description seem complex, this is a much simpler approach than that used in most other computer languages. A graphic version is shown in the flowchart above. Parse is a word used extensively by people who construct compilers. It refers to the process of dividing a sentence (or in computing, a statement) into words/ grammatical parts and identifying the parts and their relationships to each other.
One major problem with Forth is that its dictionary, more often referred to as a library in other languages, is not uniform. Some implementations come with an adequate dictionary, others less so. Some use words the same way, others give the same word different meanings. This means that Forth implementations can produce very different results, depending on dictionary content. This weakness is probably the main reason why Forth is not treated seriously, and has not been extensively used.
Forth is a stack machine, a computer where the primary interaction is moving short-lived temporary values to and from a storage location that follows the rule: last in, first out. A stack significantly reduces the complexity of a processor. Tasks are expressed as words. Simple tasks usually involve single words . More complex tasks connect together many smaller words, that each accomplish a distinct sub-task. Thus, a large Forth program is almost like a sentence that involves a hierarchy of words, distinct modules that communicate using a stack. Data is only added to the top of the stack, and removed from the top of the stack. Each word is built and tested independently. Provided that words are chosen appropriately, a Forth program resembles an English-language description of the program’s purpose.
Forthwright
Forthright is an adjective, used to describe a plainspoken/ frank/ blunt person. A person who develops/ modifies/ corrects/ improves/ uses Forth programs is referred to as a forthwright, a noun. Both words are pronounced the same way. A wright is a person who makes or repairs something. The original ca. 700 AD Old English wryhta, referred to someone working with wood. Since then, the term has expanded to include many different occupations. Carpentier, now carpenter, was introduced into England only after the Norman conquest in 1066, effectively replacing wright to describe this role. In Scotland, wright is used much more extensively.
Raspberry Pi Pico
The traditional strength of Forth is its minimalist use of resources. This is more important than it may seem. Gordon Earle Moore (1929 – ) formulated an expectation in 1965, later termed Moore’s law, by others, that computing capacity would double every year, some say every 18 months. This doubling cannot continue indefinitely. Many, including Moore, expect it to be invalid from about 2025, giving it a life span of 60 years. Even so, this means that even the most primitive of microprocessors made today has many magnitudes of capacity compared to anything made in, say, 1970. This is why many people prefer to use computer languages that are less optimal.
In contrast to Moore’s law, Niklaus Emil Wirth (1934 – ) formulated a very different expectation in his 1995 article A Plea for Lean Software, later termed Wirth’s law, by others, that software is getting slower more rapidly than hardware is getting faster. Most computer scientists are no longer making software that optimize/ minimize resource use, because they know that ample resources are available.
The reason some few people continue to use Forth is because of their acute awareness of Wirth’s law, where they see the negative impact of software bloat, on a regular basis.
General Public Licence (GPU) and public-domain Forths exist for most modern operating systems including Windows, Linux, MacOS, Android and some Virtual Machines. Such implementations include: gForth and bigForth. Dale Schumacher forked the Raspberry Pi/ARM port of JonesGorth around 2014, and removed its dependency on Linux. It now runs bare-metal, on a Raspberry Pi, booting directly into the Forth interpreter. Many important words have been re-implemented in assembly, or as part of the built-in definitions. Note: In computing, bare or bare metal refers to a computer executing instructions directly on a processor aka logic hardware, without an intervening operating system.
Iteration #2 of Unit One (#2U1), my personal workshop, will officially commence on 2023-11-01, less than two years away. It will transform a construction-support workshop into a fabrication shop, as my career as a wright/ building constructor/ carpenter comes to an end, and my career as a millwright/ machinist begins. My primary emphasis is broad, mechatronics, but the workshop’s role is limited to fabrication. Electronics and programming will probably be done inside Cliff Cottage, while much of the thinking will take place wandering about in the woods.
The purpose of the workshop is for an old man to have fun, to build upon skills learned in the past, and to learn new 21st century skills, to keep his brain and body active. Hopefully, some useful and environmentally sensitive products will be made at it.
There are plans to use Forth as the official shop language/ operating system for computer numerical control (CNC), the automated use of machine tools, controlled by a computer. I expect to have one primary machine that can move in three dimensions, and change heads as required. The two most important heads will be a router, which can shape materials as well as drill holes, and a laser cutter that cuts more accurately and with less waste than a saw. I expect to concentrate on various types of hardwoods as my primary material focus, but not to the exclusion of other materials. These are subtractive processes that remove material. In contrast, 3D-printers are additive.
There is no need to waste money on expensive silicon if cheap silicon will do. The silicon needed to control a CNC mill will be a Raspberry Pi Pico microcontroller. It costs NOK 55 = US$ 6.05 = CA$ 7.75, on the day before publication. Any money saved on silicon will be put into better bearings, and improved versions of other machine components.
Forth is not for everyone. It is useful where there is a need for a real-time system involving mechanical movements. After milling machines, and other types of tools, robots come to mind first, including unmanned underwater vehicles and drones. It should be mentioned in all fairness that Forth is not the only language I intend to use in the future. Two others are Prolog and Lua. Prolog is a logic programming language developed in France in 1971 with a number of artificial intelligence applications. Lua is a multi-paradigm scripting language, developed in Brazil in 1993. Its basic set of features that can be extended to suit different problems.