A. L. Lloyd (1908 – 1982)

Bert, Albert Lancaster Lloyd, was an English folk singer and collector of folk songs. He appears in this weblog post as a key figure in the folk music revival of the 1950s and 1960s, especially with respect to protest songs. This weblog post is being published on the fortieth anniversary of his death, 2022-09-29.

At one level, The Penguin Book of English Folk Songs (1959) is Bert’s claim to fame. Yet, I read into the title page, edited by R(alph) Vaughan Williams (1872 – 1958) with the assistance of (Bert) A. L. Lloyd. The two editors are not equals, especially in England, the arch class society, as lived in the 1950s. On Goodreads, the book has attracted two reviewers. Paul Bryant gives it five stars, but seems to think listing song titles constitutes a review. V gives it three stars, then writes, “Now all I need to do is find some friends to drink with[,] then sing these [in a] lout-ish manner. In all seriousness though, it’s well laid out with a large notes section.” V, at least, understands that songs have emotional content.

I have not prioritized purchasing this book, but own its replacement, The New Penguin Book of English Folk Songs (2012) by Steve Roud (1949 – ) and Julia Bishop (? – ). On Goodreads, Paul Bryant also gives this book five stars, then comments: “Folk music is a bitterly fought over territory. The original peasants who knew this folk stuff were located and taken into protective custody by some Victorian clerics and given the third degree until they sang like canaries. The clerics then deleted anything that looked like smut and published some of it for thrusting young schoolboys and schoolgirls to sing before they took jobs with the Foreign and Colonial Service and died of dysentery and malaria. Meanwhile some communists began to complain that all this folk was reactionary wimping about love and saucy milkmaids and lords and ladies with cranberries growing out of their brains, and pointed out that factory workers and miners had also made up folk about blacklegs getting their guts spilt. Then other communists, ones with typewriters, said that vicarfolk was all made up by Percy Grainger and Sabine Baring-Gould who used to beat each other’s bare flesh with glove puppets. Folk song? Au contraire, I think you mean fake song! they said. Then some people claiming to be Bob Dylan said that folk was still happening and they proved it too. Then some marketing departments decided it was a good label to stick on anyone who wasn’t actively taking heroin, because a label of folk guaranteed that an album would sell at least 23 copies. So people who once leaned against a pile of unsold Joan Baez albums while they were not waiting for their man were now called folk. It didn’t matter that they’d all written their own songs and played them on saxophones, it was folk if the marketing department said so.”

Kitchen scales do not decisively measure a quality difference between two books. Yet, quantitative measures impact one’s appreciation. The original book has 128 pages, while its replacement has 608. In the General Introduction of the replacement, Roud writes on the earlier work: “Vaughan Williams was one of the last survivors of the great days of the Edwardian folk-song collectors and the grand old man of the English musical establishment, although he died just before the project was completed. By contrast, Lloyd was a journalist and freelance writer who was one of the most vocal of the new Folk Revival activists, criticizing, questioning, and politically committed to spreading the word of folk song to the people.” (ix)

Dave Arthur’s (c1940 – ) book, Bert, the life and times of A. L. Lloyd (2012), provides insights into Lloyd’s life, in contrast to the Wikipedia article, that simply provides a summary. While the Wikipedia article about Bert can be read in a few minutes, it is Dave’s book that will provide a better understanding of his life and works.

Most of Bert’s family died early. In 1924, at the age of 16, after the death of his mother, and shortly before the death of his father, Bert was on his way to Australia on assisted passage, to find a new life as a farm labourer. Dave comments, “ … autodidacts like Bert are frequently such interesting people. Not bound by any ‘official’ canon or university reading list they are free to follow their noses and inclinations through half a millennium of printed books, and in the process explore bookish tributaries and bibliographic cul-de-sacs often missed by those more orthodox literary travellers who stick to the main routes.” (17) While accepting his fate as a farm labourer, he began to appreciate and collect Australian folk-songs. He also invested in HMV Plum Label (read: cheap) records, bought on speculation, to develop his musical understanding. He was not the typical sheep station hand. Immediately after the start of the Great Depression, in 1930, at the age of 21, Bert was back in England.

One of those insights provided by Dave about Bert is that Bert not always concerned about truthiness. For example, at times Bert described his father as a trawlerman = a type of deepwater, commercial fisherman. Dave mentions this, at the same time he presents evidence contesting it. Reading about it, I could not help thinking about a recent American president, an obsessed storyteller.

Libraries often devote large areas for the storage and presentation of literature, where truth is of secondary or lower importance. These collections are not referred to as lies, but as fiction. Individual books are often judged on their emotional appeal, not on how little they deviate from the truth. The same can be said about folk songs, and music more generally. People listen to music because of its emotional appeal, not its strict adherence to truth.

Sometimes, emotional content can be received negatively. In my lifetime, the song with the greatest negative impact is Vinger av stål = Wings of Steel, used by Braathens SAFE = South American and Far East Airlines, to celebrate their 50th anniversary, in 1996. It is so bad, that I cannot even find it on YouTube. I could relate to a song mentioning aluminum, titanium, carbon fibre, or even a more generic light-alloy wing. It is perhaps fitting that Braathens was out of business by the new millennium. For decades they had argued for increased competition in the European airline market. When that opportunity came, they found that they did not have the characteristics needed to succeed.

Bert was often described as a collector of folk songs. Yet, one must ask, what is a collector? Is it simply a synonym for historian? I think not. A collector has an emotional connection with the material being collected. The collection process often changes the content. Yet, the collector themselves may not appreciate how this works in practice.

Stephen Sedley (1939 – ) comments: “I think that both Bert [Lloyd] and Ewan [MacColl/ James Henry Miller (1915 – 1989), another folk singer and folk song collector,] were unnecessarily embarrassed about admitting that they were adding or improving when, of course, the whole folk process had always been a process of adding and improving.” (28)

Bert was soon a member of the Communist Party of Great Britain. He developed a friendship with Arthur Leslie Morton (1903 – 1978), author of A People’s History of England (1938). My 1976 edition takes time to explain that the book originally stopped with the Spanish Civil War, while mine ends with the conclusion of World War II. In the previous century, politics was an import aspect of the folk music being collected.

One of my interests in life is a person’s relationship with technology. In his introduction, Dave describes some characteristics of Bert in terms of ca. 1972 audio equipment, a Swiss Revox reel-to-reel tape deck. At the time, this was the apex of amateur sound recording. The technological world fifty years ago was vastly different. Today, every hand-held device aka smartphone, has audio capabilities that only the finest (read: expensive) sound studios could offer in 1972.

Attitudes to audio recording collections have changed significantly from the mid-1950s to about 2 000. In this previous century, people often took pride in displaying their recorded music collections. These collections allowed visitors to see if there was any overlapping interest. If there was, it could become a topic of conversation.

In this previous century, many musical choices involve selecting an appropriate format. The 78 rpm record was launched in 1928; the 33 rpm 12″ LP in 1948; The 45 rpm 7″ disk in 1949; the compact cassette in 1963; and, the 8-track cartridge in 1964. Each of these increased the availability of popular music. Regardless of format, music consumption was limited by personal hardware and software investments. Even music from radio stations that became widespread in the 1920s and paid for by listening to advertisements, required investment in an AM radio. Wide-band FM radio offered high-fidelity broadcasting in 1933, but only became widespread after the introduction of FM stereo broadcasting in 1961. These formats continued to dominate music until the arrival of the 120 mm digital audio compact disk (CD) in 1982, which is forty years ago. In 1991, MP3, a coding format for digital audio developed largely by the Fraunhofer Society, an applied research institution, in Germany. This format was not tied to a physical device.

Neil Straus explains, that in 1995, material costs were 30 cents for the jewel case and 10 to 15 cents for the CD. Wholesale cost of CDs was $0.75 to $1.15, while the typical retail price of a prerecorded music CD was about $17. On average, a store received 35 % of the retail price, the record company 27 %, the musician 16 %, the manufacturer 13 %, and the distributor 9 %. Because of a perceived value increase, when 8-track cartridges, compact cassettes, and CDs were introduced, each was marketed at a higher price than the format they succeeded, even though the cost to produce the media was reduced. Apple marketed MP3s for $0.99, and albums for $9.99, because the incremental cost to produce an MP3 is negligible.

In terms of digital downloads, in 1997-09, Birmingham, England Duran Duran’s Electric Barbarella became the first paid downloaded digital single, following released Boston, Massachusetts Aerosmith’s Head First that was available without charge.

Musical preferences are often based on two principles: repetition and containment. Repetition allows songs/ tracks grow on people over time. Radio playlists reinforced repetition. Eager listeners would be exposed to multiple plays a day, for at least a week. It offers opportunity for new songs to ingratiate themselves.

Containment allows songs/ tracks to be closely associated with specific, other songs/ tracks. From my childhood onward, music has typically been contained/ bundled in an album format.

What is the 2022 replacement for a music collection? One answer could be Spotify, the proprietary Swedish audio streaming services founded 2006-04-23 by Daniel Ek (1983 – ) and Martin Lorentzon (1969 – ). Some users describe streaming services as a faucet/ tap/ spigot. One turns it on, and music comes out.

One challenge with streaming is that it eliminates, or at least reduces, repetition. There is so much choice available, that music is dismissed without given something a second chance. There is little opportunity to hear a track repeated.

Streaming replaces the album bundle/ container with the track, more specifically, one type of track labelled liked songs. There is no need for any form of continuity between any two liked songs. This makes the development of any emotional attachment between tracks difficult, if not nebulous.

Unfortunately, streaming can be an uncomfortable experience. Many have commented that accessing an infinite amount of music feels inappropriate. There seem to be two challenges, possibly related to our biological nature. First, there should be quantitative limitations placed on any collection. Yes, there should be some finite, max size. Second, content should be appropriately bundled.

Bert produced a large number of albums, that united the content. For example, The Iron Muse (A Panorama of Industrial Folk Song), is the title of two Topic Records albums. The first, an LP released in 1963, followed by a CD released in 1993. Topic Records is a British folk music label, that began as an offshoot of the Workers’ Music Association in 1939, making it the oldest independent record label in the world.

Some people may appreciate this streamed approach to music, but dislike how it is manifest on Spotify, or similar services. For those who appreciate their music direct from the spigot, but dislike the streaming provider, Navidrome could be a solution. It is a simple music library, that self-hosts an open source streaming service that uses a home server and a virtual private network (VPN) on a phone. It can stream from any location, across various devices. Physically, it is a box that plugs into a router. It holds everything from Bandcamp purchases to ripped CD tracks. Users of Navidrome often comment about moving away from Big Streaming to a broader movement that incorporates small-scale tech projects and open-source services that are not resource- or energy-intensive.

In the 1950s and 1960s, when I first became a consumer of music, there was only a limited selection available. That selection was limited by recording studios, to ensure their investments produced an adequate economic return. Then, radio stations limited their play lists. After all, the purpose of a radio station is not to provide listeners with free music, but to sell advertisements. Those who were actually interested in music, typically invested in vinyl records.

Approaches to finding new music in the 2020s typically involves the internet. Online music store Bandcamp provides revenue to many musicians, taking a smaller cut of sales compared with streaming services. The Bandcamp Daily blog is a useful tool to find interesting music.

Sometimes, other living human beings can provide new musical insights. Even if musical tastes differ, people can provide musical insights unavailable from an algorithm. While it might not be the best approach to start a conversation with a complete stranger, one can continue a conversation by asking a person to disclose their most interesting musical experience this week, possibly after divulging one’s own experience.

Local record stores don’t seem to exist here in Norway any more. Some white ware stores seem to have a small selection. Fortunately, there are a few used stores that do seem to offer a selection of used records. In Inderøy, we have Kjæringa me’ Straumen that has many musical gems in different formats, including CDs and LPs.

Other sources for music can be radio stations, local and physical or distant and online. They range from the ultra professional, to amateur. Univerisity/ college/ school stations may appeal to younger people, but there are also stations that are run be retired persons, that target music of specific genres and eras.

DIY! Hardware suitable for the recording and editing of vocal music can be acquired for less than NOK 1 000/ US$ 100. Almost everything can be made using open source software, and a few hardware components such as a basic microphone (suggestion, Behringer XM8500) on an old but quiet laptop. This is goodenuf quality! In many cases, especially in terms of folk music, there should be no copyright issues, especially if distribution is limited to family and friends.

Notes

A minor character in this weblog post is Stephen Sedley. He lists carpentry, music, changing the world as his recreations. Novelist and screenwriter Ian McEwan (1948- ) is more enthusiastic about Sedley’s literary merits, commenting on Ashes and Sparks: Essays on Law and Justice (2011) “you could have no interest in the law and read his book for pure intellectual delight, for the exquisite, finely balanced prose, the prickly humor, the knack of artful quotation and an astonishing historical grasp.” Since about 2003, Sedley has been known, especially, for his Laws of Documents, to be read as humour:

  1. Documents may be assembled in any order, provided it is not chronological, numerical or alphabetical.
  2. Documents shall in no circumstances be paginated continuously.
  3. No two copies of any bundle shall have the same pagination.
  4. Every document shall carry at least 3 numbers in different places.
  5. Any important documents shall be omitted.
  6. At least 10 per cent of the documents shall appear more than once in the bundle.
  7. As many photocopies as practicable shall be illegible, truncated or cropped.
  8. Significant passages shall be marked with a highlighter which goes black when photocopied.
  9. (a) At least 80 per cent of the documents shall be irrelevant. (b) Counsel shall refer in Court to no more than 5 per cent of the documents, but these may include as many irrelevant ones as counsel or solicitor deems appropriate.
  10. Only one side of any double-sided document shall be reproduced.
  11. Transcriptions of manuscript documents and translations of foreign documents shall bear as little relation as reasonably practicable to the original.
  12. Documents shall be held together, in the absolute discretion of the solicitor assembling them, by: a steel pin sharp enough to injure the reader; a staple too short to penetrate the full thickness of the bundle; tape binding so stitched that the bundle cannot be fully opened; or a ring or arch-binder, so damaged that the arcs do not meet.

In Norway, the government switched off analog radio signals in 2017. It was the first country in the world to end FM consumer radio transmissions. Millions of old radios were made obsolete, while consumers were encouraged to purchase new DAB+ radios. Here at Cliff Cottage, we have a DAB+ radio, bought specifically for use during emergencies. If we remember, it is tested annually, then put away until the next year.

The English translation of the name of Kjæringa me’ Straumen, would be something like, the woman with the tide. It comes from Kjerringa mot strømmen which expresses the opposite, the woman against the tide. Originally, this was the title of a Norwegian folk tale published by Asbjørnsen and Moe in 1871. Kjæringa is the Trøndersk dialect pronunciation and spelling, of a term variously translated as woman, wife or (especially in other dialects) hag. In Trøndelag, the term is viewed positively. Straumen is the name of our village, but means the same thing as strømmen = tidal current.

Classification

If I were born in the 21st century, I am certain that I would have avoided purchasing most of the 4 000 physical/ paper books that are found in our library. Most, but not all, because I appreciate many books precisely because of their images. While there are technical problems using an e-book readers to view high-definition images, these are ideal tools for reading novels and more general works.

Our physical books are organized using a decimal classification system, first developed by Francis Bacon (1561 – 1626), but expanded upon by Melvil Dewey (1851 – 1931). Some aspects of this topic have been discussed in an earlier weblog post. The issues discussed there will not be repeated, but augmented.

Our starting point for a classification system is Dewey Decimal Classification (DDC). DDC was first published in 1876. The latest printed version, 23, was published in 2011. The online version, WebDewey, is continuously updated. Unfortunately, the DDC system is problematic, much like the personality of Melvil Dewey. This system was originally positively received, and initially almost universally used, especially if the universe is restricted to American public libraries. Its focus is on a masculine, Christian, European-American homophobic world. We have introduced modifications.

First, we use a revised schedule for DDC 200 (Religion), developed by Paul Gerard, which gives a more equal weight to all religions, and provides adequate space for a full treatment of the Bahá’í Faith. This is referred to as the Phoenix schedule, and has been implemented many places, including Cliff Cottage.

Second, our geographical world view is non-standard, with a focus on at least three different geographical areas: Greater Vancouver in Canada, Trøndelag in Norway, and the Bay Area in California. Thus I have developed my own classification system, Geoscheme. The current version, E2, dates from 2016-05-07, and can be accessed below.

Geoscheme E2

While Dewey’s promotion of the metric system can be applauded, other areas of focus were less positively received and less successful, such as his promotion of a spelling reform, resulting in a permanent first name change from Melville, and a temporary last name change to Dui.

At the 2019 American Library Association annual conference, council document #50 presents a resolution on renaming the Melvil Dewey medal to remove Melvil Dewey’s association with the award. It was passed unanimously. Among the reasons cited are: “ … that he did not permit Jewish people, African Americans, or other minorities admittance to the resort owned by Dewey and his wife; … he was censured by the New York State Board of Regents for his refusal to admit Jews to his resort, whereupon he resigned as New York State Librarian; … Dewey made numerous inappropriate physical advances toward women he worked with and wielded professional power over; … during the 1906 ALA [= American Library Association] conference there was a movement to censure Dewey after four women came forward to accuse him of sexual impropriety, and he was ostracized from the organization for decades”.

Other Document Classification Systems

Perhaps the main challenge with library classification systems is their arrangement as hierarchical tree structures. As time progresses, the world of Melvil Dewey becomes less relevant. New categories become increasingly needed as old ones fade into the background. Increasingly, there is co-operation across fields, so that books (and other objects) need to display multiple classifications.

In Europe, the Universal Decimal Classification system dominates public libraries. It was developed by Paul Otlet (1868 – 1944) and Henri La Fontaine (1854 – 1943). They initially created the Répertoire Bibliographique Universel (RBU), starting in 1895. They then wrote to Dewey and received permission to translate his DDC into French. However, instead of translating, they made some radical innovations, such as adapting its enumerative classification approach in which all the subjects are listed and coded, into one that allows synthesis, essentially, the use of compound numbers to represent interrelated subjects. In addition, potential relations between subjects were identified, and symbols assigned to represent them. The result of this work, Manuel du Répertoire bibliographique universel, appeared in 1905. An outline of the UDC is available here.

So far, the important work of Charles Amee Cutter (1837 – 1903) has been ignored, in these weblog posts. Yet, his Cutter Expansive Classification system is important. It uses seven separate schedules, each designed for libraries of different sizes. The first schedule is the most basic. After this, each schedule expands from the previous one. Cutter provided instructions on how a library might expand from one schedule to the next, as it grows.

The Library of Congress Classification (LCC), was developed by Norwegian born librarian J. C. M. Hanson (1864 – 1943) from Cutter’s system, starting in 1897. It replaced the fixed location system developed by Thomas Jefferson (1743 – 1846). The major flaw with LCC is its absence of a sound theoretical basis. Classification decisions were driven by practical needs, rather than epistemology: It is focused on books found in one library’s collection, and does not attempt to classify human knowledge of the world.

Digital Documents

Our digital documents, including text, image and audio files are stored on a server, where several copies are kept in case of disk failure, along with other copies on external hard drives. These can be transferred to other devices as required, including e-book readers. These documents do not have the same need of a classification system, because they can be searched for in different ways. It takes only a few seconds to transfer these documents to other devices, such as laptops, stationary machines or e-book readers.

The Five Laws of Library Science

This weblog post is being published on the fiftieth anniversary of the death of Shiyali Ramamrita Ranganathan, (1892-08-12 – 1972-09-27). He was an Indian librarian and mathematician who developed Five Laws of Library Science (1931), and the Colon Classification System (1933).

Shiyali Ramamrita Ranganathan

The five laws are:

  1. Books are for use: This focuses attention on access-related issues, such as library location, loan policies, hours and days of operation, the quality of staffing and even more mundane matters, such as library furniture and temperature control.
  2. Every reader his or her book: libraries serve a wide variety of patrons, they have to acquire literature to fit a wide variety of needs. Everyone is different and each person has different tastes regarding the books they choose.
  3. Every book its reader: All books have a place in a library, even if only a small demographic chooses to read them.
  4. Save the time of the reader: All patrons should be able to easily locate the materials they desire quickly and efficiently.
  5. A library is a growing organism: a library is a dynamic institution. Books, methods and the physical space needs to be updated over time.

There have been numerous updates and modifications of these laws over time. In 2004, librarian Alireza Noruzi emphasized the web. In 2008, librarian Carol Simpson referred to media, more generally. In 2015, B. Shadrach referred to knowledge. In 2016, Achala Munigal focused on social media.

Colon Classification System

A faceted classification uses semantic categories, either general or subject-specific, that are combined to create the full classification entry.

In the Colon Classification system, originally presented in 1933, a book is assigned a set of values from each independent facet, using punctuation marks (most notably colons) and symbols between the facets to connect them.

The system is organized into 42 classes. In the 6th edition (2006), some examples are: Class D = Engineering, J = Agriculture, N = Fine Arts, U = Geography and X = Economics. Each class is has its own specific characteristics, facets. There are five fundamental categories that can be used to express the facets of a subject:

  • Personality is the most specific or focal subject.
  • Matter is the substance, properties or materials of the subject.
  • Energy includes the processes, operations and activities.
  • Space relates to the geographic location of the subject.
  • Time refers to the dates or seasons of the subject.

As e-reading increases, and works rely more on digital storage, than physical storage, it becomes easier to use tags, rather than numbers, to classify books. With tags, one is no longer confined to the singularity of one classification system. Tags can be a mishmash of Dewey, Cutter, LCC, Colon or other features. There is no need to physically locate a book in order to read it. At Cliff Cottage, relatively fewer books are located on physical shelves in our library, while an increased number of books are found on virtual shelves on our server. There is no limit on the number of household residents, who can access the same book simultaneously!

A book can be found, and loaded onto an e-reader almost instantaneously. The current difficulty with such a system, is being in agreement as to which tags are to be used.

Note: This weblog post has been in development for over two years. One text (A) mostly about Ranganathan, was originally written 2020-07-08; a second (B), about library classification systems more generally, on 2020-11-19; a third (C) mainly about classification as it applies to physical inventories, like screws, buttons, flour and yarn, was started on 2021-12-23. These were amalgamated on 2022-02-25 and further modified on 2022-08-31 . This text didn’t work properly. On 2022-09-20, these were separated into two separate documents, essentially (A & B) and (C). The text as it appears here consists of the first two texts.

bell hooks (1952 – 2021)

bell hooks, 2009-11-01 Photo: Cmongirl

bell hooks (no capitals, please) was born Gloria Jean Watkins in Hopkinsville, Kentucky, on 1952-09-25, or 70 years before the publication of this weblog post. Her pen name is taken from her maternal great-grandmother who, according to Heather Williams “was known for her snappy and bold tongue, which [bell hooks/ Gloria Jean Watkins] greatly admired”. Williams further informs us that the name was put in lowercase letters “to distinguish [herself from] her great-grandmother.” It also signified that it was more important to focus on her works, not her personality, expressed as the “substance of books, not who I am.”

Perhaps the most import insight bell hooks brings is that communication and literacy, defined as the ability to read, write and think critically, are necessary for the feminist movement because without them people may not recognize gender inequalities.

If there is a single work that would help people understand bell hooks, it is Ain’t I a Woman: Black Women and Feminism (1981). The title is not original. It was used by Sojouner Truth (1797 – 1883), as the publication title of an untitled speech given at a Woman’s convention in 1851 at Akron, Ohio. The fact that there is 130 years between these publications suggests that the status of Black women has not improved noticeably in that time.

Racism and sexism have doubly impacted the lives of Black women, so that they have the lowest status and worst conditions of any group in American society. Southern segregationists promoted a stereotype of Black female promiscuity and immorality. According to hooks, white female reformers were more concerned with white morality than the living conditions of Black Americans.

White society stereotyped white women as pure/ goddess/ virginal, in contrast to the stereotypical Black women depicted as seductive whores. This, in turn, justified the rape of Black women by white men. hooks views Black nationalism as patriarchal and misogynist.

The then-current feminist movement (from the 1970s), is seen as a (largely) white middle and upper class movement, unsympathetic to the needs of poor/ non-white women. This feminism actually reinforces existing patterns of sexism/ racism/ classism.

There are two other problematic starts, and legacies affecting Black women. The Nation of Islam dates from 1930. The Nation was started by Wallace Fard Muhammad (c. 1877 – c. 1934) in Detroit. After Muhammad’s disappearance, leadership was taken over by Elijah Muhammad (1897 – 1975), who expanded it to include schools/ banks/ restaurants/ stores/ truck and air based transportation systems/ publishing in 46 American cities. It also owned about 80 square kilometers of farmland in 1970. While all of these may be viewed positively, it was also a patriarchal organization that promoted gendered roles, and denied women leadership opportunities.

The Black Panther Party was started in the mid 1960s, by Huey P. Newton’s (1942 – 1989) and Bobby Seales (1936- ). They initially developed a 10-point manifesto. The Black Panther Party founded over 60 community support programs (renamed survival programs in 1971) including food banks, medical clinics, sickle cell anemia tests, prison busing for families of inmates, legal advice seminars, clothing banks, housing cooperatives, and their own ambulance service. The most famous of these programs was the Free Breakfast for Children program which fed thousands of impoverished children daily during the early 1970s. Newton also co-founded the Black Panther newspaper service, which became one of America’s most widely distributed African-American newspapers.

To begin with, not all was well with the Black Panther Party. It too advocated violence, black masculinity and traditional gender roles. Thus, it was not a vehicle for improving the status of Black women. It was patriarchal and misogynist. However, things started to improve, especially from 1968, when women constituted two-thirds of the party.

In Black Looks: Race and Representation (1992) hook takes an article by Audre Lorde’s (1934 – 1992) about black womanhood as a structure, then discusses how black women are imprisoned in a stereotype of violence, that continues on through the generations. She believes that the narrative can be changed, but that it is hard. Black women are encouraged to discuss Black literature. Yet, this does not come with any guarantees of self-actualization. In particular, she refers to Celie, a character in Alice Walker’s (1944 – ) The Color Purple (1982), where she escapes an abusive situation, only to return to a similar situation at the end of the novel. What these fiction writers are doing, is breaking “new ground in that it clearly names the ways structures of domination, racism, sexism, and class exploitation, oppress and make it practically impossible for black women to survive if they do not engage in meaningful resistance on some level.” (p. 50) Angela Davis (1944 – ) and Shirley Chisholm (1924 – 2005) are presented as examples of Black women breaking the trend and resisting the cycles. Women of color need to engage in feminism and in the “decolonizing of our minds” in order to center “social change that will address the diversity of our experiences and our needs.” (p. 60)

Not being Black, female or queer pas gay, it is not my place to pass judgement on the previous two works. At some level there is an intellectual understanding, but no lived experience. This is not the case with the third, and last, book that I would like to discuss: belonging: a culture of place (2009). hooks begins chapter 2, Kentucky is My Fate, with: “If one has chosen to live mindfully, then choosing a place to die is as vital as choosing where and how to live. Choosing to return to the land and landscape of my childhood, the world of my Kentucky upbringing, I am comforted by the knowledge that I could die here.” This was her fate, in 2021.

She regards her upbringing in rural Kentucky, as an exposure to anarchy, where people are enabled to live a relatively free life, despite racial separatism, white exploitation and black oppression. She contrasts this with more general urban experiences, where rules were made, imposed and enforced by unknown others, where “black folks were forced to live within boundaries in the city, ones that were not formally demarcated, but boundaries marked by white supremacist violence against black people if lines were crossed. Our segregated black neighborhoods were sectioned off, made separate. At times they abutted the homes of poor and destitute white folks. Neither of these groups lived near the real white power and privilege governing all our lives.”

In her last chapter, 10: Earthbound: On Solid Ground, hooks discusses the concept of interbeing, “That sense of interbeing was once intimately understood by black folks in the agrarian South. Nowadays it is only those who maintain our bonds to the land, to nature, who keep our vows of living in harmony with the environment, who draw spiritual strength from nature. Reveling in nature’s bounty and beauty has been one of the ways enlightened poor people in small towns all around our nations stay in touch with their essential goodness even as forces of evil, in the form of corrupt capitalism and hedonistic consumerism, work daily to strip them of their ties with nature…. To look upon a tree, or a hilly waterfall, that has stood the test of time can renew the spirit. To watch plants rise from the earth with no special tending reawakens our sense of awe and wonder.”

While I am happy that bell hooks was able to return to Kentucky, it is not always possible for people to return to their own place. For most of my adult life, my home town, New Westminster, on the banks of Sto:lo, the Fraser River, has been economically inaccessible. Thus, I have had to create my own substitute, Cliff Cottage, at Vangshylla, in rural Inderøy, Trøndelag, Norway. This has not happened without my own internal protests! Despite these, it is a place that is suitable for my anarchist self. Rural landscapes make better use of their internal resources, and are closer to sustainable. Prices for housing are lower, so people can work less. The benefits of an rural lifestyle are real.

Urban landscapes, unfortunately, have become dependent on the massive import of external resources, for their survival. They are no longer sustainable. People living there, feel a need to work excessively just to pay for the basics of housing. The benefits of an urban lifestyle are largely a mirage. At one point I read that in 2020 New Westminster experienced the worst air quality in the world due to the combined effects of the 2020 Western American wildfires and a fire at the old pier at the quay.

This week, I was sent two listings for houses for sale in Kerrisdale, a residential area in Vancouver, British Columbia, where Trish, my wife, grew up. The prices for these modest houses on smallish lots were between two and three million dollars, Canadian. I would discourage everyone, from supporting this form of übercapitalism. Buying such a house is not in the spirit of bell hooks. It is hard to be an anarchist, making monthly mortgage payments! It is hard to be an anarchist, wasting income on unnecessary expenditures.

Plasma Kinetics

This illustration shows some of the applications for Plasma Kinetics hydrogen technology, that include aircraft, and assorted types of land vehicles. Presumably, various types of vessels could also use it. Source: Plasma Kinetics.

Hydrogen based storage technology could replace capacitor and battery technology for energy storage in vehicles, vessels and aircraft of various types and sizes. Previously, posts in this weblog have taken up a hydrogen station explosion, and its aftermath. In addition, a flawed report about the economics of hydrogen and methane has been examined.

Plasma Kinetics hydrogen technology was introduced, and patented, in 2008. It was first claimed that the technology was transformational, then disruptive. Almost immediately restrictions were placed on their use of patents, effectively resulting in the technology being banned by the US government. That situation continued until 2017, when it was allowed to be commercialized. There were some restrictions imposed under the International Traffic in Arms Regulations (ITAR), which continues to restrict its export as a missile fuel.

Where Plasma Kinetics technology differs from other providers of hydrogen, is that it does not need a compressed gas infrastructure to capture, move or distribute hydrogen. Instead, one common distribution method is to fill 19 l / 7 kg containers with hydrogen, for sale at assorted local stores. Empty containers can be returned, in exchange for recharged containers.  The stored hydrogen is non-flammable.  Containers of hydrogen can be transported via truck, rail, or ship without restriction.  There is no need to build compressed hydrogen gas stations.  Plasma Kinetics systems are slightly larger, and only moderately heavier, than compressed gas carbon-fiber tanks at 700 bar.  But solid storage containers are much easier to manage than compressed gas, and have a lower overall energy cost, and a cleaner fabrication process.  Safe, non-flammable, hydrogen storage in dense solid form. Hydrogen is zero-carbon. No energy or pressure is required to collect and store hydrogen. No pipelines or fixed structure pumping stations are required. Cassette, canister and other container systems can be easily recharged. Materials used are non-toxic and readily available worldwide. The entire processing process is quiet. 

The nano-graphite film recharges through 150 cycles and is fully recyclable. The reason for this limit, is that the process only works with atomic hydrogen = 1H (where an atom consists of one proton and one electron, but no neutrons). This amounts to 99.98% of hydrogen found in the wild. Deuterium = 2H (where an atom consists of one proton, one neutron and one electron), amounts to 0.02% of the wild hydrogen population. It cannot be used in the energy system, so it accumulates on the film. It can, however, be retrieved when the storage units are recycled, and sold for a profit that exceeds the recycling costs!

Comparison between different hydrogen storage methods. Source: Plasma Kinetics.

My acquaintance with this technology came from a YouTube video (2021-06-24) on the E for Electric channel, when Sandy Munro was asked by Alex Guberman, what he would do if he became CEO of Toyota for a day? Part of his answer involved Toyota acquiring, or at least developing a relationship with, Plasma Kinetics.

Some weeks later, in an interview with Paul Smith (2021-08-12), Smith explains how the technology works, starting at about 5m00s in. He claimed that 15 lbs provides 20 miles of range in a car. With a severe allergy to imperial units, I would probably have said that a 19 l/ 7 kg cartridge would provide an average car with sufficient energy for 30 km. Cylinders for trucks would be 20 x larger (140 kg). Four of those would allow a truck to travel 570 miles = ca. 900 km.

One of the main concerns with this technology is the capability of consumers to replace a 19 l/ 7 kg cartridge every 30 km. People expect a modern electric vehicle (EV) to have a range of at least 300 km, which would require a vehicle to carry ten such units, at a weight of 70 kg. It was pointed out that systems were being developed for the automatic removal and insertion of disks (in cars), and presumably cylinders (in trucks and airplanes).

It was noted that while batteries are extremely efficient, the specific energy of hydrogen, expressed in terms of J/ kg, is three times that of a battery. Except, in some respects, one is comparing avocados with olives! The hydrogen needs to go through a fuel cell for its energy to be converted to electricity.

It should be noted that prior to the hydrogen ending up in some container, water = H2O has been converted in an electrolyzer resulting in hydrogen 2 parts H2 and oxygen 1 part O2. Please do not ask what happens to the oxygen!

Both fuel cells and electrolyzers are becoming smaller, lighter and more reliable. Electrolyzers can be stationed at local wind or photo-voltaic farms, wastewater treatment facilities, or other climate friendly sources.

It was also pointed out that a conventional compressed hydrogen refueling station can cost US$ 2.5 to 3 million. This contrasts with a station for Plasma Kinetics containers that costs about US$ 200 000.

A fuel cell vehicle using this technology should be far cheaper to make than a battery electric vehicle. Some items are eliminated, others are repurposed. For example, the battery cooling system becomes a fuel cell cooling system. Some components remain the same, such as the electric motors. In essence, a heavy battery is being replaced with a much lighter fuel cell and the Plasma Kinetics photo release system for hydrogen. This should give the vehicle improved range.

Paul Smith concludes that interest for the technology is stronger in Asia and Europe, and much less so in North America. A fab = fabrication facility = factory, to make the equipment costs about US$ 100 million.

In EV 2030 predictions, the challenges with fuel cells involve the energy costs of electolyzing hydrogen from water, which account for somewhere between 25% (DC) and 31% (AC) energy loses. Then, processing of hydrogen in the fuel cell costs another 50%. This means that the energy value available to the motors is somewhere between 36 – 38%. In contrast, the energy value available with a battery is about 77%.

Since my prophecy quotient is already used up, I will only speak of dreams. One of which is that dynamic charging along highways will meet much of the vehicular need for electricity, by 2050. Unfortunately, this is not supported by any evidence seen so far. Associated with this dream, is that the cost of dynamic charging technology will be less than that provided by hydrogen containers and fuel cells or equivalent battery based components, in vehicles. An agenda to this dream is that solid-state batteries will become the norm because of their increased specific energy and energy density, and durability. Any such batteries will generally be much smaller and reserved for last mile situations, something a 20 kWh battery would be able to supply.

Eurorack

For people living – possibly even born – in the 21st century, Eurorack is a major approach to acquiring an affordable synthesizer. It is not a specific instrument, but a modular synthesizer format originally specified in 1996 by Doepfer Musikelektronik. It has since grown in popularity, and as of 2018 has become the dominant hardware modular synthesizer format, with over 5000 modules available from more than 270 different manufacturers.

Stated another way: If you, as a synthesizer playing person, want to base your synthesizer on modular components, there is no point in acquiring anything that isn’t Eurorack compatible; If you are a synthesizer module manufacturer, there is no point in offering modules that aren’t Eurorack compatible. Eurorack is the unavoidable standard, the intersection between module consumers and producers. Here, in this weblog post, the Eurorack specifications will be examined in some detail.

The mechanical specification for the Eurorack are derived from Eurocard, but with additional power supply and bus board specifications. The power supply is currently specified as A-100 PSU3, updated in 2015. Many cases adhere to the A-100 PSU2 specification, this allow modules to fit into existing (read: used) rack cases.

The Doepfer bus board allows for a maximum of 14 modules to be plugged in. A standard Doepfer case, either rackmount or portable, consists of two rows of 84hp, 6U high, that contain one PSU and two bus boards.

A Doepfer A-100 modular synthesizer, with two modules from Analogue Solutions, and the remainder from Doepfer. Photo: Nina Richards, 2011-12-02.

Doepfer-style bus boards are circuit boards. An alternative to these is a flying bus board. These have similar connections but use a flexible ribbon. This is often preferred, as mounting circuit boards can sometimes prove difficult.

The modules themselves have to meet Eurocard specifications for printed circuit board (PCB) cards that can be plugged together into a standard chassis which, in turn, can be mounted in a 19-inch rack. The chassis consists of slotted card guides top and bottom, into which cards are slid so they stand on end. The spine of each card has one or more connectors which plug into mating connectors on a backplane at the rear of the chassis.

Module height was three rack units (3U) and the width was measured in the Eurocard-specific Horizontal Pitch (hp) standard, where 1hp equals 0.2 inches, or 5.08 mm. The modules were largely low-cost, compact, and had some components on their boards that were socketed instead of soldered down, so the user could, for example, upgrade to a better op-amp IC.

An unpopulated Doepfer LC6 Eurorack case, with power bus. Photo: Ashley Pomeroy, 2020-12-31.

Nathan Thompson, writing as nonlinearcircuits, has posted 33 laser-cut Eurorack cases, plus rails and some other components on Thingverse. Most of the cases date from 2015 and 2016.

Modules connect to a bus board using a 10-to-16 or 16-to-16 pin cable, depending on module design. These 16 pins are arranged in pairs and carry the following signals, from top to bottom: Gate, CV, +5V, +12V, GND, and -12V. The bottom 10 pins do most of the work, providing + and -12V to power the modules. The top two pins are for Doepfer’s optional internal CV and Gate routing. The +5V rail is used on some modules that require more power.

Plugging modules in, is not always as simple as it seems. Experienced Eurorack users will rigorously check connections before powering up, no matter how long they’ve been working with the system. Typically, the red stripe on the ribbon cable connecting the modules to the bus board must line up with -12V. This should be labeled on the module, and is always labeled on the bus board. Plugging a module incorrectly may have expensive ramifications.

A-100 PSU2 provides 1200 mA = milliamps of current to both the +12V and -12V rails. This has to be compared with the power drawn by a module. This has to be less than what the PSU specifies. The A-100 PSU3 also provides +5 V of power.

With the classic Doepfer case, a user would need to consume less than 1 200 mA on both rails. Modules should be almost evenly split between the two bus boards. If a module requires +5V, most manufacturers, including Doepfer as of 2015, either a PSU3 has to be used, or an adapter, which takes current from the +12V rail. The amperage required on the +5V rail will be subtracted from that available current on the +12V rail. The power specifications in Eurorack are not technically standardized, but most follow the Doepfer standard.

Perhaps the most important consideration, but one that may be difficult to answer for someone new to synthesizers and/ or Eurorack, is deciding on the type of rig to make.

Some people refer to a classic analogue synth, a rig capable of generating its own waveforms with wave-shaping tools to add character including textures and timbres to the generated signal. Another approach is to build an effects rack that processes sound generating elsewhere. These can be monophonic, stereo or polyphonic. Below this, one can build a drum machine that is focused on rhythm, rather than more tonal qualities.

One major advantage of Eurorack is its modular nature, allowing an opportunity to add and delete modules. To construct a self-contained instrument one needs: an oscillator, a filter, a voltage controlled amplifier (VCA), two envelope generators, one for the filter and another for the VCA, an effects unit, a mixer and/ or an output module.

Beginners are often encouraged to choose an analogue oscillator. These are easy to find and use, while still offering opportunities for creative expression.

Voltage-controlled oscillator (VCO) generate waveforms—sine, triangle, sawtooth, ramp or square waves— that are slightly unstable, with fluctuations in pitch and timbre as the voltage changes over time, this gives the oscillator a unique character.

Filters impact sounds the most. For better or worse, many modern synths use filters with characteristics that emulate those found on specific vintage synthesizers.

Robert Moog’s (1934 – 2005) lasting impact on synthesizers, starts with his dictate of 1 V per octave. Increasing the voltage going into a VCO by 1 volt raises its pitch by one octave. To understand this, consider a piano and how it is tuned. Convention dictates that middle C is referred to as C4. Tuning is based on A4, two white keys below, or to the left of, middle C. A4 has a standard frequency of 440 Hz. For convenience, it will be assumed that this is produced by a VCO signal of 4V. Thus, the relationship between note, voltage and frequency can be expressed by: A0 = 0V = 27,5 Hz; A1 = 1V = 55 Hz; A2 = 2V = 110 Hz; A3 = 3V = 220 Hz; A4 = 4V = 440 Hz; A5 = 5V = 880 Hz; A6 = 6V = 1 760 Hz; A7 = 7V = 3 520 Hz; A8 = 8V = 7 040 Hz. Note: Not all VCOs are turned to A in this fashion. As can be seen, above, this results in an exponential relationship between voltage and frequency, as each change in octave requires an additional doubling or halving in frequency. An accurate reproduction of this exponential curve in modules is difficult in analogue synthesizers because temperature changes and the ageing of electronic components, often referred to as tracking errors, can impact pitch.

An aside: Many Japanese synthesizers, such as those made by Yamaha or Korg, use a system where voltage is proportional to frequency. If A1 = 1 V, then 2A = 2 V, 3A = 4 V, 4A = 8 V. In other words, it takes a doubling or halving of the voltage to result in an octave change.

There are three basic approaches to acquiring modules that can be used with Eurorack. These are 1) assembled systems; 2) DIY from kits; 3) DIY from components. All three of these approaches will be discussed below.

Assembled Systems

Moog in the late 1960s released synthesizer modules Ic, IIc and IIIc followed by the Ip, IIp and IIIp These were followed in the early 1970s by System 10, 12, 15, 35 and 55. These were all extremely expensive, based on a discrete transistor designs. The separate modules – such as oscillators, amplifiers, envelope generators, filters, noise generators, ring modulators, triggers and mixers – were connected using patch cords, which also enabled them to control each other. This produced a distinctive sound that made its way into many contemporary recordings. Production of all these except system 15, 35 and 55 modules had stopped by 1973. These last three lasted until 1981. Moog released new versions of some of these since 2014, but these typically cost US$ 35 000.

The patents and other rights to Moog’s modular circuits expired in the 1990s. With the expiration of these rights, other manufacturers have been able to offer sound clones of these modules, many in the Eurorack format. Since 2020, Behringer has been one of these.

The Behringer 960 Sequential Controller Photo: Behringer

The Behringer 960 sequencer controller replicates the operation of System 55 but using modern components, and built so that can fit in a standard Eurorack case. It is also affordable, at about NOK 1 600.

DIY from kits

Dreadbox Dysphonia Eurorack synth module Photo: Dreadbox

For slightly more money, about NOK 2 200, one could also buy a Dreadbox Dysphonia, that was offered as a kit in 2021-11. As with many kits, it was made as a single run. Once the kits from that batch are sold, no additional kits will be made. Dreadbox describes this as buy now or cry later. On 2022-01-19, one was being offered for sale for NOK 4 000. Despite the hype, one can usually expect something similar being offered in the future, but there will be differences, sometimes even improvements.

The main advantage of this kit is that It could be used as a stand-alone desktop synthesizer, or be fitted into a Eurorack. To facilitate both purposes, It comes with a USB to Eurorack power converter. This type of kit is claimed to be well suited for inexperienced DIY construction. Instructions are typically easy to understand, and solder together!

The Dysphonia consists of 13 individual sections that offer an affordable, compact, modular patch system, if one is prepared to build the system from parts. It consists of a single analogue oscillator comes with 4 waveforms that you can patch independently through 3 VCAs = voltage controlled amplifiers, and a 3 channel mixer before being subjected to a 24dB 4-pole lowpass filter and 12dB 2-pole multimode filter. The low pass filter can also self-oscillate to provide additional tones. In addition to an analogue LFO = low frequency oscillator ad envelope, there is a digital modulator providing 4 different modes with low frequency oscillator (LFO), Envelope, Random and CC = continuous control, a MIDI = musical instrument digital interface message capable of transmitting a range of values, usually 0-127. These are commonly used for controlling volume (#7), pan (#10), data slider position (#6), mod wheel (#1) and other variable parameters. This can enhance music, but an over-use of these messages can result in MIDI log-jam, where the amount of data sent exceeds the supported bandwidth. There is also a MIDI-to-CV = control voltage, module which provides analogue to digital and digital to analogue conversions, allowing the module to intereact with a keyboard, computer, phone or almost any other device. There is also a Hybrid Echo module.

DIY from components

One useful source of updated electronic information comes from Elektor magazine. A green subscription provides everything digitally, including back issues. Elektor publishes electronic projects, background articles and designs aimed at engineers, enthusiasts, students and others. It also sells printed circuit boards (PCBs), kits and modules. PCB design work is usually available without charge from their website. Microcontroller and/or computer based projects normally come with program source code and other necessary files.

This is also a good source of synth designs that take advantage of modern electronic components with methodologies that are suitable for hobbyists.

Gear Acquisition Syndrome

One of the major challenges with Eurorack, is that it encourages the acquisition of excessive amounts of gear. Gear acquisition syndrome (GAS) is a real psychological challenge, satirically documented by Steely Dan guitarist Walter Becker (1950 – 2017) in a 1994-04 Guitar Player magazine article (p. 15), where G originally stood for guitar. Because the many providers of Eurorack offer a wide variety of relatively low-cost components, often with specific but limited characteristics, it is tempting to buy just one more! Some people realize compulsive shopping should be resisted. Those who need the advice, will probably not follow it.

The seven key stages of GAS are discussed in a 2022-08-18 Music Radar article. These are: dissatisfaction, desire, ‘research’, the purchase, guilt, acceptance and relapse. Relapse, this last “cruellest stage of GAS can hit anywhere between a year to eighteen months after the purchase, although the time passed invariably depends on the amount of cash spent and the amount of meals you’ve had to eat from a tin as a consequence.” Once again, the article refers specifically to guitars, but also applies to synths, and by extension Eurorack modules.

Another weblog post tentatively titled DIY Synths and currently scheduled for publication 2023-03-25, contains more detailed information about synth circuits, especially from kits.

Elizebeth Friedman (1892-1980)

One hundred and thirty years before the publication of this weblog post, Elizebeth Friedman née Smith (1892–08-26 – 1980-10-31) was born. She was known in later life for her expertise in cryptanalysis.

Cryptanalysis, popularly called code breaking, is actually an important branch of computer science. All practitioners are expected to have some basic understanding of it, at both practical and conceptual levels. It is actually a broader topic than many realize. It involves the analysis of information systems to understand what lies hidden. As expected, Wikipedia has an article on the topic. People often describe some (often their own) cryptanalytic efforts as reverse engineering. Other people may actually think they know what this term implies, but it is vague, often deliberately used to cover up the specific techniques involved in ferreting out system information.

American Cryptanalysis begins at Riverside, Illinois, where a number of documents on the subject were published, some of which are available for download. This was mainly the work of one woman, Elizebeth Friedman (1892-1980), sometimes assisted by her husband, William (1891-1961), who was working at Riverside, starting in 1915, then went on to lead the research division of the Army’s Signal Intelligence Service (SIS) in the 1930s, and some follow-on services into the 1950s. William’s one major contribution was inventing the term cryptanalysis!

Unfortunately for the world, some scientific practices are shameful. It is not so much the individual facts that are either stumbled upon or ignored that constitute a major problem, but rather how some practitioners (typically male) take credit for work performed by others (typically female). In addition, large numbers of women have been actively discouraged in pursuing scientific careers, simply on the basis of their gender. In the twentieth century, when they were permitted to participate, they were shunted into inferior positions/ roles, and their activities were depreciated. Hopefully, in the twenty-first century, this will come to an end.

Elizebeth Friedman was the foremost cryptanalyst in USA, exceeding in ability, the talents of her husband. This is mentioned because, while they worked together, only William Friedman’s name appears on published documents, although many knew that they were sometimes written jointly, but often by Elizebeth alone. It is difficult (if not impossible) for me, a male, born more than half a century after her, to understand her situation, let alone her motivation for allowing her husband to receive full credit.

Yet, the most outrageous appropriation of her work did not involve her husband, but another American man who should have been regarded as Public Enemy #1, John Edgar Hoover (1895 – 1972). who took credit for much of Elizebeth’s cryptanalysis. As Wikipedia states, “Later in life and after his death, Hoover became a controversial figure as evidence of his secretive abuses of power began to surface. He was found to have exceeded the jurisdiction of the FBI, and to have used the FBI to harass political dissenters and activists, to amass secret files on political leaders, and to collect evidence using illegal methods. Hoover consequently amassed a great deal of power and was in a position to intimidate and threaten others, including multiple sitting presidents of the United States.”

Fake Science and its consequences

Originally, this post ended with the previous paragraph. Then, on 2022-07-22, some allegations emerged that part of a key 2006 study of Alzheimer’s disease may have been fabricated. Matthew Schrag (1971 – )found serious problems with underlying research led by Sylvain Lesné (1974 – ) on a specific protein. Science has now issued a statement about it. Images accompanying and supporting the research, seem to have been altered.

One problem with fake research is that it often results in fakers getting undeserved research grants. More importantly, real researchers get denied these grants, which can mean that medical breakthroughs get delayed. This can result in unnecessary suffering, or even premature death. The amount of money involved can reach hundreds of millions of dollars. The wrong papers get cited. In this particular case, one published in Nature, has been cited 2 300 times.

Schrag told Science, “You can cheat to get a paper. You can cheat to get a degree. You can cheat to get a grant. You can’t cheat to cure a disease.” This is the real essence of the problem. Fake research leads society down a cul-de-sac that leads nowhere.

Much of the world is already treating science with contempt. In particular, there are climate deniers, who use falsified science to justify their own claims. There are oil companies that have knowingly publicly denied the climatic impact of the carbon dioxide produced from the combustion of petroleum products, despite knowing about its consequences for about six decades. This is having a negative impact on billions of people, while some few others are provided an unwarranted life of luxury.

The world needs to prevent the publication of falsified reports, such as those by Lesné. Currently, the peer review system lacks a mechanism to prevent the publication of doubtful works. I remember my wife, Trish’s cousin, Terry Heaps (1941 – 2017), discussing a paper he had peer reviewed as a resource economist. He had rejected it, because it contained some bad data that produced incorrect conclusions. He had been alerted to this situation by an illustration. He notified the publication of this fact. The publication then notified the author. Despite this, the paper showed up in another publication, complete with the bad data and incorrect conclusions, but minus the illustration.

In addition to developing a system to prevent inappropriate works from being published, science needs to ensure that people who make valuable contributions, such as Elizebeth Friedman, are fully acknowledged.

Addendum

Once again, this post ended with the previous paragraph. Then, on 2022-08-21, X-ray evidence emerged that Wyndham Lewis (1882 – 1957) had deliberately destroyed Helen Saunder’s (1885 – 1963) missing artwork, Atlantic City (ca. 1915), by painting Praxitella (1921) on top of it. Students Rebecca Chipkin and Helen Kohn, used X-ray and other imaging technology to investigate Praxitella, because of its “uneven texture and glimpses of bright red through cracks in the surface paint.”

Vorticism was an artistic movement, heavily influenced by cubism and futurism, whose artworks typically used bold colours, harsh lines and sharp angles. It originated with the Rebel Art Centre, started in 1914 by Wyndham Lewis and Kate Lechmere (1887 – 1976), who financed it. Helen Saunders and Jessica Dismorr (1885 – 1939) were associated with the movement as practicing painters. The movement also had literary supporters that included Ford Madox Ford (1873 – 1939), Ezra Pound (1885 – 1972), T S Eliot (1888 – 1965) and Rebecca West (1892 – 1983).

The Courtauld Gallery is an art museum in central London, established in 1932. It houses the collection of the Courtauld Institute of Art, a self-governing college of the University of London, specializing in art history. Barnaby Wright, deputy head of the Courtauld Gallery and a 20th-century art specialist, is quoted several times in the Guardian article.

“Saunders was a really interesting figure, but she was largely overshadowed by her male contemporaries. She and Jessica Dismorr were the backbone of the group,”

“In the prewar years, [Saunders] was one of the most radical painters and draughtspeople around. There were only a handful of people in Europe producing that type of hard-edged abstract painting and drawing.”

Atlantic City depicts a fragmented modern metropolis, almost certainly in the vibrant colours associated with the Vorticists. A black and white image of the painting appeared in Blast, the avant garde Vorticist journal.

Starting on 2022-10-14, Courtauld Gallery will open Helen Saunders: Modernist Rebel, an exhibition of 18 of Saunders’ drawings and watercolours, tracing her artistic development. It will also show Praxitella, loaned from Leeds Art Gallery, alongside the X-ray and partial colour reconstruction of Atlantic City.

Finally, one has to ask why men deliberately destroy the work of women? From my perspective, social justice demands that the layers of paint constituting Praxitella must be removed, to allow Atlantic City to reemerge. Criminal actions cannot be allowed to triumph over legitimate actions.

Back to Elizebeth Friedman.

This post originally started at some forgotten point in 2020, It was based on one simple question. Why is Alan Turing (1912-1954) remembered as a cryptanalyst, but not Elizebeth Friedman? Of course, Elizebeth was not the first cryptanalyst. The first known recorded explanation of cryptanalysis was given over 1 100 years earlier by Al-Kindi (c. 801–873).

Other important women cryptanalysts include Aggie Mayer Driscoll (1889-1971) and Joan Clarke (1917-1996).

Extreme Heat Belt

The counties marked in red are expected to experience temperatures of 125 °F = 51.67 °C at least one day a year, by 2053. This area is referred to by some as the Extreme Heat Belt. Screenshot of an Axios map, without the underlying data provided at the county level by the First Street Foundation.

The mission statement of the First Street Foundation reads: Make climate risk accessible, easy to understand and actionable for individuals, governments, and industry. A changing climate is impacting the risks facing American properties, communities, and businesses as perils like flood, fire, heat, and drought become more common, and more severe…. First Street Foundation is a non-profit research and technology group dedicated to quantifying and communicating those risks by incorporating world class modeling techniques and analysis with the most up to date science available in order to simply, and effectively, inform Americans of their risk today and into the future from all environmental changes.

Extreme heat refers to a maximum heat index greater than 125°F. This refers to a temperature reached at least one day a year. Currently (that is, in 2022) 8 million Americans are exposed to it. By 2030, some additional coastal areas in the Southeast and Mid-Atlantic may also experience a heat index at or above 125°F. By 2053, the number of people exposed to extreme temperatures, is expected to increase to 107 million people.

Dangerous days have a heat index greater than 100 °F = 37.78 °C. The Gulf Coast and Southeast will see the highest chances and longest duration of exposure to these. While many place experience more than 20 consecutive days with heat indices above 100°F, by 2053, these streaks could involve up to 74 consecutive days.

Local hot days are days that exceed the temperatures typically experienced for a particular area. The West will have the highest chance of long durations of these.

Future cooling-driven increases in carbon emissions could aggravate warming further. Texas, Florida, California, Ohio and Missouri are the top 5 states with the greatest cooling demand expected increase in CO2 emissions between 2022 and 2053.

As a missionary for SI, the international system of units, temperature always presents a quandary. In this official system, temperature is measured in kelvin, with symbol K. Both the kelvin and celsius systems use a 100 K/ °C difference between the freezing and boiling point of water, at a standard/ sea-level air pressure reading.

0 K is set to absolute zero, which is -273.15 °C, while 0 °C, in the celsius system, is set to the freezing point of water. In the Fahrenheit system, water freezes at 32 °F = 0 °C and boils at 212 °F = 100 °C, resulting in a 180 °F difference between these two points.. Thus, 125 °F = 324.8167 K.

SI clergy undoubtedly use many nights, sleeplessly pondering if the extreme heat value should be increased to 325 K = 125.33 °F, or if 50 °C = 122 °F, should be used. Those prioritizing as little change as possible will support the former. Those wanting to use rounder values, ending in 0, will opt for the latter. The reason for this proposal is that the world needs a mechanism to compare extreme heat locations, which will require heat to be expressed in degrees celsius. This is why, personally, 50 °C holds greater appeal, even if more locations in the world will fall into that category.

Those wishing to be further perplexed by this topic, are invited to read the Wikipedia article on thermodynamic temperature. In an imperfect world, every gram of improved understanding is worth the effort.

Analogue Electric Vehicles

A Woodpecker skateboard, to encourage young experimenters to investigate battery electric vehicles. Photo: Woodpeck.org

Part 1

On 2021-07-07 Robert N. Charette wrote an article in IEEE Spectrum, How Software Is Eating the Car, The trend toward self-driving and electric vehicles will add hundreds of millions of lines of code to cars. Can the auto industry cope?

As usual, an article in Slash Dot ( /.) is my main source of biased opinions about a serious technological issue, with one typical comment given a score of 4: interesting. It read: “If you get something pre-1978 then the most sophisticated electronics in the vehicle will probably be the radio kit.” This was then followed by numerous comments about 1976 (and later) Chrysler Cordobas. This type of reasoning reaches its zenith with, “What was the last car without this nonsense? Makes me want to buy a classic car or motorcycle, just for the simplicity.”

Yes, for a long time the trend has been towards increasing [Engine Control Units =] ECUs, based on the design philosophy of, “If you want a new feature, you buy a box from a Tier 1 [top-level component suppliers, such as Bosch] that provides the feature, and wire it in. As a general rule, automakers love outsourcing work; for most of them, their dream scenario is that everyone else does all the work for them and they just slap a badge on it and take a cut.

Then Rei adds a score 5: informative, but long, comment: “This article actually has it backwards. The first company to break with this philosophy was Tesla, which has from the beginning had a strong philosophy of in-house software design, and built basically a ‘car OS’ that offloads most vehicle software functionality into a handful of computers (with redundancy on safety-critical functionality). … Eventually everyone is going to have to either make their own ‘car OS’ stack or lease one from someone else. The benefits are just too significant[:] Lower hardware costs, lower assembly costs, lower power consumption, simpler cheaper lighter wiring harness, faster iteration time on new functionality, closer integration between different subsystems, you name it. This trend throws into reverse the notion of ever-increasing numbers of ECUs (which quite simply was an unsustainable trend).”

Who could possibly disagree?

Part 2

What is the minimal vehicle a person needs? Of course, there will be as many answers as there are people, and it will all be dependent on what they are doing. There are a lot of vehicles available, but I will not refer to them as choices. Some places lack trams or other forms of public transit. They may exist in other places, but run at inappropriate frequencies. Some communities lack bike lanes, forcing cyclists to compete for space with cars. Some streets are perpetually gridlocked.

Some people need to work, outside of their residences! Does one have to take children to kindergartens or schools? What distance does one have to travel to attain basic health and nutritional needs? Can this be done as part of a commute, or is a separate trip necessary? What about specialty shops? What is the distance to the nearest bus station/ train station/ airport/ international airport? Is there a need for a social life? Is one dependent on driving a car? Could a bicycle do for some items? Are trains or buses an option? So many questions, so few obvious answers.

Perhaps my own situations could be used as an example. Compared to most people, my life is simple: no job is calling me, and I am no longer responsible for looking after young children. Yesterday, I used a vehicle with a mass of about 1.5 Megagrams (where 1 Mg = 1 000 kg), to drive 40 km. Admittedly, there are vehicles that weigh less than a car. A bicycle is probably the most efficient device for conveying people, and it can have a mass of from about 5 to about 20 kg. Yet, I would not feel safe driving one of these on the roads of rural Norway. There are no buses, but if I plan in advance and contact the appropriate office a day in advance, I might be able to use public transit, essentially a taxi charging bus rates, as long as I am willing to wait up to several hours, for a return trip.

The most basic foods, as well as building supplies, can be purchased with a 14 km return trip across Skarnsund bridge in Mosvik, where there is even a coffee bar, with better than acceptable lattes. Basic health care (doctor, dentist, pharmacy, optometrist) and a larger selection of food and basic necessities are met by driving 26 km for a return trip in the opposite direction, into Straumen. More specialty shops are available in Steinkjer involving a 70 km round trip. This all involves driving. However, there is also a train station at Røra, 40 km round trip by car, that will allow one to connect with an international airport (TRD), and the fourth largest city in Norway, Trondheim, about 120 km away – 240 km round trip, with an even larger selection of shops and activities.

Part 3

I am in agreement with Rei, that more software (and less hardware) is needed in vehicles. Yet, I am reading this week that General Motors is charging purchasers of many GMC, Buick, and Cadillac vehicles, that are shipped with OnStar and Connected Services Premium Plan by default, $1 500 for the three-year plan that was once optional, but is now required. Other companies are doing the same sort of thing. It is estimated that this revenue stream could give GM an additional $20 to 25 billion per year by 2030. BMW has come out with similar claims, giving them an additional revenue of about $5 billion per year by 2030. I do not want to ensure that a wealthy elite continues to take more of an income pie that is already unfairly divided.

At issue is the right of consumers to direct access to vehicle data, which historically has been obtained from an on-board diagnostic (OBD-2) port (North America) or European on-board diagnostic (EOBD) port, since 1996 and 2001, respectively.  These allowed vehicle owners and technicians access to vehicle data to assist with maintenance and repair. This situation is threatened by vehicle manufacturers, who want to use telematics = the sending of data wirelessly and directly, restricting vehicle data to manufacturers. In 2021, 50% of new cars have these connected capabilities, but no country has more than 20% of its vehicle fleet equipped. USA has the most. By 2030, it is estimated that about 95% of new vehicles sold globally will have this connectivity, according to a study by McKinsey. ​

While this data could provide economic and other benefits to car owners, vehicle manufacturer want to act as gatekeeper, determining who can access it, and at what cost. This is a detriment to consumers, which could result in: Increased consumer costs; restrictions on consumer choices for maintenance and repair;  safety and security issues involving the use of non-standard data types and formats; privacy concerns. Automotive mechanics, and other aftermarket providers can also be affected. 

This has resulted in a consumer backlash, which I associate with the right-to-repair movement. There are already open-source groups working to ensure that consumers retain their rights. In addition, Automotive Grade Linux (AGL) is an open source project hosted by The Linux Foundation that is building an open operating system and framework for automotive applications. It was started in 2012, and currently has 146 corporate members.

I imagine that automotive manufacturers will try to add just enough proprietary software to their vehicles, to profit maximally from their investment. On the other hand, I see that there will be an incentive for ordinary consumers to demand right-to-repair legislation, and for guerilla activists to produce generic software substitutes where this is useful.

In Europe, repair is increasingly regarded as an essential consumer right and an environmental necessity. The main objective of the European Green Deal, is to be climate neutral by 2050. The European Commission’s Circular Economy Action Plan (CEAP), published 2020-03, details how this goal is to be reached. To reduce waste, products have to be designed to last. If they don’t last, they shouldn’t be sold. To encourage the development of products that are longer-lasting, there could be lifespan labels, service manuals, and an EU-wide repairability index. This would encourage the market to compete on repairable and durability.

In 2020-11, the European Parliament voted overwhelmingly in favor of a right-to-repair, and insisted that the more conservative European Commission administrative arm, implement it. It also included repairability labeling.

In 2020-11, voters in Massachusetts approved Question 1, involving a right-to-repair Law, with almost 75 percent in favour. The law requires automakers to provide a way for car owners and their chosen repair shops to access vehicle data, including that sent wirelessly to the manufacturer. The intent of this law is to prevent manufacturers and dealerships from having exclusive access to data.

Massachusetts is the state where the first automotive right-to-repair law was passed in 2012. That law made car makers open up the data inside the car. Rather than create a state by state solution, automakers reached a nationwide agreement with car parts makers/ suppliers and repair shops on how to share the data. This agreement opened the OBD-II port. With this new and improved right-to-repair law, similar transformative actions are required.

There are an increasing number of underpaid programmers and other software and hardware specialists, unable to fully live the American (and Scandinavian) dream. Many of these would undoubtedly be willing to work as guerilla technologists to develop the tools needed for retrofitting vehicles with more consumer friendly components, especially after warranties have ended. There are an increasing number of inexpensive microprocessors and systems on a chip that can be used for these purposes.

Part 4

To put electric vehicles in perspective, one needs to return to 1965-11-05, when President Lynden Johnson was given a copy of Restoring the Quality of Our Environment, a report by the Environmental Pollution Panel, President’s Science Advisory Committee. On publication of this blog, people have had 20 735 days or 56 years, 9 months, 8 days to confront this challenge, but have failed miserably at this task.

One fundamental question is, where can younger people learn more about the construction of appropriate vehicles for the 21st century? Currently the most interesting project is Woodpecker, that describes itself as an: “Open source based Carbon negative Electric Vehicle Platform. Woodpecker is a game changing micromobility vehicle to decrease CO2. Electrical propulsion allows to use solar and renewable power. Production of Wooden frame even decreasing CO2 because it is encapsulated by [wood] while growing. Vehicle built on Circular Economy concept – most parts are recyclable.” It appears to have originated in Latvia, and includes partnerships with many higher-educational institutions in the country. One problem with Woodpecker, is that it as an organization is too closely bound to commercial enterprises. For example, a good starting point for most open-source projects is to become acquainted with their documentation. In this case it requires people interested in downloading their technical drawings to have a Trimble account, in order to use Sketchup.

Notes:

1. This post follows up some aspects of Vehicle Devices, published 2020-11-03. The division between parts is not based on content, but time. Part 1 of this weblog post was originally written 2021-06-18 and saved at 10:49. It had been patiently waiting to be published. On 2022-08-12, outdated content was removed, and Part 2, was added, starting at 20:43. Parts 3 was started on 2022-08-13 at about 07:40, while part 4 was started on the same date at 08:48.

2. Trondheim claims to be the third largest city in Norway, but I would give that title to Stavanger. The challenge with Stavanger, is that its metropolitan area is divided between multiple municipalities. Yes, I am aware that I have offended some of my Norwegian readers, because of their origins in Trøndelag. However, Stavanger is the only place in Norway where I have ever been able to find/ buy root beer! This is probably due to Americans working in the oil industry, and living in the Stavanger area.

WBT

WBT = wet-bulb temperature. Yes, I appreciate short, cryptic post titles. That said, there is a serious point to this post, with life-saving potential, related to heatwaves.

A digital psychrometer (combined dry and wet-bulb thermometer).
The functions available with a modern psychrometer.

When a wet cloth/ wick covers the bulb of a thermometer, evaporation of the water cools the thermometer. This results in a WBT, which is equal to or below the dry temperature, measured on a thermometer without a wet cloth. The WBT reading reflects the humidity in the atmosphere. Humidity refers to the relative saturation of air with water. Low humidity means there is not much water in the air; high humidity means lots of water in the air. When the air can hold no more water, it is totally saturated. It is referred to as 100% relative humidity (RH). Air at a higher temperature is able to hold more water vapour, than air at a lower temperature.

A related concept is that of dew point = the temperature to which air must be cooled to become saturated with water vapor, at the current relative humidity. At, and below this temperature, water vapor condenses to form dew, a liquid state of water.

WBT is important because it can measure heat-stress conditions, that affect many people. In fact, a high WBT can kill them. This happened mainly from 2021-06-24 to 2021-07-01 in British Columbia. At 100% RH, the WBT will equal the dry temperature.

A combined dry and wet-bulb thermometer is referred to as a psychrometer. While analogue models are available, they require either calculations or the reading of graphs to determine values. One should not overestimate the ability of a person to perform even simple calculations, when they are potentially dying of heat stroke. Digital models can be purchased at relatively low cost that do all of the calculations automatically. The model illustrated above costs NOK 255 = US$ 26.38 = CA$ 33.72, including taxes and delivery charges to Norway (as of 2022-08-01). The quality of this particular model has not been evaluated.

At Cliff Cottage, we have recently received temperature and humidity sensors that will be part of our weather station, a subsystem of our home automation system. These components are considerably cheaper than the Habotest model, but require electrical and mechanical work, as well as programming, to implement as a system. Thus, the first iteration does not produce a cost effective system, may be frustrating to make, but will give satisfaction when completed.

From 2021-06-20 to 2021-07-29, the British Columbia Coroners Service, reported the following heat related deaths, in the table below. There are some who feel the number of deaths were under reported. Note that 445 of the 569 deaths (78%) occurred during the transition week, between June and July.

Age Group# of Deaths 
<40 2
40-49 13
50-59 42
60-69127
70-79160
80-89149
90+ 76
Total 569
Heat related deaths during the summer of 2021. Source: British Columbia Coroners Service.

British Columbia, was only one of many jurisdictions, that faced heat challenges in 2021. Temperature records are being broken regularly, throughout the world. In the United States, every state and territory had a maximum temperature that exceeded 37 °C. Of these 46 entities, only six had recorded maximum temperatures below 40 °C. Four states, had maximum temperatures at or exceeding 50 °C: New Mexico, 50 °C; Nevada, 52 °C; Arizona, 53 °C; and, the highest, California, 57 °C. In Canada, Lytton, British Columbia, distinguished itself with 49.6 °C, on 2021-06-29, the maximum ever recorded in the country. The next day, a wildfire destroyed most of the town. In Norway, the highest temperature recorded is 35.6 °C, at Nesbyen, on 1970-06-20.

The challenge with these high temperature values, is that they do not take into consideration humidity, which determines how people experience heat. Some locations on planet Earth may be approaching values that prevent human survivability. The countries most affected are Saudi Arabia and other Arabian Gulf states, Pakistan, India and Australia. What is not fully appreciated, is that indoor climates in temperate zones, can also create conditions that kill people during heat waves.

The difference between WBT and dry temperature, measures how effective people can cool themselves by sweating. Admittedly, this is a simplification because, in addition to humidity and temperature, solar radiation and wind speed are other factors that affect survivability. Yet, WBT is especially important in indoor environments, where deaths often occur in heatwaves.

Sweating above WBT will no longer cool down a person, but lead to a steady rise in body temperature. This is the limit of human adaptability to extreme heat. If a person cannot escape these conditions, their body’s core temperature will rise beyond the survivable range, and organs will start to fail.

The critical WBT value for humans was usually considered to be 35 °C, indicating a situation where a healthy person could survive for six hours. One representation of this is an air temperature of 40 °C, and a relative humidity of 75%. This value comes from a 2010 theoretical study. However, research by Vecellio et al., found that this value only applied to young healthy people. Real-world data indicates that the critical WBT value is closer to 31.5 °C.

This means that the numbers of people exposed to potentially deadly combinations of heat and humidity across the world would be vastly higher than previously thought. Many older and compromised people will experience dangerous conditions far below the threshold WBT.

In Canada, the humidex = humidity index has been used since 1965 to describe how hot the weather feels to the average person. It is a nominally dimensionless quantity, calculated from °C and the dew point. Values of: 20 to 29: little to no discomfort; 30 to 39: some discomfort; 40 to 45: great discomfort, avoid exertion; above 45: dangerous; heat stroke possible/ probable.

Humidex plot (Source: Morn, using Matplotlib code)

The American Heat Index (HI) was developed from the Humidex in 1979. It is calculated using °F or °C and relative humidity. It makes assumptions about: human body mass, height, clothing, level of physical activity, individual heat tolerance, sunlight exposure, ultraviolet radiation exposure, and wind speed. Significant deviations from these will result in heat index values which do not accurately reflect the perceived temperature.

Heat Index plot (Source: Morn, using Matplotlib code)

In many situations, building construction results in indoor temperatures exceeding outdoor temperatures. Construction methods may prohibit water saturated air from leaving a building. In climates with high humidity, such as along the Gulf of Mexico coast and even on the Atlantic coast of Florida, in the United States, it is often common to use a vapour barrier close to the outer wall, with negative consequences during heatwaves. Sometimes the best solution is to omit a vapour barrier. This is the opposite of the approach used in cold climates, such as Canada, Norway and northern United States where the vapour barrier is located on the inside of the outer wall.

Heatwave Precautions

Since many of the readers of this weblog are older, it is important for them to know what to do when temperatures rise.

A first step is to realize that an indoor environment can be particularly deadly, in part because there is no wind to increase evaporation rates, needed for effective sweating/ perspiration.

A second step is to track indoor temperatures. Even without a psychrometer or wet-bulb thermometer, one knows that the WBT will be below this dry temperature. This means that temperatures should probably not be allowed to rise above, say, 30 °C, without taking some action. Thus, moving to a shady, outdoor location, may reduce risk, compared with staying indoors.

Air conditioning units are another solution, but not everyone can afford them. Their acquisition typically has to be planned well in advance of a heatwave. Fans can be effective at increasing the quantity of air available for evaporation, but they usually should be used with an open window.

In some places, special shelters have been built/ commissioned, that people can visit without charge to find heat relief. While many people will search for information online about shelters located near them, there are other sources of information available. Public libraries are a great place to find this sort of information.

Candela C-8

A Swedish built Candela C-8 foiling cruiser. Photo: Candela.

As I attempted to write the first paragraph of this weblog post, an observer came by. So I made a comment, hoping for some encouragement: “Beautiful boat, isn’t it?”

“Not particularly,” came the reply. “It looks awfully cold. They even have to wear toques. If someone fell in the water, how would they get back onboard?”

At that point I realized, yet again, that the observer and I live in two different universes. In my universe, the boat motor stops, the foils sink and the hull floats on the water. The person in the water is dragged through the open transom onto the boat. It is probably one of the easiest boats in existence to effect a rescue. I replied, “Would you like some tea?”

Tea is one of her passions, while watercraft are one of mine. A major achievement was building a sailing dinghy, a 2.4 meter long Sabot, at the age of thirteen. In my adult life I have owned two sailboats, including a Eygthene 24 cruiser.

Theoretically, I share the same speed obsession as Toad of Toad Hall as found in Kenneth Graham’s (1859 – 1932) Wind in the Willows (1908), but in a more maritime variant. I appreciate fast sailboats including America’s cup AC72 foiling catamarans, AC75 foiling monohulls and even more affordable foiling Moth dinghys.

Practically, I usually sailed my cruiser from its harbour to a small inlet two nautical miles (NM = 3.5 km) away. I would then anchor, and enjoy the tranquility of its relatively remote location. One could make that journey in almost any type of boat, including a kayak or a row boat. The advantages of a cruiser include its galley, bunks, head and shelter from inclement weather.

A glimmer of hope that I might appreciate motorized vessels occurred in 2015. Aspiring to develop a new industry here in Norway, I gave my Technology students an assignment to design an electrically powered, water-jet vessel, based on a surfboard. I introduced the topic by showing a video about river jetsurfing. Now there are foiling boards as well, as this video shows. There are other foiling boards available, but most of them use propellers rather than waterjets, something I find ill-advised.

The Candela C-8 impresses me in several different ways.

First, the hull is constructed out of carbon fibre, using vacuum molding techniques to create a rigid platform to mount the driveline and foils, as well as passenger accommodation. It is also lightweight. However, it is not something that I would like to come into close contact with sharp rocks.

Second, the driveline is remarkable. The battery is enclosed in a waterproof container, to prevent salt-water from damaging it. It is freshwater cooled. Its 40 kWh lithium ion NMC battery pack (from BMW i3) could (theoretically) power the vessel for 50 NM = 92.6 km = 57.5 miles. However, even Candela admits that a more probable result is 40 NM at 20 knots = 2 hours. The motor uses 70 kW to take off and start foiling, 16 kW to foil at 23 knots, and 37 kW at full speed = 30 knots.

The C-Pod showing foils and contra-rotating propellers. Photo: Candela.

The motor is housed underwater, which provides cooling and noise reduction. Further, it is equipped with contra-rotating propellers, that is two propellers that rotate in opposite directions about a common axis, usually to minimize the effect of torque. This approach reduces the size of propellers needed, but it is a more complicated (read: expensive) system that may require more maintenance. Candela claims that its C-POD requires no maintenance and will operate for 3000 hours without service. They state that it is built to last a human lifetime, without maintenance. In addition, there is no need to change oil or cooling fluid, as the sealed electric motors are cooled by the flow of seawater. It is important to note that with contra-rotating propellers, hydrodynamic gains may be partially reduced by mechanical losses in shafting.

An exploded view of the C-Pod driveline showing the two shafts, and twin motors. Photo: Candela.

Third, the flight control system uses ten sensors to estimate the position, velocity, and acceleration of the boat on all axis, and to determine/ estimate the real-time system state. This allows the vessel to operate in rough sea and make sudden and sharp turns. It is so much quieter than a hovercraft.

Fourth, I suspect there is a brilliant navigation system provided, that will keep those onboard out of danger. In addition, I suspect there is a dead-man switch/ man-overboard button that, when engaged, will automatically maneuver the vessel back to the point where the person fell overboard, or became incapacitated.

With a starting price of €290 000, I cannot afford to buy a C-8. No, I have never bought lottery tickets out of principle, so I have no prospects of ever being able to afford one. I would like to encourage my younger friends and family to follow the used market. I estimate that a 20 year old vessel (at about 20% of the price) will offer optimal value.

If any of my offspring are wondering what to get me for my 80th birthday in 2028, a day foiling would be ideal. They can even choose the location, with the Salish Sea, San Francisco Bay or the Stockholm archipelago, three of numerous possibilities.

MaterialCarbon fiber
Weight1605 kg DC version
Passengers8 passengers including driver
Length8.50 m
Width2.50 m
Speed24 kn cruise, 30 kn top
MotorCandela C-Pod (45/50 kW)
Range50+ NM at cruising speed
+3 NM at 4 kn in limp home mode
Draft0.5 m in shallow mode
0,9 m in planing mode
0,8 m while foiling
1.5 m while not foiling, foils extended
Charging230Vx1x16A: 13h
230Vx3x32A: 2,5h
Interface15,4-inch touch screen with Candela’s proprietary navigation and boat integration system. Free software upgrades included. One year free sea chart upgrades included.
AppCandela app with position, state of charge, route statistics and more. Optional geo-fence.
Hull-shapeThe hybrid hull is shaped for frictionless planing in addition to low air resistance when foiling. In Planing mode the foils are above the surface which prevents fouling and corrosion
Candela C-8 Specifications

For additional propaganda: https://candela.com